Question

The "schema" for objects I am putting in a collection goes something like this:

{
    name : "Niranjan", //<--key
    children : ["Suraj", "kalidas", "Suraj"]
}

But I wanted to ensure that children would be treated as a set - in the example above I wanted Mongo to not accept the record because "Suraj" appear twice in the array.

I tried creating a unique index over name and children

db.folks.ensureIndex({name:1,children:1},{unique:true})

What this does, however, is that it makes a globally unique index over any child name + folk name. That is if I DO NOT also have a unique index on name I will be able todo the following:

db.folks.insert({name:'p',children:['c1','c2']})
db.folks.insert({name:'p',children:['c3']})

because p+c1, p+c2 and p+c3 are distinct. If I tried to insert the following:

db.folks.insert({name:'p',children:['c4','c1']})

then I will get

E11000 duplicate key error index: test.folks.$name_1_children_1  dup key: { : "p", : "c1" }

Is there a way to achieve this without doing a check during insertion as in https://stackoverflow.com/questions/9640233/unique-array-values-in-mongoose?

Was it helpful?

Solution

Unique indexes will act the way you expect in a future release of MongoDB:

https://jira.mongodb.org/browse/SERVER-1068

For now, you'll need to check for uniqueness in your application before inserting.

Incidentally, when you're updating documents, you can use $addToSet instead of $push, to enforce uniqueness:

http://www.mongodb.org/display/DOCS/Updating#Updating-%24addToSetand%24each

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top