سؤال

According to RFC 4627 section 2.2: 2.2. Objects

An object structure is represented as a pair of curly brackets surrounding zero or more name/value pairs (or members). A name is a string. A single colon comes after each name, separating the name from the value. A single comma separates a value from a following name. The names within an object SHOULD be unique.

But does "SHOULD be unique" conform to industry best practice? Do most JSON encoders/decoders enforce "MUST be unique". JSONlint.com enforces "MUST be unique".

For example { "foo":"value1", "foo":"value2" } returns Valid JSON, { "foo":"value2" }

And the second same name, replaces the first entry of the same name.

هل كانت مفيدة؟

المحلول

rfc 4627:

The names within an object SHOULD be unique.

Douglas Crockford (the author) said about it:

This was the biggest blunder in the RFC. SHOULD should have been MUST.

It is, sadly, too late to repair this.

The recent ecma json standard doesn't require uniqueness to avoid invalidating existing json documents that are confirming to the rfc and that might contain duplicate names.

In other words { "foo":"value1", "foo":"value2" } is a valid json but using duplicate names in new json documents is not recommended.

Different parsers can behave differently (or can be configured to behave differently):

نصائح أخرى

For good or ill, duplicate names in JSON are permitted by the spec.

The problem is that the behaviour of decoders when faced with such duplicates is undefined.

Some parsers will reject such JSON as invalid (this is the only behaviour that can really be said to be "wrong" imho). Most others will return the last one encountered. At least one that I know of (because I wrote it :)) treats JSON strictly as a data structure independent of any JavaScript parsing rules or execution results and allows access to each named value separately by ordinal index within the containing object (as an alternative to access via the key name in which case the first occurrence will be returned).

Although some people argue that a decoder should replicate the behaviour of a JavaScript parser and execution environment when constructing an object described by JSON (that is, the last named value will over-write any earlier values of the same name) the simple fact is that JSON is only a data structure standard and although inspired by and drawing on the syntax of JavaScript, does not demand JavaScript execution or behaviours that would reflect such execution.

Accordingly, neither the RFC nor the ECMA standard dictate how a decoder must or even should behave when faced with duplicates so with the exception of parsers that reject duplicate names entirely, none of the various behaviours that accept duplicates can be said to be the "correct" one.

If you are producing and consuming JSON between processes under your own control then it might be tempting to simply find a JSON encoder/decoder that works in the way that suits you and go with that. But I would advise against it.

Which brings me to the bottom line:

Although the JSON standard allows duplicates, it does not require you to use them, thus the wisest path is simply to avoid them and avoid creating or running into problems entirely. :)

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top