The reason why this isn’t work is because of restrictions about covariance and contravariance in generics. The best way to explain this is to show an example of why this has to fail.
Assuming Foo
has a valid signature, the type system promises that the following works:
var myInput = new Dictionary<string, HashSet<string>>();
// assuming a valid signature for `Foo`
Foo(myInput);
// according to the type of `myInput` the following MUST work
HashSet<string> item = myInput["foo"];
item.Add("baz");
That is what absolutely has to work. So anything Foo
does has to make sure that this still works.
So ignoring above for a moment, let’s assume the following valid implemention for Foo
:
public void Foo (IDictionary<string, IEnumerable<string>> data)
{
List<string> item = new List<string>(){ "foo", "bar" };
data.Add("foo", item);
}
Because List<string>
implements IEnumerable<string>
, adding a list object to the dictionary that stores IEnumerable<string>
s absolutely works. Again: Above is a valid implementation of Foo
for its signature.
However, if we now combine both code segments, it falls apart: The object stored at key "foo"
is not a HashSet<string>
but a List<string>
. So the assignment at HashSet<string> item = myInput["foo"]
would fail. But here is a conflict! The type system should ensure that the assignment works no matter what happens inside Foo
; but the implementation of Foo
is also completely valid for its signature.
So instead of making some arbitrary and intransparent rules here, this is simply not allowed. The type system just prevents such calls of Foo
with an incompatible parameter. And no, because IDictionary is invariant, it isn’t possible to work around this restriction.