Question

I have been investigating the implementation of XML-RPC in Java. The major open-source libraries (e.g. Apache XML-RPC) use raw types to represent certain XML-RPC types in Java, e.g. structs are represented as raw Maps.

Raw types in new code are discouraged in Java. From the JLS,

The use of raw types is allowed only as a concession to compatibility of legacy code. The use of raw types in code written after the introduction of genericity into the Java programming language is strongly discouraged. It is possible that future versions of the Java programming language will disallow the use of raw types.

But, because of the nature of the XML-RPC protocol, are raw types unavoidable in this case?

Was it helpful?

Solution

Given the XML-RPC examples of Structs:

<struct>
  <member>
    <name>submitter</name>
    <value><string>Deep Thought</string></value>
  </member>
  <member>
    <name>answer</name>
    <value><i4>42</i4></value>
  </member>
</struct>

... and arrays:

<array>
  <data>
    <value><string>Don't Panic</string></value>
    <value><boolean>0</boolean></value>
    <value><dateTime.iso8601>19791012T12:24:42</dateTime.iso8601></value>
  </data>
</array>

You can see that they can, per the spec, mix data types within the same structure. This is something that Generics generally aims to avoid. Due to the breadth of data types supported by XML-RPC, the only common superclass compatible with all of them ends up being the trusty, old, java.lang.Object -- the same type you get from raw collections.

I think the answer to your question, technically, is: no, raw types aren't completely unavoidable. However, due to java.lang.Object being the greatest common denominator, there is probably little practical advantage to changing the pre-Java5 API since raw collection types give you Object as it is.

For argument's sake, let's look at what might be done to introduce generics anyway. First, it is almost certainly an API-breaking change (not binary API compatible) with code that used a previous version of the library (e.g. Apache's implementation).

Next, structs would change from a raw Map to the best we can do with generics in this situation: Map<String, Object>. Right off the bat, you no longer need to cast the keys to String in order to somehow determine whether you care about the associated value. But that value is still going to need some casting in most non-trivial cases, based on the program logic.

Similar to the struct, the XML-RPC arrays could switch from a raw List to List<Object>, but that makes effectively no difference to the program logic. There's no additional metadata (like the struct's String field name) to make this advantageous.

Worse, both of these data structures can be nested inside each other. So that Object you get from one of the map values (or array/list elements) could be another List or Map -- and in this case even casting to List<Object> or Map<String, Object>, respectively, is going to spit out a slew of unchecked cast warnings.

In terms of practicality, as far as the API goes for handling or introspecting arbitrary XML-RPC data structures goes, yes the raw types are probably unavoidable. Or not worth avoiding. (In many ways, it's a similar challenge to reflection and serialization in light of compile-time type erasure.)

Now, one glimmer of hope is the potential for such libraries to support marshalling these very generic XML-RPC data structures to/from specific, strongly-typed classes you've defined for a specific application. But if you have a library that supports that route, hopefully you're no longer dealing directly with the struct/array representations any more. (Or if you are, hopefully your data model doesn't mix different value types in the same "array".)

I hope that helps.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top