Question

I have some trouble using spring-data-mongodb 1.3.3.RELEASE and Java 1.6.

My setup is a bit complicated because I have to deal with legacy data, so I have a custom TypeMapper (extending DefaultMongoTypeMapper) and two different read/write converter combinations. Additionally I use @TypeAlias for setting the _class information in the database.

The problematic models consists of several nested lists, some are typed like

List<DocumentValue>

MyObject may contain a object list

List<Object>

which may contain another DocumentValue object.

This setup seems to work, Unit tests run without any problem, the object-mapping looks quite nice in the debugger. My application is a web application and I'm able to write DocumentValues into a collection, the _class information is present.

As long I do not shutdown the server (a tomcat in my case), the object-mapping works. But when I restart the server (start a new JVM), DocumentValue objects are not mapped correctly but are treated as java.util.Map. The _class information seems to be ignored. I suppose there might be an problem with my mapping context (should my model entities be registered while Spring Context start?), but I'm not able to find the misconfiguration. Did anybody have some similar problems or has some suggestions?

Was it helpful?

Solution

Thanks for your reply. I think I found the reason why _class was ignored. You are right, I use a unusual combination of TypeMapper.

Let me show you my CustomTypeMapper:

public class CustomTypeMapper extends DefaultMongoTypeMapper {

    public CustomTypeMapper (MongoMappingContext mappingContext) {

        super(DEFAULT_TYPE_KEY,
                Arrays.asList(new MappingContextTypeInformationMapper(
                        mappingContext)));
    }

    @Override
    public <T> TypeInformation<? extends T> readType(DBObject source,
            TypeInformation<T> basicType) {

        // do some custom recognition

                // or just to the common type mapping
        return super.readType(source, basicType);

    }
}

Beside the common type recognition a special thing is the constructor in which a MappingContextTypeInformationMapper is used to use the @TypeAlias annotations.

The key here is the MongoMappongContext which is needed. In the non-functional case I initialized the CustomTypeMapper like

<bean id="customTypeMapper" class="de.flexguse.repository.CustomTypeMapper">
    <constructor-arg name="mappingContext">
        <bean class="org.springframework.data.mongodb.core.mapping.MongoMappingContext" />
    </constructor-arg>
</bean>

which worked but was wrong, because the new instance of MongoMappingContext did not contain any TypeInformation provided by setting the base-package attribute in

<mongo:mapping-converter id="customMappingConverter" db-factory-ref="mongoDbFactory"
        type-mapper-ref="customTypeMapper" base-package="de.flexguse.model" >

    <mongo:custom-converters>
            ... some custom converters ...
    </mongo:custom-converters>

</mongo:mapping-converter>

Unfortunately I was not able to figure out where the MongoMappingContext is created to be able to use it as constructor argument, so I invented a simple bean MongoMappingContextProvider which gets the MongoMappingContext using @Autowire

public class MongoMappingContextProvider {

    @Autowired
    private MongoMappingContext mappingContext;

    /**
     * @return the mappingContext
     */
    public MongoMappingContext getMappingContext() {
        return mappingContext;
    }

}

The Spring configuration of the CustomTypeMapper looks now like this

<bean id="mongoMappingContextProvider" class="de.flexguse.repository.MongoMappingContextProvider" />

<bean id="customTypeMapper" class="de.flexguse.repository.CustomTypeMapper">
    <constructor-arg name="mappingContext" value="#{mongoMappingContextProvider.mappingContext}" />
</bean>

This solution works for me.

By the way, to be sure to have all TypeInformations for all model beans I use I added @Document to every model bean which is persisted - just in case ;)

Maybe this is helpful for someone else with similar problems.

Regards, Christoph

OTHER TIPS

I suspect the issue to occur due to the unusual combination of TypeMapper usage and converters. If a manually implemented Converter instance for a particular type is registered, that Converter is responsible to create an object persistable and readable by and from MongoDB. This means that your converter instance needs to write type information itself.

If you can't get it to work and can compile a tiny sample project to reproduce the issue (preferably a test case to execute), feel free to file a ticket in our JIRA instance.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top