Question

I am in the process of integrating a number of legacy systems. They each have different databases; and I will need to write data access code for most of them.

The database schemas cannot be changed (I might be able to apply some indexes and such, but tables and their columns must retain the structure). Some of the databases has an OK design, with appropiate relationsships and primary / foreign keys, and some of the other databases lacks that very much.

Which ORM would you choose for this task ? I would like to use the same ORM accross the project; and my requirements are:

  • Ability to rename tables or columns in code; but retain the old name in the database.
  • Reasonable code generation
  • Efficient LINQ support (LINQ queries against the data model should be translated to efficient SQL).
  • Generated data classes should preferably be POCO's.
  • Preferably support for different database engines.

I currently have the most experience with LINQ-To-SQL; but I have a feeling it might be the wrong choice for this project. I am willing to invest some time in learning a new framework.

Was it helpful?

Solution

At a guess, I think an ORM might cause you more trouble than it saves. If you have several different legacy databases where some of them are poorly designed, you might find it easier to build the data access layer at a lower level than an ORM. Fowler's Patterns of Enterprise Application Architecture does quite a good job of cataloguing various approaches to structuring data access layers.

Some of the data access layer might be amenable to a code generation solution; however the presence of a variety of schemas (some messy as you say) suggests that a one-size-fits-all approach may not work, or may involve disproportionate effort to make it play nicely with all of the legacy databases.

OTHER TIPS

nHibernate is going to be your best bet, it offers POCO support, includes support for every database I know of, and the LINQ support is more than adequate. It sounds like you will want to generate your mapping files, there are templates out there for mygeneration and codesmith that will help you do that and avoid tons of manual work. (after the initial generation it is easy to tweak and change tables name, relationships, and more which is much harder in most generation based frameworks)

LINQ to SQL only works with SQL Server. ADO.NET Entity Framework works with any database supported by ADO.NET.

The only thing I see on your list that is not implemented is that the classes are not POCO. In particular, it's a bad idea to expose one of them in a web service, as implementation-specific data will also be serialized.

nHibernate fits most of your requirements. The only limitation is linq; however, they have a full time developer donated by a sponsor working on adding a support for linq queries.

It supports lot of database engines, it is free, it can use POCO's. It has lots of hooks to make various pieces extensible.

In my opinion, the most important factor is whether you plan to implement a domain model or map tables directly to data transfer objects (DTOs). If you are going to model the domain, then I definitely recommend NHibernate. If you are going to work with DTOs then there are many good candidates, including Entity Framework and DataTables.

I'm going to guess you're working with C#?

I'd say maybe give SubSonic a try, it's been a good choice when the database is already around, especially when dealing with store procedures and views. There's no xml to configure like nHib and you could be up and running with it in a few hours (or 20 minutes if you know what you're doing). It's got support for a few databases as well.

The Linq part I believe Rob and Co. are adding in now so that also might be in there but that'd be apart of the latest stuff that I haven't tried yet.

http://subsonicproject.com/

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top