Nivel buena datos Dev y diseño: ¿Cuáles son las malas prácticas comunes en el desarrollo de nivel de datos?

StackOverflow https://stackoverflow.com/questions/3178565

Pregunta

Actualmente estoy investigando las mejores prácticas (a un nivel razonablemente alto) para el diseño de aplicaciones para sistemas muy fáciles de mantener que dan lugar a una fricción mínima al cambio. Por "Nivel de datos" Yo diseño de la base de datos media, la relación de objeto correlacionadores (ORM) y acceder a datos generales tecnologías.

A partir de sus experiencias, ¿Qué ha encontrado que son los errores más comunes y las malas prácticas cuando se trata de desarrollo de nivel de datos y qué medidas se han tomado / puesto en marcha / o puede recomendar para hacer que el nivel de datos un mejor lugar para estar desde la perspectiva del desarrollador?

Un ejemplo respuesta puede incluir: ¿Cuál es la causa más común de una lentos, mal escalables y extensibles niveles de datos? + ¿Qué se pueden tomar medidas (ya sea en diseño o refactorización) para curar este problema?

Busco historias de guerra aquí y algunos de los verdaderos consejos mundo que puedo construir en documentos de orientación disponibles públicamente y muestras.

¿Fue útil?

Solución

Magic.

I have used Hibernate, which automatically stores and fetches objects from a database. It also supports lazy loading, so that a related object is only retrieved from the database when you ask for it. This works in some magic way I don't understand.

This all works fine as long as it works, but when it breaks down it is impossible to track it down. I think we had a problem when we combined Hibernate with AOP, that somehow the object was not yet initialized by Hibernate when our code was executed. This problem was very hard to debug, because Hibernate works in such mysterious ways.

Otros consejos

Object-Relational mapping is bad practice. By this, I mean that it tends to produce data schemas that can only loosely be described as "relational", and so they scale poorly and exhibit poor data integrity.

This is because properly relational schemas have been through the process of normalisation, whereas the results of O-R Mapping are normally object classes implemented as database tables. These will not normally have been normalised, but will instead have been designed for the immediate convenience of the OO developer.

Of course, in cases where the persistent data requirements are minimal, this is unimportant.

However, I once worked for a shipping company that had grown by taking over several other companies, and had outsourced development of an integrated operational system (to replace the various company-specific systems it had inherited) to a company using an OO methodology, with a data schema produced by O-R mapping. The performance characteristics of the system being developed were so poor, and the data schema so complex, that the shipping company dropped it after something like two years of development - before it even went live!

This was a direct consequence of the O-R mapping; the worst complexity in the schema (and the consequently poor performance) was caused by the existence of tables created solely as artifacts of the OO design process - they reflected screen layouts, not data relationships.

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top