Question

I am having a hard time trying to come up with a reasonable design for my project. I have an existing Postgres database that gets constantly updated from other Python scripts. The web server built on Django framework will access the Postgres database to update User models only and display blog information for the logged in Users. The blog information is what is being updated overnight by other Python scripts.

Now my question, if I have to syncdb my blog model with existing Postgres database, would that cause any problem? ex:

models.py
class Blog:
   title=...
   content=...
   author=....

And say my Postgres db called mydb has many tables, one of which is blog table and contains columns for title, content and author. How would make my model in sync with existing database? Now lets say I included a new column in my db which is date of entry.

If I simply update my model to :

class Blog:
   title=...
   content=...
   author=....
   date of entry=...

will it work.

what are the potential problems here and any simpler solutions for these?

P.S: I have used South in the past. but the situation here is different. I am using db that is read-only from Django's point of view, and no data migration is necessary as of now.

Was it helpful?

Solution

If your database is read-only, you don't have to do syncdb. Use managed=False and the db_table meta option on your model to specify the table name it corresponds to, and likewise for the field column names.

If you haven't already, see the doc on legacy databases for more info.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top