Pregunta

Does anyone know of some Python package or function that can upload a Pandas DataFrame (or simply a .csv) to a PostgreSQL table, even if the table doesn't yet exist?

(i.e. it runs a CREATE TABLE with the appropriate column names and columns types based on a mapping between the python data types and closest equivalents in PostgreSQL)

In R, I use the ROracle package which provides a dbWriteTable function that does what I've described above. (see docs here)

¿Fue útil?

Solución

Since pandas 0.14, the sql functions also support postgresql (via SQLAlchemy, so all database flavors supported by SQLAlchemy work). So you can simply use to_sql to write a pandas DataFrame to a PostgreSQL database:

import pandas as pd
from sqlalchemy import create_engine
import psycopg2
engine = create_engine('postgresql://scott:tiger@localhost:5432/mydatabase')

df.to_sql("table_name", engine)

See the docs: http://pandas.pydata.org/pandas-docs/stable/io.html#sql-queries

If you have an older version of pandas (< 0.14), see this question:How to write DataFrame to postgres table?

Otros consejos

They just made a package for this. https://gist.github.com/catawbasam/3164289 Not sure how well it works.

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top