Given that bigquery is not meant as a platform to denormalize data, can I denormalize the data in google cloud sql prior to importing into bigquery?

I have the following tables: Table1 500M rows, Table2 2M rows, Table3 800K rows,

I can't denormalize in our existing relational database for various reasons. So I'd like to do a sql dump of the data base, load it into google cloud sql, then use sql join scripts to create one large flat table to be imported into bigquery.

Thanks.

有帮助吗?

解决方案

That should work. You should be able to dump the generated flat table to csv and import to bigquery. There is no direct Cloud SQL to bigquery loading mechanism, currently, however.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top