Question

I have a csv file with dump data of table and I would like to import it directly into my database using rails.

I am currently having this code:

csv_text = File.read("public/csv_fetch/#{model.table_name}.csv")
ActiveRecord::Base.connection.execute("TRUNCATE TABLE #{model.table_name}")
puts "\nUpdating table #{model.table_name}"
csv = CSV.parse(csv_text, :headers => true)
csv.each do |row|
  row = row.to_hash.with_indifferent_access
  ActiveRecord::Base.record_timestamps = false
  model.create!(row.to_hash.symbolize_keys)
end

with help from here..

Consider my Sample csv:

id,code,created_at,updated_at,hashcode
10,00001,2012-04-12 06:07:26,2012-04-12 06:07:26,
2,00002,0000-00-00 00:00:00,0000-00-00 00:00:00,temphashcode
13,00007,0000-00-00 00:00:00,0000-00-00 00:00:00,temphashcode
43,00011,0000-00-00 00:00:00,0000-00-00 00:00:00,temphashcode
5,00012,0000-00-00 00:00:00,0000-00-00 00:00:00,temphashcode

But problem with this code is :

  • It is generating `id' as autoincrement 1,2,3,.. instead of what in csv file.
  • The timestamps for records where there is 0000-00-00 00:00:00 defaults to null automatically and throws error as the column created_at cannot be null...

Is there any way I can do it in generic way to import from csv to models? or would i have to write custom code for each model to manipulate the attributes in each row manually??

Was it helpful?

Solution 2

providing fields like 'id' and for timestamps fields too manually solved it...

model.id = row[:id]

and similar for created_at,updated_at if these exists in model..

OTHER TIPS

for question1, I suggest you output the row.to_hash.symbolize_keys, e.g.

# ...
csv.each do |row|
  #...
  hash = row.to_hash.symbolize_keys
  Rails.logger.info "hash: #{hash.inspect}"
  model.create!(hash)
end

to see if the "id" is assigned.

for Question2, I don't think it's a good idea to store "0000-00-00" instead of nil for the date.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top