Question

I am trying to load data into AWS redshift using following command

copy venue from 's3://mybucket/venue'
credentials 'aws_access_key_id=<access-key-id>;aws_secret_access_key=<secret-access-key>'
delimiter '\t';

but data load is failing, when I checked Query section for that specific load I noticed it failed because of "Bad UTF8 hex sequence: a4 (error 3)"

Is there a way to skip bad records in data load into redshift?

Was it helpful?

Solution

Yes, you can use the maxerror parameter. This example will allow up to 250 bad records to be skipped (the errors are written to stl_load_errors):

copy venue 
from 's3://mybucket/venue' 
credentials 'aws_access_key_id=;aws_secret_access_key=' 
delimiter '\t' 
maxerror as 250;
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top