문제

I am trying to load data into AWS redshift using following command

copy venue from 's3://mybucket/venue'
credentials 'aws_access_key_id=<access-key-id>;aws_secret_access_key=<secret-access-key>'
delimiter '\t';

but data load is failing, when I checked Query section for that specific load I noticed it failed because of "Bad UTF8 hex sequence: a4 (error 3)"

Is there a way to skip bad records in data load into redshift?

도움이 되었습니까?

해결책

Yes, you can use the maxerror parameter. This example will allow up to 250 bad records to be skipped (the errors are written to stl_load_errors):

copy venue 
from 's3://mybucket/venue' 
credentials 'aws_access_key_id=;aws_secret_access_key=' 
delimiter '\t' 
maxerror as 250;
라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top