Copying data from S3 to AWS redshift using python and psycopg2 -
i'm having issues executing copy command load data s3 amazon's redshift python.
have following copy command:
copy moves 's3://<my_bucket_name>/moves_data/2013-03-24/18/moves' credentials 'aws_access_key_id=<key_id>;aws_secret_access_key=<key_secret>' removequotes delimiter ',';
when execute command using sql workbench/j works expected, when try execute python , psycopg2 command pass ok no data loaded , no error thrown.
tried following 2 options (assume psycopg2 connection ok because is):
cursor.execute(copy_command) cursor.copy_expert(copy_command, sys.stdout)
both pass no warning yet data isn't loaded
ideas?
thanks
i have used exact setup (psycopg2 + redshift + copy) successfully. did commit afterwards? sql workbench defaults auto-commit while psycopg2 defaults opening transaction, data won't visible until call commit() on connection.
the full workflow is:
conn = psycopg2.connect(...) cur = conn.cursor() cur.execute("copy...") conn.commit()
i don't believe copy_expert() or of cursor.copy_* commands work redshift.
Comments
Post a Comment