Open
Description
I've been working on loading large data sets in to bigq from a csv in GCS. I've got it working fine for some tables, but for others I get the following error:
bigquery.errors.JobExecutingException: Reason:invalid. Message:Error while reading data, error message: CSV table encountered too many errors, giving up. Rows: 1; errors: 1. Please look into the error stream for more details.
I have a feeling it's schema related, but I can't tell. How can I 'look in to the error stream' to see details? insertErrors isn't in my job object after it fails, I don't see any type of error message other than printing out the exception.
job = bqClient.client.import_data_from_uris( [gsFile], dataset, table, schema=None)
try:
job_id, _results = bqClient.client.wait_for_job(job)
print("Job ID: " + str(job_id))
print("Results: " + str(_results))
except Exception as e:
print(str(e))
print(str(job))
My code works great for some tables but not others, so I'm trying to find out whats wrong with one particular table.
Metadata
Metadata
Assignees
Labels
No labels