Ask Your Question
1

Why is it not possible to upload Dask dataframe to BigQuery due to Python value unrecognized by Arrow?

asked 2021-06-09 11:00:00 +0000

lakamha gravatar image

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2022-07-19 11:00:00 +0000

plato gravatar image

This error occurs because Dask uses Arrow to serialize its data when it is being transferred to BigQuery. However, some Python data types may not be recognized by Arrow, causing the serialization to fail. This is especially common with complex data types like nested structures, missing values, or non-standard Python objects.

To resolve this issue, you may need to convert your data to a supported format before uploading it to BigQuery. One way to do this is to convert your Dask dataframe to a Pandas dataframe first, and then upload the Pandas dataframe to BigQuery. Alternatively, you can try converting your data to a format like Parquet or ORC, which are designed to work well with Arrow and can also be uploaded to BigQuery.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss

Add Answer


Question Tools

Stats

Asked: 2021-06-09 11:00:00 +0000

Seen: 9 times

Last updated: Jul 19 '22