Ask Your Question
2

Can Spark SQL perform merge schema function for parquet files?

asked 2023-05-27 23:43:55 +0000

djk gravatar image

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
2

answered 2023-05-27 23:56:01 +0000

huitzilopochtli gravatar image

Yes, Spark SQL can perform merge schema function for parquet files. When reading multiple parquet files with different schemas, Spark SQL can automatically merge the schemas by inferring the common schema. This is done by comparing the column names, data types, and nullable attributes of each schema to find common fields that can be combined into a single schema. The resulting schema will have all the fields from all the input schemas, with appropriate data types and nullable attributes. This can be done using the "mergeSchema" option in the Spark SQL configuration.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss

Add Answer


Question Tools

Stats

Asked: 2023-05-27 23:43:55 +0000

Seen: 14 times

Last updated: May 27 '23