Ask Your Question
1

Why does Apache Spark return a date of January 1, 1900 when reading datetime data from SQL Server?

asked 2023-06-15 18:00:29 +0000

woof gravatar image

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2023-06-15 18:02:02 +0000

scrum gravatar image

This issue can occur when the datetime data in SQL Server is outside of the range supported by Spark. Spark only supports datetime values between January 1, 1900 and December 31, 9999. If a datetime value outside of this range is read from SQL Server, Spark will default to January 1, 1900. To avoid this issue, ensure that all datetime values in SQL Server fall within the supported range.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss

Add Answer


Question Tools

Stats

Asked: 2023-06-15 18:00:29 +0000

Seen: 8 times

Last updated: Jun 15 '23