This issue can occur when the datetime data in SQL Server is outside of the range supported by Spark. Spark only supports datetime values between January 1, 1900 and December 31, 9999. If a datetime value outside of this range is read from SQL Server, Spark will default to January 1, 1900. To avoid this issue, ensure that all datetime values in SQL Server fall within the supported range.
Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss
Asked: 2023-06-15 18:00:29 +0000
Seen: 8 times
Last updated: Jun 15 '23
How do you log Python data into a database?
How can the SQL debug mode be activated in PostgreSQL version 15.2?
How to deal with an operational error when connecting Django to MySQL?
What is the method for choosing data FROM a stored procedure?
How can SQL/PLSQL blocks be stripped of their comments?
What is the process for initializing Java UDFs in Spark?
How to set up Database First configuration in Entity Framework 7 for MVC 6?