Update readme do include datetime2(0) not being supported until spark…#157
Update readme do include datetime2(0) not being supported until spark…#157hmayer1980 wants to merge 2 commits intomicrosoft:masterfrom
Conversation
… pull request apache/spark#32655 is incorporated into your spark enviornment.
|
is it possible for the sql spark connector to truncate the datetime value instead of throwing the connection closed error? seems to strict for this case. |
|
not that I know of |
|
|
||
| ### datetime2(0) will result in com.microsoft.sqlserver.jdbc.SQLServerException: The connection is closed | ||
|
|
||
| This issue arises from Spark not supporting datetime2. |
There was a problem hiding this comment.
SQL Datetime data type only allows 3 digits fractional seconds while spark dataframe might have more digits than datatime allows. The two workarounds, 1. truncate datetime in spark dataframe to 3 digits of milliseconds. 2. sql table column use datatime2 data type, which allows 7 digits of fractional seconds.
There was a problem hiding this comment.
Hello @luxu1-ms , the workarounds are clear and are documented in the two references issues. The points - if your SQL Table has datetime2(0) datetype it just does not work - so this needs to be documented!
If you can not change the SQL Table away from datetime2(0) to any other datetime2(x) where x>0 the connector wont work!
Can we please include in the known Issues the fact that datetime2(0) is not supported - as shown by issues #39 and #83.
This will really only fix until the Spark Pull Request is incorporated - and is it takes month or never happen - we need to have that noted.
… pull request apache/spark#32655 is incorporated into your spark enviornment.