Data Type Conversions
On this page
When saving a DataFrame to SingleStore, the DataFrame column is converted to the following SingleStore type:
|
Spark Type |
SingleStore Type |
|---|---|
|
LongType |
BIGINT |
|
IntegerType |
TINYINT |
|
ShortType |
SMALLINT |
|
FloatType |
FLOAT |
|
DoubleType |
DOUBLE |
|
ByteType |
TINYINT |
|
StringType |
TEXT |
|
BinaryType |
BLOB |
|
DecimalType |
DECIMAL |
|
BooleanType |
TINYINT |
|
TimeStampType |
TIMESTAMP(6) |
|
DateType |
DATE |
When reading a SingleStore table as a Spark DataFrame, the SingleStore column type is converted to the following Spark type:
|
SingleStore Type |
SparkType |
|---|---|
|
TINYINT |
ShortType |
|
SMALLINT |
ShortType |
|
INT |
IntegerType |
|
BIGINT |
LongType |
|
DOUBLE |
DoubleType |
|
FLOAT |
FloatType |
|
DECIMAL |
DecimalType |
|
TIMESTAMP |
TimeStampType |
|
TIMESTAMP(6) |
TimeStampType |
|
DATE |
DateType |
|
TEXT |
StringType |
|
JSON |
StringType |
|
TIME |
TimeStampType |
|
BIT |
BinaryType |
|
BLOB |
BinaryType |
Data Type Conversion Remarks
-
When using the
onDuplicateKeySQLoption, the connector will error when writing a null-terminated StringType (i.e. , \0). -
DECIMALin SingleStore andDecimalTypein Spark have different maximum scales and precisions.An error will occur if you perform a read or write from a table or DataFrame with unsupported precision or scale. SingleStore’s maximum scale for the DECIMALdata type is 30, while Spark’s maximum scale forDecimalTypeis 38.Similarly, SingleStore’s maximum precision is 65, while Spark’s maximum precision is 38. -
TIMESTAMPin SingleStore supports values from 1000 to 2147483647999.SingleStore treats a null value in the TIMESTAMPcolumn as the current time. -
The Avro format does not support writing
TIMESTAMPandDATEtypes.As a result, the SingleStore Spark Connector currently does not support these types with Avro serialization.
Last modified: July 28, 2022