How to achieve writing JSON into table from scala spark connector ?
Data Type Conversions · SingleStore Documentation
What should be the DataFrame type to be used for writing JSON into table ?
Hi, kshastry
You can achieve it by creating the table before writing the DataFrame, using StringType and “truncate” value for “overwriteBehavior” option.
Here is an example:
spark.executeSinglestoreQuery("CREATE TABLE jsonTable(j JSON)")
df = spark.createDF(
List(("[]"), ("{}"), ("{\"v\":null}"), ("{\"x\":\"foo\",\"y\":null,\"z\":[]}")),
List(("j", StringType, true))
)
df.write
.format(DefaultSource.SINGLESTORE_SOURCE_NAME)
.mode(SaveMode.Overwrite)
.option("overwriteBehavior", "truncate")
.save("testdb.jsonTable")
}
And here is a result
MySQL [testdb]> show columns from jsonTable;
+-------+------+------+------+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+------+------+------+---------+-------+
| j | JSON | YES | | NULL | |
+-------+------+------+------+---------+-------+
1 row in set (0.000 sec)
MySQL [testdb]> select * from jsonTable;
+-----------------------------+
| j |
+-----------------------------+
| [] |
| {} |
| {"v":null} |
| {"x":"foo","y":null,"z":[]} |
+-----------------------------+
4 rows in set (0.083 sec)
1 Like