PySpark connector - JDK 11 support

We are using PySpark to write spark data frame into Singlestore.
Looks like Singlestore PySpark connector supports JDK8 which is already deprecated.
Just wanted to understand, if you have any plan to support JDK 11? Is there any ETA for this?

Hi @baskar.sks, what version of the Spark connector are you using?

4.0.0-spark-3.2.0

spark-connector-4-0-0-3-2-0.jar

https://spark-packages.org/package/memsql/memsql-spark-connector

JDK11 should be supported. Are you receiving errors?