Error trying using spark-connector with pyspark

Hello, I have been trying to write to singlestore with the spark connector for several days, I have downloaded and compiled the spartk jdbc and loaded it to the java classpath, I have installed the spark connector with the following command:
$SPARK_HOME/bin/pyspark --packages com.singlestore:singlestore-spark-connector_2.12:4.1.1-spark-3.3.0

my $spark_home variable has the following value: ~/spark_home, and here is the code I tried to execute

and this is the error:

thanks.

Hi Fredy! Welcome to our community forums :wave:

Sorry to hear that you are receiving this error. We’re happy to help!
What service are you running (managed or self-hosted)?
What version number?

Hi Fredy,
Which command did you use to execute code from the screenshot?