Allow only some columns in singlestore kafka connect

I am using kafka to send my cdc data which are collected by debezium to a singlestore database and I am using this kafka connect json:

{
"name": "my-connector",
"config": {
    "connector.class":"com.singlestore.kafka.SingleStoreSinkConnector",
    "tasks.max":"1",

    "transforms": "dropPrefix,unwrap",
    "transforms.dropPrefix.type": "org.apache.kafka.connect.transforms.RegexRouter",
    "transforms.dropPrefix.regex": "dbserver1.inventory.(.*)",
    "transforms.dropPrefix.replacement": "$1",
    
    "errors.tolerance": "all",
    "errors.log.enable": "true",
    "errors.log.include.messages": "true",

    "transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",

    "topics":"dbserver1.inventory.addresses",

    "connection.ddlEndpoint" : "memsql:3306",
    "connection.database" : "test",
    "connection.user" : "root",
    "connection.password": "password",


    "insert.mode": "upsert",
    "tableKey.primary.keyName" : "id",
    "fields.whitelist": "id,city",

    "auto.create": "true",
    "auto.evolve": "true",
    "transforms.unwrap.delete.handling.mode":"rewrite",
    "transforms.unwrap.add.fields": "ts_ms",

    "singlestore.metadata.allow": true,
    "singlestore.metadata.table": "kafka_connect_transaction_metadata"


 }
}

I want the singlestore database to only receive and save data from columns id and city. but apparently

        "fields.whitelist": "id,city",

does not work in this kind of kafka connect like it does in jdbc sink connector. how can I manage this?

Hi smartdnshail3! :wave:
Sorry you are experiencing this. Are you running on the managed or self-hosted service and what version?

Hi @smartdnshail3,
Can you also please provide some details on your Kafka deployment. Is it self-deployed and managed or Confluent Kafka?

1 Like