Kafka pipeline that updates existing rows in a table

I have a Kafka topic that has messages about updates needed to be done for specific rows in a table (columnstore table)
I want to create a pipeline that reads from this topic and update the table accordingly.
example of a message:

{
"accountId ":"my-account",
"id":"my object",
"columnToUpdate":"some_column",
"valueToUpdate":"some_value",
"table":"my-table"
}

I want to be able in the end to execute some kind of update query in this style:

UPDATE $table SET $columnToUpdate = $valueToUpdate WHERE id = $id AND account_id = $accountId

I wanted to avoid using a loop in a stored procedure, from what I see it can take some time for a batch to be completed and I need those updates to be in a low latency since the message is consumed.

Thanks!

To Create a Kafka Pipeline for updating rows in a table based on messages from a topic,use a kafka connect JDBC sink connector. Configure the connector to consume messages and execute the update query dynamically, using fields like accountId, id, columnToUpdate, and valueToUpdate.

I’d recommend using “pipelines to stored procedures” to do it. But if that doesn’t work or is not fast enough, then write an application that listens to the Kafka queue and reads messages, then runs the updates using a connection to SingleStore. The app could be multi-threaded or distributed, and use batching, if you need to drive very high throughput.

Thanks for the replies, My question was more about how to construct a pipeline with a dynamic SQL according to the message I get from Kafka. for now, as I couldnt make the dynamic SQL to work I created a simple stored procedure to do that.