Pipeline from Kafka (Confluent) gets created but doesn't ingest data

CREATE PIPELINE AS LOAD DATA KAFKA ‘host:9092/’
CONFIG ‘{“sasl.username”: “”,
“sasl.mechanism”: “PLAIN”,
“security.protocol”: “”,
“ssl.ca.location”: “/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem”}’
CREDENTIALS ‘{“sasl.password”: “”}’
INTO TABLE
FORMAT JSON;

I created this pipeline successfully, but when I check the table in SingleStore, it does not have any data.
This is the data I have in a kafka message structure
{ “transaction_id”: “”,
“date”: “”,
“account_id”: “”,
“transaction_type”: “”,
“amount”: ,
“currency”: “”,
“status”: “” }
When the data is extracted, what format is it being extracted in, is it being serialized?
By specifying the FORMAT JSON, is it affecting how the data is being extracted or is the coversion to JSON on the Target (SingleStore end). Can you help with the flow?

Did you start the pipeline after you created it?

START PIPELINE pipeline_name;

you can check the errors using
select * from information_schema.pipelines_errors

You are missing the format mapping for the pipeline definition

eg:

CREATE PIPELINE teams_list AS
LOAD DATA FS '<file path>/jtinsert.json'
INTO TABLE teams
FORMAT JSON
(basketball <- teams::basketball,
baseball <- teams::baseball,
hockey <- teams::hockey);
1 Like

Yes it was started but the error was found to be with the way I was getting to the JSON value. Using the example showed below, I was able to rectify my issue and work on it. It seems to be fetching the correct values now.
Thank you for your support.

Thank you. i am able to see the data in the respective columns now.