Pipeline with multiple brokers

Hello guys!

I’m here to ask how do I connect a pipeline to multiple brokers? I have 3 brokers running.

Below is my configuration to create a pipeline with 1 broker. What config do I need to use in order to work with 3 brokers?

CREATE or REPLACE PIPELINE test_pipeline
AS LOAD DATA KAFKA ‘broker:29092/test_topic’
INTO PROCEDURE insert_customer;

I’m getting the following error:
Leaf Error (127.0.0.1:3307): Cannot extract data for pipeline. Invalid message offset, consumed out of order

Best regards

Hello @Carlao. thanks for trying out Kafka pipelines.

Can you verify if you are using a transactional producer to the topic? I.e. does your producer ever use beginTransaction(), commitTransaction() or abortTransaction()?

Hello @m_k

Thanks for the answer.

I’m using a microservice with KafkaStreams processor (consumer/producer) with exactly once semantics (EOS) that is provided by spring cloud framework. This microservice publishes the event to the kafka topic and then pipeline extracts the data from it.

Best regards

Hello @m_k

I don’t know if my answer helped you but, do you need any extra information to find out my problem?

I already tried with [ ] and { } and without any of those, example:
image

Best regards :slight_smile:

Hello @Carlao,

I do not believe the issue is with multiple brokers, but rather with producers using commit.

SingleStore pipelines manage their own kafka topic partition offsets for EOS, and I believe you are running into some logic I am revising right now.

Taking this conversation to DM.

1 Like