External Task not found: Camunda API and Kafka

Hi all,
My current architecture is like this:
56

Spring Boot application start a new process vía API.
On camunda instance, I have a plugin which publish every external task in Kafka.
Once the first external task starts, Camunda publish an event in Kafka, Spring Boot application try to fetch and lock this task, but I receive and error due there is no task.

If I open the cockpit, there is already a task, but I’m thinking maybe, the event is published too fast, and the task is not yet created, even if I already have the event in the application.

Could be this possible? An event published before the external task is created?

Thanks!

Hi @pipeline,

It is possible that you publish the event before the transaction that creates the External Task has been committed to the database. One solution would be to use a transaction listener to perform the publish so that you ensure that the Task has already been committed. Here’s a code snippet of something similar/related that might help you:

Best,
Nikola

1 Like

Hi @nikola.koevski, thanks for your help.
I was looking for a solution and how to integrate this listener on my kafka plugin, but I have no idea :sweat_smile:
On my kafka plugin I have two listeners, one for ExecutionListener and TaskListener_:

public class KafkaLoggerExecutionListener implements ExecutionListener {
    public KafkaLoggerExecutionListener() {
    }
    public void notify(DelegateExecution execution) throws Exception {
        // Send message to Kafka
    }
}

and…

public class KafkaLoggerUserTaskListener implements TaskListener {
    public KafkaLoggerUserTaskListener() {
    }

    public void notify(DelegateTask delegateTask) {
    // Send message to Kafka
    }
}

Are there any way to get, at this point, if the task has been committed?
Thanks!

@pipeline You can make the external task synchronous. That way Camunda will return only after the execution is successfully completed.

https://blog.camunda.com/post/2013/11/bpmn-service-synchronous-asynchronous/

This is actually not possible. All service tasks that are implemented by using external tasks will be asynchronous. The engine’s state will be committed while the task is being worked on by an external worker.

It’s a pity, I just read his message, and I thought it will be simple :sweat_smile:. Anyway, back to my plugin source code, could you provide a little path to listen this commit events into a camunda engine plugin? I’m started with this source code (https://github.com/camunda/camunda-bpm-examples/tree/master/process-engine-plugin/bpmn-parse-listener) to create my kafka publisher plugin, but I don’t know how to listen the commit transaction event, and retrieve some external task data (like processInstance, variables, etc…)
Thanks!

Hi @pipeline,

You can have a look at the code here for how to setup an ExecutionListener to publish an event to Kafka:

You can see how to register the listener here:

In general, I think you can get some nice guidelines from the repository above. I’ve been playing around with MQTT for External Task, which is a similar use case for you.

Ping me if you need any more help.

Best,
Nikola

Hi @pipeline ,

My advice: Don’t use Kafka together with Camunda in most cases.

Use a message broker which supports distributed transactions (e.g. RabbitMQ, most JMS Message Brokers).
We’ve evaluated Kafka in a request/response Microservice pattern with Camunda and it was a pain.
It lead to (hidden Spring) retries (as the response was faster than the database commit) and high load to database which decreased the throughput by > 90%.

A message broker with distributed transaction support takes care the message is only visible to consumers when the database transaction is commited.
Also the message is rolled back if the transaction fails.

Best regards,

Roberto