Concurrent execution of multipe jobs from the same process with spring-boot

Hi,

I have concurrency issues in my camunda setup.
I have a simple process with three service tasks. For debugging puropses, the tasks both refer the same class which implements a dummy method which simply makes the thread sleep 10 seconds.
When I start multiple instances of this process, they are all “stuck” on the first task. My debuging output tells me, that one Job at a time is executing the dummy task and proceeds with the remaining two tasks until it is done. When the first jobis done, the second one is executing each task.
The desired behaviour would be the parallel execution of all the startet Jobs.

My setup:
I am running a spring-boot project with the camunda spring boot starter.

What did i possibly do wrong? Is there a way to configure this behaviour?

Thank you for your help
Sebastian

Hi Sebastian,

See the Concurrent Job Execution section in the docs.

Cheers,
Thorben

1 Like

Hi Thorben,

thanks for your quick reply.
Setting the tasks to non exclusive had some effect but didn’t solve my problem.
Assuming, every task prints the process id, the current task name and counts to 3.
The original output was:

process_1 task_1 count_1
process_1 task_1 count_2
process_1 task_1 count_3
process_1 task_2 count_1
process_1 task_2 count_2
process_1 task_2 count_3
process_1 task_3 count_1
process_1 task_3 count_2
process_1 task_3 count_3
process_2 task_1 count_1
process_2 task_1 count_2
process_2 task_1 count_3
process_2 task_2 count_1
process_2 task_2 count_2
process_2 task_2 count_3
process_2 task_3 count_1
process_2 task_3 count_2
process_2 task_3 count_3

Every job was finishing all tasks i a row before an other job was allowed to execute its tasks.

Setting the tasks non exclusive resulted in this output:

process_1 task_1 count_1
process_1 task_1 count_2
process_1 task_1 count_3
process_2 task_1 count_1
process_2 task_1 count_2
process_2 task_1 count_3
process_1 task_2 count_1
process_1 task_2 count_2
process_1 task_2 count_3
process_2 task_2 count_1
process_2 task_2 count_2
process_2 task_2 count_3
process_1 task_3 count_1
process_1 task_3 count_2
process_1 task_3 count_3
process_2 task_3 count_1
process_2 task_3 count_2
process_2 task_3 count_3

Now, the processes are running “in parallel” but every task is still executed without interruption.

The desired execution pattern would be:

process_1 task_1 count_1
process_2 task_1 count_1
process_1 task_1 count_2
process_1 task_1 count_3
process_2 task_1 count_2
process_1 task_2 count_1
process_2 task_1 count_3
process_2 task_2 count_1
process_2 task_2 count_2
process_1 task_2 count_2
process_1 task_2 count_3
process_1 task_3 count_1
process_1 task_3 count_2
process_2 task_2 count_3
process_2 task_3 count_1
process_2 task_3 count_2
process_1 task_3 count_3
process_2 task_3 count_3

Tasks should be able to run completely in parallel and not be blocked by another instances execution.

I hope my example explains what i want to accomplish.
Am I still getting the doku entry about the Exclusive Jobs wrong?

best regards
Sebastian

Hi Sebastian,

Your log output does not prove your point since it does not say which things run in parallel. Perhaps you should log start and end times of your activities or obtain that from the historic activity instances table. That should give a clearer picture. It could also be helpful if you could put your code and process up on github.

On a somewhat related side note: Job execution works with a threadpool. If all execution threads are currently executing activities (e.g. because they are all sleeping), other activities won’t be processed. The default setting is more than three, though.

Cheers,
Thorben

Hi Thorben,

here is what my logger prints out if i add time logging:

Starting process 39 task Task1 Thu Nov 10 14:58:10 CET 2016
process: 39 task: Task1 count: 1
process: 39 task: Task1 count: 2
process: 39 task: Task1 count: 3
Ending process 39 task Task1 Thu Nov 10 14:58:13 CET 2016
Starting process 43 task Task1 Thu Nov 10 14:58:13 CET 2016
process: 43 task: Task1 count: 1
process: 43 task: Task1 count: 2
process: 43 task: Task1 count: 3
Ending process 43 task Task1 Thu Nov 10 14:58:16 CET 2016
Starting process 48 task Task1 Thu Nov 10 14:58:16 CET 2016
process: 48 task: Task1 count: 1
process: 48 task: Task1 count: 2
process: 48 task: Task1 count: 3
Ending process 48 task Task1 Thu Nov 10 14:58:19 CET 2016
Starting process 39 task Task2 Thu Nov 10 14:58:19 CET 2016
process: 39 task: Task2 count: 1
process: 39 task: Task2 count: 2
process: 39 task: Task2 count: 3
Ending process 39 task Task2 Thu Nov 10 14:58:22 CET 2016
Starting process 43 task Task2 Thu Nov 10 14:58:22 CET 2016
process: 43 task: Task2 count: 1
process: 43 task: Task2 count: 2
process: 43 task: Task2 count: 3
Ending process 43 task Task2 Thu Nov 10 14:58:25 CET 2016
Starting process 48 task Task2 Thu Nov 10 14:58:25 CET 2016
process: 48 task: Task2 count: 1
process: 48 task: Task2 count: 2
process: 48 task: Task2 count: 3
Ending process 48 task Task2 Thu Nov 10 14:58:28 CET 2016

As you can see, every task is executed in one piece and the engine is not using the waiting time between the counts in order to start another task.

Best regards
Sebastian

Okay, thanks for the additional details. Would you mind sharing your code?

Thanks,
Thorben

Hi Thorben,

I am not able to share the whole Project but i appended the process definition file and the implementation of the used Task.

example.bpmn (4.0 KB)

DefaultTask.java.txt (1.0 KB)

I hope this helps understanding the Problem.

Can you please also add the process engine configuration and the code that starts the process instances?

The Process is startet by this code:

ExampleProcessServiceImpl.java.txt (1.3 KB)

There is no custom process engine configuration so far.

That means, you deploy the process model and Java code as part of a process application to a distribution you downloaded from camunda.org? Never mind, you use Spring Boot Starter.

Edit: Which Camunda version are you using?

I tried a executing your model in a unit test with Camunda 7.5.0 (plain, no Spring Boot). You can find the code here: https://github.com/ThorbenLindhauer/camunda-engine-unittest/tree/forum-1956. It gives me the following log output:

Nov 10, 2016 5:42:47 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 8 task: Task1 count: 1
Nov 10, 2016 5:42:47 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 12 task: Task1 count: 1
Nov 10, 2016 5:42:47 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 16 task: Task1 count: 1
Nov 10, 2016 5:42:48 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 12 task: Task1 count: 2
Nov 10, 2016 5:42:48 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 16 task: Task1 count: 2
Nov 10, 2016 5:42:48 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 8 task: Task1 count: 2
Nov 10, 2016 5:42:49 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 8 task: Task1 count: 3
Nov 10, 2016 5:42:49 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 12 task: Task1 count: 3
Nov 10, 2016 5:42:49 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 16 task: Task1 count: 3
Nov 10, 2016 5:42:49 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: Ending process 8 task Task1 Thu Nov 10 17:42:49 CET 2016
Nov 10, 2016 5:42:49 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: Ending process 12 task Task1 Thu Nov 10 17:42:49 CET 2016
Nov 10, 2016 5:42:49 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: Ending process 16 task Task1 Thu Nov 10 17:42:49 CET 2016
Nov 10, 2016 5:42:49 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: Starting process 12 task Task2 Thu Nov 10 17:42:49 CET 2016
Nov 10, 2016 5:42:49 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: Starting process 8 task Task2 Thu Nov 10 17:42:49 CET 2016
Nov 10, 2016 5:42:49 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: Starting process 16 task Task2 Thu Nov 10 17:42:49 CET 2016
Nov 10, 2016 5:42:50 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 12 task: Task2 count: 1
Nov 10, 2016 5:42:50 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 16 task: Task2 count: 1
Nov 10, 2016 5:42:50 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 8 task: Task2 count: 1
Nov 10, 2016 5:42:51 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 12 task: Task2 count: 2
Nov 10, 2016 5:42:51 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 8 task: Task2 count: 2
Nov 10, 2016 5:42:51 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 16 task: Task2 count: 2
Nov 10, 2016 5:42:52 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 12 task: Task2 count: 3
Nov 10, 2016 5:42:52 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: Ending process 12 task Task2 Thu Nov 10 17:42:52 CET 2016
Nov 10, 2016 5:42:52 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 16 task: Task2 count: 3
Nov 10, 2016 5:42:52 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: Ending process 16 task Task2 Thu Nov 10 17:42:52 CET 2016
Nov 10, 2016 5:42:52 PM org.slf4j.impl.JCLLoggerAdapter info
INFORMATION: process: 8 task: Task2 count: 3

This indicates true parallel execution and is similar to your expected pattern, if I understand it correctly. So I guess something in your engine setup is not correct, however I am not familiar with the Spring Boot starter and couldn’t find anyhting on a quick glance. Perhaps @jangalinski, maintainer of the extension, can help us out.

Seems like the DefaultJobConfiguration initializes a ThreadPoolTaskExecutor with default poolSize=1. That would mean although execution is async, due to the pool size the execution is sequential.
You can overwrite the TaskExecutor by providing an additional TaskExecutor Bean.

At least theoretically - I tried to do so on the current 2.0.0-SNAPSHOT but get a failure. Please try it out and report an issue if you do not succeed.

2 Likes

Hi @jangalinski
this seems to have worked for me:

@Bean
public TaskExecutor taskExecutor() {
  ThreadPoolTaskExecutor threadPoolTaskExecutor = new ThreadPoolTaskExecutor();
  threadPoolTaskExecutor.setCorePoolSize(10);
  return threadPoolTaskExecutor;
} 

thanks for the hint

3 Likes