Variable Management with Camunda



At work we use Camunda for a while now.
One thing i always thought about is handling ExecutionVariables.

One Month ago i started a Project which should solve some Problems I came along.
For example Validation of execution variables or writing 10 variables in one JavaDelegate.
I tried to solve it by adding BeanValidation and a “JPA like API”.
I pushed it to this Repository.

I’m currently at a state where i can actually use the API and it works.
Now I have some Questions:

  • Whats your opinion on this API?
  • Can someone give me Feedback?
  • Should this become a camunda-extension?

Thanks in regards!
Hope this is the right place to ask these questions :slight_smile:


I’m sorry but I fail to see the added value.

What is added over how Camunda is storing and retrieving serialized objects? (Besides bean validation)

VariableMap variablesTyped = delegateTask.getVariablesTyped(true);
MyCompany myCompany = variablesTyped.getValue(VariableConstants.MY_COMPANY, MyCompany.class);


Ok I see.

This should be the same as:

MyCompany myCompany = variablesTyped.getValue("myCompany", MyCompany.class);

So what happens here is you ask for your object MyCompany stored in the execution.
The object gets deserialized and you save it in myCompany.

If you do this:

MyCompany myCompany = manager.get(MyCompany.class);

The MyCompany.class is scanned. If it is annotated with @ObjectValue(storeFields=true), the manager doesn’t search for a variable MyCompany but for each field declared in MyCompany.class.

This gives you the ability to have an execution with different primitive variables, but you can retrieve them as objects.
This allows you to build many objects which represent a different scope of your execution variables.

I hope that helps?


Ok I see the differents now, thank you for the update!

This is really interesting as these seperate fields can be used in the tasklist filters while complex objects cannot (or at least not in version 7.5 which I use).

I’m just not sure that over-using this feature could have long-term negative effects on the performance of the engine as it might clutter the history tables when configured to full. But it’s a nice feature nonetheless.


No problem.

Im not quite sure if this is working on newer versions. The documentation seems to be the same.

That could be a problem but the History Tables should be cleared every year as far as i know.
Otherwise this could affect performance. But you could write a plugin to connect to the History Stream and send each Event to another service which manages everything. We did this and it works well and is pretty fast.

Im currently working on the sample-process if you want to check it out.

Thank you! :smiley: