Using BpmnModelInstance.definitions at bpmn.js


#1

Hi guys,

We are having severe issues with performance of bpmn-io at frontend.
We are trying an approach of parsing the xml at backend and put definitions directly at
bpmn-js:BpmnModdle.
I was wondering if we can use BpmnModelInstance.definitions as is at frontend and save time of xml parsing.

Bottom line question:
Is java class BpmnModelInstance::Definition same class as used at bpmn-js:BpmnModdle::Definition

Thanks a lot

Diana


#2

Not sure if I understood exactly what you are trying to achieve. I’m able to answer your bottom line question though:

Is java class BpmnModelInstance::Definition same class as used at bpmn-js:BpmnModdle::Definition

These are entirely different entities, representing the same element in the BPMN 2.0 document though.


#3

Cross-posted in bpmn.io forum.


#4

I will try to clarify my question.
Today we are using Viewer.importXML(‘input.bpmn’);
First this this method does is parsing the xml and creates definitions model out of it.
(The parsing takes few seconds in some cases).
We are thinking to perform the parsing at backend(our java camunda service).
Send to front end ready model.
Instead of using Viewer.importXML(), use Viewer.prototype.importDefinitions(definitions)

Does it make sense?
Is there out of the shelf java library?


#5

This is not easily possible.

Please profile your diagram parsing and double check what takes that long.

  • Is it the actual XML parsing in the front-end
  • Is it the diagram rendering (label layouting, mostly)

What is your use-case?

  • Rendering the diagrams
  • Opening the diagrams for editing

If you’re looking into rendering the diagrams, pre-rendering to SVG could be a way to go.


#6

If I understand you correctly, you suggest to pre-render SVG using bpmn-io and then using just this static SVG for user to view it.
We do need performance boost for rendering at this point.
However, we use ElementRegistry, overlays and other bpmn-io features at run time according to user clicks on diagram.
It seems like a big change for us at this point…