I’m looking for advice on how to migrate our current batch jobs to Camunda. We are willing to write some code and change our overall process, but would like to use Camunda in a way that is in sync with the way it is implemented.
We have a home-grown batch job system loosely on eGate / JCAPS. We handle hundreds of batch jobs daily. My boss wants to move to BPMN to simplify communication with business users and to a better known system to simplify the process of on-boarding employees and hiring consultants.
Camunda meets most of our needs, but we have a concern around configuring workflows. Right now, we define a workflow once and override various parameters to make it work for various business Partners. For example, we have a business process that sends new enrollees in health care to the appropriate carrier. We get flat files with changes in enrollment, convert them to the appropriate external file, and then post the files to servers for the carriers. At each step, we have 5 - 10 parameters we set. We then override a subset of them at the partner level.
Business Process: Update Health Enrollees
- Get Extract From HR
- Kaiser {server: server1, extractDir: kaiserDir, username: kaiserUser, password: kaiserPass, …}
- Blue Cross {server: server2, extractDir: bcDir, username: blueCrossUser, password: blueCrossPass, …}
- …
- Call External service to transform files.
-
-- Kaiser {carrierCode: kaiserCarrierCode, encryptionKeyName: kaiserEncKeyName }
-
-- Blue Cross {carrierCode: bcCarrierCode, encryptionKeyName: bcEncKeyName }
-
-- ......
- – Send Extract To Carrier
-
-- Kaiser {server: kaiserServer, deliveryDir: kaiserDir, username: kaiserUser2, password: kaiserPass2, ...}
-
-- Blue Cross {server: bcServer, deliveryDir: bcDir, username: bcUser2, password: bcPass2, ...}
-
-- ......
This is a lot like how a mainframe job gets defined. The workflows would be the JCL and the Partners would be PROC.
We know we could create each of the workflows as a process and then call that process for each partner, but that would be unwieldly: for one job we would go from 3 workflows with 13 partners each to 3 “actual” workflows and 13 Partner workflows. It would be very, very difficult to sell that solution to the operations team that administers the batch job services.
Alternately, it looks like we could set up DMN tables for each stage where we need to override parameters and put all the partner specific parameters into the DMN tables. This would allow us to update the parameters on a per-partner basis, but would require a new deployment each time a business partner wants to change their encryption key or server address.
Lastly, and perhaps best, we could put the partner specific information in xml or json files and load it during the process using a script. (something like this Process variables used for Process configuration stored in JSON/YML?) This is good because our partner-specific configuration is stored outside of the model, but feels a bit like a hack.
Is there a better way to do this? Should we just use the camel integration and put together our own system for retrieving configuration? Am I missing any options?
I’d be really to happy to hear from any Camunda developers if any of the above proposals won’t work. I’d also be happy to hear from the Camunda community if they have any similar experiences.
Thank you for your time.
Tim