Thank you for the reply. We have now setup our DB and Test Suites and tried running them. Although at some point, some test cases were taking a long time and hence I killed them to retry. I had few questions for whatever part I ran so far
In one test case, I see Skipped :1 (As per below). What does it indictate? Is it mandatory for all tests to run and pass?
Test set: org.camunda.bpm.engine.test.bpmn.multiinstance.MultiInstanceTest
Tests run: 77, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.02 s - in org.camunda.bpm.engine.test.bpmn.multiinstance.MultiInstanceTest
What is an indicator of pass or fail. For instance at what point can we say that Azure SQL DB does indeed work for Camunda? Would it be some x percentage of test cases passed successfully, or certain aspects of test cases should have mandatorily passed?
We are currently running only the bpmn tests, as we use only the bpmn part of the camunda engine in our organization.
Our test suite is rather big and keep in mind, we support multiple databases and setups. Skipping or ignoring test case might be related to:
test case written for specific setup, e.g. MariaDB test
test case that is not supported/relevant for a specific setup, e.g. due to h2 database limitations, we ignore/skip some tests for that database
test case demonstrating a reported bug or incorrect behavior, we add such tests to our suite so we can easily track and reproduce them whenever a bug fix is scheduled
I would say that there is no necessity to keep track of them.
I would say all of the tests should pass (excluding the skipped in the maven run as above). In case they are failures, it will be better to check them case by case, as there might be indicators for unexpected behavior of the engine running with that database.