Skip to main content

As part of the continues integration development, I would like to implement Unit testing automation, this topic is to discuss some ideas of how to do that in Atacama, I have some specific doubts like:

  • Architecture suggested for unit testing automation 
  • Database approach for unit testing

Thanks in advance

Hi @alexAguilarMx ,

Welcome to Community and thank you for your question!

There are four ways of performing unit tests:

  • Expression debugging

  • Debugging Step

  • Debugging a whole plan

  • Simple scoring step to evaluate many Unit Tests

💡 Expression debugging

You can test your expression easily through Debug button. 

  1. Go to Step and Expression you would like to test
  2. Click on Debug 
  1. Enter some sample values that appear in the source data and check in Results how the expression is evaluated

💡Debugging step

You can also test the function of the complete logic set appearing within a single step.

  1. Go to Step you would like to test
  2. Click on Bug icon in the top right corner
  1. Enter values in selected row(s), click on Evaluate Input and check how the step is evaluated

💡Debugging a whole plan

You can use the Multiplicator step and Text File writer to test the plan´s output if it can´t be delivered to the endpoint. Multiplicator Step duplicates input data and sends them into several output flows so you can write a copy of the results into a separate text file to debug them. See an example:

 

💡Simple scoring step to evaluate many Unit Tests

You can use Simple Scoring step to evaluate many Unit Tests are the same time over a record. In the example below, my colleague prepared 100 input record with a random system name and system age. Then implemented 2 unit tests that check:

1. the system name is 1 of 3 permitted values

2. the system is not older than 10 years of age

The resultant text file advises for each of the 100 records is any unit tests failed, and if so, which unit test.

Hope it helps.


Hi, what @Adela Sotonova mentioned is all relevant for the actual development for sure. Let me share some tips how you can test specific parts of the solution after it’s actually deployed and even after you process some data.

  1. solution components - e.g. complex validation components, cleansing components, transformation plans, even matching results can be tested this way.
    • you can create a unit test plan for every component processing a test data set with expected results. The result can be read by your CI/CD process to report failures.
    • Every component you drag&drop from the Component palette on the right side contains the same folder structure. Check the UNIT_TESTS to see the unit test plan
      • COMPONENTS
      • DATA
      • DOC
      • UNIT_TESTS

         

  2. Web services - you can have standalone webservices or services running as part of the MDM hub or any other validation services
    • similarly to the previous point - you can create unit test plans to call the particular webservice and check the expected result
    • the result can be then evaluated and reported by your CI/CD process
  3. Data loads - e.g. overall data stats, loads failures, attribute completeness etc.
    • those are a bit more tricky as you need to run the load operation as part of the Ataccama server and respected module and then check the results. Typically you need to do additional checks either directly in the DB or you combine with standard APIs (native services, verification exports, postprocessing plans etc.). It’s usually solution specific but you can find some more or less stable patterns.

 

The evaluation as part of your CI/CD process should consider both

  • processing logs - the process ran successfully or the process failed
  • processing results - number of failed test cases

Hope that helped a bit.

 

 


Reply