In the realm of IBM i applications, the mantra “If you cannot test it, don’t do it” serves as a crucial reminder. Testing plays a paramount role in ensuring the security and reliability of critical data and applications on the IBM i platform. This principle underlines the significance of thoroughly testing any changes, updates, or modifications before implementing them. The affirmation encapsulates the idea that rushing into changes without proper testing can lead to unforeseen complications, potentially jeopardizing the integrity of the IBM i environment. By adhering to rigorous testing practices, organizations can mitigate risks, identify vulnerabilities, and maintain the robustness of their IBM i applications, fostering a stable and secure computing environment.
We all understand the importance of testing
In every industry, testing and validation is a fundamental asset of product development and it represents from 20 to 30% of the development cost. (ref. McKinsey – “Testing and validation: From hardware focus to full virtualization?”)
The leader company plans testing and validation scenarios across functions and integrates customer-testing and validation procedures early in the development process.
They extensively leverage virtual simulation and make virtual testing a prerequisite for any physical tests.
They change their product development processes (PDPs) by integrating virtual simulation loops, and they institute strong feedback and improvement loops at each stage of testing.
What about the testing practice on the IBM i platform?
In the software industry virtual testing is the paradigm, so testing should be more approachable but still we can see that testing on the IBM i platform is not yet well integrated as it should.
Testing on the IBM i platform may not be as popular due to several factors:
- Legacy Systems: The IBM i platform often hosts legacy applications that were developed before modern testing practices gained prominence.
- Skill Gap: There might be a shortage of testers familiar with the specific nuances of testing on the IBM i environment.
- Perceived Stability: Some organizations perceive IBM i applications as stable and reliable, leading to less emphasis on testing.
- Resource Constraints: Limited resources, including time and budget, can lead to reduced testing efforts.
- Lack of Tools: There might be a lack of specialized testing tools for the IBM i platform.
- Business Criticality: Organizations might consider IBM i applications as business-critical, leading to reluctance in making changes that require testing.
- Cultural Factors: Company culture and historical practices can influence testing priorities.
Despite these challenges, modernizing testing practices and recognizing the importance of rigorous testing can help address these issues and ensure the quality of IBM i applications.
Software testing follows a common process
Software testing involves a structured process with various tasks and steps to ensure the quality of applications:
- Defining the test environment: This encompasses tasks like preparing login credentials and initializing data.
- Developing test cases: This involves creating comprehensive scenarios to validate application functionality.
- Security/Authority: Testing includes examining different user access levels for security purposes.
- Writing scripts: Generating detailed scripts that encompass data inputs and navigation.
- Executing scripts: Allows for running tests and ideally you will want to be able to revisit the test recording if errors occur rather than having to redo the all test from scratch.
- Analyzing test results: Entails selecting elements to compare and conducting comparisons.
- Submitting reports: Defect or successful reports involve documenting issues, maintaining records, creating KPIs.
- Encompassing Pgm, DB and UI: Testing will have to go across program logic, databases, and user interfaces, including 5250 screens for the IBM i platform.
Facing, analyzing and eventually resolving all those points will definitely help to make testing more approachable on the IBM i platform.
IBM i End to end testing
Figure 1 illustrates the features that an end to end testing tool should cover.
Figure 1 – IBM i End to end testing
The more these features are automated, the more likely the tool will be accepted by end users.
But that’s not all, what is also relevant is the user experience on the Recording of the Test scenario and the flexibility of correcting the Recording done. Imagine you’ve recorded a test case of an interactive function involving 30 screens and at the end you realize you’ve entered the wrong data on screen 25…the last thing you want is to redo everything from scratch. If the tool could enable the flexibility to correct it without having to redo the recording it will be a big plus for the end user.
Figure 2 illustrates the feature about Recording correction flexibility.
Figure 2 – Flexibility on correcting test recording
Another valuable point is the integration of the code coverage feature. This feature is available with the IBM API launched by the command system CODECOV.
RDi integrates it and gives the possibility to visualize the actual lines of code executed by a program. This means that by running a test it becomes possible to see which lines of code have been executed and which lines not, which eventually gives also the percentage of lines of code that has been gone through with the testing. This is definitely valuable information as if we get for example a code coverage of 30% of the code we would understand that our test is not sufficient, what we would look for would be at least a coverage of 80%. This is a KPI on the quality of the overall test.
Now, to get 80% of coverage we may have to run the same program (i.e the same test) several times with different input data. Ultimately we will create different test variations and will need to do a merging of their respective code coverages. The merging will therefore tell us what is the overall code coverage of all lines of code through all the test variations.
Having a dynamique merging code coverage information available through an SQL UDF in an automated test driven workflow process is obviously welcomed.
Figure 3 illustrates the features of code coverage and merging code coverage.
Figure 3 – Merging code coverage and KPIs
If you are interested to know more you can visit the product page ReplicTest.