The future of Android testing is bleak if as a Quality Engineers don’t change the way of testing. Automating the test cases is good but understanding the system services (CPU, Battery, RAM, ..) of the app to go to the next level is great. In OpenSooq, Our development cycle is extremely fast, and we’ve built tools to keep it that way. On another hand, this will add more headache on Quality to make sure that every release is a bug-free. This leads to the importance of Automating the quality to give us more time to focus on testing the business core. However, Automation testing is the effective and the most ideal approach to fulfilling the greater part of the testing objectives and successful utilization of quality and time.
There are different tools for automating the tests. Here, OpenSooq uses the Appium, open-source test automation tool, that is entirely driven by Selenium. Appium is one of the best choices for automating the application test Scripts, it configures for both Android and iOS applications and can be written in different programming languages. From a continuous delivery perspective, it gives better integration with Continuous Integration (CI), better test execution reporting and it can be extensible to integrate database. So, a machine does, what a human used to do, quicker, with fewer errors and 24/7.
The challenges behind Automating Android KPIs
Always OpenSooq strives to find the best way to build its Automation test scripts. In the first stage, when we created the test scripts and went to the execution phase. Here, the first challenge appears which is the execution time; The run process was taking a lot of time; almost about few hours, and from here; we start parallelizing the execution to shorten the spent time. There was a lack of documentation and dispersion of sources, and we start to investigate day and night how to do it in an optimal way. After working around the clock; We succeed to reduce the time to more than 50% of the original time. Meanwhile, we have built a great visualization that explains the behavior of our test cases regardless of the status of the test. In addition to execution time for every test case.
The Second challenge comes from the interruption from any running services in the background. However, we tried to get rid of this by turned-off all the ON services on the device. So, do not forget to make sure that there is no services work while testing.
Exploring the performance of our application by memory, CPU, and battery is the deep level of Automation; to get a faster and better response for the user. Since this will avoid the crashes and memory leak with our users. Here, the devil is in the detail; this idea seemed simple at a first look, but in fact, it was the exact opposite. We had trouble finding a way through searching to find out the consumed amount of both battery and CPU. We used the Android command to get these results but the complexity comes from transforming system results in readable information. The result was difficult to analyze after the run of the performance code, but we win the battle of understanding the story behind this methodology.
The way of dealing with performance evaluation is totally different from UI assertion. The case here is relative, this due fact of device specification, internet connection and the response from the backend. The complexity of performance testing will increase when we handle the parallel execution. So, at a certain degree, Combining the results between two different devices is needed. However, Preventing false positive will come by using the same device model which is highly recommended.
To get diagnostic output for all system services running on a connected Android device, we have to use (Android Debug Bridge) adb dympsys. The output is hard to analyze but how we can transform it into useful information?
Transforming the system services using the above adb command to generate readable information by calling
List<List<Object>> performanceData = driver.getPerformanceData("my.app.package", "cpuinfo", <timeout>);
<my.app.package> is the application package that you want to apply the performance test on it. <timeout> is an integer denoting the number of seconds Appium will poll for performance data if it is not immediately available.
When you try to have the output of this instruction, you will receive a matrix that contains user and kernel process info. However, analyzing this output to make it useful information is your responsibility.
Here, what is important is user process info which is the amount of CPU time spent in user-mode code (outside the kernel) within the process. This is only actual CPU time used in executing the process. So, the below instruction can give the final result.
driver.getPerformanceData(<my.app.package>, "cpuinfo", <timeout>).get(1).get(0)
Now, the assertion for performance testing is relative; it means you have to understand your problem domain to apply it in the right way. You always get CPU info but the question is where we should do assertion?
The simple answer is inside the method (@AfterMethod, @BeforeMethod). Since the relation is relative. So, we will deal with percent, we calculate the CPU before and after the method and conclude the percentage. For example, if the method is exceeding 30% from CPU, the test is failed.
Checking the health of the battery is one of the crucial tasks that impact a lot of sessions. Not all businesses care about this, but the user even if he is naive, he will notice that. The way of understanding the battery health is similar to the way of getting the CPU results. So, if you run the command with a battery parameter, you will get the results in the Terminal.
Running the below instruction, you will translate the output of a command to a matrix like below
driver.getPerformanceData(PKG, "batteryinfo", 10);
The output will be
You can get the value by accessing the proper element in the matrix
driver.getPerformanceData(PKG, "batteryinfo", 10).get(1).get(0));
No need to get the battery after each method unless you are running heavy operations, but getting the initial value and the final value of battery health for the whole suites (Before and After) is enough to give you an insight about your application.
Analyzing the output of the shell command to understand the memory value is different. Imagine that you want to analyze the below output. So, we will provide you the sample of code to make this task easy for you.
The below code can translate the command output to readable information by storing them in a hashmap.
So the final output of the memory can be gotten as a float number by running the below instruction.
Finally, We recommend running memory testing with each method that can give you the trending behavior of your memory, and this can help the engineer to detect any memory leak possibilities in the future.
OpenSooq always aspires to increase the depth, scope, and performance of test scripts to help improve application quality. With automation Testing, The scripts are reusable and repeatable; you do not need new scripts all the time and you can execute the same tests over and over again. You can run the test scripts anytime and it helps to find bugs in the early stages of software development. From here, I hope that all of you have been benefited about performance testing of Android applications to figure out the amount consumed from memory, battery and CPU for each test script or for all suite.