Do You Test For Your Performance Against Competition

Performance Testing has over the years become an area of specialization in software testing. With the right combination of test automation, analysis and debugging skills, it has become a niche giving testers more challenging and rewarding opportunities compared to several other test areas. Testers have over a period of time devised more complex performance tests including some of the forward looking ones such as capacity planning along with the ones that are hard to simulate such as long haul tests. Mobile devices thrown into the mix have brought in their own challenges on what portion of the testing should be done on them, and what can be done with a basic client-server setup. Undoubtedly, performance testing requires a lot of patience, an eye to catch defects which are not very straightforward and hands on experience that is built over time to understand the overall landscape.

While all of these together lie on one side, there is another aspect of performance testing which is probably not that difficult, which is very valuable in the results it could yield, but is still not often taken up by teams either due to lack of value awareness or lack of time – this is one, where core performance tests (such as response times / page load times) are run on competing products to understand how one’s product fare against them. While feature richness, time to market, brand value, product quality, marketing taken up, are all important in determining product acceptance amongst users, another very important aspect is product performance. Performance will be accounted for under the head of quality to a large extent, but several teams do not understand the importance of comparing their performance with that of competition – they need to understand that if a competitors page load time across say a heavy page is 1 second, while theirs is 3, over time, they would lose a significant user base primarily due to slowness in page load. These comparative tests don’t take much time and are not often apple to apple (because different products are being compared). But what can be done to bring in some level of consistency is to run them at specific times of the day, repeat tests over different time slots to compare results, take a similar page for comparison (e.g. a search results page) etc. Such actions will help them draw meaningful inferences and provide more data for understanding where product improvements are needed, if any. The tests by themselves do not take much time, but it is the comparison and analysis followed by any implementation changes that consume chunk of the time. However, the tester herein, collaborates with the development team to a very large extent to arrive at what changes need to be made (whether configuration level, database setting level or something more detailed at design or technology level). Such comparative testing is becoming an important part of the overall performance testing effort and if the test team voluntarily does this and takes back actionable results to the product team, it would be a very welcome change to help the entire team collaborate on a very important area of “product performance improvement in comparison to competition”

About the Author

QA InfoTech

QA InfoTech

Established in 2003, with less than five testing experts, QA InfoTech has grown leaps and bounds with three QA Centers of Excellence globally; two of which are located in the hub of IT activity in India, Noida, and the other, our affiliate QA InfoTech Inc Michigan USA. In 2010 and 2011, QA InfoTech has been ranked in the top 100 places to work for in India.

Related Posts

X