Welcome!

Continuous Integration, Continuous Delivery & Continuous Testing

Tim Hinds

Subscribe to Tim Hinds: eMailAlertsEmail Alerts
Get Tim Hinds via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Cloud Testing, Continuous Testing, DevOps Journal

Blog Post

How to Keep Test Cases in Sync By @Neotys | @DevOpsSummit [#DevOps]

With all the rapid change that happens it's important to make sure the entire team is working off the same footing

How to Keep Test Cases in Sync Between QA and Production

The art of software development is being radically transformed by the Agile Development methodology and the DevOps culture. Strong teams emphasize collaboration, and a focus on pushing code through to customers in real-time is proving to result in a real boost to productivity. But there is perhaps no metric more impacted by a successful Agile practice than software quality.

Agile affects quality in more ways than just what the end-user sees. In fact, Agile ensures quality across the entire development process. It allows engineers working on specific modules to get feedback from live production users. Operational monitoring can be triggered off issues identified in QA. Automated testing results can be fed directly back into engineering. More than ever, it's important to keep all these organizations in sync.

This post will examine one particular aspect of that challenge: keeping QA and Operations in lockstep. With all the rapid change that happens - particularly related to new tests being developed for new features and new versions of the app rolling out live into production - it's more important than ever to make sure the entire team is working off the same footing.

The Importance of Simulated Users
Simulated users are one of the most useful and important tools we have to keep QA and Operations synced up. They are used at scale in the context of load and performance testing prior to a software release to put software through the paces of heavy stress. They also are used in a production environment to monitor site performance without impacting real users.

Putting simulated users to work effectively will, in many cases, actually push the operations and development teams closer together to meet and discuss. The data generated by simulated users allows each team to get a clearer picture of the other's performance characteristics - information that they otherwise probably don't know. Simulated users will also allow the teams to be far more proactive in their problem solving efforts by identifying issues before real people experience them.

Simulated Users Gone Wrong
The scenarios you run simulated users through can be a source of trouble if not properly handled. At best, old scenarios don't exercise the appropriate aspects of new software releases - at worst, old tests break new releases.

In order to fully understand the problems that can arise from a mismatch in your test and production environments, we can learn from the experience of Brad Stoner in a previous interview with Neotys. His story All About The Cookies describes one scenario where a traffic spike caused a major site to malfunction, even though the company had done extensive load testing beforehand. The problem was traced to a mismatch between Production and QA environments caused by an inconsistent use of cookies between environments.

Consistency is so important, and your simulated users can play an important role in identifying risks before one of the following occurs:

  • Your site goes down because your testing environment didn't mimic your production environment, which means testing was irrelevant in the first place
  • You aren't monitoring a crucial user path, so real users experience problems that you don't know about it until it's too late
  • Your system experiences bottlenecks in a number of places around the software, bringing the whole site to a halt
  • It becomes hard to troubleshoot as the QA and Operations teams struggle to communicate over a shared collection of data.

Best Methods if Keeping in Sync
It is important to keep your testing scenarios in sync and there are multiple ways to do so. Below are a few.

Automated script tagging. You can set up automated processes for tagging scripts whenever they are created, updated, redesigned, fixed or cleaned out. This can eliminate confusion around the ownership of procedures. An automated system keeps everyone looking at the same information.

Common testing dashboard. It is also important to establish a common testing dashboard that spans across load testing in production and simulated user testing. This reveals information from both pre-release and post-release systems and helps bring the QA & Production teams together.

Regular meetings. Regular joint meetings and reviews should be held and performance data should be discussed between both the QA and operations teams to increase clarity of important issues.

Process QA. Designate a QA specialist to observe and improve quality across the entire process from development all the way to the production environment, establishing a robust Testing-In-Production practice.

Automation in Operations. Designate an Operations specialist to be responsible for ensuring that automated testing and deployment is taking place without any problems.

It is crucial to give both teams objectives that are related to operational support and quality. Lastly, leverage technology that makes it easy to stay in sync, like working off a platform that shares test scenario libraries between simulated testing and load testing. A few of our products here at Neotys (NeoLoad and NeoSense) will help you test in this fashion.

Test Well, Test Often
Rapid software development affects everyone across the organization. Not only do all teams have to be ready, but it is necessary to leverage collaboration and tools to ease communication, share information, delegate accountability, improve upon each other's work and stay in sync. We must remember that performance and load testing are crucial to ensure that code quality remains high - but even more important is to invest in processes that make sure the quality of the testing environment is high. Happy testing!

More Stories By Tim Hinds

Tim Hinds is the Product Marketing Manager for NeoLoad at Neotys. He has a background in Agile software development, Scrum, Kanban, Continuous Integration, Continuous Delivery, and Continuous Testing practices.

Previously, Tim was Product Marketing Manager at AccuRev, a company acquired by Micro Focus, where he worked with software configuration management, issue tracking, Agile project management, continuous integration, workflow automation, and distributed version control systems.