Monday, November 19, 2012

Architectural Foundations of Testing

Today’s modern application testing matrix is full of rising complexities. In order to save time and money and to assure the success of a developed application, test automation teams have to start with the basics. A strong tower requires strong foundations. In this case, one of the most important parts of that foundation is the development of test framework. In order to ascertain that your product can bear the weight of all of today’s demands – those of both users and technology – all framework has to be flexible and robust. We develop test architectures in a modular, data-driven, and event-driven fashion that includes rich-logic. The  main goal of test architecture is to achieve full testing with less test artifacts, meaning that we will have less maintenance of said architecture down the line. As Test Architects, our main goal is to preferably design one test case per one functional area, which can be achieved through framework modularity.

When designing modular test architecture, one can define two classes of test cases: Test Components and Business Processes. Test Components are usually smaller, reusable test cases that represent a particular functional area, while Business Processes are more procedural test cases oriented to test application business processes.  Let’s take a look at a basic example of a common application to illustrate the above methods – a travel portal. The main purpose of this app is to help a user create their own itinerary for a trip. First, we will set the functional areas as “login”, “book hotel”, and “book rental car”. In order to develop test coverage based on these principles, we will develop test components separately for each of these areas.

Data-driven testing is carried out with a simplified principle of input and output parameters, meaning that our test cases should not have any static input and output values (when this is possible and logical). The advantage of data-driven testing is the ease of driving test scenarios though input and output data within a single test artifact. Later, we will incorporate all of these into a single “book itinerary” business case. Once we develop the business component, we can address the principles of data driving. Our goal is to create an artifact that will take any input and output values into testing procedures. Therefore, we will parameterize every input and output value such as the “user name” text field, etc.

Another important characteristic of a modern test framework is event-driven testing. Event-driven architecture is an script pattern promoting the production, detection, consumption of, and reaction to events in the script. Every test case should end with a triggered event and present validation criteria in the scenario. Building applications and systems around an event-driven architecture promotes validation of test case expected results that each test step has to have by definition. For example, when designing a Login Test Component we will create two test steps: Invoke AUT and Login (typing user name, password and clicking Login button), at the end of each step this test case will be triggering event in AUT presetting validation, i.e. Login dialog invoked, or Welcome page appears (including output data from the previous input step).

Last but not least on our list of foundations is rich-logic. This element allows testing teams to incorporate all business requirements. This includes applying conditional statement into the expected results area. For example, when there are multiple user groups in the application under test, we can create a conditional statement to validate which user of the group is logged in and outline subsequent steps for the user to proceed test in a certain way. Such framework can be executed either manually or in an automated fashion across any platform. With this method in place, we can take our business case of booking an itinerary and use it anywhere (Windows, Mac, Unix,Mobile etc.) without any further modification.


This approach allows better collaboration for team members, like application SME and Automation Specialists or new colleagues. It improves change management and reduces maintenance of test artifacts, therefore increasing testing ROI.

In conclusion, designing your framework in the most efficient way from the very beginning is the ultimate test strategy. In order to achieve a rapid pace of modern software development and offer proper test coverage, the foundations for an application’s script must be strong. Once the foundations are built with care and focus, the tower will stand tall and unwavering even in the strongest of winds.

Friday, November 9, 2012

Test Automation – Choosing the Right Tools to Drive Success

In our most recent installments, we’ve talked about the efficiency of test automation, agile testing, and some best practices for test teams. We’ve covered the difficulties of catering to multiple platforms and browsers, and have mentioned that the modern market is ever-changing and filled with obstacles that must be surpassed to stay afloat among the competition. There are many testing tools available out there today, and one of the most critical steps towards successful application testing is choosing the right one to get the job done.

Here, we can continue our car analogy. Just like building a custom car requires focusing on each separate part and how it will benefit the whole, test automation teams must also work together to determine what aspects of a testing tool will help them reach their goals in the most efficient way possible. Some companies may struggle with this, as there is a false rumor going around: that test automation is a low-maintenance cure-all for any development process.

The reality? Test automation is possibly the most difficult and rigorous form of application testing. Because everything is automated, teams must be doubly focused in assuring that all scripts run as planned. The goal of this method is to save time and prevent error, but the scripts that run the tests must first themselves be error-free in order to take advantage of the benefits test automation offers. The quality of the tools used throughout testing can make or break a project.

When driving a car, the main goal is to go from point A to point B. However, when a critical situation presents itself – such as hydroplaning or a potential accident – the quality of the vehicle and its safety features can tip the scales in your favor and save your life. The same can be applied to testing scenarios. When you need to quickly overcome certain problems with object recognition or architectural aspects of a project, you must rely on the particular feature set of your tools to help you make the climb. Even the most experienced tester can’t foresee all possible hazards on the road to launching a project, and – on the surface – freeware may look to have the same functionalities as paid software. However, each tool is different. You get what you pay for, after all.

For example, some very important factors to consider in modern test automation are: the ability to recognize objects on a higher-end scale, support of rich logic scenarios, modularity support, and configuration aspects of device under test such as jailbreaking/rooting or instrumentation. Finding a tool that can perform with the most versatility with these tasks and cover all platforms and browsers is crucial, since it’s impossible to predict whether the requirements for the usage of the application will take a sudden turn. End-users may initially need the application to run on iPhone, but may unexpectedly change their requirements to running the app on Windows Safari (yes, there is such a thing!).

The chart below shows StatCounter statistics about some of the mobile browsers of today and their popularity. Can your tools help you develop your scripts on all of these and more?


The knowledge and experience of test engineers along with operating budget are also important things to consider. Certain tools may have a steeper learning curve than others. If a tool has a large feature set, it will take more time to train teams to use it. What is the budget for the project? Can it support the investment of a more advanced tool along with additional time to train the team in its usage? ZAP Specialists highly recommend investing in a high-end tool, but teams must always plan ahead to make sure that they will have a return on this investment. In the long-run, putting aside a portion of the budget to acquire a better tool may save you days when you least expect, helping the budget remain in the positive.

In conclusion, test automation is a difficult and complicated path that must be navigated with the correct tools at hand. Taking the time to make an informed decision regarding your test automation tools can save you time and money in the end.

Friday, November 2, 2012

Test Automation Best Practices for a Smooth Ride



Test automation can be compared to riding a race car. Just like test automation, a race car is an attractive way of getting where you need to go with the speed of lightning. By the same token, this can be very dangerous if you don’t possess all of the needed skills. Racing a car is a feat accomplished by many separate elements combined into one. The racer must be focused, agile, and adaptable. 

The elements of racing apply to test automation as well. Development teams must be skilled, 100% focused on their goals, and adaptable, especially when considering the overwhelming amount of new platforms and emerging technologies that form new obstacles on the road every day. Most development teams that encounter such obstacles can overcome them, but only in the right environment. 

 Another roadblock comes in the form of cutting resources. When test managers try to cut corners, allocate less skilled personnel to do automation, or mix test automation with other responsibilities, they are giving the reins of a formula one to someone who barely knows how to drive a go-cart at a theme park. Overwhelming testing teams leads to distractions that turn them away from their goals, lengthening the process and leaving room for errors and failures. These problems result in such issues such as test automation teams trying to develop un-executable artifacts. The testing projects lose out – development is pushed back and the budget dwindles. 

At ZAP, we have over 10 years of consulting experience. Pulling from this knowledge, we always advocate the approach of assigning specific tasks to particular team members and setting reasonable mini-goals that will build on each other to reach the ultimate conclusion. One of our best practices is dividing the work between “Experts” and “Specialists”. 

Experts perform QA analysis that focuses on testing goals and test documentation. Their roles also include managing requirements and building modularized test cases that will adhere to test automation best practices.  They schedule test execution based on the goals of the project and execute test sets while tracking defects and errors in the products. Simultaneously, they will collaborate with Specialists to ensure that they develop scripts addressing testing goals through test case logic and validation criteria. Meanwhile, Specialists are solely in charge of developing and maintaining robust test automation framework.  

The productivity of such collaboration usually results in 1 Specialist supporting 3-5 Experts. Automation Specialists should always use advanced technologies such as HP QTP and ZAP-fiX to make their tasks easier. Test automation teams should implement HP ALM as the ultimate test automation product to be able to quickly address all testing objectives and apply them cross-platform. This will allow test Experts with one artifact to execute it on multiple platforms with ZAP-fiX’s revolutionary 1Script technology. 

In conclusion, if Specialists use good tools like QTP and ZAP-fiX, they can execute their scripts cross-platform using ZAPFARM hosted services. In order to achieve efficient test automation processes, teams should define precise roles and responsibilities and assign particular skillsets of their members to define QA processes while focusing on test automation tasks using advanced technologies. Following these guidelines will complete the racing metaphor – proper technique, focus, and training will always result in a safe and exciting race.