Testing digital products and experiences is at the core of the Applause Platform. Applause crowdtesting and feedback solutions help Product, Engineering, and QA leaders gain rapid insight into the areas of digital quality that matter most to them. By leveraging the world’s largest, most diverse Community of digital experts your Applause Team will form highly vetted, curated testing teams for fast, authentic results. Calibrated to specific needs, our teams mimic ideal customers and test globally in real time, in any language and on any device.
Providing quality test results and speed and scale requires preparations. Your Applause Team will serve as your trusted advisor in this process, assisting you in “translating” your testing strategy into the Applause crowdtesting model, and maintaining, improving and optimizing it over time.
This article will review the required preparations for successful testing with Applause. Your Applause Team may be able to utilize additional functionality not yet available in the customer front-end. In case you’re not sure what settings are needed or how to best configure them please contact your Applause Team.
Testing in the Applause Platform is executed by products. In other words, the Test Cycle – the project under which testing instructions are detailed and executed – and the issues reported by the testers are always set against a single specific product. Moreover, many of the functional and reporting options available to you, the testers and the Applause Team are set at the product level. It is thus imperative that your products will be properly managed.
Here are the main product-related setups you need to consider:
- Product Setups: Learn more about the required data and steps to configure your product definitions here.
- Components: Learn more about identifying the distinguishable areas within your product you will be reporting on here.
- Builds: Learn more about management (and upload) of the product versions testers will be asked to test here.
- Integrations: Learn more about integrating your product with Bug Tracking Systems (BTS) here.
Structured Testing (Test Cases)
Generally speaking, tests you will be running in the Applause Platform will be of two kinds: Structured and Exploratory (or Unstructured). In an Exploratory test setting, the testers are given clear instructions about what, where and how to test, however they are free – and encouraged – to use their skills, judgment and best practices to “hunt” issues in your product. This method is commonly popular with products or features that have yet to be fully matured and in cases where you are too “far” from your end users to be able to accurately anticipate how they will be using your product.
Structured testing, in contrast, is meant for validating that functionality works “as expected”, and is very common with mature products and features (including regression and automated tests). Structured test cases consist of a series of scripted steps the testers will be required to follow. For each step it will be clear for the tester what is the expected result, so that they can accurately report back where failures occur.
Here are the main test case-related setups you need to consider:
- Test Cases: Learn more about creating and managing your product’s test cases in the Applause Platform here.
- Efficiency: Learn more about importing test cases at bulk using Excel spreadsheets here.
“Test Cycles” In the Applause Platform represent the project under which testing instructions are detailed and executed. The cycle is a clearly defined unit of labor with start and end dates, product build, scope, testing team and additional definitions and tools. You can expect most of the day-to-day work in the Applause Platform will evolve around test cycles, from setting them up, to managing them as they are being executed, and reporting on the results collected for them.
Here are the main test cycle-related setups to consider:
- Test Cycle: Learn more about creation of test cycles here.
- Best Practices: Learn more about some of the best practices collected over the years with regards to test cycle creation here.
- Customizations: Learn more about using custom fields for issue reports in a test cycle here.
Known Issues and Fix Verification
To ensure the testers are not reporting on issues you’re already aware of – and waste your and their time – you will be managing a list of known issues that will be available to the testers. As your product’s known issues will be fixed over time it is important to keep the list updated so that testers are not overwhelmed by its size, and are able to identify a recurring fixed issue. Moreover, you may want the testing community to actually verify your bug fixes before you formally resolve the issues.
Here are the main known issues and fix verification setups to consider:
- Known Issues List: Learn more about management of your product’s known issues list here.
- Bug Fix Verification: Learn more about utilizing the testing community to verify bug fixes here, and how to use 2-way integration with your Bug Tracking System to automate the Bug Fix Verification process here.
Over time you may identify certain specific testers you would like to keep working with. May it be the quality of their work, their knowledge of your product or whatever other reason, you might want to make sure your Applause Team considers inviting these favorite testers to your product’s test cycles. Learn more about favorite testers here.