Send the link below via email or IMCopy
Present to your audienceStart remote presentation
- Invited audience members will follow you as you navigate and present
- People invited to a presentation do not need a Prezi account
- This link expires 10 minutes after you close the presentation
- A maximum of 30 users can follow your presentation
- Learn more about this feature in our knowledge base article
Do you really want to delete this prezi?
Neither you, nor the coeditors you shared it with will be able to recover it again.
Make your likes visible on Facebook?
Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.
Automated testing with TestComplete
Transcript of Automated testing with TestComplete
What does software testing mean?
is the process of investigating an application and finding fails in it. The difference between testing and simply exploring is that testing involves comparing the application’s output to an expected standard and determining whether the application functions as expected.
So, the basic test sequence includes:
Defining the expected output.
Performing test actions.
Gathering the application output and comparing it to expected result.
Preparing test evidence.
Creating defects for the fails.
is the automatic execution of software testing by a special program with little or no human interaction. Automated execution guarantees that no test action will be skipped; it relieves testers of having to repeat the same boring steps over and over.
provides special features for automating:
Performing test actions.
Defining tests data.
Comparing expected data with tests output (checkpoints).
Logging test results.
For example, it includes a special "recording tests" feature that lets you create tests visually. You just need to start recording, perform all the needed actions against the tested application and TestComplete will automatically convert all the "recorded" actions to a test. TestComplete also includes special dialogs and wizards that help you automate comparison commands (or checkpoints) in your tests.
TestComplete supports various testing types and methodologies:
unit testing, functional and GUI testing, regression testing, distributed testing
check the interface between the application on one side, and the rest of the system and users on the other side. They verify that the application functions as expected.
A typical functional test consists of test commands that perform various actions such as simulating clicks and keystrokes, running test commands in a loop and verifying objects’ contents.
TestComplete project structure
TestComplete operates with test
. A project is a starting point for creating tests. It contains your
tests, baseline data for checkpoints, information about tested applications
and other items needed to perform testing. The project also defines the
execution sequence of multiple tests
and contains a
of all test runs since the start of the project.
Related projects can be united into a
that contains one or more
. TestComplete automatically generates a project suite when you create a new project. You can also create empty project suites and then use TestComplete’s dialogs to fill the suite with the desired project files.
TestComplete Test Object Model
In TestComplete, functional tests can be created in the form of
. Tests of both kinds can be recorded or created from scratch with built-in editors.
Keyword tests may be converted to scripts tests.
is visual, easy and does not require a programming background.
Scripting requires understanding
commands, but gives you the ability to create more powerful and flexible tests. TestComplete supports scripting in
VBScript, JScript, DelphiScript, C++Script and C#Script
, so you can create scripts in the language you know best.
TestComplete test types
Testing types supported by TestComplete
TestComplete main window
The object structure is shown in the Object Browser panel:
TestComplete uses a
for test objects. For
, the root node of the tree is
, while for
, the root node is
objects correspond to applications running in the operating system. We use the term process rather than application because it corresponds to the concept of processes in Windows documentation.
A process object’s name includes the name of the process executable and its index (the index is used only if several application instances are running):
The processes have child objects - windows - that correspond to top-level windows. These objects in their turn have other child window objects that correspond to controls. The window and control names depend on whether or not the test engine has access to internal methods and properties of the application under test. TestComplete works with applications of both types, but names their windows and controls in different ways.
Applications that do not provide access to their internal methods and properties are called black-box applications. The name of each window of such applications includes the window’s class name, the window’s text or title (caption) and its index. Controls are named in the same manner as windows, because in terms of the operating system, a control is just another type of a window:
Applications that expose their internal objects, methods and properties to TestComplete are called white-box applications or Open Applications. They are marked with the icon in the Object Browser (see the image below).
To address windows and controls of Open Applications, TestComplete uses the names that reflect the window or control type and the name defined in the application’s sources.
is a test operation that verifies that an object property has the expected value in the tested application. For example, it verifies the text in a text box or the state of a check box. This helps you check whether your tested application works correctly.
is a test operation that verifies that an object’s properties contain expected values. For example, it can verify that a control in your application under test is enabled, visible on screen and located at the specified position.
is a test operation that verifies that a control displaying information in the tabular form contains relevant data. This verification is done by comparing the control’s actual contents with the baseline data stored in your project.
Table Checkpoint Results
During the test run, table checkpoints check the actual data of the control against the baseline data stored in your project.
If the actual and the expected data are equal, the checkpoint posts a success message to the test log.
Otherwise, the checkpoint posts an error message to the test log. The Additional Info panel reports all differences found during the verification.
is a test operation that verifies that the specified file contains expected data. This may be necessary if your application exports some data, for example, a customer list, to a file and you need to check whether the exporting procedure functions properly.
File Checkpoint Results
If the actual and the stored files are reported to be equal, the file checkpoint posts a success message to the test log.
Otherwise, the checkpoint posts an error message to the test log and verification results to the Additional Info panel. It also posts detailed file comparison results to the test log.
is a test operation that verifies values of an XML document by comparing them with baseline values stored in your project. This may be necessary if the application under test exports data, for example, a customer list, into an XML file and you need to verify that the application generates the file correctly.
XML Checkpoint Results
If the actual data of the checked XML document and the baseline data stored in your project are equal, the checkpoint posts a success message to the test log.
Otherwise, the checkpoint posts an error message to the test log and brief verification results to the Additional Info panel. It also posts detailed verification results to the XML Checkpoint Results log panel:
is a test operation that verifies data retrieved from the database by comparing that data with the baseline copy stored in your project. That may be needed, for example, when you test an application that modifies a database and want to verify that the appropriate tables are updated correctly.
Database Checkpoint Results
If the actual data retrieved from the database equals the baseline data stored in your project, the database checkpoint posts a success message to the test log.
Otherwise, the checkpoint fails and posts an error message to the test log. The Additional Info panel reports differences found during verification.
is a test operation that verifies that an application’s window or control or an arbitrary area inside an application’s window or control is displayed correctly. Verification is done by comparing the actual image of the window, control or area in your application with the baseline image stored in your test project.e name of the process executable and its index (the index is used only if several application instances are running):
Region Checkpoint Results
During the test run, region checkpoints perform a pixel-by-pixel comparison of actual images of objects and areas in the application under test with baseline images. If images match, the checkpoint logs a success message. Otherwise, it logs an error message. The Additional Info panel of the test log reports differences (in size, color or pixels) found between images. The Picture panel of the test log displays the expected and the actual images.
enable you to verify the current text contents of the clipboard.
To create a manual checkpoint, write script code that will display verification instructions on screen. For instance, you can create a user form with text and two buttons: Success and Failure. A tester will read the instructions, perform the check and then click the Success or Failure button to signal TestComplete about the result.
You can either create the user form manually or use the manual checkpoint functionality provided by TestComplete (see below). The latter includes special menu items, input and output forms, so you will concentrate on writing verification instructions and all other tasks will be performed automatically.
web service checkpoint
is a test operation that verifies that a SOAP response returned from a tested web service contains appropriate data. Verification is done by comparing the actual SOAP response with the baseline copy of the response stored in your project. This helps you to verify that methods of your web service under test work correctly.
Web Service Checkpoint Results
If the response obtained from the web service equals the baseline copy stored in your project, the checkpoint posts a success message to the test log.
Otherwise, the checkpoint fails and posts an error message to the test log. The Additional Info panel contains brief verification results. Detailed results are posted to the XML Checkpoint Results panel. The panel displays differences found between the expected and the obtained SOAP responses:
web accessibility checkpoint
is a test operation verifying that a web page under test conforms to the Web Content Accessibility Guidelines and Section 508 standards by performing a number of checks against the page and its elements. For instance, it allows verifying whether all IMG elements contain the ALT, WIDTH and HEIGHT attributes or check that all links on the page under test are valid.
Web Accessibility Checkpoint Results
If all verification actions specified by the checkpoint settings pass successfully, the checkpoint posts a success message to the test log.
If any verification fails, the checkpoint logs an error message to the test log. The Additional Info panel reports elements that have not passed verification.
web comparison checkpoint
is a test operation that verifies that the web page under test contains the correct data or has the correct structure. Verification is done by comparing the actual contents of the web page with the baseline copy of the page stored in your project.
Web Comparison Checkpoint Results
Each web comparison checkpoint can perform several verification actions that are specified by the checkpoint settings. If all verifications pass, the checkpoint posts a success message to the test log.
If any verification fails, the checkpoint fails and posts an error message to the test log. The Additional Info panel reports differences found during the verification process.
Senior Software Test Automation at EPAM Systems, Ukraine, Kiev
A checkpoint is a comparison (or verification) operation that is performed during testing. These operations are an essential part of the testing process as they control whether the tested application functions properly.
Property Checkpoint Results
During the test run, property checkpoints compare the actual and expected property values and post the comparison results to the test log. If the property values match, the checkpoint logs a success message; otherwise, it logs an error message. The Additional Info panel displays the checkpoint parameters and the difference between the expected and actual property values (if any).
Object Checkpoint Results
During the test run, object checkpoints check actual values of object properties against stored values and post check results to the test log. If the objects are identical (if the Store data of the selected object and its children option was enabled, TestComplete also checks all child objects), the checkpoint returns True and posts a successful checkpoint message () to the test log.
Otherwise, the checkpoint returns False and posts an error message () to the test log. The Additional Info panel displays a table containing properties that fail verification with their actual and expected values.
Web service checkpoint:
Web accessibility checkpoint:
Web comparison checkpoint:
Stores is a list of items that are stored in TestComplete projects for future comparison and for creating checkpoints. By default, the items are saved to the folder specified for each test, but you can save them anywhere. Below are the following item types for the Stores project item
Stores is a list of items that are stored in TestComplete projects for future comparison and for creating checkpoints. By default, the items are saved to the folder specified for each test, but you can save them anywhere.
Below are the following item types for the Stores project item:
Image files (BMP, JPEG, PNG, TIFF, GIF or ICO).
Collections of properties of one object (these collections are stored in the xml format).
General files (text or data).
Special elements for storing and comparing XML documents.
Data of database tables and queries.
Data retrieved from controls that display information in a tabular form.
Special items used to compare and verify web pages.
Using DDT Drivers
By using the DDT driver objects you can easily extract data stored in database tables that can be accessed via Microsoft ADO, from Excel sheets or files holding comma-separated values (CSV file).
Use the ADODriver method to create a driver object for a recordset that can be accessed via Microsoft’s ADO DB (the recordset can be a table or result of an SQL query). You can then use this driver to iterate through the recordset, obtain values stored in its fields and use them in your data-driven tests. The names and order of the driver columns coincide with the order and names of the recordset columns.
ConnectionString [in] Required String
TableName [in] Required String
Result An ADODriver object
specifies the connection string used to connect to the database holding the desired recordset.
TableName - s
pecifies the name of the desired table or a SELECT SQL statement that will be executed to obtain the recordset.
that provides access to data stored in a recordset.
Use the CSVDriver method to create a DDT driver for a CSV file that contains values separated with the predefined delimiter character. You can use this driver to iterate through the lines of the file, obtain values stored in these lines and use these values in your data-driven tests. The driver assumes the first line of the file holds the column names.
FileName [in] Required String
Result A DDTDriver object
- the fully qualified name of the desired CSV file.
that provides access to data stored in the specified CSV file.
Use the ExcelDriver method to create a DDT driver for a sheet of an Excel document. You can use this driver to iterate through the rows of the sheet, obtain values stored in its cells and use these values in your data-driven tests.
FileName [in] Required String
Sheet [in] Required String
UseACEDriver [in] Optional Boolean Default value: False
Result A DDTDriver object
- string that holds the fully qualified name of the Excel document.
- string that specifies the name of the desired sheet.
- if this parameter is True, TestComplete uses the ACE driver to connect to the specified Excel sheet. If the parameter is False, TestComplete connects to the sheet via the Microsoft Excel ODBC driver.
that provides access to data stored in an Excel sheet.
By default, the test log is automatically opened at the end of the test run session. Additionally, if the Show log on pause option is enabled, TestComplete displays the temporary test log when the test is paused. You can also open logs of previous test runs from the Project Explorer panel, which lists all available logs for the currently opened project suite and its projects.
Navigate to the Next Item
- Selects the next message of the specified type in the test log. For more information, see Navigating Within the Log Tree.
Export Test Results
- Exports test results to an external file. See Exporting Test Results (mht, html).
Export Log via Script Extensions | Post Bug to JIRA
- Invokes the Add Bug to JIRA Database dialog that allows you to add information on found errors to the Atlassian JIRA database directly from TestComplete’s test log.
Export Log via Script Extensions | Post Bug to OnTime
- Invokes the Add Defect Info to OnTime Database dialog that allows you to add information on found errors to the Axosoft OnTime database directly from TestComplete’s test log.
Export Log via Script Extensions | Post Defect to ALMComplete
- Invokes the ALMComplete Integration dialog that allows you to create bug reports and send them to your SmartBear ALMComplete database directly from TestComplete’s test log.
- Creates an item in an issue-tracking system.
Select Log Panel
- Invokes the Select Log Panel dialog that helps you choose the desired test log panel.
Move Focus to Parent Level
- Navigates to the parent item of the log item that is currently selected on the Log Items page.
Test Log's Toolbar
For each project, TestComplete keeps a complex test
. The test log can
displayed in the Log panel. For instance, TestComplete can generate
messages to pinpoint errors as well as generate
about simulated user actions such as keystrokes or mouse clicks. Your keyword tests or scripts may post any kind of message. Topics of this section describe the techniques that post messages, images and files to the log.
User able to:
Post Messages to the Log (
Action, Checkpoint, Message, Warning and Error
Post Images to the Log
Post Files to the Log
Collect Call Stack Information for Log Messages
Test Log may contain:
Table of contents:
Manual vs automated Testing
TestComplete Test Types
TestComplete Projects and Project Items
TestComplete User Interface
TestComplete Test Object Model
Checkpoints and Stores