This page contains some information to help you get started with using
the tool. If you are eager to get started, download the program by
accessing the link provided in section Obtaining AutoTest binary
for execution below.
AutoTest is a test management, test execution and test result report
application written in Java. It supports the follwing features:
- Self-intuitive, easy to use GUI
- Test Execution Framework agnostic, can be customized to fit virtually
any test automation framework.
- Grouping of test cases into lists and categories, with an arbitrary
number of hierarchies, to accommodate any test strategy.
- Ability to export execution results in .html, .xls, or .txt formats
- Management and storage of test lists, categories and execution results for future
reference.
- Live tracking of the progress of the test execution via a real-time
console.
- Scheduling of test execution to start at a given time and date in the future.
Obtaining
AutoTest
There
are a couple of ways to obtain a copy of AutoTest. Which way to
follow depends on your objectives. If you are interested just in using
the tool for executing test cases, you can just get the binary. If you
are interested in contributing to further development of the tool, you
can get the source code from an online SVN repository.
Obtaining AutoTest binary
for execution
Download the latest executable file from sourceforge:
Once
you download the file, move it to a directory of your choice, and
double click on it. The AutoTest workspace should show up. Note that
since the program is written in Java, you will need a relatively new
version of Java VM from Oracle. Click on this
link and follow the on-screen
instructions in case you don't have Java on your machine, or if you are
unsure.
Obtaining AutoTest source code as an Eclipse project
It is strongly recommended to
install Eclipse prior to accessing the source code (if you don't have
it click
here, and choose Eclipse
Classic). This is because AutoTest is provided as an Eclipse project.
You can checkout the code from sourceforge using a
subversion client (for example tortoiseSVN or the command-line subversion client interface provided here).
You can "checkout" the code from the following URIs:
You can also "browse" the trunk here.
Using
AutoTest a.
Getting familiar with the application <
The figure above illustrates a typical AutoTest workspace.
On
the left hand side of the application is the test
planning (management) view.
From here, the user can add new test
categories and new test
case lists. In the context of
AutoTest terminology, test categories and test lists are known as test
assets. Test categories are
folders which can hold one or more test case lists. They can
potentially also have sub-folders, i.e. more test categories, if the
test case organizational strategy is more complex. Test lists are lists
of test cases. Suffice to note, AutoTest treats
every test list as a single execution unit, i.e.
all test cases in a list are executed in every execution
run. There are a few basic principles to keep in mind when
working with test management assets:
- Test Categories contain no
test cases and are used to group test lists and/or other test
categories.
- Test lists can contain zero or more test cases.
- Test case lists can be grouped under test categories but the opposite
is not possible.
- It is also not possible for test lists to be
grouped under other test lists.
In
order to add a new category,
the user has to click the New
Category button, in which case,
AutoTest will prompt for a category name. Clicking on the New
List button will bring up a
prompt asking for a test list name. Note that in the current
version, AutoTest does not support more duplicate names in categories
or lists (i.e. names of categories and test lists must be unique).
The
test list or category is added under the test category (or the root
"Project" node) currently selected. Therefore, before adding a new test
asset, make sure that you select the node in the tree which you would
like to be the "parent" of your newly created asset.
It
is also possible to remove test assets, by selecting the asset for
removal, then clicking on the Remove
Assets button. This will remove
the asset currently selected, but also all the children, i.e. all the
assets grouped under the selected asset.
In
the middle of the AutoTest workspace is the test
case list view. This is a
list containing the names of zero or more test scripts that can be
potentially executed. Note that this view works in tandem with the test
management view. It is
only permissible to add test cases to a test case list, only if a test
list is selected in the test management view.
There are two ways to add test cases: Either by clicking on the Add
... button, or by dragging and
dropping test cases into the list view. The advantage of dragging and
dropping is that you can add multiple test cases with one action.
It
is also possible to remove test cases from a list. Select the test
case you want to remove and click the button Remove
Cases.
Also
note that the sequence the
test cases appear on the list, is also the sequence in which they will
be executed. Execution starts by pressing the Execute
button.
On the right part of the AutoTest workspace is the test
execution results view as
well as the execution
console. These components
are used for reviewing the results of an execution and viewing the
current status of an ongoing execution respectively.
The test execution results view is a table, wherein every row
corresponds to one test case. There are currently 6 columns per row,
each containing specific information related to the execution of a test
case.
Test
Case Number: The number of the test case is an integer, increment of 1
(step 1), to indicate the number of the test case in the list.
Test
Case Name: The name of the test case, as it also appears in the test
case list view. This is typically the name of the executable test
script.
RQM
Identifier: An RQM record identifier to be used when importing
execution results to RQM. This is work in progress (hence the "TODO"
statement).
Execution
time: How much time was spent executing the test case (in seconds, with
accuracy of millisecond).
Execution
Status: What is the verdict on the test case (it can be PASSED, if the
test case passes, FAILED if the execution was a failure or Not Executed
if the test case is still not executed. In certain cases where the
result of the execution cannot be determined, this cell can have the
UNKNOWN value as well.
Reason
for failure: In case of a FAILED execution, AutoTest will attempt to
list the reason for failure here.
When
in execution mode, the Execution
Progress bar shows the
percentage of test cases executed as opposed to the total number of
test cases on the current test case list. At any time, it is possible
to export execution results using the Export
Results ... button.
When clicking on this button a new window appears and prompts the user
to select export format, as well as a folder path to export the
execution reports to.
There
are choices of three formats:
- An .html format option which outputs the execution results in .html
format, including detailed test execution logs.
- An .xls format option which outputs the execution results in .xls
format
- A .txt format option which outputs the results in an ASCII text file.
When
choosing path remember to specify the absolute path, ending with a
backslash "\", for example C:\Users\thanasis\Desktop\tests\
b. Configuring
AutoTest
The
Configuration
window for AutoTest can be found under the Application
menu. Typically, this configuration is done only once after launching
the application for the first time. The configuration information is
stored in a file called tte.cfg,
and are read everytime the application starts. First, the user has to
choose between the Simple
Configuration
and Custom
Configuration options.
AutoTest uses the shell of the Operating System (O/S) to execute test
cases and the choice of a corresponding option depends on the
underlying test execution framework..
Choosing
a configuration scheme
If
your test scripts are executed simply by passing them as arguments to a
test execution programme, then the simple configuration is the ideal
option. For example, assume a scenario where .tcl scripts that are
executed by a TCL interpreter. Such scripts can be executed
manually from the O/S shell in the following way:
tclsh
test.tcl
In AutoTest, all that is required is to define the location of the
binary used for execution in the corresponding text box.
On
the other hand, if test execution is more complex, or you want to
customize the execution by generating custom names for test reports,
you can choose the custom configuration option. With this option, it is
possible to define a pattern with which test execution can take
place. Choose the Create Expression ... button to bring the Expression Configurator
window on screen, which helps to define a test execution pattern.
Within the Expression
Configurator window, the
pattern is built on the text box on the top, with the Expression
label beside it. This execution pattern is a combination of preset
variables,together with manually written text. The preset variables are
a convenient way to express statements which change for every different
test case executed from the list, or even configuration settings. There
are currently three variables which can be ussed inside a pattern:
Current
test case (incl. path): The
current test case from the test case list, including the absolute path
to this test case. Log
Location: The location of the
log as specified in the Log Storage panel in the Configuration dialog
(see Choosing
a local test execution log repository
below). Add
a number increment: An
integer which will be appended to the pattern, and will be increased by
1 for every new test case executed from the list.
You
can use the provided buttons to insert these variables, as well as
other special characters to the pattern string on the top of the
window. Notice that when inserted, these statements assume are
represented between pipe and underscore characters ("|_" and "_|").
This way, AutoTest understands that the statement enclosed in these
characters is a variable and has to be substituted with a statement.
As
an example, assume you want to build a custom tcl execution expression,
for a test framework running in linux, which should execute a
cleanup.sh script after execution of each test case. Also, the logs
should be stored with a certain name, under a custom directory.
Presumably, this script will prepare the environment for the next test
case to be executed. Then you can create the following pattern:
the
pattern above could correspond to the test execution statement below:,
assuming that the directory where the test cases are stored is
/home/thanasis/testCases, and the first test case on the test case list
is called "testCase1":
... and will continue execution until all test cases listed are
executed.
Choosing
a local test execution log repository
Typically,
logs are reported in the standard output, but AutoTest gives you the
option of redirecting this output to a file. You can define a local
storage location in the text box under the Log Storage
panel in the Execution Configurator
window. Note that you don't need to define a location for logs to be
reported (as discussed here),
as AutoTest saves the logs in volatile memory, and then accesses this
memory when exporting test execution reports. Obviously, these logs
will not be stored if you quit the application and start it again. If
you want to keep a local copy of the logs for reference (or if
you don't plan to use the built-in automated
test
reporting function), you can use
this option.
Identifying test
resutls
AutoTest reads the standard output produced by the test execution
framework line-by-line. In order to determine which test cases pass and
which fail, AutoTest needs to know what part of a line would constitute
a passing or a failing test case. This can be specified using a regular
expression in the Regular
expression-based
execution verification panel
of the Execution
Configurator window. The
notation of the regular expression is the standard Java notation.
You can read more about regular expressions here
and then build your own. A handy tool which you can use to experiment
with, is the Regexp Editor, which can be found online at http://myregexp.com/.
Storing
current test management view
You can store the current test management view (i.e. test categories,
test lists, together with test execution results), using the Save
function in the Application menu
(Application,
Save Data Profile).
In the dialog that pops up, enter a filename, with the extension ".xml"
(since XML is used to store the data).
Then you can load the file back at any time by choosing Application,
Load Test Profile.
Note that storage will not store
the execution logs and that you can specify where these logs should be
stored in the Execution
Configurator window.