3. The Generic Test Automation Architecture Flashcards

(70 cards)

1
Q

3.1 Introduction to gTAA

gTAA presents:

A
  • Layers
  • Components
  • Interfaces

which are the further redefined into the concrete TAA

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

3.1 Introduction to gTAA

gTAA allows structured and modular approach
to building a TAS by:

A
  • Defining the concept, space, layers, services and interfaces
  • Supporting simplified components for the effective development of TA
  • Re-using test automation components for different or evolving TASs
  • Easing the maintenance and evolution of TASs
  • Defining the essential features for a user of a TAS
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

3.1 Introduction to gTAA

TAA complies with the following principles:

A
  • Single reponsibility - every TAS component must have a single responsibility, every component should be in charge of exactly one thing
  • Extension (open/slosed principle by B.Myer) - every TAS component must be open for the extension but closed for the modification
  • Replacement (substitution principle by B.Liskov) - every TAS component must be replaceable without affecting the behavior of the TAS
  • Component segregation (interface segregation principle by R.C. Martin) - It is better have more specific components than a general, multi-purpose component
  • Dependency inversion - The components of a TAS must depend on abstractions rather than on low-level details
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

3.1.1 Overview of the gTAA

The gTAA is structured into horizontal layers:

A
  • Test generation
  • Test definition
  • Test execution
  • Test adaption

It also has interfaces for:

  • Project management
  • Configuration management
  • Test management
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

3.1.2 Test Generation Layer

Test Generation Layer

A

Consists of tool support for the following:

  • Manually designing test cases
  • Developing, capturing or deriving test data
  • Automatically generating test cases from models that define the SUT and/or its environment

Components in this layer are used to:

  • Edit and navigate test suite structures
  • Relate test cases to test objectives or SUT requirements
  • Document the test design

For automated test generation the following capabilities may also be included:

  • Ability to model the SUT, its environment, and/or test system
  • Ability to define test directives and to configure/parametrize test generation algorithms
  • Ability to trase the generated tests back to the model (elements)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

3.1.3 Test Definition Layer

Test Definition Layer

A

Consists of tool support for the following:

  • Specifying test cases (at a high and/or low level)
  • Defining test data for low-level test cases
  • Specifying test procedures for a test case or a set of test cases
  • Defining test scripts for the execution of the test cases
  • Providing access to test libraries as needed

The components in this layer are used to:

  • Partition/constrain, parametrize or instantiate test data
  • Specify test sequences or fully-fledged test behaviors, to parametrize and/or to group them
  • Document the test data, test cases and/or test procedures
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

3.1.4 Test Execution Layer

Test Execution Layer

A

Consists of tool support for the following:

  • Executing test cases automatically
  • Logging the test case executions
  • Reporting the test results

This layer may consist of components that provide the following capabilities:

  • Set up and tear down the SUT for test execution
  • Set up and tear down test suites
  • Configure and parametrize the test setup
  • Interpret both test data and test cases and transform them into executable scripts
  • Instrument the test system and/or the SUT for logging of test execution and/or fault injection
  • Analyze the SUT responses during test execution to steer subsequent test runs
  • Validate SUT responses for automated test case execution results
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

3.1.5 Test Adaptation Layer

Test Adaptation Layer

A

Consists of tool support for the following:

  • Controlling the test harness
  • Interacting with the SUT
  • Monitoring the SUT
  • Simulating or emulating the SUT environment

The test adaptation layer provides the following functionality:

  • Mediating between the technology-neutral test definitions and specific technology requirements of the SUT and test devices
  • Applying different technology-specific adaptors to interact with the suit
  • Distributing the test execution across multiple test devices/test interfaces or executing tests locally
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

3.1.6 Configuration Management of a TAS

Configuration Management of a TAS

A

May need to include:

  • Test models
  • Test definitions/specifications including test data, test cases and components
  • Test scripts
  • Test execution engines and supplementary tools and components
  • Test adaptors for the SUT
  • Simulators and emulators for the SUT environment
  • Test results and reports
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

3.1.7 Project Management of a TAS

Project Management of a TAS

A
  • TAE needs to perform the tasks for all phases of the SDLC
  • Environment of the TAS should be designes such that information (metrics) can be easily exctracted or automatically reported
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

3.1.8 TAS Support for Test Management

Support for Test Management

A
  • A TAS must support the test management for the SUT
  • Test reports, test logs and test results need to be exctraced easily or automatically provided to the test management of the SUT
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

3.2. TAA Design | 3.2.1 Introduction to TAA Design

Principal activities required to design TAA

A
  • Capture requirements needed to define an apropriate TAA
  • Compare and contrast different design/architecture approaches
  • Identify areas where abstraction can deliver benefits
  • Understand SUT technologies and how these interconnect with the TAS
  • Understand the SUT environment
  • Time and complexity for a given testware architecture implementation
  • Ease of use for a given testware architecture implementation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

3.2.1 Introduction to TAA Design

The requirements for a TA approach need to consider the following:

A
  • Which activity or phase of the test process should be automated
  • Which test level should be supported
  • Which type of test should be supported
  • Which test role should be supported
  • Which software product, software product line, software product family should be supported
  • Which SUT technologies should be supported
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

3.2.1 Introduction to TAA Design

Different Design/Architecture Approach

  • Consideration for the test generation layer:
A
  • selection of manual or automated test generation
  • selection of (for example) requirements-based, data based, scenario-based or behavior-based test generation
  • selection of test generation strategies
  • choosing of the test selection strategy
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

3.2.1 Introduction to TAA Design

Different Design/Architecture Approach

  • Consideration for the test definition layer:
A
  • selection of data-driven, keyword-driven, pattern-based or model-driven test definition
  • selection of notation for test definition
  • selection of style guides and guidelines for the definition of high quality tests
  • selection of test case repositories
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

3.2.1 Introduction to TAA Design

Different Design/Architecture Approach

  • Consideration for the test execution layer:
A
  • selection of test execution tool
  • selection of interpretation or compilation approach for implementing test procedures
  • selection of implementation technology for implementing test procedures
  • selection of helper libraries to ease test execution
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

3.2.1 Introduction to TAA Design

Different Design/Architecture Approach

  • Consideration for the test adaptation layer:
A
  • selection of test interfaces to the suit
  • selection of tools to stimulate and observe the test interfaces
  • selection of tools to monitor the SUT during test execution
  • selection of tools to trace test execution
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

3.2.1 Introduction to TAA Design

Benefits of Abstraction

Part I

A
  • Abstraction in TAA enables technology independence in that the same test suite can be used in different test environments and on different target technologies
  • The portability of test artifacts is increased
  • Abstraction improves maintainability and adaptability to new or evolving SUT technologies
  • Abstraction helps to make TAA more accessible to non-technicians, as test sutes can be documented and explained at a higher level
  • This improves readability abd understandability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

3.2.1 Introduction to TAA Design

Benefits of Abstraction:

Part II

A
  • TAE must be aware that there are trade-offs between sophisticated and straightforward implementation of a TAA with respect of overall functionality, maintainability, and expandability
  • A decision on which abstraction to use in a TAA needs to take into account these trade-offs
  • The more abstraction is used in TAA, the more flexible it is with respect to further evolution or transitioning to new approaches or technologies
  • This comes at the cost of larger initial investments, but can pay off in the long run
  • It may also lead to lower performance of the TAS
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

3.2.1 Introduction to TAA Design

Benefits of Abstraction:

Part III

A

TAE provides inputs to the ROI analysis by providing technical evaluations and comparisons of different test automation architectures and approaches with respect to:

  • timing
  • cost
  • efforts
  • benefits
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

3.2.1 Introduction to TAA Design

Interconnection of SUT Technologies with TAS

Part I

A

The access to the test interfaces of the SUT is central to any automated test execution, and it can be available at the following levels:

  • software level
  • API level
  • protocol level
  • service level
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

3.2.1 Introduction to TAA Design

Interconnection of SUT Technologies with TAS

Part II

A

Paradigm of interaction, (whenever the TAS and SUT are separated by APIs, protocols or services) include the following:

  • event-driven paradigm, which drives the interaction via events being exchanged on an event bus
  • client-server paradigm, which drives the interaction via service invocation from service requestors to service provider
  • peer-to-peer paradigm, which drives the interaction via service invocation from either peer
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

3.2.1 Introduction to TAA Design

SUT Environment

A

SUT can be:

  • SUT as standalone sofware
  • SUT as software that works in relation to other:
    - software (systems of systems)
    - hardware (e.g. embedded systems)
    - environmental components
  • A TAS simulates or emulates the SUT environment as part of an automated test setup
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

3.2.1 Introduction to TAA Design

Time and complexity of TAS

A

Methods for estimations and examples include the following:

  • analogy-based estimation such as: function points, three-point estimation, wideband delphi and expert estimation
  • estimation by use of work breakdown structures such as those found in management software or project templates
  • parametric estimation such as Constructive Cost Model (COCOMO)
  • size-based estimation such as Function Point Analysis, Story Point Analysis or Use Case Analysis
  • group estimations such as Planning Poker
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
# 3.2.1 Introduction to TAA Design TAS and Ease of Use
Usability issues for a TAS includes, but is not limited to: * tester-oriented design * ease of use of the TAS * TAS support for other roles in the software development, quality assurance, and project management * effective organization, navigation, and search in/with the TAS * useful documentation, manuals, and help text for the TAS * Practical reporting by and about the TAS * Iterative design to address TAAS feedback and empirical insights
26
# 3.2.2 Approaches for Automating Test Cases Approaches for Automating Test Cases
* TAE implements test cases **directly** into automated test scripts. This option is the least recommended as it lacks abstraction and increases the maintenance load * TAE designs **test procedures**, and **transforms** them into automated test scripts. This option has abstraction but lacks automation to generate test scripts * TAE uses a **tool to translate** test procedure into automated test scripts. This option combines both abstraction and automated script generation * TAE uses a tool that **generates automated procedures** and/or **translates** the test scripts directly from models. This option has the highest degree of automation
27
# 3.2.2 Approaches for Automating Test Cases Capture/Playback Approach | Part I
* Tools are used to capture interactions with the SUT while performing the sequence of actions as defined by a test procedure * Inputs are captured; outputs may also be recorded for later checks During the replay of events, there are various manual and automated output checking possibilities: * **Manual**: the tester has to watch the SUT outputs for anomalies * **Complete**: all system outputs that were recorded during capture must be reproduced by the SUT * **Exact**: all system outputs that were recorded during capture must be reproduced by the SUT to the level of the recording * **Checkpoints**: only selected system outputs are checked at certain points for specified values
28
# 3.2.2 Approaches for Automating Test Cases Capture/Playback Approach | Part II
**Pros**: * The capture/playback approach can be used for SUTs on the GUI and/or API level * Initially, it is easy to setup and use **Cons**: * Capture/playback scripts are hard to maintain and evolve because the captured SUT execution depends strongly on the SUT version from which the capture has been taken * Implementation of the test cases (scripts) can only start when the SUT is available
29
# 3.2.2 Approaches for Automating Test Cases Linear Scripting | Part I
* Starts with manual test procedures * Each test is run manually while the test tool records the sequence of actions and in some cases captures the visible output from the SUT to the screen * Recorded scripts may be edited to improve readability or add further checks using the scripting language of the tool * The scripts can be replayed by the tool, causing the tool to repeat the same actions taken by the tester when the script was recorded
30
# 3.2.2 Approaches for Automating Test Cases Linear Scripting | Part II
**Pros:** * Little or no preparation work required before one can start automating * Programming skills are not required but are usually helpful **Cons:** * The amount of effort required to automate any given test procedure is mostly dependent on the size (number of steps or actions) required to perform it * Similar tests, with different input values, would contain the same sequence of instructions; only the information included with the instructions (instruction arguments or parameters) would differ * Non-programmers may find scripts difficult to understand as these are written in programming or some proprietary language * Scripts can soon become very large, when the tes comprises many steps * The scripts are non-modular and difficult to maintain
31
# 3.2.2 Approaches for Automating Test Cases Structured Scripting | Part I
* Strucured scripting introduces a script library * Script library contains reusable scripts that perform sequences of instructions that are commonly required across a number of tests
32
# 3.2.2 Approaches for Automating Test Cases Structured Scripting | Part II
**Pros:** * Significant reduction of maintenance changes * Reduced costs of automating new tests **Cons:** * The initial effort to create the shared scripts can be seen as a disadvantage, but this initial investment should pay big dividends if approached properly * Programming skills will be required to create all the scripts as simple recording alone will not be sufficient * The script library must be well managed
33
# 3.2.2 Approaches for Automating Test Cases Data-driven Testing | Part I
* The data-driven scripting builds on the structured scripting tecnique * The most significant difference is how the test inputs are handled. The inputs are extracted from the scripts and put into one or more spearate files * The control script contains the sequence of instructions necessary to perform the tests but reads the input data from a data file
34
# 3.2.2 Approaches for Automating Test Cases Data-driven Testing | Part II
**Pros:** * The cost of adding new automated tests can be significally reduced by this scripting technique * This technique is used to automate many variations of a useful test, giving deeper testing in a specific area and may increase test coverage **Cons:** * The need to manage data files and make sure they are readable by TAS is a disadvantage but can be approached properly * negative test cases may be missed
35
# 3.2.2 Approaches for Automating Test Cases Keyword-driven Testing | Part I
* Builds on data-driven scripting technique * The data files are now called 'test definition' files or similar * There is only one control script * A test definition file contains data as does the data files, but keyword files also contain high level instructions (the keywords, or 'action words')
36
# 3.2.2 Approaches for Automating Test Cases Keyword-driven Testing | Part II
* The keywords are mostly (but not exclusively) used to represent high level business interactions with the system * Each keyword represents a number of detailed interactions with the SUT * Sequences of keywords (including the relevant test data) are used to specify the test cases * Special keywords can be used for verification steps, or keywords can contain both the actions and the verification steps
37
# 3.2.2 Approaches for Automating Test Cases Keyword-driven Testing | Part III
**Pros:** * The cost of adding new automated test is very low (if controlling and supporting scripts already exist) * High level of reuasbility * Test cases are easier to maintain, read and write as the complexity can be hidden in the keywords **Cons:** * Implementing the keywords presents a big task, particulary if using a tool that offers no support for this scripting technique * Care needs to be taken to ensure that the correct keywords are implemented
38
# 3.2.2 Approaches for Automating Test Cases Process-driven Testing | Part I
* Builds on the keyword-driven scripting technique * Scenarios (representing use cases of the SUT and variants thereof) constitute the scripts which are parametrized with test data or combined into higher-level test definitions * Test definitions are easier to cope with, as the logical relation between actions in feature or robustness testing can be determined
39
# 3.2.2 Approaches for Automating Test Cases Process-driven Testing | Part II
**Pros:** * The use of process-like scenario-based definition of test cases allows the test procedures to be defined from a workflow perspective **Cons:** * Processes of an SUT may not be easy to comprehend * Care also needs to be taken to ensure that the correct processes, by use of correct keywords, are implemented
40
# 3.2.2 Approaches for Automating Test Cases Model-based Testing | Part I
* Refers to the automated generation of test cases * As oposed to the automated execution of test cases by use of previous approaches * Model-based testing uses (semi-) formal models which abstract from the scripting technologies of the TAA * Different test generation methods can be used to derive tests for any of the scripting frameworks discussed before
41
# 3.2.2 Approaches for Automating Test Cases Model-based Testing | Part II
**Pros:** * Allows by abstraction to concentrate on the essence of testing (in terms of business logic, data, scenarios, etc. to be tested) * Allows generating tests for different target systems and technologies * In case of changes in the requirements, the test model has to be adapted only * Test case design techniques are incorporated in the test case generators **Cons:** * Modeling expertise is required to run a model-based testing approach effectively * The task of modeling by abstracting an SUT's interfaces, data and/or behavior can be difficult * Model-based testing approaches require adjustments in the test process
42
# 3.2.3 Technical considerations of the SUT Technical aspects of an SUT
* Interfaces of the SUT * SUT data * SUT configurations * SUT standards and legal settings * Tools and tool environment used to develop the SUT * Test interfaces in the software product
43
# 3.2.3 Technical considerations of the SUT Interfaces of the SUT
* SUT has: - internal interfaces (inside the system) - external interfaces (to the system environment and its users or by exposed components) * Interfaces need to be testable * Logging of the interactions between the SUT and the TAS with different levels of detail, typically including time stamps
44
# 3.2.3 Technical considerations of the SUT SUT Data
* SUT uses: - configuration data (to control its instantiation, configuration, administration, etc.) - user data (which it processes) - external data (from other system to complete tasks) * Depending on the test procedures for an SUT, all these types of data need to be definable, configurable and capable of instantiation by the TAA * Depending on the approach, data my be handled as: - parameters - test data sheets - test databases - real data, etc.
45
# 3.2.3 Technical considerations of the SUT SUT Configurations
* SUT may be deployed in different configurations: - on different operating systems - on different target devices - with different language settings * Depending on the test procedures, different SUT configurations may have to be addressed by TAA * The test procedures may require different test setups or virtual test setups of the TAA in combination with a given SUT configuration
46
# 3.2.3 Technical considerations of the SUT SUT Standards and Legal Settings
* The TAA design may need to comply with legal and/or standards requirements * Examples: - privacy requirements for the test data - confidentiality requirements that impact the logging and reporting capabilities of the TAA
47
# 3.2.3 Technical considerations of the SUT Tools and Tool environments Used to Develop the SUT
* Different tools may be used for the requirements engineering, design and modeling, coding, integration and deployment of the SUT * The TAA should take the SUT tool landscape into account in order to enable tool: - compatibility - traceability, and/or - reuse of artifacts
48
# 3.2.3 Technical considerations of the SUT Test interfaces in the Software Product
* Do not remove all the test interfaces prior to the product release * Interfaces can be used by service and support engineers for: - problem diagnosis - testing maintenance releases * Important: Verify that the interfaces will pose no security risks
49
# 3.2.4 Considerations for Development / QA Processes Considerations for Development / QA Processes
* Test execution control requirements - interactive test execution - batch mode test execution - fully automated test execution * Reporting requirements - fixed - parametrized - defined * Role and access rights - depending on the security requirements, the TAA may be required to provide a role and access right system * Established tool landscape - All tools can support different software development activities from the established tool landscape - The TAA can be supported by a tool or tool set, which needs to seamlessly integratewith the other tools in the landscape - Test scripts should be stored and versioned like SUT code so that revisions follow the same process for both
50
# 3.3 TAS Development | 3.3.1 Introduction to TAS development Introduction to TAS development
* Development of a TAS is comparable to other software development projects. It can follow the same procedures and processes * Specific to a TAS are its compatibility and synchronization with the SUT * The SUT is impacted by the test strategy (e.g. having to make test interfaces available to TAS)
51
# 3.3.1 Introduction to TAS development The Basic SDLC for TAS | Part I
>Analyze > Design > Develop > Test > Deploy > Evolve >
52
# 3.3.1 Introduction to TAS development The Basic SDLC for TAS | Part II
* The set of requirements needs to be analyzed and collected * The requirements guide the design of the TAS as defined its TAA * The design is turned, developed into software by software engineering approaches
53
# 3.3.1 Introduction to TAS development The Basic SDLC for TAS | Part III
* TAS needs to be tested - this is typically done by basic capability tests for the TAS, which are followed by an interplay between the TAS and SUT * After deployment and use of a TAS, often a TAS evolution is needed to: - add more test capability - change tests - update the TAS to match the changing SUT * The TAS evolution requires a new round of TAS development accordnig to the SDLC
54
# 3.3.1 Introduction to TAS development The Basic SDLC for TAS | Part IV
* Additional activities in SDLC regarding the TAS: - Backup - Archiving - Teardown * These procedures should follow established methods in an organization
55
# 3.3.2 Compatibility between the TAS and the SUT Compatibility between the TAS and the SUT
* Process compatibility * Team compatibility * Technology compatibility * Tool compatibility
56
# 3.3.2 Compatibility between the TAS and the SUT Process Compatibility
* Testing of an SUT should be synchronized with its development - and, in the case of test automation, synchronized with the TAS development * A large gain can be achieved when the SUT and TAS development are compatible in terms of process structure, process management and tool support
57
# 3.3.2 Compatibility between the TAS and the SUT Team compatibility
* Both teams (TAS and SUT development teams) will benefit by - reviewing each other's requirements - reviewing designs and/or development artifacts - discussing issues - finding compatible solutions * Team compatibility helps in the communication and interaction with each other
58
# 3.3.2 Compatibility between the TAS and the SUT Tool Compatibility
* Tool compatibility between TAS and SUT management, development, and quality assurance needs to be considered * Example: - if the same tools for requirements management and/or issues management are used, the exchange of information and the coordination of TAS and SUT development will be easier
59
# 3.3.3 Synchronization between TAS and SUT Synchronization between TAS and SUT
* Synchronization of requirements * Synchronization of development phases * Synchronisation of defect tracking * Synchronisation of SUT and TAS evolution
60
# 3.3.3 Synchronization between TAS and SUT Synchronization of Requirements
* After requirements elicitation, both SUT and TAS requirements are to be developed * TAS requirements can be grouped into requirements that address the: - development of the TAS as a software-based system - testing of the SUT by means of the TAS (these correspond to the SUT requirements) * It is important to verify the consistency between the SUT and TAS requirements on any update
61
# 3.3.3 Synchronization between TAS and SUT Synchronization of Development Phases
* In order to have the TAS ready when needed for testing the SUT, the development phases need to be coordinated - most efficient when the SUT and TAS requirements, designs, specifications, and implementations are synchronized
62
# 3.3.3 Synchronization between TAS and SUT Synchronization of Defect Tracking
* Defects can relate to the: - SUT - TAS - requirements / designs / specifications * A defect corrected within one project, may provoke the corrective action in the other * Defect tracking and confirmation testing have to address both the TAS and the SUT
63
# 3.3.3 Synchronization between TAS and SUT Synchronization of SUT and TAS Evolution
* Both the SUT and the TAS can evolve to: - accomodate new features - disable features - correct defects - address changes in their environment * Any change applied to an SUT or to a TAS may impact the other so the management of these changes should address both the SUT and TAS
64
# 3.3.4 Building Reuse into the TAS Building Reuse into the TAS | Part I
Reuse of a TAS refers to the reuse of TAS artifacts (from any level of its architesture) across: * product lines * product frameworks * product domains * project families
65
# 3.3.4 Building Reuse into the TAS Building Reuse into the TAS | Part II
Reusable artifacts can include: * (parts of) test models of test goals, test scenarios, test components or test data * (parts of) test cases, test data, test procedures or test libraries themselves * the test engine and/or test report framework * the adaptors to the SUT components and/or interfaces
66
# 3.3.4 Building Reuse into the TAS Building Reuse into the TAS | Part III
While reuse aspects are already settled when the TAA is defined, the TAS can help increase the ability for reuse by: * following the TAA or revising and updating it whenever needed * documenting the TAS artifacts so that they are easily understood and can be incorporated into new contexts * ensuring the correctness of any TAS artifact so that the usage in new contexts is supported by its high quality
67
# 3.3.4 Building Reuse into the TAS Building Reuse into the TAS | Part IV
* Design for reuse is mainly a matter for the TAA * The maintenance and improvements for reuse are a concern throughout the TAS lifecycle * Continuous consideration and effort is required to: - make reuse happen - measure and demonstrate the added value of reuse - evangelize others to reuse existing TAS
68
# 3.3.5 Support for a Variety of Target Systems Support for a Variety of Target Systems | Part I
* TAS support for a varitey of target systems refers to the ability of a TAS to test different configurations of a software product * Different configurations refer to any of the following: - number and interconnection of SUT components - environments (both software and hardware) on which the SUT components run - technologies, programming languages or operating systems used to - implement the SUT components - libraries and packages the SUT components are using - tools used to implement the SUT components
69
# 3.3.5 Support for a Variety of Target Systems Support for a Variety of Target Systems | Part II
* The ability of a TAS to test different software product configurations is determined when the TAA is defined. However, the TAS has to implement the ability to handle the technical variance * The handling of the TAS variety in relation to the variety of the software product can be dealt with differently: - version/configuration management for the TAS and SUT can be used to provide the respective versions and configurations of the TAS and SUT that fit each other - TAS parametrization can be used to adjust a TAS to an SUT configuration
71
# 3.3.5 Support for a Variety of Target Systems Support for a Variety of Target Systems | Part III
* Design for TAS variability is mainly a matter for the TAA * The maintenance of and improvements for variability are a concern throughout the TAS life cycle * Continuous consideration and efforts are required to revise, add and remove options and forms variability