Lesson 7 - Software Testing (Ch 22) Flashcards

1
Q

Software testing must be planned carefully to avoid _____ development time and resources.

A

wasting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Testing begins “in the _____” and progresses “to the _____”.

A

small, large

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Initially _____ components are tested and debugged.

A

individual

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

After the individual components have been tested and added to the system, _____ testing takes place.

A

integration

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Once the full software product is completed, _____ testing is performed.

A

system

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The __________ document should be reviewed like all other software engineering work products.

A

Test Specification

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Strategic Approach to Software Testing

• Many software errors are eliminated before testing begins by conducting effective technical _____

A

reviews

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Strategic Approach to Software Testing

• Different testing techniques are _____ at different points in time.

A

appropriate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Strategic Approach to Software Testing

• The developer of the software conducts testing and may be assisted by _____ test groups for large projects.

A

independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Strategic Approach to Software Testing

• Testing and debugging are _____ activities.

A

different

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Strategic Approach to Software Testing

• Debugging must be _____ in any testing strategy.

A

accommodated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Verification and Validation

• Make a distinction between verification (are we building the __________?) and validation (are we building the __________?)

A

product right, right product

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Verification and Validation

• Software testing is only one _____ of Software Quality Assurance (SQA)

A

element

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Verification and Validation

• Quality must be built in to the _____ process, you can’t use testing to add quality after the fact

A

development

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Organizing for Software Testing

• The role of the __________ is to remove the conflict of interest inherent when the builder is testing his or her own product.

A

Independent Test Group (ITG)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Organizing for Software Testing

o The developer should do no testing at all
o Software is tossed “over the wall” to people to test it mercilessly
o Testers are not involved with the project until it is time for it to be tested

A

Misconceptions regarding the use of independent testing teams

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Organizing for Software Testing

• The developer and ITGC must work together throughout the software project to ensure that thorough _____ will be conducted

A

tests

18
Q

Software Testing Strategy

• Unit Testing – makes heavy use of testing techniques that exercise specific control paths to detect errors in each software component _______

A

individually

19
Q

Software Testing Strategy

• Integration Testing – focuses on issues associated with verification and program _____ as components begin interacting with one another

A

construction

20
Q

Software Testing Strategy

• Validation Testing – provides assurance that the software validation criteria (established during requirements analysis) meets all functional, behavioral, and performance _____

A

requirements

21
Q

Software Testing Strategy

• System Testing – verifies that all system elements mesh properly and that _____ system function and performance has been achieved

A

overall

22
Q
  • Specify product requirements in a quantifiable manner before testing starts.
  • Specify testing objectives explicitly.
  • Identify categories of users for the software and develop a profile for each.
  • Develop a test plan that emphasizes rapid cycle testing.
  • Build robust software that is designed to test itself.
  • Use effective formal reviews as a filter prior to testing.
  • Conduct formal technical reviews to assess the test strategy and test cases.
  • Develop a continuous improvement approach for the testing process.
A

Strategic Testing Issues

23
Q
  • Module interfaces are tested for proper information flow.
  • Local data are examined to ensure that integrity is maintained.
  • Boundary conditions are tested.
  • Basis (independent) path are tested.
  • All error handling paths should be tested.
  • Drivers and/or stubs need to be developed to test incomplete software.
A

Unit Testing

24
Q
  • Sandwich testing uses top-down tests for upper levels of program structure coupled with bottom-up tests for subordinate levels
  • Testers should strive to indentify critical modules having the following requirements
  • Overall plan for integration of software and the specific tests are documented in a test specification
A

Integration Testing

25
Q

Integration Testing Strategies

  1. Main control module used as a test driver and stubs are substitutes for components directly subordinate to it.
  2. Subordinate stubs are replaced one at a time with real components (following the depth-first or breadth-first approach).
  3. Tests are conducted as each component is integrated.
  4. On completion of each set of tests and other stub is replaced with a real component.
  5. Regression testing may be used to ensure that new errors not introduced.
A

Top-down integration testing

26
Q

Integration Testing Strategies

  1. Low level components are combined into clusters that perform a specific software function.
  2. A driver (control program) is written to coordinate test case input and output.
  3. The cluster is tested.
  4. Drivers are removed and clusters are combined moving upward in the program structure.
A

Bottom-up integration testing

27
Q

Integration Testing Strategies

used to check for defects propagated to other modules by changes made to existing program

  1. Representative sample of existing test cases is used to exercise all software functions.
  2. Additional test cases focusing software functions likely to be affected by the change.
  3. Tests cases that focus on the changed software components.
A

Regression testing

28
Q

Integration Testing Strategies

  1. Software components already translated into code are integrated into a build.
  2. A series of tests designed to expose errors that will keep the build from performing its functions are created.
  3. The build is integrated with the other builds and the entire product is smoke tested daily (either top-down or bottom integration may be used).
A

Smoke testing

29
Q
  • Interface integrity – internal and external module interfaces are tested as each module or cluster is added to the software
  • Functional validity – test to uncover functional defects in the software
  • Information content – test for errors in local or global data structures
  • Performance – verify specified performance bounds are tested
A

General Software Test Criteria

30
Q
  • Unit Testing – components being tested are classes not modules
  • Integration Testing – as classes are integrated into the architecture regression tests are run to uncover communication and collaboration errors between objects
  • Systems Testing – the system as a whole is tested to uncover requirement errors
A

Object-Oriented Test Strategies

31
Q
  • smallest testable unit is the encapsulated class or object
  • similar to system testing of conventional software
  • do not test operations in isolation from one another
  • driven by class operations and state behavior, not algorithmic detail and data flow across module interface
A

Object-Oriented Unit Testing

32
Q
  • focuses on groups of classes that collaborate or communicate in some manner
  • integration of operations one at a time into classes is often meaningless
  • thread-based testing – testing all classes required to respond to one system input or event
  • use-based testing – begins by testing independent classes (classes that use very few server classes) first and the dependent classes that make use of them
  • cluster testing – groups of collaborating classes are tested for interaction errors
  • regression testing is important as each thread, cluster, or subsystem is added to the system
A

Object-Oriented Integration Testing

33
Q
  1. WebApp content model is reviewed to uncover errors.
  2. Interface model is reviewed to ensure all use-cases are accommodated.
  3. Design model for WebApp is reviewed to uncover navigation errors.
  4. User interface is tested to uncover presentation errors and/or navigation mechanics problems.
  5. Selected functional components are unit tested.
  6. Navigation throughout the architecture is tested.
  7. WebApp is implemented in a variety of different environmental configurations and the compatibility of WebApp with each is assessed.
  8. Security tests are conducted.
  9. Performance tests are conducted.
  10. WebApp is tested by a controlled and monitored group of end-users (looking for content errors, navigation errors, usability concerns, compatibility issues, reliability, and performance).
A

WebApp Testing Strategies

34
Q
  • User experience testing – ensuring app meets stakeholder usability and accessibility expectations
  • Device compatibility testing – testing apps on multiple devices
  • Performance testing – testing non-functional app requirements
  • Connectivity testing – testing ability of app to connect reliably
  • Security testing – ensuring app meets stakeholder security expectations
  • Testing-in-the-wild – testing app on user devices in actual user environments
  • Certification testing – app meets the distribution standards
A

MobileApp Testing

35
Q
•	Focuses on visible user actions and user recognizable outputs from the system
•	Validation tests are based on the use-case scenarios, the behavior model, and the event flow diagram created in the analysis model
o	Must ensure that each function or performance characteristic conforms to its specification.
o	Deviations (deficiencies) must be negotiated with the customer to establish a means for resolving the errors.
•	Configuration review or audit is used to ensure that all elements of the software configuration have been properly developed, cataloged, and documented to allow its support during its maintenance phase.
A

Validation Testing

36
Q
  • Making sure the software works correctly for intended user in his or her normal work environment.
  • Alpha test – version of the complete software is tested by customer under the supervision of the developer at the developer’s site
  • Beta test – version of the complete software is tested by customer at his or her own site without the developer being present
A

Acceptance Testing

37
Q
  • Series of tests whose purpose is to exercise a computer-based system
  • The focus of these system tests cases identify interfacing errors
  • Recovery testing – checks the system’s ability to recover from failures
  • Security testing – verifies that system protection mechanism prevent improper penetration or data alteration
  • Stress testing – program is checked to see how well it deals with abnormal resource demands (i.e. quantity, frequency, or volume)
  • Performance testing – designed to test the run-time performance of software, especially real-time software
  • Deployment (or configuration) testing – exercises the software in each of the environment in which it is to operate
A

System Testing

38
Q
  • The symptom and the cause may be geographically remote (symptom may appear in one part of a program).
  • The symptom may disappear (temporarily) when another error is corrected.
  • The symptom may actually be caused by non-errors (e.g., round-off inaccuracies).
  • The symptom may be caused by human error that is not easily traced.
  • The symptom may be a result of timing problems, rather than processing problems.
  • It may be difficult to accurately reproduce input conditions (e.g., a real-time application in which input ordering is indeterminate).
  • The symptom may be intermittent. This is particularly common in embedded systems that couple hardware and software inextricably.
  • The symptom may be due to causes that are distributed across a number of tasks running on different processors.
A

Bug Casuses

39
Q
  • Debugging (removal of a defect) occurs as a consequence of successful testing.
  • Some people are better at debugging than others.
  • Common approaches (may be partially automated with debugging tools):
  • Brute force – memory dumps and run-time traces are examined for clues to error causes
  • Backtracking – source code is examined by looking backwards from symptom to potential causes of errors
  • Cause elimination – uses binary partitioning to reduce the number of locations potential where errors can exist)
A

Debugging Strategies

40
Q
  • Is the cause of the bug reproduced in another part of the program?
  • What “next bug” might be introduced by the fix that is being proposed?
  • What could have been done to prevent this bug in the first place?
A

Bug Removal Considerations