II - Testing Throughout the Software Development Lifecycle Flashcards
(35 cards)
Functional Testing
Evaluates “what“ the system should do. Work products would include: business requirements, functional specs, epics and user stories, and use cases.
Non-Functional Testing
Evaluates “how well“ the system behaves.
Change Related Testing
Required when functions have been changed or added, to confirm the fix. Ex. Confirmation Testing and Regression Testing
Confirmation Testing
A type of Change Related Testing.
Confirms whether the original defect has been successfully fixed.
Regression Testing
A type of Change Related Testing.
Re-testing an already tested program after its modification (or a modification in the environment) to make sure that:
- the behavior of other code parts is not influenced unintentionally.
- no new defects are built in.
- no existing defects have been exposed.
Characteristics of Good Testing
- For every development activity, there is a corresponding test activity.
- Each test level has test objectives specific to that level.
- Test activities (starting with test analysis and design) for a given test level begin during the corresponding development.
- Testers participate in discussions to define and refine requirements and design.
- Testers are involved in reviewing work products (e.g.,requirements, design, user stories, etc.)
Sequential Development Models
A type of Software Development Lifecycle Model. Linear, sequential process (step by step).
Typically done for long-term projects.
Ex. Waterfall or V model.
(terms like validation/verification, usually referring to V model).
Waterfall Model
An example of a Sequential Development Model.
- Testing is a “one time only” activity.
- “Final testing” at test closure of a software project.
- The model does not include error correction or a continuation of the project.
V Model
An example of a Sequential Development Model.
One side is developer activities, the other side is tester activities.
The V-model integrates the test process throughout the development process, implementing the principle of early testing: Development step, then a Verification + Validation step.
Iterative and Incremental Models
A type of Software Development Lifecycle Model. When groups of features are specified, designed, built, and tested together in a series of cycles.
The software’s features grow incrementally.
Typically for shorter-term projects, as it works in weekly increments
Ex. Rational Unified Process (RUP), Scrum/Agile, Kanban, and Spiral (Prototyping).
Rational Unified Process (RUP)
An example of an Iterative and Incremental Model.
An outdated model, not used much anymore.
Business value is delivered incrementally in time-boxed cross-disciplined iterations.
Scrum/Agile
An example of an Iterative and Incremental Model.
Pull user stories from the Backlog. Items are developed + tested over a short period of time (ex. days or weeks), to produce a potentially shippable product.
Kanban
An example of an Iterative and Incremental Model.
Work to do is categorized under 3 categories: To Do, Doing, Done.
Implemented with or without fixed-length iterations,
which can deliver either a single enhancement or feature upon completion, or can group features together to release at once.
Spiral (Prototyping)
An example of an Iterative and Incremental Model.
Involves creating experimental increments,
some of which may be heavily re-worked or even abandoned in subsequent development work.
Work with client to prototype the product. Testing begins once client signs off on the “final” prototype.
Software Development Lifecycle Model Selection
Software development lifecycle models must be selected and adapted to the context of project and product characteristics.
High-risk / long-term projects should use Sequential models.
Low-risk / shorter-term projects should use Iterative models.
Can also combine models together, depending on the project.
Test Levels
Test levels are groups of test activities that are organized and managed together.
Each test level uses at least one instance of the test process.
Includes Component Testing, Integration Testing, System Testing, and Acceptance Testing.
Component Testing
Focuses on a single component, like a button, or a field.
Component testing is often done in isolation from the rest of the system and is usually performed by the developer who wrote the code.
Defects are typically fixed as soon as they are found, often with no formal defect management.
Test Basis:
- Detailed Design
- Code
- Data Model
- Component Specifications
Test Objects:
- Components, units, or modules
- Code and data structures
- Classes
- Database models
Integration Testing
Includes multiple components. Focuses on interactions between components or systems.
Includes:
- Component Integration Testing, which is usually done by the developer.
- System Integration Testing, which is usually done by the tester. Typically involves testing multiple softwares (ex. Amazon site + payment system).
Test Basis:
- Software and system design
- Sequence diagrams
- Interface and communication protocol specifications
- Use cases
- Architecture
- Workflows
- External interface definitions
Test Objects:
- Subsystems
- Databases
- Infrastructure
- Interfaces
- APIs
- Microservices
System Testing
Testing everything together (end to end testing). Focuses on the behavior and capabilities of a whole system or product.
Ex. from logging in to logging out
Typically performed by independent testers.
Test Basis:
- System and software requirement specifications
- Risk analysis reports
- Use cases
- Epics and user stories
- Models of system behavior
- State diagrams
- System and user manuals
Test Objects:
- Applications
- Hardware/software systems
- Operating systems
- System under test (SUT)
- System configuration and configuration data
Acceptance Testing
Also known as User Acceptance Testing (UAT).
Making sure it’s easy to use, from a user’s perspective.
Is often the responsibility of the customers, business users, product owners, or operators of a system.
Test Basis:
- Business processes
- User or business requirements
- Regulations, legal contracts and standards
- Use cases and/or user stories
- System requirements
- System or user documentation
- Installation procedures
- Risk analysis reports
Test Objects:
- System under test
- System configuration and
configuration data
- Business processes for a fully integrated system
- Recovery systems and hot sites (for business continuity and disaster recovery testing)
- Operational and maintenance processes
data
- Forms and Reports
- Existing and converted production
Component Integration Testing
Focuses on the interactions and interfaces between integrated components. This is generally automated.
Examples of Defects:
- Incorrect data, missing data, or incorrect data encoding
- Incorrect sequencing or timing of interface calls
- Interface mismatch
- Failures in communication between components
System Integration Testing
Is performed after the system test or in parallel with ongoing system test activities.
Focuses on the interactions and interfaces between systems, packages, and microservices.
Examples of Defects:
- Inconsistent message structures between systems
- Incorrect data, missing data, or incorrect data encoding
- Interface mismatch
- Failures in communication between systems
- Failure to comply with mandatory security regulations
Common Forms of Acceptance Testing
- User acceptance testing
- Operational acceptance testing
- Contractual acceptance testing
- Regulatory acceptance testing
- Alpha and beta testing
User Acceptance Testing (UAT)
- Building confidence that the users can use the system to meet their needs, fulfill requirements.
- Perform business processes with minimum difficulty, cost, and risk.