random Flashcards

(96 cards)

1
Q

is the testing of “how well the system behaves”.

A

-Non-functional testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

It is sometimes appropriate to start early in the life cycle (e.g., as part of reviews and component testing or system testing).

A

for non-functional testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

the degree to which the set of functions covers all the specified tasks and user objectives

A

Functional completeness

Key Focus: Ensuring the system has all the required features and functionalities for task completion.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

sometimes needs a very specific test environment, such as a usability lab for usability testing.

A

-Non-functional testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

-The degree to which a component or system uses time, resources and capacity when accomplishing its designated functions.

A

Performance efficiency

(non-functional software quality characteristics)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The degree to which a component or system provides the correct results with the needed degree of precision.

A

Functional correctness

Key Focus: Verifying that the system behaves as expected, performing tasks accurately and delivering correct outcomes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

-The degree to which a component or system can exchange information with other components or systems, and/or perform its required functions while sharing the same hardware or software environment.

A

Compatibility

(non-functional software quality characteristics)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

-The degree to which a component or system can be used by specified users to achieve specified goals in a specified context of use.

A

Usability

(non-functional software quality characteristics)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

The degree to which the functions facilitate (make easier) the accomplishment of specified tasks and objectives

A

Functional appropriateness:

Key Focus: How well the functions of the system align with user needs and simplify task completion.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

-The degree to which a component or system performs specified functions under specified conditions for a specified period of time.

A

Reliability

(non-functional software quality characteristics)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

-The degree to which a component or system protects its data and resources against unauthorized access or use and secures unobstructed access and use for its legitimate users.

A

Security
(non-functional software quality characteristics)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

-The degree to which a component or system can be modified by the intended maintainers.


A

Maintainability

(non-functional software quality characteristics)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

-The degree to which a component or system can be transferred from one hardware, software or other operational or usage environment to another.

A

Portability

(non-functional software quality characteristics)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

with real users testing the system in realistic scenarios to evaluate ease of use.

A

usability lab

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

-when time or money is short when fixing defects, this might be restricted to simply exercising the steps that should reproduce the failure caused by the defect and checking that the failure does not occur.

A

confirmation testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

-confirms that no adverse consequences have been caused by a change, including a fix that has already been confirmation tested.

A

regression testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

-It is advisable first to perform an impact analysis to optimize the extent of this testing

A

Regression testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

-It is advisable first to perform an impact analysis to optimize the extent of this testing

A

Regression testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Impact analysis may be done before a change is made, to help decide if the change should be made, based on the potential consequences in other areas of the system

A

maintenance testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

it can be corrective

adaptive to changes in the environment

improve performance or maintainability

A

maintenance categories

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Testing the changes to an operational system or the impact of a changed environment to an operational system.

A

maintenance testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

-These adverse consequences could affect the same component where the change was made, other components in the same system, or even other connected systems.

A

regression testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

-can involve planned releases/deployments and unplanned releases/deployments (hot fixes).

A

maintenance testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

refer to specific attributes or features of a product, system, or service that define its overall quality.

A

Quality characteristics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
-often requires specific support, such as test harnesses or unit test frameworks.
component testing
26
refers to the activities involved in updating, modifying, or enhancing a software system after it has been deployed or delivered.
Maintenance
27
illustrate the order of interactions between components or objects in the system, showing how they collaborate to complete a function.
Sequence diagrams
28
this testing may not be restricted to the test object itself but can also be related to the environment.
Regression testing
29
Interfaces define how components communicate with each other, while communication protocols dictate the rules for data exchange
Interface and Communication Protocol Specifications
30
describe functional scenarios from the user's perspective, showing how different parts of the system interact to fulfill business requirements.
Use cases
31
define how the system interacts with external systems
External interfaces
32
The process of modifying a component or system after delivery to correct defects, improve quality characteristics, or adapt to a changed environment.
maintenance
33
Testing based on an analysis of the specification of the component or system.
Black-box testing Synonyms: specification-based testing
34
specification-based and derives tests from documentation external to the test object.
Black-box testing
35
-main objective is checking the system’s behavior against its specifications.
Black-box testing
36
Supports both Functional and nonfunctional testing
All the four test types(Functional testing, Non-functional testing, Black-box testing, White-box testing) can be applied to all test levels,
37
A TEST LEVEL that focuses on determining whether to accept the system. See also: user acceptance testing
acceptance testing
38
-Ideally, this test should be performed by the intended users.
acceptance testing
39
The integration testing of components.
component integration testing Synonyms: module integration testing, unit integration testing
40
A test level that focuses on individual hardware or software components.
Synonyms: module testing, unit testing component testing
41
-normally performed by developers in their development environments.
component testing
42
-The functions are “what” the test object should do.
functional testing
43
Testing performed to evaluate if a component or system satisfies functional requirements.
functional testing
44
A test level that focuses on verifying that a system as a whole meets specified requirements
system testing
45
requires suitable test environments preferably similar to the operational environment.
system integration testing
46
shows which parts of the software could be affected.
Impact analysis
47
-focuses on testing the interfaces of the system under test and other systems and external services .
system integration testing
48
-often including functional testing of end-to-end tasks and the non-functional testing of quality characteristics.
system testing
49
-For some non-functional quality characteristics, it is preferable to test them on a complete system in a representative test environment (e.g., usability).
system testing
50
The degree to which a component or system can be used by specified users to achieve specified goals in a specified context of use.
usability
51
Using simulations of sub-systems is also possible. may be performed by an independent test team, and is related to specifications for the system.
system testing
52
A group of test activities based on specific test objectives aimed at specific characteristics of a component or system.
test type
53
groups of test activities related to specific quality characteristics most of those test activities can be performed at every test level.
test type Examples: Functional Testing: To verify that the software functions as expected. Performance Testing: To check how the software performs under load or stress. Security Testing: To find vulnerabilities and ensure the system is secure. Usability Testing: To ensure the software is easy to use.
54
Testing based on an analysis of the internal structure of the component or system.
white box testing: Synonyms: clear-box testing, code-based testing, glass-box testing, logic-coverage testing, logic-driven testing, structural testing, structure-based testing
55
main objective is to cover the underlying structure by the tests to the acceptable level.
white box testing
56
A type of software development lifecycle model in which a complete system is developed in a linear way of several discrete and successive phases with no overlap between them.
sequential development models
57
-The executable code is usually created in the later phases( dynamic testing cannot be performed early in the SDLC)
sequential development models
58
-in the initial phases testers typically participate in requirement reviews, test analysis, and test design.
sequential development models
59
A type of software development lifecycle model in which the component or system is developed through a series of increments.
incremental development models
60
(e.g., Unified Process).
incremental development models
61
e.g., waterfall model, V-model
sequential development models
62
-is structure-based derives tests from the system’s implementation or internal structure (e.g., code, architecture, work flows, and data flows).
white box testing
63
(e.g., spiral model, prototyping),
iterative development models
64
A type of software development lifecycle model in which the component or system is developed through a series of repeated cycles.
iterative development models
65
A type of software development lifecycle model in which the component or system allow for revisions and continuous improvements, focusing on user feedback and risk management.
iterative development models
66
-Defining requirements, designing software, and testing are done in phases where in each phase a piece if the system is added. in this software development lifecycle model
incremental development models
67
-that each iteration delivers a working prototype or product increment. -in each iteration both static and dynamic testing may be performed at all test levels. -Frequent delivery of increments requires fast feedback and extensive regression testing
Iterative and incremental models:
68
-lightweight work product documentation and extensive test automation to make regression testing easier
Agile software development:
69
This analysis is often done before dynamic testing (which tests the program while it runs), or as part of an automated process in CI.
static analysis
70
is an organizational approach aiming to create synergy by getting development (including testing) and operations to work together to achieve a set of common goals.
DevOps
71
-most of the manual testing tends to be done using experience-based test techniques that do not require extensive prior test analysis and design.
Agile software development:
72
-assumes that change may occur throughout the project.
Agile software development:
73
-groups of test activities that are organized and managed together
test levels
74
-the exit criteria of one level are part of the entry criteria for the next level -In some iterative models, this may not apply Development activities may span through multiple test levels. this test may overlap in time.
test level
75
iterative development model Directs the coding through test cases (instead of extensive software design) Tests are written first, then the code is written to satisfy the tests, and then the tests and code are refactored
Test-Driven Development (TDD):
76
Derives tests from acceptance criteria as part of the system design process Tests are written before the part of the application is developed to satisfy the tests

Acceptance Test-Driven Development (ATDD)
77
Expresses the desired behavior of an application with test cases written in a simple form of natural language, which is easy to understand by stakeholders – usually using the Given/When/Then format. Test cases are then automatically translated into executable tests
Behavior-Driven Development (BDD):
78
promotes team autonomy, fast feedback, integrated toolchains, and technical practices like continuous integration (CI) and continuous delivery (CD). This enables the teams to build, test and release high-quality code faster through a DevOps delivery pipeline
DevOps
79
involves examining the source code for potential issues (like syntax errors, security vulnerabilities, or coding standard violations) without actually running the code.
static analysis
80
Each is an instance of the test process, performed in relation to software at a given stage of development, from individual components to complete systems or, where applicable, systems of systems
test level
81
retrospective synonyms
project retrospective, retrospective meeting, post-project meeting
82
An automated software development procedure that merges, integrates and tests all changes as soon as they are committed.
continuous integration
83
good practices of shift-left approach
Reviewing the specification from the perspective of testing: allows teams to catch defects in the planning stages, reducing rework later. Writing Test Cases Before the Code is Written: (TDD/ATDD) ensures code is developed to meet predefined test criteria. Using Continuous Integration (CI) and Continuous Delivery (CD) : CI/CD provides fast feedback on code quality through automated tests and integration processes, helping detect issues early in the development process. Static Analysis Before Dynamic Testing: Static analysis catches errors in the source code before it runs, preventing more complex issues in dynamic testing. Non-functional testing at the component level: allows performance, reliability, and security issues to be addressed early in development, reducing risks later in the SD
84
A regular event in which team members discuss results, review their practices, and identify ways to improve.
retrospective
85
An approach that involves a process of testing early, testing often, test everywhere, and automate to obtain feedback on the business risks associated with a software release candidate as rapidly as possible
Continuous testing
86
An approach to testing a system continuously in production
Shift right
87
An approach to software development in which the test cases are designed and implemented before the associated component or system is developed
Test-first
88
An approach to performing testing and quality assurance activities as early as possible in the software development lifecycle
Shift left
89
A type of testing initiated by modification to a component or system.
change-related testing
90
A type of acceptance testing performed to determine the compliance of the test object.
regulatory acceptance testing with a particular emphasis on ensuring the system adheres to governmental, legal, or safety standards
91
A type of acceptance testing performed to verify whether a system satisfies its contractual requirements.
contractual acceptance testing
92
A group of software development methodologies based on iterative incremental development, where requirements and solutions evolve through collaboration between self-organizing cross-functional teams.
Agile software development
93
A collaborative approach to development in which the team is focusing on delivering expected behavior of a component or system for the customer, which forms the basis for testing.
behavior-driven development BDD
94
benefits of devops
Fast feedback on the code quality, and whether changes adversely affect existing code CI promotes a shift-left approach in testing (see section 2.1.5) by encouraging developers to submit high quality code accompanied by component tests and static analysis Promotes automated processes like CI/CD that facilitate establishing stable test environments Increases the view on non-functional quality characteristics (e.g., performance, reliability) Automation through a delivery pipeline reduces the need for repetitive manual testing The risk in regression is minimized due to the scale and range of automated regression tests
95
risks of dev ops
The DevOps delivery pipeline must be defined and established CI / CD tools must be introduced and maintained Test automation requires additional resources and may be difficult to establish and maintain
96
Given the following statements about the relationships between software development activities and test activities in the software development lifecycle: 1. Each development activity should have a corresponding testing activity 2. Reviewing should start as soon as final versions of documents become available 3. The design and implementation of tests should start during the corresponding development activity 4. Testing activities should start in the early stages of the software development lifecycle Which of the following CORRECTLY shows which are true and false?
true: 1,4 false, 2,3