Q&A3 Flashcards

(50 cards)

1
Q

Severity of the impact on the system and/or the product stakeholders is usually determined by whom?

A

Technical behavior of the system

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Priority to fix then problem is usually determined by whom?

A

Business impact of the failure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What technique that the subsystem or component in which the defect lies?

A

Defect cluster analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Distributed model

A

Alignments in methodologies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Purpose of process improvements

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Structure of unorthodox(which is not following the norms, not normal, not practiced)

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Demotivates a tester

A
  • working late, successful project

- testing cut short, production defects results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Documentation standards is determined by whom?

A

Test Mgrs shld be aware of standards, policy and whether or not useful for them to use.

Test Managers should determine the usefulness of the diff standards for which testing occurs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Requirement based model?

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Which Tool lifecycle - porting of perf tool to new software

A

Acquisition
Support and Maintenance
Evolution
Retirement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Model failure modes of product?

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Model failure modes of product

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Test Policy?

A

Tbd

  • methodical
  • standard-compliant
  • model
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

S/w characteristic of open source application tool

A
  • concurrency threshold
  • time limits?
  • memory leaks
  • coding standard issue
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Who can have the best skills to test the new software

A

Tbd
- who worked as customer support and maintenance of
previous software released?

  • former test manager?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Who decides which test tool to use?

A

Tbd

  • test manager
  • pm
  • stakeholder
  • developer
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Current test progress 100% auto smoke test, 50% regression, 25% functional test. Goal is to automate all at the end of the release. What should be done?

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Requirement: wiper will increase speed with slow/medium/fast intermittent, slow/med/fast constant
Test Condition: when moisture is increased to move the wiper speed from slow to fast
Test cases:
1.
2.
3.

  • test requirement met the test case 100%
  • test condition met the test case 100%
  • per TCs,test condition not met
  • per TCs,requirements not met
A

Tbd

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Technique to use to reduce the number of defects

A

Tbd

  • cost of quality analysis
  • defect triage
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

For distributed testing, the division of the test work across multiple locations MUST BE?

A

explicit and intelligently decided

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

A REVIEW can eliminate what? -

A

an issue at the requirements level before the problem is implemented into the code

22
Q

Which technique can help to enforce coding standards and check for problems that might be too laborious for the team to find by examination of the work product?

A

STATIC ANALYSIS

23
Q

Test Metrics that shows percent how much has been tested: passed/failed/executed?

A

Project Metrics

24
Q

Test Metrics: defect detected by testing?

A

Process Metrics

25
Test Metrics: capability of individual/group, implementation of test cases by schedule
People Metrics
26
Tedt Merrics: defect density
Product Metrics
27
Requirements-based testing may utilize what techniques?
1. Ambiguity reviews 2. Test condition analysis 3. Cause-effect graphing
28
It can reduce an extremely large testing problem to a manageable number of test cases Provide 100% functional coverage of the test basis Identifies gaps in the test basis during test case design which can make identify defects-clearly in the sdlc when test design is started against the draft reqts
Cause-effect graphing
29
Model-based approach
Operational profiles Mix of use cases, users(personas), inputs, outputs Depict the real world use of the system ``` Functionality Usability Interoperability Reliability Security Performance ```
30
Methodical approach
- Checklists: — what to test —how much —in what order
31
- Bug clusters focus of testing - Tend to miss major areas that are important but not suffering from a large number of bugs - Dynamic testing
Reactive approach
32
Test team: - selects tests, - allocates test effort, and - initially prioritizes tests during requirements phase with oeriodic adjustments
Sequential V-model
33
Breakdown in test process occurs during which sdlc process?
Design and Implementation
34
Allocation and Prioritization is determined when?
Test Planning
35
Tests are run and defects found, testers can examine the remaining residual risk level
Risk-based testing
36
Test Manager can measure the degree to which testing is complete during which process?
Results Reporting and Exit Criteria Evaluation
37
Test Manager should evaluate Metrics and success criteria which are pertinent to the needs and expectations of the testing stakeholders including the customers and users needs and expectations in terms of quality
Test closure
38
Test Policy
Test objectives
39
Test Strategy
Test methodology
40
Master Test Plan or Project Test Plan
Implementation of test strategy
41
Level Test Plan
Particular activities to be carried out within each test level
42
Analytical strategies
Risk based testing | Test team analyzes the test basis to identify test conditions to cover
43
Model based strategies
Operational profiling
44
Methodical strategies
Used predetermined test set if test conditions | ISO 25000
45
Process or standard-compliant strategies
Scrum Agile Mgmt technique - each iteration testers analyze user stories that describes features, - estimate test effort for each feature as part of the planning process - identify test conditions for each usr story - execute tests cover conditions - report the status of each user story(untested, failing passing) during test execution
46
Reactive strategies
Defect-based attacks - team waits to design and implement tests until s/w is rcvd - EXPLORATORY TESTING
47
Consultative strategies
User-directed testing - inputs from stakeholders to determine test conditions to cover - pairwise testing (high priority option) - equivalence partitioning (lower priority options)
48
Regression-averse strategies
GUI based test automation tool | Regression automation
49
Overall governance of testing effort
Master Test Plan
50
Less formal projects
Test plan | -with all informational elements