Software Systems and Engineering Flashcards

1
Q

Software Life Cycle Stages (The Waterfall Model)

A

Analysis -> Design -> Implementation -> Building -> Testing -> Deployment -> Maintenance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Software Life Cycle Stages - Analysis

A

Describing aspects of software development for which there is no choice (to which the project has already committed).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Software Life Cycle Stages - Design

A

Defining how project goals are going to be achieved.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Software Life Cycle Stages - Implementation

A

Process of writing code, typically partitioned in many subprojects.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Software Life Cycle Stages - Building

A

Creating a “complete” version of the software (i.e. putting all chunks of the code together).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Software Life Cycle Stages - Testing

A

Making sure that small independent parts of the code correctly, code parts work together, functionality meets requirements, and new code doesn’t break the old.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Software Life Cycle Stages - Deployment

A

Actual release of software into end user environment (e.g. application on Desktop or App Store).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Software Life Cycle Stages - Maintenance

A

Supporting the software during its lifetime (e.g. releasing compatibility updates or improving performance).

80-90% of software system ‘Total Cost of Ownership’ is attributed to maintenance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Traditional Software Engineering Team

A

Architect, Project Manager, Lead Programmer, Programmer, Tester

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Waterfall Model - Advantages

A
  • Early clarification of system goals
  • Can charge for changes to the requirements
  • Works well with management tools
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Waterfall Model - Disadvantages

A
  • Iterations critical to software development process (requirements not always understood and environments always changing)
  • Lot of time spent on designing system and functional specification
  • Requirement changes costly to implement and may take too long (changes made may not be relevant later)
  • Unsuitable if requirements unable to be defined in advance (due to uncertainty and necessary experimentation)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

The Spiral Model

A

Iterative development with systematic aspects of the waterfall model. Each spiral is a phase and number of spirals depend on risk. Incremental refinement with radius dependent on cost.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

The Evolutionary Model

A

Iterative (with feedback provided by users) and incremental (development cycle consists of smaller incremental waterfall models) delivery.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What to test? - Unit Tests

A

Code that exercises a small unit (module) of software separate from other units. Foundation of other types of testing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What to test? - Integration Tests

A

Test how all components (modules) interact with each other.
Potential largest source of bugs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What to test? - Validation and verification

A

Test whether this matches user needs, and whether functional requirements are met.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What to test? - Resource exhaustion, errors, recovery

A
  • Memory
  • CPU bandwidth
  • Disk space
  • Video resolution
  • Network bandwidth
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What to test? - Performance/stress testing

A

Testing as the number of users / transactions / connections increases (scalability)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What to test? - Regression Tests

A

Compares output of the previous tests with previous known values. Make sure bugs fixed now don’t break something else.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Types of Test Data

A

Real-world: “typical” data.
Synthetic: artificially generated (not enough real-data, need statistical properties).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Test Planning -
Q1 Technology-facing / Supporting Development

A

Unit, component, and deployment tests.
Technology-facing tests are written and maintained exclusively by developers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Test Planning -
Q2 Business-facing / Supporting Development

A

Testing Quadrant:
Functional, Prototypes, Simulations
Acceptance tests conducted by customer to verify that the system meets criteria. Should be written before development by customers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Test Planning -
Q3 Business-facing / Critique the Product

A

Testing Quadrant:
Exploratory, Usability, User acceptance, Scenarios, Alpha/
Beta, Showcasing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Test Planning -
Q4 Technology-facing / Critique the Product

A

Testing Quadrant:
Performance and Load tests, Security

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Automated Testing

A

Goal of “coverage” - cover most practically possible number of cases

26
Q

Profiling Tools and Analytics

A
  • Software Performance
  • Memory Allocation
  • Execution times (“bottlenecks”)
  • Metrics
27
Q

Extreme Programming (XP)

A

Driving factors: improve software quality and responsiveness to changing requirements, reduce cost of change.

“Stay aware, Adapt, Change”

28
Q

XP Values

A
  • Communication
  • Simplicity
  • Feedback
  • Courage
  • Respect
29
Q

XP Principles

A
  • Humanity: meet needs
  • Economics: business values
  • Mutual benefit: automation, clean code
  • Self-similarity
  • Improvement
  • Diversity: skills, perspectives
  • Reflection: analyse success/failure
30
Q

XP Practices

A

Whole Team, Informative Workspace, Energised Work, Pair Programming, Stories, Weekly cycles and quarterly planning, Incremental Design, Team Continuity, Single Code Base, Shared Code, Daily Deployment

31
Q

Continuous Integration (CI)

A

Frequent integration and testing of changes. Check in changes, complete the build and run entire test suite before going any further.

Key Ingredients: source repository, automated build, team agreement.

32
Q

CI Key Practices

A
  • Maintain source repo
  • Automate builds
  • Make build self-testing
  • Daily commits
  • Easy access
  • Transparency of events
  • Automate deployment
33
Q

Software Innovation

A

Provide systems which change the practices of their user communities
- Novelty: not developed before
- Utility: form of application users value, prepared to pay for

34
Q

Startup -
Advantages

A
  • Teams of dedicated people
  • Nothing to lose
  • No established business models to protect
  • No targets to meet (e.g. shareholders)
35
Q

Entrepreneurship
Defining the Idea

A
  • Product or Service?
  • Established or new market?
  • Growing or saturated?
  • Past success, learning experience
  • Accessible market?
  • Size and potential revenue?
36
Q

Business Model Canvas

A
  • Key Partners
  • Key Activities
  • Key Resources
  • Value Propositions
  • Customer Relationships
  • Channels
  • Customer Segments
  • Cost Structure
  • Revenue Streams
37
Q

Engineering Team -
Architect

A

The principal designer who defines the overall architecture, module structure and major interfaces

38
Q

Engineering Team -
Project Manager

A

Responsible for scheduling work, tracking progress and ensuring all steps are completed (on time/budget)

39
Q

Engineering Team -
Lead Programmer

A

Leader of a programming team. Spends typically 30% of time managing the rest of the team

40
Q

Engineering Team -
Programmer

A

Implements specific modules and requirements for the system

41
Q

Engineering Team -
Tester

A

Designs test and validation procedures for the completed software based on initial specification and overall product

42
Q

Deployment Pipeline

A
  1. Developers commit changes to source repo; first stage compiles code and runs tests; assembles and stores executable.
  2. Longer running automated acceptance tests.
  3. Pipeline branches to enable independent deployment of builds in environments.
43
Q

Benefits of Continuous Deployment

A
  • Allows more frequent feedback from colleagues/users
  • Can release important features earlier
  • Being closer to and understanding users can help developers create something new
  • Delivering better reliability and stability
  • Automating repeated tasks saves time
44
Q

Costs of Continuous Deployment

A
  • More intense collaboration between departments
  • More investment in automation
  • More effort required to deploy regularly
45
Q

Continuous Deployment

A

Extension of Continuous Integration (CI) that constantly runs a deployment pipeline that tests where a deployment is in a state to be delivered.

46
Q

Agile methodology -
Advantages

A
  • Based on assumption of changing requirements, technology, users and environments.
  • Concentrates on improving response to change
  • Allows exploration and experimentation to understand system
  • Release important features quickly
  • Frequent feedback from users
  • Understand users better and requirements
  • Automating repeated tasks saves costs and resources + improves reliability
47
Q

Agile methodology -
Disadvantages

A
  • Investment in automation and infrastructure
  • More effort required to deploy regularly
  • Automated tests and build scripts become a maintenance overhead
  • Close communication between various teams required
48
Q

Software Development Team -
Architect

A

The principal designer, defines the overall architecture, module structure and all major interfaces, usually also an expert in the associated technology. Responsible for specification and high level design.

49
Q

Software Development Team -
Project Manager

A

Responsible for scheduling/rescheduling work, tracking progress and ensuring all the process steps are properly completed (on time, on budget).

50
Q

Software Development Team -
Lead Programmer

A

Leader of a programming team. Typically spending 30% of time managing the rest of the team.

51
Q

Software Development Team -
Programmer

A

Implements specific modules and often implements module test procedures.

52
Q

Software Development Team -
Tester

A

Designs test and validation procedures for the completed software. Tests are based on the initial specification and will focus on the overall product rather than individual modules.

53
Q

Potential issues with software project

A
  • Inconsistent or not well investigated system requirements
  • System requirements improperly translated into software requirements.
  • Some aspects or issues may not be discovered till analysis stage.
  • Improper management of time and resources – over time and over budget.
  • Software may fail to meet functional specification or acceptance criteria.
  • Software may not perform in realistic real world conditions.
  • Unintended delays due to bugs or problems with development
  • Changing requirements in the middle of development process.
  • Team discontinuity – inexperience with problem and software
  • User may not use system as intended
  • Safety or security requirements might not be met
54
Q

Mitigation of software project risks

A
  • Critical to get functional, safety, security and other requirements correct through thorough analysis
  • Expert advice from primary domain (and security, regulatory, etc.) and user research
  • Effective project management tools to develop project structure
  • Effect testing strategy including: unit, integration, verification and validation, resource exhaustion and errors, performance testing, testing under load, usability tests
55
Q

Waterfall Model -
Considerations

A
  • Cost of core functionality not working is high (experiment then needs to be repeated which wastes limited resources).
  • Analysis stage conducted in depth through user research and expert opinion to consolidate requirements.
  • Conduct full and detailed risk analysis.
  • System needs to be made reliable to ensure results are valid and reproducible.
  • Consider regulatory requirements, testing to meet standards, requiring formal documentation.
  • Can be extended using visualization or reporting tools using an iterative process at the end stage.
56
Q

Waterfall model -
Stages

A
  • System requirements
  • Software requirements
  • Analysis
  • Design
  • Implementation
  • Testing
  • Operations
57
Q

Waterfall model stages -
Requirements and analysis

A

The environment and processes in which software will be used need to be analyzed to establish operational parameters and required interfaces. Product requirements documentation is produced in compliance with relevant regulations.

58
Q

Waterfall model stages -
Design

A

Software architecture is defined to meet functional specifications. The design is documented in relation to requirements being addressed.

59
Q

Waterfall model stages -
Implementation

A

Code is produced and tested at a component level to test functionality.

60
Q

Waterfall model stages -
Testing

A

The test specification is developed and the product is tested.

61
Q

Waterfall model stages -
Operations

A

Requirements and guidelines are established for installation, updates and maintenance of the product.

62
Q

Development process tools

A
  • Source code repository
  • Project management tools (e.g. Jira)
  • Automation tools for CI/CD (e.g. Jenkins)