Chapter 4 Definitions Flashcards

(16 cards)

1
Q

Experiment

A

Experiment
In probability, an experiment is a process or action that leads to one or more outcomes. It is repeatable under the same conditions and is used to gather data. An experiment could be random (like rolling a die) or controlled (like testing a drug). The result of each performance of the experiment is called an outcome. Experiments form the foundation for calculating probabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Element

A

an element is an individual object or member of a set. In the context of Venn diagrams, an element represents a single item placed within a circle to show it belongs to that specific set.
Simply put, an element is what makes up the set, and identifying elements helps us understand which items belong to which groups.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Event

A

Event
* An event is a specific set of outcomes from a probability experiment.
* It can include one outcome (a simple event) or several outcomes (a compound event).

For example, rolling an even number on a die (2, 4, 6) is an event.
Events are usually denoted using capital letters such as A, B, or C.
* The probability of an event measures the chance that the outcomes in that set occur.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Sample Space

A

Sample Space
* The sample space is the complete set of all possible outcomes of a probability experiment.
* It is usually written using curly brackets, e.g. {1, 2, 3, 4, 5, 6} for a fair six-sided die.
* Each outcome in the sample space is called a sample point.
* The sample space helps to define probabilities by showing all options available. In notation, it is often represented by the symbol S or Ω.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Equally Likely

A

Equally Likely
* Outcomes are said to be equally likely if each one has the same chance of occurring.
* This assumption is crucial when using the classical definition of probability. For instance, each side of a fair coin (heads or tails) is equally likely with a probability of 0.5.
* If outcomes are not equally likely, more advanced methods must be used to calculate probability.
* Equally likely outcomes make calculating basic probabilities straightforward.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Probability

A

Probability
Probability is a measure of how likely an event is to occur, expressed as a number between 0 and 1. A probability of 0 means the event is impossible, while a probability of 1 means it is certain. The classical formula for probability is:

P(Event) = Number of favourable outcomes / Total number of outcomes

Probabilities can also be expressed as percentages or fractions. In statistics, probabilities are used to make predictions and assess risk.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Outcome

A

Outcomes
Outcomes are the individual possible results of a probability experiment. Each time the experiment is performed, exactly one outcome occurs. For example, when a coin is tossed, “heads” and “tails” are the possible outcomes. Outcomes can be listed to form the sample space. Understanding outcomes is the first step in analysing any probabilistic situation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Sample

A

Sample
A sample is a smaller, manageable group selected from a larger population, used to represent the whole. In statistics, we study samples to make inferences about the population. The quality of a sample affects the accuracy of probability estimates. Sampling must be random to avoid bias and increase reliability. Larger samples generally provide more accurate and stable probability results.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Theoretical calculation

A

Theoretical Calculation
A theoretical calculation in probability is based on known mathematical principles and assumes that all outcomes are equally likely. It does not rely on actual experiments but instead uses formulas and logical reasoning. For example, the theoretical probability of rolling a 3 on a fair six-sided die is 1/6. These calculations are most effective under controlled conditions or within simple, idealised models. They are useful for predicting what should happen in perfect, predictable situations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Experimental Calculation

A

Experimental Calculation
An experimental calculation of probability uses actual results from repeated trials or real-life data. For example, if a coin lands on heads 57 times out of 100 tosses, the experimental probability is 57/100. This type of probability reflects what did happen, not what should happen. As the number of trials increases, the experimental probability tends to approach the theoretical probability—this is known as the Law of Large Numbers. Experimental probability is especially useful when theoretical probabilities are difficult to calculate or when testing for fairness in practical situations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Proportion

A

Prop (Proportion)
In statistics, prop is short for proportion, which represents a part of a whole. It is often used to show how frequently an event occurs, usually written as a fraction or decimal. For example, if 3 out of 20 students are left-handed, the proportion is 3/20 = 0.15. Proportions are often converted to percentages for easier interpretation. They are used in both experimental and theoretical probability calculations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Theoretical Probability

A

Theoretical Probability
Theoretical probability is the likelihood of an event occurring, based on known facts and the assumption that all outcomes are equally likely. It is calculated using the formula:

P(Event) = Number of favourable outcomes / Total number of possible outcomes

This type of probability is ideal for simple events, such as tossing a coin or rolling a die. It provides an expected result based on logic and reasoning, not on actual observation. Theoretical probability is used to model ideal conditions and to guide expectations in practical situations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Idea of Chance

A

Idea of Chance
The idea of chance refers to the unpredictability of outcomes in random events. It recognises that while we may not know what will happen in a single event, we can predict long-term patterns. For instance, while we can’t say if a die will roll a 6 next time, we know the chance is 1 in 6. The concept of chance underpins all probabilistic thinking and is key to making decisions under uncertainty. It helps quantify uncertainty in a structured, logical way.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Venn Diagram

A

Venn Diagram
A Venn diagram is a visual representation used to show relationships between different sets or events. It uses circles to represent events, with overlapping areas showing common outcomes (intersections). The diagram is drawn inside a rectangle that represents the entire sample space. Venn diagrams are particularly useful for understanding unions, intersections, and complements. They are widely used in probability to illustrate combined and separate event probabilities clearly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Whole Sample Space

A

Whole Sample Space
The whole sample space refers to the entire set of all possible outcomes in an experiment. It is the universal set, usually denoted by
𝑆
S or a rectangle in a Venn diagram. Its probability is always equal to 1 (or 100%), because some outcome from the sample space must occur. Every event or outcome considered in probability is a subset of the sample space. Correct identification of the whole sample space is vital for accurate probability calculations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Value

A

Value
In probability, a value typically refers to the result assigned to an outcome, especially in numerical contexts (e.g. the value shown on a die). Values are used to compute statistics such as mean, variance, or expected value. They can also refer to the probability value itself, ranging from 0 to 1. Understanding values allows you to calculate weighted probabilities and interpret data meaningfully. Values become essential in discrete and continuous probability distributions.