Chapter 16 - Beslutningsanalyse Flashcards

1
Q

Hva er den “første største” forskjellen mellom hva som skjer her og hva vi gjorde i LP, IP og NLP?

A

Tidligere så vi på verden som deterministisk, der vi også antar at alt er “kjent”.

Det vi gjør nå, er å se bort ifra i disse antakelsene. Nå skal vi se på beslutninger under usikkerhet.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Hvordan ser vi for oss at prosessen i beslutningsanalyse skal se ut for oss?

A

Vi her en beslutningstaker som må velge ETT beslutningsalternativ. Beslutningstakeren velger dette alternativet mellom mange ulike alternativer.
Vi ser på en diskret mengde med alternativer. Dermed står dette i kontrast med LP, der divisibility antakelsen gir oss endelig mange kontinuerlige alternativer.

Videre antar vi at utfallet av beslutningen er gitt av tilfeldige faktorer som ligger utenfor beslutningstakerens kontroll.
Hvert utfall har en priori sannsynlighet.
Hver kombinasjon av beslutning/hendelse gir en pay-off.

Oversikten over de ulike alternativene og deres sannsynlighet samt payoff skal gi oss grunnlaget for å bestemme hvilken beslutning vi vil ta.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Se på tabellen. Hvordan tar vi en beslutning på noe slikt?

A

Det kommer an på hva slags kriterie vi bruker. Det finnes ulike beslutningskriterier. Ex maximin-kriteriet (pessimistiske).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Forklar maximin-kriteriet

A

maximin handler om å velge den beslutningen som gir oss “best av de værste”. Altså, vi ser på alle beslutninger vi kan ta, og ser på hvilke utfall hver beslutning kan gi. Dermed tar vi med oss det absolutt værste utfallet fra hver beslutning. Deretter velger vi det alternativet/beslutningen som gir den høyste/beste payoff av disse dårlige utfallene.

Maximin blir ofte kalt for pessimistisk siden den tar hensyn til hva som er det jævligste som kan skje. Det er derimot kanskje usannsynlig at dette vil skje, derfor kan det være gunstig å bruke andre beslutningskriterier.

Formell notasjon gitt på bildet.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

HVa er maximax-kriteriet?

A

Maximax handler om å se på beste mulige utfall fra hver individuelle beslutning, og deretter velge det beste av disse. Ekstremt optimistisk.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Hvilket beslutningskriterie er best?

A

Det gir ikke helt mening å snakke om hva som er best eller dårligst. Alt handler om hva vi som person/beslutningstaker synes er lurt. Noen ganger kan dette være å gå for å sikre høyest laveste gevinst. Andre ganger er det andre strategier.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Forklar “maximum-likelihood-kriteriet”

A

Vi ser først på hvilket utfall som er mest sannsynlig, og deretter velger vi alternativet som gir høyest payoff innad i gruppen som er mest sannsynlig.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Hva er den største kritikken til maximin, maximax og maximum likelihood?

A

Vi anvender veldig lite av informasjonen vi har tilgjengelig. Alle metodene handler kun om å se på en liten del av hva som foregår, eks sannsynlighet og velge den mest sannsynlige, men den tar da ikke bruk av ting som payoff på hendelser som bare er litt mindre sannsynlig. Derfor kan vi risikere dårlige beslutninger.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Forklar Bayes kriterium

A

Handler om å maksimere forventet payoff basert på sannsynlighet. Denne er litt mer robust enn tidligere kriterier (på noen måter).

Bayes metoden er best i tilfeller der vi ser på mange beslutninger i sekvens, siden sannsynlighet fører til at vi maksimerer forventet payoff i det lange løp.

Metoden/kriteriet fungerer ved at vi ser på hver mulige beslutning. Deretter multipliserer vi payoff for beslutningen for et spesifikt utfall med sannsynligheten for utfallet. Dette gjør vi for alle utfall, og summerer sammen.
Dette gjør vi for alle beslutninger.
Deretter velger vi beslutningen som gir høyest forventet payoff.

max {∑payoff x probability, for each beslutning}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Hva gjør vi dersom vi ikke er sikre på at sannsynlighetene våre er korrekte? Eventuelt dersom vi ikke kjenner til dem?

A

Vi vil da gjøre sensitivitetsanalyse på sannsynlighetene. Det vi gjør da, er å se på hvordan forventet payoff endrer seg for ALLE alternativene/beslutningene vi har tilgjengelig dersom vi varierer sannsynligheten for utfall. Dette forutsetter kun to mulige utfall per beslutning, slik at vi kan få en linear funksjon.

For eksempel:
E[soy] = 35xP + 8x(1-P) = 27p+8
Vi gjør det samme for alle beslutninger, og plotter linjene inn grafisk. Da kan vi se hvorvidt noen blir dominert av andre beslutninger, og vi kan se hva som skjer her. Vi kan også se hvor mye slingring det er mellom linjer i forhold til sannsynlighet. La oss si vi tror utfall P har 40%. Dersom det viser seg at en beslutning av best så lenge P er større enn 30%, så har vi en slingring på 10 prosentpoeng.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Hva er forskjellen mellom betinget sannsynlighet of Bayes teorem?

A

Betinget sannsynlighet gir bare P(A|B)=P(A U B) / (P(B)

mens Bayes gir oss P(B|A) som et uttrykk av P(A|B).

Dette betyr at dersom vi kjenner til P(A|B), eksempelvis vis sannsynligheten for at en sykdom gir et spesifikt symptom, kan vi bruke dette for å finne ut av sannsynligheten for at person har sykdommen ved å bruke dette. Hvorfor er dette hensiktsmessig? Alternativet vill vært betinget sannsynlighet: Sannsynlgihet for å både ha symtpm mens man har sykdommen, delt på sannsynlighet for å ha symptomet. Problemet med dette er at sannsynligheten for å ha symptomet kan endre seg betydelig.

Med Bayes får man en mer robust modell.
P(B|A) = P(A|B)P(B) / (P(A|B)P(B) + P(A|!B)P(!B))
Altså, sannsynligheten for å ha sykdom gitt symptom, er gitt ved sannsynligheten ved å ha symptom gitt sykdom multiplisert med sannsymnligheten for å ha sykdom, delt på total sannsnlighet for å ha sykdom.
VIKTIG: Vi bruker aldri P(A), som ville vært sannsynlighet for symptom. Detter viktig siden det ville representert sannsynlighet for “effekt”, som er vanskelig å finne. Sannsynligheten for effekt kan ofte endre seg, mens sannsynligheten for effekt gitt sykdom er stabil.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is experimentaiton?

A

Experimenting is about doing some sort of testing to gain evidence. The general thought is that we pay a small amount for some extra information, and this information makes us do a decision that is much better than we otherwise would be able to do. In other words, we consider the information to be valuable, and we can in fact put a price on it. More on this later.

The important part is that decision analysis very often considers decisions, and experimenting do make better decisions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

In this chapter, what is “decision making”?

A

Decision making is about choosing one alternative from a set of discrete alternatives. The decision itself will be chosen based on some criterion.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Why are these decisions subject to uncertainty?

A

We generally say the “state of nature”. There are very often a lot of factors that are simply outside of our control. We may never be able to know for sure whether there is oil in some field or not, if we dont drill for it. We probably know something about the probabilities though.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

A payoff is associated with each …

A

… with each alternative AND state of nature. Alternative refers to the specific decision we can make, and the state of nature refers to the possible outcomes that can follow if we select the specific decision alternative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the game theory analogy to this chapter?

A

We can view decision analysis, or the payoff table at least, as a game. We choose an alternative, and the other player (nature) selects one of the possible states that can result from this alternative. There will of course be a specific probability assigned to each of the possible states that “nature” can choose from.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is Bayes’ decision rule?

A

Calculate the expected value for each alternative (long run perspective).

Choose the largest of the expected values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the advantage of Bayes’ decision rule?

A

It incorporates all of the information in the table.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is the big DISadvantage of Bayes’ decision rule?

A

It heavily relies on somehow knowing the probabilities. Although past events may be able to provide good estimates for future events, it is not certain. Therefore, the uncertinaty is inherently uncertain. Therefore, we can make benefit from sensitivity analysis of the uncertain variable, which would be the probability.

We do this by creating linear functions with parameter equal to the probability of some event. This only works for binary events though.

ex: 700p - 100(1-p) = 700p - 100 + 100p = 800p - 100

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What points are interesting when using sensitivity analysis on probability variables?

A

The cross-over point. The cross over point refers to the point where the expected profit of one alternative is equal to the expected profit of some other alternative. Therefore, this point gives us the expected value where they are equal, along with the probability of event occurring. This of course means that we can use this probability as a reference. IF we believe the probability of hte event occurring is greater or smaller than the cross-over-point value, then we can quickly understand which option is better.
The purpose of this analysis is to be able to choose some option without actually having to know the exact probability of the event.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Say we perform experimentation. What do we call the new probability estimates?

A

Posterior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is probability tree diagram?

A

It is a tree structure that shows the probability of different alternatives and states of nature. I can follow a structure like “prior –> conditional –> joint –> posterior”. The entire point is being able to find the total probability of each possible outcome, so that we can correctly apply Bayes’ decision rule to it. In other words, this is just a visual tool for the extension of the Bayes decision rule, so that it can include evidence/experiments.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

elaborate on the expected value of perfect information

A

The expected value of perfect information is an upper bound of what we would be willing to pay for some piece of information/testing/experiment/evidence etc.

We find it by treating the experiment as if it removes ALL uncertainty. For instance, performing the seismic tests would be able to 100% tell us whether there are oil present or not. Then we calculate the expected payoff with perfect information. The value of perfect information is the difference between the expected payoff with “perfect informaiton” and without the perfect information. So, payoff with no uncertainty less the payoff with uncertainty.

How much would we at maximum be willing to pay for this piece of evidence? We would pay at max equal to the value we got. If the evidence costs more than the “expected value of perfect information” we would obviously not pay for it. In other words, we would not perform the experiment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Using the method of “the expected value of perfect information”, what does it tell us? What is th result+

A

It the expected value of perfect informaiton provides an upper bound greater than the cost of acquiring the evidence, then we definitely want to explore further. We want to calculate “expected value of experimentation”. This is more ‘difficult’, which is why we usually calculate the expected value of perfect information first, with the goal of ruling out the worst cases very quickly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

what kind of probability of used with EVOPI?

A

priors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

How is EVE calculated?

A

expected payoff with experiment - expected payoff without experiment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

When are decision trees truly powerful?

A

Whenever there are multiple decisions in sequence, for instance:
1) Should we experiument
2) What alternative to pick

28
Q

In a decision tree, how is a decision node represented, and what is it?

A

A deicison node is represented by a SQUARE. A decision node represents a point where we make a decision, and its edges are labeled with the different alternatives.

For instance, a decision node could represent “Seismic experiment” and the edges could be:
1) Perform the seismic test
2) DO NOT perform the seismic test

29
Q

In a decision tree, what is an event node?

A

An event node, represented by a circle, also called a “chance node”, indicates that there is a random event present here. For instance, a chance/event node could follow from a the alternative of “choosing to perform seismic test”. The random event node could split into “favorable” and “unfavorable”. the chance node indicate that these events are picked randomly by nature.Elabor

30
Q

elaborate on probabilities and cash flows in decision trees

A

The probabilities are inserted inside parentheses on the edges. Written on top of edges.
Cash flows are also assigned to edges, but are written under the edges.

The probabilities on the second set of decisions will be posteriors. THe first set of probabilities are priors.

31
Q

Why is LP and other similar models not great always?

A

They are established on a whole lot of certainty. If we are doing something in an environment that is difficult to predict, the deterministic models may not cut it.

In other words, in cases where we find it appropriate to include probability theory, we need to do something other than regular models with certainty. Decision analysis offers this kind of support.

32
Q

in general terms, describe the decision analysis problem

A

Some dude must choose among several alternatives based on some criterion. The alternatives typically involves a payoff and a probability of the different states. There will also typically be ways to reduce the uncertainty (experimenting) by paying for information.

33
Q

What is “state of nature”?

A

State of nature is an outcome that can “be dealt” out by nature. They determine the possible outcomes, for instance “oil” or “no oil” on a certain patch.

For instance:
The choice can be between “drilling for oil” or “selling the field”.
In the case of “drilling for oil”, there are two possible outcomes states referred to as “state of nature”:
1) Oil
2) no oil

Perhaps the probability of 1/4 for oil, and 3/4 for no oil.

34
Q

Is payoff the same as utility?

A

Must check this, but it appears that these two are interchangable. Or, payoff can be utility. Payoff can also be money.

35
Q

What is the analogy to game theory that is relevant for decision analysis?

A

Two player game, the decision maker vs nature.

1) The decision maker makes a choice between available alternatives
2) Nature picks an outcome according to the underlying probability distribution
3) The decision maker receives a payoff according to the alternative of choice and the state of nature.

There is one key difference though: Nature is not rational in terms of being competitive. It is a game of probabilities rather than adversarial.

36
Q

In addition to the payoff for various alternatives and states of nature, what is typically included in the pay-off table?

A

Prior probability distribution. Most likely based on experience of the decision maker.

37
Q

Define minimax decision criterion

A

For all alternatives, find the state of nature that gives the lowest pay off. From this set of (alternative:LowestPayoff) pairs, select the alternative that has the highest payoff.

This criterion is great as it offers the best guarantee. If we’re tight with our budget, this may be a good option.

However, it is not used that often as it is extremely pessimistic and does not account for probability information. That said, in the absence of priors, we may hedge with it.

38
Q

Define maximum likelihood criterion

A

Identify the state of nature with the highest probability of occurring. Then find the alternative that offers the largest pay off.

Ex: No oil is highest probability. Therefore, we sell the land and gain 90 grand.

39
Q

Drawback of the maximum likelihood criterion

A

It completely ignores potential huge payoffs. In the oil case, it would ignore the fact that we’re given 800 grand if there is oil.

40
Q

Define bayes decision rule

A

for each alternative, we compute the expected payoff. Then we compare this expected payoff for all different alternatives. Select the largest.

41
Q

What is the benefit of bayes decision rule?

A

Bayes decision rule incorporates all information provided in the payoff table. Therefore, nothing goes to waste, we dont ignore certain things.

42
Q

Drawback of bayes decision rule

A

Place a lot of emphasis on the probabilities. If they’re wrong, we’re fucked. Some say they are difficult to estimate. However, past/prior results have proven to be accurate in many situaitons.

43
Q

What is the jist of sensitivity analysis in decision analysis?

A

We avoid using probabilities, and rather use a variable for it. Then, in the case of binary states of nature, we can easily solve it as a system of linear equations, which gives us a value for the probability at the point of equality.
What does this provide us?
It provides a reference point where greater porbabilities will turn the thing beneficial or costly, dependent on the lines.

It basically provides a crossover point that acts as a point we can use. The benefit of this is that we dont need accurate probability estimates. As long as we find ourselves on either side of the crossover point, we know what the deicsion should be.

44
Q

What happens to the crossover point if there are more than two alternatives?

A

We can get multiple crossoverpoints. In such cases, the different points represent various decisions, or the various intervals correspond to specific decisions.

45
Q

What happens when we do experimentaiton?

A

We turn priors into posteriors. Should provide a more valuable prediciton

46
Q

Before we perform any experiment, we should…?

A

Determine its value

47
Q

The expected value of perfect information provides an …

A

upper bound to the potential value of the experiment

48
Q

What is perfect information?

A

Perfect information is a piece of evidence that assumes no uncertainty regarding the outcome that it describes.

For instance, conducting seismic testing on oil field with 100% certain results.

49
Q

What happens if the upper bound provided by the perfect information piece does not exceed the cost of the informaiton?

A

We should NOT perform the experiment. There is no value to be gained.

50
Q

What happens if the upper bound provided by the perfect piece of information exceeds the cost of it?

A

We need to use the other, slower, variant to calculate the value of the information with more accuracy. This is because the bound gives room for profit.

51
Q

What is the formula for “Expected Value of Perfect Information”?

A
52
Q

What is the difference between “expected value of information” and “Expected value of perfect information”?

A

EVPI provides an upper bound of what the evidence can provide.

EVI is the expected value of the evidence, meaning the direct increase in value that we expect to follow from the evidence.

53
Q

What is the formula for expected payoff with experimentaiton

A
54
Q

The strength of decision trees is…

A

Modelling sequences of decisions

55
Q

What types of nodes do we have in our decision trees?

A

Decision nodes: Represented by squares, indicates a point of decision. The decision node holds the question, and the edges emanating from the decision node holds answers that represent the choices we have (alternatives).

Event nodes: Represented by circles, indicates a random event. Event nodes are also called chance nodes. The circle-node representing the event/change can be regarded as a question on what state of nature we will get. For instance: Say we choose to do seismic testing. Then the state of nature picks its choice “Will the test be poisitive or negative?”. The edges emanating from chance/event nodes holds the different states of nature possible.

56
Q

Elaborate on the numbers in decision trees

A

There are two different types of numbers we use primarily in decision trees. One of them is written inside of parentheses, and the other type of not.

Numbers written either under or over edges that are NOT in parentheses, is the cash flow.

Numbers written inside parentheses are probabilities, either priors or posteriors.

The probabilities after chance nodes are essentially posteriors because we have to account for their outcome. If we dont do this, then there is no point in doing the experiment.

57
Q

How do we analyze a decision tree?

A

The procedure starts at the right side of the tree (terminal side), and progress once column at a time to the left.

For each column consisting of chance nodes, calculate expected profit using the formula for expected value and applying it to the probabability and payoff of the relevant outcomes. This expected payoff is “stored” at the chance node.

For decision nodes, we compare the expected profits/payoffs, and select the highest one.

When we analyse and get decisions that we dont want to do, we mark their edges by double dash ||

58
Q

Just look at the image

A
59
Q

What is utility function of money?

A

It is a one-variable function that accepts Money as x-variable, and outputs the corresponding utility.

60
Q

If the utility function of money is showing decreasing marginal utility of money, a person is …

A

Risk averse

61
Q

Increasing marginal utility of money corresponds to a

A

risk seeking fuck

62
Q

Linear (straight line) utility function of money corresponds to a

A

risk neutral individual

63
Q

What is important regarding utility functions and decision analysis

A

People generally have different utility functions as a result of having different preferences. Therefore, when we use a utility function to base our analysis on, we need to make sure that the utility function fits with the person making the decisions, or the persons affected by the decisions.

64
Q

how does scale impact the utility function?

A

Scale is irrelevant.

65
Q

Elaborate on how to do the decision tree task (IMP=

A

The decisions and states of nature is very easy to understand. Typically experiment first, then the chance outcome of the experiment, then our deicison, then state of nature chance nodes, and finally the terminal nodes/edges.

1) Make the outline of the tree.
2) Fill in the payoff we will receive on each terminal
3) Start with the probability.
4) We start at the terminal chance nodes. These have probabiliuties that are posterior: P(Outcome = outcome | Experiment = result).
5) We need probabilities for every chance outcomes.
6) When we have all the probabilities, we start calculating the expected payoff by considering subtrees and essentially using bayes decision rule along the way back up towards the root of the tree.

66
Q
A