Lectures 22-27 - Ch 8 Notes (decision analysis) - part 2 of 2 PDF

Title Lectures 22-27 - Ch 8 Notes (decision analysis) - part 2 of 2
Author Michael Hanna
Course information system
Institution McMaster University
Pages 36
File Size 2.9 MB
File Type PDF
Total Downloads 57
Total Views 123

Summary

Download Lectures 22-27 - Ch 8 Notes (decision analysis) - part 2 of 2 PDF


Description

5iii) Decision making under risk— ≥ 2 stages: decision trees

The manager can list the possible future outcomes and can estimate the probability that a specific outcome will occur. Two or more decisions are made; usually at different times (or stages). A decision tree is needed to depict and analyze the problem. One of three decisionmaking criteria is used to make a decision: EMV, EOL, EU. Example DA-1: Thompson Lumber (continued) Decision alternatives

Build large facility Build small facility Do nothing Best payoff for an outcome

Payoffs for each Possible Future Outcome (probability) High demand Moderate demand Low demand (0.3) (0.5) (0.2) $200k $100k -$120k $90k $50k -$20k $0 $0 $0

$200k

$100k

$0

Expected payoff, EMV

200×0.3 +100×0.5 – 120×0.2 = 86k 90×0.3 + 50×0.5 – 20×0.2 = 48k $0 EVwPI = 200×0.3 +100×0.5 + 0×0.2 = 110k

The best EMV is $86,000. The decision is: build a large facility. EVPI = EVwPI – best EMV = 110k – 86k = $24,000 Although this is a one stage problem (i.e. one decision at one point of time) and, therefore, we use a payoff table (as shown), we can also draw a decision tree for this problem. Decision trees 1. Draw the tree: Any decision analysis problem can be presented graphically as a decision tree. A decision tree presents the decision alternatives and future outcomes in a sequential (i.e. time) manner using decision nodes with arcs, outcome nodes with arcs, and end nodes with payoffs. □ = decision node. The arcs (or lines) originating from a decision node represent decision alternatives available to the decision maker at that point in time. Of these, the decision maker must select only one alternative. Most trees begin with a decision node. ○ = outcome (or event) node. The arcs (lines) originating from an outcome node represent all outcomes that could occur at that node. Each outcome has a probability. Only one outcome will actually occur. The decision maker has no control over which outcome will occur. ◄ = end (or terminal) node. Each path of decision alternatives and outcomes in the decision tree ends at an end node. The payoff (usually the monetary value, MV) at the end node is the result of the decision alternatives and outcomes on that path. 2. Fold back the tree: Decision trees are analyzed by a process called folding back the tree: ‘back’ means the end of the tree back to the front of the tree , ‘folding’ means we follow two rules. 1. At each outcome node, we calculate the expected payoff (usually the expected monetary value, EMV or the expected utility, EU). 2. At each decision node, we select the alternative with the best EMV or EU. Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 14

Example DA-1: (continued) - Drawing the tree:

Probability (0.30) ◄ (0.50) ◄ (0.20) ◄ (0.30) ◄ (0.50) ◄ (0.20) ◄ (1.00)

0



1 year

2 years

Fig. 8.1, p. 331 Time

- Folding back the tree: The tree in Fig. 8.1 above is equivalent to the following tree. ◄ ◄ ◄ ◄ ◄ ◄ (1.00)

0

1 year



2 years

Time

These values are the same as in the payoff table on the previous page

The tree in Fig. 8.2 above is equivalent to the following tree.

$86k

0

1 year

2 years

Time

- Result: EMV = $86k for the ‘Large Plant’ alternative is the best. So the best decision at the Decision Node at time 0 is to build a large plant. From Fig 8.2 after one year the plant will be finished and demand will begin. At the end of two years the demand will have been high, moderate, or low and the actual payoffs will be one of $200k, $100k, or -$120k, respectively. Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 15

Example DA-4: Folding back another decision tree Consider the following decision tree for a two-stage (i.e. two decision) problem (which we will study later). Fold back the decision tree (i.e. apply the two rules from right to left, , in the tree). Decision tree for decision problem: (0.30) (0.50) (0.20)

-

-

Tree 1

(0.468)

(0.468)

0

3 months

1 year

2 years

Time

Tree 1 is equivalent to: EMV = 200×0.3 + 100×0.5 – 120×0.2 = $86k EMV = 90×0.3 + 90×0.5 – 20×0.2 = $48k

Tree 2

EMV = $0k EMV = 196×0.509 + 96×0.468 – 124×0.023 = $141.84k $141,840

EMV = 86×0.509 + 46×0.468 – 24×0.023 = $64.75 EMV = - $4k EMV = 196×0.023 + 96×0.543 – 124×0.434 = $2.82k EMV = 86×0.023 + 46×0.543 – 24×0.434 = $16.54 EMV = - $24k 0

3 months

1 year

Time

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 16

$86k

Tree 3 Tree 2 is equivalent to:

$141.84k EMV = 141.84×0.57 + 16.54×0.43 = $87.961k

$16.54k 3 months

0

Time

$86k

Tree 3 is equivalent to:

$87,961

0

Tree 4

$87.961k 3 months

Time

Trees 1, 2, 3, 4 are equivalent. Then going from Tree 4, 3, 2, 1: the best strategy or optimal solution is: At time 0 (from Tree 4) make decision ‘Conduct Survey’; At time 3 months (from Tree 3): - If the survey gives ‘Positive Results’ then (from Tree 2) make the decision to build a ‘Large Plant’. At time 1 year (from Tree 1) the demand will be high, moderate, or low. We have no control over which; our payoff will be $196k, $96k, or -$124k, respectively. - If the survey gives ‘Negative Results’ then (from Tree 2) make the decision to build a ‘Small Plant’; At time 1 year (from Tree 1) the demand will be high, moderate, or low. We have no control over which; our payoff will be $86k, $46k, or -$24k, respectively.

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 17

We usually do all the folding back on the original decision tree (Tree 1):

$141,840 (0.468)

$141,840

Tree 1 with all folding back

(0.468)

The red line shows the best strategy or optimal solution.

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 18

Using TreePlan in Excel to draw and fold back / analyze decision trees

The TreePlan files are at: Avenue > Contents > Lectures 18-27 … Decision Analysis There are two ways to use TreePlan; ‘open’ it each time you want to use it, or ‘install’ it so that it is available whenever you use Excel. ‘Install’ is more permanent than ‘open’. Earlier in this course we installed Solver. We are not going to ‘install’ TreePlan; instead we will ‘open’ it each time we use it. This is appropriate for an add-in like TreePlan which you won’t use as often in the future as you use Solver. One of the TreePlan files is ‘How to Install …pdf’. The first few pages of this file explain how to ‘open’ TreePlan. In summary: - Download all the TreePlan files and save them in your Commerce 3QA3 folder, - Copy and save the TreePlan xlam file to a convenient location such as your desktop, - Unblock the xlam file (if possible), - Start Excel and open a worksheet, from Excel open the TreePlan xlam file, - Use the shortcut (‘Ctrl+Shift+t’ on a PC or ‘Opt+Cmd+t’ on a Mac) to run TreePlan. Read the file or go to the podcast for very detailed explanation of how to do this. It’s very easy once you know how. Once you complete these steps the following appears.

Click ‘New Tree’ to start a new decision tree. This gives the following:

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 19

Highlight the decision node in cell B5. Click ‘Decision Tree (Ctrl+Shft+T)’ or press the shortcut (‘Ctrl+Shift+t’ on a PC or ‘Opt+Cmd+t’ on a Mac) to get the Decision Node dialog box.

Click ‘Add branch’ and click ‘OK’ button. This gives …………….

Highlight the end node in cell F3. Click ‘Decision Tree (Ctrl+Shft+T)’ or press the shortcut (‘Ctrl+Shift+t’ on a PC or ‘Opt+Cmd+t’ on a Mac) to get the Terminal Node dialog box.

Click ‘Change to event node’; click ‘Three’ Branches; click ‘OK’. This gives

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 20

Example DA-1: (continued) Continue in this way until all decision nodes, outcome or event nodes, and end or terminal nodes are created for Example DA-1 (on page 15). Change the default names, probabilities, and payoffs by highlighting the cells and typing in the new values. The final decision tree is:

Avenue > Content > … ‘treeplan - thompson lumber - one stage - EMV.xlsx’ Folding back the decision tree TreePlan automatically writes formulas to 1. Calculate the EMV at the outcome nodes (and shows these adjacent to the outcome nodes), and 2. Select the alternative with the best EMV at the decision nodes (and shows these inside the decision node).

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 21

Result: The red line* shows the best decision. At time 0 we decide to build a large plant. Later the demand will be high, moderate, or low. We have no control over which. If the demand is high the payoff will be $200k, if moderate the payoff will be $100k, if low the payoff will be -$120k. *Add the red line yourself; TreePlan does not do it. The line should be red, be about 6 pt wide, and be about 50 percent transparent. This done as follows: PC

Insert > Shapes > Lines Select or highlight your line Format > Shape Outline > Colors … select a red colour > Weight … select 6 pt Format > Shape Outline > More Outline Colors … move Transparency slider to about 50%

Mac See the last pages in these Lecture Notes for an up-todate list of TreePlan hints.

Select or highlight your line … click Format select a red colour > Weights … select 6 pt > More Colors … move Transparency slider to about 50%

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 22

Decision Trees for Multistage (≥ 2 stages) Decision Problems In many problems a decision maker must make several decisions over a period of time with intervening outcomes (or uncertain events) in between. Decision trees are the best way to analyze these problems. (Payoff tables could be used but they are quite cumbersome.) Example DA-4: Thompson Lumber and market research Thompson Lumber can engage a market research firm to do a customer survey to determine whether the market conditions for storage sheds are positive or negative. The cost of the survey is $4,000. The survey will not be perfect information. Recall that the expected value of perfect information was EVPI = $24,000. Paying $4,000, which is much less, for imperfect information seems reasonable. (i) Drawing and folding back the tree using (ii) probabilities calculated by Bayes Theorem: When there is ‘No Survey’ the probabilities are as given in the problem.

(0.30)

(0.50) (0.20)

$141,840

$141,840

But when we ‘Conduct a Survey’ the probabilities will change a lot. (Otherwise there is no point in conducting a survey.) The probabilities depend on whether the survey gives a ‘Positive Result’ or a ‘Negative Result’.

(0.468)

You can: (i) Estimate the new probabilities using your judgment and knowledge of the problem, and/or (ii) Calculate the new probabilities using Bayes Theorem. You will not be asked to do Bayes Theorem calculations in this course (see below).

(0.468)

The probabilities in this Figure are calculated from Bayes Theorem. 0

3 months

1 year

Time

Figure 8.4, p. 339

Result: The red line shows the best decision. The best strategy or optimal solution is: At time 0 we ‘Conduct Survey’. Three months later: If the survey gives ‘Positive Results’ then build a ‘Large Plant’; If the survey gives ‘Negative Results’ then build a ‘Small Plant’. Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 23

(ii) Drawing and folding back the tree in TreePlan using (i) probabilities based on personal judgment and knowledge of the problem:

With ‘No survey’ the probabilities are the prior probabilities: P(HD) = 0.3 P(MD) = 0.5 P(LD) = 0.2

$200k - $4k = $196k $100k - $4k = $96k

0.5

0.4

0.1 High demand

124.0 0.1

88.3

0.5 18.0

0.5

0.65 Positive Result

0.4

0.4 124.0

0.1 59.0

‘Conduct survey’ gives information that revises the probabilities. These are called posterior probabilities:

0.1

0.5

22.0 22.0

0.4

P(HD|positive) = 0.3→0.5 P(MD|positive) = 0.5→0.4 P(LD|positive) = 0.2→0.1 P(HD|negative) = 0.3→0.1 P(MD|negative) = 0.5→0.5 P(LD|negative) = 0.2→0.4

88.31 0.35 Negative Result

We could estimate that P(positive result) = 0.50 and P(negative result) = 0.50; but based on our personal judgment a better estimate might be … P(positive result) = 0.5 → 0.65, P(negative result) = 0.5 → 0.35

See file: Avenue > Content > … ‘treeplan - thompson lumber - two stages - EMV.xlsx’

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 24

Expected Value of Sample Information, EVSI EVSI = EMV of best decision with sample information when cost of sample information is $0 – EMV of best decision without any information

(i) In this example with Bayes Theorem calculations of conditional probabilities (p. 23 of Notes): EVSI = ($87,961 + $4,000) – $86,000 = $5,961 Thompson Lumber would have paid up to $5,961 for this particular market survey. Efficiency of sample information = EVSI/EVPI The efficiency of this particular market survey information is $5,961/$24,000 = 0.25 or 25%.

(ii) In this example with personal estimates of conditional probabilities (p. 24 of Notes): EVSI = ($88,300 + $4,000) – $86,000 = $6,300 Thompson Lumber would have paid up to $6,300 for this particular market survey. Efficiency of sample information = EVSI/EVPI The efficiency of this particular market survey information is $6,300/$24,000 = 0.26 or 26%.

Result: This survey information is only 25-26% of the best possible (or perfect) information. This is low, and so we should try to find better information (and not pay too much for it).

Bayes’ Theorem calculations You will not be asked to do Bayes’ Theorem calculations on any of the exams in Commerce 3QA3. You already learned Bayes’ Theorem in your previous business statistics (or equivalent) course and so will not be examined on it again in this course. In practice, when you need conditional probabilities in a decision problem like this one you can: (i) Estimate the probabilities yourself like we just did, and/or (ii) Do Bayes Theorem calculations as shown below and on the podcast and in the textbook and as you learned in your previous business statistics (or equivalent) course.

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

page 25

Bayes Theorem time 1.

.

time 2

time 3. time

prior probabilities: P(HD), P(MD), P(LD) (given)

survey: survey accuracy: new P(+ve|HD), P(-ve|HD), information P(+ve|MD), P(-ve|MD), is: +ve, -ve P(+ve|LD), P(-ve|LD), (given)

posterior probabilities: P(HD|+ve), P(HD|-ve), P(MD|+ve), P(MD|-ve), P(LD|+ve), P(LD|-ve) (calculated)

Intersection: 𝐴 ∩ 𝐵 = A and B = both A and B

time 1. The prior probabilities are: P(HD) = 0.3, P(MD)= 0.5, P(LD) = 0.2. time 2. A survey can be run to obtain new information (+ve, -ve). Data on the accuracy of 75 previous surveys are: Previous surveys +ve -ve Total

Number 39 36 75

High demand, HD 29 1 30

‘States of nature’ Moderate demand, MD 8 7 15

Low demand, LD 2 28 30

The survey accuracy is: Previous surveys +ve -ve

‘States of nature’ Moderate demand, MD P(+ve |MD) = 8/15 = 0.533 P(-ve |MD) = 7/15 = 0.467

High demand, HD P(+ve |HD) = 29/30 = 0.967 P(-ve |HD) = 1/30 = 0.033

Conditional Probability: 𝑃(𝐴 ∩ 𝐵) = 𝑃(A|𝐵 )𝑃(𝐵) and 𝑃(𝐵 ∩ 𝐴) = 𝑃(B|𝐴)𝑃(𝐴) Bayes Theorem 𝑃(B|𝐴)𝑃(𝐴) 𝑃(𝐴|B) = 𝑃(B)

and 𝑃(A|𝐵)𝑃(𝐵) 𝑃(𝐵|A) =

Low demand, LD P(+ve |LD) = 2/30 = 0.067 P(-ve |LD) = 28/30 = 0.933

𝑃(A)

time 3. Based on the new information, the decision maker calculates the (new, revised) posterior probabilities: 𝑃(+ve) = 𝑃(+ve ∩ 𝐻𝐷) + 𝑃(+ve ∩ 𝑀𝐷) + 𝑃(+ve ∩ 𝐿𝐷) = 𝑃(+ve|𝐻𝐷)𝑃(𝐻𝐷) + 𝑃(+ve|𝑀𝐷)𝑃(𝑀𝐷) + 𝑃(+ve|𝐿𝐷)𝑃(𝐿𝐷) = 0.967x0.3 + 0.533x0.5 + 0.067x0.2 = 0.570

These probabilities are placed on the decision tree.

𝑃(𝐻𝐷|+ve) = 𝑃(𝑀𝐷|+ve) =

𝑃(+ve|𝐻𝐷)𝑃(𝐻𝐷 )

𝑃(+ve) 𝑃(+ve|𝑀𝐷)𝑃(𝑀𝐷) 𝑃(+ve)

= =

0.967×0.3 0.570 0.533×0.5 0.570

= 0.509, = 0.468, 𝑃(𝐿𝐷|+ve) =

𝑃(+ve|𝐿𝐷)𝑃(𝐿𝐷) 𝑃(+ve)

=

0.067×0.2 0.570

= 0.023

𝑃(-ve) = 𝑃(-ve ∩ 𝐻𝐷) + 𝑃(-ve ∩ 𝑀𝐷) + 𝑃(-ve ∩ 𝐿𝐷) = 𝑃(-ve|𝐻𝐷)𝑃(𝐻𝐷) + 𝑃(-ve|𝑀𝐷)𝑃(𝑀𝐷) + 𝑃(-ve|𝐿𝐷)𝑃(𝐿𝐷) = 0.033x0.3 + 0.467x0.5 + 0.933x0.2 = 0.430 (Notice: P(-ve) = 1.0 – P(+ve) = 1.0 – 0.57 = 0.43) 𝑃(𝐻𝐷|-ve) = 𝑃(𝑀𝐷|-ve) =

𝑃(-ve|𝐻𝐷)𝑃(𝐻𝐷)

𝑃(-ve) 𝑃(-ve|𝑀𝐷)𝑃(𝑀𝐷) 𝑃(-ve)

=

0.033×0.3 = 0.430 0.467×0.5

=

0430

0.023,

= 0.543, 𝑃(𝐿𝐷|-ve) =

𝑃(-ve|𝐿𝐷)𝑃(𝐿𝐷) 𝑃(-ve)

=

0.933×0.2 0.430

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

= 0.434

page 26

Example DA-5: New Product Introduction The marketing manager must make two decisions: (d1) introduce a new product (or not), and (d2) decide what price to charge. There are two future outcomes or uncertain events: (f1) a competitor introduces a competitive product (or not), and (f2) the competitor’s price is high, medium, or low. The sequence of decision alternatives and future outcomes is: d1: introduce a new product or not, f1: competitor introduces a competitive product (probability = 0.8) or not (probability = 0.2), d2: the price to charge for our new product—high, medium, low, f2: competitor sets a price—high, medium, low (probabilities depend on our decision in d2). The payoffs (monthly profit) and probabilities are shown on the decision tree below. The sequence, payoffs and probabilities are all given. But you have to create the decision tree. Drawing the decision tree:

f2

d2

f1

d1

0

6 months

9 months

one year

Decision making under risk – ≥2 stages Lectures 22-27 – Ch. 8 (Decision analysis) – part 2of 2

Time page 27

Folding back the decision tree:

f2

d2 EMV = 0.3×150 + 0.5×0 + 0.2×-200 = 5

f1 EMV = 0.1×250 + 0.6×100 + 0.3×-50 = 70

d1

EMV = 0.1×100 + 0.2×50 + 0.7×-100 = -50 EMV = 0.8×70 + 0.2×500 = 156

0

6 months

9 months

...


Similar Free PDFs