In any business decision making we are faced with alternate scenarios where we have to make a decision of choosing between the scenarios based on certain risks associated with each such scenario. One very important tool which helps in such decision making is ‘decision tree’. Decision tree is used by organizations to identify the strategy most likely to reach at an identified goal. Another use of this decision tree is as a descriptive means to calculate conditional probability.
Decision tree actually came up as a tool to graphically represent the structural relationship among various alternative choices. Decision trees are improvement upon the old yes/no (dichotomous) choices. Gradually the decision tree became more complex. And ultimately it led a way to computer flow charts. Now a days decision trees are represented using computers and very complex type of decision trees involving many decision variables are being made. They are no longer just dichotomous in nature, rather they involve assigning probabilities to the likelihood of any paths.
A decision Tree consists of 3 types of nodes:
1. Decision nodes - commonly represented with squares.
Decision tree actually came up as a tool to graphically represent the structural relationship among various alternative choices. Decision trees are improvement upon the old yes/no (dichotomous) choices. Gradually the decision tree became more complex. And ultimately it led a way to computer flow charts. Now a days decision trees are represented using computers and very complex type of decision trees involving many decision variables are being made. They are no longer just dichotomous in nature, rather they involve assigning probabilities to the likelihood of any paths.
A decision Tree consists of 3 types of nodes:
1. Decision nodes - commonly represented with squares.
2. Chance nodes – which are represented with circles.
3. End nodes - represented with triangles.
The decision tree is drawn from left to right by incorporating each condition and attached probability to it. But while taking the decision based on decision tree we look from right to left. We start by looking at payoffs at each end notes and multiplying it with each probability assigned to it. Then we move one step towards left and come to chance nodes where we write the value that we have got from the product of each end node and respective probability. Here we get a pay off for that node. Similarly we get values for each chance node. Then we move one step towards further left and come to a decision node. Here we consider the payoffs of each chance node under that decision node and make a decision as to whether to go with alternative one or two or the other which ever gives higher pay off. In this way continue rejecting some conditions and selecting some other until we reach at the final decision node where we take the final decision as to choose which alternative we want to employ.
Decision theory is based on the concept that an expected value of a discrete variable can be calculated as the average value for that variable. The expected value is especially useful for decision makers because it represents the most likely value based on the probabilities of the distribution function. The application of Bayes' theorem enables the modification of initial probability estimates, so the decision tree becomes refined as new evidence is introduced.
Decision tree has following advantages:
1) They are simple to understand and interpret.
2) They have value with even simple hard data.
3) We can easily combine it with other decision making tools like NPV or PERT to take decisions in a situation involving uncertainty.
Decision tree is a simple graphical way of evaluating the situations under varying degrees of uncertainties. Just by using probabilty and simple calculations we can take the decisons which might seem really tedious in the beginning.
References: 1) http://www.wikipedia.org/
2) An Overview of Forecasting Methodology: By David S. Walonick
The decision tree is drawn from left to right by incorporating each condition and attached probability to it. But while taking the decision based on decision tree we look from right to left. We start by looking at payoffs at each end notes and multiplying it with each probability assigned to it. Then we move one step towards left and come to chance nodes where we write the value that we have got from the product of each end node and respective probability. Here we get a pay off for that node. Similarly we get values for each chance node. Then we move one step towards further left and come to a decision node. Here we consider the payoffs of each chance node under that decision node and make a decision as to whether to go with alternative one or two or the other which ever gives higher pay off. In this way continue rejecting some conditions and selecting some other until we reach at the final decision node where we take the final decision as to choose which alternative we want to employ.
Decision theory is based on the concept that an expected value of a discrete variable can be calculated as the average value for that variable. The expected value is especially useful for decision makers because it represents the most likely value based on the probabilities of the distribution function. The application of Bayes' theorem enables the modification of initial probability estimates, so the decision tree becomes refined as new evidence is introduced.
Decision tree has following advantages:
1) They are simple to understand and interpret.
2) They have value with even simple hard data.
3) We can easily combine it with other decision making tools like NPV or PERT to take decisions in a situation involving uncertainty.
Decision tree is a simple graphical way of evaluating the situations under varying degrees of uncertainties. Just by using probabilty and simple calculations we can take the decisons which might seem really tedious in the beginning.
References: 1) http://www.wikipedia.org/
2) An Overview of Forecasting Methodology: By David S. Walonick