A lot of us use different methods to make tough decisions. Some flip a coin, others use the “eeny, meeny, miny, moe” rhyme, and others examine each option’s pros and cons. However, for arguably more professional decisions, some use a technique called the ‘decision tree.’
People often use decision trees as a way to figure out an answer to a difficult problem. These trees are commonly used in determining a course of action when it comes to investing, finance, and business. In university classes, they are major components of finance, philosophy, and decision analysis. For all its uses in academics, many students and graduates do not have a full understanding of their purpose. This is surprising given that these statistical representations are pivotal in economic forecasting and corporate finance.
What is it?
A decision tree is a diagram or chart that helps people determine the proper course of action. Alternatively, they show those who use it a statistical probability. It builds the outline of the namesake plant; typically upright, though sometimes on its side. Each of the tree’s branches represents a potential decision, outcome, or reception. The farthest branches illustrate a series of endpoints.
At their core, these are easy-to-understand depictions of a decision and every possible outcome of making that specific decision. Individuals use decision trees in an array of situations. Some are more personal and simple, like deciding on whether or not to go out for dinner. Others are far more complex undertakings in a scientific, industrial, or microeconomic context.
By showing a sequence of steps, decision trees are an effective and simple way to visualize the potential options of a decision. Similarly, it helps people understand the range of potential results. The trees also help identify every possible option and weigh each action against the risks and rewards of every option.
Each outcome in the decision tree has a risk and reward weight or number. When a person uses a decision tree to make a choice, they look at each result and evaluate the pros and cons. These trees are capable of spanning as long – or as short – as needed to come to a fitting conclusion.
Common uses of this model
Decision trees are commonplace in operations management and research. Let’s assume that decisions must be taken online with no recall under insufficient knowledge. In this case, a decision tree must parallel a probability model as the best choice model. Alternatively, an online selection model algorithm. Another popular use of decision trees is to help with the calculation of conditional probabilities.
For business school students, decision trees are recurring tools. Influence diagrams, utility functions, and other decision analysis methods are also quite common. These are especially prevalent in health economics, business, and public health. Decision trees and other tools are examples of operations research or management science techniques.
An organization may utilize decision trees as something of a support system for decision-making. The structure of the model allows readers to see how and why a single choice could result in the next. The branches of the tee are indicative of mutually exclusive options. The framework gives users the ability to take a problem with various solutions and display them in an easy-to-understand format. It also displays the overall relationship between different decisions or events.
Decision tree analysis is commonplace in the realm of option pricing. For instance, the binomial option pricing model utilizes distinct possibilities to determine an option’s value upon expiration. Fundamental binomial models make the assumption that the underlying asset’s value will rise or fall depending on calculated probabilities. Specifically, at the European option’s maturity date.
With American options, the situation becomes more complex. In this case, at any time until reaching maturity, the option can be exercised. The binomial tree would consider various paths that the underlying asset’s price could potentially take over time. When the number of nodes in the binomial decision tree rises, the model will eventually assemble onto the Black-Scholes formula.
The Black Scholes model (aka. the Black-Scholes-Merton model) is a mathematical model used for pricing an options contract. It is an easier alternative for option pricing over decision trees, computer software can build binomial option pricing models with countess nodes. This calculation type frequently provides accurate pricing information, particularly for dividend-paying stocks and Bermuda Options.
How to make one
Decision trees categorize instances by way of sorting them down the tree from the root to a leaf node. Doing so provides the proper classification of an instance. The classification of an instance is done by starting at the root node of the tree before testing the attribute that this node specifies. It then continues down the tree branch that corresponds to the value of the attribute. For the subtree that is rooted at the new node, this procedure happens again.
To make a decision tree, you first need to have a specific decision. Draw a small square at the far left of where the tree will start, which is what will represent the decision. From here, draw lines that move outward from the box. Each one of these lines will move from left to right and represent a potential option. Alternatively, you can have the square start at the top and draw the lines moving downward.
Analyze the results at the end of each line (aka. option). If an option’s outcome turns out to be a new decision, then draw a box at the end of that particular line. Draw new lines protruding out of that decision, which is what will represent the new options and label them accordingly. If an option’s result is unclear, you can draw a circle at the end of the line. This will denote any potential risk.
Suppose an option results in a decision. In this case, you should leave that line blank. Proceed to expand until each line reaches an endpoint. This means that you have gone over every choice or outcome. At this point, draw a triangle to signify the endpoint.
Strengths & Weaknesses
Using decision tree methods are advantageous for the following reasons:
- They are capable of generating coherent rules.
- Can perform classification and do so without much computation requirements.
- Can handle variables that are both continuous and categorical.
- They provide straightforward indications about which fields possess more importance regarding classification or forecasting.
For all its strengths, decision tree methods have their fair share of weaknesses:
- They are less suitable for estimation tasks in which the objective is to predict a continuous attribute’s value.
- They are prone to errors when it comes to classification problems with many classes, as well as small amounts of training examples.
- The process of growing a decision tree is computationally expensive. At each node, every candidate splitting field needs to be subject to sorting prior to the discovery of its best split. With some algorithms, field combinations are used and there must be a search for ideal combining weights. Reducing algorithms tends to be quite expensive due to many candidate sub-trees needing to undergo formation and comparison.