Showing posts with label Decisiontrees. Show all posts
Showing posts with label Decisiontrees. Show all posts

Decision-making

Let’s say you are faced with a tough problem.  You have wrestled with it for a while and now you have to make a decision.  One approach is to use what I call the three by three method.
You start by considering many options.  You narrow this down to three choices.
  1. Your current best option, the CBO.
  2. A sound realistic alternative.
  3. An outlandish idea that might just work. This is deliberately quite different from a) and b).  Indeed it might be the opposite of a) the CBO.
Very often we seem to face a choice which is binary – yes or no, go or stay etc.  It is always worth trying to find a third option, even if it at first appears a little ridiculous.
For each of these options you now ponder three questions:
  1. What is the best that could happen if this worked well
  2. What could go wrong? What is the worst that could happen?
  3. What are the dependencies and next steps to make this happen?
You compare and contrast these possibilities and plans.  Then you make a decision using both your head; what logically makes sense? – and your heart;  what do you feel is best?
For example, imagine you are a marketing director and one of your team, Fred, has been under-performing for some time.  His creative ideas are poor and he has lost the respect of other team members.   You select three options.
  1. Fire Fred. The current best option.
  2. Put Fred on a training course to improve his performance
  3. Promote Fred to a management position!
Now we consider three aspects for each choice.
A1) Fred goes and we replace him with someone fresh and creative. Team performance improves.
A2) Fred goes but sues us for wrongful dismissal because we did not follow correct procedure. The process takes a lot of time and it makes us look sloppy and incompetent.
A3) We discuss the next steps with HR and make sure we do it right.
B1)  Fred completes the training.  His performance improves and he is much more motivated.
B2) Fred completes the training but his performance does not improves and we are back to the current situation with a lot of time and effort wasted.
B3) We discuss Fred’s issues in depth with Fred and ask him to identify courses which might help. We carefully assess whether these courses would really work.
C1) Promoting Fred within the Marketing department does not make any sense.  But although his creative skills are low his operational and admin skills are good.  Perhaps we should look elsewhere in the company.  He might make a good logistics executive or mail room manager.
C2) We can find no suitable position in which we case we are no worse off than today and are back to choices A or B.
C3) We discuss Fred with HR, review his skills and identify any suitable vacancies.
We now weigh up these different scenarios and choose a way forward.  We might dismiss B and instead talk to HR to explore C.  We are now aware that if we choose to fire Fred we must follow proper procedure to avoid a calamity.
Of course if you have a really major decision to make then you would use a more detailed and thorough analysis but for most management decisions this technique works well.  It is quick and it can be used by individuals or groups.
Why should you use the three by three approach?  First the mind can review and compare three options in some depth more easily than say five or six.  Secondly the third option is deliberately included to stimulate and provoke your thoughts to consider something unconventional.  This can lead to a fresh and better solution.  Thirdly, by considering both the best and worst possible outcomes we can see benefits but also avoid bad options or bad implementations which could trip us up.
I suggest you try this approach as follows. Select your current biggest issue.  Generate a great many ideas and then let them rest for a while.  Tomorrow morning (maybe on your commute to work) select A, B and C as above.  Go through the three by three analysis and then make a decision.  Don’t let the issue fester for a long time.  It is often the case that a wrong decision is better than no decision because at least you are moving forward.  If it turns out that you made the wrong decision have the courage to admit it and correct it.

Decision trees







Decisions!

What is a decision tree?
A decision tree is a map of the possible outcomes of a series of related choices. It allows an individual or organization to weigh possible actions against one another based on their costs, probabilities, and benefits. They can can be used either to drive informal discussion or to map out an algorithm that predicts the best choice mathematically.
A decision tree typically starts with a single node, which branches into possible outcomes. Each of those outcomes leads to additional nodes, which branch off into other possibilities. This gives it a treelike shape.
There are three different types of nodes: chance nodes, decision nodes, and end nodes. A chance node, represented by a circle, shows the probabilities of certain results. A decision node, represented by a square, shows a decision to be made, and an end node shows the final outcome of a decision path.
decision tree
Decision trees can also be drawn with flowchart symbols, which some people find easier to read and understand.

Decision tree symbols

ShapeNameMeaning
decision nodeDecision nodeIndicates a decision to be made
chance node decision treeChance nodeShows multiple uncertain outcomes
Alternative branchesEach branch indicates a possible outcome or action
Rejected alternativeShows a choice that was not selected
endpoint node decision treeEndpoint nodeIndicates a final outcome

How to draw a decision tree

To draw a decision tree, first pick a medium. You can draw it by hand on paper or a whiteboard, or you can use special decision tree software. In either case, here are the steps to follow:
1. Start with the main decision. Draw a small box to represent this point, then draw a line from the box to the right for each possible solution or action. Label them accordingly.
how to draw decision tree
2. Add chance and decision nodes to expand the tree as follows:
  • If another decision is necessary, draw another box.
  • If the outcome is uncertain, draw a circle (circles represent chance nodes).
  • If the problem is solved, leave it blank (for now).
decision tree
From each decision node, draw possible solutions. From each chance node, draw lines representing possible outcomes. If you intend to analyze your options numerically, include the probability of each outcome and the cost of each action.
3. Continue to expand until every line reaches an endpoint, meaning that there are no more choices to be made or chance outcomes to consider. Then, assign a value to each possible outcome. It could be an abstract score or a financial value. Add triangles to signify endpoints.
how to draw a decision tree
With a complete decision tree, you’re now ready to begin analyzing the decision you face.

Decision tree analysis example

By calculating the expected utility or value of each choice in the tree, you can minimize risk and maximize the likelihood of reaching a desirable outcome.
To calculate the expected utility of a choice, just subtract the cost of that decision from the expected benefits. The expected benefits are equal to the total value of all the outcomes that could result from that choice, with each value multiplied by the likelihood that it’ll occur. Here’s how we’d calculate these values for the example we made above:
decision tree analysis
When identifying which outcome is the most desirable, it’s important to take the decision maker’s utility preferences into account. For instance, some may prefer low-risk options while others are willing to take risks for a larger benefit.
When you use your decision tree with an accompanying probability model, you can use it to calculate the conditional probability of an event, or the likelihood that it’ll happen, given that another event happens. To do so, simply start with the initial event, then follow the path from that event to the target event, multiplying the probability of each of those events together.
In this way, a decision tree can be used like a traditional tree diagram, which maps out the probabilities of certain events, such as flipping a coin twice.
tree diagram

Advantages and disadvantages

Decision trees remain popular for reasons like these:
  • How easy they are to understand
  • They can be useful with or without hard data, and any data requires minimal preparation
  • New options can be added to existing trees
  • Their value in picking out the best of several options
  • How easily they combine with other decision making tools
However, decision trees can become excessively complex. In such cases, a more compact influence diagram can be a good alternative. Influence diagrams narrow the focus to critical decisions, inputs, and objectives.
influence diagram

Decision trees in machine learning and data mining

A decision tree can also be used to help build automated predictive models, which have applications in machine learning, data mining, and statistics. Known as decision tree learning, this method takes into account observations about an item to predict that item’s value.
In these decision trees, nodes represent data rather than decisions. This type of tree is also known as a classification tree. Each branch contains a set of attributes, or classification rules, that are associated with a particular class label, which is found at the end of the branch.
These rules, also known as decision rules, can be expressed in an if-then clause, with each decision or data value forming a clause, such that, for instance, “if conditions 1, 2 and 3 are fulfilled, then outcome x will be the result with y certainty.”
Each additional piece of data helps the model more accurately predict which of a finite set of values the subject in question belongs to. That information can then be used as an input in a larger decision making model.
Sometimes the predicted variable will be a real number, such as a price. Decision trees with continuous, infinite possible outcomes are called regression trees.
For increased accuracy, sometimes multiple trees are used together in ensemble methods:
  • Bagging creates multiple trees by resampling the source data, then has those trees vote to reach consensus.
  • Random Forest classifier consists of multiple trees designed to increase the classification rate
  • Boosted trees that can be used for regression and classification trees.
  • The trees in a Rotation Forest are all trained by using PCA (principal component analysis) on a random portion of the data
A decision tree is considered optimal when it represents the most data with the fewest number of levels or questions. Algorithms designed to create optimized decision trees include CART, ASSISTANT, CLS and ID3/4/5. A decision tree can also be created by building association rules, placing the target variable on the right.
Each method has to determine which is the best way to split the data at each level. Common methods for doing so include measuring the Gini impurity, information gain, and variance reduction.
Using decision trees in machine learning has several advantages:
  • The cost of using the tree to predict data decreases with each additional data point
  • Works for either categorical or numerical data
  • Can model problems with multiple outputs
  • Uses a white box model (making results easy to explain)
  • A tree’s reliability can be tested and quantified
  • Tends to be accurate regardless of whether it violates the assumptions of source data
But they also have a few disadvantages:
  • When dealing with categorical data with multiple levels, the information gain is biased in favor of the attributes with the most levels.
  • Calculations can become complex when dealing with uncertainty and lots of linked outcomes.
  • Conjunctions between nodes are limited to AND, whereas decision graphs allow for nodes linked by OR.

Decision trees






Decision trees

There will be some posts appearing here tomorrow relating to decision trees.

THEY ARE NOT IN YOUR SYLLABUS but....those of you who do Mathematics, like looking at probability etc may find them quite interesting.

And they do help you make decisions...and decision-making IS in your syllabus.