Decision Tree
"A decision tree takes as input an object or situation described by a set of properties, and outputs a yes/no decision. Decision trees therefore represent Boolean functions. Functions with a larger range of outputs can also be represented...." |
![]() |
Introductory Readings |
Decision Tree Learning from The "AIxploratorium" web site. A great introductory tutorial that even includes a demo...and it all begins with: "You are in the office pool, currently betting on the outcome of the basketball game next week, between the MallRats and the Chinooks. You have to decide which team will win, then bet on that team. Of course, you could just guess, or flip a coin. Here we present a way that (typically) will do better: by using observations about the past performance of the teams."
Introduction to Decision Tree Learning. Slides for Lecture #2, covering Section 2 of Chapter 11 [Learning] inComputational Intelligence: A Logical Approach, David Poole, Alan Mackworth and Randy Goebel. (Oxford University Press, New York; 1998).
- Also see the Decision Trees Applet:"Learning is the ability to improve one's behaviour based on experience and represents an important element of computational intelligence. Decision trees are a simple yet successful technique for supervised classification learning. This applet demonstrates how to build a decision tree using a training data set and then use the tree to classify unseen examples in a test data set." - From CIspace - Tools for Learning Computational Intelligence: "Here are some applets that are designed as tools for learning and exploring concepts in artificial intelligence. They are part of the online resources for Computational Intelligence. If you are teaching or learning about AI, you may use these applets freely."
Decision Trees. By Estelle Brand and Rob Gerritsen. From The DBMS Guide to Data Mining Solutions (1998). "When a businessperson needs to make a decision based on several factors, a decision tree can help identify which factors to consider and how each factor has historically been associated with different outcomes of the decision. Data mining tools that produce decision trees create graphical models and text rules that can both predict behavior and explain patterns. End users can easily understand and apply decision trees, so these techniques are very popular data exploration tools."
VIDEO: Holly Yanco's lecture about Data Structures [Trees, Trees, Trees]. Available from AAAI's Video Archive along with background information and links to the textbook and lecture handout.
How Chess Computers Work. By Marshall Brain. Howstuffworks. "Once black makes its move, the computer goes through this whole process again, generating a new tree and evaluating all of the board positions to figure out its next move. This approach is called the minimax algorithm because it alternates between the maximums and minimums as it moves up the tree. By applying a technique called alpha beta pruning, the algorithm can run about twice as fast and requires a lot less memory."
General Readings |
The Minimax Game Tree. From AI Horizon. "A detailed explanation of one of the most important data structures ever created for Game Artificial Intelligence. The minimax tree is at the heart of almost every board game program in existence."
- Also see this interactive exhibit from the Computer History Museum: Chess Software Basics
Rules are Much More than Decision Trees. From Information Discovery, Inc. "Rule induction and decision trees are related approaches to discovering logical patterns within data sets. Although rules and decision trees may seem similar at first, they are in fact quite different both in terms of the information they discover from databases and in terms of their behavior on new data items. In essence, decision trees may be viewed as a simplistic approach to rule discovery."
Decision Tree Learning lecture slides to accompany Tom Mitchell's book, Machine Learning (McGraw Hill, 1997).
- Also available: Decision tree learning code - Companion to Chapter 3
Decision Trees Tutorial Slides. By Andrew Moore of The Auton Lab, part of Carnegie Mellon University's School of Computer Science. "The Decision Tree is one of the most popular classification algorithms in current use in Data Mining and Machine Learning. This tutorial can be used as a self-contained introduction to the flavor and terminology of data mining without needing to review many statistical or probabilistic pre-requisites."
- Also see Andrew Moore's Game Tree Search Algorithms, including Alpha-Beta Search.
Decision Trees. Chapter Six of Introduction to Machine Learning - Draft of Incomplete Notes. By Nils J. Nilsson.
Decision Making Under Uncertainty. Some information on research by David Poole. "Michael Horsch and I have been working on flexible (anytime) evaluation of large influence diagrams by 'information refinement'. The idea is to build a decision tree expressing how each decision node depends on its information predecessors. Often good decisions can be expressed with small decision trees. The following paper shows how this can be done for a single decision: M.C. Horsch and D. Poole. Flexible Policy Construction by Information Refinement, Proc. Twelfth Conference on Uncertainty in Artificial Intelligence, Portland, Oregon, August 1996." (You can access this paper via a link from his site.)
Alpha-beta pruning. Part of Dr. A. N. Walker's course materials for Game Theory. "An insight into the help that alpha-beta pruning gives can be had by imagining that Kasparov is sitting next to you suggesting good moves to try...."
Related Resources |
Incremental Decision Tree Induction. Developed by the UMass Machine Learning Laboratory. "ITI (Incremental Tree Inducer) is a program that constructs decision trees automatically from labeled examples. Although the program is called ITI, it includes both the ITI and DMTI (Direct Metric Tree Induction) algorithms. One runs the ITI program, but with certain command options invokes the DMTI algorithm within."
Silicon Graphics' MineSet(TM) Tree Visualizer. "The Tree Visualizer allows one to navigate (fly) through the tree, zoom in on interesting nodes, click on bars to get counts, and mark interesting places in the tree." Check out the videos and snapshots.
Other References Offline
How Game-Playing Computers Think -> Minimaxing & Alpha-beta pruning. An illustration from Computers, Games and the Real World, by Matthew L. Ginsberg. Scientific American (special issue: Exploring Intelligence - Winter 1998).
Decision Trees. MIT Encyclopedia of Cognitive Science (subscription req'd.). Following the overview, you'll find refences, readings and links to related resources. "A decision tree with a range of discrete (symbolic) class labels is called a classification tree, whereas a decision tree with a range of continuous (numeric) values is called a regression tree. A domain element is called an instance or an example or a case, or sometimes by another name appropriate to the context. An instance is represented as a conjunction of variable values. Each variable has its own domain of possible values, typically discrete or continuous. The space of all possible instances is defined by set of possible instances that one could generate using these variables and their possible values (the cross product). Decision trees are attractive because they show clearly how to reach a decision, and because they are easy to construct automatically from labeled instances. Two well known programs for constructing decision trees are C4.5 (Quinlan 1993) and CART (Classification and Regression Tree) (Breiman et al. 1984)."
—————