Decision Trees
Decision tree is a classifier in the form of a tree structure. Each node is either
- A leaf node - indicates the value of the target attribute (class)
- A decision node - specifies some test to be carried out usually on a single attribute value with one branch for each outcome of the rest.
Decision Tree Example :
Each internal node tests an attribute
Each leaf node assigns a classification
Each branch corresponds To an attribute value
Decision Tree Defn :
A decision tree is a tree in which -
Decision tree for classificationDecision tree is a classifier in the form of a tree structure. Each node is either
- A leaf node - indicates the value of the target attribute (class)
- A decision node - specifies some test to be carried out usually on a single attribute value with one branch for each outcome of the rest.
Decision Tree Example :
Each internal node tests an attribute
Each leaf node assigns a classification
Each branch corresponds To an attribute value
Decision Tree Defn :
A decision tree is a tree in which -
- Each non-leaf node has associated with it an attribute (feature)
- Each leaf node has associated with it a classification (+ or -)
- Each arc corresponds to one possible value of the attribute of its parent node.
A decision tree used to classify an example.
- Start from the root node.
- Follow the appropriate decision branches
- On reaching a leaf node the predicted class is obtained
Classification us9ing Decision tree
What type of functions can decision trees represent ?
Disjunction of conjunctions
Try representing the following functions:
What type of functions can decision trees represent ?
Disjunction of conjunctions
Try representing the following functions:
- AND
- OR
- XOR
- majority
Decision tree induction
- Top down greedy search through the space of decision trees.
- Use training data
- Quinlan developed ID3 in 1973 based on the CLS algorithm.
Top-Down Induction of Decision Trees : ID3
- A ← the "best" decision attribute for next node
- Assign A as decision attribute for node
- For each value of A create new descendant
- Sort training examples to leaf node according to attribute value of the branch
- If all training example are perfectly classified (same value of target attribute) stop, else iterate over new leaf nodes.
1. Input :
- Atts - A set of non-target attributes
- Q - target attribute
- S - training set
2. Returns a decision tree
If S is empty, return a single node with value Failure
If S consists of records of the same class, return a single leaf node with that value.
If Atts is empty, then return a single node with the value of the most frequent value of Q in S.
Begin
End
No comments:
Post a Comment