vumichien commited on
Commit
f45df97
·
1 Parent(s): 0b7082e

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +4 -2
app.py CHANGED
@@ -14,7 +14,10 @@ model_card = f"""
14
  ## Description
15
 
16
  The **DecisionTreeClassifier** employs a pruning technique that can be configured using the cost complexity parameter, commonly referred to as **ccp_alpha**.
17
- By increasing the value of **ccp_alpha**, a greater number of nodes can be pruned. This demo demonstrates the impact of **ccp_alpha** on tree regularization
 
 
 
18
 
19
  ## Dataset
20
 
@@ -54,7 +57,6 @@ def get_ccp(test_size, random_state):
54
  ax2.set_title("Number of nodes vs alpha")
55
 
56
  fig3, ax3 = plt.subplots()
57
-
58
  ax3.plot(ccp_alphas, depth, marker="o", drawstyle="steps-post")
59
  ax3.set_xlabel("alpha")
60
  ax3.set_ylabel("depth of tree")
 
14
  ## Description
15
 
16
  The **DecisionTreeClassifier** employs a pruning technique that can be configured using the cost complexity parameter, commonly referred to as **ccp_alpha**.
17
+ By increasing the value of **ccp_alpha**, a greater number of nodes can be pruned.
18
+ In this demo, a DecisionTreeClassifier will be trained on the Breast Cancer dataset. Then, the effect of **ccp_alpha** in many terms of the tree-based model like the impurity of leaves, depth, number of nodes, and accuracy on train and test data are shown in many figures.
19
+ Based on this information, the best number of **ccp_alpha** is chosen. This demo also shows the results of the best **ccp_alpha** with accuracy on train and test datasets.
20
+ You can play around with different ``test size`` and ``random state``
21
 
22
  ## Dataset
23
 
 
57
  ax2.set_title("Number of nodes vs alpha")
58
 
59
  fig3, ax3 = plt.subplots()
 
60
  ax3.plot(ccp_alphas, depth, marker="o", drawstyle="steps-post")
61
  ax3.set_xlabel("alpha")
62
  ax3.set_ylabel("depth of tree")