Skip to content

Commit f2973a0

Browse files
authored
Merge pull request #183 from JuliaAI/dont-export-measures
Stop exporting measures
2 parents 1b1a35b + 88f9144 commit f2973a0

File tree

4 files changed

+16
-9
lines changed

4 files changed

+16
-9
lines changed

README.md

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -92,7 +92,7 @@ apply_tree(model, [5.9,3.0,5.1,1.9])
9292
# apply model to all the sames
9393
preds = apply_tree(model, features)
9494
# generate confusion matrix, along with accuracy and kappa scores
95-
confusion_matrix(labels, preds)
95+
DecisionTree.confusion_matrix(labels, preds)
9696
# get the probability of each label
9797
apply_tree_proba(model, [5.9,3.0,5.1,1.9], ["Iris-setosa", "Iris-versicolor", "Iris-virginica"])
9898
# run 3-fold cross validation of pruned tree,
@@ -312,6 +312,13 @@ Available models are: `AdaBoostStumpClassifier`,
312312
`RandomForestClassifier`, `RandomForestRegressor`.
313313

314314

315+
## Feature Importances
316+
317+
The following methods provide measures of feature importance for all models:
318+
`impurity_importance`, `split_importance`, `permutation_importance`. Query the document
319+
strings for details.
320+
321+
315322
## Saving Models
316323
Models can be saved to disk and loaded back with the use of the [JLD2.jl](https://github.com/JuliaIO/JLD2.jl) package.
317324
```julia

src/DecisionTree.jl

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,3 @@
1-
__precompile__()
2-
31
module DecisionTree
42

53
import Base: length, show, convert, promote_rule, zero
@@ -13,8 +11,7 @@ export Leaf, Node, Root, Ensemble, print_tree, depth, build_stump, build_tree,
1311
prune_tree, apply_tree, apply_tree_proba, nfoldCV_tree, build_forest,
1412
apply_forest, apply_forest_proba, nfoldCV_forest, build_adaboost_stumps,
1513
apply_adaboost_stumps, apply_adaboost_stumps_proba, nfoldCV_stumps,
16-
majority_vote, ConfusionMatrix, confusion_matrix, mean_squared_error, R2, load_data,
17-
impurity_importance, split_importance, permutation_importance, accuracy
14+
load_data, impurity_importance, split_importance, permutation_importance
1815

1916
# ScikitLearn API
2017
export DecisionTreeClassifier, DecisionTreeRegressor, RandomForestClassifier,

src/classification/main.jl

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -194,10 +194,10 @@ Prune tree based on prediction accuracy of each node.
194194
will be pruned and become a leaf.
195195
196196
* `loss`: The loss function for computing node impurity. Available function include
197-
`DecisionTree.util.entropy`, `DecisionTree.util.gini` and `mean_squared_error`. Defaults
198-
are `DecisionTree.util.entropy` and `mean_squared_error` for classification tree and
199-
regression tree, respectively. If the tree is not a `Root`, this argument does not affect
200-
the result.
197+
`DecisionTree.util.entropy`, `DecisionTree.util.gini` and
198+
`DecisionTree.mean_squared_error`. Defaults are `entropy` and `mean_squared_error` for
199+
classification tree and regression tree, respectively. If the tree is not a `Root`, this
200+
argument does not affect the result.
201201
202202
For a tree of type `Root`, when any of its nodes is pruned, the `featim` field will be
203203
updated by recomputing the impurity decrease of that node divided by the total number of

test/runtests.jl

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,9 @@ using Statistics
99
using Test
1010
using LinearAlgebra
1111

12+
import DecisionTree: accuracy, R2, majority_vote, mean_squared_error
13+
import DecisionTree: confusion_matrix, ConfusionMatrix
14+
1215
println("Julia version: ", VERSION)
1316

1417
similarity(a, b) = first(reshape(a, 1, :) * b / norm(a) / norm(b))

0 commit comments

Comments
 (0)