Notebook 9 COPY Flashcards

(51 cards)

1
Q

What are some advantages of Decision trees?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are some disadvantages of Decision trees?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How do we apply the Decision Tree classifier to our iris dataset?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How to you show a Decision Tree plot? How would you change this to only create a Tree of max depth of 2?

A

clf = tree.DecisionTreeClassifier(max_depth=2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do we interpret this decision tree plot?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does the Emsemble method do?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the import statements need for Random Forest and Gradient Boosted Trees?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Use a Random Forest Classifier on this dataset?

A

GO thourgh hyperparameters on Data base

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Use a Gradient Boosting on this dataset?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do you code this?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How do we apply the Random Forest Classifer to this imbalanced data set?

A

If this worked ocrrectly you should find an AUC of about 0.84

The RandomForestClassifer has a parameter class_weight. The “balanced” mode uses the values of y to automatically adjust weights inversely propotional to class frquencies in the input data –> setting class_weight = “balanced” should improve the the AUC to about 0.87 for this artifical data set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is a Multi-layer Perceptron?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are the advatanges of a multi-layer Perceptron?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the disadvantages of a multi-layer Perceptron?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does MLPClassifier do?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How does the MLPClassifier train the algorithm?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What does the MLPClassifier allow for?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What does the MLPRegressor do?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What parameter for both the MLPRegressor and MLPClassifer use for regularisation?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What arhe three algorithms MLP uses to train?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is the time complexity of bakcwards propogation?

22
Q

What are some tips on using MLP in practise?

23
Q

What are all the import statements required for using MLPClassifer?

25
Note that the dataset has 100 samples by default. This is a good size for our purposes. How does the complexity of backpropagation depend on the sample size?
ADD TO THIS
26
27
Why do we not perform fit as well on test data?
Its because of overfitting but what a more robust answer
28
What does the MinMaxScaler do?
29
Why does the training data differ?
30
What does QuantileTranformer do?
31
32
What is import to do before proceeding tuning the hyperparameters of MLP?
33
Sanity Check for MLP using SVC?
34
35
Use the MLPClassifier?
Note: give the Classifier score for moons using the StandardScaler is 0.95 we cna conclude that is the best
36
What are some of the most important Hyperparameters for the MLP?
37
What are the different choices of activation hyperparameters for the MLPClassifier??
* model get worse with higher alpha, and learning_rate_init.
38
What are the different solvers hyperparameters for the MLPClassifier?
Note: The default solver ‘adam’ works pretty well on relatively large datasets (with thousands of training samples or more) in terms of both training time and validation score. For small datasets, however, ‘lbfgs’ can converge faster and perform better.
39
For the hidden_layer_sizes hyperparameter what do you need to do to screate a single layer with 10 neurons? What about 3 layers of 10 neurons?
40
Perform hyperparameter tuning using a loop for the the number of neurons in a single hidden layer? print a graph of the results and decide on what hte best parameter is?
41
Perform hyperparameter tuning using a loop for the the number of neurons from 2-40 for 2 hidden layers? print a graph of the results and decide on what hte best parameter is?
42
43
44
How do we plot the gradient descent over time of this small samples?
45
What is the output of this data? How do interpret it?
46
In supervised learning, what is importance to note about the features?
47
What is the feature_importances_property? How and when do we use it?
48
Apply the RandomForestClassifier to the Iris dataset using feature_importances_property?
49
What should the importances sum to?
50
What is Permutation Importance?
51
Apply the RandomForestClassifier to the Iris dataset using permutation_importance? What does the perm_importance sum to? How can interpret the graph?