Perceptron Learning Algorithm (pseudocode) Flashcards

(14 cards)

1
Q

Describe the Perceptron learning rule

A

Automatic algorithm that works efficiently even for high dimensional problems

Learns the parameters, weights and biases directly

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the purpose of the trainable parameters

A

To give importance to the features that contribute more towards the learning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Explain the steps to an artificial neural network

A

Inputs passed in with corresponding weights

Σ : Linear transformation (g(x)) combines multiple inputs into one output value

f(z): Activation function is applied to g(x) the activation function

y hat: Forward pass - a calculation is predicted.

Error: Backward pass - adjust the model parameters according to the error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the bias term b

A

The bias b is a learnable scalar that shifts the decision boundary off the origin

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do we merge the bias b with the weight vector

A

Add the bias b to the end of the weight vector. add a constant 1 to the end of the input vector as this will be multiplied by the bias when dotted together

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Write the perceptron Learning algorithm in pseudocode

A

P and N are inputs with class labels (1,-1)

Learning rate is a hyperparameter

InitialiseW: randomly, is initialising the weights and biases

LINE: while Not converge do:

Convergence refers to the training stoppping criteria which could be:
No more errors
Model stop making improvement
Max number of iteration

LINE: Do forwards pass
Activation function

LINE: if X’ e P and W^T * X’ < 0
It is saying for each sample in the positive class, check if it is negative, if it is then append the negative of the sample to the error class (delta)

Next if statement:
If sample is in negative class and is positive, add the sample to the error class delta

After all if statements

New set of weights =

Weights - learning rate * sum of all errors
Increase t (iterations)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How to tell which class is positive and negative

A

Check the first two vectors, take these as co-ordinates and will point in the direction of the positive class e.g. (1,1) shows the top right quadrant is positive so the class where the decision boundary has the majority of the top right quadrant will be positive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q
A

Sum of errors = (-0.6,0.7,0) and then next step ish shown in image

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

We are using boolean logic OR as the classifier and seeing if the activation function classifies correctly

What is X’ for the training row 1

A

X’ = x1,x2,1

(1 is for the bias)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The result of the activation function is the parameters dotted with X’ as shown by g(x), which row will misclassify and not give the same answer as OR

A

Row 1: =-0.5 so gives 0. Correct
Row 2: =0.1 so gives 1. Correct
Row 3:=-0.1 so gives 0. Incorrect
Row 4:=0.5 so gives 1. Correct

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the sum of errors (delta)

A

Sum of errors is given by sum of X’
(X’) becomes negative if it was wrongly classified as positive

SO:

Row_3’s X’ = [1,0,1]

so sum of errors (delta) = [-1,0,-1]^t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Formula for new set of weights

A

New set of weights =

Weights - learning rate * sum of all errors
Increase t (iterations)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Formula for sum of errors (delta)

A

sum of X’ when it was misclassified, you add a -X’ if it was misclassified as positive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q
A

[1.1,0.6,0.2]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly