Sale!

CSCI 3230 Written Assignment 1 solved

Original price was: $35.00.Current price is: $30.00. $25.50

Category:

Description

5/5 - (8 votes)

Written Assignment 1
I. Neural Network
a) For the following single perceptron, write output unit O in terms of input units Ij’s
and weights wj’s where the activation function is denoted by f. Assume there is a input
unit I0 = 1 connecting to the neuron with weight w0.
Figure 1. Single perceptron
b) For the following multiple layer perceptron (neural network), error term is
where is the m-th target desired output. Let hi,k be the
output of Ni,k and hi,0 = 1, write down the equation of hi,k and .
c) For the activation function , show that = �(�)%1 − �(�)(.
S f
w1
wn
w2 …
I1
I2
In
w0
O
å( ) +
=
= – 1
1
2
2
1 HK
m
E Om Tm Tm
Om
. . .
. . .
. . .
. . .
. . .
. . .
I1 I2 In
L1 :
L2 :
LK :
O1 O2 OHK+1
L0 :
LK+1 :
( ) z e
f z – + =
1
1 f ‘(z)
Figure 2. Multiple layer perceptron where L0
is the Input Layer and LK+1 is the Output
Layer. For each layer Li, it has Hi number of
neurons. For each node Ni,j, the j-th node in
the layer Li, is the single perceptron in part a.
For each two nodes in adjacent layers, there
is a connection with weight, wi,j,k, which is
the weight connecting Ni,j and Ni+1,k.
CSCI3230,  Written Assignment 1 2
d) One way to iteratively minimize a smooth function , is to use gradient descent
The rule is until , where and are the learning
rate and the tolerance level respectively. Explain why we need the learning rate .
e) For fixed input, error term is also a function of ’s. Our
goal is to find a set of ’s such that is minimized.
i) For figure 3, express for the purple-colored output neuron in terms of
the symbols in figure 3 and .
ii) Referring to the yellow-colored hidden neuron and the symbols in figure 4, show
that +,
+-./0,2
= ∑ ∆678,9: ∙ �67=,>,9:
?./@
9A= , where ∆6,>= +,
+-.,2
∙ ℎ6,> ∙ (1 − ℎ6,>).
iii) For figure 4, express for the yellow-colored hidden neuron in terms of
g(z)
xt+1 = xt -ag'(xt ) – < e t+ t x x 1 a e
a
å( ) +
=
= – 1
1
2
2
1 HK
m
E Om Tm wi, j,k
wi, j,k E
wK j k
E
, , ¶

Tk
. . .
WK,1,k WK,2,k WK,j,k
LK :
Ok
LK+1 :
. . .
WK,H ,k
hK+1,k
hK,j hK,H hK,1 hK,2 K
K
. . .
Wi,1,k Wi,2,k Wi,j,k
Li :
Li+1 :
. . .
Wi,H ,k
hi+1,k
hi,j hi,H hi,1 hi,2 i
i
. . . . . . hi+2,m hi+2,H hi+2,1 hi+2,2 i+2
Wi+1,k,1 Wi+1,k,2 Wi+1,k,m Wi+1,k,H
Li+2 :
i+2
wi j k
E
, , ¶

Figure 3. The bottom part of the multiple layer perceptron, where hi,j is the output value of Node
Ni,j. The purple-colored node indicates each of the HK+1 nodes in the output layer LK+1.
Figure 4. Intermediate three layers extracted from the multiple layer perceptron, where hi,j is the
output value of Node Ni,j. The yellow-colored node indicates each of the Hi+1 nodes in the output
layer Li+1.
CSCI3230, , Written Assignment 1 3
the symbols in figure 4 and ∆C,D.
iv) Hence, write down the backward propagation algorithm using e(i), e(ii) and e(iii).
f) Please illustrate the problem that occur when the neural network has a large number
of layers.
g) Please describe what overfitting is and how it can be avoided.
II. Assignment Submission
You MUST complete this assignment by using any one of the computer text editors (e.g. MS
Word, WordPad, iWork Pages… etc.) and then save the document to PDF format with A4
printable page size. Scan version of the hand written work is NOT accepted. Please limit the
file size of the PDF file less than 1MB.
You MUST submit the PDF file to the submission system on our course homepage (within
CUHK network), otherwise, we will NOT mark your assignment.
III. Important Points
You MUST STRICTLY follow these points:
a. You MUST strictly follow the submission guidelines.
b. Remember to type your FULL NAME, STUDENT ID on the assignment.
c. Late submission will NOT be entertained according to our submission system settings.
d. Plagiarism will be seriously punished.
IV. Late Submission
According to the course homepage, late submission will lead to marks deduction.
No. of Days Late Marks Deduction
1 10%
2 30%
3 60%
4 or above 100%