Sale!

COMP642 Exam 2 Solved

Original price was: $40.00.Current price is: $35.00. $29.75

Category:

Description

5/5 - (1 vote)

For each of the below statements, agree or disagree and explain your
reasoning:

Hierarchical clustering can’t handle big data well but K Means
clustering can. This is because the time complexity of K Means is
linear i.e. O(n) while that of hierarchical clustering is quadratic i.e.
O(n ).

In K Means clustering, since we start with random choice of clusters,
the results produced by running the algorithm multiple times might
differ. The same is true in Hierarchical clustering since the order of the
data can be random.

K Means clustering requires prior knowledge of K i.e. no. of clusters
you want to divide your data into. But, you can stop at whatever
number of clusters you find appropriate in hierarchical clustering by
interpreting the dendrogram.

 

For feed forward neural networks, explain what can be represented.
Explain how weights are updated in the back propagation step.

 

When are CNN useful? In a CNN, how are the number of connections
reduced in comparison to a feed forward NN? Explain pooling layer and
convolution layer.

 

What is the difference between bias and variance? How do these relate
to underfitting and overfitting? How can they help you select models?
How are they used to tune hyperparameters?

 

When are RNN useful? What is the shortcoming of a simple RNN and
how is that solved using LSTM?

COMP642 Exam 2