Sale!

CS 440: Assignment 2 Logic-based and Bayesian Inference solved

Original price was: $35.00.Current price is: $30.00. $25.50

Category:

Description

5/5 - (6 votes)

CS 440: INTRO TO ARTIFICIAL INTELLIGENCE
PART A. Question 1: [10 points] Consider the the following Bayesian network, where variables A through E
are all Boolean valued:
a) What is the probability that all five of these Boolean variables are simultaneously true?
[Hint: You have to compute the joint probability distribution (JPD). The structure of the
Bayesian network suggests how the JPD is decomposed to the conditional probabilities available.]
b) What is the probability that all five of these Boolean variables are simultaneously false?
[Hint: Answer similarly to above.]
c) What is the probability that A is false given that the four other variables are all known to be
true?
[Hint: Try to use the definition of the conditional probability and the structure of the network.
For probabilities that can not be computed directly from the network, remember the following normalization trick: if P() = α · ƒ () and P(¬) = α · ƒ (¬), then you can compute the
normalization factor as α =
1.0
ƒ ()+ƒ (¬)
, since P() + P(¬) = 1.0.]
Question 2: [10 points] For this problem, check the Variable Elimination algorithm in your book.
Also consider the Bayesian network from the “burglary” example.
a) Apply variable elimination to the query:
P(Brgry|JohnsCs = tre, MryCs = tre)
and show in detail the calculations that take place. Use your book to confirm that your answer
is correct.
b) Count the number of arithmetic operations performed (additions, multiplications, divisions),
and compare it against the number of operations performed by the tree enumeration algorithm.
c) Suppose a Bayesian network has the from of a chain: a sequence of Boolean variables
X1, . . . Xn where Prents(X) = {X−1} for  = 2, . . . , n. What is the complexity of computing P(X1|Xn = tre) using enumeration? What is the complexity with variable elimination?
Question 3: [15 points]
In this problem, you will implement two kinds of sampling techniques (rejection sampling and
likelihood weighting) to perform approximate inference on the given Bayesian network.
a) Compute the probabilities of P(d|c), P(b|c), and P(d|¬, b) by enumeration. Then, employ
rejection sampling and likelihood weighting to approximate them. Use 1000 samples, and
document your results. How do the approximations and the actual values compare?
b) Next, we focus on P(d|c). We know that the accuracy of the approximation depends on the
number of samples used. For each of the sampling methods, plot the probability as a function
of the number of samples used. Do you notice large divergences in the convergence rates
among the methods?
c) Construct a query on this Bayes Net such that the convergence and effectiveness of rejection
sampling is noticeably worse than that of likelihood weighting List the query you are using,
and provide the probability plot as a function of samples used. Why is it the case that rejection
sampling is noticeably worse for this query?
Question 4: [15 points]
You are an interplanetary search and rescue expert who has just received an urgent message:
a rover on Mercury has fallen and become trapped in Death Ravine, a deep, narrow gorge on the
borders of enemy territory. You zoom over to Mercury to investigate the situation. Death Ravine
is a narrow gorge 6 miles long, as shown below. There are volcanic vents at locations A and D,
indicated by the triangular symbols at those locations.
[3pts] (b) Estimate the expectation E[f(X, Y, Z)|Y = 0, Z = 1] using two samples obtained using likelihood-weighting. Show
your work. Reuse the sequence {ai}1≤i≤15 (starting with a1) as a source of randomness.
3 [6 pts] HMM: Search and Rescue
You are an interplanetary search and rescue expert who has just received an urgent message: a rover on Mercury
has fallen and become trapped in Death Ravine, a deep, narrow gorge on the borders of enemy territory. You zoom
over to Mercury to investigate the situation. Death Ravine is a narrow gorge 6 miles long, as shown below. There
are volcanic vents at locations A and D, indicated by the triangular symbols at those locations. A B C F D E
! !
The rover was heavily damaged in the fall, and as a result, most of its sensors are broken. The only ones still functioning are its thermometers, which register only two levels: hot and cold. The rover sends back evidence E = hot
when it is at a volcanic vent (A and D), and E = cold otherwise. There is no chance of a mistaken reading. The rover
fell into the gorge at position A on day 1, so X1 = A. Let the rover’s position on day t be Xt ∈ {A, B, C, D, E, F}.
The rover is still executing its original programming, trying to move 1 mile east (i.e. right, towards F) every day.
However, because of the damage, it only moves east with probability 0.5, and it stays in place with probability 0.5.
Your job is to figure out where the rover is, so that you can dispatch your rescue-bot.
(a) (2 pt) Three days have passed since the rover fell into the ravine. The observations were (E1 = hot, E2 = cold,
E3 = cold). What is P(X3|hot1, cold2, cold3), the probability distribution over the rover’s position on day 3, given
the observations?
You decide to attempt to rescue the rover on day 4. However, the transmission of E4 seems to have been corrupted,
and so it is not observed.
(b) (2 pt) What is the rover’s position distribution for day 4 given the same evidence, P(X4|hot1, cold2, cold3)?
(c) (2 pt) All this computation is taxing your computers, so the next time this happens you decide to try approximate inference using particle filtering to track the rover. If your particles are initially in the top configuration shown
below, what is the probability that they will be in the bottom configuration shown below after one day (after time
elapses, but before evidence is observed)?
!

 





 






 


 
 

 





 





A B C F D E
! !
A B C F D E
!

 





4
The rover was heavily damaged in the fall, and as a result, most of its sensors are broken.
The only ones still functioning are its thermometers, which register only two levels: hot and cold.
The rover sends back evidence E = hot when it is at a volcanic vent (A and D), and E = cold
otherwise. There is no chance of a mistaken reading. The rover fell into the gorge at position A
on day 1, so X1 = A. Let the rover’s position on day t be Xt ∈ {A, B, C, D, E, F}. The rover is still
executing its original programming, trying to move 1 mile east (i.e. right, towards F) every day.
However, because of the damage, it only moves east with probability 0.80, and it stays in place
with probability 0.20. Your job is to figure out where the rover is, so that you can dispatch your
rescue-bot.
a) Filtering: Three days have passed since the rover fell into the ravine. The observations were
(E1 = hot, E2 = cold, E3 = cold). What is P(X3 | hot1, cold2, cold3), the probability distribution
over the rover’s position on day 3, given the observations? (This is a probability distribution
over the six possible positions).
b) Smoothing: What is P(X2 | hot1, cold2, cold3), the probability distribution over the rover’s
position on day 2, given the observations? (This is a probability distribution over the six
possible positions).
c) Most Likely Explanation: What is the most likely sequence of the rover’s positions in the three
days given the observations (E1 = hot, E2 = cold, E3 = cold)?
d) Prediction: What is P(hot4, hot5, cold6 | hot1, cold2, cold3), the probability of observing hot4
and hot5 and cold6 in days 4,5,6 respectively, given the previous observations in days 1,2,
and 3? (This is a single value, not a distribution).
e) Prediction: You decide to attempt to rescue the rover on day 4. However, the transmission
of E4 seems to have been corrupted, and so it is not observed. What is the rover’s position
distribution for day 4 given the same evidence, P(X4 | hot1, cold2, cold3)?
The same thing happens again on day 5. What is the rover’s position distribution