Review for Midterm Exams
CSU520 & CSG120 Artificial Intelligence - Spring 2006
Professor Futrelle -
College of Computer and Information Science
Northeastern U., Boston, MA
This page started March 30th, 2006 - As of March 31st, it is complete.
They will be open book, open notes exams based on material in the AIMA textbook,
Section III, Knowledge and Reasoning, and Section V, Uncertain Knowledge and
Reasoning.
The level of difficulty of the CSU520 and
CSG120 exams will be different.
The logic of the design of the exams is: The great majority of the
questions will be modeled directly on the examples in the textbook.
They do not assume that you have correctly done the homework exercises,
since you have had no opportunity to see those graded and will not
be given the detailed answers to each of those exercises.
The specific sections in the book from which the questions will be
drawn are listed below.
The CSG120 exam will be Monday, April 3rd.
The CSU520 exam will be on Wednesday, April 5th.
Topics on Part III - Knowledge and Reasoning
-
Be able to use Fig. 7.8 tables as well as the identities in Fig. 7.11.
-
Understand how the inference rules at the top of page 212 are derived.
-
Be able to do a simple proof in the style of the one on page 212.
The question might be simplified further by asking you what
equalities are used for some steps you will be given.
-
Be able to do resolution as shown in the example of clauses
of length two, page 214.
-
Be able to do conversion to conjunctive normal form, as described
in steps 1 through 4 on page 215.
-
Be able to do a resolution proof as shown in Fig. 7.13.
I would give you the clauses in the top row excepting the
one on the right. I would then give you a clause to negate
to then carry through the proof in the manner shown.
-
Be able to do a simple proof, with a diagram, as shown in Fig. 7.15.
-
Be familiar with the difference between
∀x ∃ y Loves(x,y), bottom of page 251, and
∃x ∀ y Loves(x,y), top of page 252.
-
Be able to do an example of Generalized Modus Ponens as shown
on page 276. This requires understanding the Instantiation rules
of page 273, and particularly, the introduction of a Skolem constant,
on that page and covered in more detail in Sec. 9.5, Resolution.
-
Understand simple forward chaining as described on pages 280 and 281.
Be able to do a proof in the style of Fig. 9.4.
-
Study Sec. 9.5 on resolution carefully, but only through the middle
of page 300. Be able to do simple proofs in the style of Figs. 9.11 and
9.12. I will not ask you to go from English sentences all the way
through to a proof as described on pages 298 and 299. Instead, I will
ask you to conversion to conjunctive normal form as described on
pages 296 and 297, as well as the example proofs in Figs. 9.11 and
9.12.
-
There will be no questions on Chapter 10.
Topics on Part V - Uncertain Knowledge and Reasoning
As before, pay most careful attention to the various examples that
occur throughout this material. Exam questions will tend to focus
on examples similar to those in AIMA.
- Know the basic ideas and terminology of Propositions, page 467
and the top of page 468.
- Understand what is meant by saying that,
"An atomic event is a complete specification of
the state of the world about which the agent is uncertain.," page 468.
- Know what is meant by unconditional or prior probability,
including the full joint probability distribution, pages 468 and 469.
- You must know the definition of conditional probability on page 470.
- The example in Fig. 13.3 and the discussion there and on the next
page are all important.
- Normalization is explained on page 476.
There, the example of computing P(Cavity|toothache)
is described, in which you can avoid computing the common denominator
by realizing that the probability of the two cases must sum to 1.0.
(Skip Fig. 13.4).
- Independence, Sec. 13.5, is important, but remember that some variables may
not be independent, so you can't use the simplifying assumptions of
independence.
- All topics in Sec. 13.6, Bayes' Rule, are fair game, though the major
emphasis will be on the basics of Bayesian networks in Chap. 14.
- Skip Sec. 13.7.
- Fig. 14.2, the familiar Burglary-Earthquake net will be the basis
of at least on question. You will be given or asked to construct
a network of the same basic sort and do computations based on the network.
The example at the bottom of page 495 is useful to understand.
- We will skip the Noisy-OR example, because it is too detailed - there
would be nothing more for you to do.
- The example on page 505 is important, since it allows you to
compute a conditional probability, given evidence.
Note that normalization is used, since computing P(b|j,m) and P(¬b|j,m)
are equally easy. Compute the latter value yourself now
to make sure you understand how normalization works in this case.
There are some important topics in Parts III and V that will not be covered
on the Midterm. They might be
included on the Final Exam, but only if there is time to discuss them
further in class.
The Final Exam will cover some aspects of Knowledge and Uncertainty, again
based on the topics above. The major new topic on the Final will be
Learning, Part VI of AIMA.
Return to CSU520 homepage
or return to CSG120 homepage.
or RPF's Teaching Gateway or
homepage