Module Identifier 
CS36110 
Module Title 
INTELLIGENT LEARNING 
Academic Year 
2003/2004 
Coordinator 
Dr Mark B Ratcliffe 
Semester 
Semester 1 
Other staff 
Dr Yonghuai Liu 
PreRequisite 
CS26210 
Course delivery 
Lecture  20 lectures 
Assessment 
Assessment Type  Assessment Length/Details  Proportion 
Semester Exam  2 Hours  100% 
Supplementary Exam  Will take the same form, under the terms of the Department's policy.  

Further details 
http://www.aber.ac.uk/compsci/ModuleInfo/CS36110 
Learning outcomes
On successful completion of this module, students will be able to:
1. appreciate the range of applicability of intelligent learning concepts and techniques;
2. explain the state of the art of intelligent learning concepts and techniques;
3. have sufficient knowledge of a basic rule induction method, such as ID3, to be able to explain it;
4. explain the theory behind artificial neural networks, in particular perceptron and back propagation;
5. describe and explain the theory behind concept learning, Baysian learning, genetic algorithm, reinforcement learning, and inductive logic programming;
6. apply Baysian theory to justify concept learning, decision tree learning, artificial neural network learning.
Brief description
This module builds on CS26210 and examines some of the key ideas in Artificial Intelligence. A small number of topics are studied in depth in order to give insight and understanding of the methods and issues involved in stateoftheart developments.
Content
1. Introduction  2 lectures
Possibility and necessity of learning, target function,components in a learning system, performance measurement of learning systems.
2. Concept learning  3 lectures
Generality ordering of hypotheses, FINDS algorithm, candidate elimination algorithm, version space, the LISTTHENELIMINATE algorithm, inductive bias
3. Decision tree learning  3 lectures
Entropy, best attribute, information gain, best tree, inductive bias, Occam's razor, overfitting, reduced error pruning, rule postpruning
4. Artificial neural network  3 lectures
Perceptron, linear separability, gradient decent, sigmoid function, back propagation algorithm, overfitting
5. Baysian learning  4 lectures
Baysian theory, maximum a posteriori hypothesis, maximum likelihood, probability density, normal distribution, minimum description length principle, Bayes optimal classifier, naive Bayes classifier
6. Genetic algorithm  3 lectures
Best hypothesis, hypothesis representation, genetic operators, fitness function, fitness proportionate selection, steady state selection, rank based selection
7. Reinforcement learning  2 lectures
Qlearning, adaptive dynamic programming, temporal difference learning.
Reading Lists
Books
** Recommended Text
T. Mitchell (1998) Machine Learning
McGraw Hill
S. M. Weiss and C. A. Kulikowsky (1991) Computer Systems That Learn
Morgan Kaufmann
S J Russell and P Norvig (1995) A I: A Modern Approach
PrenticeHall ISBN 0131038052
Notes
This module is at CQFW Level 6