Syllabus
Computer Science 291A
Introduction to Pattern Recognition, Articial Neural Networks, and Machine Learning
 Spring 2014


Lecture: TH 9:00am - 10:50am Room: PHELP 2510
Instructor: Yuan-Fang Wang Office: Harold Frank Hall, 3113
Office hours: TH 11:00am - 12:00pm Phone: 893-3866
Enrollment code: 53322

Prerequisite:

There is no undergraduate pre-requisite (though some material covered in CS165A and CS165B might overlap). This is a first graduate-level introductory CS course in PR, ANN and ML. Students should have graduate standing and good preparation in calculus, linear algebra, and probability.

Course Description:

This course will cover several topics on pattern recognition (PR), artificial neural networks (ANN), and machine learning (ML). Pattern recognition is a classical research area that deals with recognizing patterns (objects) based on their features (traits or appearance). It has seen wide applications in speech recognition, image analysis, target detection, optical character recognition, fingerprint identification, dating services, insurance fraud detection, DNA sequence alignment, protein structure matching, data mining, network intrusion detection, engine trouble shooting, among many others. Artificial neural networks provide a general computing framework which is purported to be highly parallel, distributed, and fault tolerant. Machine learning deals with algorithms and formulations that enable a machine to learn and improve its performance from experience, that is, to modify its behaviors and execution on the basis of acquired information and data analysis. These areas share a lot of commonality in addressing similar problems in classification and clustering.

Though active research activities are still being conducted in these areas, many useful results and practical algorithms have been developed which should prove useful. This course will discuss some of these results. You should consider concepts and algorithms presented here as general mathematic tools which hopefully will enrich your math "toolbox" and may become handy some day.

Grading:

There is no exam in the course. The final grade will be determined based on attendance and class participation, assignments, and research/project presentations. The difference between assignments (miniprojects) and class/research projects is if you get to choose the topics or work on assigned topics. Be warned that how much you get out of this course will in a very large degree be decided by your motivation to learn and your curiosity to explore and experiment with the many ideas and algorithms presented in the course.

Topics Covered:

We will divide the courses into three parts: The first half will focus on many "traditional", tried-and-true techniques and algorithms in pattern recognition and artificial neural networks. The second part will focus on more advanced techniques in machine learning, in particular, in Support Vector Machines, Boosting, and Kernel Methods. The third part will be on recent development of "deep" neural networks, particularly their applications to image and vision problems.

We will discuss the following "traditional" topics (Topics may be deleted and additional topics added depending on the interests of the participants and the available time):

We will also discuss the following more advanced topics in machine learning: Caveat: Based on past experience, we may not be able to cover all the topics listed above. Even though a fair amount of mathematics will be introduced, this is not intended to be a course in mathematics. The purpose of the course is to introduce you to some practical, useful concepts, designs, and algorithms in PR, ANN, and ML with a minimum amount of mathematic rigor. For example, we do not prove many of the results, instead, we provide intuition as to why things turn out the way they are.

Textbooks:

I will lecture using my own notes. We do not follow any book in any particular order. So none of these books is required. You can probably find PDF versions of many of these books online.

Other Useful References (oldies but goodies)

  1. Duda and Heart, Pattern Classification and Scene Analysis, John-Wiley & sons, New York, NY, 1973.
  2. Fukunaga, Introduction to Statistical Pattern Recognition, Academic Press, New York, NY, 1972.

General class policies and announcements: