CS824 - Quantitative Methods for Artificial Intelligence
TIMETABLE | TEACHING MATERIAL |
Credits | 10 |
Level | 5 |
Semester | Semester 1 |
Availability | Mandatory |
Prerequisites | None |
Learning Activities Breakdown | Lectures: 12 hours (on-line) | Lab: 12 hours | Tutorial: 6 hours | Homework / Private Study: 60 hours |
Items of Assessment | 2 |
Assessment | Group-based lab submissions (40%) and Individual assignment (60%) |
Lecturer | Crawford Revie |
Aims and Objectives
The aim of this class is to provide students with the foundations of mathematics that are required to understand modern Artificial Intelligence techniques. The class will focus on three main topic areas: linear algebra, probability and statistics.
Learning Outcomes
– understand the statistical techniques used in modern AI/Deep Learning: in particular, basic data description (EDA), statistical distributions, significance testing, 'classical' and Bayesian inference;
– understand and be able to apply probability theory to common problems in modern AI/Deep Learning: randomness, probability distributions, variance, expected values, etc;
– gain an appreciation of how techniques from linear algebra are used in modern AI/Deep Learning: vectors, matrices, tensors, their structures and basic manipulation.
Syllabus
1 Probability: probability is used to make assumptions about the underlying data when designing deep learning or AI algorithms. It is important to understand the key probability distributions, and these will be covered in this course. This part of the course will cover: Elements of Probability, Random Variables, Distributions, Variance and, Expectation, Special Random Variables.
2 Statistics: statistical methods are used in AI to analyse data and quantify the performance of agents. Statistical concepts: mean, standard deviation, variance, confidence intervals, statistical methods for data analytics, use of statistics in performance measurement, and an introduction to statistics in Python.
3 Linear algebra notation is used in Machine Learning to describe the parameters and structure of different machine learning algorithms. This makes linear algebra a necessity to understand how neural networks are put together and how they operate. This part of the course will cover: Scalars, Vectors, Matrices, Tensors, Matrix Norms, Special Matrices, and Eigenvalues / Eigenvectors.
Recommended Reading
This list is indicative only – the class lecturer may recommend alternative reading material. Please do not purchase any of the reading material listed below until you have confirmed with the class lecturer that it will be used for this class.
Various items of reading material will be suggested on the MyPlace pages associated with each main topic in this module.
Last updated: 2024-09-23 09:57:52