# CS801 – Quantitative Methods for AI

 TIMETABLE TEACHING MATERIAL Credits 10 Level 5 Semester 1 Prerequisites N/A Availability Mandatory Contact Lectures / Tutorials: 14 hours | Labs: 14 hours | Homework / Private Study: 70 hours Assessment Small-scale quizzes (15%), Lab submissions (35%) and Individual assignments (50%) Resit The resit assessment will involve completing an assignment or a written Exam. Lecturer Professor Crawford Revie

General Aims

The aim of this class is to provide students with the foundations of mathematics that are required to understand modern Artificial Intelligence techniques. The class will focus on 3 topics: probability, statistics and linear algebra.

Learning Outcomes

On completion of the class students will be able to:

• understand and apply probability theory as used in modern AI: randomness, probability distributions, variance and expectation, expected value;
• understand and apply statistical techniques as used in modern AI: basic data analysis, significance tests, Bayesian inference;
• understand and use linear algebra techniques as used in modern AI: scalars, vectors, matrices, tensors.

Syllabus

1. Probability: probability used to make assumptions about the underlying data when we are designing these deep learning or AI algorithms. It is important for us to understand the key probability distributions, and we will cover it in depth in this course. This part of the course will cover: Elements of Probability, Random Variables, Distributions, Variance and, Expectation, Special Random Variables.
2. Statistics: statistical methods are used in AI to analyse data and quantify the performance of agents. Statistical concepts: mean, standard deviation, variance, confidence intervals, statistical methods for data analytics, use of statistics in performance measurement, and an introduction to statistics in R.
3. Linear algebra notation is used in Machine Learning to describe the parameters and structure of different machine learning algorithms. This makes linear algebra a necessity to understand how neural networks are put together and how they are operating. This part of the course will cover: Scalars, Vectors, Matrices, Tensors, Matix Norms, Spiecial Matrices and Vectors, and Eigenvalues and Eigenvectors.