• Offered by School of Computing
  • ANU College ANU College of Engineering Computing & Cybernetics
  • Course subject Computer Science
  • Areas of interest Computer Science, Information Technology, Advanced Computing, Artifical Intelligence
  • Academic career PGRD
  • Mode of delivery In Person
  • Co-taught Course
  • Offered in Second Semester 2024
    See Future Offerings
  • STEM Course

This course explores a selected area relevant to statistical machine learning in depth, and will be taught by an SML staff member of internationally recognised standing and research interest in that area. Based on current SML staffing, this will be one of:

kernel methods

structured probabilistic models

reinforcement learning

convex analysis and optimisation

topics in information theory

decision theory

deep learning

differentiable optimisation in deep learning


Over the past several years the content has alternated between “convex analysis and optimisation” and “differentiable optimisation in deep learning". Students should contact the course convenor to find out what topic is planned for the coming semester.

Learning Outcomes

Upon successful completion, students will have the knowledge and skills to:

  1.  Distinguish definitions of key concepts in convex analysis, including convexity of sets and functions, subgradients, and the convex dual
  2. Derive results about convex functions such as Jensen’s inequality
  3.  Produce a formal optimization problem from a high-level description and determine whether the problem is convex
  4.  Recognize standard convex optimization problems such as linear programs and quadratic programs
  5.  Derive the standard (dual) quadratic program for support vector machines and understand the extension to max-margin methods for structured prediction
  6.  Implement and analyse gradient descent algorithms such as stochastic gradient descent and mirror descent
  7. Understanding advanced concepts in deep learning and relate them to principles from convex optimisation
  8. Implement a simple algorithms and differentiable models using a common deep learning framework (such as PyTorch)

Indicative Assessment

  1. Regular homework assignments (40) [LO 1,2,3,4,5,6,7,8]
  2. Open-book final exam (60) [LO 1,2,3,4,5,6,7]

The ANU uses Turnitin to enhance student citation and referencing techniques, and to assess assignment submissions as a component of the University's approach to managing Academic Integrity. While the use of Turnitin is not mandatory, the ANU highly recommends Turnitin is used by both teaching staff and students. For additional information regarding Turnitin please visit the ANU Online website.

Workload

Lectures, Tutorials and self study to a total of 130 hours.

Inherent Requirements

None.

Requisite and Incompatibility

To enrol in this course you must have completed: COMP6670 or COMP3670 or COMP8600 or COMP4670 Incompatible with COMP4680.

Prescribed Texts

Main text (depending on the topic taught):

  • Stephen Boyd and Lieven Vandenberghe, "Convex Optimization", Cambridge Press, 2004 (convex optimisation)
  • Ian Goodfellow, Yoshua Bengio and Aaron Courville, "Deep Learning", MIT Press, 2016 (deep learning)
  • Kevin Murphy, "Machine Learning", MIT Press (structured probabilistic models)


Reference texts:

  • Rockafellar, "Convex Analysis", Princeton Press.
  • Hiriart-Urruty and Lemarechal, “Fundamentals of Convex Analysis”, Springer.
  • Bertsekas, Nedic and Ozdaglar, “Convex Analysis and Optimization”, Athena Scientific.
  • Bertsekas, “Nonlinear Programming”, Athena Scientific.
  • Koller and Friedman, "Probabilistic Graphical Models", MIT Press.
  • Bishop, "Pattern Recognition and Machine Learning", Springer.


Preliminary Reading

Background texts:

  • Strang, "Introduction to Linear Algebra", Cambridge Press.
  • Garrity, "All the Mathematics You Missed: But Need to Know for Graduate School", Cambridge University Press.


Assumed Knowledge

  • Familiarity with linear algebra (including norms, inner products, determinants, eigenvalues, eigenvectors, and singular value decomposition)
  • Familiarity with basic probablity theory
  • Familiarity with multivariate differential calculus (e.g., derivative of a vector-valued function)
  • Exposure to mathematical proofs

Fees

Tuition fees are for the academic year indicated at the top of the page.  

Commonwealth Support (CSP) Students
If you have been offered a Commonwealth supported place, your fees are set by the Australian Government for each course. At ANU 1 EFTSL is 48 units (normally 8 x 6-unit courses). More information about your student contribution amount for each course at Fees

Student Contribution Band:
2
Unit value:
6 units

If you are a domestic graduate coursework student with a Domestic Tuition Fee (DTF) place or international student you will be required to pay course tuition fees (see below). Course tuition fees are indexed annually. Further information for domestic and international students about tuition and other fees can be found at Fees.

Where there is a unit range displayed for this course, not all unit options below may be available.

Units EFTSL
6.00 0.12500
Domestic fee paying students
Year Fee
2024 $4980
International fee paying students
Year Fee
2024 $6360
Note: Please note that fee information is for current year only.

Offerings, Dates and Class Summary Links

ANU utilises MyTimetable to enable students to view the timetable for their enrolled courses, browse, then self-allocate to small teaching activities / tutorials so they can better plan their time. Find out more on the Timetable webpage.

The list of offerings for future years is indicative only.
Class summaries, if available, can be accessed by clicking on the View link for the relevant class number.

Second Semester

Class number Class start date Last day to enrol Census date Class end date Mode Of Delivery Class Summary
9225 22 Jul 2024 29 Jul 2024 31 Aug 2024 25 Oct 2024 In Person N/A

Responsible Officer: Registrar, Student Administration / Page Contact: Website Administrator / Frequently Asked Questions