This course explores a selected area relevant to statistical machine learning in depth, and will be taught by an SML staff member of internationally recognised standing and research interest in that area. Based on current SML staffing, this will be one of:
• Kernel methods
• Graphical models
• Reinforcement learning
• Convex analysis
• Minimal description length principle
• Topics in information theory
• Decision theory
Over the past several years the content has alternated between “convex analysis and optimisation” and “structured probabilistic models”. Students should contact the course convenor to find out what topic is planned for the coming semester.
Upon successful completion, students will have the knowledge and skills to:
- Distinguish definitions of key concepts in convex analysis, including convexity of sets and functions, subgradients, and the convex dual
- Derive basic results about convex functions such as Jensen’s inequality
- Deduce how Bregman divergences are constructed from convex functions and derive some of their properties
- Produce a formal optimization problem from a high-level description and determine whether the problem is convex
- Recognize standard convex optimization problems such as linear programs and quadratic programs
- Derive the standard (dual) quadratic program for support vector machines and understand the extension to max-margin methods for structured prediction
- Implement and analyse gradient descent algorithms such as stochastic gradient descent and mirror descent
- Assessment will be in the form of regular assignments and an open-book final examination. (null) [LO null]
The ANU uses Turnitin to enhance student citation and referencing techniques, and to assess assignment submissions as a component of the University's approach to managing Academic Integrity. While the use of Turnitin is not mandatory, the ANU highly recommends Turnitin is used by both teaching staff and students. For additional information regarding Turnitin please visit the ANU Online website.
Workload2hrs lectures weekly 8 hours of self study per week
Requisite and Incompatibility
Main text (depending on the topic taught):
- Stephen Boyd and Lieven Vandenberghe, "Convex Optimization"
- Kevin Murphy, "Machine Learning" (structured probabilistic models)
- Hiriart-Urruty and LemarÃ©chal, “Fundamentals of Convex Analysis”
- Bertsekas, Nedic and Ozdaglar, “Convex Analysis and Optimization”
- Bertsekas, “Nonlinear Programming”
- Koller and Friedman, "Probabilistic Graphical Models"
- Bishop, "Pattern Recognition and Machine Learning"
- Familiarity with linear algebra (including norms, inner products, determinants, eigenvalues, eigenvectors, and singular value decomposition)
- Familiarity with basic probablity theory
- Familiarity with multivariate differential calculus (e.g., derivative of a vector-valued function)
- Exposure to mathematical proofs
Tuition fees are for the academic year indicated at the top of the page.
Commonwealth Support (CSP) Students
If you have been offered a Commonwealth supported place, your fees are set by the Australian Government for each course. At ANU 1 EFTSL is 48 units (normally 8 x 6-unit courses). More information about your student contribution amount for each course at Fees.
- Student Contribution Band:
- Unit value:
- 6 units
If you are a domestic graduate coursework student with a Domestic Tuition Fee (DTF) place or international student you will be required to pay course tuition fees (see below). Course tuition fees are indexed annually. Further information for domestic and international students about tuition and other fees can be found at Fees.
Where there is a unit range displayed for this course, not all unit options below may be available.
Offerings, Dates and Class Summary Links
ANU utilises MyTimetable to enable students to view the timetable for their enrolled courses, browse, then self-allocate to small teaching activities / tutorials so they can better plan their time. Find out more on the Timetable webpage.