EE Department Intranet - intranet.ee.ic.ac.uk
Close window CTRL+W

ELEC70082 Distributed Optimisation and Learning


Lecturer(s): Dr Stefan Vlaski

Aims

You will see how learning algorithms, from least mean squares to backpropagation can be derived from risk minimization, and how these insights can be exploited to derive distributed counterparts of the algorithms, both in federated and decentralised settings. You will become familiar with the challenges associated with developing distributed systems (what to share, heterogeneity, adversaries, model-fusion), as well their advantages once properly designed (privacy, robustness, communication efficiency). Material will be accompanied by applications in signal processing, communications, and other areas.

Learning Outcomes

By the end of this module, you will be able to (1) recommend settings where distributed solutions are likely to yield substantial improvements, (2) develop distributed/decentralized versions of centralized algorithms, (3) analytically quantify trade-offs associated with distributed implementations, and (4) implement algorithmic solutions in code, making use of relevant packages.

Syllabus

The module will cover (1) empirical risk minimisation and (stochastic) gradient descent with variants, (2) multi-objective optimisation and Pareto optimisation, (3) distributed and federated learning, (4) graph and network theory, (5) primal schemes for Pareto optimisation (penalty, diffusion), (6) primal-dual schemes for Pareto optimisation, (7) multitask learning and (8) noncooperative learning.
Assessment
Exam Duration: N/A
Exam contribution: 60%
Coursework contribution: 40%

Term: Spring

Closed or Open Book (end of year exam): N/A

Coursework Requirement:
         N/A

Oral Exam Required (as final assessment): N/A

Prerequisite module(s): None required

Course Homepage: unavailable

Book List: