4pm - 5pm
Thursday 13 May 2021
Artificial Intelligence: An Information Theoretic Perspective
Free
This event has passed
There is no need to register, just login to the Zoom call on the day.
Abstract
The core of any Artificial Intelligence (AI) application is machine learning. During the last decade the huge potential of AI has been accentuated by a revolutionary progress in deep learning, whereby a task is solved by training a deep neural network (DNN) using training data and an appropriate objective function. The quest for an effective DNN architecture, as well as the learning objective, is the subject of hundreds, if not thousands, of papers published annually. The talk will focus on the problem of measuring the loss of DNN that drives the learning process. Noting that most researchers use heuristic methods to define the loss function, we resort to information theory to provide a better basis for selecting an objective function that is cognizant of the fact that in machine learning we are dealing with a multitude of probability distributions. The first question to consider is whether the classical information measures such as Shannon entropy and Kullback-Leibler divergence that have been developed for communication applications are equally relevant for decision making tasks. We will show that there are arguments for adopting or developing variants that are better suited for machine learning. We will also address the problem of modelling the various distributions that play an important part in deep learning. The advocated comprehensive information theoretic approach to machine learning will be illustrated on a number of AI tasks, including classification, retrieval, regression and classifier incongruence detection.
Speaker
Professor Josef Kittler will be speaking at this event.
How to attend
This will be an online event held on Zoom.
- Meeting ID: 921 5419 43061
- Passcode: 163426
Visitor information
Find out how to get to the University, make your way around campus and see what you can do when you get here.