Interpretable AI-Theory and Practice

Follow this link to apply

PhD Scholarship Applications now open!

Deadline 24 October 2021

For January or April 2022 start

 

About the Project

A major bottleneck for enterprises adopting AI is the difficulty in applying and interpreting the correct method for a given problem. This project will survey available interpretable methods in AI and communicate best practices in both lay and comprehensive terms, and explore new theoretical landscapes to extend and innovate interpretable methods in AI, focusing on both uncertainty (aleatoric and epistemic), and causality. Emphasis will be on probabilistic inference, in particular using graphical models, including both neural networks and more general approaches like neural wirings and directed acyclic graphs. In partnership with Max Kelsen P/L, the team will investigate the proposed methodologies on real datasets from different healthcare organizations.

 

About the Candidate

The ARC Training Centre for Information Resilience (CIRES) invites highly motivated and committed candidates to apply for a fully funded PhD position focused on researching interpretable machine learning algorithms to understand how black-box models behave and provide theoretical foundations for algorithmic safety. In line with CIRES’s industry engagement objectives, the position is defined and co-funded in close partnership with the highly successful Brisbane-based consultancy – Max Kelsen. In collaboration with Max Kelsen Partner Investigator Dr Maciej Trzaskowski, an expert in machine learning and quantum computing, the candidate will develop applications for health and genomics data analysis using the data repositories held by Max Kelsen,

Max Kelsen has active research, development, and consulting activities in the fields of AI and cancer genomics, and has prioritized AI safety as a key ingredient of any new product prior to deployment. This scholarship is one of two CIRES projects with Max Kelsen related to organisational and transformational aspects of data, algorithms, and AI.

The candidate is expected to have good understanding of concepts from applied statistics/probability, numerical linear algebra, and machine learning. Proficiency in Python programming language and machine learning software packages such as Pytorch is required.

 

PhD Scholarship Applications now open!

Deadline 24 October 2021

For January or April 2022 start

Follow this link to apply

 

project researchers
Dr Fred Roosta-Khorasani (Principal Advisor)
Dr Hassan Khosravi
Prof Shazia Sadiq
Dr Sen Wang
partner investigator
Max Kelsen