Aarhus Universitets segl

PhD applications

Analytic Number Theory, Automorphic Forms and Representation Theory

Supervisor: Paul David Nelson

Research Area

The project will concern research in some part of analytic number theory, automorphic forms and/or representation theory, building on recent advances in these subjects. We seek applicants with strong mathematics backgrounds who are interested in learning about and contributing to these topics. The start date is flexible.

Earliest start date: 1 May 2026.


In order to apply, and find more details, visit:

Deadline: 1 February 2026 at 23:59 CET.

Riemannian Geometry

The position is funded by a Villum Investigator Grant and will be part of the research group led by Professor Fabrice Baudoin.

Research Area

The successful candidate will work in Riemannian geometry and related areas. Topics of interest include, but are not limited to:

  • Geometric analysis on Riemannian manifolds
  • Curvature and comparison geometry on Riemannian foliations
  • Heat kernel methods and diffusion processes on manifolds
  • Interactions between geometry, analysis, and probability

The precise research direction will be shaped jointly by the PhD student and prof. Baudoin in accordance with the goals of the Villum Investigator project.

Earliest start date: 1 May 2026.


In order to apply, and find more details, visit:

Deadline: 1 February 2026 at 23:59 CET.

REMAX: Rigorous Evaluation Methods for AI Explainability

Supervisor: Rune Nyrup.

Research area and project description

The Centre for Science Studies is looking to appoint a PhD-candidate in the field of interdisciplinary ethics and epistemology of AI. Advertised as part of the project REMAX: Rigorous Evaluation Methods for AI Explainability, it offers an exciting opportunity to participate in a close collaboration between philosophy and computer science.

Background

Explaining the behaviour of complex AI models, such as deep neural networks, is very hard. This is a key challenge for ethically responsible AI. To address this challenge, computer scientists within the field of Explainable AI (XAI) develop algorithms that extract selective or simplified information about a given AI model.

However, rigorous methods for evaluating XAI tools are currently lacking. This is partly due to a gap between how explainability is evaluated within computer science and philosophy. In computer science, explainability is mainly evaluated through data-driven metrics. While they provide some insights for developers, it is unclear whether existing metrics track any ethically relevant properties. Conversely, in philosophy, explainability is evaluated using normative criteria. While based on principled ethical analyses of why explainability matters, they are currently too abstract and provide little practical guidance for AI development or governance.

To bridge this gap, REMAX aims to: (a) pioneer new normative criteria for AI explainability, drawing on philosophical theories of evidence and explanation; (b) develop novel metrics and algorithms for AI explainability; and (c) create a tight feedback loop where (a) and (b) can iteratively refine each other.

The Position

The PhD-candidate will conduct philosophical research on normative criteria for AI explainability, collaborate with computer scientists on metrics for XAI, and contribute to organising workshops and other project activities. The various tasks of the position are organized in continuous agreement with the main supervisor, Associate Professor Rune Nyrup (Science Studies), as well as the co-supervisor, Professor Ira Assent (Computer Science).

The candidate will be based at the Centre for Science Studies, with secondary office space at Computer Science. The project also includes opportunities and funding for conference attendance and guest researcher visits abroad.

This is a 3-year fixed term position, with a start date of 1 May 2026.


In order to apply, and find more details, visit:

Deadline: 1 February 2026 at 23:59 CET.