Opportunity type
Fellowship
Funding
Recurring opportunity
Cause areas
AI safety & policy
Routes to impact
Skill-building & building career capital
Learning about important cause areas
Direct high impact on an important cause
Skill set
Conceptual & empirical research
Academia
Software engineering
Deadline
2025-11-15
Location
Stuttgart, Germany
Description
The University of Stuttgart is offering four PhD positions in AI Safety, providing an opportunity to conduct high-impact research on large reasoning models and novel machine behavior.
- Join a leading research group to empirically investigate AI safety, focusing on deception and situational awareness in advanced models.
- Work on cutting-edge projects, publish in top journals, and present at conferences, with support from ELLIS Stuttgart and the Graduate Academy.
- Benefit from a TV-L 13 contract (German federal wage agreement), with positions available for up to 2 years or 1.5 years depending on the project.
- Open to candidates from all fields (computer science background advantageous); advanced English required, German not necessary.
Women are especially encouraged to apply, and severely disabled persons are given priority in case of equal suitability.
This text was generated by AI. If you notice any inconsistencies, please let us know using this form
Join 60k subscribers and sign up for the EA Newsletter, a monthly email with the latest ideas and opportunities