Technical Alignment Research Accelerator

Technical Alignment Research Accelerator

Opportunity type
Training program
Cause areas
AI safety & policy
Routes to impact
Skill-building & building career capital
Testing your fit for a certain career path
Direct high impact on an important cause
Skill set
Software engineering
Conceptual & empirical research
Community building
Deadline
2026-01-23
Location
Singapore, Sydney, Melbourne, Brisbane, Manila, Tokyo, and Taipei
Description
The Technical Alignment Research Accelerator (TARA) is a free, 14-week part-time program designed to help strong Python programmers in the Asia-Pacific region transition into technical AI safety roles without relocating or taking time off.
  • Participate in weekly in-person Saturday sessions and receive remote expert support, with all costs (lunch, compute credits, study space) covered.
  • Build deep technical skills by implementing core machine learning and AI safety algorithms from scratch, guided by experienced Teaching Assistants.
  • Collaborate in small groups, engage in pair programming, and complete a capstone project with the opportunity for publication-quality research.
  • Join a vibrant APAC-wide community of peers passionate about AI safety, with alumni outcomes including fellowships and AI safety roles.
No visa sponsorship is mentioned; participants must attend in-person sessions in their city. Apply here: https://www.guidedtrack.com/programs/24689i7/run
This text was generated by AI. If you notice any inconsistencies, please let us know using this form
Join 60k subscribers and sign up for the EA Newsletter, a monthly email with the latest ideas and opportunities