Opportunity type
Training program
Part-time role
Recurring opportunity
Cause areas
AI safety & policy
Routes to impact
Skill-building & building career capital
Learning about important cause areas
Testing your fit for a certain career path
Networking with peers around AI governance topics
Skill set
Research
Software engineering
AI governance understanding
Collaborative problem-solving
Deadline
2026-03-01
Deadline soon
Location
Sydney, Melbourne, Brisbane, Manila, Tokyo, & Singapore
Description
The Technical Alignment Research Accelerator (TARA) is a free, 14-week part-time program designed to help strong Python programmers in the Asia-Pacific region transition into technical AI safety careers without relocating or taking time off.
- Participate in weekly in-person Saturday sessions and receive remote expert support, with all costs (TA, lunch, compute credits, study space) covered.
- Build deep technical skills by implementing core machine learning and AI safety algorithms from scratch, culminating in a publication-quality research project.
- Join a regional network of peers and alumni, with structured guidance and collaborative learning in small groups across major APAC cities.
- Entry requires strong English proficiency, a solid Python background, and in-person attendance; a refundable commitment bond is required, with waivers available for financial need.
Apply to be a participant here.
This text was generated by AI. If you notice any inconsistencies, please let us know using this form
Join 60k subscribers and sign up for the EA Newsletter, a monthly email with the latest ideas and opportunities