Opportunity type
Contest
Funding
Cause areas
AI safety & policy
Information security
Routes to impact
Skill-building & building career capital
Direct high impact on an important cause
Learning about important cause areas
Skill set
Research
Software engineering
Information security
Deadline
2026-03-31
Description
Schmidt Sciences, NDIF, and Cadenza Labs are seeking proposals for a Red Team competition to create challenging datasets of on-policy LLM lies, advancing research in AI lie detection.
- Selected Red Teams receive a $10,000 stipend and up to $2,000 in compute, with an additional $15,000 for teams whose datasets meet agreed specifications.
- Teams will design datasets of LLM-generated lies that are difficult for current detection methods, focusing on novelty, realism, and statistical robustness.
- Deliverables include the dataset, any fine-tuned models, and a concise report justifying the approach and label verification.
- Applications are reviewed on a rolling basis until March 31, 2026; co-authorship on the competition report is included.
No information about visa requirements is provided.
Apply here: https://airtable.com/appJnG5RQbzQBKu9y/pagpqSTsmkbPXsjd0/form
Apply here: https://airtable.com/appJnG5RQbzQBKu9y/pagpqSTsmkbPXsjd0/form
This text was generated by AI. If you notice any inconsistencies, please let us know using this form
Join 60k subscribers and sign up for the EA Newsletter, a monthly email with the latest ideas and opportunities