Center for AI Safety
safe.aiThe Center for AI Safety (CAIS) is a nonprofit research and field-building organization focused on reducing societal-scale risks from advanced artificial intelligence. CAIS conducts technical AI safety research, builds infrastructure and pathways into the field (including a compute cluster and researcher support), and publishes resources such as blog posts and a newsletter. It also works on advocacy and standards development to align industry, academia, and policymakers around stronger AI safety practices. Researchers, funders, and decision-makers rely on CAIS for pragmatic research, tools, and guidance to mitigate catastrophic AI risks.
Open Positions (5)
Apply instantly with AI
Let ApplyBlast auto-apply to jobs at Center for AI Safety for you. Save hours on applications and land your dream job faster.