AI Safety & Entrepreneurship

WikiLast edit: 3 Sep 2025 13:33 UTC by Chris_Leong

Articles:

There should be more AI safety organisations

Why does the AI Safety Community need help founding projects?

AI Assurance Tech Report

AI Safety as a YC Startup

Alignment can be the ‘clean energy’ of AI

AI Tools for Existential Security

What makes an AI Startup “Net Positive” for Safety


Incubation Programs:

Def/​acc at Entrepreneur First—this is a new program, which focuses on “defensive” tech, which will produce a wide variety of startups, some of which may help with reducing x-risk, but others may include AI applications like autonomous weapons or diagnostic medical software, which, while they generally won’t increase x-risk, may not reduce it either.

Catalyze—AI Safety Incubation

Seldon Accelerator: “We take on AGI risk with optimism”

Constellation Incubator: Supports pre-idea entrepreneurs and early-stage organizations

AE Studio has expressed interest in helping seed neglected approaches, incl. ideas that take the form of business ideas

Fifty Years: “We help AI researchers build startups for a safe and aligned future”

AIS Friendly General Program:

Founding to Give: General EA incubator

Entrepreneur First—They run def/​acc, so this could be a decent alternative

Halycon Futures: “We’re an entrepreneurial nonprofit and VC fund dedicated to making AI safe, secure and good for humanity.” They do new project incubation, grants, investments, ect.

Brains Accelerator: “Brains is an accelerator that helps talented scientists and technologists execute on ambitious research visions that are beyond the scope of individual academic labs, startups, or large companies. These visions range from upending the way we deal with carbon to how we understand chronic disease or observe the universe.”

Communities:

Entrepreneurship Channel EA Anyway

AI Safety Founders: Website (with links to resources), Discord, LinkedIn, collaboration request

VC:

AI Safety Seed Funding Network

Lionheart Ventures—“We invest in the wise development of transformative technologies.”

Juniper Ventures—Invests in companies “working to make AI safe, secure and beneficial for humanity”. Published the AI Assurance Tech Report

Polaris Ventures—“We support projects and people aiming to build a future guided by wisdom and compassion for all”

Mythos Ventures—“an early-stage venture capital firm investing in prosocial technologies for the transformative AI era.”

Metaplanet—Founded by Jaan Tallinn. “We invest in deep tech that benefits humanity in the long run. We also make grants to fund projects that don’t fit the venture model… We also love projects that reduce existential risks from AI and other advanced technologies. We tend to skip well-known immediate risks and remedies that get ample attention and investment”

Anthology Fund—Backed by Anthropic. One of the five key areas is “trust and safety tooling”

Menlo Ventures—Lead the GoodFire round (article)

Babuschkin Ventures (contact): “Supports AI safety research and backs startups in AI and agentic systems that advance humanity and unlock the mysteries of our universe”

Safe AI Fund—“an early-stage venture fund dedicated to supporting startups developing tools to enhance AI safety, security, and responsible deployment. The fund provides both financial investment and mentorship”

Organisational Support:

Ashgro—fiscal sponsorship

Rethink Priorities Special Projects—provides fiscal sponsorship or incubation

Other:

Funding options spreadsheet

Hackathon for AI Safety Startups—Ran once by Apart Research, may run again

Constellation Residency—Year-long position

Nonlinear—Free coaching for people who are running an AI safety startup or who are considering starting one


Dustbin—Things that previously existed

Longtermist Entrepreneurship ProjectRetrospective

No comments.