ea.domains—Domains Free to a Good Home

Tl;dr: I’ve set up a database of domains at ea.domains which are free to a good home, to prevent them being squatted and blocked for use. You can add domains you control to it using this form.

Since my well received post on setting up an Anti-squatted AI x-risk domains index, I’ve been picking up more and more domains, talking to other domain holders, and building an interface for viewing them. I’ve also put a[1] few of them[2] to good use[3] already!

Also a big thanks to Ben West for sharing 215 of CEA’s parked domains, which EA projects are welcome to request use of by emailing tech@centreforeffectivealtruism.org. He also offered for CEA to be the custodian of the domains I bought, with the condition that they will be pointed to nameservers I control and that they will return ownership of them for EA-aligned use on request, unless they’re in active use by another EA project. This will save me from having to pay upkeep, which will help make this more sustainable. He’s open to extending this offer to holders of other relevant domains.

If you’d like to use one of these domains, then message the contact person specified. They each have different policies for handing over the domain, but a good standard is that they’ll point the domain towards your servers on request, and hand it over for free if and when you have built something useful at that location.

Here’s the top 40 domains I’m most excited about, but go check the full list:

DomainPossible useContact
aisafety.globalAI Safety Conferencehello@alignment.dev
existential-risks.orgHigh quality explanation?tech@centreforeffectivealtruism.org
ontological.techNew org?hello@alignment.dev
existential.devNew org?hello@alignment.dev
epistemic.devNew org?hello@alignment.dev
aisafety.toolsDirectory of resources?jj@aisafetysupport.org
aisafetycareers.com esben@apartresearch.com
agenty.orgNew org?hello@alignment.dev
x-risks.com drewspartz@nonlinear.org
aisafety.events jj@aisafetysupport.org
aisafety.me jj@aisafetysupport.org
effectivealtruism.venturesEA entrepreneurs or impact investing group?donychristie@gmail.com
aisafetybounties.com esben@apartresearch.com
aisafety.degreeAI Safety PhD cohortshello@alignment.dev
alignment.careers80k said they were happy for others to join the careers advice spacehello@alignment.dev
xrisk.fundx-risk specific funding organization?hello@alignment.dev
aisafety.careers80k said they were happy for others to join the careers advice spacehello@alignment.dev
aisafety.fundAIS-specific funding org?hello@alignment.dev
animalwelfare.dayDays to do a coordinated push for cause areas?hello@alignment.dev
globalhealth.dayDays to do a coordinated push for cause areas?hello@alignment.dev
alignment.dayDays to do a coordinated push for cause areas?hello@alignment.dev
aisafety.dayDays to do a coordinated push for cause areas?hello@alignment.dev
cause-x.dayDays to do a coordinated push for cause areas?hello@alignment.dev
biosecurity.dayDays to do a coordinated push for cause areas? Anti-GOF?hello@alignment.dev
rationality.dayDays to do a coordinated push for cause areas?hello@alignment.dev
aisafety.questProject Euler for AIS?hello@alignment.dev
aisafety.coachAn org which specializes in coaching AI safety people?hello@alignment.dev
aisafety.instituteResearch organization?hello@alignment.dev
aisafety.observerArticles on news on the AI safety space?hello@alignment.dev
alignment.academyTraining programhello@alignment.dev
alignment.fyi hello@alignment.dev
aisafety.venturesEntrepreneurs org?hello@alignment.dev
aisafety.groupPeer-to-peer study groups for skilling up maybe?hello@alignment.dev
globalprioritiesresearch.com tech@centreforeffectivealtruism.org
bountiedrationality.orgWebsite to pair with the BR facebook group.noahcremean@gmail.com
aisafety.foundationAIS-specific funding org?hello@alignment.dev
xrisk.foundationx-risk specific funding organization?hello@alignment.dev
alignment.coursesList of all training programs (using Stampy answer as backend)?hello@alignment.dev
aisafety.networkPeer-to-peer researchers something?hello@alignment.dev
aisafety.devSWEs for AI safety org?hello@alignment.dev



My theory is that in order to align incentives well, I’m going to go ahead and build useful things without requesting funding and hope that someone thinks I’m doing good work and retrofunds me. I’ve spent $544.14 on domains listed on ea.domains, and $1,542.62[4] on other directly EA domains for various projects not listed here, along with at least a few dozen hours researching, buying, and setting up domains as well as the website.

I’d be encouraged to see my theory confirmed, and to set an example so that other people try the build-first strategy if they have the means. I’ll list retrofunders who step up in this post and the footer of the website, if they are happy with that.

There are more domains I’m excited to add to the collection, but they’re a bit more expensive (e.g. one which I think would make a great org name is $99/​y. Not naming publicly to avoid it being squatted, but happy to tell people who want to buy it for ea.domains in private, or use some retrofunds for it).

  1. ^

    aisafety.training—A frontend for AI Safety Support’s comprehensive list of training programs, conferences, and other events.

  2. ^

    aisafety.community—A database of AI safety communities.

  3. ^

    aisafety.world—A collection, soon to be map, of all notable orgs in the AI x-safety space, collaborating with Hamish who will be claiming a Superlinear prize I wrote and Nonlinear funded.

  4. ^

    Accounting available on request.

Crossposted from EA Forum (48 points, 8 comments)