Hey there~ I’m Austin, currently building https://manifund.org. Always happy to meet LessWrong people; reach out at akrolsmir@gmail.com!
Austin Chen
Questions about animal welfare markets
I agree with the paper that paying here probably has minimal effects on devs, but also even if it does have an effect it doesn’t seem likely to change the results, unless somehow the AI group was more more incentivized to be slow than the non AI group.
Minor point of clarity: I briefly attended a talk/debate where Nate Soares and Scott Aaronson (not Sumner) was discussing these topics. Are we thinking of the same event, or was there a separate conversation with Nate Soares and Scott Sumner?
If you’re looking to do an event in San Francisco, lmk, we’d love to host one at Mox!
Thanks Ozzie—we didn’t invest that much effort into badges this year but I totally agree there’s an opportunity to do something better. Organizer-wise it can be hard to line up all the required info before printing, but having a few sections where people can sharpie things in or pick stickers, seems like low hanging fruit.
This could also extend beyond badges—for example, one could pick different colored swag t-shirts to signal eg (academia vs lab vs funder) at a conference.
I’ll also send this to Rachel for the Curve, which I expect she might enjoy this as a visual and event design challenge.
Post-Manifest coworking at Mox
Pre-LessOnline coworking at Mox
Huh, seems pretty cool and big-if-true. Is there a specific reason you’re posting this now? Eg asking people for feedback on the plan? Seeking additional funders for your $25m Series A?
My guess btw is that some donors like Michael have money parked in a DAF, and thus require a c3 sponsor like Manifund to facilitate that donation—until your own c3 status arrives, ofc.
(If that continues to get held up. but you receive an important c3 donation commitment in the meantime, let us know and we might be able to help—I think it’s possible to recharacterize same year donations after c3 status arrives, which could unblock the c4 donation cap?)
From the Manifund side: we hadn’t spoken with CAIP previously but we’re generally happy to facilitate grants to them, either for their specific project or as general support.
A complicating factor is that, like many 501c3s, we have a limited budget to be able to send towards c4s, eg I’m not sure if we could support their maximum ask of $400k on Manifund. I do feel happy to commit at least $50k of our “c4 budget” (which is their min ask) if they do raise that much through Manifund; beyond that, we should chat!
Kevin Roose
Manifund 2025 Regrants
Thanks to Elizabeth for hosting me! I really enjoyed this conversation; “winning” is a concept that seems important and undervalued among rationalists, and I’m glad to have had the time to throw ideas around here.
I do feel like this podcast focused a bit more on some of the weirder or more controversial choices I made, which is totally fine; but if I were properly stating the case for “what is important about winning” from scratch, I’d instead pull examples like how YCombinator won, or how EA has been winning relative to rationality in recruiting smart young folks. AppliedDivinityStudies’s “where are all the successful rationalists” is also great.
Very happy to answer questions ofc!
Fundraising for Mox: coworking & events in SF
San Francisco ACX Meetups Everywhere Spring 2025
Thanks for the feedback! I think the nature of a hackathon is that everyone is trying to get something that works at all, and “works well” is just a pipe dream haha. IIRC, there was some interest in incorporating this feature directly into Elicit, which would be pretty exciting.
Anyways I’ll try to pass your feedback to Panda and Charlie, but you might also enjoy seeing their source code here and submitting a Github issue or pull request: https://github.com/CG80499/paper-retraction-detection
Oh cool! Nice demo and happy to see it’s shipped and live, though I’d say the results were a bit disappointing on my very first prompt:
(if that’s not the kind of question you’re looking for, then I might suggest putting in some default example prompts to help the user understand what questions this is good for surfacing!)
Thanks! Appreciate the feedback for if we do a future hackathon or similar event~
Thanks, appreciate the thanks!
This looks awesome, congrats on announcing this! I would be extremely tempted myself were it not for a bunch of other likely obligations. Approximately how large do you expect this fellowship to be?
Also, structuring Inkhaven as a paid program was interesting; most fellowships (eg Asterisk, FLF, MATS) instead pay their participants. I wonder if this introduces minor adverse selection, in that only writers who are otherwise financially stable can afford to participate. Famously, startup incubators that charge (like OnDeck) are much worse than incubators that pay for equity (like YC or hf0).
I imagine you’ve thought about this a lot already, and you do offer need-based scholarships which is great; also things like LessOnline and Manifest have proven some amount of success for charging for events. But maybe there’s some other way of finding sponsors or funders for these writers? For example, I think Manifund would be quite happy to sponsor 1-3 “full rides” at $5k+ each, for a few bloggers who are interested in topics like AI safety funding, impact evaluations, and new opportunities, which we could occasionally crosspost to the Manifund newsletter. And I imagine other orgs like GGI might be too!