Convincing People of Alignment with Street Epistemology

Having a group of people who have experience convincing people of alignment helps us:

  • 1. Convince people of alignment

  • 2. Make explicit common reasons for skepticism and unwillingless

  • 3. Coach people who want to have these conversations w/​ their friends/​colleagues/​advisors/​etc.

If you’re not convinced of alignment, and would be willing to donate 1 hour of your time talking to me, I’d appreciate if you messaged me to schedule a call these next couple of weeks.

If you’d like to help convince people of alignment and build a curriculum w/​ me, I’d also appreciate a message and schedule a call to chat.

What is Street Epistemology?

Street Epistemology is a set of tools that helps you have better conversations about difficult topics.

It’s a way of getting to the core disagreements of a conversation, instead of talking in circles, past each other, and never getting anywhere.

Recently I had a conversation that went like this (note: details are vague, but this was the general flow of the conversation. They are a friend and had read Superintelligence but were not working in alignment):

L: What would convince you to start working on alignment?

C: There’s not really anything actionable to solve the problem.

L: Have you looked at people’s research agendas who work on this?

C: No.

L: Okay, say we have an actionable plan that you could work on, would you then want to work on alignment?

C: … there’s definitely a pay-aspect, so I couldn’t take much of a pay cut than what I’m working on now (ie software engineer)

L: Okay, say there’s an actionable plan and you have funding at the same pay as your current job, then would you want to work on alignment?

[Note that we could’ve gone off on tangents like explaining different people’s research agendas or saying “if you truly believed AI could cause an existential risk, then a pay cut’s not a big deal”]

C: I’m not sure if I’d be happy working on it. When I’m studying ML, there’s a monotony to it.

L: I think you’re the type of person who’d be happy working on this type of problem; it’s not all ML. Though there are definitely parts that are a slog, but programming and learning is like that sometimes too.

C: Oh, I definitely can find joy in things that are slog, but things like this and global warming make me feel bad.

[asking to elaborate eventually yields]:

C: ”...even if I made a perfect solution to alignment, it wouldn’t change anything. You could have some people on board, but then one guy in another country say “screw this” and ruin everything”

[bring up AI governance and pivotal act into a long reflection]

He said later “I could see myself transitioning if I made it a hobby first”. A good question in general may then be “How could you see yourself transitioning into alignment, realistically?” or giving people very concrete pathways that other people have gone through.

The Current Plan

I’ll meets up/​call 0-5 people each week to help build my personal understanding so that I can better convince others and coach people in the future.

Find at least 1 other person interested in this to start creating a curriculum together. I would expect them to also have conversations, coach people, and potentially read up on the street epistemology site for tips and tricks.

After ~20 calls and building the curriculum, start coaching people on how to do it themselves. One idea is to do 5 sessions of

  • 1. Intro to street epistemology

  • 2. Conversation with Logan (that’s me!) pretending to be a skeptic

  • 3. Conversation with a volunteer skeptic

  • 4. Conversation with their friend/​colleague

  • 5. How to follow up

I also expect to uncover common reasons for skepticisms and unwillingness, which I can write about and convince someone else to help fix those problems

Isn’t This Kind of Weird or Cult-like?

In high-level detail, I can see that comparison, but when you actually have these conversations, they come across as very honest and intentional.

Call to Action

Repeating the beginning, if you’re not convinced of alignment, and would be willing to donate 1 hour of your time talking to me, I’d appreciate if you messaged me to schedule a call these next couple of weeks.

If you’d like to help convince people of alignment and build a curriculum w/​ me, I’d also appreciate a message and schedule a call to chat.