But my internal reaction is something like “oh god no, I absolutely cannot be seen at an EA global, that would be super cringe”.
I think it’s cringe that the cringne-ess is stopping you from going! (Or probably not stopping you from going, since you wouldn’t otherwise have any reason to go.)
Yeah from my perspective EAG is a place where a lot of people interested in technical alignment go, to talk to other people interested in technical alignment, about technical alignment stuff.
Meanwhile there are other things happening at EAG too, but you can ignore them. You don’t have to attend the talks, you don’t have to talk to anyone you don’t want to talk to. And it’s not terribly expensive, and the location is (often) down the street from you (OP, John).
I wonder whether you’re thinking harder about countersignaling than about what would be object-level good things to do?
Separate from the cringe, I do also see little object-level reason to go, at least for me. The signal to noise ratio at noob-dominated events is pretty bad, EA branding specifically tends to make it worse, and if I’m just going to restrict to people I already know anyway then I can just talk to them outside of such events.
I’ve gone to two EAGx events and I really liked the focus on making 1 on 1 connections, but you’re right that a lot of those connections didn’t seem extremely valuable and networking over the internet is probably more likely to find worthwhile people.
This could be good evidence that EA should focus on figuring out more valuable networking strategies for participants. I think there’s a lot of low hanging fruit in this regard. Speed meetings with random whoever for example is cool because of the random scrying vibe of it, but it’s not good at all for meeting people with specific focuses.
But I think these EA events really are targeting people who are unskilled, unknowledgeable, and just starting out, and trying to get them focused on skilling up and networking in useful directions, in which case I think they are useful events that you have correctly identified as not being useful for you.
Though, I may be playing the loser game in AI Alignment since I’m in the situation where I want to be focusing on AI Alignment / AI notkilleveryone work, and have ideas for directions, but need to eventually be making money to continue working on those directions, so I need to either waste a bunch of time making money doing something else to support my work, or figure out a signal that I can use to communicate to people who can fund my work that my work is worthwhile. I have been trying to focus on what I think is worthwhile and becoming more able to determine what is worthwhile, but honestly I need signals for myself to know whether I’m succeeding or not.
I think there’s an interesting dynamic here where epistemic modesty suggests I should pay attention to what other people are focusing on and assume it is worthwhile, which leads to a kind of signalling game. Also, I want to optimize any ideas I am trying to work with and spread for being understandable and appealing to other people, which is another kind of signalling game. The problem is when the object level is getting sacrificed for the signalling. Signalling on it’s own is actually extremely valuable.
The way I often model it is that there are people focused on valuable things (object level things). In order to communicate about these valuable things they create jargon, but people who just want to be associated with the valuable thing co-opt that jargon to use in signalling games. This creates a game where the object level people are forced to constantly create new jargon to communicate about the object level thing without that communication getting lost in the noise of the signalling game people who are not actually focused on the object level thing.
Perversely, it is unclear how to distinguish between people who are trying to learn jargon for the sake of status signal games vs people who are trying to learn the jargon out of an interest in the object level but are not yet skilled.
Obviously status game people aren’t going to just, tell you they’re only pretending to be interested, and I think many people are genuinely unable to distinguish their own motivations. Desire for status is evolutionarily selected for afaik, so I’d expect it to feel like an innate drive rather than something most people are thinking of as a means to an end.
If you’re fighting for control of a province, the compromise “your enemy gets an important sounding title like Archduke with almost completely ceremonial powers, and you get a boring sounding title like Undersecretary of Resource Management that controls the place’s economy and military” works a surprising amount of the time.
I think we might actually need more status traps to give status focused people somewhere to go and something to do that at least keeps them out of the way of object level focused people. If the status trap can use status people to produce some value, all the better.
One of the big problems is how to communicate to prospective object level people which things are status traps and which are the real things without communicating it to status gamers who would then probably see “real thing” as high status… but then again, reading dry reports is not glamorous to most people, so maybe making fake status games with the appearance of high status even though they are explicitly fake would be enough for most status gamers.
Of course that assumes there’s a fundamental and enduring difference between status and object focused people. If status focused people can learn to be object focused, that would obviously be better. I think EA is doing pretty good at that with the amount they focus on “if you want to be doing charity you need to really really pay attention to and learn about the actual object level effects of what you’re doing”. If you can make actually trying to engage with the object level an obviously high status thing it may convert status focused people to object focused people, or at least a good approximation, but it might run the risk of causing even more status focused people to cause problems for object focused people. Hard to say.
Yeah, I found this surprisingly focused on social reality given that the just previous sentence was ”The winner’s bracket isn’t focused on signalling games, it’s focused on something more object level.”
If you feel the need to signal how focused on the object level you are, you’re still playing the signaling game.
If you feel the need to signal how focused on the object level you are, you’re still playing the signaling game
Hmm… I feel there are two games here:
(1) People are trying to be seen as important and valuable without actually having done the things that would mark them as actually valuable as a way of cheating the system. This would be like the people who go to university but figure out how to cheat and not actually learn because they want to use the certificate to get a good paying job that they hope will also be fairly easy.
(2) People are trying to send signals to connect with others focused on doing things that they want to be doing. This would be like someone who already learned about a subject but then goes to school to get a certificate to use as social proof that they actually have learned about the subject they are interested in.
I feel like people playing both of these games would be interested in signalling how focused they are on the object level, but they would be doing so for very different reasons and to very different effects.
These can shade into each other or be indistinguishable. Suppose you’re trying to signal that you’re smart. Is this #1 or #2, depending on how smart you are? If you think you’re smart and you really aren’t, and you’re intending #2, does that still count as #2 or is it #1 instead?
Yeah! That seems like a really good distinction. Indeed, we have quite a few variables: {actual, self perceived, other perceived}x{motivation, skill}. So if you imagine each of the six taking values from {status focus, object focus, both}x{low competence, high competence} then we have 6x6=36 possible situations. Kinda surprisingly complicated, and obviously this is still a very simplified model of peoples actual situations.
But in general, I suspect most people mostly think of themselves as playing game (2).
Most people probably have more status motivation than they realize, but this may or may not be a problem depending on whether they are also object focused.
I feel like the population is probably split between people who perceive their skill to be higher than their actual skill, and those who perceive their skill lower than it actually is. Although, I have the impression that the common conception of Dunning Kruger is incorrect, and we have found more recently that people can correctly rate themselves in comparison to other people but most people think they are closer to average than they are. Nevertheless, I think this would lead to many people who need to update towards trying more strongly to signal their worth and many others who need to update towards less strongly signalling their worth. But ideally we would have more, and better, social signalling mechanisms helping people coordinate outside of individual agents trying to signal for themselves, but that’s a whole other topic.
I think it’s cringe that the cringne-ess is stopping you from going! (Or probably not stopping you from going, since you wouldn’t otherwise have any reason to go.)
Yeah from my perspective EAG is a place where a lot of people interested in technical alignment go, to talk to other people interested in technical alignment, about technical alignment stuff.
Meanwhile there are other things happening at EAG too, but you can ignore them. You don’t have to attend the talks, you don’t have to talk to anyone you don’t want to talk to. And it’s not terribly expensive, and the location is (often) down the street from you (OP, John).
I wonder whether you’re thinking harder about countersignaling than about what would be object-level good things to do?
Datapoint: I found EAG to be valuable when I lived in Sweden. After moving to London, I completely lost interest. I don’t need it anymore.
Separate from the cringe, I do also see little object-level reason to go, at least for me. The signal to noise ratio at noob-dominated events is pretty bad, EA branding specifically tends to make it worse, and if I’m just going to restrict to people I already know anyway then I can just talk to them outside of such events.
I’ve gone to two EAGx events and I really liked the focus on making 1 on 1 connections, but you’re right that a lot of those connections didn’t seem extremely valuable and networking over the internet is probably more likely to find worthwhile people.
This could be good evidence that EA should focus on figuring out more valuable networking strategies for participants. I think there’s a lot of low hanging fruit in this regard. Speed meetings with random whoever for example is cool because of the random scrying vibe of it, but it’s not good at all for meeting people with specific focuses.
But I think these EA events really are targeting people who are unskilled, unknowledgeable, and just starting out, and trying to get them focused on skilling up and networking in useful directions, in which case I think they are useful events that you have correctly identified as not being useful for you.
Though, I may be playing the loser game in AI Alignment since I’m in the situation where I want to be focusing on AI Alignment / AI notkilleveryone work, and have ideas for directions, but need to eventually be making money to continue working on those directions, so I need to either waste a bunch of time making money doing something else to support my work, or figure out a signal that I can use to communicate to people who can fund my work that my work is worthwhile. I have been trying to focus on what I think is worthwhile and becoming more able to determine what is worthwhile, but honestly I need signals for myself to know whether I’m succeeding or not.
I think there’s an interesting dynamic here where epistemic modesty suggests I should pay attention to what other people are focusing on and assume it is worthwhile, which leads to a kind of signalling game. Also, I want to optimize any ideas I am trying to work with and spread for being understandable and appealing to other people, which is another kind of signalling game. The problem is when the object level is getting sacrificed for the signalling. Signalling on it’s own is actually extremely valuable.
The way I often model it is that there are people focused on valuable things (object level things). In order to communicate about these valuable things they create jargon, but people who just want to be associated with the valuable thing co-opt that jargon to use in signalling games. This creates a game where the object level people are forced to constantly create new jargon to communicate about the object level thing without that communication getting lost in the noise of the signalling game people who are not actually focused on the object level thing.
Perversely, it is unclear how to distinguish between people who are trying to learn jargon for the sake of status signal games vs people who are trying to learn the jargon out of an interest in the object level but are not yet skilled.
Obviously status game people aren’t going to just, tell you they’re only pretending to be interested, and I think many people are genuinely unable to distinguish their own motivations. Desire for status is evolutionarily selected for afaik, so I’d expect it to feel like an innate drive rather than something most people are thinking of as a means to an end.
I’m reminded of Things I Learned By Spending Five Thousand Years In An Alternate Universe
I think we might actually need more status traps to give status focused people somewhere to go and something to do that at least keeps them out of the way of object level focused people. If the status trap can use status people to produce some value, all the better.
One of the big problems is how to communicate to prospective object level people which things are status traps and which are the real things without communicating it to status gamers who would then probably see “real thing” as high status… but then again, reading dry reports is not glamorous to most people, so maybe making fake status games with the appearance of high status even though they are explicitly fake would be enough for most status gamers.
Of course that assumes there’s a fundamental and enduring difference between status and object focused people. If status focused people can learn to be object focused, that would obviously be better. I think EA is doing pretty good at that with the amount they focus on “if you want to be doing charity you need to really really pay attention to and learn about the actual object level effects of what you’re doing”. If you can make actually trying to engage with the object level an obviously high status thing it may convert status focused people to object focused people, or at least a good approximation, but it might run the risk of causing even more status focused people to cause problems for object focused people. Hard to say.
Yeah, I found this surprisingly focused on social reality given that the just previous sentence was ”The winner’s bracket isn’t focused on signalling games, it’s focused on something more object level.”
If you feel the need to signal how focused on the object level you are, you’re still playing the signaling game.
Hmm… I feel there are two games here:
(1) People are trying to be seen as important and valuable without actually having done the things that would mark them as actually valuable as a way of cheating the system. This would be like the people who go to university but figure out how to cheat and not actually learn because they want to use the certificate to get a good paying job that they hope will also be fairly easy.
(2) People are trying to send signals to connect with others focused on doing things that they want to be doing. This would be like someone who already learned about a subject but then goes to school to get a certificate to use as social proof that they actually have learned about the subject they are interested in.
I feel like people playing both of these games would be interested in signalling how focused they are on the object level, but they would be doing so for very different reasons and to very different effects.
These can shade into each other or be indistinguishable. Suppose you’re trying to signal that you’re smart. Is this #1 or #2, depending on how smart you are? If you think you’re smart and you really aren’t, and you’re intending #2, does that still count as #2 or is it #1 instead?
Yeah! That seems like a really good distinction. Indeed, we have quite a few variables: {actual, self perceived, other perceived}x{motivation, skill}. So if you imagine each of the six taking values from {status focus, object focus, both}x{low competence, high competence} then we have 6x6=36 possible situations. Kinda surprisingly complicated, and obviously this is still a very simplified model of peoples actual situations.
But in general, I suspect most people mostly think of themselves as playing game (2).
Most people probably have more status motivation than they realize, but this may or may not be a problem depending on whether they are also object focused.
I feel like the population is probably split between people who perceive their skill to be higher than their actual skill, and those who perceive their skill lower than it actually is. Although, I have the impression that the common conception of Dunning Kruger is incorrect, and we have found more recently that people can correctly rate themselves in comparison to other people but most people think they are closer to average than they are. Nevertheless, I think this would lead to many people who need to update towards trying more strongly to signal their worth and many others who need to update towards less strongly signalling their worth. But ideally we would have more, and better, social signalling mechanisms helping people coordinate outside of individual agents trying to signal for themselves, but that’s a whole other topic.