Along with your black hole example and Jeff’s light bending example, Relativity also predicted time dilation and gravitational waves before they were confirmed experimentally.
I worry I will sound like a jerk. I’m not trying to, but why?
What is the advantage of memorizing a grocery list over writing down a list?
Koch Industries claims that a major piece of social tech they use is compensating managers based on the net present value of the thing they’re managing, rather than whether they’re hitting key targets
I looked but can’t seem to find any information about this. Do you have any idea where I could explore this more?
Bureaucracy is just as gameable as any other system. Human bad actors are able to use bureaucracies to their own ends, I see no reason to believe that AI couldn’t do the same.
Might be worth checking out the Immoral Mazes sequence and the Gervais Principle to see how that goes down.
It’s not just bending the truth. Being vague also gives you more discretion in decision making.
If you list objective criteria for a decision, then you don’t have discretion to give things out to your friends or deny things to your enemies.
This is a great post exploring academic environment according to the Gervais Principle. Love more stuff like this.
This is mainly just a note to call out two great comments that I think may add to the theory.
This comment and a reply with new names that I think help talk about the different categories
Sociopath = Ruthless
Clueless = Believer
Loser = Slacker
This comment also points out the overlap between Rao’s theory and psychologist McClelland’s Needs Theory.
Sociopath = Need for Power
Clueless = Need for Achievement
Loser = Need for Affiliation
Sometimes people seem clueless just because we don’t understand them, but that doesn’t mean they are in fact clueless.
I suppose you aren’t using his suspect definition of Clueless. But your point is potentially valid either way.
It’s also true that something can seem “excessively cynical, inaccurate” or “counterproductive” doesn’t mean they are, in fact, excessively cynical, inaccurate, or counterproductive.
Does this framework actually explain how diffusion of responsibility works?
The framework alone doesn’t but reading the whole thing does. You can also check out some of my shortforms for some summaries.
You clearly don’t like his advice and certainly don’t have to follow it. I have found it very helpful (at understanding some previously confusing situations and getting promoted). I’m not the only one in this thread either so I humbly suggest it might be worth updating priors on how good or bad the framework is.
I strongly suspect you are incorrect. Having read much of Rao’s work, he pretty explicitly advocates becoming more sociopathic (per his definition). One of his other books is called “Be Slightly Evil”
As far as underperformers getting promoted, Luthans has published work on the difference between successful managers (defined as getting promoted) and effective managers (defined as having high performance teams). The reality is that they do very few of the same things and there is very little overlap between the two. Evidence shows that ‘doing well’ at work is not the best way to get to the top.
That is explicitly stated in the post.
Losers recognize that being a wage slave IS a bad deal. As a result, they do the minimum necessary to not get fired and keep collecting their paycheck. Again, this is a reasonable thing to do in many cases. For example, you may be a Loser in your day job so you can pursue your real interests nights and weekends.
I suspect, but can’t prove, that now so many justify wanting something because of climate change that they don’t actually want a solution until and unless they have already got whatever they actually want.
I like it. I’d love to pair it with a prediction market so we can find out who best understands what drives these metrics
What values are exclusively centrist?
I think you’re overlooking the biggest (best?) reason to not worry about this.
Just because someone has said it doesn’t mean everyone has heard it. Unless everyone you’re writing for knows everything ever written....you are making the world a better place by letting them know whatever the thing is.
I love it, but since unpaid internships are extremely frowned on and/or explicitly illegal I can only imagine that “you pay me” internships are a non starter.
How many people truly wish they were never born?
I’d like to read it. PM me please if you decide not to post it publicly.
First, how can we settle who has been a better forecaster so far?
The first forecaster thought it was less likely that 2 out of 3 things that didn’t occur - wouldn’t. The second forecaster thought it was more likely that 2 out of 3 things that didn’t occur—would. So I think that the first forecaster has got a pretty easy case on this one.
I think the rest of your questions seem to be thinking that the percentages are measuring something in the real world. They are a measure of the predictor’s confidence. A way to tell the world how seriously they think you should take their prediction.
What kind of argumentation can the first forecaster make to convince the other one that 42% is the ‘correct’ answer?
I don’t think he can. He is technically a little less sure that humans that will land on the Mars than second forecaster. (or, if you prefer, a little more sure that they won’t) And a 1% difference is functionally 0 difference in this situation.
If they had vastly different levels of confidence, they could discuss the gaps in the optimism/pessimism, but at 1% difference....that’s just personal preference
And what does this numerical value actually mean, as landing on Mars is not a repetitive random event nor it is a quantity which we can try measuring like the radius of Saturn?
To repeat self, They are a measure of the predictor’s confidence. A way to tell the world how seriously they think you should take their prediction.
If one believes the 42% is a better estimation than 43%, how can it help making any choices in the future?
Even if you had predictors with so many predictions that you could actually take a 1% difference seriously....I still don’t know when that 1% would matter much.
There are people who want to be at the top of an organization, for status or money, not because they care about the stated goals of an organization.
By definition, the larger and more successful the organization is the more these types of people will be attracted to it.
Also, by definition, these people are willing to do far more to get to the top than people who only want to “be good” at their jobs.
Given that willingness to do anything to get to the top, they have an edge on hiring and promotions and will eventually succeed.
To ensure they don’t lose their position, they will bring in people that are loyal to them. The only way to prove loyalty is by doing something against their own self interest and/or the interests of the organization. Loyalty means being willing to act dumb.
Over a long enough timeline, the odds that a large organization is led by people who don’t care about the goals of the organization approaches 1. And those leaders will bring in people whose most valuable trait is a willingness to act in a non-productive fashion.
I would consider that most people aren’t really in the market to have their mind changed. Especially about politics. They aren’t talking do open a discussion. They are talking to let you know how smart or ‘good’ they are. (or how dumb or ‘bad’ the people who disagree with them are)
If you want to change people’s mind, start with people who accept the concept of ‘mind changing’.
Shameless plug for my idea for a new internet forum :)
National Guard blaming DOD. Not sure that DOD has responded.