First, props for doing the experiment. And yeah, that sounds delicious.
The fact still stands that ice cream is what we mass produce and send to grocery stores. Even if our hypothetical aliens could reasonably predict that we’d enjoy any extra fatty, salty, and sweet food should we happen to come across it, that’s not sufficient information to determine what foods we mass produce in practice.
Is it really that hard to predict ice cream over bear fat with honey and salt? I’m skeptical.
To start with, it’s a good bet that we’re going to mass produce foods that are easily mass produced. Bears? Lol, no. Domesticated herbivores, obviously. Cream, not tallow. Plant sugar, not honey. Cavemen figured out how to solve the “mass produce food without much technology” problem, which is how we stopped being cavemen. If the aliens are willing to spend five minutes actually trying, you’d think they’d figure out that bear fat is out for this reason alone.
More centrally, I roll to doubt the implicit “But I should want to eat lots of pure fat, because I’m evolved to like calories!”. Stop being a baby about “Ew it’s gross”, and try eating 1000 calories of pure rendered fat by itself. I dare you to actually run the experiment, and see what happens. Find out where that “Ew it’s gross comes from” and whether it’s legit or not. It’s not hard to figure out.
Tallow is delicious when potatoes are fried in it, but try to have a meal of pure tallow and you’ll feel sick to your stomach because your stomach is going to have a hard time digesting that. Butter is emulsified with water, and is easier to digest in large globs. Cream is emulsified fat-in-water so it actually disperses when consumed with more water, and is therefore way easier to digest in large amounts when not mixed in with other foods. Maybe part of the reason that we fry potatoes in tallow, put globs of butter on bread, and eat bowls of solidified cream—and not the other way around—is that the other way around doesn’t work?
On top of that though,
I don’t know any bear hunters and don’t want to get parasites,
Emphasis mine.
This is important too, and affects people’s taste in a very visceral way—and pathogen risk is exactly why I was disgusted by bear meat the one chance I had to eat it. Imagine taking a bite of raw chicken, or pork. Or even beef. Disgusting, right?
Except raw meats are delicious when we trust them. Sushi is the obvious example, despite the fact that you’d be disgusted by the idea of taking a bite out of a raw fish you caught in the river. But it’s true with other meats too, which are a lot like sushi. In Germany they sell raw pork sandwiches, and call it “mett”. It’s delicious.
If you want to understand why people aren’t always immediately super on board with “Try this weird food that no one else you know eats and survives eating”, maybe this is partly why. When I was visiting Sweden, people there were having cheese for dessert. How easy do you think it’d be to sell people on the idea of stinky cheese, if not for cultural learning that it’s actually safe?
Is this really that surprising?
That we’d viscerally want to avoid food that brings risk of parasites and disease?
That we’d mass produce food that is easily mass producible?
And want to eat large quantities of food only when we can digest it in large quantities?
There are more details that aren’t so immediately obvious. Like why iced cream? Sure, maybe to make it solid, but why does that matter? Or, why do we not salt ice cream? Okay, I guess it’d melt. So maybe it is immediately obvious, since I literally figured that out as I was typing this.
Regardless, there’s work to be done in predicting which “superstimuli” people are going to tend towards, and it’s not always trivial. “Plant sugar and cream” may be trivial, but predicting “ice cream” in particular is a bit harder.
Back on the first hand though, we don’t just eat ice cream. We also drink milk shakes, for one. So the answer to “Why solid?” is “Not just solid!”. And ice cream sounds gross to me right now, but a fatty bear steak drizzled with a touch of honey and sprinkled with salt actually sounds delicious. Or cow steak, whatever. Ice cream is but one food we consume, and not some fixed pinnacle of yumminess.
Our tastes and desires actually change, as we learn about things like “How safe is it to eat raw pork in Germany?”, and “How much sugar is good for my body right now?”. That’s why you can’t tempt me with ice cream right now.
Run the experiment of eating all the sugar you want—way more than you should. Experience what it feels like to eat too much sugar, and allow yourself to update on that feeling of sickness. The result is learning that sugar isn’t all that great. I still enjoy little bits at the appropriate times, sure, but that actually aligns with my current best estimates of what’s best for me—and gone are the days of gorging on sweets. Try to restrain yourself, and treat your tastes as “unpredictable unchangeable unconscious stuff”, and you may never give yourself the chance to learn otherwise.
I agree that most people don’t put in more thought than “Uh, bear fat and honey and salt flakes?”, and therefore make terrible predictions. Maybe this is how the book presented it.
But I don’t think the right conclusion is “Unpredictable!” so much as “So put in the work if you care to predict it?”.
This is directly applicable to the alignment of AI because it turns out we’re cultivating AI more than hard coding them, so if we don’t learn to cultivate alignment of our own desires.. and learn to make sense of our preferences for ice cream over bear meat—and to allow them to shift back to bear meat over ice cream when appropriate.. then what chance do we have at aligning an AI?
You don’t want the AI craving something analogous to sweets and trying to restrain itself—look how well that works out for humans.
Nor do you want to plead with AI—or people working on AI—to resist the temptation of the forbidden fruit. Look at how well that one has worked out for humans.
I do think this is pretty interesting – I agree a lot of this is imaginable if you think about it and I’m excited by the exercise of people trying to One-shot solutions to complex problems.
I am curious whether you think you could make some kind of equivalent prediction, about a random facet of the evolved world that is not the Bear Fat thing in particular, that you don’t already know about?
Your description makes this feel plausible, but, it’s a lot easier when you get to look at the evidence in hindsight.
That’s the right first question to consider, and it’s something I was thinking about while writing that comment.
I don’t think it’s quite the right question to answer though. What I’m doing to generate these explanations is very different than “Go back to the EEA, and predict forward based on first principles”, and my point is more about why that’s not the thing to be doing in the first place more than about the specific explanation for the popularity of ice cream over bear fat.
It can sound nitpicky, but I think it’s important to make hypotheticals concrete because a lot of the time the concrete details you notice upon implementation change which abstractions it makes sense to use. Or, to continue the metaphor, picking little nits when found is generally how you avoid major lice infestations.
In order to “predict” ice cream I have to pretend I don’t already know things I already know. Which? Why? How are we making these choices? It will get much harder if you take away my knowledge of domestication, but are we to believe these aliens haven’t figured that out? That even if they don’t have domestication on their home planet, they traveled all this way and watched us with bears without noticing what we did to wolves? “Domestication” is hindsight in that it would take me much longer than five minutes as a cave man to figure out, but it’s a thing we did figure out as cave men before we had any reason to think about ice cream. And it’s it’s sight that I do have and that the aliens likely would too.
Similarly, I didn’t come up with the emulsification/digestion hypothesis until after learning from experience what happens when you consume a lot of pure oils by themselves. I’m sure a digestion expert could have predicted the result in advance, but I didn’t have to learn a new field of expertise because I could just run the experiment and then the obvious answer becomes obvious. A lot of times, explanations are a lot easier to verify once they’ve been identified than they are to generate in the first place, and the fact that the right explanations come to mind vastly more easily when you run the experiment is not a minor detail to gloss over. I mean, it’s possible that Zorgax is just musing idly and comes up with a dumb answer like “bear fat”, but if he came all this way to get the prediction right you bet your ass he’s abducting a few of us and running some experiments on how we handle eating pure fat.
As a general rule, in real life, fast feedback loops and half decent control laws dominate a priori reasoning. If I’m driving in the fog and can’t see but 10 feet ahead, I’m really uninterested in the question “What kind of rocks are at the bottom of the cliff 100 feet beyond the fog barrier?” and much more interested in making sure I notice the road swerving in time to keep on a track that points up the mountain. Or, in other words, I don’t care to predict which exact flavor of superstimuli I might be on track to overconsume, from the EEA. I care to notice before I get there, which is well in advance given how long ago we figured out domestication. I only need to keep my tastes tethered to reality so that when I get there ice cream and opioids don’t ruin my life—and I get to use all my current tools to do it.
I think this is the right focus for AI alignment too.
The way I see it, Eliezer has been making a critically important argument that if you keep driving in a straight line without checking the results, you inevitably end up driving off a cliff. And people really are this stupid, a lot of times. I’m very much on board with the whole “Holy fuck, guys, we can’t be driving with a stopping distance longer than our perceptual distance!” thing. The general lack of respect and terror is itself terrifying, because plenty of people have tried to fly too close to the sun and lost their wings because they were too stupid to notice the wax melting and descend.
And maybe he’s not actually saying this, but the connotations I associate with his framing, and more importantly the interpretation that seems widespread in the community, is that “We can’t proceed forward until we can predict vanilla ice cream specifically, from before observing domestication”. And that’s like saying “I can’t see the road all the way to the top of the mountain because of fog, so I will wisely stay here at the bottom”. And then feeling terror build from the pressure from people wanting to push forward. Quite reasonably, given that there actually aren’t any cliffs in view, and you can take at least the next step safely. And then reorient from there, with one more step down the road in view.
I don’t think this strategy is going to work, because I don’t think you can see that far ahead, no matter how hard you try. And I don’t think you can persuade people to stop completely, because I think they’re actually right not to.
I don’t think you have to see the whole road in advance because there’s a lot of years between livestock and widespread ice cream. Lots of chances to empirically notice the difference between cream and rendered fats. There’s still time to see it millennia in advance.
What’s important is making sure that’s enough.
It’s not a coincidence that I didn’t get to these explanations by doing EEA thinking at all. Ice cream is more popular than bear fat because of how it is cheaper to produce now. It’s easier to digest now. Aggliu was concerned with parasites this week. These aren’t things we need to refer to the EEA to understand, because they apply today. The only reason I could come up with these explanations, and trivially, is because I’m not throwing away most of what I know, declining to run cheap experiments, and then noticing how hard it is to reason 1M years in advance when I don’t have to.
The thread I followed to get there isn’t “What would people who knew less want, if they suddenly found themselves blasted with a firehose of new possibilities, and no ability to learn?”. The thread I followed is “What do I want, and why”. What have I learned, and what have we all learned. Or can we all learn—and what does this suggest going forward? This framing of people as agents fumbling through figuring out what’s good for them pays rent a lot more easily than the framing of “Our desires are set by the EEA”. No. Our priors are set by the EEA. But new evidence can overwhelm that prettyquickly—if you let it.
So for example, EEA thinking says “Well, I guess it makes sense that I eat too much sugar, because it’s energy which was probably scarce in the EEA”. Hard to do the experiment, not much you can do with that information if it proves true. On the other hand, if you let yourself engage with the question “Is a bunch of sugar actually good?”, you can run the experiment and learn “Ew, actually no. That’s gross”—and then watch your desires align with reality. This pays rent in fewer cavities and diabetes, and all sorts of good stuff.
Similarly, “NaCl was hard to get in the EEA, so therefore everyone is programmed to want lots of NaCl!”. I mean, maybe. But good luck testing that, and I actually don’t care. What I care about is knowing which salts I need in this environment, which will stop these damn cramps. And I can run that test by setting out a few glasses of water with different salts mixed in, and seeing what happens. The result of that experiment was that I already knew which I needed by taste, and it wasn’t NaCl that I found my self chugging the moment it touched my lips.
Or with opioids. I took opioids once at a dose that was prescribed to me, and by watching the effects learned from that one dose “Ooh, this feels amazing” and “I don’t have any desire to do that again”. It took a month or so for it to sink in, but one dose. I talked to a man the other day who had learned the same thing much deeper into that attractor—yet still in time to make all the difference.
Yes, “In EEA those are endogenous signaling chemicals” or whatever, but we can also learn what they are now. Warning against the dangers of superstimuli is important, but “Woooah man! Don’t EVER try drugs, because you’re hard coded by the EEA to destroy your life if you do that!” is untrue and counter productive. You can try opioids if you want, just pay real close attention, because the road may be slicker than you think and there are definitely cliffs ahead. Go on, try it. Are you sure you want to? A lot less tempting when framed like that, you know? How careful are you going to be if you do try it, compared to the guy responding “You’re not the boss of me Dad!” to the type of dad who evokes it?
So yes, lots of predictions and lots of rent paid. Just not those predictions.
Predictions about how I’ll feel if I eat a bowl full of bear fat the way one might with ice cream, despite never having eaten pure bear fat. Predictions about people’s abilities to align their desires to reality, and rent paid in actually aligning them. And in developing the skill of alignment so that I’m more capable of detecting and correcting alignment failures in the future, as they may arise.
I predict, too, that this will be crucial for aligning the behaviors of AI as well. Eliezer used to talk about how a mind that can hold religion fundamentally must be too broken to see reality clearly. So too, I predict, that a mind that can hold a desire for overconsumption of sugar must necessarily lack the understanding needed to align even more sophisticated minds.
Though that’s one I’d prefer to heed in advance of experimental confirmation.
Our tastes and desires actually change, as we learn about things like “How safe is it to eat raw pork in Germany?”, and “How much sugar is good for my body right now?”. That’s why you can’t tempt me with ice cream right now.
This is important. Food preferences have a cognitive component, they’re not just stimulus-response, and the usual way I see evolved preferences talked about doesn’t seem to recognize this nearly enough. (This is tangential to the original point about alignment failure — which just requires that preferences be godshatter-like enough, which it seems obvious that they are — but, like you say, matters for understanding ourselves.)
I haven’t gotten bad physical consequences from eating too much sugar, but also I wouldn’t know if I do because e.g. frosting is hard to stand for me in a visceral way, just due to the sweetness, and eating too much lesser-sweet stuff still wakes me “sweet tired”. But I don’t notice an impact on e.g. my digestion or my energy (besides that of, like, eating any meal).
From what you said, it sounded like there is an impact from eating too much sugar? What is it?
Larger effects are easier to measure, and therefore quicker to update on. I didn’t take concerns of “too much sweets” very seriously, so i had no restraint whatsoever.
The clearest updates came after wildly overconsuming while also cutting weight. I basically felt like shit which is probably a much exaggerated “sweet tired”, and never ate swedish fish again. And snickers bars before that.
Since then the updates have been more subtle and below the level of what’s easy to notice and keep good tabs on, but yes “sweet tired”. Just generally not feeling satisfied and fulfilled, and developing more of that visceral distaste for frosting that you have as well, until sweets in general have a very limited place in my desires.
It’s not a process like “Oh, I felt bad, so therefore I shall resist my cravings for sugar”, it’s “Ugh, frosting is gross” because it tastes like feeling tired and bad.
Interesting. To me frosting feels almost physically painful to eat more than a small amount of, and I have no memories of any consequences from eating frosting (besides the immediate “ow”)
and try eating 1000 calories of pure rendered fat by itself
I’ve done this many times. Drinking 150ml olive oil is not that bad. If you just chug it you get somewhat nauseous. But if you sip it over 45 minutes, its fine.
First, props for doing the experiment. And yeah, that sounds delicious.
Is it really that hard to predict ice cream over bear fat with honey and salt? I’m skeptical.
To start with, it’s a good bet that we’re going to mass produce foods that are easily mass produced. Bears? Lol, no. Domesticated herbivores, obviously. Cream, not tallow. Plant sugar, not honey. Cavemen figured out how to solve the “mass produce food without much technology” problem, which is how we stopped being cavemen. If the aliens are willing to spend five minutes actually trying, you’d think they’d figure out that bear fat is out for this reason alone.
More centrally, I roll to doubt the implicit “But I should want to eat lots of pure fat, because I’m evolved to like calories!”. Stop being a baby about “Ew it’s gross”, and try eating 1000 calories of pure rendered fat by itself. I dare you to actually run the experiment, and see what happens. Find out where that “Ew it’s gross comes from” and whether it’s legit or not. It’s not hard to figure out.
Tallow is delicious when potatoes are fried in it, but try to have a meal of pure tallow and you’ll feel sick to your stomach because your stomach is going to have a hard time digesting that. Butter is emulsified with water, and is easier to digest in large globs. Cream is emulsified fat-in-water so it actually disperses when consumed with more water, and is therefore way easier to digest in large amounts when not mixed in with other foods. Maybe part of the reason that we fry potatoes in tallow, put globs of butter on bread, and eat bowls of solidified cream—and not the other way around—is that the other way around doesn’t work?
On top of that though,
Emphasis mine.
This is important too, and affects people’s taste in a very visceral way—and pathogen risk is exactly why I was disgusted by bear meat the one chance I had to eat it. Imagine taking a bite of raw chicken, or pork. Or even beef. Disgusting, right?
Except raw meats are delicious when we trust them. Sushi is the obvious example, despite the fact that you’d be disgusted by the idea of taking a bite out of a raw fish you caught in the river. But it’s true with other meats too, which are a lot like sushi. In Germany they sell raw pork sandwiches, and call it “mett”. It’s delicious.
If you want to understand why people aren’t always immediately super on board with “Try this weird food that no one else you know eats and survives eating”, maybe this is partly why. When I was visiting Sweden, people there were having cheese for dessert. How easy do you think it’d be to sell people on the idea of stinky cheese, if not for cultural learning that it’s actually safe?
Is this really that surprising?
That we’d viscerally want to avoid food that brings risk of parasites and disease?
That we’d mass produce food that is easily mass producible?
And want to eat large quantities of food only when we can digest it in large quantities?
There are more details that aren’t so immediately obvious. Like why iced cream? Sure, maybe to make it solid, but why does that matter? Or, why do we not salt ice cream? Okay, I guess it’d melt. So maybe it is immediately obvious, since I literally figured that out as I was typing this.
Regardless, there’s work to be done in predicting which “superstimuli” people are going to tend towards, and it’s not always trivial. “Plant sugar and cream” may be trivial, but predicting “ice cream” in particular is a bit harder.
Back on the first hand though, we don’t just eat ice cream. We also drink milk shakes, for one. So the answer to “Why solid?” is “Not just solid!”. And ice cream sounds gross to me right now, but a fatty bear steak drizzled with a touch of honey and sprinkled with salt actually sounds delicious. Or cow steak, whatever. Ice cream is but one food we consume, and not some fixed pinnacle of yumminess.
Our tastes and desires actually change, as we learn about things like “How safe is it to eat raw pork in Germany?”, and “How much sugar is good for my body right now?”. That’s why you can’t tempt me with ice cream right now.
Run the experiment of eating all the sugar you want—way more than you should. Experience what it feels like to eat too much sugar, and allow yourself to update on that feeling of sickness. The result is learning that sugar isn’t all that great. I still enjoy little bits at the appropriate times, sure, but that actually aligns with my current best estimates of what’s best for me—and gone are the days of gorging on sweets. Try to restrain yourself, and treat your tastes as “unpredictable unchangeable unconscious stuff”, and you may never give yourself the chance to learn otherwise.
I agree that most people don’t put in more thought than “Uh, bear fat and honey and salt flakes?”, and therefore make terrible predictions. Maybe this is how the book presented it.
But I don’t think the right conclusion is “Unpredictable!” so much as “So put in the work if you care to predict it?”.
This is directly applicable to the alignment of AI because it turns out we’re cultivating AI more than hard coding them, so if we don’t learn to cultivate alignment of our own desires.. and learn to make sense of our preferences for ice cream over bear meat—and to allow them to shift back to bear meat over ice cream when appropriate.. then what chance do we have at aligning an AI?
You don’t want the AI craving something analogous to sweets and trying to restrain itself—look how well that works out for humans.
Nor do you want to plead with AI—or people working on AI—to resist the temptation of the forbidden fruit. Look at how well that one has worked out for humans.
I do think this is pretty interesting – I agree a lot of this is imaginable if you think about it and I’m excited by the exercise of people trying to One-shot solutions to complex problems.
I am curious whether you think you could make some kind of equivalent prediction, about a random facet of the evolved world that is not the Bear Fat thing in particular, that you don’t already know about?
Your description makes this feel plausible, but, it’s a lot easier when you get to look at the evidence in hindsight.
That’s the right first question to consider, and it’s something I was thinking about while writing that comment.
I don’t think it’s quite the right question to answer though. What I’m doing to generate these explanations is very different than “Go back to the EEA, and predict forward based on first principles”, and my point is more about why that’s not the thing to be doing in the first place more than about the specific explanation for the popularity of ice cream over bear fat.
It can sound nitpicky, but I think it’s important to make hypotheticals concrete because a lot of the time the concrete details you notice upon implementation change which abstractions it makes sense to use. Or, to continue the metaphor, picking little nits when found is generally how you avoid major lice infestations.
In order to “predict” ice cream I have to pretend I don’t already know things I already know. Which? Why? How are we making these choices? It will get much harder if you take away my knowledge of domestication, but are we to believe these aliens haven’t figured that out? That even if they don’t have domestication on their home planet, they traveled all this way and watched us with bears without noticing what we did to wolves? “Domestication” is hindsight in that it would take me much longer than five minutes as a cave man to figure out, but it’s a thing we did figure out as cave men before we had any reason to think about ice cream. And it’s it’s sight that I do have and that the aliens likely would too.
Similarly, I didn’t come up with the emulsification/digestion hypothesis until after learning from experience what happens when you consume a lot of pure oils by themselves. I’m sure a digestion expert could have predicted the result in advance, but I didn’t have to learn a new field of expertise because I could just run the experiment and then the obvious answer becomes obvious. A lot of times, explanations are a lot easier to verify once they’ve been identified than they are to generate in the first place, and the fact that the right explanations come to mind vastly more easily when you run the experiment is not a minor detail to gloss over. I mean, it’s possible that Zorgax is just musing idly and comes up with a dumb answer like “bear fat”, but if he came all this way to get the prediction right you bet your ass he’s abducting a few of us and running some experiments on how we handle eating pure fat.
As a general rule, in real life, fast feedback loops and half decent control laws dominate a priori reasoning. If I’m driving in the fog and can’t see but 10 feet ahead, I’m really uninterested in the question “What kind of rocks are at the bottom of the cliff 100 feet beyond the fog barrier?” and much more interested in making sure I notice the road swerving in time to keep on a track that points up the mountain. Or, in other words, I don’t care to predict which exact flavor of superstimuli I might be on track to overconsume, from the EEA. I care to notice before I get there, which is well in advance given how long ago we figured out domestication. I only need to keep my tastes tethered to reality so that when I get there ice cream and opioids don’t ruin my life—and I get to use all my current tools to do it.
I think this is the right focus for AI alignment too.
The way I see it, Eliezer has been making a critically important argument that if you keep driving in a straight line without checking the results, you inevitably end up driving off a cliff. And people really are this stupid, a lot of times. I’m very much on board with the whole “Holy fuck, guys, we can’t be driving with a stopping distance longer than our perceptual distance!” thing. The general lack of respect and terror is itself terrifying, because plenty of people have tried to fly too close to the sun and lost their wings because they were too stupid to notice the wax melting and descend.
And maybe he’s not actually saying this, but the connotations I associate with his framing, and more importantly the interpretation that seems widespread in the community, is that “We can’t proceed forward until we can predict vanilla ice cream specifically, from before observing domestication”. And that’s like saying “I can’t see the road all the way to the top of the mountain because of fog, so I will wisely stay here at the bottom”. And then feeling terror build from the pressure from people wanting to push forward. Quite reasonably, given that there actually aren’t any cliffs in view, and you can take at least the next step safely. And then reorient from there, with one more step down the road in view.
I don’t think this strategy is going to work, because I don’t think you can see that far ahead, no matter how hard you try. And I don’t think you can persuade people to stop completely, because I think they’re actually right not to.
I don’t think you have to see the whole road in advance because there’s a lot of years between livestock and widespread ice cream. Lots of chances to empirically notice the difference between cream and rendered fats. There’s still time to see it millennia in advance.
What’s important is making sure that’s enough.
It’s not a coincidence that I didn’t get to these explanations by doing EEA thinking at all. Ice cream is more popular than bear fat because of how it is cheaper to produce now. It’s easier to digest now. Aggliu was concerned with parasites this week. These aren’t things we need to refer to the EEA to understand, because they apply today. The only reason I could come up with these explanations, and trivially, is because I’m not throwing away most of what I know, declining to run cheap experiments, and then noticing how hard it is to reason 1M years in advance when I don’t have to.
The thread I followed to get there isn’t “What would people who knew less want, if they suddenly found themselves blasted with a firehose of new possibilities, and no ability to learn?”. The thread I followed is “What do I want, and why”. What have I learned, and what have we all learned. Or can we all learn—and what does this suggest going forward? This framing of people as agents fumbling through figuring out what’s good for them pays rent a lot more easily than the framing of “Our desires are set by the EEA”. No. Our priors are set by the EEA. But new evidence can overwhelm that pretty quickly—if you let it.
So for example, EEA thinking says “Well, I guess it makes sense that I eat too much sugar, because it’s energy which was probably scarce in the EEA”. Hard to do the experiment, not much you can do with that information if it proves true. On the other hand, if you let yourself engage with the question “Is a bunch of sugar actually good?”, you can run the experiment and learn “Ew, actually no. That’s gross”—and then watch your desires align with reality. This pays rent in fewer cavities and diabetes, and all sorts of good stuff.
Similarly, “NaCl was hard to get in the EEA, so therefore everyone is programmed to want lots of NaCl!”. I mean, maybe. But good luck testing that, and I actually don’t care. What I care about is knowing which salts I need in this environment, which will stop these damn cramps. And I can run that test by setting out a few glasses of water with different salts mixed in, and seeing what happens. The result of that experiment was that I already knew which I needed by taste, and it wasn’t NaCl that I found my self chugging the moment it touched my lips.
Or with opioids. I took opioids once at a dose that was prescribed to me, and by watching the effects learned from that one dose “Ooh, this feels amazing” and “I don’t have any desire to do that again”. It took a month or so for it to sink in, but one dose. I talked to a man the other day who had learned the same thing much deeper into that attractor—yet still in time to make all the difference.
Yes, “In EEA those are endogenous signaling chemicals” or whatever, but we can also learn what they are now. Warning against the dangers of superstimuli is important, but “Woooah man! Don’t EVER try drugs, because you’re hard coded by the EEA to destroy your life if you do that!” is untrue and counter productive. You can try opioids if you want, just pay real close attention, because the road may be slicker than you think and there are definitely cliffs ahead. Go on, try it. Are you sure you want to? A lot less tempting when framed like that, you know? How careful are you going to be if you do try it, compared to the guy responding “You’re not the boss of me Dad!” to the type of dad who evokes it?
So yes, lots of predictions and lots of rent paid. Just not those predictions.
Predictions about how I’ll feel if I eat a bowl full of bear fat the way one might with ice cream, despite never having eaten pure bear fat. Predictions about people’s abilities to align their desires to reality, and rent paid in actually aligning them. And in developing the skill of alignment so that I’m more capable of detecting and correcting alignment failures in the future, as they may arise.
I predict, too, that this will be crucial for aligning the behaviors of AI as well. Eliezer used to talk about how a mind that can hold religion fundamentally must be too broken to see reality clearly. So too, I predict, that a mind that can hold a desire for overconsumption of sugar must necessarily lack the understanding needed to align even more sophisticated minds.
Though that’s one I’d prefer to heed in advance of experimental confirmation.
I consider it pretty normal to encounter salt as an integral component of fancy ice cream flavors, but my biases are formed from places like https://saltandstraw.com/collections/all-flavors
This is important. Food preferences have a cognitive component, they’re not just stimulus-response, and the usual way I see evolved preferences talked about doesn’t seem to recognize this nearly enough. (This is tangential to the original point about alignment failure — which just requires that preferences be godshatter-like enough, which it seems obvious that they are — but, like you say, matters for understanding ourselves.)
I haven’t gotten bad physical consequences from eating too much sugar, but also I wouldn’t know if I do because e.g. frosting is hard to stand for me in a visceral way, just due to the sweetness, and eating too much lesser-sweet stuff still wakes me “sweet tired”. But I don’t notice an impact on e.g. my digestion or my energy (besides that of, like, eating any meal).
From what you said, it sounded like there is an impact from eating too much sugar? What is it?
Larger effects are easier to measure, and therefore quicker to update on. I didn’t take concerns of “too much sweets” very seriously, so i had no restraint whatsoever.
The clearest updates came after wildly overconsuming while also cutting weight. I basically felt like shit which is probably a much exaggerated “sweet tired”, and never ate swedish fish again. And snickers bars before that.
Since then the updates have been more subtle and below the level of what’s easy to notice and keep good tabs on, but yes “sweet tired”. Just generally not feeling satisfied and fulfilled, and developing more of that visceral distaste for frosting that you have as well, until sweets in general have a very limited place in my desires.
It’s not a process like “Oh, I felt bad, so therefore I shall resist my cravings for sugar”, it’s “Ugh, frosting is gross” because it tastes like feeling tired and bad.
Interesting. To me frosting feels almost physically painful to eat more than a small amount of, and I have no memories of any consequences from eating frosting (besides the immediate “ow”)
I’ve done this many times. Drinking 150ml olive oil is not that bad. If you just chug it you get somewhat nauseous. But if you sip it over 45 minutes, its fine.