Yep, this is somewhat very true, the current design is pretty bad, the previous color scheme was much better, but then a friend told me that it reminded them of pornhub, so we changed it to blue. And generally, yep, you’re right that the vibe is very important.
I think “i care about your opinion, please share” is very bad though for a bunch of reasons (including this would be misleading), and I would want avoid that. I think the good vibe is “there is information you’re missing that’s important to you/for your goals”.
It’s often hard to address a person’s reasons for disbelieving a thing if you don’t know what they are, so there are ways of asking not from a place of feigned curiosity but from a place of like, “let’s begin, where should we start.”
More saliently I think you’re just not going to get any other kind of engagement from people who disbelieve. You need to invite them to tell the site why it’s wrong. I wonder if the question be phrased as a challenge.
The site <smugly>: I can refute any counterargument :> Text form: insert counterargument [ ]
I like this direction, but I think it’s better to invite a stance of intellectual curiosity and inquiry, not a stance of defending a counterargument. More of “What is my intuition missing?” than “I’m not stupid, my intuition is right, they won’t refute my smart counterarguments, let’s see how their chatbot is wrong”
They aren’t trained on the conversations, and have never done self-directed data sourcing, so their curiosity is pure simulacrum, the information wouldn’t go anywhere.
Yeah, it is a different purpose and vibe compared to people going out and doing motivational interviewing in physical locations.
I guess there’s a question here that is more relevant to people like some sort of empowerment framing or similar. “You might be worried about being disempowered by AI and you should be”, we’re here to help you answer your questions about it. Still serious but maybe more “welcoming” in vibes?
The most relevant thing to people is that a bunch of scientists are saying a future AI system might literally kill them and all of their loved ones, and there are reasons for that
Yep, this is somewhat very true, the current design is pretty bad, the previous color scheme was much better, but then a friend told me that it reminded them of pornhub, so we changed it to blue. And generally, yep, you’re right that the vibe is very important.
I think “i care about your opinion, please share” is very bad though for a bunch of reasons (including this would be misleading), and I would want avoid that. I think the good vibe is “there is information you’re missing that’s important to you/for your goals”.
It’s often hard to address a person’s reasons for disbelieving a thing if you don’t know what they are, so there are ways of asking not from a place of feigned curiosity but from a place of like, “let’s begin, where should we start.”
More saliently I think you’re just not going to get any other kind of engagement from people who disbelieve. You need to invite them to tell the site why it’s wrong. I wonder if the question be phrased as a challenge.
The site <smugly>: I can refute any counterargument :>
Text form: insert counterargument [ ]
I like this direction, but I think it’s better to invite a stance of intellectual curiosity and inquiry, not a stance of defending a counterargument. More of “What is my intuition missing?” than “I’m not stupid, my intuition is right, they won’t refute my smart counterarguments, let’s see how their chatbot is wrong”
Why would this be misleading?
They aren’t trained on the conversations, and have never done self-directed data sourcing, so their curiosity is pure simulacrum, the information wouldn’t go anywhere.
Yeah, it is a different purpose and vibe compared to people going out and doing motivational interviewing in physical locations.
I guess there’s a question here that is more relevant to people like some sort of empowerment framing or similar. “You might be worried about being disempowered by AI and you should be”, we’re here to help you answer your questions about it. Still serious but maybe more “welcoming” in vibes?
The most relevant thing to people is that a bunch of scientists are saying a future AI system might literally kill them and all of their loved ones, and there are reasons for that