″ How do you teach normies to use AI five years from now, for their own job? Altman says basically people learn on their own.
It’s great that they can learn on their own, but this definitely is not optimal.
As in, you should be able to do a lot better by teaching people?
There’s definitely a common theme of lack of curiosity, where people need pushes in the right directions. Perhaps AI itself can help more with this. ”
This hits at a particularly relevant point to me. There is a difference between teaching to fish and giving a fish. Helping to do something and doing it for the person. There is a huge financial opportunity right here in this question and I do not understand the incentives at play that have been preventing everyone from jumping on it. Please help me see what I am missing.
Take dating apps. They make the match for the user. Ok, why doesn’t at least one make AI personalities and train you to interact in the real world? Help you find your own weaknesses and overcome them. If I have an AI that can match two people, and it knows what the most common desires and red flags are, then it can generate a profile of a user, here are your strengths and weaknesses against the market, here is a training plan to minimize your weaknesses and maximize your strengths. Here is information about your local environment and what your chosen partners are interested in, and then create a series of simulated experiences of approaching and interacting. Rinse repeat then go out in the real world and do it.
Same with any topic or skill. Instead of doing it, the AI can teach me to do it. Can give me simulated practice. Can take on multiple personas and let me interact with them so I learn to handle multiple types of challenges.
I have seen a very small number of people who have made debate bots for this purpose. But that’s it. It seems like easy, obvious money and no one is doing it and I don’t see why. It seems to be within cur ent capabilities. What is the disincentive that I am missing?
MindMax are trying to do get people hooked on AI-designed personalised curricula, but it’s pretty cynical, assumes that all the learner wants to do is listen to short, mostly surface level dialogs.
″ How do you teach normies to use AI five years from now, for their own job? Altman says basically people learn on their own.
It’s great that they can learn on their own, but this definitely is not optimal.
As in, you should be able to do a lot better by teaching people?
There’s definitely a common theme of lack of curiosity, where people need pushes in the right directions. Perhaps AI itself can help more with this. ”
This hits at a particularly relevant point to me. There is a difference between teaching to fish and giving a fish. Helping to do something and doing it for the person. There is a huge financial opportunity right here in this question and I do not understand the incentives at play that have been preventing everyone from jumping on it. Please help me see what I am missing.
Take dating apps. They make the match for the user. Ok, why doesn’t at least one make AI personalities and train you to interact in the real world? Help you find your own weaknesses and overcome them. If I have an AI that can match two people, and it knows what the most common desires and red flags are, then it can generate a profile of a user, here are your strengths and weaknesses against the market, here is a training plan to minimize your weaknesses and maximize your strengths. Here is information about your local environment and what your chosen partners are interested in, and then create a series of simulated experiences of approaching and interacting. Rinse repeat then go out in the real world and do it.
Same with any topic or skill. Instead of doing it, the AI can teach me to do it. Can give me simulated practice. Can take on multiple personas and let me interact with them so I learn to handle multiple types of challenges.
I have seen a very small number of people who have made debate bots for this purpose. But that’s it. It seems like easy, obvious money and no one is doing it and I don’t see why. It seems to be within cur ent capabilities. What is the disincentive that I am missing?
MindMax are trying to do get people hooked on AI-designed personalised curricula, but it’s pretty cynical, assumes that all the learner wants to do is listen to short, mostly surface level dialogs.