Depends on what it is your are predicting, and what you mean by simulating. I am going to take “simulating” to mean “running a comparable computation to the one that produced the result in the other entity”.
You cannot reliably predict genuinely novel (!), intelligent actions without being intelligent. (If you can reliably solve novel math problems, this means you can do math.) But you can predict the repetition of an intelligent action you have seen before, or something very similar, even if you are not quite intelligent enough to understand why it is so common. This is especially plausible if there is a relatively small range of intelligent responses. (E.g. I can imagine someone accurately predicting whether a government will initiate a covid lockdown this week, without that person having done an in depth analysis of the data that hopefully led to the government choice, if they have experienced lockdowns and the data and statements that preceded them before).
You can predict what a person with empathy would say, even if you have no empathy, provided you can still model other minds relatively accurately and have observed people with empathy. Running emotions is a very complex affair, but the range of results is still relatively predictable from the outside based on the input, even if you never run through those internal states. If I’ve seen a roomful of toddlers cry while watching Bambi, and then show them 100 other TV shows with parental deaths, I as a machine will likely be able to predict that they will cry again even if I don’t feel sad myself.
Depends on what it is your are predicting, and what you mean by simulating. I am going to take “simulating” to mean “running a comparable computation to the one that produced the result in the other entity”.
You cannot reliably predict genuinely novel (!), intelligent actions without being intelligent. (If you can reliably solve novel math problems, this means you can do math.) But you can predict the repetition of an intelligent action you have seen before, or something very similar, even if you are not quite intelligent enough to understand why it is so common. This is especially plausible if there is a relatively small range of intelligent responses. (E.g. I can imagine someone accurately predicting whether a government will initiate a covid lockdown this week, without that person having done an in depth analysis of the data that hopefully led to the government choice, if they have experienced lockdowns and the data and statements that preceded them before).
You can predict what a person with empathy would say, even if you have no empathy, provided you can still model other minds relatively accurately and have observed people with empathy. Running emotions is a very complex affair, but the range of results is still relatively predictable from the outside based on the input, even if you never run through those internal states. If I’ve seen a roomful of toddlers cry while watching Bambi, and then show them 100 other TV shows with parental deaths, I as a machine will likely be able to predict that they will cry again even if I don’t feel sad myself.