(1) « people liking thing does not seem like a relevant parameter of design ».
This is quite a bold statement. I personally believe the mainstream theory according to which it’s easier to have designs adopted when they are liked by the adopters.
Fair point. I guess “not relevant” is a too strong phrasing. And it would have been more accurate to say something like “people liking things might be neither sufficient nor necessary to get designs adopted, and it is not clear (definitely at least to me) how much it matters compared to other aspects”.
Re (2): Interesting. I would be curious to know to what extent this is just a surface-level-only metaphor, or unjustified antrophomorphisation of cells, vs actually having implications for AI design. (But I don’t understand biology at all, so I don’t really have a clue :( .)
(1) « Liking », or « desire » can be defined as « All other things equal, Agents will go to what they Desire/Like most, whenever given a choice ». Individual desire/liking/tastes vary.
(2) In Evolutionary Game Theory, in a Game where a Mitochondria-like Agent offers you choice between :
(Join eukaryotes) mutualistic endosymbiosis, at the cost of obeying apoptosis, or being flagged as Cancerous enemy
(Non eukaryotes) refusal of this offer, at the cost of being treated by the Eukariotes as a threat, or a lesser symbiote.
then that Agent is likely to win. To a rational agent, it’s a winning wager. My last publication expands on this.
Fair point. I guess “not relevant” is a too strong phrasing. And it would have been more accurate to say something like “people liking things might be neither sufficient nor necessary to get designs adopted, and it is not clear (definitely at least to me) how much it matters compared to other aspects”.
Re (2): Interesting. I would be curious to know to what extent this is just a surface-level-only metaphor, or unjustified antrophomorphisation of cells, vs actually having implications for AI design. (But I don’t understand biology at all, so I don’t really have a clue :( .)
(1) « Liking », or « desire » can be defined as « All other things equal, Agents will go to what they Desire/Like most, whenever given a choice ». Individual desire/liking/tastes vary.
(2) In Evolutionary Game Theory, in a Game where a Mitochondria-like Agent offers you choice between :
(Join eukaryotes) mutualistic endosymbiosis, at the cost of obeying apoptosis, or being flagged as Cancerous enemy
(Non eukaryotes) refusal of this offer, at the cost of being treated by the Eukariotes as a threat, or a lesser symbiote.
then that Agent is likely to win. To a rational agent, it’s a winning wager. My last publication expands on this.