I think Bostrom wrote it that way to signal that while hist own position is that digital mind implementations can carry the same moral relevance as e.g. minds running on human brains, he acknowledges that there are differing opinions about the subject, and he doesn’t want to entirely dismiss people who disagree.
He’s right about the object-level issue, of course: Solid state societies do make sense. Mechanically embodying all individual minds is too inefficient to be a good idea in the long run, and there’s no overriding reason to stick to that model.
I think Bostrom wrote it that way to signal that while hist own position is that digital mind implementations can carry the same moral relevance as e.g. minds running on human brains, he acknowledges that there are differing opinions about the subject, and he doesn’t want to entirely dismiss people who disagree.
He’s right about the object-level issue, of course: Solid state societies do make sense. Mechanically embodying all individual minds is too inefficient to be a good idea in the long run, and there’s no overriding reason to stick to that model.