One should have no bias with respect to what one is being evenhanded about.
Juno_Watt
Clippy doesn’t treat clips even-handedly with other small metal objects.
Ask yourself: “would I self-study this material anyway if I had the next three-five years paid for? Would this occupy a large part of my time regardless of what I’m doing?” If so, it’s worth it.
Philosophy makes a good hobby. You can do it anywhere, and no special equipment is required.
Ask yourself: “would I self-study this material anyway if I had the next three-five years paid for? Would this occupy a large part of my time regardless of what I’m doing?” If so, it’s worth it.
As opposed to what? The business world is relentlessly honest?
I kind of see the point about logical possibility being what you get if you switch off your knowledge of how the world works, and just run off a minimal axiom set. But I don’t know what the connection between that particular set of lower and higher level of organisation is, ie the connection between consciousness and mind. I don’t think anyone else does. Zombies are logically conceivable for everybody. But conceivability is not about the world, as you say.
Let’s not get started on the medical profession’s bias towards health..maybe it’s just their job to teach reason..have you ever met someone who couldn’t do emotional/system-I decision-making right out of the box?
A thousand sci-fi authors would agree with you that AIs are not going to have emotion. One prominent AI researcher will disagree
Upvoted for understatement.
but I think abstract objects could be an entirely different sort of thing from concrete, physically existing objects, and still exist.
It’s logically possible..like so many things.
Either these non physical things interact with matter (eg the brains of mathematicians) or they don’t. If they do, that is supernaturalism. If they don’t, they succumb to Occam’s razor.
But wouldn’t it be better to have a Superintelligent AI deduce our emotions itself, rather than programming it in ourselves?
Would it be easier?
Introspection is hard.
Counterpoint: it offers stability, which is useful regardless of theology.
Stability Is good if governance is good and bad if not.
Let me put it this way: would you rather we’re ruled by someone who’s skilled at persuading us to elect him, and who focuses resources on looking good in four years;
...and you can get rid of..
or someone who’s been trained since birth to govern well, and knows they or their descendants will be held accountable for any future side-effects of their policies?
OK. Looks like democracy with a supply of candidates from Kennedy-style political dynasties is the best of all possible systems...;-)
I didn’t say delete numbers from theories. I mean’t don’t reify them. There is stuff in theories that you are supposed not to reify, like centres of gravity.
Having only the disadvantages of no emotions itself, and an outside view...
..but if we build an Intelligence based on the only template we have, our own, its likely to be emotional. That seems to be the easy way.
OK. So, in what sense do these “still exist”, and in what sense are they “entirely different” from concrete objects? And are common-or-garden numbers included?
I was suggesting that it might serve to render governance better
Under democracy, the people can decide if their stable government has outstayed its welcome after so many years.
You still have to focus on retaining popularity, via attacking political opponents and increasing PR skills, unless the elections are total shams.
Whilst aristos just have to keep slipping their rivals the poisoned chalice...much more discreet.
Also, to be clear, I’m not advocating this position;
Got that.
Morality comes from the “heart”. It’s made of feelings.
People use feelings/System1 to do morality. That doesn’t make it an oracle. Thinking might be more accurate.
As with all things moral, that’s just an arbitrary choice on our part
If you don’t know how to solve a problem, you guess. But that doens’t mean anything goes. Would anyone include rocks in the Circle? Probably not, since they don’t have feelings, values, or preferences. So there seem to be some constraints.
It’s been shown that expertise is only valuable in fields where there is a short enough and frequent enough feedback loop for a person to actually develop expertise—and there is something coherent to develop the expertise in
What do you think philosophy is lacking? An (analytical) philosopher who makes a logic error is hauled up very quickly by their peers. That’s your feedback loop. So is “something coherent” lacking? Phil. certainly doesn’t have a set of established results like engineering, or the more settled areas of science. It does have a lot of necessary skill in formulating, expressing and criticising ideas and arguments. Musicians aren’t non-experts just because there is barely such a thing as a musical fact. Philosophy isn’t broken science.
But if “plexists” means something like “I have an idea of it in my head”, then there is no substance to the claim that 3 plexists..3 is then no more real than a unicorn.
I don’t see what that has to do with existence. We could cook up a well-defined fubarosco-juno unicorn.
Unbiases are determined by even-handedness.