Report from Humanity+ UK 2010

“Theosophists have guessed at the awesome grandeur of the cosmic cycle wherein our world and human race form transient incidents. They have hinted at strange survival in terms which would freeze the blood if not masked by a bland optimism.”

– H.P. Lovecraft on transhumanism



Just thought I’d write a quick post to sum up the H+ UK conference and subsequent LW meetup attended by myself, ciphergoth, JulianMorrison, Leon and a few other LW lurkers. My thanks to David Wood for organizing the conference, and Anders Sandberg for putting me up/​putting up with me the night before.

I made a poster giving a quick introduction to “Cognitive Bias and Futurism”, which I will put up on my website shortly. The LW crowd met up as advertised – we discussed the potential value of spreading the rationality message to the H+ community.1

One idea was for someone (possibly me) to do a talk at UKH+ on “Rationality and Futurism”, and to get the UK transhumanist crowd involved and on board somewhat more. The NYC Less Wrong guys seem to be doing remarkably well with a meetup group, about a billion members, a group house (?) – do you have any advice for us?

The talks were interesting and provocative – of particular note were:

  • Aubrey De Grey’s talk which was far from his usual fare. He tailored his talk to suit the H+ audience who are more familiar with SENS, and spoke about the pace of recent advances in induced pluripotency in stem cells, progress in migrating mitochondrial DNA from the mitochondria to the cell nucleus, and how the mainstream media can blow a minor paper with a sexy title to undeserved levels of fame whilst passing over much more important but “technical” sounding work. I asked what probability he assigned to his SENS program working, but he did not give an estimate. I think that rationalists could potentially help the life-extension movement by publishing an independent, critical review of the probability of the SENS program succeeding, perhaps in an economics journal.

  • Nick Bostrom added the definition of “capability potential” and “axiological potential” to his usual existential risk meme.

  • Other talks were interesting, but showed a lack of appreciation of rationalist methods. For example, though many talks made predictions about the future, only Bostrom’s talk gave any probabilities. Other talks used version 1.0 epistemology – phrases like “I think that X will happen, not Y”, rather than “I assign more probability (k%) to X than most people do”. Max More was particularly guilty of this in his talk on “Singularity Skepticism” – though only because his talk attempted to answer a much harder question than most of the other talks. (E.g. transhumanist art, architectural style, etc)



1: Particularly after hearing a man say that he wouldn’t sign up for cryonics because it “might not work”. We asked him for his probability estimate that it would work (20%), and then asked him what probability he would need to have estimated for him to think it would be worth paying for (40%) – which he then admitted he had made up on the spot as “an arbitrary number”. Oh, and seeing a poster claiming to have solved the problem of defining an objective morality, which I may or may not upload.