Could you be Prof Nick Bostrom’s sidekick?

If fund­ing were available, the Cen­tre for Effec­tive Altru­ism would con­sider hiring some­one to work closely with Prof Nick Bostrom to provide any­thing and ev­ery­thing he needs to be more pro­duc­tive. Bostrom is ob­vi­ously the Direc­tor of the Fu­ture of Hu­man­ity In­sti­tute at Oxford Univer­sity, and au­thor of Su­per­in­tel­li­gence, the best guide yet to the pos­si­ble risks posed by ar­tifi­cial in­tel­li­gence.

No­body has yet con­firmed they will fund this role, but we are nev­er­the­less in­ter­ested in get­ting ex­pres­sions of in­ter­est from suit­able can­di­dates.

The list of re­quired char­ac­ter­is­tics is hefty, and the po­si­tion would be a challeng­ing one:

  • Willing to com­mit to the role for at least a year, and prefer­ably several

  • Able to live and work in Oxford dur­ing this time

  • Con­scien­tious and discreet

  • Trustworthy

  • Able to keep flex­ible hours (some days a lot of work, oth­ers not much)

  • Highly com­pe­tent at al­most ev­ery­thing in life (for ex­am­ple, or­ganis­ing travel, me­dia ap­pear­ances, choos­ing good prod­ucts, and so on)

  • Will not screw up and look bad when deal­ing with ex­ter­nal par­ties (e.g. me­dia, event or­ganisers, the uni­ver­sity)

  • Has a good per­son­al­ity ‘fit’ with Bostrom

  • Willing to do some tasks that are not high-status

  • Willing to help Bostrom with both his pro­fes­sional and per­sonal life (to free up his at­ten­tion)

  • Can speak English well

  • Knowl­edge of ra­tio­nal­ity, philos­o­phy and ar­tifi­cial in­tel­li­gence would also be helpful, and would al­low you to also do more work as a re­search as­sis­tant.

The re­search Bostrom can do is unique; to my knowl­edge we don’t have any­one who has made such sig­nifi­cant strides clar­ify­ing the biggest risks fac­ing hu­man­ity as a whole. As a re­sult, helping in­crease Bostrom’s out­put by say, 20%, would be a ma­jor con­tri­bu­tion. This per­son’s work would also help the rest of the Fu­ture of Hu­man­ity In­sti­tute run smoothly.

The role would offer sig­nifi­cant skill de­vel­op­ment in op­er­a­tions, some skill de­vel­op­ment in com­mu­ni­ca­tions and re­search, and the chance to build ex­ten­sive re­la­tion­ships with the peo­ple and or­gani­sa­tions work­ing on ex­is­ten­tial risks.
If you would like to know more, or be added to the list of po­ten­tial can­di­dates, please email me: robert [dot] wiblin [at] cen­tre­fore­ffec­tivealtru­ism [dot] org. Feel free to share this post around.
Note that we are also hiring for a bunch of other roles, with ap­pli­ca­tions clos­ing Fri­day the 12th De­cem­ber.