This can be addressed by peer to peer tech and federation. Peertube uses few techniques, that make it more tentable: hosting on the site itself, site to site sharing (activity pub), and bittorent to fill the massive demand. The bittorent part is on by default and there are already more than then an few instances which people can share on.
verbalshadow
Karma: −17
Unaligned AI is coming regardless.
verbalshadowJul 26, 2024, 4:41 PM
−15 points
3 comments2 min readLW link7 votes
Overall karma indicates overall quality.
- verbalshadowDec 27, 2018, 3:56 AM0 points
2 votes
Overall karma indicates overall quality.
00 votes
Agreement karma indicates agreement, separate from overall quality.
in reply to: TreyBoudreau’s comment on: Why Don’t Creators Switch to their Own Platforms?
1 vote
Overall karma indicates overall quality.
0 votes
Agreement karma indicates agreement, separate from overall quality.
Yes. That is one of the things in possibility space. I don’t think unaligned means safe. We work with unaligned people all the time, and some of them aren’t safe either.
The main thing I was hoping people would understand from this is that an unaligned AI is near a 100% possibility. Alignment isn’t a one and done goal that so many people act like it is. Even if you successfully align an AI, all it takes is one failure to align and the genie is out of the bottle. One single point of failure and it becomes a cascading failure.
So let’s imagine an ASI that works on improving itself. How does it ensure the alignment of an intelligence greater than itself.
With hundreds, maybe thousands of people working to create AI, someone will fail to align.
The future is unaligned.
Are we taking that seriously? Working on alignment is great, but it is not the future we should be prepping for. Do you have a plan? I don’t yet, but I’m thinking about the world where there are intelligences greater than me abound (already true) and we don’t share the same interests (also already true).