There are extensions like adnauseum which try to poison your data trace. Though it’s dubious whether they help much. You could have some kind of crawler thingy which would pretend to be like 100 normal users so you get lost in the noise. But even that could probably be filtered out if someone really wanted to—it would be hard to accurately simulate a human (I also dimly recall reading an article about it?). Maybe something that records other peoples sessions and plays them back? Or a LLM doing it (hehe)? But even that wouldn’t help in the case of being logged in to various services, and I’m guessing that most people don’t automatically log out of gmail whenever they change tabs?
One source of hope is that data gets stale quickly. People can change their minds (even if they don’t), so just because you know what I thought a year ago doesn’t mean that you know what I think now. Then again, most people don’t care either way and it would be pretty simple to remove the small number of outliers who suddenly go dark. One possible way of cleaning up would be to spend a couple of months posting more and more radically strange posts (e.g. going all in on flat earthism) before going private in order to mislead any analysis. This is hard and requires passing an ITT.
Tor + cleaning cookies + logging out of everything after using them + separate user profiles goes a long way. But it’s very inconvenient.
This sounds like there’s a niche for technical writers who format/edit AF posts as proper ML articles for credits, points and glory