There’s something rather massive that you’re missing here: If you have enough freedom to create personalized environments then the bigger issue is what some fraction of people will choose to do with that kind of power. Since this is way more than you can get from an experience machine.
The best version of this I can think of entails everyone getting personalized AGI able to act as perfect DM’s for simulated adventures. Since you need someone to act out the roles of the NPC’s, particularly the villains. This way people can play out adventures where they get to be the hero and maximally fulfill all their psychological instincts for purpose, in addition to all the lower level Maslow stuff. Notably though, people care about being admired by other real people. So the AI would address this by making new people. Using its superintelligence to predict in advance which characters you will befriend and creating them as real digital minds who don’t necessarily know they’re in a simulation until you complete the adventure together. The AI would then ensure these new digital minds are created with preferences such that they will never feel resentment once they find out the nature of their existence, and will be glad to have been created the way they were. They’ll even be glad they didn’t know it was a simulation at the time, since they got to believe they were say saving the world along with their best friends for instance!
There’s also a point to be made about how the sheer prevalence of escapist fantasy speaks to people’s desires for something exactly like this. Since people clearly have an unmet desire to feel like they’re a hero who people respect, going on amazing adventures with a group of close friends.
Of course the worse version of this still arguably wouldn’t be that bad on net: Since some people would create simulated hells where they acted as petty tyrants yes, but for every one of those simulated hells there’d be a massive number of utopian simulations. Though if you’re much more cynical about how sadistic you think the median person is then you might expect this scenario to be far worse on net.
It seems like you’re conflating forcing something on somebody with making somebody aware of an option. Since it seems rather implausible that if people were all aware they just choose to not get ennui after 500 years, that they would choose not to alter themselves this way when there’s no real downside.
As the original commenter here pointed out given how one sided this seems to be, it seem strange that humans would have converged on this bizarre deathist culture unless it was engineered that way by the minds on purpose for reasons that it’s difficult to conceive of not being bad.