Video quality is mainly not determined by resolution
It appears that many people’s world model about digital video quality is something along the lines of “lower-resolution video looks worse because it is lower-resolution, and lower-resolution video files have lower bitrate (smaller filesize relative to video length) because they are lower-resolution”. This is not accurate or useful, and holding this belief might e.g. lead to overspending money on high-resolution displays that might not be as necessary for an enjoyable video watching experience as the marketing would have you believe, hold you back from getting value out of the full potential of the video watching setup you already have, or if you are uploading videos yourself, not help you publish high quality videos.
The accurate belief is complicated, as usual, but in short, digital video quality is mainly determined by bitrate, encoder, and encoding settings. The differences between encoders and encoding settings are outside the scope of this post, but bitrate is simple: everything else being equal, higher bitrate is always better, and bitrate is so important that very often, higher bitrate is just outright better even when everything else is not equal, although you do hit diminishing returns after a point (and then the other factors, such as resolution, might get more relevant).
When you are watching YouTube in 360p, it looks bad mainly just because it has a very low bitrate; what you are perceiving as flaws in the quality are compression artifacts from a low bitrate, and the effect of the resolution being low is negligible in comparison. Incidentally, a hypothetical 1080p version of the video encoded at the same bitrate the actual 360p version is using would look even worse — YouTube is scaling the video down because they’ve got to work with a predetermined bitrate so that people with slow connections can watch the videos at all, and the compression algorithm has an easier time making the video look acceptable when there are fewer pixels to try to fit into that same tiny bitrate.
Resolution does still make a difference, especially below 360p and/or for content with text, user interfaces, long still shots or animation. But although the difference is not insignificant, you might be surprised just how little resolution a video needs to look pretty decent and for the returns to start diminishing when you go higher. To demonstrate exactly what difference it makes, here is a comparison of a short clip from my band’s video at the original 1080p quality and absurdly high-bitrate (beyond visually lossless by a safe margin) re-encodes at 144p, 240p, 360p, 480p and 720p. Effectively, all factors besides the resolution itself have been eliminated from this comparison, so it is representative of what the resolution is doing to the video quality.
In the subfolder, there is a comparison of the 1080p video encoded at different bitrates, to demonstrate what compression artifacts look like when that is the only factor changing (CRF 15 being practically indistinguishable from the original and CRF 51 completely unwatchable — higher CRF value means lower bitrate and lower quality). The values are pretty arbitrary, but 15 is what I personally consider as good as lossless for most intents and purposes, 22 is kind of similar to YouTube 1080p quality, you might see quality similar to 25 somewhere worse than YouTube, and the rest are worse than that.
My opinion with my setup is that high-bitrate 720p looks absolutely fine, 480p looks pretty much fine with just a hint of a problem that probably wouldn’t bother me if I wasn’t looking for it, and even 240p would be watchable in a pinch. Considering that I have been happily watching DVDs, which are 480p and extremely high-bitrate, I pretty much expected the 480p video to look as good as it does, but the 240p and 144p being that good still was even surprising to me. Meanwhile, it’s easier to notice the drop in quality between CRF 15 and CRF 22 than 1080p and 720p, and it quickly goes downhill from there, although the subtitles stay acceptable much better than when the resolution is decreased.
A few things you can do with this knowledge to improve your life:
Start watching 4K videos on streaming services when possible, even if you don’t have a 4K screen. You won’t benefit from the increased resolution since your device will downscale it back to your screen’s resolution, but you will benefit from the increased bitrate that the 4K video probably secretly has.
Keep watching DVDs, they’re fine. No need to be put off by the fact that they’re “only” 480p.
Streaming services are convenient compared to physical media, but if you care about quality, the quality increase you get by buying physical media instead (or even pirating well encoded Blu-ray Disc rips) is quite substantial.
4K TVs and projectors are more expensive than their Full HD counterparts. I am not definitively telling you that you shouldn’t buy them, but if you’re on the fence about spending that extra money, you should probably feel pretty confident that you can watch movies perfectly enjoyably in Full HD even on a large screen.
If you do any kind of content creation involving video, it’s worth it to learn how to encode. Just because a video is in 4K, or 1080p, it doesn’t mean it’s automatically good.
This might technically be true, and yet my experience has been consistently that higher resolution is always better. Perhaps this is just because, in the real world, higher resolution automatically implies a higher filesize (bitrate)?
(My actual strategy is “get the largest file you can at 4K, which seems to work pretty well.)
It is indeed often the case, but there are real-life scenarios where this is false (e.g. 480p DVDs probably look better than YouTube 720p because the latter has a much lower bitrate) and the causal relationship arguably exists in the opposite direction: less bitrate is required to make lower-resolution videos look decent, so lower resolutions tend to be preferred when people need to cut down on bitrate.
One situation where resolution matters is when the subtitles are hardcoded as a part of the image. At 720p they are nicely legible, at lower resolutions they are not.
I’m not sure if anyone still does this, but there was also a funny point early in the history of 4k streaming where people would encode 4k video at the same bitrate as 1080p, so they could technically advertise that their video was 4k, but it was completely pointless since it didn’t actually have any more detail than the 1080p video.
This is impossible. Assuming the content is originally 1080p, 360p is merely a type of lossy compression of the 1080p. The decompression algorithm is “upscale the 360p to 1080p” and the end result is an approximation of the original 1080p video that isn’t quite correct because information ws thrown out.
The 1080p video at the same bitrate is also a lossy compression of the original 1080p, and the end result of decoding it will be an approximation of the original 1080p video that isn’t quite correct, because the exact same amount of information was thrown out.
That said, an ideal video encoding algorithm would always do better with a 1080p video because it has more options[1], but it’s not clear to me that actually-existing encoders meet this ideal.
If the optimal way to encode a video is to downscale it to 360p, an optimal 1080p encoder can downscale to 360p. If the optimal way to encode the video is to use information that’s not visible in 360p, the 1080p encoder can use it, but a 360p encoder can’t.
I mean, the ideal settings depend on content, use case and even personal preference when it comes to the details, so it’s probably not fundamentally possible to design an ideal video encoding algorithm that automatically produces the best possible video quality at the desired filesize without the need to manually finetune the settings. As long as we are manually finetuning the settings, the output resolution is one of the settings we can change, so in that sense, all the algorithms already implement that feature.