Took ya long enough
Flat panel displays that actually make it feel like we lost (almost) nothing from abandoning flat CRTs. Now everything is expensive and everyone is poorer. Good job.
People can't/The human eye won't/not/no/never
I'm tired of a lot of arm chair experts who say what the human eye can and can't see. So I'm going to join them. I'm sure there are limits to what can be percieved with a higher refresh monitor. I'm too dumb to measure them. Consdiering what refresh rates I see on monitors and TVs now, I can be sure the 'Bah! Humans can't see more than 60fps!' people are mostly gone now. '24 was good enough for film!' is probably still around in some form in games, but I'd like to remind that person that while the film was 24/23.976 the projector was probably displaying it at 72hz or more since each frame is displayed multiple times. That sync was (pretty much) perfect. Sometimes the issue isn't that the framerate is low, it's that it's inconsistent, timed/paced badly, or linked to something else in the engine that makes the game worse to play. For games, i'm sure a brand new really really pretty menu based jrpg that ran at n64 levels of 20 fps would be perfectly fine if every frame was paced well and was synced well. Go look at the original playstation Final Fantasy VII battles. Menus at full 60, camera movements at 30, animations locked to 15. All fit into eachother well. On the movie side of things, shutter speed, sensor readout is very important to what makes a movie look less 'video' and more film. It's all technically video now though...
4k/UHD bluray probably won't look better than bluray more for video codec and color sampling technicalities than the resolution. The color information in video is not the full resolution. (Chroma Subsampling) However, because there still is a boost in color resolution, upscaling stuff that was shot around cinema 2K resolution probably could look better on a UHD bluray because more color information would be preserved over bluray and well above a streaming service. That's why people want Speed Racer on a UHD even though it's out there that it was shot digitally at a lower resolution. It was still shot with a lot of color information. I can't speak fully to HDR not just being a fancy thing to sell displays, the only HDR display I have that actually seemed to render things like advertized is a tiny iPhone 13 mini.
Displays in motion...
As Blur Busters, Digital Foundry, and others have pointed out, the type of display can affect how motion is perceived. Honestly, I wonder if part of the reason we went all the way to 120/144/165hz and higher displays isn't just because they could just multiply or reach targets, but because most of these displays were LCD. (Pretty sure 144 exists as a hold over from 3DTV era chasing 72hz in both eyes for movies. If you saw a 3D movie, especially a Disney one it was projected in 144hz.) Or rather "TN" and "IPS" types of displays. These displays basically 'smear' into the next frame. CRTs, projected film, and (basically) OLED are able to present an actual black frame between each frame. Though, I think in the case of OLED, miniLED, (plasma?) and a few bleeding edge display types they primarily just cleanly display new frames. This blank frame wasn't just a side effect of the display tech, but it helped us percieve motion a certain way. The high refresh on the TN and IPS panels basically is a hack to get better motion by making the pixels smear into each frame faster.
My first 1080p monitor didn't have HDMI on it. It worked well enough for a time. However, it took about 7 years to find a monitor at the same price point that was better enough than it to look like it side by side. That better monitor still had similar issues, just slightly less of them. And better colors. Way better. Motion was still blurry. It actually got hard to read certain animations in games. In fighting games, sometimes it'd be hard to tell what an opponent was doing because if two attacks looked similar except for a few cues, it would be difficult to read since in faster motion that character was a blur. Overdrive and other motion fixes got added and improved over time, but the trade off was usually visual artifacts.
CRT hipster
The entire time I had those blurry monitors I got into fps games. Eventually Team Fortress 2 became my primary game. Luckily for me it was made at the tail end of 4:3 monitors being common so I could output a picture for a CRT. And that sure saved my butt online. Nothing was a blurry mess. The only technical hurdles gettting in my way were poor cpu optimization and network issues. Thankfully I had a good connection to a few servers with maps I liked. Not only that, at a certain point I figured out how to get 1024x768 at 120hz working. Thanks to old tech I experienced my first 120hz monitor. I played back films at 72hz and 120hz on this monitor, action scenes more readable than my lcd in some cases. It was a late model monitor meant for computer labs that needed a good picture for graphic design so the colors were pretty good. It may have even been calibrated at one point. I played Team Fortress 2 for over 870 hours. Most of those hours were behind a CRT. At 120hz. In 4:3. At the time nobody sold me on a CRT being a fun nicer thing to try. I already knew we had abandoned quality for space saving. I had already figured out we lost something in chasing the flat panel. By the time I graduated I already figured out that part of that flat panel transition was marketing. Hype. New and shiny. Don't worry, I moved several times and was swiftly reminded why we wanted lighter smaller displays. No, I do get it. Ow. It took me until getting a slightly old budget Benq 3Dvision monitor a tournament on the cheap to move a PC crt monitor to a sometimes thing.
Old stuff and resolution
Ok, I'm not going to say too much when RetroRGB exists. But please trust me when I say that old game consoles are helped by good HDMI converter boxes like the Retrotinks or the OSSC. Still don't completely replace a CRT tv, but that's more analog video oddities than other things. Simply, if you want to try retro gaming accurately even just throwing a ps2 or older over composite or svideo is worth it and easy. Old CRT finding is going to get harder so I do get it if it's not worth chasing. It's for less and less people every year for a variety of reasons including the financialization and speculation on collectability stuff.
A bunch of the above applies but something to remember about even standard definition TVs, and computer monitors is that they do not scale. They are not fixed resolution displays. Video may have come in a few set resolutions, but games did not. Computers did not. That CRT I played TF2 on could display Melty Blood in 640x480 at 60hz and my desktop for checking graphic design colors at 1600x1200. I watched some movies at 1280x960 at 72hz. A few times at higher resolutions at 72hz. No motion blur. The way phosphors and the display work even prevented aliasing artifacts from being too obvious, so playing games at lower resolutions to get more performance or fancier effects worked.
Today-ish
I could replace all my flat panels with one 4k 120hz OLED (or some other newer display) with variable refresh rate. Variable refreshrate is the thing that makes it easier for me to give up on older displays. Getting my first Gsync display solved a problem I hinted at earlier. Syncing frames to frame rate. Games don't always run well. Some even render at specific frame rates internally then buffer, which also means a delay, to sync to display. Variable refresh lets the game or software dictate when frames happen so it mostly solves weird visual judder, reduces input lag in some games, and lets games run at whatever framerate. I can play an emulated old game at the original framerate easy. My fighting games don't need to run at the exact 60hz my monitor expects. FFXIV doesn't need to run at the exact framerate all the time. Lots of stuff just looks smoother by fixing a different problem.
"4k" resolution won't sell for everyone but I do love pixel peeping some things and playing games with particle effects. Though, as pixels get denser the necessicty of anti-aliasing lessens. Jagged edges and weird one pixel artifacts either cause subtle visual patterns or disappear.
I'm poorer than ever and the new tech that finally fully fixes most of the things that made me keep a crt around is here and expensive. It also feels late. I don't know what part of that is companies just planning long term since some weird financial incentive keeps them having to release a new product every year instead of making fully featured devices every once in a while. It feels like every industry relies on waste and production numbers to feed numbers into a machine that says they can still pay people to make things.