I'm not sure if I have the patience to watch the entire video - did he say whether he's using the anti-newton glass carrier for his Coolscan? That makes an absolutely enormous difference in scanning resolution. I use an 8000 and when using the glass carrier at 4,000 dpi, I can see grain pretty clearly defined when zooming in (though not as much with some films like Velvia 50, but that's not an especially common stock for filmmakers). Anyway, the standard carriers for the Coolscan do a pretty piss-poor job of keeping film flat and the scanner has a really tiny DOF - so non-flat film = unsharp scans (but still usually usable for a print or whatever, an 8x10 print at 300dpi is only about 8 megapixels after all).
Either way, it's just plain untrue that most S35 film has much more resolution than 4k. At the point where the grain structure is clearly visible at 100% (including the bigger and smaller grains), you aren't going to get a lot more detail.
The main reason that most people think 4K is good enough now is that unless you're sitting really close to a huge screen, we're well past the threshold of declining returns on increasing resolution. The average human eye can't perceive any difference whatsoever between 4K and 8K on a 50" screen viewed from a distance of around 2-3 meters - and the perceived difference, even between 2K and 4K content at that distance is not huge. Upgrade yourself to a 75" TV and at that distance, the sharp-eyed can perceive a difference between 4K and 8K... but most people won't notice the difference or care at all.
I'm not sure how things are in every market, but in my local market, I can say that many cinemas are still projecting most content at 2K. Even on a huge screen, people don't seem to mind the difference when watching from 15 meters away.
Also, also - at 4K, lots of little flaws (make-up seams, etc) become more visible and VFX can take a while to render. Step up to 8K and those problems are compounded yet again.