> They did it with AMD's GPUs later down the line, pretending like Vulkan couldn't be implemented so they could promote Metal.
It was even worse than that, they just stopped updating OpenGL for years before either Vulkan or Metal existed at all. Taking a Macbook and using bootcamp would instantly raise the GPU feature level by several generations just because Apple's GPU drivers were so fucking old & outdated.
What geekbench 5 fps are you talking about? Geekbench only has OpenCL and Vulkan scores for the 3090 as far as I can tell, and the M1 Ultra is less than half the OpenCL score of the 3090. And the M1 Ultra was significantly more expensive.
Find or link these workloads you think exist, please
> The M1 was (well, is) a marvel and absolutely smokes a 3090 in perf per watt.
The GTX 1660 also smokes the 3090 in perf per watt. Being more efficient while being dramatically slower is not exactly an achievement, it's pretty typical power consumption scaling in fact. Perf per watt is only meaningful if you're also able to match the perf itself. That's what actually made the M1 CPU notable. M-series GPUs (not just the M1, but even the latest) haven't managed to match or even come close to the perf, so being more efficient is not really any different than, say, Nvidia, AMD, or Intel mobile GPU offerings. Nice for laptops, insignificant otherwise
Here you go[0]. 'Aztek Ruins offscreen'. Although I misremembered the exact FPS, the 3090 is at 506 FPS.
Also note how the M1 Ultra is pushing 2/3 of the FPS of the 3090 despite 1/3 of the power budget and the game itself being poorly optimized for the M-series architecture.
And here[1] you have it smoking an Intel i9 12900K + RTX 3900. The difference doesn't look too impressive until you realize the power envelope for that build is 700-800W.
Also, the GTX 1660 (technically an RTX 2000 series, but whatever) is about 26% less efficient than an 3090[2].
> Being more efficient while being dramatically slower
That's my whole point and what you're refusing to see. The M1 is not dramatically slower than an i9 or 3090 despite having dramatically lower power use.
The proof for this will really start to come once Qualcomm and Mediatek have gotten a handle on their PC ARM chips and Valve decides they're good enough for a Steam Deck 2 or 3. You'll get to see 2-3x the battery life along a modest performance increase.
> Here you go[0]. 'Aztek Ruins offscreen'. Although I misremembered the exact FPS, the 3090 is at 506 FPS.
Oh, GFXBench not geekbench.
Realistically that 506 fps result is probably CPU bottlenecked, not that aztec ruins is all that relevant. It's a very old benchmark, released in 2018, that was destroyed for mobile GPUs, so realistically is using a 2010-ish GPU feature set.
If that's your use case, great. But it's not significant at all.
> And here[1] you have it smoking an Intel i9 12900K + RTX 3900.
Not using the GPU, so irrelevant. Also not using 700-800w
> Also, the GTX 1660 (technically an RTX 2000 series, but whatever) is about 26% less efficient than an 3090[2].
"bestvaluegpu" I've never heard of but holy AI slop nonsense batman. Taking 3dmark score and dividing it by TDP is easily one of the worst ways to compare possible.
> Then we decided to just have no markings at all on USB C cables.
I'm shocked the LTT TrueSpec cables are the first I'm aware of to so such a small and basic thing. I have so many USB C cables and no idea which are power only, USB 2 only, or what. Such a mess
I mean.. kinda everything about Mythos for example? Anthropic has a good product, but they also pretty consistently say some stupid ass shit if you're being generous, and blatant lies if you aren't
The only reason I haven't canceled my Plex is because I bought a lifetime pass a decade ago so I literally can't. :/ I almost wish I hadn't specifically so I could cancel it and send that signal.
But yes Plex is quite enshittified now. Would definitely start with Jellyfin or something else these days.
> OP here might be misremembering DVDs, here: the physical media skipped or froze intermittently and the players themselves were finicky
In my teens my friends and I watched probably hundreds of DVDs, and they almost never had a problem. Skips & freezes were almost only ever a factor for highly scratched copies, more typical of those from Blockbuster than anything we picked up in the $5 bargain bins.
I don't think I've ever encountered a "finicky" player, either. I don't even know what that'd mean.
I have programmed well over hundreds of DVDs, and I can assure you, there were finicky players. Apex players were infamous on how cheap they were, and finicky is an appropriate word. DVD had a spec, and there were parts of the spec that Apex players did not do well at all. The spec allowed for random play. Apex players cheaped out on an PRNG type of ability and came with a saved preset list of random values. If you programmed a disc with random playback, it would playback exactly the same way every. single. time. It really sucked when we were programming games using the random feature. The spec allowed for 99 titles. Any where over 50 titles, and there was a better than not chance that an Apex player wasn't going to even recognize the disc. There were other quirks too, but I'm hoping the point was made
About half of the DVDs and Blu-rays I get from the library skip at some point in my PS5. They're usually not visibly scratched, though usually the scratches that matter are on the top not the bottom.
I started just ripping everything when the studios started adding unskippable ads... I had a rental copy of Friday, still have never actually seen it, there was a bad scratch and it froze after 30+ minutes of unskippable previews.
I've never had a really bad player though... I have seen players that had issues with burned disks, but not mfg (unless scratched rentals).
> You can get the 4 lego movies for $5 on DVD on Amazon right now. A "Tom Cruise 10-Movie Collection" is $12. You get the idea.
The image quality on these is also quite bad, especially with cost cutting resulting in these being compressed further to fit on a single-layer DVD. Often without any indication that it happened, as well. Whether or not you find it acceptable is definitely a matter of personal taste, but it's very much apples & oranges vs. Netflix. Blu-ray by contrast is generally better quality than what you'll get from streaming services.
As a kid I watched Fantasia on VHS so many times that the tape quality started to decay and my parents stored it away for special occasions. The quality decay didn’t bother me at all.
Long prior to that, I played my Jesus Christ Superstar album so many times that the tic-sound where a scratch appeared is now embedded in my memory as part of the song.
I sometimes "sing" the tick, even in other versions.
The GeForce 4 generation as a whole, while being solid enough cards, were historically not interesting. They were just basic spec bumps over the GeForce 3. No new features or similar. And, critically, the 9700 Pro released the same year as the GeForce 4 and absolutely smoked the living shit out of it.
The MX440 allowed players that were playing games on id Tech 3 to finally play at high frame rates. I remember this card being all the rage back then in pro gaming circles for this reason.
The MX440 was an entry level budget card? If it was all the rage in pro gaming circles at the time that's really just a reflection of how poor pro gamers were back then rather than anything to do with the MX440 being particularly noteworthy. In fact looking back at old reviews, it was if anything a flop. Launch MSRP was too expensive for the performance it offered. Especially when it was a DX7 card surrounded by DX8 cards at almost the same price point (including Nvidia's own Ti4200 for just $50 more)
It was even worse than that, they just stopped updating OpenGL for years before either Vulkan or Metal existed at all. Taking a Macbook and using bootcamp would instantly raise the GPU feature level by several generations just because Apple's GPU drivers were so fucking old & outdated.
reply