Wow, I always thought Intel will go bankrupt but I guess it’s clear that US is going to back them.
Gotta give Intel a lifeline because the longtime strategy of in-house design+manufacturing no longer works when you lost the manufacturing edge, when you missed the transitions to mobile and GPU, and when you are stuck with the same crappy designs for the one market you dominate: server.
If Sapphire Rapids and Meteor Lake fail they might be in danger. Increasingly the threat is not AMD but Arm, since cloud hyperscalers can bring CPU design in house and undercut x86 and Arm's fixed length instruction encoding is fundamentally better for very wide decoders.
Once they stutter it’s game over. You have to research and build fabs years before you start using them. What happens when they get stuck and fall behind TSMC for a few generations?
AMD spun off its fabs because they actually had good designs but couldn’t keep up in manufacturing. TSMC and ARM had different business models. Now, Intel never had the best designs but they had the best manufacturing. Now they are third place in both. Third place but spending as if it’s going to continue to dominate for the next decade. What’s going to happen when it writes off an entire fab?
I don't think bankruptcy is likely at all but they have more competition than for a long time (if ever) in their most profitable markets. I don't think backwards looking measures like historic earnings are that useful - if you look at the price / earnings ratio of 12 that tells you that the market is sceptical about Intel's prospects.
I'd be surprised if x86 is still a major thing in 5 years, and very surprised if it was still a major thing in 10 years. It's had a good run, but it's 40 years old (with, granted, many extensions and improvements). Sometimes you need to start with a fresh sheet of paper.
Apple's M1 was the first shot across the bow, and I believe AMD has an ARM-like architecture in development as well.
In contrast, i'd be surprised if x86 were to disappear in 5 or 10 years from the mainstream. Personally, i think that there's too much similarly old software that's build first and foremost for x86, porting which would take bunches of time and effort.
Furthermore, ARM is only big in the consumer markets as a part of SoC designs, which are bad for a variety of reasons, notably, limited upgrade possibilities. For example, i cannot go to an Internet store within my country and purchase an ARM-compatible motherboard and buy an ARM CPU separately. As for Apple's M1 offering - i think it's a good start to demonstrate the feasibility of ARM, but i'd say that ARM will most likely remain popular mostly for phones and tablets.
I'd bet on maybe another 20-40 years as the timeframe for ARM to replace x86 and for the latter to become irrelevant.
> Personally, i think that there's too much similarly old software that's build first and foremost for x86, porting which would take bunches of time and effort.
I just got an M1 iMac this week. So far I haven't found any of my old x86 stuff that won't run. I'm positive that such software exists, but nothing I use on a regular basis.
While I haven't done any formal benchmarks, most of it feels faster under emulation than it did on my previous iMac (which, to be fair, was a few years old).
I feel like there's more to Apple's success, in part because of them having full control over the whole ecosystem, from the hardware, to the OS, to the languages that they'd like to support within their platform. Doing that for near-arbitrary hardware (e.g. everything for server hardware, to regular workstations, to budget off-brand laptops) would be far less likely to succeed on such a scale.
Unless everyone ditches Windows in favour for other OSes that would play nicely with ARM, or unless Windows manages to make leaps in regards to emulation technology, i don't see ARM going mainstream for the kind of personal computing that people do on their PCs (as opposed to mobile devices, where ARM clearly dominates).
Plus, there's an argument to be made that we shouldn't even need emulation in the first place, just software that's sufficiently portable to be compiled for a new target, or written in a platform-agnostic enough way that it can be easily ported over to a new architecture.
A consumer company started selling laptops with ARM chips, so obviously everything will change! /s
More seriously, companies like AWS have developed ARM-based chips in-house, and they seem to be very good (cheaper, faster than x86 equivalents) for specific workloads. That will certainly help with adoption, but i sincerely doubt x86 will stop being the dominant architecture. Maybe the consumer market will see a significant shift towards ARM. But not the real money maker, datacenter.
Nothing's different, this is the culmination of the last decade's work. These changes are slow, but ARM is now as fast or faster than x86 in several contexts while being vastly more efficient, and looks likely to continue improving at a better rate. ARM has started appearing in laptops, desktops, and servers. x86 is running out of bastions.
I said ARM is vastly more efficient than x86, not previous implementations of ARM. You can see this in action in the new MacBooks, which do indeed have like 4x the battery life of the x86 versions.
Ah yeah I misread that. I don't think M1 is 4x the battery life of x86 though. Any examples of specific cases? For e.g The M1 Air vs Acer swift (i7-1065G7) - 1.9 W vs 1.8 W at idle, and 30W vs 27 Watts at full load. The power consumption looks to be about on par. But maybe you're thinking of some other measure of efficiency?
Also, I'd like to say that M1 is at 5nm fabrication - which is not an advantage of ARM. I'm sure x86 will get there eventually.
I'm factoring in performance. The 1065G7 has a similar power profile to the M1 but it's much slower - both perceptually and in benchmarks. I like Passmark as a quick and dirty comparison tool, and it has the M1 at 1.7x the 1065G7, putting it in the realm of (but faster than!) the i9 Macbook Pros. 4x runtime was off the top of my head and it looks like that was an overestimate, but it's still over 2x. Also I've used those things and there's no way they get 11 hours.
You're right about the process node. AMD's new laptop chips are on 7nm and they compare more favorably than Intel's. x86 isn't dead yet, but the trend is still not in its favor:
Gotta give Intel a lifeline because the longtime strategy of in-house design+manufacturing no longer works when you lost the manufacturing edge, when you missed the transitions to mobile and GPU, and when you are stuck with the same crappy designs for the one market you dominate: server.