Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I disagree. The PC era is evolving, not ending. How many people still have and use a traditional desktop or laptop computer at home? Microsoft is filling a growing demand for an OS on smaller form factor devices by supporting ARM processors with the next release of Windows, but the traditional PC market is still alive and kicking, its just not "cool" anymore; it's seen as more of a utilitarian tool than a novelty at this point, whereas smartphones and iTablets currently retain the novelty factor, thus their pervasiveness at CES.


I hope you don't mind. I just thought I'd try this on for size...

I disagree. The Mainframe era is evolving, not ending. How many people still have and use a mainframe when they interact with their bank? PCs are filling a growing demand for an OS on a smaller form factor, but the traditional mainframe market is still alive and kicking, its just not "cool" anymore; it's seen as more of a utilitarian tool than a novelty at this point, whereas PCs currently retain the novelty factor, thus their pervasiveness at BigIronExpo.

Yup. I actually recall people making this argument back in those days. And they were right in a sense: big iron still exists in its niche, after all. Regardless, the PC revolution was still a massive shift — large enough to justify the rise of the PCs as heralding the demise of big iron.


When I saw the iPad launched I was very concerned we are moving towards a market of consumer-only devices; devices which you can't use creatively to make content, program applications, etc.. Things like programming will be considered a "specialist" activity and new generations of nerds will have nothing to cut their teeth on because the only PCs are development workstations in corporate offices. Consumer computing will turn into pressing buttons and seeing magic happen, with no ability for curious people to figure out what's going on behind the scenes or modify it. A good litmus test is : can you use applications on the device to make entirely new applications for that device? You can do this on a Ti-86 calculator, but not on an iPad (please correct me if I'm wrong).

I'm not entirely convinced things will pan out like this, but the thought still occurred to me.


We've been heading in that direction for some time, but it's only because of the increasing complexity of our devices.

Televisions, radios, cars, and home appliances are all devices that are harder to tinker with today than when they were first invented. Computer hardware, for the most part (outside of hobby kits) is all magic black boxes plugged into each other. Something wrong with your video card? There's not much you can do about it.

Luckily, the hobby/amateur market in computer hardware/software (and other hobbies, like cars) is still alive and kicking. IMO though, corporations seem to like taking steps to make that sort of work harder (e.g., utilizing encryption to ensure "blessed" hardware/software can only be used).


IMO though, corporations seem to like taking steps to make that sort of work harder (e.g., utilizing encryption to ensure "blessed" hardware/software can only be used).

I've given some thought to this, and I don't think that it can mostly be blamed on the corporations. Sure, they're part of it, but I think the ever increasing complexity of technology is mostly what causes things to be harder to tinker on. There's far more to understand than in older technology, and it makes tinkering/hacking on things far more difficult.

It's kind of like classic cars vs cars today. Older cars were much more tinkerable/hackable/etc, but newer ones have certainly benefited from the increased complexity in the form of better gas mileage, safety, etc at the cost of hackability.


I don't think complexity is nearly the enemy of hackability that manufacturer lockdown is. Modern cars can still be hacked -- everything from OBD-2 interfaces to modified firmware for engine control units. FPGAs provide a way to tinker with hardware concepts.

The demise of hackability is and will be due to DRM and the hardware equivalents (like impossible-to-find proprietary screwdrivers).


To me, that's the critical distinction. A Motorola Atrix (http://www.motorola.com/Consumers/US-EN/Consumer-Product-and...) is a "PC" to me. An iPhone in a dock with a keyboard isn't, even though they're very comparable otherwise.


I think the biggest difference is that mainframe customers were primarily limited to large businesses, whereas the PC has gained the mindshare of the masses in the last 15 years, giving it an inertia unseen before in tech. Unlike the transition from mainframes to PCs, I think that the iTablet/Smartphone market supplements the PC market, rather than outright replacing the core functionality provided by PCs.


But 'the PC' is a slippery term. That analyst didn't declare the death of 'the PC' in the sense of "a traditional desktop or laptop computer", he declared the death of 'the PC' in the sense of Wintel. So for example, the Mac is a PC in sense 1, but (still, mostly) not in sense 2. Of course the two phenomena are closely related by now, and the sense-2 PC is under a certain amount of pressure too. But the fate of the traditional desktop/laptop is a much wider question, and more tied in to Big Questions about the future of software.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: