Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When I saw the iPad launched I was very concerned we are moving towards a market of consumer-only devices; devices which you can't use creatively to make content, program applications, etc.. Things like programming will be considered a "specialist" activity and new generations of nerds will have nothing to cut their teeth on because the only PCs are development workstations in corporate offices. Consumer computing will turn into pressing buttons and seeing magic happen, with no ability for curious people to figure out what's going on behind the scenes or modify it. A good litmus test is : can you use applications on the device to make entirely new applications for that device? You can do this on a Ti-86 calculator, but not on an iPad (please correct me if I'm wrong).

I'm not entirely convinced things will pan out like this, but the thought still occurred to me.



We've been heading in that direction for some time, but it's only because of the increasing complexity of our devices.

Televisions, radios, cars, and home appliances are all devices that are harder to tinker with today than when they were first invented. Computer hardware, for the most part (outside of hobby kits) is all magic black boxes plugged into each other. Something wrong with your video card? There's not much you can do about it.

Luckily, the hobby/amateur market in computer hardware/software (and other hobbies, like cars) is still alive and kicking. IMO though, corporations seem to like taking steps to make that sort of work harder (e.g., utilizing encryption to ensure "blessed" hardware/software can only be used).


IMO though, corporations seem to like taking steps to make that sort of work harder (e.g., utilizing encryption to ensure "blessed" hardware/software can only be used).

I've given some thought to this, and I don't think that it can mostly be blamed on the corporations. Sure, they're part of it, but I think the ever increasing complexity of technology is mostly what causes things to be harder to tinker on. There's far more to understand than in older technology, and it makes tinkering/hacking on things far more difficult.

It's kind of like classic cars vs cars today. Older cars were much more tinkerable/hackable/etc, but newer ones have certainly benefited from the increased complexity in the form of better gas mileage, safety, etc at the cost of hackability.


I don't think complexity is nearly the enemy of hackability that manufacturer lockdown is. Modern cars can still be hacked -- everything from OBD-2 interfaces to modified firmware for engine control units. FPGAs provide a way to tinker with hardware concepts.

The demise of hackability is and will be due to DRM and the hardware equivalents (like impossible-to-find proprietary screwdrivers).


To me, that's the critical distinction. A Motorola Atrix (http://www.motorola.com/Consumers/US-EN/Consumer-Product-and...) is a "PC" to me. An iPhone in a dock with a keyboard isn't, even though they're very comparable otherwise.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: