Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is this person a wizard?

To me, this seems an impossible feat.

But I wonder how it seems to people who understand how it works?

I'm reminded of this joke:

Two mathematicians are talking. One says a theorem is trivial. After two hours of explanation, the other agrees that it is indeed trivial.

 help



I remember myself on my first year of CS, set theory classes, at the whiteboard, trying to make a proof, but there something I was not able to prove at all, so I said 'it's trivial' and the doctor said 'yeah, it's trivial' and we went further.

"Trivial" doesn't exclusively mean "easy", though it is often used as a euphemism like that.

In a literal sense, it very well may have been trivial, even if neither you nor the professor would have been able to easily show it.


What's your definition of trivial?

The one I've always flown with is, trivial means (1) a special case of a more general theory (2) which flattens many of the extra frills and considerations of the general theory and (3) is intuitively clear ("easy") to appreciate and compute.

From this perspective, everything is trivial from the relative perspective of a god. I know of no absolute definition of trivial.


It originally hails from the trivium: the grammar, logic and rhetoric taught to beginner students.

the absolute definition of trivial is trivial to show

Maybe it wasn't trivial at all for both of you ...

No, this was really something trivial, in the sense that you could feel it's true. Like 2+2=4 but to prove it you need to create a set of functions, axiom and a theorem

> But I wonder how it seems to people who understand how it works?

As someone who mostly understands what's going on - It does not seem like wizardry to me, but I am very impressed that the author figured out the long list of arcane details needed to make it work.


The primary function of modern operating systems is to allow multiple programs to run, without interfering with each other, even if they try too. This means that each program can only read its own limited amount of memory and only gets to use the processor for a limited time, before another program gets a turn. Windows did not start using those features until Windows NT, which XP is based off of. Through Windows 98, any program could do whatever it wanted, and that hardware sat idle. Windows versions up to 98 were more like a library of features that a program could use, to display a user interface and talk to hardware peripherals.

There's special hardware in a processor, for the operating system to limit each programs access to memory and processing time, which Windows 9x leaves unused. This means that the Windows 9x Subsystem for Linux can say "look at me i'm the operating system now" and take over that hardware to run a modern operating system.


This is wildly inaccurate.

Windows 3.11 was a hypervisor running virtual machines. The 16-bit Windows virtual machine (within which everything was cooperatively multitasking), the 32-bit headless VM that ran 32-bit drivers, and any number of V86 DOS virtual machines.

Win9x was similar in the sense that it had the Windows virtual machine running 32-bit and 16-bit Windows software along with V86 DOS VMs. It did some bananas things by having KERNEL, USER, and GDI "thunk" between the environments to not just let 16-bit programs run but let them continue interacting with 32-bit programs. So no, Win9x was in fact 32-bit protected mode with pre-emptive multitasking.

What Win9x prioritized was compatibility. That meant it supported old 16-bit drivers and DOS TSRs among other things. It also did not have any of the modern notions of security or protection. Any program could read any other program's memory or inject code into it. As you might expect a combination of awful DOS drivers and constant 3rd party code injection was not a recipe for stability even absent bad intentions or incompetence.

Windows 2000/XP went further and degraded the original Windows NT design by pulling stuff into kernel mode for performance. GDI and the Window Manager were all kernel mode - see the many many security vulnerabilities resulting from that.


This is correct. Win9x did have memory protection, it just made an intentional choice to set up wide open mappings for compatibility reasons.

WSL9x uses the same Win9x memory protection APIs to set up the mappings for Linux processes, and the memory protection in this context is solid. The difference is simply that there is no need to subvert it for compatibility.


That's greatly oversimplified, or less generously, just flat out wrong. Win32 programs have always had their own isolated address space. That infamous BSOD is the result of memory protection hardware catching an access to something outside of that address space. When you open a DOS box, it uses the paging and V86 hardware mechanisms to create a new virtual machine, even though it shares some memory with the instance of DOS from which Windows was booted.

What Windows 9x didn't have was security. A program could interfere with these mechanisms, but usually only if it was designed to do that, not as a result of a random bug (if the entire machine crashed, it was usually because of a buggy driver).


win32 programs in win32s shares same address space.

Thank you, that's a great explanation.

It's mostly explained if you go to the project page. For me, the I would say the hardest thing about something like this is gleaning the Microsoft driver APIs. In the 9x days, Microsoft documentation was not quite thorough and difficult to access. It's still not pleasant.

It’s good use for AI

Have the model spit out example programs to study the API


Never heard of this joke, very funny !

The win9x kernel famously doesn't do very much, which seems like it can give you ample room to port some of Linux's low level functionality to it.

She is indeed some sort of wizard

Now I know two mathematician jokes. The other one is "A mathematician is a device for converting coffee into theorems."

> this seems an impossible feat

What makes you think so?


AAA+ joke

[flagged]


The README states:

> Proudly written without AI.


amelius was saying that tongue-in-cheek.

As the repo says

> Proudly written without AI.


Stuff like this is far above the capabilities of today’s top AIs.

It’ll produce something, sure, but it won’t actually work, and making it work takes as much effort as building it from scratch.


This is way beyond the current capability of AI. You should likely know that instead of just trashing random projects.

I believe this wasn't really even a joke, but a real story that got distorted as joke: https://hsm.stackexchange.com/a/8054

This is in the class of things where even if the specific text doesn't trace to a true story, it has certainly happened somewhere, many times over.

In the math space it's not even quite as silly as it sounds. Something can be both "obvious" and "true", but it can take some substantial analysis to make sure the obvious thing is true by hitting it with the corner cases and possibly exceptions. There is a long history of obvious-yet-false statements. It's also completely sensible for something to be trivially true, yet be worth some substantial analysis to be sure that it really is true, because there's also a history of trivial-yet-false statements.

I could analogize it in our space to "code so simple it is obviously bug free" [1]... even code that is so simple that it is obviously bug free could still stand to be analyzed for bugs. If it stands up to that analysis, it is still "so simple it is obviously bug free"... but that doesn't mean you couldn't spend hours carefully verifying that, especially if you were deeply dependent on it for some reason.

Heck I've got a non-trivial number of unit tests that arguably fit that classification, making sure that the code that is so simple it is bug free really is... because it's distressing how many times I've discovered I was wrong about that.

[1]: In reference to Tony Hoare's "There are two ways to write code: write code so simple there are obviously no bugs in it, or write code so complex that there are no obvious bugs in it."


I don't think you understand how jokes work. They are mostly "distortions" of real dialog or events to add incongruous or absurdist elements. Here, Hardy's not uncommon momentary doubt about whether a statement really was obvious, while faintly amusing, is made into a joke by turning the momentary doubt into a 15 minute excursion. People then riff on the joke by turning that excursion into a mathematician presenting an elaborate proof that a statement is "obvious", quite contrary to the meaning of "obvious".

> I don't think you understand how jokes work.

That makes sense, I was just born yesterday.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: