Hacker Newsnew | past | comments | ask | show | jobs | submit | oivey's commentslogin

> Excess solar power generated by ordinary consumers is probably being priced correctly

Do you have any evidence for this position? Is this just regulations giving you bad vibes? I’m pretty sure everyone was quite aware the sun doesn’t shine at night whenever the previous rules and regulations were written. Your analysis isn’t breaking new ground.


Demand for electricity is higher during the day. The previous rules were written when solar was a smaller part of the grid and "generates during the day" was an advantage.

As the amount of solar increases, the supply during the day goes up, so the daytime price starts going down. Meanwhile the highest demand period is just after sunset, so that's going to be when the price is highest because not only is that the highest demand, that's when solar generation is zero. And it's when people selling solar during the day are trying to buy power back. But now they're selling low and buying high.


> I’m pretty sure everyone was quite aware the sun doesn’t shine at night whenever the previous rules and regulations were written.

And because they knew, the regulations I've heard of set some sort of statutory price that consumers get. This is because it's been fairly likely from the start that if the price is set by the market with reference to the value of the electricity, consumers won't get anything. Because their contribution is largely worthless and occasionally value-destructive.


No? Christianity, for example, has changed massively over time. You can pick any denomination, and it’s true.

You can also look at new religions, denominations, or sects popping up. The purpose of religion is at its core supposed to be spirituality.


At a certain point, grades become arbitrary and won’t necessarily select for the best candidates. Obviously the current system doesn’t, either.

The actual solution is to increase the number of slots for training doctors to match the huge number of qualified applicants. It makes even more sense given that there is a shortage of doctors and health care costs are astronomical.


It’s free, but it’s not like they’re running Gmail as a charity, either. It has revenue and contributes to their other businesses.


Even that is arguably not lucky, it just followed a non-obvious trajectory. Graphics uses a fair amount of linear algebra, so people with large scale physical modeling needs (among many) became interested. To an extent the deep learning craze kicked off because of developments in computation on GPUs enabled economical training.


Nvidia started their GPGPU adventure by acquiring a physics engine and porting it over to run on their GPUs. Supporting linear algebra operations was pretty much the goal from the start.


They were also full of lies when they have started their GPGPU adventure (like also today).

For a few years they have repeated continuously how GPGPU can provide about 100 times more speed than CPUs.

This has always been false. GPUs are really much faster, but their performance per watt has oscillated during most of the time around 3 times and sometimes up to 4 times greater in comparison with CPUs. This is impressive, but very far from the "100" factor originally claimed by NVIDIA.

Far more annoying than the exaggerated performance claims, is how the NVIDIA CEO was talking during the first GPGPU years about how their GPUs will cause a democratization of computing, giving access for everyone to high-throughput computing.

After a few years, these optimistic prophecies have stopped and NVIDIA has promptly removed FP64 support from their price-acceptable GPUs.

A few years later, AMD has followed the NVIDIA example.

Now, only Intel has made an attempt to revive GPUs as "GPGPUs", but there seems to be little conviction behind this attempt, as they do not even advertise the capabilities of their GPUs. If Intel will also abandon this market, than the "general-purpose" in GPGPUs will really become dead.


GPGPU is doing better than ever.

Sure FP64 is a problem and not always available in the capacity people would like it to be, but there are a lot of things you can do just fine with FP32 and all of that research and engineering absolutely is done on GPU.

The AI-craze also made all of it much more accessible. You don't need advanced C++ knowledge anymore to write and run a CUDA project anymore. You can just take Pytorch, JAX, CuPy or whatnot and accelerate your numpy code by an order of magnitude or two. Basically everyone in STEM is using Python these days and the scientific stack works beautifully with nvidia GPUs. Guess which chip maker will benefit if any of that research turns out to be a breakout success in need of more compute?


> GPGPU can provide about 100 times more speed than CPUs

Ok. You're talking about performance.

> their performance per watt has oscillated during most of the time around 3 times and sometimes up to 4 times greater in comparison with CPUs

Now you're talking about perf/W.

> This is impressive, but very far from the "100" factor originally claimed by NVIDIA.

That's because you're comparing apples to apples per apple cart.


For determining the maximum performance achievable, the performance per watt is what matters, as the power consumption will always be limited by cooling and by the available power supply.

Even if we interpret the NVIDIA claim as referring to the performance available in a desktop, the GPU cards had power consumptions at most double in comparison with CPUs. Even with this extra factor there has been more than an order of magnitude between reality and the NVIDIA claims.

Moreover I am not sure whether around 2010 and before that, when these NVIDIA claims were frequent, the power permissible for PCIe cards had already reached 300 W, or it was still lower.

In any case the "100" factor claimed by NVIDIA was supported by flawed benchmarks, which compared an optimized parallel CUDA implementation of some algorithm with a naive sequential implementation on the CPU, instead of comparing it with an optimized multithreaded SIMD implementation on that CPU.


At the time, desktop power consumption was never a true limiter. Even for the notorious GTX 480, TDP was only 250 W.

That aside, it still didn't make sense to compare apples to apples per apple cart...


Well, power envelope IS the limit in many applications; anyone can build a LOBOS (Lots Of Boxes On Shelves) supercomputer, but data bandwidth and power will limit its usefullness and size. Everyone has a power budget. For me, it's my desk outlet capacity (1.5kW); for a hyperscaler, it's the capacity of the power plant that feeds their datacenter (1.5GW); we both cannot exceed Pmax * MIPS/W of computation.


All of that may be true but it’s irrelevant.

If you’re dividing perf by perf/W, it makes no sense to yell “it’s not equal to 100!” You simply failed at dimension analysis taught in high school.


> A few years later, AMD has followed the NVIDIA example.

When bitcoin was still profitable to mine on GPUs, AMD's performed better due to not being segmented like NVIDIA cards. It didn't help AMD, not that it matters. AMD started segmenting because they couldn't make a competitive card at a competitive price for the consumer market.


That physics engine is an example of a dead-end.


I don’t think this is true. Advancements in technology often make things possible that previously were not at any price. Engines, for example, are better than ever in part due to computer modeling that would have been impossible in the 70s. Same deal with aerodynamics, safety features, and a million other things. In the 70s, you couldn’t have those things for any price. They required decades of development in other sectors to open possibilities for automobiles.


Most technology on cars existed years or decades before the became commonplace and affordable enough to use outside racing or exotic cars.

Airbags were patented in the 1950s. Modern ABS in 1971. Fist electronic fuel injector in 1957. You could take the Formula 1 level technology of 1970, and with enough money, apply it to a pickup truck. It would be shockingly expensive - and not as good. T

hat's my point! You are getting so much more for your dollar today, even though prices have risen faster than inflation. You are getting a multi-million dollar truck for $50k.


> You are getting a multi-million dollar truck for $50k.

You’re not, though, because that truck never did and never could exist. A modern F-150 isn’t a 70s F1 car made cheap by new tech. This isn’t something you can wave away with an argument equivalent to “we put 1000 research points in the tech tree.”

When the US economy was working well, products got better and cheaper over time. Tech and increased labor productivity drove that. Now, tech and labor productivity has continued to increase, yet consumer prices have far outpaced inflation.


"A modern F-150 isn’t a 70s F1 car made cheap by new tech. "

Yes, it pretty much is. You have to consider technology in cars is moving on two seperate/distinct paths.

1: improving manufacturing processes, materials, quality, which is lowering prices over time. Megacasting aluminum car parts is an example.

2: Adding totally new complex parts and systems that cars didn't have. This is things like airbags, antilock breaks, infotainment system, catalytic converters. This adds to the total cost.

#2 is far outpacing #1, which is why prices of cars are going up faster than inflation, wages, etc.


Again, the old car comparison is demonstrably untrue. To put the same example forward, computer modeling has wildly changed and accelerated car design in ways that were impossible for any sum of money in the 70s.

I think part of why this is hard to believe is that people strongly believe in the concept that time is money. On the margins for decisions like hiring someone to mow your lawn, it is true. For large scale things, you often cannot accelerate processes no matter how much money you dump into it. A good example of this is how long it has taken China to industrialize.

To be clear also, you have to prove your point that #2 is outpacing #1. The fact that the price keeps going up is not proof as there are other explanations. The poor quality of domestic manufacturers and their bad business practices, for example.


And 3D printing helped a lot too. Not as much as computer modelling, but still.


That is a low n, but I’m not sure what the alternative is. Surely random anecdotes (n=1) are even less powerful?


The low n is not the only questionable thing about the study. What a big n gives you is diversity of samples and tighter confidence intervals, but it can not correct for methodological limitations. Specifically, they didn't invite any people with sleep issues or who are already sleeping under noise. Therefore the conclusion is a "duh" - if you don't require pink noise to sleep, then don't add it.


Random anecdotes might be less biased. For example, no pressure to publish nor sell a product.


The alternative is higher n. The study makes a claim, it does not present the evidence necessary to back up that claim. Until someone does a larger study, no conclusion should be drawn.


I'm not inviting you to draw conclusions from my semi-random (but informed by years of professional thought about why people like different sounds) anecdote.


Personally, I trust the results of a sleep study, or any study on anything, by people I don’t know with questionable incentives than I do anecdotes of commenters I’ve been following for 10 years on HN, especially when they align with my own experiences, and conversations I’ve had over beers with people in industry (whatever that might be).

A lot of “science” is junk, not insofar as it’s false, but like water is wet.

Good science: there are compounds in cruciferous vegetables that appear to exert some health benefits.

Junk science: bok choy is green.

If a sleep lab is ignoring the fact of chronotypes (it’s obvious our genetic history would require some people to be predisposed to keep an eye out for toothy clawed things, and dangerous ‘others’) while most of their tribe / community are sleeping), the people who work there do so because it pays the bills, not because they’re passionate about working in the medicine / health industry at all.

I encourage people to get up and walk out if you find yourself at a service provider that doesn’t care about you. Find someone who gives a frak.


Once again, I am not suggesting you generalize from my anecdote. I like the sound of rain and sleep better with it. I have absolutely no idea how widespread this is in the population.


Lockheed, Boeing, Northrop, Raytheon, and all the others are private companies, too. NASA and others generally go through contractors to build things. SpaceX is on the dole just like them.


The satellite is built on Earth, so I’m not sure how it dodges any of those regulations practically. Why not just build a fully autonomous, solar powered datacenter on Earth? I guess in space Elon might think that no one can ban Grok for distributing CSAM?

There’s some truly magical thinking behind the idea that government regulations have somehow made it cheaper to launch a rocket than build a building. Rockets are fantastically expensive even with the major leaps SpaceX made and will be even with Starship. Everything about a space launch is expensive, dangerous, and highly regulated. Your datacenter on Earth can’t go boom.


Truly magical thinking, you say? OK, let's rewind the clock to 2008. In that year two things happened:

- SpaceX launched its first rocket successfully.

- California voted to build high speed rail.

Eighteen years later:

- SpaceX has taken over the space industry with reusable rockets and a global satcom network, which by itself contains more than half of all satellites in orbit.

- Californian HSR has spent over thirteen billion dollars and laid zero miles of track. That's more than 2x the cost of the Starship programme so far.

Building stuff on Earth can be difficult. People live there, they have opinions and power. Their governments can be dysfunctional. Trains are 19th century technology, it should be easier to build a railway than a global satellite network. It may seem truly magical but putting things into orbit can, apparently, be easier.


That’s a strange comparison to make. Those are entirely different sectors and sorts of engineering projects. In this example, also, SpaceX built all of that on Earth.

Why not do the obvious comparison with terrestrial data centers?


it should be easier to build a railway

No, because of the costs of acquiring land that the railway goes through.


Now how about procuring half a gigawatt when nearby residents are annoyed about their heating bills doubling, and are highly motivated to block you? This is already happening in some areas.


"fantastically expensive"

From individual POV yes, but already Falcons are not that expensive. In the sense that it is feasible for a relatively unimportant entity to buy their launch services.

"The satellite is built on Earth, so I’m not sure how it dodges any of those regulations practically."

It is easier to shop for jurisdiction when it comes to manufacturing, especially if your design is simple enough - which it has to be in order to run unattended for years. If you outsource the manufacturing to N chosen factories in different locations, you can always respond to local pressure by moving out of that particular country. In effect, you just rent time and services of a factory that can produce tons of other products.

A data center is much more expensive to build and move around. Once you build it in some location, you are committed quite seriously to staying there.


A truck is sluggish because of its weight and inertia. It’s a law of nature. What law of nature is making Gitlab slow?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: