Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is 5s really such a problem? I don't think I'd bail out of a website because it took 5s to load, unless it was something I didn't particularly want to see anyway. Which I guess might be why people didn't stick around for the tests from which the 35% number is drawn.


Yes. Numerous reputable entities have published reports demonstrating that users notice quite a lot. Amazon claims that every 100ms costs them 1% of revenue. Google claims 500ms costs them 20% of traffic. 5 seconds is a fucking eternity, and anything you expose to users on the web with such horrible performance will suffer greatly because of if. One exception may be banks. Users are more forgiving of latency as their financial connection to it increases.


I guess I find this plausible if we're talking about n ms multiplied by the number of resources loaded, and your page doesn't render progressively. If we're talking about total load time, I don't see why you'd even bother clicking a link if you weren't prepared to wait a few seconds for it to load.

Edit: in the case of Google and Amazon, I can believe that being slow will cause users to defect to other services. I don't believe that anybody will not bother to read documentation because it takes a second to load.

Edit2: If this is true, can anybody explain why users behave in this seemingly bizarre way? Do you give up on pages after 500ms? Have you seen anybody else do that? What is going on?


Look at page speed vs. bounce rate in Google Analytics for any well-trafficked site. People are frequently casually clicking around, and the faster you have content up on the screen the more casual users will engage.

By analogy, to get into a different mindset, think about channel-surfing on the TV. If other channels show a picture in 0.2s, and as you flip around there's a channel that takes 0.8s to show a picture, are you more or less likely to surf past the slow channel?


Thank you for replying! I got a staggering number of unexplained downvotes before anybody was prepared to talk to me.

Thinking about it I can well believe that if you want people to stay on your site and click around, probably because you want to show them ads or products, a small delay will impact on the number of clicks. I would imagine that's not the motivation for most github pages sites though.


Regarding giving up after 500ms: I don't think the issue is that people are consciously abandoning a site after a single page load that seems a bit slow. It's the cumulative burden of slightly slow pages that make the site slightly less attractive compared to other alternatives that respond faster. The differences are noticeable - if only subconsciously - and the result is that a portion of the users will move to the other service that just feels more responsive. Responsiveness of the site is part of the value being offered (even if people don't recognize it explicitly) For any site with significant volume of users and some effective competition for their service, this distinction results in measurable changes in use/conversions. I think the _actual_ change in user activity or conversions for s specific site would depend a whole lot on the nature of the service being offered and the alternatives available.


I would imagine less serious viewers will drop off quicker than motivated viewers.

Nothing can stop me if I need to buy something on Amazon or need to pay a bill online. If I'm just filling time and here's three interesting links to "fad of the day (hour?)" then slowest link might lose.

A simple A/B tester could insert an additional 50 ms to half the requests and some data analysis could calculate the slope of the graph in that area. Assuming that slope is perfectly linear for no good reason at extremes like 1500 seconds or 0.0000001 nanoseconds would be unwise.


you lose about 10% after 1 second, and about 5% every second after that. So yes, it is a very big deal.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: