Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I recently lost my internet provider, so while I wait for a new one I'm tethering from my phone. The Verizon unlimited data plan actually throttles me to modem speed, about 56k, so it's actually not unlimited. Anyway, it's been interesting to judge various websites by how fast they load at that speed. Hacker News comes right up. Twitter is acceptable. Facebook is terrible and often does not even load at all. The same for any site that uses React. I'm not sure why Facebook uses such a bloated system when they are trying to expand their user base into much of the world that does not have high speed internet.


You can bypass the speed restrictions by decreasing the TTL on your machine by 1 [0], I tested it and it worked perfectly.

[0] https://android.stackexchange.com/questions/226580/how-is-ve...


The answer you linked talks about _increasing_ the TTL on your machine.


Hah! This is awesome, I'll give it a try. Thanks.


I travel a lot and have gotten quite used to browsing on airplane WiFi, so a similar low-bandwidth experience (at times).

I'll add a huge culprit to the list: Medium. They have their own "clever" code to progressively load images and I find it absurdly frustrating (because in most scenarios, the images just don't ever load), so I end up with lots of design articles with blurry blobs of color instead of images.

There are so many ways to natively progressively load images that I'm not sure why they've chosen the most user-hostile one. You see blurry blobs of color in no particular order, no indication of which ones are loading, no way to force a particular image, etc. I find myself frustrated often and I end up abandoning most of the stories (or avoiding Medium altogether).


Isn’t progressive loading actually built-in to the JPEG standard? Like you get it for free if you encode it for progressive decode. Yet another “lets use JavaScript” waste of time. Developers gotta develop tho.


This came up with my remote team recently.

A coworker and myself had the worst internet speeds in the company, but he recently got FTTH.

I went to replicate a bug, by clicking on a button quickly and excessively, and was able to add 5 duplicate entries into the DB.

The frontend dev could not replicate it until I suggested using the Chrome dev tools to simulate a slower connection.

I have Frontier DSL.


I remember tracking down corrupt entries in our DB. It was mostly one user introducing the inconsistencies. Turns out he would double-click on every button, and the browser would happily send-abort-send two requests every time. Sometimes these would race on the server.

We implemented disable-on-submit after that, and the inconsistencies went away. Other people would click again when the response didn't come fast enough, but that was rare to lead to corruption. Probably when their connection was lagging, they would click multiple times in frustration. But that one guy provoked enough destruction to make us notice and fix it for everybody!


I believe Twitter uses React? Although it’s possible that Twitter successfully served you Twitter-Lite while FB may have mistakenly sent the full version.


I just checked. You're right. So maybe it isn't React but something else Facebook is doing.


How about editing your above comment that was badmouthing react then?


Because I don't know it's not React. Twitter could be using a stripped down minimal version of React with limited functionality. Standard React could still be a bloated mess.


> Standard React could still be a bloated mess.

React is ~6KB (even less g-zipped). React is very performant and its performance was one of it's original claims to fame over previous frontend frameworks like Angular.


React never claimed performance, and the virtual DOM is quite wasteful. Any direct manipulation library will perform better.

Plus, it’s dishonest to look at react core size. react-dom, which is absolutely required, is 40KB minified AND compressed. Plus the usual plugins to handle icons, SVG, animations, and on on. A base React application easily crosses the 200KB gzipped mark.


> React never claimed performance, and the virtual DOM is quite wasteful. Any direct manipulation library will perform better.

React's performance (in large part because of it's virtual DOM) over Angular was a common benefit cited in it's rise to fame. Indeed no one ever claimed React was somehow faster than the document API (or even JQuery) and that isn't what I'm saying either. Whether or not you think the virtual DOM is wasteful in 2021 is a separate opinion and even Svelte devs admit the virtual DOM approach is usually fast enough.

> A base React application easily crosses the 200KB gzipped mark.

I didn't bootstrap a CRA app to check its full size, but considering a hero image alone can easily be larger than this, it still stands that React is certainly not "bloated".


The mating call of all react devs - “but what about the images”. Who said a 100KB hero image was a good idea either? It doesn’t excuse dropping a 1MB (when executed) bomb of JavaScript to deliver a blog post and making mobile and low speed users suffer.

It’s also established that byte per byte, JS is way more costly to download, parse and execute than images, the comparison itself is dumb.


Being faster and smaller than Angular isn't much of a feat...


As I said, it's not the size of the React library that's the problem: https://news.ycombinator.com/item?id=26691150


There is no additional overhead in nesting React components vs normal HTML elements. React-rendered and normally rendered HTML are identical.


Dan Ambramov has stated the reason hooks was created was to avoid nesting hell that was used to manage state. I may be misunderstanding him. I am not s React expert.


React hooks are just Javascript functions. The layout of your application is unaffected by the shape of your application state. Instead the shape of your application's state is typically governed by its layout.


I think Twitter does some aggressive PWA-style caching


Funny, I just went through a similar experience and had to tether off a Verizon connection for almost two weeks.

I certainly felt the pain of a slow connection too and felt frustrated at how badly this affected the experience on so many sites.

Here’s an idea: web developers should test their sites on the slowest connection speed still commonly available (ie 3G) and make sure the experience is still acceptable. I know that webpagetest [1] allows you to do this and the results are illuminating.

[1]: https://www.webpagetest.org/


No idea if they're still doing this, but Facebook has done "2G Tuesdays" for just that reason

https://engineering.fb.com/2015/10/27/networking-traffic/bui...


Considering that facebook managed to use up 800 MB of ram, and it requires a good 2-3 secs minimum to open a chat head, I honestly have a hard time believing they test by humans that shit of a UI in any shape or form, let alone in multiple ways (or care about the test results)


I’m living overseas and this happens with my phone from the US. It’s unbelievable how many apps time out. Why an app has implemented a timeout is instead of relying on the network stack is beyond me.


React is not inherently slow. The sites you use are slow because they're bad sites, not because they use React.


Easier to start an ISP and give away free internet than to rewrite Facebook.com to be more performant


If not that, maybe try PdaNet. It is $10-15 for a license and it can mask that you're tethering through a tunnel from desktop->phone, so everything gets counted as mobile data instead of hotspot data.


Search for "Facebook Lite", it's their lightweight frontend for lower-bandwidth connections.


It appears to be an app for a phone? I have no throttling problem on the phone in case that wasn't clear, but I don't care to look at the small screen and want to use my desktop.


I think they mean https://m.facebook.com - a lightweight web version.


There's also https://mbasic.facebook.com, which is even more lightweight and still allows you to send and receive messages.


I tried it. It's very ugly and featureless on desktop but it does load fast.


It actually used to have -more- features. I use it on my phone exclusively; once upon a time I could use messenger from within it. They removed that feature and now try to foist the Messenger app on me. So I don't use FB Messenger from my phone.


After minification and compression, React is about 100kb and can be cached for a long time.

With all the images, videos on social media etc. it seems rather small in terms of bandwidth.


I'm not counting images and videos because all sites have that problem, but things like clicking on the notifications icon and then waiting for it too load and it never does so I have to give up and go back to my phone.


I thought it was actually around 30kb, after being minified and gzipped?


It's not the size of the React library that is the problem, it's the nesting of components with components within components that is the problem. If you've ever shown source and seen divs 20 to 30 deep, you're most likely looking at html that React produces.


A React component has no obligation to add a div or any other element to the DOM tree.


Dan Ambramov has stated the reason hooks was created was to avoid nesting hell that was used to manage state. I may be misunderstanding him. I am not s React expert.


I believe "nesting hell" was in reference to obtuse state management architecture rather than DOM structure. It seems like a common misconception based off other comments on this post. Angular inserts a new custom element for every component so it's definitely a problem elsewhere, if not with React.


React needs to borrow the idea of 'zero-cost abstractions' from Rust. Too often I've seen five divs used to add five layers of behavior where one (or zero) divs could have had the same effect.


With react dom and everything?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: