I recently lost my internet provider, so while I wait for a new one I'm tethering from my phone. The Verizon unlimited data plan actually throttles me to modem speed, about 56k, so it's actually not unlimited. Anyway, it's been interesting to judge various websites by how fast they load at that speed. Hacker News comes right up. Twitter is acceptable. Facebook is terrible and often does not even load at all. The same for any site that uses React. I'm not sure why Facebook uses such a bloated system when they are trying to expand their user base into much of the world that does not have high speed internet.
I travel a lot and have gotten quite used to browsing on airplane WiFi, so a similar low-bandwidth experience (at times).
I'll add a huge culprit to the list: Medium. They have their own "clever" code to progressively load images and I find it absurdly frustrating (because in most scenarios, the images just don't ever load), so I end up with lots of design articles with blurry blobs of color instead of images.
There are so many ways to natively progressively load images that I'm not sure why they've chosen the most user-hostile one. You see blurry blobs of color in no particular order, no indication of which ones are loading, no way to force a particular image, etc. I find myself frustrated often and I end up abandoning most of the stories (or avoiding Medium altogether).
Isn’t progressive loading actually built-in to the JPEG standard? Like you get it for free if you encode it for progressive decode. Yet another “lets use JavaScript” waste of time. Developers gotta develop tho.
I remember tracking down corrupt entries in our DB. It was mostly one user introducing the inconsistencies. Turns out he would double-click on every button, and the browser would happily send-abort-send two requests every time. Sometimes these would race on the server.
We implemented disable-on-submit after that, and the inconsistencies went away. Other people would click again when the response didn't come fast enough, but that was rare to lead to corruption. Probably when their connection was lagging, they would click multiple times in frustration. But that one guy provoked enough destruction to make us notice and fix it for everybody!
I believe Twitter uses React? Although it’s possible that Twitter successfully served you Twitter-Lite while FB may have mistakenly sent the full version.
Because I don't know it's not React. Twitter could be using a stripped down minimal version of React with limited functionality. Standard React could still be a bloated mess.
React is ~6KB (even less g-zipped). React is very performant and its performance was one of it's original claims to fame over previous frontend frameworks like Angular.
React never claimed performance, and the virtual DOM is quite wasteful. Any direct manipulation library will perform better.
Plus, it’s dishonest to look at react core size. react-dom, which is absolutely required, is 40KB minified AND compressed. Plus the usual plugins to handle icons, SVG, animations, and on on. A base React application easily crosses the 200KB gzipped mark.
> React never claimed performance, and the virtual DOM is quite wasteful. Any direct manipulation library will perform better.
React's performance (in large part because of it's virtual DOM) over Angular was a common benefit cited in it's rise to fame. Indeed no one ever claimed React was somehow faster than the document API (or even JQuery) and that isn't what I'm saying either. Whether or not you think the virtual DOM is wasteful in 2021 is a separate opinion and even Svelte devs admit the virtual DOM approach is usually fast enough.
> A base React application easily crosses the 200KB gzipped mark.
I didn't bootstrap a CRA app to check its full size, but considering a hero image alone can easily be larger than this, it still stands that React is certainly not "bloated".
The mating call of all react devs - “but what about the images”. Who said a 100KB hero image was a good idea either? It doesn’t excuse dropping a 1MB (when executed) bomb of JavaScript to deliver a blog post and making mobile and low speed users suffer.
It’s also established that byte per byte, JS is way more costly to download, parse and execute than images, the comparison itself is dumb.
Dan Ambramov has stated the reason hooks was created was to avoid nesting hell that was used to manage state. I may be misunderstanding him. I am not s React expert.
React hooks are just Javascript functions. The layout of your application is unaffected by the shape of your application state. Instead the shape of your application's state is typically governed by its layout.
Funny, I just went through a similar experience and had to tether off a Verizon connection for almost two weeks.
I certainly felt the pain of a slow connection too and felt frustrated at how badly this affected the experience on so many sites.
Here’s an idea: web developers should test their sites on the slowest connection speed still commonly available (ie 3G) and make sure the experience is still acceptable. I know that webpagetest [1] allows you to do this and the results are illuminating.
Considering that facebook managed to use up 800 MB of ram, and it requires a good 2-3 secs minimum to open a chat head, I honestly have a hard time believing they test by humans that shit of a UI in any shape or form, let alone in multiple ways (or care about the test results)
I’m living overseas and this happens with my phone from the US. It’s unbelievable how many apps time out. Why an app has implemented a timeout is instead of relying on the network stack is beyond me.
If not that, maybe try PdaNet. It is $10-15 for a license and it can mask that you're tethering through a tunnel from desktop->phone, so everything gets counted as mobile data instead of hotspot data.
It appears to be an app for a phone? I have no throttling problem on the phone in case that wasn't clear, but I don't care to look at the small screen and want to use my desktop.
It actually used to have -more- features. I use it on my phone exclusively; once upon a time I could use messenger from within it. They removed that feature and now try to foist the Messenger app on me. So I don't use FB Messenger from my phone.
I'm not counting images and videos because all sites have that problem, but things like clicking on the notifications icon and then waiting for it too load and it never does so I have to give up and go back to my phone.
It's not the size of the React library that is the problem, it's the nesting of components with components within components that is the problem. If you've ever shown source and seen divs 20 to 30 deep, you're most likely looking at html that React produces.
Dan Ambramov has stated the reason hooks was created was to avoid nesting hell that was used to manage state. I may be misunderstanding him. I am not s React expert.
I believe "nesting hell" was in reference to obtuse state management architecture rather than DOM structure. It seems like a common misconception based off other comments on this post. Angular inserts a new custom element for every component so it's definitely a problem elsewhere, if not with React.
React needs to borrow the idea of 'zero-cost abstractions' from Rust. Too often I've seen five divs used to add five layers of behavior where one (or zero) divs could have had the same effect.