It's an old, old argument, but very true. My first Internet-capable computer had a 14.4k modem - that's a theoretical speed of 14,000 kilobits per second. But on the whole I don't really remember the web being slower or less usable than it is now, and that was before ad blockers were really a thing. And some of the technologies that we have now but not then could theoretically have made sites load even faster; PNG instead of GIF, MozJPEG compression for JPEGs, SVG for vector graphics, CSS for reducing repeated code and getting rid of table-based layouts, etc.
But, obviously, the web hasn't gotten faster, because the increased bandwidth has been filled with increased nonsense, and browsers now download 2MB of nonsense to show a 200-character text post. It's this sort of nonsense that is partly the genesis for ideas like Gemini, a return to Gopher, or my own KyuWeb; limit the capabilities of the medium to vastly improve its speed and legibility.
This sort of thing isn't exclusive to the web, either. Is an old late-'90s version of Microsoft Word that could run on a machine with 4MB of RAM really a thousand times better than one from today that requires 4GB?
I really love the web, but sometimes it does piss me off a bit.
Same here. I have very clear memories of dismissing the WWW as useless for exactly this reason. Those memories are 30 years old so they may not be entirely accurate, but the principle of embarrassment [1] lends them some support if you buy into that sort of thing.
I don't know about you guys but I got a hell of a lot done on a 56K modem all the way up until the mid-2000s. Hell, I was even doing group content on City of Heroes from the dial-up connection I had while working my night-shift hotel job, and the game played surprisingly well given the bandwidth and latencies I had.
you guys should try and dig up some old home movies of using the computer back in the 90s
Nowadays if something isn't a bloated pile of trash, it'll literally load up in a few dozen milliseconds or less. Back then, even a pretty lightweight text-only site would take seconds to load and render (and of course, bloated crap sites back then took upwards of a minute to load if you left images on)
This was a killer feature in the iCab browser! I loved that ambitious little browser and used it for many years, and its ability to easily toggle image loading was one of the reasons.
I browse that way today. Images, CSS and javascript disabled by default and whitelisted on a case-by-case basis with uMatrix. 90% of the time this makes websites better and I don't end up whitelisting anything. Most of the images paired with articles online are irrelevant stock photos that are only added to articles because the illiterate half of the population will close the page immediately if there isn't a picture to look at.
When I was a kid we had 28.8 for a couple years then 56k a couple years later. I remember frequently leaving the room and coming back to check and see if the page loaded. I convinced my parents that if we got DSL I would spend less time online because I wouldn't have to wait for websites to load. Hahaha
Same here. I remember when I used to think twice before clicking a link and loading a page. It would take seconds to load most pages, so loading things unnecessarily was a huge waste of time. Now people don’t think twice about opening a bunch of links from search results all at the same time, etc.
I also remember loading being so slow that I could sit there watching an imagine file load a few rows of pixels at a time. And when downloading, I had to worry about tying up the phone line for x amount of time, even for things that were just a few MB. So yeah, it was definitely much slower “back in the day.”
Bandwidth has improved, i.e., internet speeds have improved. But the internet is not the web.
The amount of data websites try to make users transfer, e.g., through browsers that auto-load resources and run Javascript, has also increased. The number of websites using "simple HTML" has decreased, even though it makes the web faster. As such, for the web user, the speed of the web (cf. internet speed) has not improved. At best it has only stayed the same.
Thanks to increased internet speeds more data can be transferred in less time. But if the amount of data is increased instead of being held constant, e.g., simple HTML is no longer "good enough", then over time the user may not notice any speed improvement when using the web. The amount of data being transferred over the web today attributable to marketing, advertising, tracking and telemetry, and the number of DNS lookups to support those connections, has increased substantially.
When "tech" companies purport to work to "make the web faster" their strategy is never to reduce the practices that make the web slower, like advertising. In some cases they are working to allow more such traffic, without affecting speed too much so as to alert the user. Or to allow routing website traffic through their own servers, e.g., AMP.
Transferring the data required for 1990s websites over today's internet would make the 1990s web noticeably faster. Unfortunately, today's websites and "ad tech" are substantially more resource hungry, mainly for the benefit of generating revenue for "tech" company intermediaries. The user foots the bill for the increased resource usage.
But, obviously, the web hasn't gotten faster, because the increased bandwidth has been filled with increased nonsense, and browsers now download 2MB of nonsense to show a 200-character text post.
A 10KB page at 14.4kbps takes 5 seconds to download. A 2MB page on a typical 30Mbps connection take less than 1s (ignoring improvements latency, changes in TCP frame size, better line conditions, etc). Web pages are definitely much bigger, and arguably bloated with unnecessary junk, but they still download faster than they did in the 14.4kbps days.
Bandwidth speed and page sizes don't explain why you think the web was faster decades ago.
I think it's more likely that you're just wrong. I remember using the web in 1995 on a 28kbps modem (such speed!), and it was slow as hell. It was genuinely click-a-link-and-get-cup-of-tea slow a lot of the time. Not only wasy connection awful, a lot of hosting companies were slow too. Servers would throttle and kill sites that used too much bandwidth. Pages often failed to load compared to today. I still used it of course because it was amazing, but I didn't enjoy it. Surfing wasn't really possible. Uploading anything was a pipedream. When I went to uni in 1997 and got access to broadband for the first time it was a revelation. The web became truly useful. I had my first website up and running 3 months later and I've never looked back...
A 200-character text post was not 10KB. Tons of useful information was available in pure text.
Also, most webpages were loading in far less than 500ms on a newer modems, 56k.
More importantly, a lot of pure-text pages were focused on providing dense information that you would read for minutes. The load time was negligible in comparison.
Today information density on the web is the opposite.
Speaking factually, I guess you have to be correct and nostalgia is tainting my memories. Maybe my standards were also much lower then, so a ten-second page load was less offensive then than it is now.
Your screen name forced me into a double take, as it's very rare for me to come across a similar name.
On topic, the idea of performance optimization seems to be completely lost in large sectors of modern day computer science. My recent experience being windows. I have several old laptops that had been rendered completely unusable through windows updates [one even stopped charging]. However, once replacing windows with Ubuntu, they 'feel' like some of the fastest machines I have. Hell, even the one that wouldn't charge, now charges and works like a charm.
It's sad to think of all the waste, physical, electrical, and economical, that stems from poorly optimized code.
When product managers come to me and demand I make the <select> look a certain way to fit in with the rest of the "design language", I end up having to re-write all of the base and edge cases and accessibility and theming of what's already a perfectly functioning UI component. That code gets shipped to every user. And that's just one component... tomorrow I will make the <input type="checkbox"> look a certain way. Probably hundreds of KB just building these base UI components to fit in with some "design goal" that will change again in 6 months. If it makes the app unusable for some (unmeasured) percentage of our users on featurephones in developing countries, they don't care, because the product managers making these decisions will be gone in 6 months too, replaced with another with the same attitudes. At least the app "looks nice"!
Guess who is paying for all that increased nonsense. Hint: It is not the "tech" companies orchestrating its delivery.
It is true this problem is not limited to bandwidth. The same applies to RAM, secondary storage and in some cases CPU. Who pays for the increased nonsense that saps the computer owner's RAM, storage and CPU. Hint: It is not the companies that provide "free" software.
I initially experienced computing on a VAX at a university, before I ever owned a "personal computer". Perhaps it was only the program we were running, but the speed was faster than anything I ever experienced later on a PC. I worked continuously over the years to try to make the PC "user experience" as fast as that VAX "user experience" as possible, for myself only. It has worked quite well.
The web that the "tech" companies want, where websites can and do vary drastically in speed and design,^1 and the one I want are two different things. Under the text-only, no Javascript, no CSS, no DNS system^2 I have designed, all websites are more or less the same speed and they all present information in more or less the same format.
Like the parent comment, I like the web as a concept. However I dislike the "tech" company model for the web. I am not the only one. IMHO, adoption of the "tech" company model for the web, with its steadily increasing nonsense, relies on a world of computing without meaningful, informed choice and without feasible alternative options.
1. Or, as the author cites, one where the practical result is traffic being routed through Google, e.g., AMP.
2. NB. I am not against the use of such facilties. However, due to the way they are commonly used on today's web, and given the lack of user control over their use, eliminating them was necessary to achieve the speed I wanted.
> Is an old late-'90s version of Microsoft Word that could run on a machine with 4MB of RAM really a thousand times better than one from today that requires 4GB?
100,000x even. Obviously yes! Even the web -- the things the higher resource caps enable are unbelievable. Yes bottom feeding news sites are worse for it, boo hoo. Have you see the shit you can do in a web browser today? On any device? It's so, so, so much cooler and better than anything the 90s had.
A lot less developer time spent making sure the application fit in a tiny amount of memory.
Of course that sucks for you as a user when an application takes up 4+GB of memory for no reason (I'm looking at you Teams). But for the company that writes it, it is great. Compile and ship, no more weeks or months of trying to shave off every bit so you can fit this in 16MB of RAM, or on just a few floppies. You have a big customer that will sign a 10 million+ contract if you add feature X? Sorry, no way to make that work in under 105KB, which means we can't ship it.
Pretty much none. I used Word 97 well into the early 2000's. In the early 90's, my Amiga ran a decent word processor in 1 meg of RAM. Software expands to consume whatever resources are available: memory, disk, CPU, bandwidth...
I truly wish I had the data to back this up, but in my memory the web is WAY WAY faster than it was back then even with all of the extra "bullshit". I mean the images were like 16 colors and 200X300, and even just text based sites were slower and latency was worse too. Am I crazy here?
I remember images loading one line at a time. Now I can stream 4k video. Not sure what OP is on about. There is lots of bloat but we are now on a jumbo jet and we used to be on a hot-air balloon
Somewhere around 1998 or 1999 I was working as a web developer and I listened to a pitch from a guy who wanted to build a video streaming site. For the life of me, I couldn't figure out who was going to pay a subscription to watch grainy, postage-stamp sized videos in a web browser, but apparently that sort of thing took off.
No doubt the modern web is faster even on moderate xDSL connections. I remember web pages taking half a minute to load, even relatively basic ones. The 56k v90 was quite a bit quicker than 14.4k too. Its not even close to similar I remember watching JPG images gradually progress through their low quality versions upwards. Pages were a lot lighter back then but still much much slower than today.
Latency was also a contributing factor. 56K modem latency is about 200 to 300ms higher than latency on a pure digital connection (ISDN, 56K frame relay, 56K DDS, etc.)
I can echo the sentiment. I would love to start over and save only the good stuff.
With that said, I am wary of my memory of the web of the 90's because everything was so new.
There are games I remember loving and seeing no serious flaws in twenty-five years ago, that I now go back to at original resolutions and original framerates and find completely unplayable. But having no point of comparison, what I was enthralled by at the time wasn't necessarily excellent control or excellent design -- it was being able to move in virtual 3d space at all.
My dad tells a similar story about Atari's Pong in the 1970's. Everyone was so excited by being able to control something on the TV screen that nothing else about the experience really mattered by comparison.
When the web became popular, the prospect of being able to interact with like-minded people anywhere in the world and from all walks of life and in communities of a size and power and with resources otherwise impossible -- it was all so new. And when I remember the time, I remember that. I didn't care about having to log into web sites on every visit. I didn't care about replicating data across devices. I didn't care about unprofessional web pages. I didn't care that everything was under construction, the design was universally bad, and that things often broke. I didn't care that any interaction had to go back to the server. I didn't care that pages were typically pretty static, and exploring meant constantly finding new ones rather than connecting with an author or community.
But if you dropped me back in the 90's now, I would care about all of that stuff.
I'm not defending the bloated and the awful. I often install video games now and think "GIGAbytes? What on earth FOR??" I often work on software now and see chains of chains of chains of dependencies and despair that we have clearly lost our way. Web sites crowded with ad-blocker defeating ads make me wish for the days when they would merely blink at you, rather than waiting 15 seconds and then starting up a video with sound in some random corner of the site. I do feel like if someone made me emperor, I could start over and do it right. I do wish we could all collectively agree to just not suck.
I'm just very skeptical that going back in time is quite as awesome as memory would lead one to expect.
The web has gotten much faster. I remember waiting around for at least 5 to 10 seconds for pages to load over 14.4K.
Still, modern web sites are incredibly bloated. There was a period in the late 90's where I had an early cable modem connection (3 megabits.) All sites were still built for dialup, so things loaded at lightning speed. In general, I'd say effective speeds have remained constant since the early 2000's since site bloat has kept pace with increasing broadband speeds.
I remember the web being scary more than anything in the 56k days. Viruses were something you constantly thought about on any site you didn't know well.
Now we just have data collection, which is I suppose theoretically pretty scary, but not an immediate concern at the moment for most people, or at least not a big enough to change people's behavior.
I also remember it being a bit slow. Sometimes a lot slow. And media being obviously near unusable.
I'd definitely take almost everything about the modern web over the old web. Except the content, which was way better back then, and endless scrolling, which is ass. I strongly suspect the two may be related.
I remember the name of one particular jpeg I downloaded. We used to go to the school lab, start the download, put a paper sign on the monitor asking people not to turn it off and come back later for our new jpeg.
> But on the whole I don't really remember the web being slower or less usable than it is now
I don't know, perhaps the internet I had was just too shit, but it was so much slower than what I get now. I remember when the progress bar was actually meaningful when loading a pure text webpage; and I don't get that today unless I'm using a phone inside an elevator.
> a 14.4k modem - that's a theoretical speed of 14,000 kilobits per second
Nope, that'd be a 14 Mbit modem, a relatively high speed for ADSL modems and not something you'd be able to push through those rusty barbed wire fences which make up some of those old PSTN installations. I guess you mean the 14.4k modem has a theoretical speed of 14.400 bits per second.
In the late 90's, you would've been forced to have a smaller set of dependencies, both out of necessity and because they didn't exist. In the early Java apps I worked on, we downloaded all the dependencies manually, generally from a shared drive or CVS repo, and put them in our classpath.
Yeah exactly, I would typically have a directory with all the dependencies that I downloaded one by one and use that as part of the classpath, whereas now I do mvn clean install --download-the-internet
something to be said about constraint and enforcing some artificial limitations on yourself as a developer or designer.
i think the issue is down to developers, on their iMac Pro with 4k screen, 5 mins from the data centre. a lack of appreciation for the bloat, because it's not really felt.
i recently took a habit to stop updating software. i noticed how many updates are totally useless and often causing regressions in my experience.
Maybe if you paid for your content it would suck less. The problem is no-one wants to pay for content, so they have to pay for it with ads, which means satisfying the true customer, the advertiser. Google tried, for a while, non-cruft ads, but eventually the demand for more sensational advertising is what drives most of this cruft.
Another driver, which is something front-end devs can do something about, is to stop shipping 20MB bundles to the browser, stop using 10MB of fonts, and stop front-loading 200MB of images. Ignorant use of powerful tools that make adding more and more resources to your project feel basically free are a big contributor, at least in the webapp space. (Classic CMS software also accretes cruft, but generally more slowly over time.)
You think I wanted 10MB of fonts? Nope. Talk to Public Affair's design team.
You believe I wanted 200MB of images? Wrong. I wanted text, maybe some thumb-nails with pop-up images. But the photography group has to exist for a reason.
20MB bundles of Javascript? Sorry, but between the trackers marketing wanted and the fact that my boss' boss wants to have some hip new framework on his resume means I'm stuck when maybe just some jQuery would have done the job.
You are thinking about the web circa 1995, with the Webmaster being the master. That's been gone for a while. We implement what others demand, and they don't care if it is slow on other machines, they don't give a damn if it isn't accessible, and they could care less if the content is essentially empty, just pictures of people smiling because they chose this product. We're really not in charge of this unless this is a passion project personal site or we're some scrappy little startup.
Bingo. The technical decisionmaking has been delegated to product managers with business degrees who have probably never even used the product they're demanding be riddled with hundreds of KB of digital litter, much less written a single line of code. Capitalism will kill us all.
This has nothing to do with capitalism, it is about engineers being overridden. You can find all kinds of similar hideous stories under communism, and more, with its fixation on central planning. Any system which decouples decision-making from local ground-truth wisdom will run into this kind of error.
> Maybe if you paid for your content it would suck less
Haha, maybe, but I really don't think so. People pay for cable and still get ads; people pay for Netflix and still get ads, I still got ads when on Youtube Premium, etc. Even outside of media consumption, there are countless similar examples.
Perhaps, if content was paid for in juicy frictionless micropayments, the bullshit web would suck less - for a time. But it would begin to suck more and more, until we have the same situation or worse (while still paying, more and more).
The companies pumping out these trackers and selling our data couldn't care less whether people are subscribing to a service or not. They'll offer cash and buy our data until the day that shit is illegal.
The host of a podcast I enjoy held an AMA recently added an anecdote to your, and my, opinion.
While He readily appreciated the Patreon subscribers & tailored special features for their contributions, he admitted the funds from the subscribers were eclipsed by several(?) factors by the ad revenues.
I work for a company where we basically sell access to a SaaS behind a frontend powered by React. The customers pay, very well as I'm led to believe by our figures. There's no advertising. Nobody cares that the frontend is slow, because that slowness is because it's an SPA and that's factored in as a necessary cost of business.
In other words, even if you pay, there's no reason to think the software should be any better.
I pay for a lot of content nowadays, less and less of it on the web. I spend far less time on the web than I used to, and life has become better.
The web is just not a very good medium in general and unfortunately the promise of web search has turned out to be just a chimera. Conventional methods of research or specialized search engines are more time efficient. The only thing the web is good for is cheap stimulation, and even for that it isn’t very good.
You know what? We don't actually need all this content that is being pooped out of everyone. Because most of it's shit, and not worth the time to even watch. Yet people remain convinced that we need it, and some of you remain convinced that they need to pay for it. What a lie.
>A story at the Hill took over nine seconds to load
The Hill is what finally pushed me to install an ad blocker. Before that it didn't bother me too much to get served ads etc that I filtered out of my awareness anyway, but The Hill was unusable.
So, I installed an ad blocker and immediately realized how negligent I had been-- in terms of quality of web browsing experience-- in not doing so earlier. The Hill loaded near instantly. Massive improvements all around on other sites as well. As in the article, It felt similar to upgrading from dialup to a Cable modem.
Unfortunately it seems sites have upped their game in adblocker arms races and the gains I achieved when I first installed it are incrementally being chipped away as things become ever so slightly slower, tick.. tick.. tick... tick... As time goes by.
Suggestions? Is there something more I can do? Besides simply disabling JavaScript which breaks many site from usability.
For what it's worth, AdGuard still seems to be holding its own against ad blocker blockers. Unlike other blockers, AdGuard is a paid service, but well worth it in my opinion and I'd recommend it in an instant.
The only place where AdGuard has been failing recently is blocking Twitch ads, unfortunately.
I really don't like paid services, not the cost but the mental friction and the fact that you can easily accumulate so many of them... Anyway, I'll at least check them out during the trial they offer to see how much of an improvement it is over UblockOrigin.
Worth browsing the web with Lynx for a week just to see how convoluted the modern web has become. Roughly 10% of sites are viewable in Lynx and have a simple enough layout. Lynx due to it being text based also filters out all the AD-Tech, trackers, and JS bloat.
Back in 2008 I was doing a lot of flash banners. Some clients would request the swf file to be as small as 9kb for a multi-scene banner.. It was a pain in the ass but we managed it most of the time with no design changes. We would spend a lot of time tunning image compression params to get the smallest swf possible. Sometimes the image quality would get awful.
Today, my clients just don't care at all if the frontpage is 10MB as long they can have a multi banner hero with a video running in the background.
Side note: I worked on a chatbot widget project where the tech PO was pushing us to move from react to preact to save 8kb on a 60kb script. I said, why bother? There is a 1.2MB image on the front page. Just give me access to the front page assets and I can save 800kb alone on that image.
How the hell have we managed to relegate every step of our application design and decisionmaking to people whose technical expertise starts and stops at "it works on my machine" ... we are doomed
I wish I had a scalpel I could use to cut away the webcruft that comes from every direction. Setting up a pihole and turning off JS can help, I reckon, but there is still tons of cruft that simply doesn't belong.
For me the web, granted with an ad blocker, nowadays feels snappy. Some pages are indeed bloated but ironically these sites are more often than not the types with nothing but regurgitated blog spam.
Even with my first broadband connection I remember the web to be a lot slower in the past. Actually sitting and waiting for images to be loaded in line by line for example.
I do agree that webpages could go on a diet making our current situation even better I do believe the feeling of a faster web is an false memory out of a nostalgia of a simpler time.
My first modem was a 2400bps MNP5, before the revelation that was the 14.4 HST (sysop discount, naturally). I first used the Internet in the early 90s, “pre-Netscape” and for a couple of years even “pre-Mosaic”. Precisely because so few people had good connections, web site owners were economical with their use of images, and information was structured so that it loaded quickly. Presentation took a back-seat to accessibility.
Very good reading. I always say that when people say "hey, remember when we had 10G hard drives. They were so small". People don't realize that today we have bigger drives but we also have bigger files. Of course our pictures today have better quality, but proportionately we have similar storage capacity as we had in the early 2000's
That's true of course to some extent, but I don't think the trend will continue on quite as drastically.
Like sure, you have games nowadays like RDR2, CoD, that have inflated to absurd file sizes of up to like 200 GB, compared to only a few gigs for AAA games maybe 10 years ago, but you also for examplehave more powerful hardware and more advanced compression like x265 which can store vastly more video in a lot less space than it was ever possible before since it can be decoded and unpacked in real time.
It used to take 4.7 GB of a DVD to hold a single movie at an abysmal 480p resolution, whereas nowadays you can pack a full HD one in easily a quarter of that size. I'm sure this will get even more extreme as processing power increases.
Same here, kind of. I'm reading this thread in Linux framebuffer mode and w3m, a text-mode browser. You can see images with this setup, but only by hovering a particular image link and launching an external viewer.
This has been my main computing setup for about half a year now, and it works surprisingly well (I'm neither a coder nor a web dev, though). Majority of the sites I visit are definitely bearable in text-only mode. It's a flexible setup, too, since I can seee the images if I need to.
For more inspiration, see Ali G Rudi's framebuffer tools [1] and a great site on w3m [2].
It is sad I thought this was submitted on HN not that long ago, but it turns out COVID somehow makes 4 years of life span felt like a year.
The summary of my comment from previous discussions remains the same.
>My biggest problem is with the idea of extending the Web via Javascript and everything should be Javascript only. Rather than extending the Web Browser native function.
But no one wants to tackle this problem. Google isn't interested, Mozilla were the first to want everything on the web to be Javascript. Apple wants everything to be an App. Even the Government wants everything to be an App so they could police Apple and Google only.
The web is dying not because it cant be made better or competitions are driving them out. It is dying because no one in power gives a fuck about it.
That's easy enough to simulate on Linux if you want to try. Replace eth0 with your wan interface. Re-paste the "del" lines to clear the added latency. If testing this on a remote host I would suggest adding a cron job to run the "del" lines every 10 minutes. I am typing this from memory so it may not work quite right. Use "tc -s -p qdisc" to get the packet statistics. All of these commands need to be run as root or with sudo.
# outbound qdisc
tc qdisc del dev eth0 root > /dev/null 2>&1
tc qdisc del dev eth0 ingress > /dev/null 2>&1
tc qdisc add dev eth0 root netem delay 250ms 20ms
# inbound qdisc
modprobe ifb numifbs=1 && ip link set dev ifb0 up
tc qdisc del dev ifb0 root > /dev/null 2>&1
tc qdisc del dev ifb0 ingress > /dev/null 2>&1
tc qdisc add dev ifb0 root netem delay 250ms 20ms
250ms 20ms means add a latency of 250ms with a variability of 20ms. This test will break at some point but one could test the impact of this using a speed test [1][2]
What if I want to shop for my groceries without waiting 10 seconds for Costco's app to load and then NOT have the "Grocery" button jump up half a page just as I'm about to tap it because a deals section below it just got finished dynamically loading?
> What if I want to shop for my groceries without waiting 10 seconds for Costco's app to load and then NOT have the "Grocery" button jump up half a page just as I'm about to tap it because a deals section below it just got finished dynamically loading?
Loading a webpage slower isn't going to do anybody any good, especially not when someone is trying to complete important tasks.
If someone wants to decrease engagement with the internet then they should use it less.
If your problem is with the attention economy and media overload, then your problem is more likely with apps and sites that are specifically tailored to addict the user, rather than the fast speeds that admittedly enable them.
It doesn't stop there. There's still no good way to validate information on the internet.
The internet, much like the real world, is full of untrue information, bad advice, and outright lies. Things that don't overcome this: Trusted experts, Wikipedia type citations, comment sections. Of those I think comment sections are the closest to being successful because they at least provide an open platform to raise doubt about any info.
I think engineers can sometimes become hyper-focused on a single metric eg. speed.
A webpage in 1992 or today just needs to load fast enough. Theres a whole rake of other factors that may be more important than speed.
The web is only bullshit if you use bullshit websites. I get nearly all my news from wikipedia or twitter. Neither of them take more than 1s to load for me.
How will the web make money if all this "bullshit" stuff is removed? Unless you put all of it behind a paywall (which may be not a bad idea, in fact i'm waiting for a startup that will provide a "paywall to everything" service which you can plug your site in and get a cut)
Just because someone suggests a way to work around a problem doesn't mean that they aren't also annoyed by the problem.
There are some problems you do not have a viable way to attack at the root, and you just have to deal with them. Often other people control the thing you would have to change, and will not be persuaded to do it your way.
I understand the desire to not have to learn something just to be able to remove an annoyance, but isn't having that option better than not being able to remove the annoyance at all?
I don’t really understand the complaint. If you don’t like the CNN website, don’t go to CNN or use CNN lite, which has been well documented on HN. No one is forcing you to visit the site.
Also, blocking webfonts is trivially easy at this stage so if you have specific complaints then you can take matters into your own hands with little more than a flip of a setting.
The complaint is more about the general direction that the web is taking.
It's all well and good saying "just don't use X website", but most news sites have plenty of features that are either annoying or outright disrespectful to the user.
That aside though, somebody who likes articles published by CNN has good reason to complain, because the things they like are being filled with bloat/are annoying to access.
Idk, in my case I just stopped using web at all, with some exceptions like hn and some messengers. I’ve passed the point where I missed “the old web”, I don’t really care anymore. I’m back to reading books, that’s what I had been doing before the Internet, so that’s just getting back to roots. I don’t read news, I haven’t been using social networks for at least a decade, and I feel good. Modern web is a shit show I just can’t stand.
I don't understand the complaint. If you don't have a problem with the web or don't experience those problems, don't comment on this HN thread. No one is forcing you to like it or engage with this post.
Google made a solid attempt at improving web performance and usability with AMP and offering preferred ranking - and hacker news mostly skewered them for it.
Publishers need to be compelled to improve performance but the current model encourages bloat
This is Google's problem; they smuggle in invasiveness and control in an attempt to conquer the web wearing a costume of benevolence and reasonability.
It's cyber-imperialism. The professed virtues are honorable but they have to be instrumented without the power grab
Amp is the worst, and now Google has reneged and rolled back on it, yet many sites still have it integrated because it's more work to undo all the work that's already been done to migrate to amp to begin with. A solid, first-class, grade-HIV+ clusterfuck.
But, obviously, the web hasn't gotten faster, because the increased bandwidth has been filled with increased nonsense, and browsers now download 2MB of nonsense to show a 200-character text post. It's this sort of nonsense that is partly the genesis for ideas like Gemini, a return to Gopher, or my own KyuWeb; limit the capabilities of the medium to vastly improve its speed and legibility.
This sort of thing isn't exclusive to the web, either. Is an old late-'90s version of Microsoft Word that could run on a machine with 4MB of RAM really a thousand times better than one from today that requires 4GB?
I really love the web, but sometimes it does piss me off a bit.