This article in different forms keeps making the rounds and it's just so tiring. Yeah, let's remember everything that was great about 25 years ago and forget everything that sucked. Juxtapose it with everything that sucks about today but omit everything that's great. Come on man.
If you think things suck now, just make it better! The world is your playground. Nobody makes you use YAML and Docker and VS Code or whatever your beef is. Eclipse is still around! There's still a data center around your corner! Walk over and hang a server in the rack, put your hardly-typechecked Java 1.4 code on there and off you go!
> The world is your playground. Nobody makes you use YAML and Docker and VS Code or whatever your beef is
Nobody, except your future employment prospects.
There's good reasons and bad reasons for a lot of technical options; "can I hire people to do it?" is a very good reason, but it does directly lead to CV-driven-development, where we all chase whatever tech stack the people writing the job adverts have decided is good.
The same people who capitalise "MAC" in "MAC & PC", the same people who conflate Java with JavaScript, the same people who want 10 years experience in things only released 3 years ago.
Sure, you can do that for your hobby projects. But "at work" you generally have these decisions made for you. And these decisions have changed over time for the wrong reasons.
As an aside: if we say k8s, we should also say j8t.
> But "at work" you generally have these decisions made for you.
The idea that most employers make terrible decisions now, and amazing decisions back in the day, is plainly false. The author vividly recollects working at a decent Java shop. Even there I strongly doubt everything was amazing as they describe, but it sounds decent indeed. But plenty businesses at the time used C++, for no good reason other than inertia, usually in Windows-only Visual C++ 6-specific dialects. Their "build server" ran overnight, you'd check in your code in the late afternoon and get your compile errors back in the morning. The "source control" worked with company-wide file locks, and you'd get phoned your ass back to the office if you forgot to check in a file before leaving. Meanwhile, half the web was written in epic spaghetti plates of Perl. PHP was a joy to deploy, as it is now, but it was also a pain to debug.
If you care deeply about this stuff, find an employer who cares too. They existed back then and they exist now.
> Their "build server" ran overnight, you'd check in your code in the late afternoon and get your compile errors back in the morning.
This. Let's keep things in perspective when people complain about long Rust compile cycles. And even that's a whole lot better than filling in paper-based FORTRAN or COBOL coding forms to be punched into cards in the computing room and getting back line-printed program output (or a compiler error) the next week.
It was definitely on the order of hours for large code bases - the Microsoft Excel team passed out punishment “suckers” for those who broke the build - causing 100+ people to not have a new working build to look at and test.
Linux kernel compiles in the 1990s were measured in hours, and that codebase was tiny compared to many. So, yep, builds were slow, slow enough to have an entire xkcd comic written about them.
The reasons are always the same. 20% because some changes are actually improvements, and 80% cargo cultism where if you just build the right containers and chant the correct YAML incantations, you too will become a Google… followed by the phase where everybody just keeps doing these things because they organizationally don’t know any other way anymore, until the next big trendy thing, which does revert some of the issues of the previous trendy thing, but introduced a new set of already solved problems because this profession is incredibly myopic when it comes to history.
You'll get old too one day and it will look a whole lot different watching the younguns stumble through completely avoidable mistakes and forget the long lessons of your life that weren't properly taught or were just ignored.
We have records from many periods in history of old men crowing about how society is collapsing because of the weak new generations. Thing is, maybe they were always right, and the new generations just had to grow up and take responsibility? And then again, maybe sometimes they were little too right and society did in fact collapse, but locally.
“Hard times create strong men. Strong men create good times. Good times create weak men. And, weak men create hard times.”
Agreed! If anything, I think I'm tired of the "everyone says this when they get old!" hot take. Sometimes things really do get visibly worse and the intergenerational complaining about it is due to it really happening.
I bring this example up every time, but I'm a baseball fan. And seemingly every generation of fan has said there's more people striking out than there used to be. Is it because there part of getting old as a baseball fan? No! It's really happening. Strikeouts have quite literally been going up from one decade to the next for basically a century.
So sometimes it's because the thing is really happening. Environmental ecosystem collapse? Real. People having shorter attention spans every next generation? Real! Politics getting worse? Well, maybe the 1860s were worse, but the downward trajectory over the last 50 years seems pretty real. Inefficiency of increasingly automagic programming paradigms? Real!
Sometimes things are true even if old people are saying them.
An interesting Reddit r/AskHistorians thread on the question """Does the aphorism "Hard times create strong men. Strong men create good times. Good times create weak men. Weak men create hard times", accurately reflect the evolution of civilizations through history and across different cultures?"""
copying only the conclusion for a tl;dr: "The only way that the aphorism explains history is by reinforcing confirmation bias - by seeming to confirm what we already believe about the state of the world and the causes behind it. Only those worried about a perceived crisis in masculinity are likely to care about the notion of "weak men" and what trouble they might cause. Only those who wish to see themselves or specific others as "strong men" are likely to believe that the mere existence of such men will bring about a better world. This has nothing to do with history and everything with stereotypes, prejudice and bias. It started as a baseless morality tale, and that is what it still is."
I’m not sure what you mean. I don’t see a bias here, the point is plainly stated: the notion of weak men is dubious. You might not agree, but then engage with something substantial.
they were not right and I promise when I’m old, I will not have this attitude. it’s one of my least favorite types of people; and that’s precisely my point, old men have been saying society is collapsing since ancient times, yet here we are, with things better than ever
If you don't think they were at least sometimes right, to what do you instead attribute the various cases of socio-economic collapse documented throughout history?
> I promise when I’m old, I will not have this attitude.
To my ears this is a hilariously naive statement. It sounds to me to be roughly the equivalent or a 7-year old saying "Adults have boring jobs where they sit at a desk all day. I hate it. I promise when I'm old I'm gonna be an Astronaut or play Major League Baseball."
It's not that they don't mean it, it's that one should make promises about a situation they can't yet understand. While some of those kids probably did end up being astronauts or baseball players 99%+ who made that promise didn't. It turns out that being an adult gives them perspective that helps them realize the reasons they want a desk job even if they don't like it, or for many they actually enjoy their desk job (ex they like to program).
So the same if a million young people all thought similarly, and then magically changed their view when they got there dont promises your going to be the one who will break the streak.
You might turn out to be an astronaut, but many people listening, based on good previous evidence will rightly assume you won't.
That's because the thesis it's expounding on isn't "old people don't like change", but rather "experienced people often see their juniors unknowingly making avoidable mistakes".
Yes, sometimes it's that, but what I've observed in the field is self-inflicted by programmers. It's the impostor syndrome. They don't know, but they don't want to show that they don't know. They look around for clues of what to choose. Monkey see, monkey do. And then, like monkeys, they bash their keyboard until it seems to work.
>Nobody makes you use YAML and Docker and VS Code or whatever your beef is.
Not VS Code, but maybe YAML and Docker if your company is trying to align what tools it uses. C# places might still force you to use Visual Studio proper. Everyone says use the right tool for the job, but for bog standard CRUD web development, we do have a shitload of tools that work and there's multiple ways to get to a fast, working product.
I still chuckle that my laptop is 3 times as fast as the cloud thing that serves our CRUD app and we pay many more times for it, but also knowing full well I do not ever want to be RDP'ing into a production box again and pouring through IIS or Windows logs.
What I definitely do see is a degradation in making choices about when to adopt a more complicated technology because of the incentives of the hiring market.
People have loudly beaten the drum to keep your skills up to date and so people choose stuff that's popular because it benefits them personally, even when the product doesn't need it. This in turn leads companies to only select people who know that stack, and the more that companies do that, the more people make technical choices to get them the best job that they can handle.
We absolutely, very much 100% see that happening now with LLM AI if you ever needed a bigger piece of proof. Pretty much everything that is happening now has just been a louder example of every bad practice since the run up to the dotcom bust.
Because of that, I'd frankly never suggest running on-prem or building a local-only app unless there was a much bigger reason (legal, security, whatever) especially if the other products in the company have chosen the cloud.
Why? Because convincing the next job that that would have been the right choice is too hard.
Edit: and to someone else's point, I made the choice to be in the Microsoft/Azure/Windows hell hole but digging myself out and moving to something else is practically working a second full-time job and holding 2 ecosystems in my head at once
On the one hand: yes, this dev has clearly chosen a career/language specialization that puts him knee-deep in the absolute worst tooling imaginable.. I cannot fathom a workflow this fucking miserable and if this was my day to day, I would be far, far more depressed than I already am.
AND, the fact that so very very much of our industry does run, perhaps not all of, but a significant amount of a workflow not awfully different from this is IMO, an indictment of our trade. To invoke the immortal sentiment of the hockey coach from Letterkenny, this shit is FUCKING embarrassing.
So much major software that ships in a web browser because writing for Windows, Mac and Linux is just too hard you guys, it's simply too much for a sweet little bean like Microsoft ($3.62 trillion) to manage as they burn billions on AI garbage, is FUCKING embarrassing.
Half the apps on my phone are written this way which is why they can barely manage 30hz on their animations, die entirely when S3 goes down, and when they are working, make my phone hot. To run an app that lets me control my thermostat from my desk. That's FUCKING embarrassing.
And my desktop is only saved by virtue of being magnitudes more powerful than my original one back in the 90's, yet it only seems a scant more capable. In the early 00's I was sitting on Empire Earth and chatting with people over TeamSpeak. My computer can still do this, and with the added benefit of Discord can stream my game so my friends can all watch each other, and that's cool, apart from I lose about 10 fps just by virtue of having Discord open, and when I'm traveling? Oh god forget it, Discord flounders to death on hotel wifi despite it being perfectly cromulent DSL speeds. Not BLAZING, surely, but TeamSpeak handled VOIP over an actual DSL connection, with DSL latency, in the 00's. That's FUCKING embarrassing.
All our software now updates automatically by default, and it's notable when that's a GOOD thing. Usually what it actually means is the layout of known features changes for seemingly arbitrary reasons. Other times more dark patterns are injected. And Discord, not to pick on them, but they're the absolute fucking worst for this. I swear they push an "update" every time one of their devs sneezes, I usually have to install 18 such updates on each launch, and I run it very regularly! And for all that churn, I couldn't tell you one goddamn thing they actually added recently. FUCKING embarrassing.
And people will say "oh they could be better," "we know we can do it better," "these aren't the best companies or apps" okay but they are BIG ones. If the mean average car in America got awful fuel economy, needed constant service, was ill-designed for it's purpose and cost insane amounts of money...
Oh, that happened too. I think I just made my metaphor more embarrassing for another industry.
It's horrifying to see someone who started working 10 years after me talking about "when I was young" :-D
Programming was more exciting when you had amazing things to imagine having - like a 256 colour screen or BASIC that ran as fast as machine code (ACORN ARCHIMEDES) or the incredible thought of having a computer powerful enough to play a video and even having enough storage to hold a whole video!
2400bps! Checking email once a day
Everything came true. My screen has 16million colours and I don't notice. I have 12 cores and 64GB of memory and fibre optic internet. The extremely exciting future arrived. That mind-expanding byte article I read about Object Oriented programming came true enough for us to see that it wasn't the answer to everything.
There are 2 problems now - nothing one does is unique. You can't make a difference because someone else already has done whatever you can think of doing ....and of course AI which is fun to use but threatens to make us even more useless.
I just cannot really get my sense of excitement back - which may just be because I'm old and burned out.
In an ironic twist of life, this is almost what I'm back doing right now. I turned off notifications and pull messages years ago because of all the messages I'm getting for a dozen different systems. I check mail at most a few times per day and that's it. I wouldn't be able to work if I'd have to actively keep an eye on them. I can get away with it because everybody is using Slack for work or WhatsApp for personal life, so there is no urgency to check mail. I'm on Slack too, so I see if I have messages there but WhatsApp is silenced and I allow no notification of any sort on the lock screen of my phone.
I'm disappointed that you said "2400bps" instead of "2400 baud". :/
It's always surprising to me when I see people being nostalgic for the old days. Yes, things seemed simpler, but it was because there was less you could do.
I'll always fondly remember my attempt to get on GeoLink with a 300 baud modem, and then my parents realizing that the long distance calls made it far, far too expensive to use, and returning it. Sure, I was disappointed at the time, but it wasn't too much later that 56k modems existed and we had a local unlimited internet provider. And now it's a fun story.
But I was actually just as frustrated at the time as I am now, but for different reasons. Change exists, and that's good.
I agree that it feels harder to make your mark today. But I don't think it's actually harder. There's plenty of fun things that people would love to see people do. Just yesterday, I found out about the Strudel musical programming language and watched an amazing video of someone making Trance with it. And those kind of discoveries happen constantly now, where they were pretty seldom back 30 years ago.
We're at the point that anyone can make a game, app, webpage, etc if they put enough effort into it. The tools are so easy and the tutorials are so plentiful and free that it's really just about effort, instead of being blocked from it.
I've been saying "we live in the future" about once a month for years now. It's amazing what we have today.
> Yes, things seemed simpler, but it was because there was less you could do.
And because computing was less mature and a younger field overall. If computers remained stagnant for 500 years, fixed as they were in 1980, I bet that programming would become increasingly more complex, just to enable doing do more with less
> I'm disappointed that you said "2400bps" instead of "2400 baud". :/
haha :-) It was of course 2400 baud and we were using FidoNET which was very very exciting at that time in Zimbabwe. We'd spend 10 minutes trying to get a dial tone sometimes but it was magic when you connected and saw something was coming in. International telephone calls were so expensive that we talked to my brothers overseas once or twice a month at best. With email we could chat every day if we wanted.
The limitation then was information - no internet, no manuals no documentation. I wrote a text editor and did my best to make it fast with clever schemes but it always flickered. Many years later a pal at university in South Africa casually mentioned that graphics memory was slow so it was actually best to write to memory and then REP MOVSB that to the graphics memory. I cursed out loud at this and asked him how he knew that?! Well, he lived in a more modern country and could buy the right books. Nowadays you really can be a linux kernel programmer in the Congo if you want to.
It's not just that you're old and burned out. There is a declining marginal value of improvements.
Take sound, for example. Going from "no sound" to "sound" was huge. Going from just beeps to IBM PC sound was a real step. CD-quality was a real step. Going from CD-quality to 64-bit samples at a 1 MHz sample rate is a yawn. Nobody cares. The improvement on CD quality isn't enough to be interesting.
I have high enough bandwidth. Enough screen resolution. Enough RAM, enough CPU speed, good enough languages freely available, enough data.
The problem is, everything that was an easy win with all that has already been done. All that's left is things that aren't all that exciting to do.
(I don't think this is permanent. Something will come along eventually - a new paradigm, a new language, a new kind of data, something - that will open a bunch of doors, and then everything will be interesting again.)
> Funnily enough, everything ran at about the same speed as it does now.
Actually, where I was sitting on a decent PC with broadband Internet at the time, everything was much, much faster. I remember seeing a video on here where someone actually booted up a computer from the 2000's and showed how snappy everything was, including Visual Studio, but when I search YouTube for it, it ignores most of my keywords and returns a bunch of "how to speed up your computer" spam. And I can't find it in my bookmarks. Oh well.
Wow that was a great read, thank you. It's funny that it is already starting to break due to all of the links and ad tracking, which is another kind of rot.
Use Linux/KDE. None of the gains from the switch to SSDs have been lost. Everything is instant even on an n100. You only need something more powerful for compilation, gaming, or heavy multimedia (like the 200 Mbps 4k60 video my camera produces, which isn't accelerated by most processors because it's using 4:2:2 chroma subsampling).
Xfce or LXQt are also great alternatives, blazing fast even on 15 year-old hardware. Old hardware can be slow for basic web browsing and multimedia (e.g. watching videos with modern codecs) but other low-level uses are absolutely fine.
These days to get a snappy experience, one has to aggressively block everything and only selectively unblock the bare minimum, so that one doesn't get tons of bloat thrown in the direction of one's browser. Oh and forget about running JavaScript, because it _will_ be abused by websites. And then sites have the audacity to claim one is a bot.
Many websites are so shitty, they don't even manage to display static text, without one downloading tons of their JS BS.
You know what made javascript so common, accessible, and thus later universal?
Putting things on a screen was (is) stupidly simple.
That's the whole thing. It's not the types or npm or whatever. It's that you could start with C/Python/Java and spend your first 6 months as a coder printing and asking values on a dark terminal (which is something a newbie might not even have interacted with before) or you could go the html/css/javascript route and have animations moving in your screen you can show to your friends on day one.
Everything flows from that human experience. UIs are Electron because creating UIs for a browser is an extremely more universal, easy and creative-friendly experience than creating UIs for native apps, particuarly if JS is your native language for the previously stated reason.
The industry failed to adapt to the fact that the terminal wasn't any longer the reality of computer users, and the web kinda filled that niche by default.
Also, React was an answer to the advent of mobile phones, tablets, smart tvs, and basically all the explosion of not-a-desktop computer form factors. You could no longer assume your html was a proper format for everything. So you need an common API to be consumed by a mobile app, a tv app, a tablet app... react was the way to make the web another of the N apps that use the api, and not get special treatment. The idea made sense in context, back then.
> Putting things on a screen was (is) stupidly simple.
This is why HTML is still a great language for building UIs and that's why Visual Basic had a huge success in the early 90s: drag UI components on a panel, write the callbacks for clicks, save and distribute the exe. Almost anybody could do it.
React and its siblings are much more complicated than that.
It’s so very ironic that your comment appears on this particular post. We a much better version of thiswith interface builder on NeXT in the late 1980s (and later Mac OS X). Similar tooling existed for DOS and Windows.
You literally laid out your UI in a WYSIWYG, drag and dropped connections to your code, and had something working on day one.
It was even easier than the web, because what you were laying out were full components with code and UI and behavior. The web still hasn’t caught up with that fully.
When I see comments like these, I better understand why old timers shake their fist at the youngsters reinventing the wheel badly because they don’t understand what came before.
The author is writing like Java was outlawed or something. There are tons of shitty enterprise Java jobs out there for those who want them. Personally, I worked one of those jobs a decade ago, and the article's description of the "golden age" didn't bring back good memories.
It's easy enough to avoid the NPM circus as well. Just don't put JavaScript on your resume and don't get anywhere near frontend development.
I've been coding since 40 years old and professioannly since about 30. And let's set this straight: it is much (much, like really much) better nowadays.
We have super powerful editors, powerful languages, a gazillion of libraries ready for download.
I use to write video games and it took months (yeah, months of hobby-time) to put a sprite on a screen.
And for Java, yeah, things have improved so much too: the language of today is better, the tooling is better, the whole security is more complex but better, the JVM keeps rocking...
I am still a happy programmer after all these years using only node, express, postgres and sublime. I try not to listen to the sirens singing on the rocky shores...
Maven was a great idea. Introduce a high barrier to entry for publishing packages. Paired with a search box operated by carrier pidgeon, this effectively meant what you were looking for didn't exist. Every time you had any kind of quirky need, you had to write it out by hand, or find someone smarter than you to do it for you, or worst of all buy it from a vendor for lots of money at horrible quality. People recite 'DSA is bad for coding challenges, when will I need to write a hash map', but once upon a time you did have to write a hash map here and there. Supply chain vulnerability is a cost, but the product was worth the cost: you can just import the darn package!
I need a map of key ranges to values with intelligent range merging, it is right there on crates.io to import, it has been there since 2016, Maven Central didn't get one until 2018. In the olden days either it was in Apache Commons or it didn't exist. Halcyon days those.
> We run our k8s cluster in the “Cloud”. It’s a bunch of services that run on Linux, but we don’t run Linux ourselves, we run it on VMs that we rent by the hour for approximately the same cost as buying a computer outright every month. We do this because no one knows how to plug a computer in any more.
My mind struggles with this reality everyday... the "cloud" has to be the most successful rebrand of all time. In 2005: "Be very careful what data you share on the internet". In 2025: "Yeah, I just put all my shit in the cloud".
Just landscape changed significantly comparing to 2010.
Before you didn't have current smartphones and tablets. On desktop you could just support windows only (today probably have to macOS as well). On browser you didn't have to support Safari. You didn't have to worry much about screen DPI, aspect ratio, landscape vs portrait, small screen sizes. Your layout didn't have to be adaptive. There was not much demand for SPA. Back then there was much less people worldwide with access to internet.
My first text editor for PHP was Komodo Edit, it was super slow and everyone jump shipped to Sublime Text, then VSCode was slower than Sublime but had incredible industry supported extensions like Git and Prettier, programming didn't peak during my PHP days, in fact the PHP frameworks sites I used are still around and incredibly slow.
I am building an AI coding tool that doesn't eat RAM, there are super lightweight alternatives to Electron.
In this respect I’m liking vibe coding. I can tell it use html css and js only and make me a frontend.
That obviously has severe limitations but ideal if you don’t like the front end framework scene and want to put all your logic in backend. And there I find it a bit easier to navigate
RIP guys in corporate that have things imposed on them
I get it. I agree with most of this article. But also like, nothing went away.
If you pine for the days of Java and Maven, you can still do that. It’s all still there (Eclipse and NetBeans, too!)
If you don’t like using Node and NPM, that’s totally valid, don’t use them. You can spin up a new mobile app, desktop app, and even a SaaS-style web app without touching NPM. (Even on fancy modern latest-version web frameworks like Hanami or Phoenix)
If you don’t want everyone to use JS and NPM and React without thinking, be the pushback on a project at work, to not start there.
Java is usable now, but in 2013 it was the worst debugging experience one could have. I would rather work with PHP5 than with Java (unless I started a project from scratch). Also auto-refactoring was clearly worse, because well, Java. It was around that time that I tried Scala then Clojure, and even if debugging the JVM was still an experience (to avoid as much as possible), at least limited side effects reduced the issues.
If programming peaked, it certainly wasn't in 2010.
> it was the worst debugging experience one could have.
Hard disagree. I'm not going to argue that Java debugging was the best, however:
1. You could remote debug your code as it ran on the server.
2. You could debug code which wouldn't even compile, as long as your execution path stayed within the clean code.
3. You could then fix a section of the broken code and continue, and the debugger would backtrack and execute the code you just patched in during your debugging session.†
This is what I remember as someone who spent decades (since Java 1.0) working as a contract consultant, mainly on server side Java.
Of course this will not convince anyone who is determined to remain skeptical, but I think those are compelling capabilities.
† Now I code in Rust a lot, and I really enjoy it, but the long compile times and the inability to run broken code are two things which I really miss from those Java days. And often the modern 2025 debugger for it is unable to inspect some for the variables for some reason, a bug which I never encountered with Java.
I think I’m lucky, because for me it’s the other way. In 2009 I started my first real programming job writing c++ in vim. For the last 5 years I’ve been writing rust in helix and things have never been better.
Javascript wins by keeping the costs down. Companies today want to do more with less, which is how it should be and you are still free to choose from a myriad of technologies. When you pair this setup with LLMs, it's actually the best it has ever been IMO.
If the author doesn't want to work with NPM and the JavaScript ecosystem he could just get a job writing Spring/Boot, which makes up probably 90% of the jobs at large enterprise companies. I don't agree that this world has disappeared...
There's something to this. I recently shipped a music curation site and deliberately avoided React/Next/etc - just HTML, CSS, vanilla JS. The cognitive load difference is stark. The 'peak' might be less about capability and more about us rediscovering that simpler tools often suffice.
It's all about picking the right tools for the job. The "cognitive load" might be larger in a vanilla project compared to React when your interface is more complex and interactive.
Same. I build stuff for local businesses in my area with nothing but boring old HTML, PHP, CSS and JS. I guess my shit isn't "web scale" but it works, and it works consistently, with minimal downtime, and it worked during both Amazon and Cloudflare's latest outages.
I don't need my software to eat the world, I'm perfectly content with it just solving someone's problems.
Honestly, the person should spend their time of fixing their shit instead of writing blog posts.
I find intellij a great IDE, modern frameworks are fun to use, ai helps me doing things i don't want to do or things i just need (like generate a good README.md for the other people).
Containers are great and mine build fast.
My Startup has a ha setup with self healing tx to k8s and everything is in code so i don't need to worry to backup some random config files.
Hardware has never been that cheap. NVMs, RAM and Compute. A modern laptop today has a brilliant display, quite, can run everything, long batterytime.
Traffic? No brainer.
Websphere was a monster with shitty features. I remember when finally all the JEE Servers had a startup time of just a few seconds instead of minutes. RAM got finally cheap enough that you were able to run eclipse, webserver etc. locally.
This feels very much like the tired "the modern internet sucks - the old web with old websites was better!" trope that appears on here regularly.
You can still code the old way just like you can still put up your old website. No one is forcing you to use AI or VS Code or even JavaScript.
Of course you.might not getting those choices at work, but that is entirely different since your paid to do a job that benefits your employer, not paid to do something you enjoy.
Defining peak programming as entreprise Java using Eclipse on SVN is border delusionnal imo.
> And here’s the funny thing: it never broke, because the person who built it did it well, because it wasn’t taxed within an inch of its life, and because we were keeping an eye on it.
> The most popular is “VS Code”, which needs only a few gigabytes of RAM to render the text.
> We used Eclipse, which was a bit like VS Code.
This made me laugh. You can't possibly be a serious person if you think Eclipse was better in any way shape or form than VS Code.
I don't have a great memory, but one thing I can absolutely still remember with 100% clarity is how bloated and memory hungry Eclipse was. It was almost unusable.
This article in different forms keeps making the rounds and it's just so tiring. Yeah, let's remember everything that was great about 25 years ago and forget everything that sucked. Juxtapose it with everything that sucks about today but omit everything that's great. Come on man.
If you think things suck now, just make it better! The world is your playground. Nobody makes you use YAML and Docker and VS Code or whatever your beef is. Eclipse is still around! There's still a data center around your corner! Walk over and hang a server in the rack, put your hardly-typechecked Java 1.4 code on there and off you go!
> The world is your playground. Nobody makes you use YAML and Docker and VS Code or whatever your beef is
Nobody, except your future employment prospects.
There's good reasons and bad reasons for a lot of technical options; "can I hire people to do it?" is a very good reason, but it does directly lead to CV-driven-development, where we all chase whatever tech stack the people writing the job adverts have decided is good.
The same people who capitalise "MAC" in "MAC & PC", the same people who conflate Java with JavaScript, the same people who want 10 years experience in things only released 3 years ago.
Sure, you can do that for your hobby projects. But "at work" you generally have these decisions made for you. And these decisions have changed over time for the wrong reasons.
As an aside: if we say k8s, we should also say j8t.
> But "at work" you generally have these decisions made for you.
The idea that most employers make terrible decisions now, and amazing decisions back in the day, is plainly false. The author vividly recollects working at a decent Java shop. Even there I strongly doubt everything was amazing as they describe, but it sounds decent indeed. But plenty businesses at the time used C++, for no good reason other than inertia, usually in Windows-only Visual C++ 6-specific dialects. Their "build server" ran overnight, you'd check in your code in the late afternoon and get your compile errors back in the morning. The "source control" worked with company-wide file locks, and you'd get phoned your ass back to the office if you forgot to check in a file before leaving. Meanwhile, half the web was written in epic spaghetti plates of Perl. PHP was a joy to deploy, as it is now, but it was also a pain to debug.
If you care deeply about this stuff, find an employer who cares too. They existed back then and they exist now.
> Their "build server" ran overnight, you'd check in your code in the late afternoon and get your compile errors back in the morning.
This. Let's keep things in perspective when people complain about long Rust compile cycles. And even that's a whole lot better than filling in paper-based FORTRAN or COBOL coding forms to be punched into cards in the computing room and getting back line-printed program output (or a compiler error) the next week.
Did people really only compile once a night in the days of visual studio 6? There were pentium 2s and 3s back then.
It was definitely on the order of hours for large code bases - the Microsoft Excel team passed out punishment “suckers” for those who broke the build - causing 100+ people to not have a new working build to look at and test.
Linux kernel compiles in the 1990s were measured in hours, and that codebase was tiny compared to many. So, yep, builds were slow, slow enough to have an entire xkcd comic written about them.
Amen.
We love the idea that we don't have any agency in this field and we're constantly being pushed by the mean baddies at the top.
The "terrible decisions" of yore hold no comparison to today's "terrible decisions". It's not the same ballpark, it's not the same sport.
jubernetes?
That's the "let's rewrite k8s in proper enterprise Java like it was meant to be in the first place, none of this Golang nonsense" project.
jubernetet
> if we say k8s, we should also say j8t
It’s that one extra spoken syllable that pushes it into k8s I guess. ¯\_(ツ)_/¯
But it's only used in written form. You don't actually say out loud "kay eight es", do you?
Kocho rolls out the tongue for Spanish speakers.
Sounds like “coso” (“thingy”, mildly despective).
kate's
Seriously? What if you have a Kate in your team?
Then she ends up taking the blame for causing lots of headaches from people that can’t parse context
The automated system rejects their CV when applying obviously
Haha, oh my, no. That would be silly.
Clearly it’s “Kates”. Like “sk8er boy”.
Kates vs Jate?
How do you say i18n?
In my head?
Something like slurring the word “internationalization” combined with 18. “Ill-ateen-yon”
(In general I hate these numeric abbreviations, they’re terribly opaque to anyone who’s not clued in)
Eye eighteenin
institutionalization
> And these decisions have changed over time for the wrong reasons.
Have you ever considered that you don’t understand why those decisions were made and that’s why you think they were made for the wrong reasons?
I simply don't agree with the reasons, while you seem to imply that all decisions made, are good.
The reasons are always the same. 20% because some changes are actually improvements, and 80% cargo cultism where if you just build the right containers and chant the correct YAML incantations, you too will become a Google… followed by the phase where everybody just keeps doing these things because they organizationally don’t know any other way anymore, until the next big trendy thing, which does revert some of the issues of the previous trendy thing, but introduced a new set of already solved problems because this profession is incredibly myopic when it comes to history.
They don't want a cure, they want something they can bitch about.
Speak for yourself. The pathologies/neuroses on display here... 'cure'... 'bitch'... seriously, are we still talking about software?
old man claims society collapsing; back in his day…
You'll get old too one day and it will look a whole lot different watching the younguns stumble through completely avoidable mistakes and forget the long lessons of your life that weren't properly taught or were just ignored.
We have records from many periods in history of old men crowing about how society is collapsing because of the weak new generations. Thing is, maybe they were always right, and the new generations just had to grow up and take responsibility? And then again, maybe sometimes they were little too right and society did in fact collapse, but locally.
“Hard times create strong men. Strong men create good times. Good times create weak men. And, weak men create hard times.”
― G. Michael Hopf, Those Who Remain
Agreed! If anything, I think I'm tired of the "everyone says this when they get old!" hot take. Sometimes things really do get visibly worse and the intergenerational complaining about it is due to it really happening.
I bring this example up every time, but I'm a baseball fan. And seemingly every generation of fan has said there's more people striking out than there used to be. Is it because there part of getting old as a baseball fan? No! It's really happening. Strikeouts have quite literally been going up from one decade to the next for basically a century.
So sometimes it's because the thing is really happening. Environmental ecosystem collapse? Real. People having shorter attention spans every next generation? Real! Politics getting worse? Well, maybe the 1860s were worse, but the downward trajectory over the last 50 years seems pretty real. Inefficiency of increasingly automagic programming paradigms? Real!
Sometimes things are true even if old people are saying them.
An interesting Reddit r/AskHistorians thread on the question """Does the aphorism "Hard times create strong men. Strong men create good times. Good times create weak men. Weak men create hard times", accurately reflect the evolution of civilizations through history and across different cultures?"""
copying only the conclusion for a tl;dr: "The only way that the aphorism explains history is by reinforcing confirmation bias - by seeming to confirm what we already believe about the state of the world and the causes behind it. Only those worried about a perceived crisis in masculinity are likely to care about the notion of "weak men" and what trouble they might cause. Only those who wish to see themselves or specific others as "strong men" are likely to believe that the mere existence of such men will bring about a better world. This has nothing to do with history and everything with stereotypes, prejudice and bias. It started as a baseless morality tale, and that is what it still is."
https://old.reddit.com/r/AskHistorians/comments/hd78tv/does_...
Ironic given how strongly Reddit, r/AskHistorians, and the humanities in general bring their own biased lens to bear. Just like everyone else.
I’m not sure what you mean. I don’t see a bias here, the point is plainly stated: the notion of weak men is dubious. You might not agree, but then engage with something substantial.
they were not right and I promise when I’m old, I will not have this attitude. it’s one of my least favorite types of people; and that’s precisely my point, old men have been saying society is collapsing since ancient times, yet here we are, with things better than ever
> they were not right
If you don't think they were at least sometimes right, to what do you instead attribute the various cases of socio-economic collapse documented throughout history?
> I promise when I’m old, I will not have this attitude.
To my ears this is a hilariously naive statement. It sounds to me to be roughly the equivalent or a 7-year old saying "Adults have boring jobs where they sit at a desk all day. I hate it. I promise when I'm old I'm gonna be an Astronaut or play Major League Baseball."
It's not that they don't mean it, it's that one should make promises about a situation they can't yet understand. While some of those kids probably did end up being astronauts or baseball players 99%+ who made that promise didn't. It turns out that being an adult gives them perspective that helps them realize the reasons they want a desk job even if they don't like it, or for many they actually enjoy their desk job (ex they like to program).
So the same if a million young people all thought similarly, and then magically changed their view when they got there dont promises your going to be the one who will break the streak.
You might turn out to be an astronaut, but many people listening, based on good previous evidence will rightly assume you won't.
The quote at the end has absolutely nothing to do with old people not liking change.
That's because the thesis it's expounding on isn't "old people don't like change", but rather "experienced people often see their juniors unknowingly making avoidable mistakes".
It kind of does, no one repeating it ever thinks they’re the weak men creating hard times
Old man yells at cloud native...
The better tools are already there. It's not about making it better. It's about most programmers today choosing mediocrity.
Sometimes, you can avoid contact with this mediocrity, but often you don't and you're forced to play in this swamp.
> choosing
Sometimes the programmer doesn't get to choose. I do find the business drive toward mediocrity quite maddening.
Yes, sometimes it's that, but what I've observed in the field is self-inflicted by programmers. It's the impostor syndrome. They don't know, but they don't want to show that they don't know. They look around for clues of what to choose. Monkey see, monkey do. And then, like monkeys, they bash their keyboard until it seems to work.
>Nobody makes you use YAML and Docker and VS Code or whatever your beef is.
Not VS Code, but maybe YAML and Docker if your company is trying to align what tools it uses. C# places might still force you to use Visual Studio proper. Everyone says use the right tool for the job, but for bog standard CRUD web development, we do have a shitload of tools that work and there's multiple ways to get to a fast, working product.
I still chuckle that my laptop is 3 times as fast as the cloud thing that serves our CRUD app and we pay many more times for it, but also knowing full well I do not ever want to be RDP'ing into a production box again and pouring through IIS or Windows logs.
What I definitely do see is a degradation in making choices about when to adopt a more complicated technology because of the incentives of the hiring market.
People have loudly beaten the drum to keep your skills up to date and so people choose stuff that's popular because it benefits them personally, even when the product doesn't need it. This in turn leads companies to only select people who know that stack, and the more that companies do that, the more people make technical choices to get them the best job that they can handle.
We absolutely, very much 100% see that happening now with LLM AI if you ever needed a bigger piece of proof. Pretty much everything that is happening now has just been a louder example of every bad practice since the run up to the dotcom bust.
Because of that, I'd frankly never suggest running on-prem or building a local-only app unless there was a much bigger reason (legal, security, whatever) especially if the other products in the company have chosen the cloud.
Why? Because convincing the next job that that would have been the right choice is too hard.
Edit: and to someone else's point, I made the choice to be in the Microsoft/Azure/Windows hell hole but digging myself out and moving to something else is practically working a second full-time job and holding 2 ecosystems in my head at once
I both agree and disagree with you.
On the one hand: yes, this dev has clearly chosen a career/language specialization that puts him knee-deep in the absolute worst tooling imaginable.. I cannot fathom a workflow this fucking miserable and if this was my day to day, I would be far, far more depressed than I already am.
AND, the fact that so very very much of our industry does run, perhaps not all of, but a significant amount of a workflow not awfully different from this is IMO, an indictment of our trade. To invoke the immortal sentiment of the hockey coach from Letterkenny, this shit is FUCKING embarrassing.
So much major software that ships in a web browser because writing for Windows, Mac and Linux is just too hard you guys, it's simply too much for a sweet little bean like Microsoft ($3.62 trillion) to manage as they burn billions on AI garbage, is FUCKING embarrassing.
Half the apps on my phone are written this way which is why they can barely manage 30hz on their animations, die entirely when S3 goes down, and when they are working, make my phone hot. To run an app that lets me control my thermostat from my desk. That's FUCKING embarrassing.
And my desktop is only saved by virtue of being magnitudes more powerful than my original one back in the 90's, yet it only seems a scant more capable. In the early 00's I was sitting on Empire Earth and chatting with people over TeamSpeak. My computer can still do this, and with the added benefit of Discord can stream my game so my friends can all watch each other, and that's cool, apart from I lose about 10 fps just by virtue of having Discord open, and when I'm traveling? Oh god forget it, Discord flounders to death on hotel wifi despite it being perfectly cromulent DSL speeds. Not BLAZING, surely, but TeamSpeak handled VOIP over an actual DSL connection, with DSL latency, in the 00's. That's FUCKING embarrassing.
All our software now updates automatically by default, and it's notable when that's a GOOD thing. Usually what it actually means is the layout of known features changes for seemingly arbitrary reasons. Other times more dark patterns are injected. And Discord, not to pick on them, but they're the absolute fucking worst for this. I swear they push an "update" every time one of their devs sneezes, I usually have to install 18 such updates on each launch, and I run it very regularly! And for all that churn, I couldn't tell you one goddamn thing they actually added recently. FUCKING embarrassing.
And people will say "oh they could be better," "we know we can do it better," "these aren't the best companies or apps" okay but they are BIG ones. If the mean average car in America got awful fuel economy, needed constant service, was ill-designed for it's purpose and cost insane amounts of money...
Oh, that happened too. I think I just made my metaphor more embarrassing for another industry.
Agreed. If folks want to write java in elipse they are more than welcome to do so... dont understand these yelling at clouds posts really
Somehow this reminded me of a similar rant about devops:
It’s the Future - https://blog.paulbiggar.com/its-the-future/
And now I see that's from June 2015 -- it's over 10 years old now! Wow
I'm not sure we're really in a better place in the cloud now ... The article says
I’m going back to Heroku
and in 2025, I think people still want that
It's horrifying to see someone who started working 10 years after me talking about "when I was young" :-D
Programming was more exciting when you had amazing things to imagine having - like a 256 colour screen or BASIC that ran as fast as machine code (ACORN ARCHIMEDES) or the incredible thought of having a computer powerful enough to play a video and even having enough storage to hold a whole video!
2400bps! Checking email once a day
Everything came true. My screen has 16million colours and I don't notice. I have 12 cores and 64GB of memory and fibre optic internet. The extremely exciting future arrived. That mind-expanding byte article I read about Object Oriented programming came true enough for us to see that it wasn't the answer to everything.
There are 2 problems now - nothing one does is unique. You can't make a difference because someone else already has done whatever you can think of doing ....and of course AI which is fun to use but threatens to make us even more useless.
I just cannot really get my sense of excitement back - which may just be because I'm old and burned out.
> Checking email once a day
In an ironic twist of life, this is almost what I'm back doing right now. I turned off notifications and pull messages years ago because of all the messages I'm getting for a dozen different systems. I check mail at most a few times per day and that's it. I wouldn't be able to work if I'd have to actively keep an eye on them. I can get away with it because everybody is using Slack for work or WhatsApp for personal life, so there is no urgency to check mail. I'm on Slack too, so I see if I have messages there but WhatsApp is silenced and I allow no notification of any sort on the lock screen of my phone.
I'm disappointed that you said "2400bps" instead of "2400 baud". :/
It's always surprising to me when I see people being nostalgic for the old days. Yes, things seemed simpler, but it was because there was less you could do.
I'll always fondly remember my attempt to get on GeoLink with a 300 baud modem, and then my parents realizing that the long distance calls made it far, far too expensive to use, and returning it. Sure, I was disappointed at the time, but it wasn't too much later that 56k modems existed and we had a local unlimited internet provider. And now it's a fun story.
But I was actually just as frustrated at the time as I am now, but for different reasons. Change exists, and that's good.
I agree that it feels harder to make your mark today. But I don't think it's actually harder. There's plenty of fun things that people would love to see people do. Just yesterday, I found out about the Strudel musical programming language and watched an amazing video of someone making Trance with it. And those kind of discoveries happen constantly now, where they were pretty seldom back 30 years ago.
We're at the point that anyone can make a game, app, webpage, etc if they put enough effort into it. The tools are so easy and the tutorials are so plentiful and free that it's really just about effort, instead of being blocked from it.
I've been saying "we live in the future" about once a month for years now. It's amazing what we have today.
> Yes, things seemed simpler, but it was because there was less you could do.
And because computing was less mature and a younger field overall. If computers remained stagnant for 500 years, fixed as they were in 1980, I bet that programming would become increasingly more complex, just to enable doing do more with less
> I'm disappointed that you said "2400bps" instead of "2400 baud". :/
haha :-) It was of course 2400 baud and we were using FidoNET which was very very exciting at that time in Zimbabwe. We'd spend 10 minutes trying to get a dial tone sometimes but it was magic when you connected and saw something was coming in. International telephone calls were so expensive that we talked to my brothers overseas once or twice a month at best. With email we could chat every day if we wanted.
The limitation then was information - no internet, no manuals no documentation. I wrote a text editor and did my best to make it fast with clever schemes but it always flickered. Many years later a pal at university in South Africa casually mentioned that graphics memory was slow so it was actually best to write to memory and then REP MOVSB that to the graphics memory. I cursed out loud at this and asked him how he knew that?! Well, he lived in a more modern country and could buy the right books. Nowadays you really can be a linux kernel programmer in the Congo if you want to.
From no internet to internet but only at work, to 56k at home, to DSL, to fiber. That's quite a ride in one lifetime.
It's not just that you're old and burned out. There is a declining marginal value of improvements.
Take sound, for example. Going from "no sound" to "sound" was huge. Going from just beeps to IBM PC sound was a real step. CD-quality was a real step. Going from CD-quality to 64-bit samples at a 1 MHz sample rate is a yawn. Nobody cares. The improvement on CD quality isn't enough to be interesting.
I have high enough bandwidth. Enough screen resolution. Enough RAM, enough CPU speed, good enough languages freely available, enough data.
The problem is, everything that was an easy win with all that has already been done. All that's left is things that aren't all that exciting to do.
(I don't think this is permanent. Something will come along eventually - a new paradigm, a new language, a new kind of data, something - that will open a bunch of doors, and then everything will be interesting again.)
> Funnily enough, everything ran at about the same speed as it does now.
Actually, where I was sitting on a decent PC with broadband Internet at the time, everything was much, much faster. I remember seeing a video on here where someone actually booted up a computer from the 2000's and showed how snappy everything was, including Visual Studio, but when I search YouTube for it, it ignores most of my keywords and returns a bunch of "how to speed up your computer" spam. And I can't find it in my bookmarks. Oh well.
> I remember seeing a video on here where someone actually booted up a computer from the 2000's and showed how snappy everything was...
Was this what you were referring to?: https://jmmv.dev/2023/06/fast-machines-slow-machines.html
Notice also how you can clearly see which window is selected, and which item within that window is selected. Microsoft has just lost the plot.
Wow that was a great read, thank you. It's funny that it is already starting to break due to all of the links and ad tracking, which is another kind of rot.
Use Linux/KDE. None of the gains from the switch to SSDs have been lost. Everything is instant even on an n100. You only need something more powerful for compilation, gaming, or heavy multimedia (like the 200 Mbps 4k60 video my camera produces, which isn't accelerated by most processors because it's using 4:2:2 chroma subsampling).
Xfce or LXQt are also great alternatives, blazing fast even on 15 year-old hardware. Old hardware can be slow for basic web browsing and multimedia (e.g. watching videos with modern codecs) but other low-level uses are absolutely fine.
> Use Linux/KDE
I do. It's not without its problems but currently it's the least bad solution.
These days to get a snappy experience, one has to aggressively block everything and only selectively unblock the bare minimum, so that one doesn't get tons of bloat thrown in the direction of one's browser. Oh and forget about running JavaScript, because it _will_ be abused by websites. And then sites have the audacity to claim one is a bot.
Many websites are so shitty, they don't even manage to display static text, without one downloading tons of their JS BS.
I think the web is not livable anymore without Firefox and uBlock Origin.
YouTube has become enshittified at a record clip. Search is useless and shorts have turned it into just another TikTok brain dopamine machine.
You know what made javascript so common, accessible, and thus later universal?
Putting things on a screen was (is) stupidly simple.
That's the whole thing. It's not the types or npm or whatever. It's that you could start with C/Python/Java and spend your first 6 months as a coder printing and asking values on a dark terminal (which is something a newbie might not even have interacted with before) or you could go the html/css/javascript route and have animations moving in your screen you can show to your friends on day one.
Everything flows from that human experience. UIs are Electron because creating UIs for a browser is an extremely more universal, easy and creative-friendly experience than creating UIs for native apps, particuarly if JS is your native language for the previously stated reason.
The industry failed to adapt to the fact that the terminal wasn't any longer the reality of computer users, and the web kinda filled that niche by default.
Also, React was an answer to the advent of mobile phones, tablets, smart tvs, and basically all the explosion of not-a-desktop computer form factors. You could no longer assume your html was a proper format for everything. So you need an common API to be consumed by a mobile app, a tv app, a tablet app... react was the way to make the web another of the N apps that use the api, and not get special treatment. The idea made sense in context, back then.
> Putting things on a screen was (is) stupidly simple.
This is why HTML is still a great language for building UIs and that's why Visual Basic had a huge success in the early 90s: drag UI components on a panel, write the callbacks for clicks, save and distribute the exe. Almost anybody could do it.
React and its siblings are much more complicated than that.
> The industry failed to adapt to the fact that the terminal wasn't any longer the reality of computer users
I'm still waiting for some GUI-based terminal to appear
I want to call git in Scratch!
Jupyter notebook UI is essentially a proper GUI-based REPL/terminal.
It’s so very ironic that your comment appears on this particular post. We a much better version of thiswith interface builder on NeXT in the late 1980s (and later Mac OS X). Similar tooling existed for DOS and Windows.
You literally laid out your UI in a WYSIWYG, drag and dropped connections to your code, and had something working on day one.
It was even easier than the web, because what you were laying out were full components with code and UI and behavior. The web still hasn’t caught up with that fully.
When I see comments like these, I better understand why old timers shake their fist at the youngsters reinventing the wheel badly because they don’t understand what came before.
The author is writing like Java was outlawed or something. There are tons of shitty enterprise Java jobs out there for those who want them. Personally, I worked one of those jobs a decade ago, and the article's description of the "golden age" didn't bring back good memories.
It's easy enough to avoid the NPM circus as well. Just don't put JavaScript on your resume and don't get anywhere near frontend development.
I've been coding since 40 years old and professioannly since about 30. And let's set this straight: it is much (much, like really much) better nowadays.
We have super powerful editors, powerful languages, a gazillion of libraries ready for download.
I use to write video games and it took months (yeah, months of hobby-time) to put a sprite on a screen.
And for Java, yeah, things have improved so much too: the language of today is better, the tooling is better, the whole security is more complex but better, the JVM keeps rocking...
And now we've got Claude.
I'm really happy to be now.
were you trying to say you've been coding *for* 40 years? that "old" in "40 years old" confuses me a lot
or are you 60 years of age now, starting at 40 and have been coding long enough to see editors progress?
I am still a happy programmer after all these years using only node, express, postgres and sublime. I try not to listen to the sirens singing on the rocky shores...
Maven was a great idea. Introduce a high barrier to entry for publishing packages. Paired with a search box operated by carrier pidgeon, this effectively meant what you were looking for didn't exist. Every time you had any kind of quirky need, you had to write it out by hand, or find someone smarter than you to do it for you, or worst of all buy it from a vendor for lots of money at horrible quality. People recite 'DSA is bad for coding challenges, when will I need to write a hash map', but once upon a time you did have to write a hash map here and there. Supply chain vulnerability is a cost, but the product was worth the cost: you can just import the darn package!
I need a map of key ranges to values with intelligent range merging, it is right there on crates.io to import, it has been there since 2016, Maven Central didn't get one until 2018. In the olden days either it was in Apache Commons or it didn't exist. Halcyon days those.
> We run our k8s cluster in the “Cloud”. It’s a bunch of services that run on Linux, but we don’t run Linux ourselves, we run it on VMs that we rent by the hour for approximately the same cost as buying a computer outright every month. We do this because no one knows how to plug a computer in any more.
My mind struggles with this reality everyday... the "cloud" has to be the most successful rebrand of all time. In 2005: "Be very careful what data you share on the internet". In 2025: "Yeah, I just put all my shit in the cloud".
[dead]
> Where did it all go wrong?
Just landscape changed significantly comparing to 2010.
Before you didn't have current smartphones and tablets. On desktop you could just support windows only (today probably have to macOS as well). On browser you didn't have to support Safari. You didn't have to worry much about screen DPI, aspect ratio, landscape vs portrait, small screen sizes. Your layout didn't have to be adaptive. There was not much demand for SPA. Back then there was much less people worldwide with access to internet.
My first text editor for PHP was Komodo Edit, it was super slow and everyone jump shipped to Sublime Text, then VSCode was slower than Sublime but had incredible industry supported extensions like Git and Prettier, programming didn't peak during my PHP days, in fact the PHP frameworks sites I used are still around and incredibly slow.
I am building an AI coding tool that doesn't eat RAM, there are super lightweight alternatives to Electron.
In this respect I’m liking vibe coding. I can tell it use html css and js only and make me a frontend.
That obviously has severe limitations but ideal if you don’t like the front end framework scene and want to put all your logic in backend. And there I find it a bit easier to navigate
RIP guys in corporate that have things imposed on them
I get it. I agree with most of this article. But also like, nothing went away.
If you pine for the days of Java and Maven, you can still do that. It’s all still there (Eclipse and NetBeans, too!)
If you don’t like using Node and NPM, that’s totally valid, don’t use them. You can spin up a new mobile app, desktop app, and even a SaaS-style web app without touching NPM. (Even on fancy modern latest-version web frameworks like Hanami or Phoenix)
If you don’t want everyone to use JS and NPM and React without thinking, be the pushback on a project at work, to not start there.
Java is usable now, but in 2013 it was the worst debugging experience one could have. I would rather work with PHP5 than with Java (unless I started a project from scratch). Also auto-refactoring was clearly worse, because well, Java. It was around that time that I tried Scala then Clojure, and even if debugging the JVM was still an experience (to avoid as much as possible), at least limited side effects reduced the issues.
If programming peaked, it certainly wasn't in 2010.
> it was the worst debugging experience one could have.
Hard disagree. I'm not going to argue that Java debugging was the best, however:
1. You could remote debug your code as it ran on the server.
2. You could debug code which wouldn't even compile, as long as your execution path stayed within the clean code.
3. You could then fix a section of the broken code and continue, and the debugger would backtrack and execute the code you just patched in during your debugging session.†
This is what I remember as someone who spent decades (since Java 1.0) working as a contract consultant, mainly on server side Java.
Of course this will not convince anyone who is determined to remain skeptical, but I think those are compelling capabilities.
† Now I code in Rust a lot, and I really enjoy it, but the long compile times and the inability to run broken code are two things which I really miss from those Java days. And often the modern 2025 debugger for it is unable to inspect some for the variables for some reason, a bug which I never encountered with Java.
About your †, I think that new Rust tooling like dx will eventually enable hot patching code during debugging
https://lib.rs/crates/subsecond
(Note, it was created by Dioxus, but it's usable in any Rust project)
> 2. You could debug code which wouldn't even compile, as long as your execution path stayed within the clean code.
In Java? How?
I think I’m lucky, because for me it’s the other way. In 2009 I started my first real programming job writing c++ in vim. For the last 5 years I’ve been writing rust in helix and things have never been better.
Javascript wins by keeping the costs down. Companies today want to do more with less, which is how it should be and you are still free to choose from a myriad of technologies. When you pair this setup with LLMs, it's actually the best it has ever been IMO.
If the author doesn't want to work with NPM and the JavaScript ecosystem he could just get a job writing Spring/Boot, which makes up probably 90% of the jobs at large enterprise companies. I don't agree that this world has disappeared...
There's something to this. I recently shipped a music curation site and deliberately avoided React/Next/etc - just HTML, CSS, vanilla JS. The cognitive load difference is stark. The 'peak' might be less about capability and more about us rediscovering that simpler tools often suffice.
It's all about picking the right tools for the job. The "cognitive load" might be larger in a vanilla project compared to React when your interface is more complex and interactive.
Same. I build stuff for local businesses in my area with nothing but boring old HTML, PHP, CSS and JS. I guess my shit isn't "web scale" but it works, and it works consistently, with minimal downtime, and it worked during both Amazon and Cloudflare's latest outages.
I don't need my software to eat the world, I'm perfectly content with it just solving someone's problems.
Honestly, the person should spend their time of fixing their shit instead of writing blog posts.
I find intellij a great IDE, modern frameworks are fun to use, ai helps me doing things i don't want to do or things i just need (like generate a good README.md for the other people).
Containers are great and mine build fast.
My Startup has a ha setup with self healing tx to k8s and everything is in code so i don't need to worry to backup some random config files.
Hardware has never been that cheap. NVMs, RAM and Compute. A modern laptop today has a brilliant display, quite, can run everything, long batterytime.
Traffic? No brainer.
Websphere was a monster with shitty features. I remember when finally all the JEE Servers had a startup time of just a few seconds instead of minutes. RAM got finally cheap enough that you were able to run eclipse, webserver etc. locally.
Java was verbose, a lot more verbose than today.
JQuery was everywere.
Ew
truthnuke
This feels very much like the tired "the modern internet sucks - the old web with old websites was better!" trope that appears on here regularly.
You can still code the old way just like you can still put up your old website. No one is forcing you to use AI or VS Code or even JavaScript.
Of course you.might not getting those choices at work, but that is entirely different since your paid to do a job that benefits your employer, not paid to do something you enjoy.
Have fun.
MY GOD THIS IS GOLD. Nothing but the truth here.
Programming was so much better 15 years ago, except for all the parts that sucked.
> I love writing JavaScript, and I’m glad I can run it on the server. But this doesn’t mean I think it’s a good idea to use it for everything.
"Javascript" programming peaked.
this is going straight into my funny folder
Defining peak programming as entreprise Java using Eclipse on SVN is border delusionnal imo.
> And here’s the funny thing: it never broke, because the person who built it did it well, because it wasn’t taxed within an inch of its life, and because we were keeping an eye on it.
Sure, if you say so buddy...
> The most popular is “VS Code”, which needs only a few gigabytes of RAM to render the text.
> We used Eclipse, which was a bit like VS Code.
This made me laugh. You can't possibly be a serious person if you think Eclipse was better in any way shape or form than VS Code.
I don't have a great memory, but one thing I can absolutely still remember with 100% clarity is how bloated and memory hungry Eclipse was. It was almost unusable.
In other words you have competitive advantage because your cloud costs will be 10x less.
This is exactly what 10 years of experience did for you. Why complain?
how tedious must a life be if you go through it with this kind of thinking