Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not sure how I can prove it, but ~25 years ago building software without open source sucked. You had to build everything from scratch! It took months to get even the most basic things up and running.

I think open source is the single most important productivity boost to our industry that's ever existed. Automated testing is a close second.

Google, Facebook, many others would not have existed without open source to build on.

And those giants and others like them that were enabled by open source employed a TON of people, at competitive rates that greatly increased our salaries.



25 years ago, I was slinging apps together super fast using VB6. It was awesome. It was a level of productivity few modern stacks can approach.


I'm too young to have used VB in the workforce, but I did use it in school, and honestly off that alone I'm inclined to agree.

I've seen VB namedropped frequently, but I feel like I've yet to see a proper discussion of why it seems like nothing can match its productivity and ease of use for simple desktop apps. Like, what even is the modern approach for a simple GUI program? Is Electron really the best we can do?

MS Access is another retro classic of sorts that, despite having a lot of flaws, it seems like nothing has risen to fill its niche other than SaaS webapps like airtable.


You can add Macromedia Flash to that list - nothing has really replaced it, and as a result the world no longer has an approachable tool for building interactive animations.


https://www.youtube.com/watch?v=hnaGZHe8wws

This is a nice video on why Electron is the best you might be able to do.


Thanks for the link - this is a cool video. Though it seems like it's mostly focusing on the performance/"bloat" side of things. I do agree that's an annoying aspect of Electron, and I do think his justifications for it are totally fair, but I was more so thinking about ease of use, especially for nontechnical people / beginners.

My memory of it is very fuzzy, but I recall VB being literally drag-and-drop, and yet still being able to make... well, acceptable UIs. I was able to figure it out just fine in middle school.

In comparison, here's Electron's getting started page: https://www.electronjs.org/docs/latest/ The "quick start" is two different languages across three different files. The amount of technologies and buzzwords flying around is crazy, HTML, JS, CSS, Electron, Node, DOM, Chromium, random `charset` and `http-equiv` boilerplate... I have to imagine it'd be rather demoralizing as a beginner. I think there's a large group of "nontechnical" users out there (usually derided by us tech bros as "Excel programmers" or such) that can perfectly understand the actual logic of programming, but are put off by the amount of buzzwords and moving parts involved, and I don't blame them at all.

(And sure, don't want to go in too hard on the nostalgia. 2000s software was full of buzzwords and insane syntax too, we've improved a lot. But it had some upsides.)

It just feels like we lost the plot at some point when we're all using GUI-based computers, but there's no simple, singular, default path to making a desktop GUI app anymore on... any, I think, of the popular desktop OSes?


You are totally right. Going even way back, in days of TurboPascal, you could include graphics.h and get a very cool snake game going within half an hour. Today, doing anything like that is a week of advanced stuff. Someone wanted to recreated that experience today and came up with this: https://github.com/dascandy/pixel

But as you can see how much boiler plate was needed to be written for them to write this.

https://github.com/dascandy/pixel/blob/master/examples/simpl...

See the user example and then look at src for boilder plate.

In old days, you could easily write a full operating system from scratch on 8051 while use PS/2 peripherals. Today, all peripherals are USB and USB 2.0 standard is 500 pages long.

I also agree that we have left behind the idea of teaching probably or at least removed it from the mainstream.


> 25 years ago, I was slinging apps together super fast using VB6. It was awesome. It was a level of productivity few modern stacks can approach.

If that were too, wouldn't we all be using VB today?


Ever try to maintain a bunch of specialized one-off thrown-together things like that? I inherited a bunch of MS Access apps once ...

everything old is new again


Excel (and spreadsheets in general) is not quite the same as VB but is similar in that it solves practical problems and normal people can work with it.


How are you measuring productivity?

What one can make with VB6 (final release in 1998) is very far from what can make with modern stacks. (My efficiency at building LEGO structures is unbelievable! I put the real civil engineers to shame.)

Perhaps you mean that you can go from idea to working (in the world and expectations of 1998) very quickly. If so, that probably felt awesome. But we live in 2025. Would you reach for VB6 now? How much credit does VB6 deserve? Also think about how 1998 was a simpler time, with lower expectations in many ways.

Will I grant advantages to certain aspects of VB6? Sure. Could some lessons be applicable today? Probably. But just like historians say, don't make the mistake of ignoring context when you compare things from different eras.


Agentic coding is just another rhyme of 25 y/o frenzy of "let's outsource everything to India." The new generation thinks this time is really special with us. Let's check again in 25 years


Indeed it did; I remember those times. All else being equal I still think SWE salaries on average would of been higher if we kept it like that given basic economics - there would of been a lot less people capable of doing it but the high ROI automation opportunities would of still been there. The fact that "it sucked" usually creates more scarcity on the supply side; which all being equal means higher wages and in our capitalist society - status. Other professions that are older as to the parent comment already know this and don't see SWE as very "street smart" disrupting themselves. I've seen articles recently like "at least we aren't in coding" from law, accounting, etc an an anecdote to this.

With AI at least locally I'm seeing the opposite now - less hiring, less wage pressure and in social circles a lot less status when I mention I'm a SWE (almost sympathy for my lot vs respect only 5 years ago). While I don't care for the status aspect, although I do care for my ability to earn money, some do.

At least locally inflation adjusted in my city SWE wages bought more and were higher in general compared to others in the 90's-2000's than on wards (ex big tech). Partly because this difficulty and low level knowledge meant only very skilled people could participate.


Monopolizing the work doesn't work unless you have the power to suppress anyone else joining the competition, i.e. "certified developers only".

Otherwise people would have realized they can charge 3x as much by being 5x as productive with better tools while you're writing your code in notepad for maximum ROI, and you would have either adjusted or gone out of business.

Increased productivity isn't a choice, it's a result of competition. And that's a good thing overall, even if it sucks for some developers who now have to actually work for the first time in decades. But it's good for society at large, because more things can be done.


Sure - I agree with that, and I agree its good for society but as you state probably not as good for the SWE who has to work harder for the same which was my point and I think you agree. Other professions have done what you have stated (i.e. certification) and seen higher wages than otherwise which also proves my point. They see this as the "street smart" thing to do, and generally society respects them for it putting their profession on a higher pedestal as a result. People respect people who take care of themselves first generally I find as well. Personally I think there should be a balance between the two (i.e. a fair go for all parties; a fair day's work with some job security over a standard career lifetime but not extortionary).

Also your notion of "better tools" may of not happened, or happened more slowly without open source, AI, etc which would of meant higher salaries for longer most probably. That's where I disagree with the parent poster's claim of higher salaries - AI seems to be a great recent example of "better tools" disrupting the premium SWE's enjoy rather than improving their salaries. Whether that's fair or not is a different debate.

I was just doubting the notion of the parent comment that "open source software" and "automated testing" create higher salaries. Usually efficiency economically (some exceptional cases) creates lower salaries for the people who are made more efficient all else being equal - and the value shifts from them to either consumers or employers.


> Other professions have done what you have stated (i.e. certification) and seen higher wages than otherwise which also proves my point.

I'd generally agree with that if it regards to safety (e.g. industrial control systems), but we manage that by certifying the manufacturer, not the individual developer. But otherwise I think it's harmful to society, even if beneficial to the individuals - but there's a lot of things falling in that bucket, and it's usually not the things we strive for at a societal level.

In my experience, getting better and faster has always translated into being paid more. I don't know that there's a direct relationship to specific tools, but I'm pretty sure that the mainstreaming of software development has caused the huge inflation of total comp that you see in many companies. If it was slow and there's only this handful of people that can do it, but they're not really adding a huge amount of value, you wouldn't be seeing that kind of multiplier vs the average job.


> But otherwise I think it's harmful to society, even if beneficial to the individuals

I disagree a little in that stability/predictability to people also adds some benefit to society - constant disruption/change for the sake of efficiency I believe at extreme levels would be bad for mental health at the very least and probably cause some level of outrage and dysfunction. I know as an SWE tbh I'm feeling a bit of it - can't imagine if it was everyone.

I personally think there is a tradeoff; people on average have limits to adaptability in their lifetimes and so it needs to be worth it for people to invest and enter in a given profession (some level of economic profit that makes their limited time worth spending in it). It shouldn't be excessive though - it should be where both client and producer get fair/equal value for the time/effort they both need to put in.


> ex big tech

I mean, this seems like a pretty big thing to leave out, no? That's where all the crazy high salaries were!

Also, there are still legacy places that more or less build software like it's 1999. I get the impression that embedded, automotive, and such still rely a lot on proprietary tools, finicky manual processes, low level languages (obviously), etc. But those are notorious for being annoying and not very well paid.


I'm talking about what I perceive to be the median salary/conditions with big tech being only a part of that. My point is more that I remember back in that period good salaries could be had outside big tech too even in the boring standard companies that you state. I remember banks, insurance, etc paying very well for example compared to today for an SWE/tech worker - the good opportunities seemed more distributed. For example I've seen contract rates for some of the people we hire haven't really changed for 10 years for developers. Now at best they are on par with other professional white collar workers; and the competition seems fiercer (e.g. 5 interviews for a similar salary with leetcode games rather than experienced based interviews).

Making software easier and more abstract has allowed less technical people into the profession, allowed easier outsourcing, meant more competition/interview prep to filter out people (even if the skills are not used in the job at all), more material for AI to train on, etc. To the parent comment's point I don't think it has boosted salaries and/or conditions on average for the SWE - in the long run (10 years +) it could be argued that economically the opposite has occurred.


even if that's true it's clear enough AI will reduce the demand for swe


I don't think that's certain. I'm hoping for a Jevons paradox situation where AI drives down the cost of producing software to the point that companies that previously weren't in the market for custom software start hiring software engineers. I think we could see demand go up.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: