Hacker Newsnew | past | comments | ask | show | jobs | submit | teucris's commentslogin

Software developers should be worried about their jobs, not because these tools are capable of replacing them or reducing a company’s need for human developers, but rather because the _perception_ that they can/will replace developers is causing a major disruption in hiring practices.

I truly don’t know how this is going to play out. Will the software industry just be a total mess until agents can actually replace developers? Or will companies come to their senses and learn that they still need to hire humans - just humans that know how to use agents to augment their work?


Software development hiring is terrible right now, but hiring has been pretty slow in general. We gained 2 million jobs in 2024 and only 500,000 in 2025.

That can't possibly be a long term disruption. If it doesn't work it doesn't work

If AI can't replace developers, companies can't replace developers with it. They can try — and then they'll be met with the reality. Good or bad


You may be right if AI truly can never replace devs. But history shows many examples of inadequate technologies getting hammered on by industry until they work due to distorted perceptions of economic gain. I’ve heard stories like this regarding mechanical looms, CNC, and cloud services. My understanding is those all work (decently) now not because they were obviously better, but because economic pressure pushed innovation to make them work, for better or worse.

> the _perception_ that they can/will replace developers is causing a major disruption in hiring practices.

Bingo.

And it’s causing the careers of a majority of juniors to experience fatal delays. Juniors need to leap into their careers and build up a good head of steam by demonstrating acquired experience, or they will wander off into other industries and fail to acquire said experience.

And when others who haven’t even gone through training yet see how juniors have an abysmally hard time finding a job, this will discourage them from even considering the industry before they ever begin to learn how to code.

But when no-one is hiring such that even students reconsider their career choice, this “failure to launch” will cause a massive developer shortage in the next 5-15 years, to the point where I believe entire governments will have this as a policy pain point.

After all, when companies are loathe to actually conduct any kind of on-the-job training, and demand 2-5 years of experience in an whole IT department’s worth of skills for “entry level” jobs, an entire generation of potential applicants with a fraction of that (or none at all) will cause the industry to have figurative kittens.

I mean, it will be the industry’s own footgun that has hurt them so badly. I would posit it may even become a leggun. The schadenfreude will be copious and well-deserved. But it’s going to produce massive amounts of economic pain.


> Juniors need to leap into their careers and build up a good head of steam by demonstrating acquired experience,

Junior devs at least have the option of building a portfolio of usefully software on their own machine at home, while eating ramens.

They can build websites for mom'n'pop stores. They can participate into open source projects. Etc, etc...

I dread the people who won't get jobs into other fields because managers have been told by corporate that "we don't need people, chatgpt can do everything".


> Junior devs at least have the option of building a portfolio of usefully software on their own machine at home, while eating ramens.

Bold of you to assume today’s young adults can live without a steady paycheque that doesn’t suck up 40-80hrs of their time a week. I mean, how else will that roof end up over their heads?

And most parents have been brainwashed to believe that any child not living on their own once they become adults are failures, and still need to be kicked out of the house such that they are forced to learn self-sufficiency.

AFAICT, most parents of adult offspring have zero clue about how bad things actually are out there, with most still telling their children to go from business to business with printed-off Résumés. Outside of blue-collar jobs, I haven’t seen this work for a good twenty years, now.


I'm probably picturing "junior" as "people still in college preparing to get their first job".

My point is that a luxury of the software engineering craft is that you can practice "at pro level" very cheaply. Even as a teenager, learning and using vscode/python/react/etc... on your own is a possibility.

Learning Salesforce and SAP and the internal support tool of BigCo is not.

That being said, I completely agree that we're going to put a generation of "wannabee white collar" in a dire situation. Cynically, this might be an overdue correction from the years of "college degree for everyone", and maybe (just maybe) some people will actually thrive in the "hard-to-llm-ize" profession if they can retrain.

(If the market laws apply, someone will build "turn advertisement-copy writers into electricians", and it would not necessarily be for the worst ? I know, easy to tell for me, who got the opposite deal by being a software engineer on demand at the right time at the beginning of my career.)


> while eating ramens

For many, even cutting their budget isn’t enough to pursue what you’re describing. Modern careers in software are very hard to reach for people who can’t afford to wait for a real paycheck, and it drives away a massive group of potential talent.


> All of the "discoverability" algorithms are specifically and fundamentally about sifting through the millions to find the few that are preferred.

They are fundamentally about finding the content that will generate the most revenue. That changes the dynamics quite a bit.


You're not wrong, but the need to please the user is still paramount, otherwise they'll just do something else. This is why TikTok is eating everyone's lunch.


I don't agree with this and to answer the question you originally asked me, I do think users are consuming things they don't actually enjoy. The goal isn't to please the user, the goal is to not bore the user. If you talk to people I'm sure you'll find a lot of the music they listened to isn't "enjoyed" so much as it is inoffensive background noise.


It's not surprising that some people are mindless consumers, but it's not useful to assume the majority is, especially of paying customers, and competition exists.


You're assuming it's not useful because it doesn't bode well for your argument. What makes you think assuming the majority aren't mindless consumers is useful?


Again, if people enjoyed watching things they didn't like TikTok would not be eating everyone's lunch.

Tiktok is not eating everyone's lunch. Instagram Reels and Youtube Shorts have caught up to and in some metrics even beat Tiktok.

The hypocrisy lies in the fact that the philosophy of Ayn Rand - that an elite few held up society and the rest were pretty much just parasites - has been used at great length to justify the gutting of social programs.


Please read my comment in good faith. There is no contradiction with Rand’s philosophy here. According to her framework, the state stole from her throughout her life. Using public assistance is merely retrieving a small piece of that stolen money.


I agree that it was in her philosophical framework to accept social security - apologies if my comment seemed in bad faith due to that not being clearer. The irony does not lie with her, but rather those that use her philosophy to eliminate the safety net that she herself ended up using.

Sure, she could have used the money she had put into social security to invest, and maybe would have come out better off. But for those of us who see how public services can enrich an entire society, there is irony to how this all played out.


> I agree that it was in her philosophical framework to accept social security

Then where exactly is the irony or hypocrisy here?


"The irony is with those who believe that thievery is wrong. She obviously didn't believe what she wrote because her actions reveal she believed in stealing your stolen property back from a thief, which is itself thievery."


"The irony is with those who believe that thievery is wrong. She obviously didn't believe what she wrote because her actions reveal she was OK accepting when the thieve gave her your property to make up for the theft she suffered earlier"

FTFY.

She didn't steal from the thieve, she became complicit with the thieve stealing other people's work to get their money back (gracefully handed by the thieve).


And the gutting is done by the people she described as the parasites.


She believed that even wealthy kids that just live off their trust funds were parasites too. It was about consuming vs producing, not elite vs non-elite.


Indeed, dehumanizing people shouldn't be the foundation of a logical argument.

Have a wonderful day =3


It's pretty clear which group she would place Elon Musk into, probably the most Randian character out there.


I think of the foundational model like CPUs. They're the core of powerful, general-purpose computers, and will likely remain popular and common for most computing solutions. But we also have GPUs, microcontrollers, FPGAs, etc. that don't just act as the core of a wide variety of solutions, but are also paired alongside CPUs for specific use cases that need specialization.

Foundational models are not great for many specific tasks. Assuming that one architecture will eventually work for everything is like saying that x86/amd64/ARM will be all we ever need for processors.


Not to be too pedantic, but code is a kind of specification. I think making the blanket statement "Prompt is code" is inaccurate but there does exist a methodology of writing prompts as if they are specifications that can reliably converted to computational actions, and I believe we're heading toward that.


Yeah, I assumed someone would say this.

My manager gives me specifications, which I turn into code.

My manager is not coding.

This is how to look at it.


I’m all for this movement provided it’s actually focusing on the rights of individuals rather than empowering corporations to own and operate massive amounts of computing power unchecked. When I first read the article, I frankly assumed this was meant to limit regulation on AI. From what I’ve read in the law that doesn’t seem to explicitly be the case, but given the organizations involved, I fully expect to see more in that vein.


But this isn’t a suggestion to turn away from AI threats - it’s a matter of prioritization. There are more imminent threats that we know can turn apocalyptic that swaths of people in power are completely ignoring and instead fretting over AI.


Really appreciate the detailed article! I was on the team that shipped D3D11 and helped with the start of D3D12. I went off to other things and lost touch - the API has come a long way! So many of these features were just little whispers of ideas exchanged in the hallways. So cool to see them come to life!


That requires the operating system to “hint” to the display that there’s no refresh necessary and for the display to shut down during those times. That’s currently not supported as these kits just take a video signal, but it’s something being worked on for a future version!


Edit: You work on that stuff, right? Then this armchair experting feels silly, just imagine it's for other readers.

It seems much more practical (if a little less power-efficient) to implement the no diff -> no refresh logic for screen regions in the display hardware. The RAM and logic for a display-side framebuffer can't be expensive today, a couple of Euros for the extra board space and chip(s). If that stuff takes off, just additional transistors in the all-in-one ASIC.

For the whole screen, that more or less already exists in laptop hardware: "panel self-refresh". HDMI and DiplayPort might need a new extension or something? Is there anything?


The Embedded DisplayPort standard has had the panel self-refresh feature since 2011, and the partial update feature since ~2015. I found a press release from Parade in early 2017 for a TCON supporting the partial refresh feature. I don't think there's anything missing from the ecosystem other than a strong impetus to use those features.


Yes! I work with the Modos team. You’re exactly right - ideally we want region-based refresh hinting. The SDK supports some region based features - we’d like to extend that functionality.


“Traditionally, the [e-paper display] controller used a single-state machine to control the entire panel, with only two states: static and updating,” says Modos cofounder Wenting Zhang. “Caster treats each pixel individually rather than as a whole panel, which allows localized control on the pixels.”


Response time is on par with LCDs - the trailing you’re seeing is ghosting, which in most situations is not common but does occur occasionally.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: