Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Let me start off with saying I completely agree with your points of the downsides of this.

It is however extremely hard to imagine the world that would be, if we were to take a different route. What would be the effects of slower improvements. I assume (since I sometimes do) many people look at the software industry as narcissistic and self-agrandizing when "delivering expriences" and "solving global problems with scalable midware". But what third party industries would be held back if this one moved much slower? Could there be loss of medicinal advancements, monetary, safety, third world advancements?

But as you say, I'm also very hesitant on the stability if we can't find some middleground between the two.



To borrow a line from Larry Page's book: more wood behind fewer arrows. That would already work wonders, so instead of having an endless repeat and rehash of the same concepts in disposable form it would likely be better (and possibly even faster, so no effective slowdown!) to do things a bit better, to try to merge more often rather than to fork and re-start all over.

I think the feeling of fresh and new development (before complexity sets in) is so compelling that it tends to drive us away from actual progress. After all, it is so much easier to launch yet another half-baked language, frame work or product than it is to pick up something existing and to really improve it or to update it in a way that it will last much longer. That's relatively thankless and anonymous work compared to slapping your handle on a new framework.

This is part of what makes good software hard: good software isn't sexy (think: erlang verus node.js, as just an example, I've tried very hard to keep this discussion brand and tech free but I feel that an example may help to illustrate what I'm getting at).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: