no more software layers please! my PC was much faster when my programs were native. Apps rendered with chrome rendered with webkit...8GB used to be enough, then all these "developers" with their "development machines" who think it's no big deal that their Hangouts implementation takes 200MB when Pidgin takes 20.
people used to complain about Firefox hogging memory, I think it's Chrome we should be worried about...
At my day job I create apps for TV decoders. Most of those render and run in browsers.
The boxes have 1 GB memory, the browser uses just a small fraction of that. A lot of the browser memory is used for graphics (think bitmaps), the JS-heap is only 4 MB, not sure about the rendering engine.
Granted, we use SVG, not HTML/CSS, for most of these apps.
At the same time, it's likely much more expensive to code native apps than it is to just build an html/css/js app using chrome with a lot of the heavy lifting already done for you (at the expense of increased resource consumption).
A lot of these apps wouldn't exist if they couldn't have been built quickly and cheaply.
You see those little dots? Those are individual silicon atoms from one of a few fins in the gate on Intel's 22nm node.
Also, I Am A C Programmer. I Think Everything Should Be Written In C. C Is Close To The Hardware And Is Faster And Use Less Resource Compared To The JavaSlow and WebKittenz and CSS. But I Like CSS Because It Has C In It. I Am A C Programmer.
Get back to me when we have compressed ZRAM implementations and Same Page Merging everywhere to save memory on each Chrome process and each JS interpreter instance. Don't you want to do the green thing?
Congratulations, I thought I'd managed to go a whole year without hearing about process limits and the grim future we face. Seriously, every 6 months or so for the last 15 years or more there is an article published explaining the end of the PC revolution. Google the terms "end of silicon" or "end of moore's law" and add a custom date range like 1998-1999 and you'll get dozens if not hundreds of articles from reputable sources saying that we've come to an impasse or that in 3, 5 or 10 years we will be screwed.
When chips started hitting the limits of the photo-lithography process, they switched to lasers, UV, DUV, and now EUV. Transistors start leaking so they redesign them time and time again. We've gone from completely 2D Circuts to 2 layer 3D to what's it at now 16 layers? The end is nigh and has been since Moore came up with that god damn law.
My point is that there's a lot a really brilliant people in the field who've been working for years if not decades to address problems we haven't even had yet. We're perpetually on the verge of a crisis, and simultaneously on the verge of a solution to said crisis.
We're in an age of ridiculously powerful machines that we don't know what to do with. If suddenly all progress halted in chip development and we had to make due, we'd probably enter into an age of hyper optimization where software engineers would be scrutinizing every clock cycle furthering the annual performance gains we've had for the last 40+ years.
I'm not suggesting a solution, I'm simply stating that lots of brilliant people have been aware of this eventuality and have been actively working on next generation processes to address it.
if "next generation processes" is really the best you can come up with...
Those people are working on EUV at ASML. Those are the next generation processes. What you're suggesting is they have some super science quantum-tunneling barrier that the world has never heard of nor seen nor thought could exist.
I'm with you. I'm a "mechanically sympathetic" anomaly amongst my age group and even many of my senior colleagues who came up in the 90s when Moore's Law == clock speed. My whining about cache thrashing and branch mispredictions might fall on deaf ears now, but I'll be ready to fix all the slow-ass code when people realize that silicon atoms aren't getting any smaller and heat isn't getting any less hot.
you have my approval :) I've been toying with the notion of continuing to focus on embedded C since most of the Computer Science d00ds I've worked with have no clue how things work "what? the runtime takes care of it" besides O-notation.
If we do hit process limits on process node limits, we'll start going massively parallel, it which case you still won't be using C to program your 10000 processor machine.
I don't agree that it's "much more expensive" - and I've been developing applications with UIs in Win32 for a long time.
Also, even if native apps were "more expensive", that's only from the developers' perspective -- and for a good app, the number of users far outnumbers its developers, so any "increased resource consumption" gets foisted on and effectively multplied by all the users. Only the developers get any benefit from this; but even then, since developers are themselves users, if they use apps that others have similarly developed with this culture of "selfishness", they get to experience the "increased resource consumption" too. In the end, I don't think this vicious cycle of waste benefits anyone except the hardware manufacturers.
In general, I think treating resources like they're infinite and "there will always be more" is almost certainly guaranteed to make it so there won't ever be enough.
This Firefox.html is a fun experiment to push boundaries, like WebKit.js and all the other interesting things you can do with JS and an HTML rendering engine (e.g. see Fabrice Bellard's complete PC emulator in a browser), and somewhat reminds me of other tricks like nesting VMs.
As long as the users want more and more apps very cheaply they might have to accept the performance they get. It has also been a problem that most OS's have different GUI frameworks which means you have to re-devlop every app/application multiple times. This is an enormous time sink. When all operating system uses the same GUI framework (and possibly the same languages) it might be very effective to use that instead of html+css. Until that it is still expensive and as I started with, the users want more for less.
> it's likely much more expensive to code native apps
No, and not even close. Web tech is primitive compared to the native SDKs and there is no sign of that improving anytime soon. All what happens is reinventation of some MVC and reshuffling the libs.
I've been gradually moving back to native apps: hexchat for IRC, pidgin for gtalk+fb messenger, thunderbird for email (heavy, but can't find something better for me).
With this setup I can boot to a system that uses about 1.5GB of RAM occupied with Firefox with a couple of tabs open.
Some stuff that I'm trying to eliminate that's a big memory hog: dropbox (150MB), skype (150MB).
I don't like the idea of developing for current machines. With the constant advances in computers the next generation of applications, "developer apps" should be made for the next generation of computers, meaning that its okay if it's a bit of a hog. I mean, 200MB is just a drop in a bucket now.
...and fuck those people who can't afford to build a new bleeding edge gaming rig ever year, right? </s>
Developer arrogance like this is exactly why all software sucks. It's actually a lot worse than it used to be due to mobile phones, which are hugely underspecced but attract the same mentality of "my code is surely the only important thing users will ever run so it's ok to use 700MB for a browser".
If you want to have bleeding edge developer software than you may need bleeding edge developer hardware. To clarify, I don't think mainstream applications or even betas should aim at bleeding edge hardware, just that when we plan on making some new piece of software that may take years to build its okay to assume that people in the future will have better computers.
In 2006, the next computer was the iPhone with 1GB of memory and the iPhone 6 still has 1GB today. I wouldn't count on everything being a 16GB+ monster rig just to bring a browser up.
I don't necessarily disagree that stuff is getting too bloated, but the iPhone 1 had 128 MB of RAM. So iPhone memory is doubling every 2-3 years so far. That trend will likely continue as Apple's software is also getting more bloated.
people used to complain about Firefox hogging memory, I think it's Chrome we should be worried about...