Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think people are looking at this too drastically. Machines can only replace humans up to a certain point. They're ideal for easily quantifiable and scripted tasks, but past that there needs to be human contact and judgement involved. At worse, we'll just have new jobs for people to maintain these systems, one level of abstraction higher, and even then there will still be legacy systems and people who are slow to adopt.

For this specific example, I haven't been following AI that much, but from the news it doesn't seem to be advancing that far. Sure, we have self-driving cars, but a whole team went into creating them, and before that there were teams involved in image recognition, mapping, navigation, engineering, and more. Natural language processing has advanced pretty far though, I'll give you that. But there's a lot of stuff that happens in between the query "Siri, what's the Weather?" and the response and people will need to be involved in the processes in between.



> "can only replace humans up to a certain point"

But with advances in both hardware (3d memory, memresistors, neural network chips, graphite), and software(recent advances in CV, statistics, deep neural networks) that point is moving forward all the time.

Not too long ago we couldn't imagine that a computer could recognise house numbers and streets names in arbitrary fonts, but that is what powers google maps. Luckily in this case the technology has not replaced a huge workforce of would-be google-mappers. But that doesn't mean that it can't be done.

Technology, and especially AI has the potential to be enormously disruptive. And even if the risk seems remote, it's too big to ignore imo.


Yes, I'm not disagreeing that we're making huge progress. My main argument was that it's only going to advance up to a certain point. For one thing, computing power is reaching it's limits. Many consider Moore's law to be dying [0], at that point the advent of the future will rely on the discovery or implementation of a new material like graphene or a quantum computer. Then we have limits in storage and computation, but of course those can be surmounted using smart software (RAID and hadoop come to mind). I'm not saying you're wrong, just that there's a limit to all of it.

HN seems to be stuck on some singularity, but personally I just think it's a byproduct of futurology. Just look at the past and how they envisioned the future. Heck, we can study the industrial revolution and the reaction of Luddites, but people still have jobs today. Like I said, we might just have to move an abstraction higher, just like how we went from manual labor, to controlling machines that do manual labor.

Also, just a nitpick but all that google maps OCR was done by humans via Captcha.

[0]: http://www.eetimes.com/document.asp?doc_id=1319330


Fortunately by the time that point moves to "100% of human based endeavors" we will be yielding the planet to our new AI overlords and we can pat ourselves on the back for being the first species in history to execute its own extinction and successive lifeforms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: