Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Okay, you had me up until > up to and including a technological singularity.

Please...please stop this Ray Kurzweil religious BS. This will most likely not happen within our lifetime and I sure hope it doesn't. The Singularity is based on the notion that we will all have access to immortality, knowledge, etc.. the beautiful concepts we are already losing access to (See SOPA, PIPA, freedoms getting taken away, etc..)

The 'sheeple' will never get to reap the benefits of 'The Singularity'.



This may surprise you, but I think Ray Kurzweil is waay too optimist. Partly because reaching the Singularity will mean messing with forces we barely comprehend, and partly because we may collapse before we even have the time to mess with those forces. If you have some time, I suggest you to take a look at http://facingthesingularity.com/

In the meantime, just ponder this: we are, in a trivial sense, machines. Machines capable of reasoning about themselves. In principle, there is nothing stopping us from making actual machines that beat us in every domain. Yesterday, Chess. Today, Jeopardy. Tomorrow, driving. And maybe someday, machine building itself.

If we ever reach that point, you know enough about recursion to know how this will goes: recursive self-improvements leading to something way more capable than your average Einstein. Now let's hope that our little Skynet is programmed to do good (whatever "good" is), instead of, say, using us as raw material to fill the solar system with paper clips. Because unlike with the fictional Skynet, we won't even stand a fighting chance.


link2009: I suspect your account may have been killed (i.e. everything you post is visible only to you). You should check since your most recent post is dead. (However that can also happen if you post a duplicate, so I can't say for sure.)

It's risky for a new member to post something controversial - you have to build up a karma cushion first. (I personally don't approve of downvoting a controversial comment, but I'm fighting a loosing battle on the subject.)


The core concept is an event we can't predict beyond, so claiming it means any specific benefit (such as immortality) is misunderstanding it, and so is claiming some specific people won't be affected, and so is claiming that the outcome must include 'benefits' for living humans.


There are now 3 core concepts that we may call "Singularity".

1) Event horizon: past the point where there are entities smarter than us, we virtually can't predict anything.

2) Accelerating change: things will get bigger/better/smarter at an exponential rate. The "Singularity" is a point somewhere on that exponential curve. (Or something)

3) Intelligence explosion: if it's smarter than us, it will be made by us (whether it is an AI, or brain computer interface, or whatever). Therefore it can recursively improve itsef to super-intelligence very quickly.

Taken loosely, those 3 visions are mostly compatible. But taken to their extremes, they are contradictory. It is important to distinguish the 3 to avoid confusion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: