Search This Blog

Wednesday, September 28, 2005

I'm Working on It

Our sole responsibility is to produce something smarter than we are; any problems beyond that are not ours to solve -- Eliezer S. Yudkowsky, from "Staring into the Singularity"
Helping create an entity that is more intelligent than we are is one of my (admittedly lesser) goals that I hope to achieve in my robotics work. I believe that it's achievable, if not in my lifetime, at least well before any truly unsolvable, catastrophic environmental problems arise. That's why I push far harder for policies that enhance economic growth (which in turn drives technology) than I do for things like environment friendly regulation.

I'm currently reading (scanning) Ray Kurzweil's The Singularity is Near, in which he reiterates a core belief of mine -- that technology is expanding at an exponential rate that is itself increasing exponentially. The resulting wealth and ability to mitigate problems that will result from this technological expansion will surpass our current capabilities by orders of magnitude in just a few decades.

So far, I recommend Kurzweil's book, though I'm only on chapter three. The quote above appears in the book.


johnt said...

New cars,penecillin,and this thing called the computer,great. New human beings I'm far less sure of. History and the present are cursed with people who are distraught that other people aren't like them or are recalcitrant,or just different. It goes back before the ziggaraut and plagues us today. I'm leary of what seems like a Huxley type world that those who would be adventurous with the lives of others view as a golden opportunity. Implants,wiring,genetic manipulation,who needs politics?

Bret said...


I'm not sure about the "new human beings" concept either. Note that Yudkowsky wrote "something" smarter, and, in my opinion, by not specifically mentioning humans, it seems to imply a non-human entity, for example, a robot.

If, instead of Yudkowsky's quote, you're referring directly to Kurzweil's vision, it still depends a lot on how "implants, wiring", etc. affect our humanity. For example, am I less human when I log on to the Internet and have its vast body of knowledge at my command? Is it any different, then, if I have access to that same knowledge hardwired into my brain (whatever that means)? I think not.

By the way, as always, your comments are sincerely appreciated.

johnt said...

MANY WORKMEN Built a huge ball of masonry upon a mountain top then they went ot the valley below and turned to behold their work. "It is grand they said; They loved the thing. Of a sudden it moved, It came upon them swiftly It crushed them all to blood. But some had opportunity to squeal. Stephen Crane You may guess that Crane is not talking about masonry. Technology is one tough genie to get back in the bottle and with the future marvels of genetics hovering over us and the avatars of cloning smiling benevolently in our direction Crane's poem may be worth remembering. Thanks for your kind response,keep trucking.

Howard said...

Bret would like to
"create an entity that is more intelligent than we are"

of course one already exists - "the extended order of human cooperation"

the central planners, control freaks and socialist, scientific or otherwise, don't like it much...

Bret said...

I agree with that - note the plural "we" in both Yudkowsky's quote and my first sentence allows my post to be interpreted that way.