Our sole responsibility is to produce something smarter than we are; any problems beyond that are not ours to solve -- Eliezer S. Yudkowsky, from "Staring into the Singularity"Helping create an entity that is more intelligent than we are is one of my (admittedly lesser) goals that I hope to achieve in my robotics work. I believe that it's achievable, if not in my lifetime, at least well before any truly unsolvable, catastrophic environmental problems arise. That's why I push far harder for policies that enhance economic growth (which in turn drives technology) than I do for things like environment friendly regulation.
I'm currently reading (scanning) Ray Kurzweil's The Singularity is Near, in which he reiterates a core belief of mine -- that technology is expanding at an exponential rate that is itself increasing exponentially. The resulting wealth and ability to mitigate problems that will result from this technological expansion will surpass our current capabilities by orders of magnitude in just a few decades.
So far, I recommend Kurzweil's book, though I'm only on chapter three. The quote above appears in the book.