>Ray Kurzweil will be a guest on National Public Radio (NPR) "Science Friday"
>this Friday (March 17) at 3 PM EST (for one hour). Host Ira Flatow
will be
>interviewing Ray and Bill Joy (Cofounder and Chief Scientist of SUN
>Microsystems).
Questions for this interview:
1. Do you really think the human species can live out its entire
existence without ever coming face-to-face with enhanced humans or AIs
or *some* form of greater-than-human intelligence?
2. Wouldn't it be a good idea to concentrate on avoiding the avoidable
dangers, like nuclear or biological warfare, by confronting the
unavoidable danger/opportunity of greater-than-human intelligence as
fast as possible?
3. Will relinquishment really work? History seems to show that you
can't make a commitment not to develop a technology, you can only make a
commitment not to develop it first.
4. In the past, new technology has had side effects, but other
technologies have cushioned the impact. "Internet time" has increased
the pace of change but it's also given us tools that help us deal with
that change. Won't government controls on advanced technology slow the
cushioning civilian applications more than they slow military development?
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Member, Extropy Institute Senior Associate, Foresight Institute
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:05:25 MDT