From: Gordon Worley (redbird@rbisland.cx)
Date: Mon Sep 16 2002 - 21:20:02 MDT
On Monday, September 16, 2002, at 05:58 PM, Ben Goertzel wrote:
> You have talked about rationality a lot on this list, but you haven't
> defined it in such a way that I can really understand what you mean by
> the
> word.
>
> Furthermore, it seemed to me in some previous discussions that you and
> Eliezer meant subtly different things by the term.
We used to define it differently. Now it's just that I'm not as good at
writing the definition because my understanding of rationality is not
quite as deep as Eliezer's. Anyway, here's my latest attempt at
defining rationality in a way that I think you'll like, Ben.
rationality: a force in the universe, specifically the BPT, that is
required for all nonaccidental successes to occur
work: any nonaccidental success, where success is defined as the
completion of a process
Let U be the set of all processes in the Universe
Let T be the proper subset of U that is equal to the set of all
processes in U that work
If x is a member of T, then x draws upon rationality
Rationality, the force, is different from rational thought, the
consistent application of rationality in your mind. All thought draws
upon rationality, but irrationality is often introduced into thought in
human mind. Purely rational thought (and no one here claims to be
capable of purely rational thought) would mean making all decisions
following Bayesian decision theory.
As an aside, if you're curious why T is a proper subset of U, it is
because the Universe is by default arational and, for example, the
formation of stars and planets in no way draws upon rationality; all
those successes were accidental. The Universe, as far as we know, did
not set out with the goal of creating Earth.
>> Can things be done now to make acceptance of the Singularity easier?
>> Possibly. When it actually happens, though, I bet everyone on this
>> list
>> will still be shocked to see it actually happening. Rather than trying
>> to prepare people for the Singularity, your time would be better spent
>> actually making the Singularity happen by recruiting, raising money,
>> coding AI, etc..
>
> I don't think one can responsibly make a blanket statement like that.
> There
> is a lot of uncertainty about the Singularity, and it's quite plausible
> that
> preparing people mentally and emotionally for the Singularity will help
> the
> Singularity to go down better. For example, people who are
> better-prepared
> for the Singularity are less likely to try to pass laws halting
> Singularity-pertinent research, to try to blow up researchers doing
> useful
> Singularity-ward work, etc.
I doubt that since the Singularity isn't a technology like nanotech that
has both Earth-shattering and incremental advances. To be honest, I
highly doubt that the impact of swaying public opinion one way or
another will make any real impact unless one of, what I consider to be,
outside scenarios occur. One is that everyone suddenly gets the
Singularity and they give us lots of money to work on it and everyone
lives happily ever after. Another is that everyone but a few dozen
people are strongly opposed to the singularity and the 6 billion people
on this planet make quick work of us. More likely is that we'll face
any number of slightly helpful or slightly harmful environments that
will either make our lives easy or make us jump through a few hoops. In
sum, though, I think that a person's work saving us from jumping through
a few hoops will have far less impact than if that same person spent
time working more directly towards the Singularity. In other words,
leave the memetic battles to those like Kurzweil who are interested in
not only SL4 technology. If you want the Singularity to happen you
should be doing what you can to work towards it.
--
Gordon Worley "You're not going crazy ...
http://www.rbisland.cx/ you're going sane ... in a
redbird@rbisland.cx crazy world!"
PGP: 0xBBD3B003 --The Tick
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT