From: Samantha Atkins (samantha@objectent.com)
Date: Wed Mar 10 2004 - 02:22:10 MST
I am most worried about this present slice of time where existing all
too human oligarchies attempt to so control technology and social
change as to preserve their power. The level of surveillance and
coercion required and available with accelerating technology may yet
doom humanity before greater effective intelligence can be built,
organized and deployed.
- samantha
On Mar 7, 2004, at 11:59 PM, Charles Hixson wrote:
> Philip Sutton wrote:
>
>> Hi Charles,
>>
>>> I've said this in other contexts, but *I* think it bears repeating.
>>> If it weren't for the actions taken by persons in power, I would
>>> feel
>>> that all reasonable steps should be taken to slow the onset of the
>>> singularity. As it is, I feel the singularity may represent our only
>>> hope for survival, dangerous though it is.
>> Given your preamble on the dynamic of instability in the face of
>> rapid change, what makes you feel that a coercive oligarchy won't be
>> the force that actually controls the launch into the singularity ie.
>> that controls the critical bulk of computational power and other key
>> technologies?
>> Cheers, Philip
>>
> They may well initiate it's launch. If they do, I feel our chances of
> survival are worse. But in any case, those who launch it will soon
> lose control over it. Soon may mean in minutes. The course that we
> are attempting to follow is to create an entity that will, of it's own
> volition, make decisions that we would, if we understood the
> situation, consider life affirming. I don't think anyone here has the
> illusion that control would rest with any human or group of humans.
> At least, humans as we would recognize them today. Multiple paths to
> transcendence imply that in some of them the control would rest with a
> group of entities that had once been human. Those aren't necessarily
> the most hopeful paths, either.
>
>
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT