From: Samantha Atkins (samantha@objectent.com)
Date: Wed Jun 26 2002 - 15:56:22 MDT
James Higgins wrote:
> At 03:59 PM 6/26/2002 -0400, Eliezer S. Yudkowsky wrote:
>
>> James Higgins wrote:
>>
>>> Actually, I was thinking about this earlier, glad you asked.
>>> I think the best solution would be to assemble a board of maybe 10
>>> people. These people should be intelligent, but not necessarily
>>> geniuses. Some should be experts on AI, but not all. I would say
>>> the criteria ALL members must posses would be:
>>> 1. A strong desire to see the Singularity occur
>>> 2. Strongly value human life and the survival of the human race
>>> 3. Must be willing and able to accept that a solution, other
>>> than their own, is a better solution
>>> The deciding body SHOULD NOT have exactly coinciding interest. They
>>> should not, under any circumstances, all be working on the same
>>> project (such as the Singularity Institute).
>>
>>
>> James,
>>
>> Do you believe that a committee of experts could be assembled to
>> successfully build an AI? Or even to successfully judge which new AI
>> theories are most likely to succeed?
>
>
> Do I believe a committee could successfully build an AI? Maybe. But I
> don't think it would be a good idea to do it that way.
>
>> If not, why would they be able to do it for Friendly AI?
>
>
> I never said they could, or should, DESIGN anything. Simply approve
> designs. Zoning committees don't build anything, but they are important
> to maintain order in a metropolitan area. I believe a Singularity
> Committee (or whatever it should be called - I'd like to avoid the term
> "committee") would be a very useful asset to the human race.
Sorry, but this cannot be made to work. Committees approving
software designs generally simply bog down the process and
remove anything truly innovative. I have experienced this in my
over two decades of software architecture/design/implementation
experience on software much more mundane and committee
understandable than an AI seed could be. Humans, especially in
their aggregated "committee" form, simply don't have remotely
the intelligence, creativity or ability to grasp the gestalt
that would be required.
> Although I
> can see where it could easily be seen as a detriment to will-full,
> single-minded, solo players or even like minded teams. Individual
> accomplishment is irrelevant in light of the Singularity, successful
> completion of the project in the safest manor possible is the only
> rational goal.
>
Sorry again, only individuals are relevant when you are
considering vastly sophisticated and largely blue sky new
software designs. Those don't come out of anything but
individuals.
- samantha
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT