From: Kevin Osborne (kevin.osborne@gmail.com)
Date: Sun Jan 29 2006 - 22:49:15 MST
a few points, probably worth signficantly less than two cents:
- coding an AGI will require programmers. As a programmer, newly
interested in the field (my current n00b entry vector is
Vinge->Anissimov->Kurzweil->Yudkowsky->Goertzel) the SIAI page in
question was, while almost certainly valid, the html equivalent of
being given the finger. As an open source nut, I'm used to programmers
being wooed to add momentum to a given project; I thought the idea is
that any code is better code than no code (cf. vaporware) and that the
more programmers you have, the more likely you are to get a few who
are any good. It seems intuitive that encouraging people _not_ to
program in the AGI sphere would be counterproductive. Add to that a
seeming lack of open source projects, tools, frameworks, templates of
any veracity to contribute to and you get left with someone who isn't
going to donate either time _or_ money until they figure out whether
_you_ are the Crazy Person... and more likely will just go back to
hacking their Linux kernel.
- having said that, definitely seek out wingnuts instead of jocks.
people who work well in a team are nice to chat to at the water
cooler, but in terms of getting code flying are worth their weight in
popcorn. The guys I've met/worked with who have code running on good
proportions of boxen planet-wide have invariably been one, maybe two
guys who in person are at best rather objectionable and at worst are
complete nutballs. but what they've achieved by working in basements
and underdeveloping their social skills has had a huge impact - just
count the number of cpu ticks people give to them. How many examples
of real breakthroughs and achievements are made by pubic-headed
dingbats working at 3am and how many by well funded back-scratchers
working on the corporate farm? This seems to apply in science as well
as tech. Social skills may be nice to keep mediocre project teams
achieving their goals but all the real groundbreakers seem to be half
nuts - Bram Cohen is a great example. Vinge has a theme in one book
that seems to allude to chemically enforced Asperger's to create teams
of super-achievers.
- rehashing the first point, you guys need more life in this thing.
having a high bar is needed for some things but you also need lower
entry points to develop a groundswell. Some of the non-geniuses could
maybe help put together an AGI@home to see if maybe they can get the
citizenry to contribute some of the heavyweight distributed processing
that'll be needed; and while they're at it raise some awareness and
maybe bring in some donations via welcome pages rather unlike
seed-ai-programmer.html. And where is the open source project? For
shame. Farming out tasks to teams of varying ability is a good way of
getting the job done. Provide avenues for lesser coders to be testers,
builders, system integrators and plumbing pipers as opposed to the
algorithm and psynet freaks which are, yes, absolutely essential but
shouldn't be doing journeyman work and you never know, maybe one of
your plumber coders will have personal problems, go half nuts on
crystal meth and end up writing half your thought-pattern code in a
year-long fit of beneficient mental misfiring before he offs himself.
On 1/30/06, Olie L <neomorphy@hotmail.com> wrote:
> Richard's refusal: Not Strong Enough
>
> >From: Richard Loosemore <rpwl@lightlink.com>
> >Subject: Re: Syllabus for Seed Developer Qualifications [WAS Re: Some
> >considerations about AGI]
> >Date: Sun, 29 Jan 2006 19:39:00 -0500
> >
> >Patrick Crenshaw wrote:
> >>On 1/25/06, Richard Loosemore <rpwl@lightlink.com> wrote:
> >>>Huh? You don't need a PhD. in all of those subjects!
> >>>
> >>>It would certainly help to be well-rounded, but it is crazy to require a
> >>>Ph.D. in (e.g.) M-Theory. Heck, a couple of undergrad physics courses
> >>>and a dose of Brian Greene ought to be enough.
> >>
> >>A dose of Brian Greene is just about useless. What could be very
> >>useful for AGI is... thermodynamics.
>
> Useful, possibly. How is it _necessary_?
>
> Imagine a Seed AI encoded on DNA... Does anyone deny this as a possibility?
>
> Imagine a project going about creating a Seed AI using DNA. Sure, it's an
> arse-about way to go about it today, but if our genetics knowledge were
> greater (times many many)...
>
> The original physics developments needed to create such a seed AI? Nowt.
>
> The advanced physics knowledge needed by the development team? Nowt
>
> The simple physics knowledge needed by the team? Hrmmm... could be useful
> to help the seed AI get a grip on reality... before it learned advanced
> physics for itself.
>
> The simple physics knowledge needed by most members of the development team?
> Zip.
>
> Some well rounded knowledge could be useful to help the project know where
> it doesn't need to go. F'rinstance, some Neurology and Quantum mech might
> be useful for an AGI team to know that Penrose's "brain /= computing
> machine" objections are, at best, tangential to AGI.
>
> But not every team member needs to be a polymath.
>
> Polymaths are useful. At least, I hope so, as I have a limited degree of
> vested interest in demand for polymaths. But not every member of a team has
> to be a pan-Paragon for the team to achieve amazing things.
>
> Teams ought to combine individuals' skills and abilities in compounding
> ways. They need to be able to communicate on the application of specialist
> abilities. That doesn't mean everyone has to be an expert in all areas.
>
> I'm not playing down what's required to be on a successful AGI team. Even
> though I "get" (a lot of) AI and cog-sci theory, I can't programme for shit.
> So don't hire me.
>
> But don't turn down a great neuroscientist/cog scientist with meta-math and
> programming skills (think: Giulio Tononi crossed with Jaynes, with coding
> skills) just because they don't know diddley about networking / operating
> systems / number theory.
>
> Put them in a team, and make them useful.
>
> -- Olie
>
> More examples:
>
> - If the AGI project is not utilising nanotech, no nanotech knowledge is
> required. If it's using existing nanotech (eg: nanocomputers supplied by a
> 3rd party, which are faster but appear identcal to conventional computers),
> no nanotech knowledge is required.
>
> - Much is made about rationality tools. Any AGI project will have >=1
> rationality tool (eg: Bayesian reasoning), so all programmers will likely
> have *some* connection to the AGI's use of rationality tools. But if they
> can institute the programming, does it really matter whether or not the
> programmers are themselves hyper-rational?
>
> If they are knowledgeable and insightful, does it matter if *some*
> teammembers have mediocre "Judgement Quotients"? If they occasionally have
> a teary over which O/S they have to programme with, does this matter outside
> of team morale issues? I fail to see how such natural deficiencies are a
> problem with well managed teams.
>
>
>
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT