From: Eugene Leitl (Eugene.Leitl@lrz.uni-muenchen.de)
Date: Mon Mar 31 1997 - 04:53:37 MST
On Mon, 31 Mar 1997, Gregory Houston wrote:
Greg, your problem seems to be a language/concept artefact, you're mixing
the abstraction layers. If you think computer, you instantly see a
particular machine, programmed in a particular way. It's seseless to seek
emotions, or hydrodynamical shockwaves in flipping bits, as trying to
find emotion in spiking neurons. It's an activity pattern, a physical
process occuring in the exquisitely structured space slab between your ears.
A computer simulation of a particular physical system has a degree of
equivalence. If we're just after information about that peculiar system,
information extracted is perfectly valid in the real world. Unless you are a
part of the simulation (imagine the Universe, observers included, being
sucked into your computer in a solipsistic implosion, we wouldn't even
notice), you need I/O to be able to interface (look and tweak) with it
at all, of course.
A perfect simulation of the photosynthesis process won't feed anyone in
the real world directly. Yet ANN-controlled agents moving in a spatial
ALife simulation, looking for food and enemies look very much alive to
me. Primitive, yet clearly alive. I wonder how an orthodox Brahmin
must feel when cutting juice off a long-evolved ALife ecosystem.
Technically it would be inconsistent behaviour, looking at your toes for
fear of crushing bugs, yet carelessly flipping off the switch.
Good thing thinking is mostly about processing information, and sensors and
actuators easy to come by. Evolving agent behaviour in a robot colony is
actually easier than to simulate it, because correct physics of a
nontrivial dynamic structure assembly is terribly expensive (in terms of
computer horsepower, and code complexity) to do.
Attaching particular values to certain physical processes is merely an
artefact of the observer. You are within the you-system, that's why you feel
about love, and pain differently. Torturing uploads by tweaking their
R&P system representation is a crime, even though in cyberspace no one
hears you scream.
ciao,
'gene
> David McFadzean wrote:
>
> > What if the images that the computer analyses were generated with
> > a ray-tracing program? No optical hardware needed.
>
> That would be much more like imagination than seeing. I do not require
> eyes to see things in my head that I make up, but if I want to see
> things from the real world then I need something that will recieve
> information from the real world.
>
> > Randall Beer and his students successfully taught (programmed/trained/
> > evolved) a simulated cockroach to walk though it had no real (hardware)
> > legs. When the neural nets were later downloaded into a real robot it
> > could instantly walk in the real world.
>
> It was trained to walk, but it could not walk until it had the robot
> embodiment. It could think about walking all day long, but until the
> instructions were downloaded into the hardware (the robot) there was no
> walking going on.
>
> --
> Gregory Houston Triberian Institute of Emotive Education
> vertigo@triberian.com http://www.triberian.com
> phone: 816.561.1524 info@triberian.com
> cellular: 816.807.6660 snail: PO Box 32046 Kansas City MO 64171
>
> "Empowered, impassioned, we have a lust for life insatiable!"
>
>
______________________________________________________________________________
|mailto:ui22204@sunmail.lrz-muenchen.de |transhumanism >H, cryonics, |
|mailto:Eugene.Leitl@uni-muenchen.de |nanotechnology, etc. etc. |
|mailto:c438@org.chemie.uni-muenchen.de |"deus ex machina, v.0.0.alpha" |
|icbmto:N 48 10'07'' E 011 33'53'' |http://www.lrz-muenchen.de/~ui22204 |
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:19 MST