[p2p-research] Drone hacking

Ryan Lanham rlanham1963 at gmail.com
Thu Dec 24 03:29:34 CET 2009


Several things about this argument confuse me because I simply don't share
one or more points of view--not in a falsifying way...but in a fundamental
way.

First, there is the interesting focus on legitimacy of fields of
specialization.  Human enough, and just the sort of oddness computers won't
have for long...because they don't need to.  We can  focus as much as we
want, we won't be able to compete.  Specialization is inherently about
competition (and I suppose a little about self-fulfillment...but hardly).
Competition falls away rapidly when systems surpass us.  Power lifting is
still a sport but it isn't something very gripping for most of us.  Fernhout
was actually particularly elegant on this point I thought.  He actually
helped be start to visualize post-market worlds more clearly.  The key word
is abundance...which has nothing to do with stuff, but with capacity.  When
you have abundant cranes and forklifts, power lifters are uninteresting.

With regard to privacy, I suppose it is a worry.  But my dog doesn't worry
about privacy from me.  I protect it, walk with it, be its companion, feed
it, take it out to pee.  If I was the dog, I don't think I'd care very much
about privacy.  I don't think less of dogs because they don't, and I am not
wicked toward them because I could be.  As pervasive sensors become more or
less omniscient, I don't see why I'd care.  I don't care now.  The only
threat is from other people...not from machines.  If machines wish to kill
us, I suspect others will wish to protect us.

We can easily stop most crime now by rfiding all vehicles and making all
money electronic.  I suppose it is better to have privacy, but I for one
would quickly trade it for a stamp on my car and new card in my wallet (with
no cash.)   Fearing governments is a bit 20th century Orwell was a brilliant
writer and an interesting guy, but the gig is sort of up.  I actually like
my Big Brothers.  Wouldn't mind a few more.  My brothers aren't bad beings.
Should I abandon the idea of big brothers because they might be bad?

Overall, of course people can try to prohibit progress and learning, but it
has never worked before.  It won't work with machines.  Learning is a
feedback technology.  Dark Ages result from stifling forces, but
renaissances are practically inevitable.  Our evolution makes us learners.
Machines' evolution is making them learners. For now, we breed them.  Soon,
they will breed themselves.

I never realized (and I mean this with complete sincerity and wonder) that
humans were so anthropocentric in social philosophy.  I am genuinely
surprised.  I guess I have been hanging with biologists and AI people so
long that that is comes as a surprise to me.  What is the big deal?  We're
really not that remarkable.  It is learning that is remarkable.  And
learning will be increasingly decoupled from us.

-- 
Ryan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listcultures.org/pipermail/p2presearch_listcultures.org/attachments/20091223/1c5d4e2d/attachment.html>


More information about the p2presearch mailing list