[p2p-research] Drone hacking
J. Andrew Rogers
reality.miner at gmail.com
Mon Dec 21 22:58:01 CET 2009
On Mon, Dec 21, 2009 at 12:20 PM, Andy Robinson <ldxar1 at gmail.com> wrote:
> It is sad that there are people around who think the whole world is made up
> of "behaviours" and numbers, and who are in denial about the importance of
> culture and social construction in human action.
This has nothing whatsoever to do with being "in denial about the
importance of culture and social construction in human action".
Indeed, it quantifies the importance far better than anyone is ever
likely to be comfortable with. Go ahead and keep your ineffability; a
computer will still be able to predict and manipulate your behavior
below the threshold of your ability to detect it.
This is hardly a novel idea. Hagelbarger and Shannon demonstrated that
a computer can predict human behavior better than humans can predict
their own behavior in the 1950s. Mathematics and computers have only
gotten better.
> So, we have a sci-fi future with computers which do not require inputs or
> programmers, and which never have bugs or backdoors.
No, we have an only slightly sci-fi future where the models are
generated via generalized inductive methods instead of hardwiring the
pattern extraction. Throw lots and lots of raw data into the system
and let the mathematics deal with the useful pattern extraction --
they are much more reliable than humans at discerning subtle
relationships.
> These perfect
> computers accumulate vast quantities of data on desert nomads and mountain
> clans, not to mention the millions of slum-dwellers without computers. They
> then deploy aggregate models which somehow manage to eliminate variation for
> individual difference, despite the fact that they only know numbers, not
> people.
Nobody is relying on sensors and data sources you own, though that
most certainly helps. More importantly, all of this technology works
by building unique and detailed predictive behavioral models of each
and every individual. This is a big part of why it is computationally
intensive. No one is predicting the behavior of some standardized
"average" individual, they are predicting the behavior of *you* in a
specific context. Shades of Minority Report, except not fully
realized and not relying on magic.
> By which point they had bloody well better know how to
> make themselves invisible, so the companies/strategists/politicians who are
> using them don't think to come for the wall-plug. (Oh wait. More naivete
> on my part. They're bound to be powered by self-refuelling warp drives.)
They are already invisible, that's the point. Very limited versions of
this type of technology are already used at a couple big e-commerce
sites; it does wonderful things for sales, and to the end user nothing
has changed. Hell, I know how this stuff works and *I* can't see it
when I use those sites even though I know it is there and that they
are manipulating a model built by measuring and testing my behavior
(also invisible).
> The only people stupid enough to think that people are predictable, are
> people who are boring enough to BE predictable. And who have only ever
> bothered to look at "behaviour" in *mass* societies, where the people around
> them have also been made stupid and boring.
Yes, I know, you are a precious and unique snowflake of unfathomable depth.
Ironically, people seem to be more predictable (from the perspective
of a computer) when they are intentionally trying to be unpredictable.
There is more entropy in relatively thoughtless activity, probably
because behaviors are more closely coupled to environmental entropy;
the environment has far more ambient entropy than human cognitive
processes.
--
J. Andrew Rogers
realityminer.blogspot.com
More information about the p2presearch
mailing list