From: Jason Joel Thompson (jasonjthompson@home.com)
Date: Fri Aug 04 2000 - 01:20:51 MDT
> Jason's perspective is that this will be a disaster if we don't have a
> way to protect ownership of IP, and therefore we should come up with
> institutions today that protect the forms of IP that we have now, so
> that in the future we will be able to have a functioning economy.
I'm not really in favor of institutional protection in the long term, but I
have yet to articulate exactly what I am in favor of. More on this later.
> The article by John Gilmore follows much the same reasoning, but comes to
> exactly the opposite conclusion. John, like Jason, sees today's threats
> to IP as portending a future society where all productive efforts will
> face similar forms of piracy. Rather than fighting this trend, though,
> he seeks ways to live with it.
I see the 'trend' as entropy, and, yes, I do think we should fight it.
But the way I want to fight it involves using it, which means there may be
points of commonality between John's philosophy and my own.
> His experience no doubt teaches him that you *can* make money off of
> software which is given away for free; the millions of dollars in his
> bank account attest to that.
And clearly he won't be the last to make lots of money off of these sorts of
new business models. But I believe the ways in which people make money off
of free software are available to them (increasingly!) in world with 'not'
free software.
> Intellectual property law, and expected business practice, is being
> driven by the entertainment middleman industry in exactly the wrong
> direction: Artificially restricting computers and citizens so that they
> will not make the copies that they are very good at making.
Hmm, well I've never been a big fan of most uses of the words 'artificial'
and 'natural.' I'm an advocate of creating a new form of 'smart'
information that is not only data, but meta-data... that is to say, it
contains rules that govern it's own behavior... if that's 'artificial,' then
so be it. If we need to create a social contract to protect information
until we get there, then I can probably go along with that too.
Following
> this path will lead the economy to a massive dislocation (much
> bigger than the record companies' dismay about Napster and MP3 --
> record companies' sales today are greater than ever). Almost nobody
> understands this yet.
I try to avoid strong predictive statements because I'm all too aware of how
superfluid the next ten years are going to be. IMO we don't have good
benchmarks by which to make statements like "following this path will lead
the economy to a massive dislocation." (And actually, I'm not quite sure
what that means...) However, I'll grant that it ain't much fun if we don't
get to take a few stabs in the dark... so on!
> The solution he sees is to move society towards acceptance of the open
> source philosophy in all areas of human endeavor. Only in that way can
> we move into a nanotech economy without a crash, and allowing everyone to
> share in the fruits of the new technology:
>
> If even a third or a half of the economy is running on open source
> principles before assemblers start assembling more assemblers, we
> can probably avoid war and worldwide civil unrest.
Wow. Now we're gettin' into the big stuff. Okay, well let me try to
articulate a contrary viewpoint.
This isn't fully formed yet folks, so I hope a few broad strokes will
suffice:
For a moment, lets empty our heads of some of the commonly held beliefs with
regards to information in the Internet age: foremost, the idea that
"information wants to be free," but also the whole realm of wide-eyed dotcom
day-dreamiery: free, anonymous, infinitely self-replicating, frictionless
information. I would call this sort of information "dumb, nameless,
randomly replicating information with no rules." Let's refer to it as
Dinfo. (dumb info)
Lets consider instead a nearly antithetical type of information: smart,
named, intelligently replicating information with rules. Information that
knows who made it. Information that can be inherently possessed of rules.
Sinfo.
Let's forget for a moment that proper Sinfo requires technology that we
don't currently have yet-- let's instead just sit back and play a mind game
with Dinfo and Sinfo.
At first glance the biological parallel to Dinfo is cancer, whereas Sinfo
seems to have some strong correlation to the gene. To be more precise:
cancer replicates uncontrollably without good feedback mechanisms to control
its behavior, whereas the gene has some very specific rules about how it can
multiply, with what it can interact, and to "whom" it belongs.
Cancer clearly is the master of unrestrained growth. The gene, on the other
hand, must make use of a tremendously complicated higher order emergent
system in order to perpetuate itself. It follows an exceedingly intricate
and precise set of instructions in order to develop a complex adaptive
vehicle for itself: life.
Without straining the analogy further, the point is simply this: rules +
complexity + selection pressure = emergence. Unrestrained growth is a
negative attractor state-- one into which I fear we are being rapidly drawn.
Okay, okay, like I said, this is just going to be some broad strokes, so let
me move in a slightly different direction.
Sinfo knows who made it. It knows where its been. Copies may be
functionally indentical, but -they- "know" they're different. Some of it
follows rules. Some of the rules are extremely complicated: On Thursday you
can hang out with Bob's Sinfo, but on Friday you have a son and then commit
suicide. Some Sinfo lets you copy it indefintely. Some you can only copy
once, some not at all.
When you build Sinfo you can make it as free or as "owned" as you want. You
are empowered to create information with a legacy, that knows that you made
it, that won't let anyone else copy it precisely, and you can free it into
the environment to prosper or perish.
Now, I think there's a whole realm of inherent advantages to this model, but
I'm not going to get into all of them here, 'cause that's a whole book
folks. We've discussed some of the big issues already on this list. But I
do want to point out what I consider to be one of the biggest pluses to
Sinfo:
You can write yourself into it.
I know that this is the kiss of death for the "information wants to be free"
advocates, and it's the precise reason why I am not among that group: Even
if totally free information is wildly successful (especially if!) the bad
consequence is that it doesn't care about it's creator. We would live in a
state of making superficial use of the massive information network that has
coelesced around us, unable to control or affect it, passive bystanders to
an ongoing process that may very well write its own rules eventually (become
Sinfo!) and just simply evolve past us. A robust undiscipled cancer.
Optionally, we hardcode ourselves into the information-- in fact, we make
the "smartness" fundamental to the information. Every thought that comes
out of your brain becomes an extension of you, as opposed to an anonymous
dust mote in the cyclone. The system that is the "creator" becomes
inextricably linked to the data that is the "created." Widespread
empowerment of ideas in the frictionless matricies of the future give you
exposure to limitless designer ideas, with a direct positive feedback loop
to the "creator" based upon your successful use of the "created."
In the attention economy, you probably won't want to prevent copying of your
information, even if you could-- you get 'paid' everytime someone takes a
look. But maybe some uncopiable Sinfo swirls around the matrix-- absolutely
precious, diamond sharp and outrageously valuable. If you have to work hard
to get it, then...*you'll work hard to get it!* It won't be free or easy or
infinite but will instead require considerable innovation on your part to
procure.
Okay, well it's obvious I could on (and on, and on...)
The primary objection to Sinfo is likely to be on grounds of practicality.
(Think of all the overhead...!) But I'd argue that its worth all the
overhead... all that meta-data that appears to be slowing the information
down is actually integral to all sorts of emergent success. Kind of like
the government. :)
This does mean that we have to give up some control, in favor of interesting
rules. And if the technology isn't good enough to make those rules work,
then, again, yep, I am in favor of social engineering to make it happen.
But I really don't think the technology is impossible. I really don't. I
actually do think that the gene is a good example of a working model. The
gene is really just information-- written into a less plastic substrate:
physical reality. I'd like to copy and improve upon its success in the
media of the future.
But maybe I'll do that tomorrow, 'cause right now I gotta go to sleep.
Night all.
-- ::jason.joel.thompson:: ::wild.ghost.studios::
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:19 MST