From: Anders Sandberg (asa@nada.kth.se)
Date: Tue Sep 11 2001 - 14:06:10 MDT
On Tue, Sep 11, 2001 at 12:12:46PM -0700, Samantha Atkins wrote:
> Anders Sandberg wrote:
> >
>
> > At the same time this is in many ways just a test :-( The problem of
> > destruction - that the amount of destruction attainable by an
> > individual is increasing along with technology, and countermeasures
> > seldom work against irrational agents - is serious and growing.
>
> You could harden likely targets a bit. Like shooting down off
> course aircraft.
It is expensive and difficult to harden many targets. And a policy of
shooting down off course aircraft will of course mean that bloody
accidents where innocents are killed are going to happen every few years
- is that price worth the slight safety improvement?
> > The big problem is that we don't have any good solutions to it: the
> > standard response at present is to demand that Big Brother save us
> > through Echelon, Carnivore and DMCA - which will not really work,
> > but will erode civil liberties, add significant risks of abuse and
> > quite likely make people feel even more alienated against the
> > government, increasing risks of new attacks.
>
> Yes. Although DMCA has nothing to do with detecting or
> preventing any sort of terrorist activity.
True (although M$ and RIAA might disagree). But it is part of the same
meme complex: information must be controlled. The originating reasons
are of course different, but the basic problem for all the interests
behind these systems is that if information is power, then it must be
controlled so that they (and those groups they work for, including of
course civilian governments) can retain it.
> >The
> > transparent society might at least keep some civil liberties and
> > remain an open society, but the cultural changes needed will be
> > rather wrenching.
>
> The biggest change required is that governments be severely
> limited from interfering with individual rights and with any and
> all behaviors that do not directly harm or defraud others.
> Transparency is not safe unless governments are forbidden to act
> on the information received to suppress unpopular attitudes and
> activities or to attempt to force compliance with government
> endorsed positions.
The transparency option needs a big number of what Brin calls
"antibodies", those people brave and tenacious enough to keep poiting
out abuses and demanding their rights. Without enough of them it will of
course not work. This is why the transparency option needs more than
just small cameras, it needs a cultural awareness of the need for
freedom and the need to stand up for one's rights.
> > Of these partial solutions, some appear more promising than others.
> > The transparent society ideas of accountability and the ability to
> > trace stuff would help discourage the more rational irrationalists,
> > and might even allow the discovery of dangerous plots before they
> > are executed. If these traits could be combined with a
> > self-enforcing structure (legal, economic, software, whatever) that
> > prevents abuse, much would be won.
>
> But who gets to define what is and isn't abuse? Some of the
> current definitions of those in power in the US and elsewhere in
> the world would lead straight to a more rigid and horrific
> totalitarian state than the world has ever seen. As long as
> political theory and practice is highly irrational, our best
> protection is in what privacy we have and what inefficiencies
> for enforcement of government will that exist now. I will live
> (but not happily of course) with the threat of terrorism rather
> than under the grim certainty of a totalitarian state.
True. Of course, totalitarian states tend to breed terrorists too, so
you will not even be safe in that case.
In the end abuse has to be citizen-defined, and it needs to be defined
in such a way that the power of definition cannot be taken away from the
citizenship. How to achieve that is an important question, I really
hope we can rise to the challenge and come up with creative answers that
can be implemented.
> > We need more than just the need
> > for court orders to bring out keys from escrow, something that can't
> > be easily subverted by a new regime. Distributed encrypted legal
> > smart contracts, perhaps?
> >
>
> The Fifth Amendment should cover the right of any and all
> persons not to diverge their electronic information. This
> information should be seen as a direct extension of the person's
> mind. It should be as illegal for the court to order it open as
> for the court to order truth-serum to be administered to a
> defendant. Nothing less imho will protect us as individuals as
> we become more intimately augmented by electronic and
> computational devices.
I agree, but the worrying trend right now is in the other direction. I
noticed yesterday that brain fingerprinting is now admissible in US
courts, although I guess that is hard to do on a resisting subject. This
right to informational integrity is important, and despite the awful
implementations of it here in the EU, at least some parts of it is
getting into law (unfortunately, just parts and there are many
contradictions with equally sweeping police rights). Of course, this is
an area where crypto can help a bit, but we still need the legal and
ethical infrastructure to make the use of crypto legal.
> > The risk with disasters like today is that people feel powerless and
> > stressed, and hence opt for the easiest solutions rather than think
> > of new ones. We have a huge problem, we need to solve it, and we
> > need a *good* solution. Otherwise this will happen again and again,
> > on larger and larger scales.
>
> I could see a more transparent society (in both directions,
> public and private) with and only with severe restrictions on
> government and corporate meddling with individual rights.
It is a feedback loop: in a somewhat transparent society, anybody
meddling with your rights is more easily exposed, and meddling doesn't
pay as much as before. That helps make the society more transparent. On
the other hand, the reverse is also true. That is why we need those
antibodies.
> On today's tragedy, I wonder whether these planes were
> simultaneously hijacked in a coordinated matter (difficult) or
> whether the automatic guidance systems were somehow subverted
> and made impervious to manula override. Just a thought.
I think the first; to my knowledge (which is admittedly near zero) it is
not possible to completely override manual in airplanes. This event
might however lead to the reverse: what if planes could be remote
controlled in the event of hijacking? I can envison a cryptographically
secure remote control system that could in certain emergencies override
the pilot. This is likely a Bad Idea - both a hacking opportunity par
excellence (even if it is based on some clever hardware scheme there are
always mistakes), and it would undermine the trust of the pilot and
plane. However, some might think it might be worth it.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:10:29 MST