Re: law enforcement for profit

From: Michael S. Lorrey (retroman@turbont.net)
Date: Fri May 05 2000 - 04:57:59 MDT


Billy Brown wrote:
>
> Michael S. Lorrey wrote:
> > Which speaks more to how the internet violates our privacy and the
> > security of our property rather than to a problem with the legal
> > principle. The internet is our tool, not the reverse, we should not have
> > our rights be enslaved to its limitations. We should not accept a police
> > state just so we can feel better about ordering books online with our
> > credit cards.
>
> True. But I didn't say anything about accepting a police state. I said that
> the idea that only human observations can constitute evidence is rapidly
> becoming impractical, and will be completely unworkable in the very near
> future. If you don't like my solution to the problem, offer another one -
> but don't tell me we can just bury our heads in the sand and ignore it.

Simple: require a minimum of a chimp level AI running the cameras.
Principle should not bend to practicality, the only reason cameras are
goin up now rather than 20 years ago is that cameras are relatively
cheap now relative to the value of a traffic citation. Demanding a
higher standard of practicality to match one's legal principles merely
puts it off for a few more years, but in the end you won't have people
so upset about being having themeselves, rather than their vehicles,
being cited for something they did, and eliminate all vicitimless crimes
from being prosecutable with camera evidence.

>
> Nothing should ever be assigned a 100% level of trust in a court of law. But
> that doesn't mean the level of trust has to be 0%. The idea is supposed to
> be to present the jury with all available data and let them decide who to
> believe, remember?
>
> Besides, the original topic was what to do with systems that are far more
> difficult to tamper with. It is impossible to make any device completely
> tamper-proof, but it is pretty easy to make them robust enough that it takes
> real expertise to falsify their data. If the device also has reasonably high
> accuracy (say, <0.01% false positives), it is going to be much more common
> for a human witness to lie than for the device to give false data.

The thing is though that the device's data does not go straight to the
courtroom. It gets transmitted somewhere, stored, analysed, stored
again, and then ticketing reports are processed based on the sorting
analysis. Every time data is stored it is ripe for manipulation or
corruption.

>
> I have to admit, BTW, that I find it both amusing and sad that we can be
> having this argument in this particular forum. Can you picture applying a
> 'only direct observations by sentients can be evidence' rule to a society of
> Jupiter brains? Even relatively puny technologies like mature VR and
> animal-level AI make the idea unworkable. Either you eventually adopt a more
> flexible standard, or you end up with large swaths of social interaction
> that are ripe for criminal abuse, but completely beyond the reach of law
> enforcement.

There have been rare occasions where the reactions of dogs have been
admissible in courts, and the testimony of young children. Give me a
chimpanzee level AI doing the traffic work and I might change my mind.
Until then, don't demand that I regard a machine produced traffic
citation as anything other than automated tyranny.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:28:25 MST