On Thu, 15 Mar 2001, Eliezer S. Yudkowsky wrote:
> My Standing Challenge is as follows:
>
> "Name one concrete thing that you should be able to do, but that a Sysop
> won't let you do." It can't be an intangible quality, like "having
> nanobots fully under my control" - you have to name something specific
> that you want to do with those nanobots, but which you won't be able to do
> under the Sysop Scenario.
How about manufacturing my *conscious* girlfriend-de-jour with an
auto-self-disassemble program on a 24-hour timer (because all
that sword-fighting and head lopping off is just too much trouble).
Built in is the memory that she gave informed consent to
auto-self-disassembly.
Robert
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:40 MDT