From: Nick Tarleton (nickptar@gmail.com)
Date: Thu Apr 17 2008 - 21:59:31 MDT
On Wed, Apr 16, 2008 at 10:28 PM, Tim Freeman <tim@fungible.com> wrote:
> From: "Nick Tarleton" <nickptar@gmail.com>
>
> >Different respect vs. compassion weighting is not the only reason
> >humans judge the car-buying contest as acceptable but the mugging bad;
> >we disapprove of using the threat of force to take others' property.
> >This should be taken into account.
>
> You're saying interesting stuff, so I don't want to ignore you, but my
> response is the close enough to my respone to Lee Corbin that I don't
> want to redundantly send it to SL4.
>
> I would work hard to do that, if I knew how to express the concept of
> "others' property" formally. Any ideas?
>
>
> >In any case, you really shouldn't specify this level of detail in
> >advance.
>
> I'm not sure what you mean by "this level of detail". Are you saying
> it's overspecification to figure out whether the AI cares about
> non-humans? Or is it overspecification to talk about cows? Or
> something else?
Fixing who the AI cares about is over-specification. That's what the
AI (in the CFAI model) or extrapolated volition (in the newer model)
is supposed to figure out.
> Actually, I was hoping to find some principled way to decide who
> benefits without having to bring political expediency into the
> discussion.
As I said, political expediency definitely doesn't matter after the
Singularity, and probably not before either.
> As you can see, I failed, but I don't know how to do
> better.
"And if I myself dared to set down ten Rights, I would get at least
three Wrong."
- http://www.sl4.org/wiki/CoherentExtrapolatedVolition
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT