From: Robin Hanson (hanson@econ.berkeley.edu)
Date: Fri Jul 17 1998 - 10:47:10 MDT
'gene writes:
> > Are you saying, for example, that on average the forecasts I might make
> > today about 2098 are no more accurate than the forecast an ancestor of
> > mine might have made in 1898 about 2098?
>
>Essentially, yes. Assuming, 2098 is PostSingularity you can only safely
>exclude anything not in accordance with physical laws, but given what
>little we know this set of constraints is not very stringent. Of
>course if we run into saturation it might turn out your predictions
>are accurate, after all. Point is, we can't tell yet what actually
>happens.
I don't know what you mean by "saturation."
You don't need to "safely exclude anything" to be able to make more
accurate forecasts. You just have to assign a higher probability to
what turns out to be the right answer. Consider the probability
distributions my 1898 ancestor would have assigned to these:
1) Number of his/her human decendants in 2098
2) Median physical mass of each descendant in 2098
3) Median age till death of each descendant in 2098
4) Median number of immediate descendants of each descendant
5) Median distance each descendant lives from where he lives
6) Modal religious affiliation of descendants
*Any* difference in his/her vs. my distributions assigned should mean
I will make better forecasts (assuming I'm rational). Even if the
distribution I would assign is broader than the distribution
he/she would have assigned, reflecting the realization of a wider range
of possibilities, that still means I am more likely to assign a
higher probability to the right answer.
Robin Hanson
hanson@econ.berkeley.edu http://hanson.berkeley.edu/
RWJF Health Policy Scholar, Sch. of Public Health 510-643-1884
140 Warren Hall, UC Berkeley, CA 94720-7360 FAX: 510-643-2627
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:22 MST