Robin Hanson writes:
> I don't know what you mean by "saturation."
Sorry for being notoriously informal, it means that certain developments start to lag relatively to the hitherto linear log plot of some relevant metric vs. time.
> You don't need to "safely exclude anything" to be able to make more
> accurate forecasts. You just have to assign a higher probability to
> what turns out to be the right answer. Consider the probability
> distributions my 1898 ancestor would have assigned to these:
Hmm, let's see. (Btw, would you let me have a copy of your algorithm
of assigning these numbers to actual events forecast? Big thanks).
> 1) Number of his/her human decendants in 2098
Anything from zero (we can't go negative, can we?) to some
> 2) Median physical mass of each descendant in 2098
If physical mass has a meaning after some timespace engineering (show me a proof you can't do that by 2098). Even if you can't build these, I still have no idea how many persons can really fit into the volume of a glass of water, whether quantum computers, or no. What's the mass of matter in the PS lightcone @ 2098?) (And molecular circuitry we will certainly to be able to do by then. Urgh. Was this a forecast? Forget whatever I said).
> 3) Median age till death of each descendant in 2098
Age? (Well, if you go Lamarckian, identity stops having its canonical meaning). Death? What's that?
> 4) Median number of immediate descendants of each descendant
If identity still exists, as does timespace.
> 5) Median distance each descendant lives from where he lives
Urrr. 'Distance'? 'Live'?
> 6) Modal religious affiliation of descendants
'Religion'?
{Allright, I've been deliberately nonconstructively dadaist (within my current mental model), but point (still) is: you can't be sure what actually happens. Personally, I would adhere to more conservative scenarious (you'd be surprised), but no one can't be certain. No, sir. Not even Robin Hanson.}
> *Any* difference in his/her vs. my distributions assigned should mean
> I will make better forecasts (assuming I'm rational). Even if the
Maximum difference you'll get, but it still does not mean anything a priori.
> distribution I would assign is broader than the distribution
> he/she would have assigned, reflecting the realization of a wider range
> of possibilities, that still means I am more likely to assign a
> higher probability to the right answer.
The distinction is not practical. Yes you would, much good this will do you.
ciao,
'gene