From: Marc Geddes (marc_geddes@yahoo.co.nz)
Date: Sat Oct 23 2004 - 00:35:39 MDT
80% of all new businesses fail. Even with an
excellent business plan, I think Eliezer is right, the
chances of success wouldn't rise above 50%. Also
correct is that the business would become your life.
You'd have to do nothing else for at least 5-10 years.
Finally, and this is probably the most important
point, but wasn't mentioned by Eliezer, running a
successful business requires a certain mind-set, a
certain psychology.
Someone said that there are lots of clever people, why
can't they get rich quick? Well, brains aren't what's
required. Business is all about people skills I
think. I judge that Eli does not have the right
psychology for business. Asking him to run a business
is about as sensible as asking a hard-arse Hollywood
Exec to become an FAI researcher. Wrong psychology.
And I'd be the first to admit that I don't have the
psychology for business either.
I really really don't know what the solution to the
funding problem is. I do wish Sing Inst well.
I can only point out again the reasons why I think few
are donating:
(1) most people don't understand Sl4 concepts
and (2) those who have the right psychology for SL4
concepts are highly individualistic and skeptical of
authority by nature, which makes co-operation
difficult. I think Eli correctly analysed the problem
here in the Raelian discussion. Dangerous to be
half-rationalistic and so forth.
Can I just say, I recognize that a Singularity is
*quite likey* and that there is *quite possibly* an
Existential risk in the not too distant future. But
this is still far from clear.
I think that Sing Inst supporters have to admit to
themselves they can't be *very sure* that this is , in
fact, the case. Even Eliezer cannot possibly know
what the actual time-line for existential risk is, and
what will, in fact, actually happen at Singularity.
It is not in fact the case that the risk is something
that is *really obvious*. Only a far bit of thinking
can persuade one that there is in fact, a real risk
here. And it is not in fact the case that we know
that the risk is actually near-term. Personally I do
not think that there is any real significant
existential risk until around 2030 and beyond. This
could be wrong. The probability of the extinction of
humanity could rise significantly far earlier than
this date. But who can honestly say they can make
accurate risk assessment? So the situation does not
appear to one to be as *obvious* as say a big asteriod
coming towards one (although the situation could quite
definitely be just as urgent).
=====
"Live Free or Die, Death is not the Worst of Evils."
- Gen. John Stark
"The Universe...or nothing!"
-H.G.Wells
Please visit my web-sites.
Sci-Fi and Fantasy : http://www.prometheuscrack.com
Mathematics, Mind and Matter : http://www.riemannai.org
Find local movie times and trailers on Yahoo! Movies.
http://au.movies.yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:49 MDT