From: Dan Fabulich (dfab@cinenet.net)
Date: Tue Mar 04 1997 - 21:20:03 MST
Hal Finney wrote:
> The idea of using Usenet is interesting, although it's not clear to me that
> the Usenet model of flooding the world with postings is going to survive.
> The web approach of just linking your new data into existing structures seems
> so much more efficient.
More efficient, yes, but Usenet also has its subtle charms. Most
notably, a decentralized flood is really the only way to prevent people
from retracting articles. And, of course, when retracting articles is
impossible, Max M's point about editing isn't relevant anymore. The
article with that message id will remain constant forever in an ideal
hypertext system, so referring to that message and its byte range will
always refer to the same data.
In regards to dejanews and altavista and other archives... These are
good, but not good enough. Ideally, the hypertext system would allow
us to move beyond the plaintext discussion of Usenet and into other
media. Yet even today dejanews wouldn't even THINK of archiving a
binaries group. Like a library, the hypertext system would have to
be able to deal not only with text but anything and everything which
computer technology might throw at it. Is this possible?
Ken Meyering wrote:
> The point is, the media content should have some kind of unique
> identifier, and the *medium* should be of no concern.
Very true, and MIME already accomplishes this... It doesn't matter
whether my Usenet is running off a DVD disk, a RAID drive or a punch
card so long as the data gets to my computer intact. MIME will help me
to encode the data and will tell me what medium I'm running. AND it's
already installed on almost every popular newsreader on the Internet.
> So, I suggest skipping all the R&D of *text-based* systems, and
> jumping straight into designing systems that will work for
> entertainment and education.
Usenet is a text-based architecture, but it doesn't just contain text.
With MIME and other binary-text converters, Usenet has been retooled to
deal with all conceivable media... And we can refer to these articles
by message id and byte range, especially if these articles are unique.
Really, the Usenet is already basically ready to take on hypertext. It
already does the coarse-grained work by threading subjects together.
With very little effort, this threading could become fine-grained and
non-linear. Both Netscape Navigator and Internet Explorer BOTH have
tools for reading news with links and binaries embedded in them. The
framework is already laid out.
> Who will PROFIT from building this type of system? Those who own
> content. ...Do Ted Turner, Gerald Levin, Michael Ovitz, etc. understand
> the technical requirements for these types of systems? I would guess
> that they don't. Do ADVERTISERS appreciate the benefits of these
> systems. I'd guess that they do. FINE-GRAINED PSYCHOGRAPHICS. Such
> systems would enable the tracking of information preferences to the
> most subtle levels.
Certainly, this system has obvious benefits for almost everybody. In
fact, your stimulating education and entertainment could almost emerge
straight from the current Usenet, were it more popular. But the real
genius of hypertext emerges when the archive is permanent, and the
acquisition of data becomes a constantly expanding time capsule which
anyone can study from, contribute to, and look back on for guidance.
> Maybe if you target your hypertext ideas to this audience. You may
> find that you get results more quickly than by targeting the W3
> Consortium.
Heh. No kidding. Maybe I'm just talking up pipe dreams. But I still
believe that hypertext is invaluable to the human race. Is there
anything more we can do to bring it about completely?
-He who laughs last thinks slowest-
dAN
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:13 MST