[p2p-research] From the document web, via the data web, to the active web

Athina Karatzogianni athina.k at gmail.com
Sun Mar 30 04:17:13 CEST 2008


Hi

Well its the middle of the night here too, and after reading that I thought
that there is a little bit more to the technical side of this, which Henrik
very competently discusses here.

I think this piece makes a serious point in relation to our understanding of
knowledge and how our philosophy of human-to-human and human-computer
interaction influences knowledge and particularly 'universal'  knowledge (a
truly bizarre utopian enlightment concept), as well as the uncertainties
created by 'globalization' and technocracy.

The article deserves attention because it warns against relying on 'boolean'
logics in IT terms, I would argue in relying also in 'binary' terms in the
political sense (although that is more relevant on the point of allowing for
interaction, disagreement and happily conflict, as they produce the most
interesting results).

What has been done 'technically' with Web 2.0 is not enough and
fundamentally the architecture is still the same relying on certain logics
of 'universal' truths, fact triangulation and customer-client relationships,
instead of networking and building on each other and why not, even
producing  'biased' knowledge. The reason the Web 2.0 is not that 'fun' is
because it is impersonal (or often too personal!), alienating to the
computer illiterate, and not catering for exciting interactions for those
that are IT literate.

To put it simply, some aspects are too centralized (control of platforms,
software, e-commerce etc), while others are too scattered and lost to the
few more well known blogs and webpages. The architecture is not enabling,
because it was devised for different purposes.

All the great efforts and amendments to that will always fall short. It is
like a house constantly changing builders, architects and engineers, I see
it difficult though, however good these people are to have better
cyberspaces, unless the foundations are looked at, and not only in technical
terms.

The philosophy part this author advances is in my opinion spot on.

Hello from chilly England

Athina

On Sat, Mar 29, 2008 at 10:43 PM, Henrik Ingo <henrik.ingo at avoinelama.fi>
wrote:

> Michel,
>
> It's in the middle of the night, but I just feel compelled to add some
> points. This is not good enough for a blog, but may be freely
> copypasted by anyone...
>
>
> My impression of the article is that it is written by some academic
> non-programmer, who has tried to study the HTTP protocol and some W3C
> standardisation efforts, but has no experience in actually producing
> web applications. As first impressions go I could of course be very
> wrong, I didn't even bother to read some of the middle parts of the
> article!
>
> The Document web definition is fine, it is what anyone would consider
> "Web 1.0". What I strongly disagree with is the authors criticism or
> belittlement of current "Web2.0". In my opinion a significant shift in
> the web happened with the maturation of the Firefox browser, which
> released an avalanche of web based applications and portals that made
> heavy use of JavaScript and CSS. (If someone wouldn't like the term
> "Web2.0" it may be better and clearer to call this "The advent of
> AJAX".)
>
> Before Firefox there where 2 browsers, Internet Explorer and Netscape
> that supported advanced JavaScript, but they supported totally
> different versions of it (the standardised version today is the IE
> one, a testament to the fact that MS indeed employes some very good
> programmers, the ones that happened to work on IE from 4.x to 6.x
> before 2001). Therefore most pages that tried to do anything with
> JavaScript or advanced CSS supported only one of these browsers, or
> sometimes tried to support both of them, often with poor results. And
> many in the universities or Open Source crowds for instance were still
> using text-based browsers - which is notable because at the time this
> group had significant mindshire in the web's development. For all of
> these reasons use of JavaScript was considered evil by (in my opinion)
> a majority of web developers and what was then called "Dynamic HTML"
> was mostly a phenomenon of the Microsoft camp. (Even today if you use
> the web interface to Microsoft Exchange email, it is very nice on IE
> but barely usable on Firefox.)
>
> With the advent of Firefox - which supported the then standardised IE
> style of JavaScript - the situation started changing, since there now
> was a standard, and a free multiplatform browser to support the
> standard. Quite soon very cool web based apps were born, led by Google
> maps, Google mail... This was called AJAX programming, as in
> Asynchronous JavaScript and XML. Compared to Microsofts DHTML
> evangelisation this was much cooler technology than anyone had ever
> dreamt of, and availability of an Open Source browser to support it
> made also the opposition vanish. So imho this, and not IE4.x with
> DHTML support was the de fact next phase of the web.
>
> At the same time we had developed some additional techniques - most
> signicant would prehaps  be RSS and the family of XML markups used to
> provide blog feeds. This lead to a collaboration between websites
> beyond linking: You could provide parts of another blog or newssite on
> your own page, for instance. Or to take a very different example,
> BookMooch uses Amazon to provide data and cataloguing of books. Yet,
> BookMooch is a site for free sharing of old books, you'd think Amazon
> wouldn't like "helping out" such a project. Not so, in reality lots of
> BookMooch users end up buying books on Amazon. In fact, BookMooch
> probably makes most of its income based on money they get from Amazon
> for these referrals.
>
> AJAX combined with RSS and some other by then standard tools (wiki is
> a significant one) is in my opinion rightly called Web2.0. This is
> very different from the original document based web and rightly has
> been given its own name.
>
> Web2.0 is NOT the social web (like FaceBook, LinkedIn). The social web
> is merely an application of Web2.0, technically it doesn't contribute
> anything new. (Well, apart from FaceBooks innovation of letting 3rd
> parties develop applications embedded in its own site, that is a great
> innovation, but it is not "THE social web".) Why the social web is so
> much hyped is in this context in fact a good question, I believe there
> is in fact a little pyramid scheme to it all. I mean Facebook is fun
> and all, but it isn't THAT fun, I think the effective inviting
> mechanism plays a part.
>
> This is the point we are now. Now for my own predictions:
>
> Next we will see the advent of the Single sign-on web, most likely
> emodied in the form of OpenID. (SSO means you don't have to create new
> logins for every site, you just use one main identity and password to
> log in to each site. Obviously the sites you log in to don't get to
> know your password, they just accept the referral from your ISP, mail
> provider, or other OpenID provider you are using.) This imho will add
> further granularity to the web, in that users can come and go more
> fluidly than today, where you make a choice to register and join
> FaceBook but not something else. This in turn should foster a
> development where we can again have smaller sites providing one small
> funny little piece of the social web, instead of the monolithic
> FaceBooks of today. This would be in line with what Web2.0 was all
> about, Facebook et al are in fact a countertrend to the Web2.0 trend
> if seen in this light.
>
> Whether a "decentralised social web" will arise from this is a good
> question, and whether the Global Giant Graph will emerge from that is
> an even better question. It might, but it might end up something
> entirely different. The GGG is technically possible today, and how
> OpenID works there are some similarities to the RDF used in GGG, so
> once OpenID becomes popular, the next step might be to not just
> externalise (or decentralise) your login credentials but also your
> social connections. But we will know the answer to this in something
> like 5 years.
>
> The proposal in the end on new HTTP commands is just pure folly (it is
> just the wrong place to do it, period), which underlines that the
> author wasn't just slightly off with his Web2.0 comments, but in fact
> knows nothing at all about the technology he is talking about. To
> implement such functionality by extending HTTP would imho be quite
> silly, and in fact a peer-to-peer protocol like SIP would probably be
> a better starting point in the first place, and even then you wouldn't
> do it by commands like those, but you'd develop an XML based document
> language to transmit this kind of information.
>
>
> So, I guess it turned out to a semi-good commentary after all. OTOH I
> think you stole my evening with this link so I'll have to do what I
> really was going to do tomorrow. Good night!
>
> henrik
>
> On Sat, Mar 29, 2008 at 8:29 AM, Michel Bauwens <michelsub2004 at gmail.com>
> wrote:
> > This is a great article to understand the technical evolution of the
> web, see
> >
> http://www.dur.ac.uk/j.r.c.geldart/essays/there_again/towards_the_active_web.html
> >
> >  Any comments about this to our blog would be most appreciated,
> >
> >  Michel
> >
> >  --
> >  The P2P Foundation researches, documents and promotes peer to peer
> alternatives.
> >
> >  Wiki and Encyclopedia, at http://p2pfoundation.net; Blog, at
> >  http://blog.p2pfoundation.net; Newsletter, at
> >  http://integralvisioning.org/index.php?topic=p2p
> >
> >  Basic essay at http://www.ctheory.net/articles.aspx?id=499; interview
> >  at
> http://poynder.blogspot.com/2006/09/p2p-very-core-of-world-to-come.html
> >  BEST VIDEO ON P2P:
> >  http://video.google.com.au/videoplay?docid=4549818267592301968&hl=en-AU
> >
> >  KEEP UP TO DATE through our Delicious tags at
> http://del.icio.us/mbauwens
> >
> >  The work of the P2P Foundation is supported by SHIFTN,
> http://www.shiftn.com/
> >
> >  _______________________________________________
> >  p2presearch mailing list
> >  p2presearch at listcultures.org
> >  http://listcultures.org/mailman/listinfo/p2presearch_listcultures.org
> >
>
>
>
> --
> email: henrik.ingo at avoinelama.fi
> tel: +358-40-5697354
> www: www.avoinelama.fi/~hingo <http://www.avoinelama.fi/%7Ehingo>
> book: www.openlife.cc
>
> _______________________________________________
> p2presearch mailing list
> p2presearch at listcultures.org
> http://listcultures.org/mailman/listinfo/p2presearch_listcultures.org
>



-- 
Dr Athina Karatzogianni
Lecturer in Media, Culture and Society
The University of Hull
United Kingdom
HU6 7RX
phone: ++44 (0) 1482 46 5790

Check out Athina's work:

http://www.amazon.co.uk/Cyberconflict-Routledge-Research-Information-Technology/dp/0415396840/

http://www.amazon.co.uk/Power-Resistance-Conflict-Contemporary-World/dp/0415452988/

http://www.amazon.co.uk/Cyber-conflict-Politics-Contemporary-Security-Studies/dp/0415459702/

http://vectors.usc.edu/thoughtmesh/publish/135.php

Press interviews:

France:http://www.lemonde.fr/web/article/0,1-0,36-924253,0.html
http://www.20minutes.fr/article/180599/Monde-La-Chine-a-soif-d-informations.php
Greece:http://www.enet.gr/online/online_text/c=112,id=78490200
Brazil:
http://jbonline.terra.com.br/editorias/internacional/papel/2007/11/04/internacional20071104008.html
Poland:http://hacking.pl/6789-Czas_cyberwojen.html
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://listcultures.org/pipermail/p2presearch_listcultures.org/attachments/20080330/9fbe9e69/attachment-0001.html 


More information about the p2presearch mailing list