[p2p-research] Google chief: Vast changes to Web loom

Paul D. Fernhout pdfernhout at kurtz-fernhout.com
Tue Oct 27 19:47:44 CET 2009


Pamela McLean wrote:
 > 2009/10/25 Paul D. Fernhout <pdfernhout at kurtz-fernhout.com>
 > From:
 >http://www.hollywoodreporter.com/hr/content_display/technology/news/e3ib38cebd2d45985dd25cc8166a405e123
 > """
 > (snip) Schmidt said.
 > "You will tend to listen to other people," he said.
 >  The problem is how to organize all the information, he said. It is the
 > fundamental problem facing Google, a company that offers many products but
 > was built on a Web search engine that trolls for information, gathers it
 > and ranks it for users. Schmidt asked rhetorically how, for instance,
 > Google might be able to rank a user's individual tweets. ...
 > """

> I'm responding to the part of Paul's email about more effective ways of
> automatically organising information. Reading the email I had several
> thoughts. They connect in my mind - but it would take a while to justify how
> they connect - however they which may strike a chord with someone - and if
> so I would be interested in further discussion.

I was waiting to see if someone else replied on this...

A few comments.

> There is so much information now flooding around us all - which gives us new
> problems of organising it. Some of us  have to do it for ourselves - some
> have automatic systems of one kind and another - some of us have trusted
> friends or colleagues who act as an information filters and pointers for us.
> Many of us help each other to a lesser or greater degree - pushing, pulling,
> parking, packing and personalising information for the people we connect
> with.
 >
> We are moving to a new kind of "shared knowing" - much of our memory is
> external and sharable; much of what we don't yet know but might soon know is
> easily accessible; much of our thinking is becoming more external and
> sharable......
> 
> The internet isn't just about the technology and the information - it's also
> about people and roles and relationships and how the internet is impacting
> on them.

Agreed.

The term "agents" (for some sort of artificial intelligence) was a big 
buzzword on this for a time around 2000. Example:
   "What are Information Agents?"
   http://www.dbgroup.unimo.it/IIA/briefintroduction.html
General Magic talked a lot about this.

But as you point out, there are important social process for all this.

So, how all these approaches interoperate is another issue, too, the human 
and the artificial (the artificial integration like is still in research, 
the artificial search like Google, and the artificial memory like Wikipedia).

It seems there has long been a tension in research funding between those who 
support human replacement by AI and those who support individual (or group) 
augmentation by computers.

Much of the earlier work in AI was about replacing people. Doug Engelbart 
was one of the few earlier pioneers in augmenting individuals and groups.
   http://www.ibiblio.org/pioneers/englebart.html
"Douglas Engelbart has always been ahead of his time, having ideas that 
seemed far-fetched at the time but later were taken for granted. For 
instance, as far back as the 1960s he was touting the use of computers for 
online conferencing and collaboration. Engelbart's most famous invention is 
the computer mouse, also developed in the 1960s, but not used commercially 
until the 1980s. Like Vannevar Bush and J.C.R. Licklider, Engelbart wanted 
to use technology to augment human intellect. He saw technology, especially 
computers, as the answers to the problem of dealing with the ever more 
complex modern world and has dedicated his life to the pursuit of developing 
technology to augment human intellect. "

It seems there is still that tension even now.

Compare, for example, those who want to make a "friendly AI" to be our next 
"god" :-)
   http://en.wikipedia.org/wiki/Friendly_artificial_intelligence
versus those working toward social networks related to security. :-)
   http://www.opensourcesensing.org/
"A long and expensive battle is looming between those using sensors to 
collect data and those whose data is being collected. But this conflict can 
be reduced and agreement accelerated. We can use open source-style processes 
to develop sensor and data handling standards that take into account both 
the right to privacy and the right (or perceived need) to sense."

I can see that theme in sci-fi too, now that I look more for it, even in 
things going back to the 1950s. Theodore Sturgeon, with his "The Skills of 
Xanadu" and "To Marry Medusa" was connected into that theme for a time.
   http://p2pfoundation.net/Skills_of_Xanadu

But how that will all work out in practice remains to be seen or 
incrementally invented.

> Social networking on the Internet is not just about networking at a distance
> with people we have never met Face to Face (F2F), it is also about
> continuing to network with people we initially knew F2F and might otherwise
> lose contact with, equally it is about meeting people F2F who we have
> initially networked with online.

Great point. Clay Shirky talks about something like this is "here comes 
everyone" where he says, with things like "meetup.com" the virtual and the 
physical is coming together. Same with conferences.

Also, FabLabs, RepRap, the "maker" scene, instructibles, 3D printing, open 
manufacturing, and so on, are another way that the digital and physical are 
converging.

> The Internet impacts on individual relationships to information, and group
> relationships to information, and people's relationships to information,
> and to knowledge, and to each other. All kinds of subtle realignments are
> happening. The present is unsure of itself and there is a such a fluidity
> about the future...  Issues about automatically organising information are
> as much about the context in which the information will be used as about the
> information itself - hmm - we face some intriguing challenges when it comes
> to organising information..

Yes, than is a great way to describe a bigger picture.

And I agree, context is a good way to think about it, including levels of 
information.

I also like William Kent's book "Data and Reality" about some related issues 
(from a more technical and abstract perspective).
   http://www.bkent.net/Doc/darxrp.htm
As he says, quoting others, "the map is not the territory".

 From one comment there on the book: "I remember my first exposure to the 
work of Edward Tufte. The richness of detail that could be presented simply 
was almost a physical shock. Were it not for Bill Kent I might have 
forgotten that the data represented by that richness was only a 
representation of reality, and not the reality itself. In a world which 
reinvents the Perfect Semantic Representation Language to End All Semantic 
Representation Languages every ten years or so, it is a pleasure to have 
Bill's calming influence in print in the form of Data and Reality.  -Richard 
Mark Soley, Ph.D., Chairman and CEO, Object Management Group, Inc. (1999)"

Still, with computers, sometimes the map may be the territory. It depends, 
as you say, on context. :-)

But, one thing to muse on is that we do build our social and political rules 
into our infrastructure eventually. Clay Shirky talks about that a little here:
   http://www.shirky.com/writings/group_enemy.html

Langdon Winner talks about that a bit too in various books.

For me, I can wonder, what rules have we built into our infrastructure? And, 
if we change those rules, how would the infrastructure change?

What rules is, say, Google building into our society, if any?

Broadly, I see this as about scarcity vs. abundance, but there are all sorts 
of other rules about things like levels of privacy, types of security, 
issues of concentration of power, and so on, that one could think about, 
especially in a peer-to-peer context.

What are our assumptions? What are our values? How do those connect with our 
social organization and supporting infrastructure?

--Paul Fernhout
http://www.pdfernhout.net/





More information about the p2presearch mailing list