ERCIM News No.40 - January 2000


The Web, Europe, and ERCIM

The ERCIM jubilee events ended with a video-recorded statement by Tim Berners-Lee, director of the World Wide Web Consortium (W3C), followed by a discussion led by Dick Bulterman (Oratrix Development BV, The Netherlands). The following is a summary of the points made during the session.

Dick Bulterman, Oratrix, The Netherlands. Tim Berners-Lee, Director of the World Wide Web Consortium (W3C) during the video-recorded statement.

The New York Times recently described the Web as a uniquely American phenomenon, although it was invented at CERN in Geneva, Switzerland. Why and how did Europe let the Web get away? Looking back may seem unproductive, but one also learns a thing or two. There are several reasons. The Internet was very much more deployed in the US, where many universities were connected, through which the Web naturally spread across the country and local telephone calls are free in the USA, which makes it less of a barrier to acceptance. The fact that the first user-friendly browser was American certainly helped. Europe had initially a strong ‘not invented here’ feeling towards the Internet, it wanted to wait for the ISO solutions (which didn’t really come) when in fact the Internet was running fine with IP. Actually CWI, being the first non-military Internet site in Europe, had great troubles in selling the Internet. Only when the Web came along, the Internet was politically accepted in Europe. Another reason for the much faster spread of the Web in the US was the American entrepreneurial spirit (there are many start-ups) of ‘do first and discuss afterwards’ against the more considered European way. And, of course, there is a linguistic reason. Monolinguality helped the Web spreading faster in the US, and there was an incentive to put on a website because English is understood around the globe.

In contrast with the American monolithic culture the European diversity may be an impediment to certain developments, it also offers unique opportunities, for example using the multilingual possibilities of the Web: the native English speakers will not form the majority on the Web before long. What is most important on the Web is content, not technology, and Europe possesses the richness and diversity of contents. Also a much richer set of ideas is likely to emerge when boundaries allow people to think independently. Europe can distinguish itself in several ways in the further development of the Web. There is, for example, a lot of know-how in graphics. Europe is strong in mobile communication and can play an important role in matters like intellectual property rights (IPR) and privacy. The rapidly increasing bulk of legacy HTML in the US gives Europe (and ERCIM) the chance to take the lead in resource discovery over heterogeneous resources by connecting its pre-existing databases to the Web and put metadata in front of it (to be done by progressively improving XML and RDF).

ERCIM creates a balance between European diversity and necessary homogeneity by building bridges between different cultures and stimulating technical ideas to move freely within academia and across borders. Several ERCIM members also house a W3C office, which helped the W3C Consortium be connected into Europe. The future Web will be much more of a collaborative system (as it was always meant to be). ERCIM, being a collaborative body itself, can play an important role here, for example in making a very consistent user interface. ERCIM may improve its collaborative force using the Web as a tool, thus contributing to the more general task for Europe to build the new society on top of the Web. ERCIM could also play an honest broker role with respect to the ‘big four’ companies (essentially all American). W3C is very heavily swayed by its members, several of them having sectoral interests in the Web (databases, graphics, ...), whereas ERCIM jointly has a very wide-spread interest in the very many concepts on the Web and thus can bring integrity to it, as well as bring people together who are interested in the Web, not each in their own little niche.

New ideas are often born in an ivory tower, far away from tradition. On the other hand, the consumer requires new products to be incrementally compatible, and in order to realize a brilliant idea in the next round’s software and websites it is important to find a balance between idea and compatibility. In the US there is a broad feeling that European programmes like ESPRIT introduced a level of overhead that inhibited quick focused entry into new areas. ESPRIT was worth its weight in gold in getting people who were not used to collaborating to do so. However, in introducing new technologies it is hardly possible to compete with commerce, which can act often non-bureaucratically and disposes of fairly large budgets (the budget of the Framework Programme is not more than 4% of the total research money in Europe). At the same time American companies are extremely eager to participate in the European IST Programme. In general there is a lot of interest to do research in these programmes, as the ongoing over-subscription shows. In the past the creation of the W3C consortium in Europe also received support from ESPRIT. When entering a new area, consortium building and submitting a proposal takes one year, followed by the project period, and only then the standardization process starts. For some things this works and for some it doesn’t. In any case ideas and their time-to-market should be in balance. To be sure, work that needs to be done quickly is, also in the US, not done in governmental programmes but in the companies. One has to put activities and their funding in the right place.

The research community has certainly a role in developing standards. In general a lot of this process is political, not technical. The reason is that the very high costs of developing new software for economies of scale markets are only recaptured if you just hit the mainstream. Experience with the Web multimedia activity around the development of the language SMIL shows that companies initially show a broad willingness to accept people from the outside into the process, but as soon as something becomes a recommendation commercial interest prevails. Companies coming up with a new product see to it that it apparently conforms to standards, but a keen eye, for example in the form of more ERCIM involvement in the W3C process, is required here. In order to give technical arguments sufficient weight against company policies it is very important that researchers come to standardization meetings with something that works (product, pilot, demonstrator).

W3C is concerned about intellectual property rights: does the Web remain available for everyone if some people take for example with patents control over protocols used on the Web? W3C is supposed to control developments, but actually a few giants like Microsoft and Netscape have ‘de facto’ control , because they have all the resources to control market access.
It is difficult to see what the future Web will be like. It did not start with someone saying ‘Let’s invent the Web’. It’s a gradual, and global process, which happens almost by chance. There was a very strong feeling from US industry that systems like this should evolve through combative market forces. However, dividing the Web market, with its multiple technologies, between only a few players will not work, as consumer industry has shown, because of insufficient volume. This is relevant for the transition to the mobile Internet which is also hampered by the dozen odd coding formats used in the Web. (A variety of standards is on its own not a problem, not using or changing them is.) An important driving force, changing completely the technology and its use on the Web, is the expected increase with a factor twenty in bandwidth for next year, leading to 1 Megabit for everybody in five years. Of course, now everybody is speaking about video and images, but the future use of the Web will probably be something very different. The bandwidth explosion will cause a shift from the academic market. (There is some inverse similarity with the movie industry in the thirties. Movies were then completely outside academic life.) Dealing with this social phenomenon is the next challenge.

In any case the Next Generation Web should enable collaborative work to be done immediately, all participants being synchronously attached to the Web. This may, for example, reduce considerably the development time of a prototype. It’s definitely going in this direction. Shared knowledge, which becomes information by association with activity, plays a crucial role in a knowledge-based economy. Paradoxically, such an economy drives on buying and selling information, thus opposing the motivation to share. In view of debates raging over ownership issues of faculty members’ notes, it seems that issues like IPR play a larger role than technology in bringing about a knowledge-based economy.

return to the ERCIM News 40 contents page