Archive for the ‘Technology’ Category

Looking forward to Web 3.0

September 7, 2007

Web 2.0 companies like Digg and Facebook are all the rage. The New York times looks beyond the current trends and does some musing on the next direction for the Web will take. While Web 2.0 used human intelligence to bring structure and meaning to the mountains of data generated by Web 1.0. Web 3.0, according to the Grey Lady, will use artificial intelligence to answer complex questions in way that resembles a human response. It will be able to do this by integrating large amount of common sense data and formal data in databases.

For example a Web 1.0 search for a sushi restuarant would be for a Yahoo web search “sushi +restaurant +”san jose””. A web 2.0 search would involve going to Yelp, entering your zip code, selecting the category “restaurants” and “sushi”, and then viewing a list of restaurants rated by other Yelp users. A Web 3.0 search would allow you to ask for a “a good sushi restaurant near me that is open now and takes reservations”. The Web 3.0 will know where “here” and when “now” is. It will be able to predict which sushi restaurant you think is “good” by examining your rating data from previous outings. It will be able to integrate many different types of data, so that it could get the restaurants hours and directions from one site, its reviews from another site, and make reservations using yet another. By using intelligence to make sense of the user generated data of Web 2.0, the next incarnation of the Web will be able to answer much more complex questions and in doing so will become a vastly more powerful tool than todays Web.

A New Networking Paradigm

May 21, 2007

Here is a tech talk that Van Jacobson, a research fellow at PARC, gave to Google entitled “A New Way to Look at Networking.” The main thrust of his idea is that in order to solve the main challenges the internet faces today: security, spam, the slashdot effect, and difficulty in implementing ubiquitous computing, we need to have a different model for thinking about and designing internet applications. Before the days of the Internet, the telephone system was designed around building point to point connections between callers. However, this was relatively unreliable and did not scale up well. Enormous resources were spent designing ever more complicated and robust units for the phone system because if any one piece failed, the entire connection failed. However, the Internet changed this completely. Rather than make point to point connections, TCP/IP abstracted away the individual connections and instead used routing algorithms and a robust protocol to make sure that all the data got from the sender to the receiver without worrying about what happens underneath. Similarly, Jacobson is suggesting that we develop new protocols that stand above the connection layer, because right now we spend a lot of time and effort worrying about where our data comes from and whether connections are secure, and whether the sender is reliable. Instead, by focusing the network on data, rather than on the connections, we can greatly increase the power and flexibility of the Internet.

One example of this new view is the Bittorrent file distribution system. With bittorrent a user requests a specific file. The file itself is actually held in bits and pieces on the machines of hundreds of different users. However, the user does not need to know where all the parts are and how to download them; all of that work is done by the algorithm. Instead the user just needs to know what he wants to download and the network takes care of the rest.

This new paradigm points to a time in the near future where the internet will be very different from how we know it now. In particular it points to light at the end of the tunnel for solving several long standing problems of security, data integrity, and ubiquity.

More information on content centric networking can be found here. Also related are the ideas of ubiquitous computing and mesh networking.