Since the early 1990s coupled with the widespread deployment of broadband to the home, we have seen remarkable progress in the ease of Internet accessibility to end users. Both commercial and private sectors rely heavily on the availability of the Internet to conduct normal day to day functions. Underpinning this exponential growth in popularity of the Internet are the advances made in the applications of basic algorithms to design and architect the Internet. The most obvious example of these algorithms is the use of search engines to collect and correlate vast amounts of information that is spread throughout the Internet.
With the dawn of this new century, we are now on the verge of expanding the notion of what we mean to communicate. A new generation of netizens are poised to leverage the Internet for a myriad different applications that we have not envisioned thus far. This will require that the Internet be flexible and adapt to accommodate the requirements of next generation applications. To address this challenge, in the United States, the National Science Foundation has initiated a large research project GENI. The goal of GENI is to perform a clean-slate design for a new Internet. In particular, the aim of this project is to rethink the basic design assumptions on which the current Internet is built, with the possibility that to improve flexibility for new services we may arrive at a radically different Internet, beyond what one might imagine from evolving the current network. Given this context of internet research, the purpose of this book is to provide a comprehensive survey of present algorithms and methodologies used in the design and deployment of the Internet. We believe that a thorough understanding of algorithms used by the Internet today is critical to develop new algorithms that will form the basis of the future Internet.
The book is divided into 3 parts dealing with the application of algorithms to different aspects of network design, operations and next generation applications. Part 1 provides an algorithmic basis for the design of networks both at the physical and the service layer. This part is extensive since it considers different physical layer network technologies. The second part of this book covers two important topics of network operations and management. As we know today, network providers have already completed field trials for the 100Gbps network. It should not be long before these capacities become common place on the Internet. The challenge of processing packets at such high speeds imposes a tremendous significance on efficient and fast packet processing algorithms. Part 3 of this book discusses algorithmic techniques that form the basis of emerging applications.
In this book we have attempted to provide a flavor of how algorithms have formed the basis of the Internet as we know it today. It is our hope that this book will provide a useful overview of algorithms applied to communication networks, for any student who aspires to do research in network architecture as well the application of algorithms to communication networks. We believe that for a robust design of the future Internet, it is essential that the architecture be founded on the basis of sound algorithmic principles.
[ bib | Alternate Version ] Back
This file was generated by bibtex2html 1.92.