Towards an auction-based internet
January 13, 2012
Are we headed to an auction-based internet? The current state of technology and the technology trends do seem to indicate such a possibility.
An auction-based internet would be a business model in which bandwidth would be allocated to different data traffic on the internet based on dynamic bidding by different network elements. Such an eventuality is a distinct possibility considering the economics and latencies involved in data transfer, the evolution of the smart grid concept and the emergence of the promising technology known as the OpenFlow protocol.
In the book “Grids, cloud and virtualization”, by Massimo Caforo and Giovanni Aloisio, the authors highlight a typical problem of the computing infrastructure of today. In the book, the authors contend that a key issue in large scale computing is data affinity, which is the result of the dual issues of data latency and the economics of data transfer.
They quote Jim Gray (Turing award in 1998) whose paper on “Distributed computing economics” states that that programs need to be migrated to the data on which they operate rather than transferring large amounts of data to the programs. This is in fact used in the Hadoop paradigm, where the principle of locality is maintained by keeping the programs close to the data on which they operate.
The book highlights another interesting fact. It says that the “cheapest and fastest way to move a Terabyte cross country is sneakernet (the transfer of electronic information, especially computer files, by physically carrying removable media such as magnetic tape, compact discs, DVDs, USB flash drives, or external drives from one computer to another). Google used sneakernet to transfer 120 TB of data. The SETI@home also used sneakernet to transfer data recorded by their telescopes in Arecibo, Puerto Rico stored in magnetic tapes to Berkeley, California.
It is now a well known fact that mobile and fixed line data has virtually exploded, clogging the internet. YouTube, video downloads and other streaming data choke the data pipes of the internet and Service Providers have not found a good way to monetize this data explosion. While there has been a tremendous advancement in CPU processing power (CPU horsepower in the range of petaflops) and enormous increases in storage capacity (of the order of petabytes) coupled with dropping prices, there has been no corresponding drop in bandwidth prices in relation to the bandwidth capacity.
Phil Marshall / Tolaga Research
Operational automation, agile development environments and platform strategies are vital to success