Bonus $100
Fury vs Usyk
IPL 2024
Mumbai Indians vs Kolkata Knight Riders
Paris 2024 Olympics
PROMO CODES 2024
UEFA Euro 2024
Users' Choice
88
87
85
69

Technological hurdles: 2012 and beyond

11 Jan 2012
00:00
Read More

You must have heard it all by now – the technological trends for 2012 and the future. The predictions range over big data, cloud computing, internet of things, semantic web, social commerce and so on.

This post focuses on what seems to be significant hurdles as we advance to the future. The positive trends are bound to continue and in our exuberance we may lose sight of the hurdles before us. Besides, “problems are usually opportunities in disguise”. So here is my list of the top issues facing the industry now.

Bandwidth shortage: A key issue of the computing infrastructure of today is data affinity, which is the result of the dual issues of data latency and the economics of data transfer. Jim Gray's paper on distributed computing economics states that that programs need to be migrated to the data on which they operate rather than transferring large amounts of data to the programs. In this paper Jim Gray tells us that the economics of today’s computing depends on four factors namely computation, networking, database storage and database access. He then equates $1 as follows:

≈ 1 GB sent over the WAN

≈ 10 Tops (tera cpu operations)

≈ 8 hours of cpu time

≈ 1 GB disk space

≈ 10 M database accesses

≈ 10 TB of disk bandwidth

≈ 10 TB of LAN bandwidth

As can be seen from above breakup, there is a disproportionate contribution by the WAN bandwidth in comparison to the others. In other words, while the processing power of CPUs and the storage capacities have multiplied accompanied by dropping prices, the cost of bandwidth has been high. Moreover the available bandwidth is insufficient to handle the explosion of data traffic.

In fact it has been found that the “cheapest and fastest way to move a Terabyte cross country is sneakernet” (the transfer of electronic information, especially computer files, by physically carrying removable media such as magnetic tape, compact discs, DVDs, USB flash drives, or external drives from one computer to another).

With the burgeoning of bandwidth hungry applications it is obvious that we are going to face a bandwidth shortage. The industry will have to come with innovative solutions to provide what I would like to refer as “bandwidth-on-demand”.

 

The Spectrum Crunch: Powerful smartphones, extremely fast networks, content-rich applications, and increasing user awareness, have together resulted in a virtual explosion of mobile broadband data usage. There are two key drivers behind this phenomenal growth in mobile data. One is the explosion of devices - smartphones, tablet PCs, e-readers, laptops with wireless access. The second is video. Over 30% of overall mobile data traffic is video streaming, which is extremely bandwidth hungry. All these devices deliver high-speed content and web browsing on the move. The remaining traffic is web browsing, file downloads, and email.

 

The growth in mobile data traffic has been exponential. According to a report by Ericsson, mobile data is expected to double annually through to 2015. Mobile broadband will see 1 billion subscribers this year (2011), and possibly touch 5 billion by 2015.

In an IDATE (a consulting firm) report, the total mobile data will exceed 127 exabytes (an exabyte is 1018 bytes, or 1 million terabytes) by 2020, an increase of over 33% from 2010).

 

Given the current usage trends, coupled with the theoretical limits of available spectrum, the world will run out of available spectrum for the growing army of mobile users. The current spectrum availability cannot support the surge in mobile data traffic indefinitely, and demand for wireless capacity will outstrip spectrum availability by the middle of this decade or by 2014.

 

This is a really serious problem. In fact, it is a serious enough issue to have the White House raise a memo titled “unleashing the wireless broadband revolution”. Now the US Federal Communication Commission (FCC) has taken the step to meet the demand by letting wireless users access content via unused airwaves on the broadcast spectrum known as “White Spaces”. Google and Microsoft are already working on this technology which will allow laptops, smartphones and other wireless devices to transfer in GB instead of MB through Wi-Fi.

 

But spectrum shortage is an immediate problem that needs to be addressed immediately.

 

IPv4 exhaustion: IPv4 address space exhaustion has been around for quite some time and warrants serious attention in the not too distant future. This problem may be even more serious than the Y2K problem. The issue is that IPv4 can address only 2^32or 4.3 billion devices. Already the pool has been exhausted because of new technologies like IMS which uses an all IP Core, plus the internet of things with more devices, sensors connected to the internet – each identified by an IP address. The solution to this problem has been addressed and requires that the Internet adopt the IPv6 address scheme. IPv6 uses 128-bit address and allows 3.4 x 1038 or 340 trillion, trillion, trillion unique addresses. However the conversion to IPv6 is not happening at the required pace and pretty soon will have to be adopted on war footing. It is clear that while the transition takes place, both IPv4 and IPv6 will co-exist so there will be an additional requirement of devices on the internet to be able to convert from one to another.

 

We are bound to run into a wall if organizations and enterprises do not upgrade their devices to be able to handle IPv6.

 

These are some of the technological obstacles that confront the computing industry. Given mankind’s ability to come up with innovative solutions we may find new industries being spawned to solve these bottlenecks.

 

Tinniam V Ganesh is an infrastructure architect at IBM India, Global Technology Services. You can write to him at tvganesh.85@gmail.com and read his blog http://gigadom.wordpress.com

.

Related content

Tags:
Rating: 5
Advertising