I’m is considering taking up a new career as a fortune-teller. I may not have predicted the horrendous floods sweeping through Queensland in Australia, but my words on preparedness were certainly timely.
“Maybe it’s time to review the massive spend on marketing activities and re-allocate some monies to enterprise risk management, disaster recovery and business continuity exercises. How many telcos simply pay lip-service to those critically important sectors of their business, preferring to spend big money on customer acquisition and hope that nothing goes wrong with operations in the future?
The years of cutbacks in IT and OSS may have helped them weather the economic downturn but may have left operators exposed through lower levels of surveillance and no plans for emergency response.
“The recent spate of unusual weather patterns and natural disasters should be reason enough for every service provider to review its contingency plans going into 2011.”
When experts tell you to always be ready for that once in a hundred years event, they know what they are talking about. The Queensland floods are such an event but reading news reports emanating from the flood-stricken capital, Brisbane, tell the tale of those that thought ahead, and those that didn’t.
As you would expect, fixed-line service has been severely disrupted and some mobile services reliant on mains power or linked by fixed-line for backhaul have had issues. The hapless Vodafone network, already grappling with major service problems and a pending class action by up to 12,000 disgruntled customers was the worst affected with over 200 sites reportedly going down.
Telstra went to extreme measures to protect some of its assets including sandbagging and wrapping at least one threatened exchange in plastic wrap – brilliant!
It appears that the biggest problems were not caused by water directly but by power outages deliberately created by electricity utilities in many areas trying to avoid shorting caused by the water and potential live areas where people were wading through water to escape the floods.
Although many exchanges and base stations have battery and diesel generators for backup, crews could not not access them to replace the batteries or top up fuel tanks. There has also been the added dangers not usually experienced by technicians. Mud, leaking sewerage, dead animals and even one pit which housed five deadly snakes wrapped around conduit!
The best news to come out was that, of eleven major data centers, only two had been affected, one by some water ingress and the other by forced power outages.
Most data centers were situated on high ground and had adequate backup power. The same could not be said for many of their corporate customers that had to close their offices in the Brisbane CBD because of the dangerous conditions, lack of public transport and power.
It has been a great example of good planning and management that cloud services operated through these data centers were not affected, even though access to them may have been.
The priority for all providers in the stricken areas has been to maintain service in all areas for emergency use. Many stories of lives being saved because people were able to use their mobile phones to get help are emerging and the tracking of missing persons had been sped up dramatically using location based tracking.
For those companies that did plan ahead and proved to their customers and the nation as a whole that they could be relied upon at times of national disaster have achieved the best outcome in terms of customer awareness and satisfaction they could ever have imagined, without spending one cent on marketing.