The promise of big data is profound: massive data-sets that allow segmentation of individuals. Big data lets users track their fitness levels on a mobile phone app - and compare results with thousands of other Netizens. Businesses can focus on their efforts on their best customers with tailor-made loyalty programs. Built-in GPS trackers show where users actually go and how much time they spend there.
Wait a minute...that last bit seems a bit intrusive, doesn’t it? Yes, GPS tracking is a useful feature if you get lost while hiking. But does that mean you want to be tracked as you walk past retail establishments that sniff out your profile?
We’ve known since the early days of SMS marketing that there’s a fine line between retail enticement and retail harassment. But when it comes to an individual’s health records, the stakes increase. Yes, someone with a medical condition could benefit from a mobile app that communicates with a body-sensor. But do you want that personal data stored on your handset? If so, who gets access? You, your doctor, your boss, your insurance company? And what do you do when that data is leaked or stolen? And who is accountable?
Big Data Insights April 2016
The shift to analytics-driven programmable networks
Big data analytics for better network operations
These are the questions that telcos - and indeed everyone in the big-data value chain - are facing. The answers are hard, and they’re only going to get harder from here on out.
Security expert Bruce Schneier made a similar point in a March 2016 blog post titled “Data Is a Toxic Asset”. “Retailers save our purchasing habits. Cell phone companies and app providers save our location information,” Schneier observed. “Telecommunications providers, social networks, and many other types of companies save information about who we talk to and share things with. Data brokers save everything about us they can get their hands on. This data is saved and analyzed, bought and sold, and used for marketing and other persuasive purposes.”
Schneier points out that while 2015 was a big year for large-scale data theft, that in itself isn’t unusual - such thefts are practically a weekly occurrence, and for a variety of reasons, from fraud to embarrassing and/or blackmailing the company being hacked (last year’s Ashley Madison hack being a famous example).
More to the point, 2015 was the year that businesses started to realize that big data is also a “toxic asset” - i.e. it’s dangerous to save because lots of players want access to that data (marketers, hackers, government agencies, etc), it’s hard to secure, and the consequences of failing to secure it can be severe in terms of damage to the company’s share price, reputation and profit (not to mention the expense of fighting lawsuits).
The same week Schneier’s essay was posted (as if to put an exclamation point on it), US cancer clinic 21st Century Oncology admitted that its system had been breached, exposing private information on 2.2 million patients and employees, including names, social security numbers, diagnosis and treatment details and insurance information.
Telcos are by no means exempt or immune to all this. UK-based telecoms company TalkTalk found that out the hard way. It’s been hacked three times in the past year, and admitted in February that one 2015 data breach resulted in criminals using customer information to commit fraud by requesting bank details that were then used to steal cash from their accounts. TalkTalk has reportedly lost over 100,000 customers as a result.
Stories like this are going to become more commonplace as the value of big data increases, unless everyone - including telcos - takes security and privacy of big data seriously, if only because their customers are going to demand that they do so. Because it’s their personal information that’s being stolen - data that they entrusted companies to protect. And breaching that trust is unacceptable.