Those General Data Protection Regulation (GDPR) regulations are coming into effect soon and there’s still a lot of uncertainty over what th...
Those General Data Protection Regulation (GDPR) regulations are coming into effect soon and there’s still a lot of uncertainty over what they’ll really mean. That means the time is now to apply some scrutiny to your whole relationship with data not just personal customer details, but also any sort of data that’s important to your business, and which would cause serious trouble if it were to leak or get lost.
Of course, that’s a big ask. With the specific demands of GDPR already taxing your resources, you might question if you want to make the job any bigger than is strictly required. But the philosophy that underpins GDPR isn’t just about customer protections, but about information security in general and this might be the first time you’ve had to take a proper look at your practices and processes through that lens. Indeed, you might be alarmed by what you find. Even if GDPR didn’t exist, it ought to be a priority to properly figure out if your data is actually secure, and what risks you face.
Source: Pixabay
And, by the way, if you’re worried about the stringent demands of GDPR and data security in general, don’t feel too anxious. The day I sat down to write this piece, the news broke that the Information Commissioner’s Office (ICO) website had been compromised, to download a cryptocurrency mining script onto visitors’ PCs. That might not sound encouraging, but it illustrates an important fact that nobody’s security is perfect. With new threats emerging constantly, it’s simply not possible to guarantee that you won’t ever be hacked or caught out by a bug, no matter how much that reality might frustrate irascible chief execs.
Rather, the focus should be on understanding how to identify and neutralise threats, and having contingencies in place to recover quickly and cleanly from any problems. And you can at least reduce your exposure by identifying the biggest risk to your data security, and addressing that until you reach the point where something else is a bigger liability. In practice, that’s likely to be not strictly a technical vulnerability, but something more philosophical.
benchmarking the value of their targets, and if they find it’s not high enough, they simply stop their misdemeanours, tidy up behind them and politely leave?
No matter how humble you may feel your own situation is, hackers can find a way to exploit it. That might involve stealing valuable data, but it’s still a stretch to imagine that most hackers are interested in analysing and hawking esoteric data. More likely, they’re simply looking to blindly assimilate your machines into a botnet, and then rent out or sell access to idiots.
Once one of these idiots gets into your network, it’s often only then that they will try to work out if they have landed a minnow or a whale. In fact, if they conclude that you’re not worth the effort, that could work out worse for you. High value targets are more likely to be kept in good working order, while less valuable systems may become hosts for various kinds of malware system testing, or just goofing around which brings the concomitant risk of your “worthless” data being leaked or sabotaged.
In short, while GDPR encourages us to think about security in terms of the intrinsic value of our data, to the bad guys, that’s often an incidental consideration. It may be that your data really is of no value or interest to anyone outside your own company. But when the ICO comes calling, asking how it ended up strewn all over the internet, that’s not going to be a satisfactory defence.
If you want to skip that exercise, I can give you a basic summary now: any form of outsourcing that takes your data out of your control has the potential to turn into a disaster.
Why is that? It may be okay for you to run a bit of a messy shop at home, with a few security groups and some standard office PCs, but once you’re in outsourcing land, the whole approach has to be different. I’m talking about blocking off USB ports, to ensure that outsourced workers can’t access your address list or leak your proprietary data. I’m talking about banning personal smartphones inside the packing centre, to ensure that delivery addresses don’t get shared around. If there’s web access at all, it must only be via a strictly managed and filtered VPN.
To a paper moving head office, all of this might sound ridiculous. But if you’re entrusting business critical roles to staff who aren’t your own, you need to recognise the risk of names, addresses and other handy bits of digital identity being either deliberately stolen or carelessly exposed. It doesn’t take much for an attacker to compile an in-depth profile of someone they want to either impersonate or defraud.
Things are, perhaps, a bit more straightforward for pure IT outsourced environments, as these come ready loaded with industry tested small print. This covers some key issues such as who owns the data on the servers, and what the outsourcer might do when presented with legal papers by an apparently relevant law enforcement agency. What it often doesn’t cover, however, is nitty-gritty issues such as whether the entire software stack, right from the bootloader all the way through to your files, can provide a reportable, usable output that or example shows that someone has successfully exercised their right to be forgotten. If it doesn’t, you could have some difficult questions to answer in the event of a dispute.
At the end of the day, the problem with outsourcing is that it hasn’t caught up to the realities of business computing in 2018 and because of the strictly prescriptive nature of the business, it seems destined to perpetually lag behind emergent needs and ways of doing things. Way back in 1994, I was contracting at a bank when the job of distributing laser printer toner was outsourced to the same company that already supplied the business with fresh toilet roll. This was fine until the first time a cartridge threw up an obscure error and knocked a busy printer out of action. Do you think the toilet roll dispensers knew how to help, or wanted to?
These days, that sort of issue is more urgent still: even the simplest failure can cost a whole day’s revenue, and most businesses can’t afford that kind of risk. Hence the classic “cloud service adoption curve”: at first, there’s an upward leap of enthusiasm as the boring, tricky jobs are waved away into the ether. Then the merciless logic of the finance director applies the brakes, and the really important jobs start to come back in-house, as the company undergoes the long, grumbling transformation into a hybrid environment where visiblity and accountability reign once more.
On a side note, certain suppliers also get a black mark for allowing clients to use dangerously outdated systems. Windows Server 2003 was a respectable platform, to be sure, and a good match for the simple needs and outlook of most companies that were looking to get into a networked compute environment around that time. But it’s now 2018. When I hear one of those IT service companies asking around for an install CD or activation key for Server 2003 as I still do from time to time it makes me wonder in what other ways they’re behind the curve.
It’s a similar situation with hardware. Only last week, I saw a machine equipped with six 72GB 3.5in drives, mounted proudly in an air conditioned server room rack. Running such ancient machines so far beyond their best-before date isn’t just dangerous in terms of software vulnerabilities: it means that as and when one of those hard disks goes kaput, the chances of finding a replacement of the same make and model are effectively zero. So from a RAID perspective, this arrangement makes no sense at all.
But that’s not the whole story. It’s not just about trusting your staff and your MDM solution. Most of the apps they’re relying on that your business is relying on are cosmetic front ends to massive, invisible cloud compute farms. You can bet that the owners and developers don’t have your best interests at heart.
This isn’t to say that every business needs to be run on some sort of vastly powerful public platform. You can deliver planks of wood on remarkably little compute power and make a profit. That’s what business is about. Even so, the emphasis has gradually shifted away from processing and over to the value of information. To deliver a plank of wood might require a truck, which might not fit under all the bridges in your town, and therefore might need a smarter routing algorithm.
One relevant case study here is the open-source satnav app Waze, which has been fighting a steady rearguard action against requests for a “truck mode” for at least the last two to three years. The reason is a classic critique of mass-market logic: reportedly, the developers didn’t originally build the ability to capture bridge heights into the data model, so now that data hasn’t been collected, and indeed their database lacks the necessary structure for storing it.
So, as a rule, you should expect public platforms to provide only an approximate fit for your needs. This applies to free email services as much as free satnav apps, or free anything and then when a revenue model does emerge, you’re likely to find your crucial data is heavily tied into a service that you don’t control, must pay to continue using and may not be able to easily extricate it from. It’s important to remember that a breach isn’t the only way you can lose control of your data.
Of course, almost no-one has. 5G is at first, at least being deployed on crazy things such as hydrogen blimps, to provide a temporary bandwidth boost at big sporting events and the like. Technically speaking, it has more in common with supercharged Wi-Fi than with previous generations of “G” cellular signals.
One thing that’s important to know is that phones able to make use of 5G will be bandwidth-hopping like mad. Your typing might be going up via 2G, while you watch a video over 4G and download an OS update over 5G. Diverse routing for cellular data is touted as an advantage, but I can’t shake the suspicion that any phone that can actually do this will need a trolley loaded with batteries.
Not only is this wireless future complicated, it’s very far from being secure. It’s been calculated that it would take much more than the lifetime of the universe to decrypt secure Wi-Fi well into the lifetime of several universes but that’s an assessment that forgets an important lesson about the start of computing as a serious tool. It dates back to the days when Bletchley Park was working to develop Alan Turing’s concepts into a workable device with a specific job to do cracking the codes used by the German Enigma machines during the Second World War.
As you surely know, it was a task at which those pioneers eventually succeeded, but they were helped along by the Germans’ unfortunate habit of repeating the same phrase at the end of their encrytped messages “Heil Hitler”. This consistent pattern in the flow of messages massively reduced the amount of thinking time or pencil work that was needed to establish the rest of the cryptological alphabet in use that day from the lifetime of the universe to an afternoon.
Roll forward to the internet, and you can be sure that a fair proportion of web traffic is going to involve repetition and consistency. It won’t be anything as simple as “https://www/”, but think along those lines.
So let’s not place our faith in the wireless future. Businesses will still want the convenience of a wireless connection, but when the business owner needs to prove to regulators and the public that their data is being securely handled, the answer isn’t hiding in the technical specifications of the protocol. Regardless of the transport, it’s your job to monitor and manage the flow.
Of course, that’s a big ask. With the specific demands of GDPR already taxing your resources, you might question if you want to make the job any bigger than is strictly required. But the philosophy that underpins GDPR isn’t just about customer protections, but about information security in general and this might be the first time you’ve had to take a proper look at your practices and processes through that lens. Indeed, you might be alarmed by what you find. Even if GDPR didn’t exist, it ought to be a priority to properly figure out if your data is actually secure, and what risks you face.
Source: Pixabay
And, by the way, if you’re worried about the stringent demands of GDPR and data security in general, don’t feel too anxious. The day I sat down to write this piece, the news broke that the Information Commissioner’s Office (ICO) website had been compromised, to download a cryptocurrency mining script onto visitors’ PCs. That might not sound encouraging, but it illustrates an important fact that nobody’s security is perfect. With new threats emerging constantly, it’s simply not possible to guarantee that you won’t ever be hacked or caught out by a bug, no matter how much that reality might frustrate irascible chief execs.
Rather, the focus should be on understanding how to identify and neutralise threats, and having contingencies in place to recover quickly and cleanly from any problems. And you can at least reduce your exposure by identifying the biggest risk to your data security, and addressing that until you reach the point where something else is a bigger liability. In practice, that’s likely to be not strictly a technical vulnerability, but something more philosophical.
“We’re not important enough to hack”
I hear this all the time and not just from one man bands, but multi million pound enterprises. It might seem obvious to you that the data you’re working with is of limited value to the wider world. But do you want to bet your company on the idea that hackers go about carefullybenchmarking the value of their targets, and if they find it’s not high enough, they simply stop their misdemeanours, tidy up behind them and politely leave?
No matter how humble you may feel your own situation is, hackers can find a way to exploit it. That might involve stealing valuable data, but it’s still a stretch to imagine that most hackers are interested in analysing and hawking esoteric data. More likely, they’re simply looking to blindly assimilate your machines into a botnet, and then rent out or sell access to idiots.
Once one of these idiots gets into your network, it’s often only then that they will try to work out if they have landed a minnow or a whale. In fact, if they conclude that you’re not worth the effort, that could work out worse for you. High value targets are more likely to be kept in good working order, while less valuable systems may become hosts for various kinds of malware system testing, or just goofing around which brings the concomitant risk of your “worthless” data being leaked or sabotaged.
In short, while GDPR encourages us to think about security in terms of the intrinsic value of our data, to the bad guys, that’s often an incidental consideration. It may be that your data really is of no value or interest to anyone outside your own company. But when the ICO comes calling, asking how it ended up strewn all over the internet, that’s not going to be a satisfactory defence.
“It’s not our concern, we’ve outsourced”
Outsourcing can be a smart way to handle certain areas of your business. But the term is dangerously loose; there is, for example, a big difference between outsourcing your delivery and fulfilment to a third party, and outsourcing your email to Google. It’s tempting to think you’ve entirely washed your hands of certain functions, when in reality you’ve merely made yourself responsible for a process you don’t own. Unless you handle absolutely everything in-house, it might be a good idea to draw up a “balance sheet” that details the benefits and liabilities of each outsourcing decision.If you want to skip that exercise, I can give you a basic summary now: any form of outsourcing that takes your data out of your control has the potential to turn into a disaster.
Why is that? It may be okay for you to run a bit of a messy shop at home, with a few security groups and some standard office PCs, but once you’re in outsourcing land, the whole approach has to be different. I’m talking about blocking off USB ports, to ensure that outsourced workers can’t access your address list or leak your proprietary data. I’m talking about banning personal smartphones inside the packing centre, to ensure that delivery addresses don’t get shared around. If there’s web access at all, it must only be via a strictly managed and filtered VPN.
To a paper moving head office, all of this might sound ridiculous. But if you’re entrusting business critical roles to staff who aren’t your own, you need to recognise the risk of names, addresses and other handy bits of digital identity being either deliberately stolen or carelessly exposed. It doesn’t take much for an attacker to compile an in-depth profile of someone they want to either impersonate or defraud.
Things are, perhaps, a bit more straightforward for pure IT outsourced environments, as these come ready loaded with industry tested small print. This covers some key issues such as who owns the data on the servers, and what the outsourcer might do when presented with legal papers by an apparently relevant law enforcement agency. What it often doesn’t cover, however, is nitty-gritty issues such as whether the entire software stack, right from the bootloader all the way through to your files, can provide a reportable, usable output that or example shows that someone has successfully exercised their right to be forgotten. If it doesn’t, you could have some difficult questions to answer in the event of a dispute.
At the end of the day, the problem with outsourcing is that it hasn’t caught up to the realities of business computing in 2018 and because of the strictly prescriptive nature of the business, it seems destined to perpetually lag behind emergent needs and ways of doing things. Way back in 1994, I was contracting at a bank when the job of distributing laser printer toner was outsourced to the same company that already supplied the business with fresh toilet roll. This was fine until the first time a cartridge threw up an obscure error and knocked a busy printer out of action. Do you think the toilet roll dispensers knew how to help, or wanted to?
These days, that sort of issue is more urgent still: even the simplest failure can cost a whole day’s revenue, and most businesses can’t afford that kind of risk. Hence the classic “cloud service adoption curve”: at first, there’s an upward leap of enthusiasm as the boring, tricky jobs are waved away into the ether. Then the merciless logic of the finance director applies the brakes, and the really important jobs start to come back in-house, as the company undergoes the long, grumbling transformation into a hybrid environment where visiblity and accountability reign once more.
“Don’t worry, we’re fully backed up”
Who said that you were fully backed up? Because the very last person you should be trusting with this sort of life or death responsibility is your IT services supplier. Perhaps the logic of “one throat to choke” made sense when your relationship with the whizzy consultants was mostly limited to web hosting. But, as we come to rely on third-party providers for a wider range of services and advice, so we give them more opportunities to let us down, in ever more catastrophic ways and it becomes ever more vital to maintain a healthy distrust of them. In the case of backups, you can engage a third-party as part of your strategy, but putting all your eggs in one basket is an extremely bad idea and you should be suspicious of any consultant that allows you to do it.On a side note, certain suppliers also get a black mark for allowing clients to use dangerously outdated systems. Windows Server 2003 was a respectable platform, to be sure, and a good match for the simple needs and outlook of most companies that were looking to get into a networked compute environment around that time. But it’s now 2018. When I hear one of those IT service companies asking around for an install CD or activation key for Server 2003 as I still do from time to time it makes me wonder in what other ways they’re behind the curve.
It’s a similar situation with hardware. Only last week, I saw a machine equipped with six 72GB 3.5in drives, mounted proudly in an air conditioned server room rack. Running such ancient machines so far beyond their best-before date isn’t just dangerous in terms of software vulnerabilities: it means that as and when one of those hard disks goes kaput, the chances of finding a replacement of the same make and model are effectively zero. So from a RAID perspective, this arrangement makes no sense at all.
“We empower our employees to use their own devices”
“Bring your own device” (BYOD) has become very fashionable in recent years and, if you really want to, it is possible to run a business solely on your staff’s personal phones and tablets, using mainstream apps.But that’s not the whole story. It’s not just about trusting your staff and your MDM solution. Most of the apps they’re relying on that your business is relying on are cosmetic front ends to massive, invisible cloud compute farms. You can bet that the owners and developers don’t have your best interests at heart.
This isn’t to say that every business needs to be run on some sort of vastly powerful public platform. You can deliver planks of wood on remarkably little compute power and make a profit. That’s what business is about. Even so, the emphasis has gradually shifted away from processing and over to the value of information. To deliver a plank of wood might require a truck, which might not fit under all the bridges in your town, and therefore might need a smarter routing algorithm.
One relevant case study here is the open-source satnav app Waze, which has been fighting a steady rearguard action against requests for a “truck mode” for at least the last two to three years. The reason is a classic critique of mass-market logic: reportedly, the developers didn’t originally build the ability to capture bridge heights into the data model, so now that data hasn’t been collected, and indeed their database lacks the necessary structure for storing it.
So, as a rule, you should expect public platforms to provide only an approximate fit for your needs. This applies to free email services as much as free satnav apps, or free anything and then when a revenue model does emerge, you’re likely to find your crucial data is heavily tied into a service that you don’t control, must pay to continue using and may not be able to easily extricate it from. It’s important to remember that a breach isn’t the only way you can lose control of your data.
“We’re all going wireless because that’s the future”
Allow me to recount a conversation I had a week ago standing outside a building so secure that you can’t get in if you don’t have photo ID, preferably your passport, and where the internal staff must keep you within line of sight at all times. This organisation took the view that there was no need to supply Ethernet connections to people’s desks, because the future was wireless. So I asked them how confident they were about signal footprints and whether they had done any building evaluation for the forthcoming 5G cellular rollout.Of course, almost no-one has. 5G is at first, at least being deployed on crazy things such as hydrogen blimps, to provide a temporary bandwidth boost at big sporting events and the like. Technically speaking, it has more in common with supercharged Wi-Fi than with previous generations of “G” cellular signals.
One thing that’s important to know is that phones able to make use of 5G will be bandwidth-hopping like mad. Your typing might be going up via 2G, while you watch a video over 4G and download an OS update over 5G. Diverse routing for cellular data is touted as an advantage, but I can’t shake the suspicion that any phone that can actually do this will need a trolley loaded with batteries.
Not only is this wireless future complicated, it’s very far from being secure. It’s been calculated that it would take much more than the lifetime of the universe to decrypt secure Wi-Fi well into the lifetime of several universes but that’s an assessment that forgets an important lesson about the start of computing as a serious tool. It dates back to the days when Bletchley Park was working to develop Alan Turing’s concepts into a workable device with a specific job to do cracking the codes used by the German Enigma machines during the Second World War.
As you surely know, it was a task at which those pioneers eventually succeeded, but they were helped along by the Germans’ unfortunate habit of repeating the same phrase at the end of their encrytped messages “Heil Hitler”. This consistent pattern in the flow of messages massively reduced the amount of thinking time or pencil work that was needed to establish the rest of the cryptological alphabet in use that day from the lifetime of the universe to an afternoon.
Roll forward to the internet, and you can be sure that a fair proportion of web traffic is going to involve repetition and consistency. It won’t be anything as simple as “https://www/”, but think along those lines.
So let’s not place our faith in the wireless future. Businesses will still want the convenience of a wireless connection, but when the business owner needs to prove to regulators and the public that their data is being securely handled, the answer isn’t hiding in the technical specifications of the protocol. Regardless of the transport, it’s your job to monitor and manage the flow.
COMMENTS