SecTor 2017 Cloud Security Alliance conference – Data protection front and center

It was all business at the Cloud Security Alliance Summit in Toronto as discussions around how organizations can design for data security predominated.

SecTor bills itself as Canada’s premier IT Security conference and this year did not disappoint.

I attended a pre-show event run by the Cloud Security Alliance that featured a full room talking about cloud application design and data protection. CIRA was thrilled to sponsor the CSA and to speak at such an important event. It touches on CIRA’s role in running what can be considered the original subscription-based internet service (domains) and our new cloud security offerings like the D-Zone Firewall that can help organizations add defence to their perimeter.

After listening to various speakers I started to notice a common thread that wound itself throughout the various presentations. Unsurprisingly, this common theme was data protection because the General Data Protection Regulation (GDPR) goes into effect next year in Europe and it has global organizations worried. In Canada we have similar regulations in the form of the the Personal Information Protection and Electronic Documents Act (PIPEDA) and the regulations are about to get a lot more teeth by putting an obligation to report any cyber security breaches within a very short timeframe. Every Canadian business needs to pay attention and the extent of the data policy requirements and legal liabilities are only now starting to be understood.

Data privacy - the big buzz at the CSA Summit

Lets start off the synopsis with a few interesting facts shared by John DiMaria at the BSI Group:

  • 90 percent of information in the world was created in the last two years.
  • 79 percent of companies collect data from individuals.
  • 21 percent of users trust companies with their data.

These cascading stats led to a theme that was echoed by several speakers: legal liability in the cloud. Businesses want data, consumers don’t trust them and so governments are acting. Organizations need to know to whom data is shared and how it is used and have processes that provide appropriate access to specific and required data within departments and suppliers.

If these processes aren’t well documented and managed, then corporate defence can go out the window in a litigation. When you go to court to battle a fine that is levied, the first things the judge will ask include, “who touched this data and what are the access controls?”. If you don’t have precise answers then any data you present could be thrown out as evidence. In other words, data governance is critical.

A sidebar on data sovereignty

An interesting discussion happened on the notion of data crossing borders and what that means because, in a cloud world, server location impacts jurisdiction and local regulations may not map to what businesses require for privacy.

In a panel discussion Mark Gaudet, Product Manager at CIRA said, “We are a prime example with our DNS Firewall service. We deliver the service from Canadian servers to help maintain privacy but importantly sovereignty. Additionally, we do not monetize the underlying recursive data, and store it only for as long as is necessary for security monitoring.” Other speakers echoed that the other good news for Canadians is that many of the other (large) cloud vendors, are starting to locate their servers on our soil. Until recently, this topic is something most vendors actively avoided drawing attention to.

On to the cloud – what is the current situation?

During a panel on shared responsibility, Krishna Narayanaswamy, CEO of Netskope said that there are 25,000+ cloud applications. Choosing suppliers with enterprise customers can help to ensure that you are servicing your customer while meeting compliance needs – but that doesn’t absolve you from accountability. It is up to the every organization to require their cloud vendor provide documentation that supports their security policies, monitoring, and enforcement capabilities. For the big infrastructure and platform vendors, his opinion was that when you choose them you are probably getting a more secure environment than you could by running it yourself. “Amazon and others have a lot invested in delivering quality services and if there is a problem it could be huge for them.” 

Which takes us to a few interesting stats that summarize the top risk points for most organizations. First, 93% of organizations are using cloud services. In fact, a typical large enterprise is using 500 - 1,000 cloud services across the Iaas, PaaS, and SaaS options that they know about. Within this soup is a shadow IT problem where users and departments use unauthorized technology that nobody knows – often for reasons that open the organization up to possible data breaches.

The combination of shadow IT and a huge diverse set of projects means that the biggest problem still rests where it always has – firmly with the humans. According to Centrify’s presentation, over 80% of breaches are caused by identify compromise. This is 10x higher than, for example, missing a patch on a machine. Moreover, they showed 95% of IaaS security failures are the customer’s fault, not the service providers – and more than half are user authentication. One risk mitigation solution presented was segmentation and compartmentalization of applications and data.

From the cloud infrastructure to the developer’s desktop

Rich Mogull, gave an interesting talk titled, “Five-Ish Ways to Kick Traditional Security’s Ass with Cloud and DevOps”. It was deeply technical with a lot of demo’s that make it a little hard to write about, but the one of the core messages was that cloud computing lets organizations design compartmentalized architectures because, unlike in the past, there is almost no cost to doing so. This includes segregating networks, and services into tiny micro-containers so that compromises are limited to what is in the containers. You can put the master passwords in a vault and give each developer only exactly what account they need.

Responding to breaches

A panel on Security Response in a Cloud World covered the up-front work needed by organizations for policy and management. During this panel Ken Bell, Deputy CISO at ForcePoint drove this point home when he said, “If you want to see the fastest incident response ever, walk in and tell the security person that you have no logs and they will walk right back out.” 

Why?  Because without logs the security professional is incurring a lot of liability in offering an opinion let-alone in attempting to deliver a security service. For DevOps, it is not just having logs but using them every day to make things better. With proper tools and techniques, you can see behaviours that are different from the norms on the network - but only if you are actually watching them every hour of every day for changes. It was a sobering statement that the average dwell time (for a hacker on a network or application) before a response is 3-6 months - and it should be a day. It was cited that Snowden’s behaviour changes took 14 days to spot and by then it was too late (for those in charge of the data security anyway).

Data breaches – are the headlines about to get bigger?

Scalar, a great Canadian success story in this industry (and .CA Registrant by the way!), presented data from their security survey:

  • Organizations report, on average, that they face 40 significant attacks annually
  • 50% say they don’t have sufficient security resources
  • Average breach cost is $175K per breach
  • 51% suffered a data loss

This last data-point warrants unpacking, so I will close on this thought. If 51% of organizations report  suffering a data loss and they will soon be legally obliged to report those losses then we can predict that in 2018 security headlines will be much more frequent.

Blog navigation