Best of 2019: Getting to grips with privacy obligations

Information Technology Professionals Association (ITPA)

By Jonathan Nally
Monday, 30 December, 2019


Best of 2019: Getting to grips with privacy obligations

A panel of experts canvasses the pros and cons of, and obligations imposed by, Australia’s privacy compliance regime.

A wide mix of representatives from the cybersecurity industry, government, small and large businesses, and non-profit organisations gathered in North Sydney on 16 October for a briefing on the pros and cons of, and obligations imposed by, Australia’s privacy compliance regime.

The Information Technology Professionals Association (ITPA) Breakfast Briefing attracted almost 50 IT professionals, who were treated to an in-depth look into the world of privacy legislation, data breach risks, security systems and philosophies, and real-world examples of dealing with this complex topic.

On hand to lend their expertise were four speakers from a variety of backgrounds and with many years of relevant experience between them:

  • Anna Johnston, a highly respected expert in privacy law. Previously Deputy Privacy Commissioner for NSW, in 2004 she founder Salinger Privacy to offer privacy consulting and training services.
  • William Shipway, an IT security and risk professional, currently serving as Senior Security Manager for Sydney Trains; he previously managed the IT Security and Risk team for Service NSW, within the NSW Department of Customer Service.
  • Phil Kernick, CTO and co-founder of CQR Consulting. He has 25 years of experience in infosec and technology. His consultancy assists both private and public sector organisations to manage their security strategies and architectures.
  • Jason Duerden, Country Manager, ANZ, for BlackBerry Cylance. He has 10 years of sales, leadership and management experience in the cyber realm.

Headshots of Anna Johnston, William Shipway, Phil Kernick and Jason Duerden.

Clockwise from top left: Anna Johnston, William Shipway, Phil Kernick and Jason Duerden.

What are data and privacy breaches?

Johnston began by explaining the Australian Privacy Act 1988 and what constitutes a notifiable data breach (NDB).

The Australian Privacy Act covers the federal government and the private sector in Australia. (There are also state and territory privacy laws, none of which presently have mandatory data breach requirements, although NSW is looking into it at the moment.) Under the Act, a data breach means one or more of three different events, “Either an unauthorised disclosure, an unauthorised access, or loss of one of three types of data: personal information, tax file numbers or credit information,” Johnston said.

“Just because something is a data breach, doesn’t [necessarily] make it a notifiable data breach. And just because something is a data breach, doesn’t [necessarily] make it a cybersecurity incident,” she added, giving the example of the loss of paper files.

A DDoS attack that jams up a system is not a notifiable data breach as defined under the law, unless there’s a loss of information or unauthorised disclosure or access involved as well.

What makes something a notifiable data breach is if it is likely to result in ‘serious harm’ to one or more individuals. “You might have a million or more people affected by the data breach, but [even] if only one of those people is likely to suffer serious harm, then that would tip it into this category — that then makes it notifiable, meaning you have to notify or report to the OAIC and to the affected individuals,” Johnston said.

“Some people think they have 30 days to notify — that’s not strictly the requirement under the law,” she said. “As soon as you know you have a notifiable data breach, you must notify as soon as practicable. The 30-day rule is the time period you have in which do an assessment of your incident to determine whether or not it meets this threshold of notifiable data breach. Whenever you determine within that 30-day period that you’ve met that threshold, you must notify as soon as possible.”

The provisions apply to private sector organisations with annual turnovers of more than $3 million, which means many small businesses are exempt. “There’s a caveat to that — health service providers are covered no matter their size, even if they’re a very small business,” Johnston added.

“But if you think about three categories of data — personal information, tax file numbers or credit information — tax file numbers are held by every employer. And so to the extent that a data breach involves TFNs of staff, any organisation, any employer will be covered,” she said. “This includes state and local governments and universities and small businesses that are otherwise exempt from the Australian Privacy Act.”

Johnston then explained the difference between a data breach and a privacy breach. The federal privacy act has a set of 13 Australian Privacy Principles that cover the entire information life cycle of handling personal information, from collection to use to disclosure. They have things to say about access and correction rights, data quality and data security.

“A privacy breach is a breach of any of those principles. You might breach the rules about collection, but it’s not necessarily a data breach,” Johnston said. “But if you have a data breach that is caused by a failure of data security, it might be both a notifiable data breach and a breach of Australian Privacy Principle number 11, which is your data security obligations. So the one event might trigger two different legal problems.

“In terms of the repercussions of a data breach or privacy breach, they’re actually the same. Both of them are what’s called an ‘interference with privacy’, which triggers all of the Privacy Commissioner’s investigative powers [and] the complaints handling requirements,” she added. “And so that means that potentially you’re up for compensation for anyone harmed by either the data breach or the breach of the APPs, and civil penalties of up to $2.1 million per incident.”

With the data breach notification requirements, “the penalties only kick in if you have failed to notify when law says you’re supposed to notify. So having a data breach itself is not the legal problem, it’s failing to notify about the data breach when the law says you’re supposed to … that’s what triggers the $2.1 million fines and compensation.”

Prevention, detection and response

Shipway began by acknowledging the traditional owners of the land, and reflected on the juxtaposition of the tens of thousands of years of Indigenous history and the “very recent history in information security and IT generally”, and how “the short period of time we’ve had to develop our industry reflects on some of the issues we see”.

Shipway said that the challenge in his role is “to help people to understand why security is important”.

“The conversation in security is often about getting security closer to the people who are practising IT — developers, engineers, people working within an agile framework — people who, as part of their daily work, probably should be doing security but might not have thought about it, or might not know how to do it,” he said.

Shipway said that “automation is one of the key things that can improve that, because you can take the poor security decisions out of people’s work and have the decisions made automatically”.

He added that while prevention often takes the form of technology solutions that are very expensive and deliver unknown and potentially dubious value, “detection has become more important and many organisations are starting to do that quite well”.

However, “Response is still the area where many of us struggle, and I think it’s the place where most of our effort is going to be going over the next few years … particularly in light of penalties for breaches,” he said.

“Response is firstly about knowing your environment, knowing the technical environment and knowing your organisation’s business context … because you might not need to respond quickly if you don’t hold sensitive data,” he said. “You might need to have a business case already prepared if you do, and you need to be able to secure it really quickly.”

People sitting at tables, listening to a panel of speakers at the front of the room

Almost 50 people, from a broad spectrum of the IT world, attended the ITPA Breakfast Briefing on privacy compliance on 16 October in North Sydney.

Knowing what you have to lose

Beginning his presentation, Kernick asked for a show of hands of those who could honestly say they know the difference between a data breach, a privacy breach and a cybersecurity incident. Only a few hands went up.

Regarding that uncertainty, he said, “Here’s my overarching advice to you — don’t worry about the complications. Don’t worry about the detail. Don’t worry about whether you’re in or out, or this is in or that’s out. Just assume you’re in. Assume it will be of business benefit to protect the personal information that you, as businesses, hold.”

Citing the statistics released since the NDB regime began, Kernick said that the number of reported breaches is “not going down, because at this stage there’s no consequence to Australian business for having a data breach”.

He said that while businesses are required to notify in the case of a notifiable breach, “they’ve discovered that … you don’t get outed, you don’t have to do anything, so it’s cheaper to notify than fix the problem. And that fundamentally is why the attackers are so successful.

“Now there are business and governments who are doing this [well],” he said. “They’re the exception, they’re not the rule. So when we come in and do testing of businesses and testing of government agencies, we win pretty much every single time. We are able to access whatever they thought was protected, every single time, because they’re not protecting it well.

“But more than that, they don’t know what they have to protect,” he added. “So the first thing we would say to all Australian businesses is, work out what personal data you hold. Genuinely, people don’t know where it is, they don’t know that they’ve got it, they don’t know who’s looking after it.

“That’s step one. You can’t protect it if you don’t know where it is,” he said.

“Imagine that you were a manufacturing business dealing with a process that deals with toxic waste. Would you just let that lie around the office?” he asked. “We need to think of personal data as toxic waste. This is information that we hold that we don’t have a need for.

“In our cloud-first, storage-is-free world, we hold data because we can, because it doesn’t cost anything to hold it,” he said. “This is like accumulating toxic waste randomly in your businesses because you’ve got spare office space. It doesn’t make sense.

“It can’t get breached if you don’t hold it. Our teams can’t get in and find where it is if you don’t hold it. The hackers can’t get in and steal it and ransom it back to you or sell it off, or anything else, if you don’t hold it,” he said. “The best advice we can give you … is to not hold this information at all. And if you do need to hold it, protect it really well.”

On the subject of penalties for data and privacy breaches, Kernick said that Australia’s current regime is not perfect but it is the necessary first stage. “Eventually governments will realise that this was a first step towards trying to get businesses to internalise the costs of data breaches.”

Taking responsibility

Duerden explained how BlackBerry, having gotten out of the mobile phone business, has recast itself as a cybersecurity firm, bringing its highly regarded mobile phone security expertise to the wider market. BlackBerry security technology is now available for use on other manufacturer’s devices, and is finding a new role in the world of the IoT. Duerden cited the example of Jaguar cars, which contain BlackBerry know-how.

BlackBerry Cylance, which sponsored the Breakfast Briefing, is a data science and AI company specialising in endpoint security, such as next-generation antivirus, endpoint detection and response, and the incident response capabilities that goes along with that.

“Most people don’t really have much of an understanding of the landscape, process, policy, focus areas, where the data is, why it’s important, what we should do to protect it, and how we’re going to respond effectively if we have an incident,” Duerden said.

“It’s definitely a hard space to be in, so don’t feel like you need to be an expert, because there are experts out there who are still continuously learning.”

Duerden said that from a threat landscape perspective, “I think the biggest thing that we really talk to customers about is aligning the technology with the people — you’re never going to just put technology in that doesn’t need people to operate.

“The biggest challenge in security is that we’ve always had technologies that are dependent on humans to operate,” he added.

“If we think about the scale of malware and ransomware and everything that comes out every single day, the way that we’ve been doing security [is] we write a rule and we stop it. Then the next one comes out, and the next one. We’re just constantly in a hamster wheel … chasing around and around.”

But Duerden said that we’re now finally in a world where technologies are available to help us get to a zero-trust model where we don’t trust anything that comes into the environment — whether that’s a device, software, email or whatever.

“And the backbone of that is really driven by machine learning,” he said. “We’re able to scale leveraging technology that can help us make better decisions, and decisions that are autonomous from an operator having to constantly manage that system.”

Duerden said that from an incident response perspective, “If you don’t have an incident response plan, you should get one tomorrow.

“At least understand where your information is, who’s responsible, what happens, who do we call, what do we do,” he said. “You might not think it, but it is very common that a lot of organisations don’t have that, including governments and big corporates.”

Duerden said you should also test yourself. “How do you know how you’re going to handle an incident if you’ve never experienced one? There are plenty of exercises that you can do to test your team’s capabilities,” he said.

Duerden added that it’s important to make sure your service provider knows what they’re doing as well; just because you have a contract with a third party, it doesn’t mean the risk isn’t yours.

“There’s a big misconception that, ‘Hey, I’ve got a managed service provider that does all my IT’,” he said. “That doesn’t mean that when a breach happens they are responsible; you are still responsible.”

This is Part 1 one of our report from the ITPA Breakfast Briefing on privacy compliance. Part 2 will cover discussion of real-world examples and concerns, as expressed by members of the audience.

Image credit: ©stock.adobe.com/au/andranik123

This article was first published on 29 October, 2019.

Information Technology Professionals Association (ITPA) is a not-for-profit organisation focused on continual professional development for its 18,700 members. To learn more about becoming an ITPA member, and the range of training opportunities, mentoring programs, events and online forums available, go to www.itpa.org.au.

Related Articles

Measuring inefficiency

With a view to improving my 'leanness' and stop myself working so many extra hours, I...

Cybersecurity advice in the wake of Ukraine

In light of the current situation in Ukraine, the ACSC is urging all Australian organisations to...

Why major IT changes can wait

Attempting major IT changes late in the day — or week — can be a recipe for disaster.


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd