Australia wants boards held to account for infosec

Company directors better get schooled up on the cybers

Australia’s new 2020 cyber security strategy is the latest national plan to propose that company directors be held accountable for meeting minimum information security baselines prescribed by the government.

Australia’s Ministry of Home Affairs flagged in the strategy “possible legislative changes that clarify the obligations for businesses… to protect themselves and their customers from cyber security threats” including new “duties for company directors”. These new rules of the road would affect both regulated and previously unregulated entities.

The strategy document correctly identifies that most critical assets in Western economies are held by public companies and that the boards of public companies play an outsize role in making decisions that impact the national security posture.

It’s disconcerting, however, for governments, most of which have an atrocious record on meeting minimum security baselines, to force private industry to comply with security baselines. This stuff is hard and baselines don’t make it magically easy.

The strategy document doesn’t instil confidence that Home Affairs – which hasn’t replied to our requests for clarification – is up to the job. The nonsensical section on ‘metrics’ in the strategy doesn’t define a single metric. News reports suggest the department, which has already replaced its policy leads since inheriting responsibility for cyber security policy, will continue to let industry panels guide it’s thinking on standards.

The Risky Biz view is that prescriptive baselines don’t really work. Every company has a radically different exposure to cyber security risk and prescriptive baselines are very difficult to maintain (consider how outdated the password policies in most standards are.)

Forcing companies to make security metrics public will expose them to additional risk by showing adversaries where they are weak and inevitably encourage them to fiddle with their numbers.

One idea tossed around in US policy circles in recent months has been to simply compel companies to report security metrics internally to the board, just as you would for your financials. US Senator Angus King, a co-chair of the Cyberspace Solarium Commission, proposed an amendment to Sarbanes Oxley reporting rules that would do just that. His recent attempt to attach it to the 2021 NDAA (Defence authorisation) Bill failed, but we’re told he will try again in February 2021.

Under King’s amendment, the CISO of a listed company (or an equivalent principle security or risk officer) would be required to report to external audit and the board audit committee on a regular basis. Their report would need to inform the board of all significant cybersecurity risks in critical information systems, of any fraud that involves management or employees in the infosec team, whether there were significant changes in infosec controls and of any corrective actions taken to tackle cyber security risks.

This sort of reporting already happens in banking and finance. It doesn’t solve every problem, but it does prevent company directors from taking a “see no evil, hear no evil” approach to cyber security risks.

None of the information in those reports needs to be published or submitted to regulators. The act of tabling it does enough. It creates a record that can be subpoenaed during eDiscovery in lawsuits. It nudges company directors and chief executives toward declaring security goals and being genuinely invested in meeting them. It doesn’t sound like much, but that’s better than where most companies are today.

The question then becomes, what is a meaningful goal to strive for?

Policy wonks won’t need to reinvent the wheel here. Seasoned CISOs suggest that most large US companies have converged on the NIST cybersecurity framework for self-assessment of cybersecurity maturity.

Melody Hildebrandt, CISO at Fox told Risky.Biz in a recent chat that the US CISOs are converging on common approaches to reporting and compliance. Cloud computing has brought about standardisation, easier enumeration of infrastructure and endpoints and a common vocabulary for measuring cyber security risk.

“In the past we’d all hire one of the Big Four [consulting firms] to measure our maturity using some sort of special source proprietary methodology,” Hildebrandt said. “You would be crazy to do that today. We turned a corner in the last two years into some generally accepted practices.”

It would be worthwhile to “canonise” those agreed practices, she said, if only to help assess how other organisations measure and treat risks when completing due diligence for proposed mergers and acquisitions.

Other CISOs have warned that thresholds would need to apply: smaller companies might find standards like NIST CSF a little overwhelming.

Dmitri Alperovitch, chairman of Silverado Policy Accelerator suggests simple, universal metrics: how an organisation responds to incidents, such as speed of detection and speed of response.

Hildebrandt can live with that.

“I think it is entirely reasonable that an organisation should be required to document and report internally on material security incidents,” she said. “If a metric was to be forced on me, ‘mean time to detection’ is a pretty good one. It’s a hard one, but it’s also something that might move the needle.”

Enjoy this story? You can subscribe to the weekly Seriously Risky Business newsletter at our SubStack page.

SUBSCRIBE NOW:
Risky Business main podcast feed:
Listen on Apple Podcasts Listen on Overcast Listen on Pocket Casts Listen on Spotify Subscribe with RSS
Our extra podcasts feed:
Listen on Apple Podcasts Listen on Overcast Listen on Pocket Casts Listen on Spotify Subscribe with RSS
Subscribe to our newsletters: