Could you really go to jail for highlighting text in a PDF?

 Yesterday’s reporting of leaked private information shocked the public – and triggered further debate on how we go about constructing and overseeing the management of data and personal information. Reading that the mobile numbers of virtually all Australia’s serving MPs, (including some ex Prime Ministers) had been disclosed prompted mass disbelief. To hear that the failed redaction was apparently due to a contractor accidentally using white coloured font to block out the phone numbers was astonishing. Particularly given the fact that a remarkably clear Department of Defence training manual on redacting PDF files from 2011 lists this very method as  “poor redaction: example 3”. This affair won’t feature in any future presentation on Australia’s desire to attain world-leading status in cyber security.

Yet the second day story on this news is also pretty worrying from my perspective, with a number of twitter users asking the question: could individuals who highlighted the text in the PDF file go to jail in the future? For those unfamiliar with the pending legislation on the re-identification of de-identified data this may seem just as unbelievable as the initial story. Perhaps you are all fearing another entry in the “expansive government power secured through broadly worded statutory provisions” ledger.

The last couple of weeks’ have illustrated to me the importance that people understand the rule of law values which guide statutory interpretation. The disclosure powers publicly claimed by the Department of Human Services have been the subject of (in my view welcome) criticism, but have also produced a climate of fear and real pessimism about the ability of government agencies to pursue expansive interpretations of their powers. What I want to show is that how we construe words, as lawyers, does discipline government and express fundamental values such as reasonableness and procedural fairness. I don’t believe that the new laws could stretch to catch a private citizen who spotted the redaction error that led to yesterday’s story. But the quite serious effort I have to go to to establish this does once again raise rule of law concerns about the limits of government’s power. I’m thankful that law, as a science of words, never involves dictionary swallowing but rather working with an undertow of “quantum”, value led elements.

Edit: based on some feedback from readers, I do want to clarify that while I am in interpreting this section to protect the rights of individuals, the very fact I have to write this post raises rule of law concerns. I wish to underline my broader opposition to the Bill as disproportionate and not justified in its current form.

Privacy Amendment (Re-identification Offence) Bill 2016 Section 16D:

The relevant section which may threaten those who highlighted the text in the PDF file is section 16(D)(1) of the Privacy Amendment (Re-identification Offence) Bill 2016 which criminalises the re-identification of de-identified data. Quick test for the non legally trained reader: can you spot the broad phrase that opens up the coverage of the offence?

 

16D  De‑identified personal information must not be re‑identified

(1)  An entity contravenes this subsection if:

(a)  information has been published by, or on behalf of, an agency (the responsible agency) in a generally available publication; and

(b)  the information was published on the basis that it was de‑identified personal information; and

(c)  on or after 29 September 2016, the entity does an act with the intention of achieving the result that the information is no longer de‑identified; and

(d)  the act has the result that the information is no longer de‑identified.”

 

The flexible phrase that my eye runs straight to is contained in subsection (b): “the information was published on the basis that it was de-identified personal information”. The language of 16D(1)(b) has been constructed so that criminal liability is not hinged upon the porousness of government’s initial practices. In short, the government is attempting to delink potential criminal liability from any initial incompetence of its redaction techniques.  As the editor of ZDnet Australia, Chris Duckett, commented on one of my tweets yesterday, citing earlier statements by the Attorney General’s Department:  “Problem with this is that it doesn’t seem to matter if the de-identification is screwed”. Let’s look at how we can construe these section to avoid the farcical situation where a citizen becomes criminalised by hovering a cursor over white font, copying and pasting.

I believe that in reading the section as a whole, any de-identification offence can only occur where a process which can rationally be described as de-identification has taken place. Given the ridiculous nature of yesterday’s fail, I think that the redaction would not be interpreted so as to attract the label “deidentified”. My interpretation is supported by the following elements of the above section:

  1. The requirement in section 16D1(d) that the act has the result that the information is no longer de‑identified”. This element of the offence is in my view premised on the idea that a process which can rationally be described as de-identification (“no longer deidentified”) has occurred. It seems to me impossible to argue that an individual, by highlighting the text of PDF, could be said to have obtained the result of de-identification.
  2. The heading of the section underlines its overall purpose without qualification: “De‑identified personal information must not be re‑identified”. This will be construed harmoniously with the “on the basis” element in section 16(1)(b).
  3. The intent requirement in the section seems to underline that the individual must – in a calculated way – engage in an act of de-identification. In effect because it is so unbelievable that hovering your cursor would de-identify the data, I don’t think the court would find that the individual had formed a specific enough intent to de-identify it. I would ask the court to construe the intention requirement as being made out where the individual has actively identified, tested or otherwise assessed the vulnerability of the data and then took a conscious action to exploit it.

Tim Watts MP then chimed in on twitter to ask whether my interpretation was in fact circular – that in requiring that the effective de-identification to occur, the person who de-identifies the dataset acquits themselves by proving – through cracking the code – that the data was never de-identified. Credit the politician (a rare statement!) for spotting that while my above points work to avoid criminalising a citizen for spotting this one, utterly ridiculous elementary fail, there is a point at which the section must move to prevent the individual from justifying their conduct through the submission that the de-identification was “screwy”.

I actually would support an amendment to tighten up the “on the basis” phrase – which to me threatens to become a little engine of ambiguity. While I’m confident we are fine in this case, I’m a little worried that less tech savvy members of the judiciary might struggle to understand why it would be irrational to construe something as a process of de-identification. There is a better way of drafting subsection (b) so that it reads harmoniously with the title of the section, the intention requirement and (d). Drawing these subtle lines is in my view the responsibility of legislature, but sadly, these days increasingly seems to be the job of the profession and us academics.

With great power comes relegated responsibility? Of Whitelisting and Retrospectivity

Now while you might have come for my indefensibly clickbaiting headline – why not stay for broader rule of law and policy questions at the heart of Bill? Beyond the narrow questions of drafting, we do need to map the elephant footprints of how the government approaches the framing and delivery of privacy protection and cybersecurity. Criminal provisions are the low hanging fruit of regulatory interventions, casting government in a hue of toughness and vigour.  Yet we may be ill served by the selective, overconfident treatment it gives to the complexities of data security in the globalised age. The threat of prosecution may matter little to those outside Australia’s jurisdiction – consider yesterday’s alarming story on the possible use of overseas companies to assist outsourced debt collection. Too many anti-terrorist or crime prevention initiatives seize upon the criminal law, without asking broader questions of governance. Modern government is haunted by the tendency, as set out in the Queen song lyric, to “build your muscles while your body decays”. It can be argued that the rush to deliver this bill sets its face against the de-centralised, watchdog culture of data breach monitoring which exists within our tech community. It sets its face against the culture of research, and the idea that any good data management policy plays chess against itself. This is embodied in the submission on the legislation made by the University of Melbourne researchers who successfully de-identified the mass dataset that led to the hurried drafting of the legislation.

The legislation does comes replete with exemptions for statutory contractors, research bodies, and crucially, as stressed by the Privacy Commissioner and the explanatory memorandum of the Bill – journalists.  A journalism exception is already installed in the Privacy Act, but I do want to mention that the operation of that section of the Act does have some difficulties regarding its ambit and practical application. While these exemptions are by and large quite welcome, the legislation vests significant power in the Attorney-General to approve or disapprove of entities conducting data security work. In effect the Attorney General will administer a ‘whitelist’ system of exemption, including a public interest exemption of somewhat contestable scope.

Another aspect which the Law Council of Australia has criticised is the legislation’s retrospectivity, defended by the Attorney General’s Department in the following terms:

“The new offences in the Bill will operate retrospectively from the 29th of September 2016; the day after the Government announced its intention to introduce these offences. This creates a strong disincentive for entities to attempt re-identification while Parliament considers the Bill. Releases of private information can have significant consequences for individuals beyond their privacy and reputation, which cannot be easily remedied. This warrants swift and decisive action by the Australian Government to prohibit such conduct.”

Australian governments are intoxicated by this “backdate to our original press release” retrospectivity. It places an executive aura around the process of legislating pointing to the party political realities eroding parliament’s status as a separate branch of government. To argue that an initial, superficial assertion that “we are criminalising this act” puts the public on effective notice to avoid de-identifying data while Parliament debates the Bill, does not reflect rule of law values. The precise contours of the offence are produced not by the Attorney General’s draft but only after the process of legislative engagement and debate central to our representative democracy. As the Law Council of Australia pointed out before Christmas in opposing this retrospectivity pointed out the important statements made by a majority of the High Court in Director of Public Prosecutions (Cth) v Keating (2013) 248 CLR 459, which emphasised the common law principle that the criminal law ‘should be certain and its reach ascertainable by those who are subject to it’. This concept is ‘fundamental to criminal responsibility’ and ‘underpins the strength of the presumption against retrospectivity in the interpretation of statutes that impose criminal liability’.

A “Bits and Pieces of Government” Approach to Privacy Protection

For further information on all this, but also to see our public service at its very best, I do want to direct readers to the wonderful information note prepared for the public by the Bills and Digest section of the Senate. This prescient note written prior to yesterday’s story actually predicted the possible application of the Bill to poorly redacted white fonted PDFS. This note displays the highest standard of professionalism and fair-mindedness. Reading through this excellent digest has really driven home something a theme emerging from the past couple of weeks’ privacy debate. At the moment privacy protection is not being driven by a whole of government approach but by episodic, reactive management driven by crises and political currents. Veterans’ Affairs Minister Dan Tehan appears to have secured passage of the Digital Readiness Bill with its public interest certificate system permitting the use veterans’ personal information. The Minister did react to the recent public outcry, releasing some guidance on the protections within his proposed system late last Friday evening.

This document, especially the frankly bracing table comparing the Privacy Act protections to the Bill, nevertheless foregrounds the implication that privacy protections are denuded outside the context of the Minister’s own bill. The points made within in do not seem to align with the Attorney General Department’s explanation (in the above digest) of why we can trust the Privacy Act and other primary legislation to effectively promote the proper de-identification of public datasets. Nor do the Attorney General’s Department arguments about the weight given to privacy and the nature of public interest under statute in construing the exemptions to the re-Identification Bill seem to align with Department of Human Services’ broad approach to the exemption provided section 202 of their primary legislation. It is exasperating to herd, parse and identify the consistencies in these interpretations.

 

The Battle to Know Where We Stand

Chief Justice Robert French retired at the end of last year. If there was one unifying theme to the French court it was to stress the idea that the language of statutory powers must be read against the backdrop of immanent values – statutory implications such as the right to procedural fairness and the requirement of reasonableness. This is why, despite administrative law’s extremely contextual nature and incredibly fine distinctions, I was happy to go on the public record arguing for a narrow reading of Department of Human Services’ disclosure powers. I do not want Australians to lose faith in the interpretive values which the legal community brings to reading any government power. Even if the Department of Human Services’ approach to its powers is upheld, the Department has still failed to deliver a fair and legitimate governance of that interpretation: no policy guidance on their interpretation was published prior to this controversy. Section 209 Directions were created defining when it is in the public interest to “correct the record”, even though the explanatory memorandum of the Social Security Administration Act 1999 states that s208 certification process was intended for releases which would “in circumstances where release would normally be barred”. These directions are a disallowable instrument (meaning a Senate motion opposing them could be laid down): did parliamentarians know that the Department actually viewed s202 not s208 as its standard mechanism for correcting the record? How is this an expression of the values of representative government the Australian Constitution is premised upon?  It took a public debate (and simple please explain notes from myself, Victoria Legal Aid and former Deputy Privacy Commissioner of NSW Anna Johnston) for us to gain insight into how the Department viewed its powers. There is more at stake than points-scoring against an individual department or party politics, more at stake than a “test case”. Far too often in Australia we have to litigate and agitate for certainty, rather than having legislated for it.

Darren O’Donovan