This is an interesting read and very relevant article. To check out his blog see details at the bottom. Enjoy!
NZLawyer \\ issue 184 \\ 18 May 2012
A heroic – but slightly defective – plan to save
the online world
There are times in the
life of a mild-mannered reporter (and barrister and blogger) when he must shuck
off his civilian garb, don his cape, and save the world. This is one of those
times.
There are evil people online. They bully, they defame, they
harass, they intimidate, they denigrate, they post naked pictures of their ex on
Facebook. Their innocent victims must be protected.
As the Law Commission has amply demonstrated in it issues
paper, The News Media meets ‘New Media’: Rights, Responsibilities and
Regulation in the Digital Age (NZLC IP 27, December 2011), these harms are
real. Netsafe says it gets 75 calls a month from desperate punters facing this
sort of problem. They are often frustrated by the inability of the police to
deal with them; police in turn say there are gaps in the law. Social media sites
aren’t always responsive to complaints. The victims can’t afford to go to court,
even when legal remedies are available.
What’s to be done? The Law Commission has mooted a
Communications Tribunal that could make take-down orders (and orders for
damages, rights of reply, or an apology) where a complainant can show that the
law is being broken and the breach is causing him or her significant harm. This
process would parallel the civil and criminal law, but be quicker, cheaper, and
easy to access.
I’ve suggested that this is problematic. Criminal defendants
might face an attack on two fronts, with different standards of proof and
evidence rules. They’d at least have to make awkward tactical calls that might
prejudice their criminal defence. A Tribunal ruling might influence their
criminal trial rights. Besides, the Tribunal would have to deal with tricky
mens rea issues that ought to be beside the point for the
Tribunal.
Parallel civil claims raise different problems. Take
defamation. Defamation cases involve a cornucopia of fiddly issues about the
meaning of words, the truth of particular imputations, classification as
opinion, the existence of reciprocal social or moral interests or duties, and
much more. The Courts also exhibit a muscular aversion to injunctions. What
would a Tribunal do with all that?
Conscious of some of these problems, the Law Commission
proposed an alternative: a Communications Commissioner, with no real powers but
an informal role of assisting resolution of online claims. This strikes me as a
great idea (particularly if this person were to be able to facilitate
resolutions with the big social media organisations), but if push comes to
shove, he or she won’t be able to shove very hard.
Here’s where I save the day. I have a plan that solves these
problems, while still providing a remedy for our innocent victims.
But first, a confession. This plan is pretty similar to the
Law Commission’s two plans rolled together. They’ve done the hard work here. And
my plan also has its problems. At this point, it’s a bit half-baked. What’s
more, it won’t stop a determined and savvy online abuser. But then, I’m not sure
what will. The best we can hope for is a cheap and quick way of having harmful
material removed where that’s possible, with care taken to ensure free speech
rights aren’t trampled on.
Here’s the gist. A claimant would have to provide evidence
of four things:
Material is published online in relation to the claimant
(who must be a natural person);
The ongoing availability of the material is causing the
claimant significant harm;
The claimant has made reasonable attempts to have it
removed, but has failed;
The material features one or more of the following
characteristics:
(a) It is false or
misleading;
(b) It contains sensitive personal information (including an
image);
(c) It breaches an obligation of confidentiality;
(d) It
denigrates the claimant by reason of race, religion, sexual orientation,
etcetera;
(e) It claims, without authority, to represent the
claimant;
(f) It encourages others to abuse the claimant.
These latter elements obviously have much in common with the
law: item (a) echoes defamation law, (b) the tort of invasion of privacy,
etcetera. But they strip away the complexities, and – I hope – reduce the clash
with the criminal law.
That’s enough to get a claim up and running. After that
happens, the Tribunal would have a discretion to make a take-down order. But it
could only do so after considering a range of statutory factors, and only if it
concludes that a take-down order is demonstrably justified. (That last bit is
intended to magic away the obvious clash with the New Zealand Bill of Rights
Act 1990).
What are the factors? Well, for a start, obvious ones like
the degree of harm likely to be caused, the breadth, nature, and understanding
of the likely audience, whether an order would be futile, and the importance of
the right to freedom of expression, including anonymous expression, and the
inherent dangers of censorship.
I’d also throw in a series of other factors designed to
reflect the principles of the law, but avoid their complexity. So the Tribunal
would have to factor in, where relevant:
The extent to which the material is accurate;
The extent to which the material is recognisable as
opinion;
The extent to which the material is recognisable as humour
or satire;
The extent to which the material contributes to a
discussion of a matter of importance to its audience;
Whether a right of reply has been offered, whether it has
been taken up, and whether it is likely to be effective in addressing the
harm.
These factors are designed to reflect defamation defences;
different factors may need to apply in privacy cases, for example.
Notice what’s not covered. This plan doesn’t deal with
harassing emails or texts; it doesn’t relate to copyright, contempt of court, or
hate speech against groups (there are remedies elsewhere for these); it wouldn’t
cover the mainstream media (if they have signed up to the single regulator
that’s also being proposed by the Law Commission); and it wouldn’t be available
to corporate bodies (unless the material reflected particularly on a natural
person – an attack on a small family firm, for example).
But I’m just laying down a possible framework. I think
there’s room for debate about the content.
What about the complaints process? I have in mind a
two-stage process, with a role for a Communications Commissioner at the
beginning.
A complaint would be lodged (with a small filing fee, I
think). The Commissioner would check that the four elements discussed above were
covered, and would filter frivolous or vexatious complaints. He or she would
have a duty to try to ensure the respondent is provided with details of the
claim (perhaps via an ISP) and given information about how a response can be
made (perhaps even anonymously).
The Commissioner could then decide to take a range of
actions:
Provide information to the parties;
Help the complainant deal with social media
organisations;
Merely warn the respondent of laws that may apply;
Try to mediate/settle;
Intervene on behalf of the complainant;
Refer the case to the Tribunal;
Refer it to the Tribunal for fast-track
consideration.
So the Tribunal would only come into play if the
Commissioner was unsuccessful. It too would be required to seek and consider the
respondent’s response (if possible). It could make interim orders in very
serious cases after fast-track consideration, perhaps applying a higher
threshold.
The consideration of the complaint would usually be done on
the papers, fairly informally, perhaps with provision for a hearing in rare
cases. The Broadcasting Standards Authority has operated successfully this way.
The Tribunal could make final orders after an exchange of submissions. I
wouldn’t be inclined to give it power to order damages. Let the Courts handle
that.
There should be a right of appeal on the merits. And it
should be an offence to disobey a take-down notice, to repost the material, or
to post something substantially similar.
The Tribunal would probably need powers to make take-down
orders against website hosts and ISPs where the respondent can’t be found or
won’t comply, suppress the names of claimants on occasion, order disclosure of
the respondent’s identity where necessary, make declarations of inaccuracy, and
order rights of reply. We’d probably want a provision that the evidence and
outcome at the Tribunal couldn’t be used in court proceedings.
There. Problem solved. Now, how about world
hunger…?
Not so fast, you say. And you’re right. You’ve noticed that
my plan suffers from some of the same defects as the Law Commission’s one as to
the dangers of parallel proceedings. And my plan too creates a fairly complex
task for the Tribunal. In addition, it presents greater danger of abuse. The
Tribunal has a very broad discretion to censor online material. My proposal
plainly authorises suppression beyond the edges of current laws.
I’m sure you can think of your own objections. I hope you
do. This is a debate worth having. The government is likely to be forced to take
some action on these issues in the near future, and the harder we’ve thought
about what that action should be, the lower the chances of us screwing it
up.
Steven Price is a
Wellington barrister specialising in media law. He writes a blog at
www.medialawjournal.co.nz.