Skip to main content

Search

Law Society Journal 2008: Cyber-racism: can the RDA prevent it?

 

law society logo

back to index of LSJ articles

Cyber-racism: can the RDA prevent it?

Jonathon Hunyor

Jonathon Hunyor is the Director of Legal Services at the Human Rights and
Equal Opportunity Commission

pdf icon Download in PDF [300KB]

word icon Download in Word [43KB]

The reach of the internet and the freedom it gives those wanting to share
information and publish opinions presents a wide range of regulatory challenges.
One such challenge is preventing online acts of racial hatred, or
‘cyber-racism’.[1]

In a number of cases, the Federal Court has held that posting racially
offensive material on the internet is subject to the racial hatred provisions of Racial Discrimination Act 1975 (Cth) (‘RDA’). More recently,
the Court has held that the RDA also applies to those hosting websites who fail
to remove offensive material posted by users.

There are, however, significant limitations in the ability of the RDA to
prevent cyber-racism. Recognising this, the Standing Committee of
Attorneys-General is reported

to be considering giving the Australian
Communications and Media Authority (ACMA) specific powers to order internet
service providers to take down websites expressing racial
hatred.[2]

The
RDA

Section 18C of the RDA makes it unlawful to do an act ‘otherwise than
in private’ if

  • the act is reasonably likely to offend, insult, humiliate or intimidate
    another person or group of people; and
  • the act is done because of the race, colour or national or ethnic origin of
    the other person or people in the group.

An act is taken not to be done ‘in private’ if,
relevantly, it ‘causes words, sounds, images or writing to be communicated
to the public’ (s 18C(2)).

The RDA provides an exemption for things done ‘reasonably and in good
faith’ in the context of artistic works, discussions and debates, fair and
accurate reporting and fair comment expressing a genuine belief (see s
18D).[3]

Jones v Toben

The first case to consider the application of the RDA to internet publication
was Jones v Toben.[4] The respondent had published anti-Semitic material that, amongst other
things, sought to cast doubt on whether the Holocaust had occurred.

Branson J held that the act fell within the scope of s 18C: ‘placing of
material on a website which is not password protected is an act which, for the
purposes of the RDA, is taken not to be done in
private’.[5]

The material was also found to be offensive to members of the Australian
Jewry and done ‘because of’ race, being plainly calculated to convey
a message about Jewish people.[6]

On appeal the Full Federal Court upheld the decision at first instance and
also considered the application of the exemption in s
18D.[7] Although the material
published purported to be part of an historical debate, the Full Court held that
the respondent had not acted reasonably in good faith as the material published
was ‘deliberately provocative and inflammatory’, intended to
‘smear, hurt, offend, insult and humiliate
Jews’.[8]

The respondent was ordered to remove the offensive material from the World
Wide Web and restrained from publishing the material on the Web or
otherwise.[9]

An order to the same effect was made in another case involving material of a
similar nature published on the internet: Jones v The Bible Believers’
Church
.[10]

Silberberg

In Silberberg v The Builders Collective of Australia and
Buckley
,[11] the first
respondent (‘the Collective’) conducted an internet website which
included an online ‘Forum’ to enable discussion and debate of issues
relating to the building industry.

All members of the public with internet access could view the messages posted
in the Forum. While only registered users were entitled to post messages, in
practical terms users could remain anonymous. Messages were posted automatically
without intervention or monitoring by the Collective and there was no
systematic monitoring thereafter, although postings were reviewed from time to
time and there was a general policy of deleting objectionable material. People
posting material were also required to indicate their agreement that they would
not post information that was ‘vulgar, hateful, threatening, invading of
others privacy, sexually oriented or violates any
laws.’[12]

The second respondent, Buckley, posted material that made reference to the
applicant’s Jewish ethnicity and conveyed imputations that were found to
be likely to offend and insult the applicant and other persons of Jewish
ethnicity. Gyles J found that this was in breach of the RDA, citing with
approval Jones v Toben and Jones v The Bible Believers’
Church
.[13] Buckley was ordered
not to publish the impugned messages or any material conveying similar content
or imputations.

The case against the Collective was, however, dismissed. Gyles J noted that
while there was ‘little difficulty in applying s 18C to the author of a
message’, the position of ‘others involved in the chain between
author and ultimate reader is not so
clear’.[14]

Given that the essence of the complaint against the Collective was their
failure to remove material posted by Buckley, it was significant that s 3(3) of
the RDA provides that ‘refusing or failing to do an act shall be deemed to
be the doing of such an act’.

Gyles J considered a range of cases that had deal with issues of liability
for publication and broadcasting in the context of copyright and defamation
proceedings. Based on these authorities and s 3(3) of the RDA, his Honour
concluded that it was ‘clear enough that failure to remove known offensive
material would be caught by s
18C(1)’.[15]

Gyles J further concluded that s 18C(1) caught failures to remove offensive
material within a reasonable time, even in the absence of actual knowledge of
the offensive contents of the message. His Honour observed that ‘[t]he
Collective chose to conduct an open anonymous forum available to the world
without any system for scrutinising what was posted. The party controlling a
website of such a nature is in no different position to publishers of other
media’.[16] The fact that the
material was said to be posted in breach of user conditions did not, in his
Honour’s view, alter that conclusion: ‘In one sense it underlines
the fact that the Collective took no steps to ensure that its conditions were
obeyed.’[17]

However, the failure could not be shown to have any relevant connection with
race or ethnic origin. The failure ‘is just as easily explained by
inattention or lack of diligence.’ On this basis, the Collective was found
not to have breached the RDA and the case against it dismissed.

Although not referred to by his Honour, it is relevant to note that while s
18E of the RDA provides for vicarious liability for acts of racial hatred (not
applicable here as Buckley was not in a position of employee or agent), there is
no provision for ancillary liability. Section 17 makes it unlawful to
‘assist or promote’ the doing of other acts unlawful discrimination
proscribed by Part II of the RDA, but this does not apply to the prohibition on
racial hatred in Part IIA.

Limited protection

The cases discussed above reveal a number of potential and actual limitations
of the RDA in preventing cyber racism.

First, the courts have yet to consider whether s 18C(1) applies to password
protected sites. Although much is likely to turn on the facts of a case, such
sites arguably should still be considered to be ‘otherwise than in
private’ on the basis that there is a communication to ‘the
public’, albeit a limited section of it. Such an approach is supported by
s 18C(3) which provides that public place ‘includes any place to which the
public have access as of right or by invitation...’ (emphasis
added).

More significantly, the absence of ancillary liability provisions would
appear to require very little care to be taken by those hosting websites that
allow for comments to be posted. It is not enough to show that a host
organisation was aware of offensive material appearing on their website, or even
that they refused (rather than simply neglected) to remove it from their
website. An applicant must prove that the failure or refusal was connected with
the race of the relevant group.

The need to pursue the individual responsible for posting offensive material
creates particular difficulties in the context of the internet, which can allow
anonymous publication from anywhere with internet access.

While internet content hosts and service providers are also subject to
content regulation under the Broadcasting Services Act 1992 (Cth)
(‘BSA’), the obligations in the BSA are based upon the film
classification regime.[18] This film
classification regime concerns itself primarily with material depicting sex,
violence and certain criminal activity. Although this will potentially include
more serious forms of vilification involving, for example, incitement to
violence (which may also fall within the reach of criminal law), this regime
does not address the specific types of harm that the RDA prohibits and seeks to
prevent.

The proposal to give ACMA specific powers to regulate racial hatred appearing
online is therefore timely.


[1] Further discussion of the issue
of cyber-racism can be found on the HREOC website at
http://www.humanrights.gov.au/racial_discrimination/cyberracism/index.html.

[2] Imre Salusinszky, ‘Law chiefs place ban on race-hate websites’, The Australian, 1 April 2008,
8.

[3] For a discussion of cases
that have considered these sections, see Federal Discrimination Law 2005: http://www.humanrights.gov.au/legal/FDL/fed_discrimination_law_05/index.html.
An updated version of Federal Discrimination Law will be published later
this year.

[4] [2002] FCA
1150.

[5] Ibid
[74].

[6] Ibid [95],
[99]-[100].

[7] [2003] FCAFC 137.
The issue of exemptions was not considered in detail at first instance as the
respondent had not filed a defence and the onus of making out an exemption rests
with the respondent: [2002] FCA 1150,
[101].

[8] [2003] FCAFC 137,
[161]-[163] (Allsop J), [45] (Carr J with whom Branson J
agreed).

[9] [2002] FCA 1150,
[113].

[10] [2007] FCA
55.

[11] [2007] FCA 1512
(‘Silberberg’).

[12] Ibid [3]-[5].

[13] Ibid
[18]-[23].

[14] Ibid
[26].

[15] Ibid
[34].

[16] Ibid.

[17] Ibid.

[18] See Schedule 5. The
key concept is ‘prohibited content’ which is defined by reference to
classification by the Classification Board: see cl 20 Schedule 7.