Internet Regulation in Australia
|Race Discrimination Unit, HREOC, October 2002
1. The Australian Broadcasting Authority
This paper will outline the regulatory provisions and non-regulatory strategies potentially available in Australia to manage the problem of racism on the Internet. It will examine the content regulation regime in Australia, including the classificatory standards applied by the Australian Broadcasting Authority (ABA) and the Office of Film and Literature Classification (OFLC). The requirements of the relevant industry codes registered by the ABA will also be overviewed as well as some of the possible non-regulatory options available for Internet content management.
It is important to note from the outset that racial vilification on the Internet is not prohibited by the classificatory standards administered the Australian Broadcasting Authority and the Office of Film and Literature Classification. So while the following section describes the principal mechanism for Internet regulation in Australia, this regime does not deal with Internet content that breaches, or potentially breaches, the Racial Discrimination Act 1975 or other anti-vilification laws.
1. The Australian Broadcasting Authority
The Australian Broadcasting Authority (ABA) is the principal agency responsible for Internet content regulation in Australia.  The ABA describes itself as administering a 'co-regulatory' scheme which was established by the Broadcasting Services Act 1992. 'Co-regulation' refers to the policy, which underscores that Act, that industry should be a regulatory partner with the ABA. The other principal tool available to the ABA for Internet regulation is its complaints mechanism. The ABA can refer material to the Office of Film and Literature Classification (OFLC). Material is assessed by the ABA and classified by the Classification Board by reference to the standards set out in Australian classification legislation and codes.
Schedule 5 of the Broadcasting Services Act 1992 gives the ABA the following functions:
One of the core objectives of the scheme is to "address community concerns about offensive and illegal material on the Internet and, in particular, to protect children from exposure to material that is unsuitable for them".  Clearly then the issue of racism on the Internet, which concerns both offensive and potentially illegal material on the Internet (including material, such as games and music principally targeted at young people), falls within the parameters of one of the ABA's core aims, despite the fact it is not dealt with in the regulatory framework.
The regulatory scheme applies only to material that can be stored and not to real time activities or communications via the Internet. Thus website pages are subject to ABA regulation. Where games can be downloaded and sampled they form Internet content subject to ABA regulation. The ABA also regulates audio, such as music samples, and where lyrics are posted on the Internet they are stored content material subject to the scheme.  The ABA does not regulate email, or real time activities or discussions, as these are not Internet material that would be stored or archived.
The ABA complaints mechanism
The ABA utilises a complaints-activated monitoring system as a key tool in discharging its legislative functions. The ABA's on-line complaints information describes the following Internet material as prohibited and consequently as material which may be the subject of a complaint:
These two categories of prohibited content are based on the National Classification Code and Classification Guidelines as authorised by the Classification (Publications, Films and Computer Games) Act 1995.  We will examine these classification standards more closely when discussing the role of the Office of Film and Literature Classification below.
As noted, racially vilificatory content would not generally be prohibited by these classificatory standards.
As part of an investigation, the ABA may request the Classification Board, based in the Australian Office of Film and Literature Classification, to classify the content according to its Guidelines for the Classification of Films and Videotapes (Amendment No.3).
If the content breaches, or is likely to breach, the classificatory standards, the ABA has the power to require the content host to remove the material from its service - where the content host is based in Australia. Therefore, dealing with prohibited Internet content in Australia is a relatively straightforward matter and failure by content hosts to comply with ABA orders can attract significant penalties.
Like all agencies regulating Internet content, the ABA faces greater difficulties in dealing with material carried and hosted by off-shore providers. Where content breaches or is likely to breach the classificatory standards, and it is not hosted in Australia, the ABA will notify the suppliers of approved filters of the content in order that these filters can be configured to block access to the content. The codes of practice require Australian ISPs to provide one of these products to subscribers (see below). It may also liaise with overseas regulators to bring the problem to the attention of overseas content hosts.
In cases where the content breaches criminal standards (for example, child pornography), the ABA may refer the material to the appropriate law enforcement agency, the relevant state or territory Police or the Australian Federal Police. Some racial vilification may fall into this category as we have seen.  That is, some racist internet material may constitute a criminal offence in some Australian states. It is unclear what referral processes to the police actually occur in such cases, but it is unlikely that the ABA would act on such material unless it also contained child pornography or other serious illegality.
2. Classification Standards and the Office of Film and Literature Classification
The ABA may request the Classification Board to classify Internet content according to the Classification Guidelines and the National Classification Code.  In addition to publications, including Internet publications, this classificatory regime applies to visuals and computer games. As noted above though, the classificatory standards do not prohibit racial vilification on the Internet despite the unlawful nature of the activity. This raises the issue of the adequacy of the current classificatory standards in dealing with content that is unlawful, or potentially unlawful, under the Racial Discrimination Act 1975 or other anti-vilification laws.
The classificatory standards in Australia are concerned with the prohibition or restriction of material which is sexually explicit or portrays extreme violence.  There are six main classification categories.  For the purposes of this discussion it can be noted that material classified RC and X is prohibited and R-rated material must be subject to a restricted access system. The descriptions applying to these classifications within the relevant Guidelines are principally concerned with material of a sexual or violent nature. 
It is problematic that the classificatory regime in Australia does not prohibit Internet material that is unlawful or potentially unlawful. The classificatory regime is intended to reflect contemporary community standards  and these must include the standards established by federal law in the Racial Discrimination Act 1975, as well as state criminal provisions.
It can be argued that the prohibition of racial vilification on the Internet is an objective consistent with the aims and principles of the classificatory system in Australia. Moreover, the inconsistency between the classificatory standards and the Racial Discrimination Act and criminal anti-vilification laws create obvious uncertainty and inefficiencies in Internet content regulation.
3. Industry Codes of Practice registered by the ABA
The regulatory scheme in Australia emphasises industry self-regulation through the development of industry codes of practice.  Where codes are not developed by industry or are inadequate, the ABA is able to develop and impose an industry standard. There are currently three codes of practice, or content codes, developed by industry and registered by the ABA which provide some of the industry standards applicable to Internet content.  Two of these codes apply to Internet service providers and one to Internet content hosts. Internet service providers (ISPs) offer a service for carrying communications to the public.  An Internet content host (ICH) is an organisation that hosts Internet content in Australia. 
The three codes of practice have been developed by the Internet Industry Association, the key industry representative organisation in Australia. The original codes were revised, amended and re-registered in 2001 and 2002, and are due to be reviewed again in November 2003. 
The ABA may direct an ISP or ICH to comply with a code if satisfied that it is not already doing so. Failure to comply with such a direction may be an offence under the Broadcasting Services Act 1992.
Content Code 1 deals with ISP obligations in relation to general internet access. In conformity with the stated priority of the Broadcasting Services Act 1992 the code is principally concerned with minimising access by children to unsuitable Internet material. For example, the code outlines steps intended to ensure that Internet access accounts are not provided to persons under the age of 18 years without the consent of a parent, teacher or other responsible adult.  It also requires ISPs to encourage those of its subscribers that are content providers to use appropriate labelling "in respect of Content which is likely to be considered unsuitable for children."  Furthermore, the code requires ISPs to provide users with information about the supervision of children's access to the Internet and other matters such as Internet content filtering software, labelling systems and filtered Internet carriage services. 
ISPs are deemed to have fulfilled these requirements "where they direct users, by means of a link on their Home Page or otherwise, to resources made available for the purpose from time to time by the IIA, the ABA, NetAlert or other organisation approved by the IIA". Accordingly, ISPs can generally discharge their responsibilities under the code through referral information.
The code also requires ISPs to have procedures to deal with complaints from subscribers about unsolicited email that advertises Internet information and sites "likely to cause offence to a reasonable adult".  This responsibility is discharged by an ISP by providing "complainants with, or direct[ing] them to, information describing methods by which receipt of unsolicited email of this nature can be minimised". 
Significantly, the code also requires ISPs to inform content providers "of their legal responsibilities, as they may exist under the Act or complementary State or Territory legislation in relation to Content which they intend to provide to the public via the Internet from within Australia".  With respect to all subscribers (and not just content providers) the code states that ISPs must take reasonable steps to inform subscribers:
The Code clarifies that ISPs will have fulfilled these obligations to inform subscribers "where they have included, on their Home Page or prominent Web Page" the information stipulated above, or provided a link to a Web Page containing that information and approved for that purpose by the IIA.  Again then, the responsibilities of ISPs, which are sketched in very general terms under the code, are principally discharged through referral.
Content Code 2 deals with ISP obligations in relation to access to content hosted outside Australia.  Specifically, the code provides that ISPs must provide filter technology at a reasonable cost when it is notified by the ABA of prohibited or potentially prohibited content, except where the end user already has technology in place, such as a firewall, which is likely is to provide a reasonably effective means of preventing access to the material.
Finally, Content Code 3 deals with Internet content host (ICH) obligations. Again, this code is principally concerned to minimise the access of children to unsuitable material and so it replicates many of the provisions outlined in Content Code 1.  This code also outlines the same standards as Content Code 1 regarding unsolicited emails advertising offensive sites. 
Content Code 3 also requires content hosts to advise the content providers who use their services "not to place on the Internet content in contravention of any State, Territory or Commonwealth law".  It also refers to providing users with information about their right to complain to the ABA about content. Again, this obligation is discharged when the right and means to complain to the ABA are referred to in a notice on the content host's Home Page, a web page link or in the service contract. 
Finally, Content Code 3 confirms the compliance of content hosts with take-down and other notices from the ABA. It also requires compliance with the directions of other "Relevant Authorities" and advising other content hosts of prohibited content.
There are many other industry codes adopted by ISPs around the world, and these are briefly outlined in the United Nations report that supplements this paper.
4. Non-regulatory Approaches to Content Control
There are a range of non-regulatory mechanisms available to assist in responding to and minimising racist content on the Internet. These include hotlines, filtering, rating systems and education and awareness. Many of these are overviewed in the Safer Internet Action Plan (SIAP) developed by the European Union  as well as by the United Nations. 
Hotlines are one approach used to deal with inappropriate or unsuitable Internet content and this is a strategy particularly emphasised in Europe. Hotlines provide a complaints service to the public, hence are a good way of monitoring Internet content. Some hotline agencies are state funded while others are industry financed. One of the most significant European hotline associations is INHOPE (Internet Hotline Providers in Europe). INHOPE's principal focus has been on child pornography, yet it has broadened its focus to the problem of racism on the Internet in recent years. 
Reference has already been made to filtering systems which can automatically restrict access to problematic sites according to general notifications, end-user selection or keywords. These filtering technologies are canvassed in a range of reports, and particularly by the Australian Broadcasting Authority.  Several of these filters are applied specifically to limit access to hate speech. 
Rating systems allow content creators and/or third parties to classify content. This rating is then identified by the end-user's filtering system and access is determined accordingly. The major system used for associating labels with Internet content is the PICS (Platform for Internet Content Selection) mechanism developed by the Worldwide Web Consortium (W3C). If appropriate vocabulary can be identified, PICS can be used to classify material based on racist content, although the system does not guarantee that all sites will be rated, or rated appropriately.  The Internet Content Rating Association (ICRA) is the most prominent rating association in the world and it operates its rating process by way of a questionnaire completed by content providers. Again, there is no requirement that sites be rated.
Search engines also provide a possible framework through which racism on the Internet can be limited. Search engine operations are generally based on keywords, with each system using a different approach to rank the search results requested by an end-user. It has been suggested that search engines could be effectively utilised to apply content rating frameworks such as PICS.  It is also proposed that search engine catalogues can be directed to rank anti-racist material as highly as racist material in returning search requests. 
End-user education is another non-regulatory tool available to combat racism on the Internet. In line with its emphasis on protecting children from harmful content, the Australian Broadcasting Authority has developed its "Cybersmart Kids Online" education tool for children.  Other important 'net literacy' resources internationally include Childnet International  and "Quality Information Checklist".  Some community organisations established to combat racism and hatred can also be important resources in this regard. The more well-known among international hate monitoring groups include the Anti-Defamation League in the US,  Hate.com,  Tolerance.org,  Cyber-squatters against Hate,  the Simon Wiesenthal Centre,  the Southern Poverty Law Centre  and Turn it Down . Other non-regulatory provisions which could assist are monitoring and promotions frameworks such as the development of guidelines, education and so on. 
1. The following description of the role and complaints process of the ABA has been taken from the ABA's website. Please see: http://www.aba.gov.au/internet/index.htm.
© Human Rights and Equal Opportunity Commission, 2003.