Tue | Feb 18, 2020

Jamaica in the digital era (Part II)

Published:Tuesday | March 19, 2019 | 12:07 AM
Cordel Green
The Chamber of Parliament, where legislation is debated and approved.

The following is the second part of a paper updated by Anthony Clayton, chairman of the Broadcasting Commission of Jamaica, and Cordel Green, executive director of the commission. Part I appeared yesterday.

Media literacy

The priority for Jamaica now is no longer accessibility as almost everyone in the nation has a mobile device and can choose a data plan. It is media literacy. Having access to the Internet is only the first step. People also have to know how to use the Internet to change the way they live, learn, and work. The evolving digital environment requires citizens who are Internet-literate, are confident creators and consumers of content, and have the technical and social skills needed to participate positively in the digital world. The Broadcasting Commission has therefore redoubled its commitment to improve media literacy throughout the land.

The new approach to content regulation

Content now flows seamlessly across borders. Consumers are able to bypass traditional networks and all associated safeguards. A vast new array of opportunities is opening up for people to participate in the digital economy, no longer as passive consumers, but now as prosumers, i.e. both consumers and creators of content, and this economic mobilisation of the people has tremendous potential benefits for Jamaica.

Sadly, it is also true that evil never sleeps. New forms of crime, terrorism, political manipulation, hate speech, and malice have already emerged in the digital world. and are quick to move into any undefended spaces. The current generation of problems includes uncontrolled access to extreme pornography and ultra-violent content, hate speech, unethical advertising, Internet addiction disorder, cyberbullying, grooming and revenge porn, induced suicides; the use ofmanipulative disinformation and hate speech on social media to instigate riots and murders; cybercrime, including scamming, phishing and bank hacks, identity theft, fraud and resale of stolen credit card details; narcotics distribution via the dark web; terrorist recruitment on social media platforms, which has allowed organisations such as the Islamic State to reach out to disaffected youth across the world, bypassing all border controls, and convince them to carry out lone-wolf attacks in their own countries; the rapid spread of bogus conspiracy theories; the polarisation of politics by fake news and unfiltered hate speech; and the undermining of the concept of truth, which happens because many people can no longer tell the difference between fake and real news.

The need for standards and controls

As the population shifts to online sources for content, the traditional media have rapidly become less profitable as advertising has migrated along with the eyeballs. Many newspapers have closed while other media houses have been pushed into mergers or are desperately trying to find new sources of revenue in order to survive. As their profitability erodes, all but a few are losing much of their primary news-gathering and fact-checking capacity, and some are now just pulling their news off the web in a process that has become self-referential. The loss of the fact-checking gatekeepers and the increasing reliance on trending topics on social media makes it increasingly difficult for people to distinguish between fake news, Internet gossip, and reliable sources of information.

The diminution of authoritative and independent sources of news also means that many people now obtain their information from closed loops of like-minded individuals in the same social media groups, which encourages political tribalism and more extreme views, increases vulnerability to fake news and manipulation via social media, and thereby starts to undermine the basis of tolerance that is the foundation of democracy and participation in society. One of the vital roles of government is to create the conditions for economic prosperity and development. This means that government has to address the issues outlined above and develop a modern regulatory framework to deal with the new era of content proliferation and monopolistic concentration.

The Need for Action

Social media has become the largest news source in the world but has little editorial control, regulation, or legal recourse against lies and slander. Governments have become deeply concerned about the ways that social media can now be used to undermine the truth, create division, spread hatred, and manipulate democracy, and a number of governments are, therefore, now preparing to take steps to change the law and regulate social media.

In the United Kingdom, for example, a recent parliamentary report concluded an 18-month investigation into fake news and found that Facebook had deliberately and repeatedly broken data privacy and competition laws and called for the company to be regulated.

The report makes a number of important recommendations, including:

• That the existing standards for broadcasting should become the basis for standards for all content, including conventional and Internet broadcasting and social media. The same standards for truth and decency should apply, regardless of the technology used to carry the information.

• That there must be clear guidelines on misinformation and disinformation (rather than the better-known phrase ‘fake news’), which will then form the basis for regulation and enforcement across all media platforms.

• That there is need for regular audits of the security of social networks to ensure that the technology companies are operating responsibly and have safeguards in place to protect the data of users and prevent their legitimate privacy from being compromised.

• That the rules on political campaigns and finance must be updated to make them fit for the digital age.

• To consider introducing a new tax on technology companies and to use this to fund digital literacy programmes.

The UK’s communications regulator, OfCom, also published a discussion paper on harmful online content, which takes a more nuanced approach. It suggests that broadcasting rules cannot simply be applied online, mainly because of the sheer volume of content generated or shared by online platforms, the enormous diversity of online content (which includes user-generated content and conversations), the fact that many online platforms do not create content accessed by their users, difference in context and audience expectations between broadcast and online sources, and the multinational nature of online platform operators. OfCom, therefore, took a principles-based approach, i.e., they identified principles from broadcasting regulation that could form the basis for a framework for online protection. The main principles are:

• Protection and reassurance against harmful content and conduct, reflecting appropriate societal norms and setting clear standards that the regulated parties are required to adopt in their practices and procedures. This is similar to the idea that the same standards for truth and decency would apply regardless of the technology used to carry the information.

• Enforcement, involving appropriate sanctions to deter bad behaviour. This means that regulators would be given new powers to sanction offenders.

• Upholding freedom of expression. In practice, this means that regulators will pay more attention to the technical processes that platforms employ to identify, assess, and address harmful content, as well how they handle appeals, to ensure that legitimate comment gets through while malicious attacks can be prevented or retracted.

• That the regulatory approach must be flexible and adapt to changing consumer behaviour and expectations and to allow for technological innovation in developing better ways to protect users.

• Transparency about the rules underpinning the regulatory regime, including availability of information to consumers about how platforms decide what content is shown or given prominence, and the source of specific content. One of the goals here is to make it easier to see the difference between real news and deliberate disinformation.

• Public consultations to inform all future changes to regulatory requirements. This is to ensure that the rules regarding, e.g. decency will change in line with social values and standards.

The OfCom report identified the most pressing priorities in the development of online standards. These were:

• The protection of children across all sources and types of content. This is to ensure that child pornography and the arrangements for trafficking children are driven off the Internet.

• Protection from illegal or harmful content in viewing or online interactions, including exposure to hate speech, the promotion of terrorism, encouragement of suicide, self-harm or violence, bullying, harassment or trolling, disinformation, and fake news.

• Mandatory provision of information to users to allow them to make a more informed assessment of material they view online with regard to whether it is factual or fictitious. This could include greater transparency requirements for the algorithms used to rank search results.

Jamaicans want action!

The Broadcasting Commission commissioned a survey in October 2018 which revealed that the public was now ‘extremely concerned’ about the extent to which children in Jamaica are now exposed to a range of serious online dangers. About 87 per cent of Jamaicans said that their greatest concern was that their children might see pornographic videos involving children.

Other major concerns about online risks to children were exposure to violent and pornographic material, including ‘revenge porn’ and human trafficking.

Jamaicans are also very concerned about fake news (70 per cent) and exposure to graphic videos and images from accident and crime scenes filmed by bystanders on their cellphones then circulated on social media.

An overwhelming majority of Jamaicans (82 per cent) said that they wanted the Broadcasting Commission to educate the public on how to protect themselves and their families online. They agreed that protection against malicious and harmful online content was important and necessary, especially for young people who are increasingly able to access unrestricted content.

The BCJ’s response

Jamaica’s legislation and approach to media and communications regulation were largely developed in a bygone age when telephones, televisions, radios, film, and newspapers were all different technologies that required separate rules and regulation. They will not serve in the new, integrated environment. The Broadcasting Commission believes that regulation in the new era requires a much more sophisticated approach than the traditional directives and sanctions employed by regulators in the past. Content regulation has to be limited to what is both essential and feasible and must always have regard for the right to freedom of expression and access to information. All interventions, in this new era, must be lean, transparent, efficient effective, content-focused, and technology-agnostic.

The BCJ is, therefore, developing a new toolkit, which now includes a mix of sophisticated educational and advisory interventions, as well as the traditional sanctions, and we are developing more flexible models of light, cost-effective influence. This includes digital literacy, as digitally literate citizens are now the first line of defence against online crime, fraud, and disinformation. So one of the BCJ’s goals is to equip citizens with the skills and knowledge needed to recognize if there has been an infringement or misuse of their personal information; to detect media manipulation via disinformation and botnet operations; to detect penetration of social media by terrorist or criminal networks; and to guard themselves against malicious, harmful and inappropriate content.

The BCJ ‘s current public education initiatives include a schools’ outreach programme and advertisements such as the award-winning video ‘Pinchy Dead’, which warns about fake news. Other BCJ interventions are designed to encourage ethical and responsible sharing of information, for example, warning about why it is important to avoid circulating pictures of dead bodies and horrific accident scenes on social media.

The way forward for Jamaica

The BCJ’s recently submitted two substantial papers to the Government, outlining the way forward for Jamaica. Some of the main points are as follows:

• We need a fundamental rethink of the legislative framework for media and content regulation. Content should be regulated in a dedicated, specialised, and technology-agnostic manner, across platforms and devices, encompassing broadcasting, cinemas, video games, social media, virtual reality, augmented reality and AI applications.

• We need to further expand the media literacy and digital-awareness programmes, working through schools and educational organisations to engage students and adults (especially parents) on the critical media and information issues of the day such as cyberbullying, revenge porn, Internet addiction, and other problematic Internet-use issues. We need to build the capacity of youth, parents/guardians, and teachers to detect and report risks and sensitise people to signs of radicalisation or gang recruitment.

• In the longer term, convergence will move beyond telecommunications, broadcasting and spectrum to include infrastructure (such as the smart grid, smart roads, and autonomous transport systems, industrial symbiosis and other components of the smart city), government operations, education and training, and all other aspects of life as platforms and networks will continue to converge. The institutional arrangements to deal with technological convergence must, therefore, be carefully considered.

New institutional structures will be required as many existing regulatory boundaries will disappear.

• We must deepen our engagement with regional and international counterparts, as some jurisdictional issues (such as regulating technology companies) will require a regional or global approach.

• We must consider taxing, e.g., foreign-based technology companies with a strong presence in the Jamaican market and use part of the proceeds to fund digital literacy and build regulatory capacity.

• We must develop algorithms to monitor content online and assess the efficacy of the tools used by online operators to protect against various harms.

• We must support our media companies in developing their capacity for fact-checking and detecting disinformation and political manipulation.

• We need to update the provisions for political advertising and campaign finance to make them fit for the age of social media.

The BCJ’s new mission

The BCJ believes that Jamaica’s future depends on making a successful transition to a fully digital society and economy, so the BCJ emphasises the new opportunities to create and share information for knowledge building, learning, development, and economic activity while at the same time giving the citizens the skills they need to defend themselves against the predators that operate online.

This is why the Broadcasting Commission of Jamaica has a new mission. Its mission now is “to ensure a successful national transition to a digital economy, using the empowering and liberating potential of technological innovation to encourage new forms of business, social, cultural, and media development while protecting the people of Jamaica from potential abuses of communication and influence. We guard against malicious, harmful, and inappropriate content; we operate public education programmes to build the capacity of youth, parents, guardians, teachers, and the general public to detect and respond to harmful material; and we work with the media to encourage high standards and trustworthiness in journalism”.

Anthony Clayton is a professor of Caribbean Sustainable Development at The University of the West Indies and chairman of the Broadcasting Commission of Jamaica.

Cordel Green is an attorney-at-law and executive director of the Broadcasting Commission.