Technology, Cyberspace and Reality Versus Fantasy
It could be said that “information poison” infects the mind of a reader like poison ivy, wrapping its noxious tendrils around fact and logic, leaving reality and fantasy intertwined. Consequently, countless reputations, careers and families are destroyed every day for the sake of clicks, sales and advertising revenue. A definition of what “information poison” is and the various guises it can assume, including misinformation, disinformation and mal-information will first be provided. An explanation of how this assortment of metaphorical toxins are spread will be discussed next. It will detail the roles that both the media and the public play in, not only the initial dissemination of information disorder, but also its inevitable proliferation. Following this, an examination of the possible social consequences of this “poisoning” will be undertaken. Finally, in addition to its contribution to the well of polluted information, the ways in which technology could provide a potential antidote will be analysed. It will be argued that technology cannot provide a cure for “information poisoning” without encroaching on fundamental human rights and freedom of speech.
“Information poison” comes in many forms. Three of these are misinformation, disinformation and mal information. While these terms are often used interchangeably, they are, in fact, all different forms of information disorder. Misinformation is defined as fictitious information, however, the person disseminating it holds a fallacious opinion that it is accurate. Contrarily, while disinformation is also false, it differs from misinformation in that it is deliberately disseminated with the knowledge of its erroneous nature. This could be done for many reasons, such as a need or desire to intentionally mislead the public, increase wealth or gain followers on social media. Lastly, mal-information is factual, but it is spread maliciously and unjustifiably. A hypothetical example of this could be releasing the laboratory results of a celebrity who tests positive for the Human Immunodeficiency Virus. Sharing personal and sensitive information, such as this, without consent is considered “unethical journalism” (Wardle & Derakhshan, 2018) and is a violation of privacy. Bigotry, racism and homophobia create hate-fuelled “information poison”. The purpose of this variety is generally to incite fear, distrust and anger by preying on minorities at emotionally heightened times, for example during and directly after the infamous 2014 “Sydney Siege” (Ali & Khattab, 2018). During this terrifying ordeal, eighteen innocent people were held hostage in a café throughout a 16-hour standoff before the lone perpetrator opened fire. The gunman’s motives have since been heatedly debated. Despite the Sydney Siege officially being deemed Islamic extremism, it is highly questionable and doubtful that the same logic would have been applied had the shooter been Caucasian. As Kurasawa (2018, p. 3) argued, in reference to a similarly sensationalised shooting in Toronto: “Without a shred of evidence and fuelled by pre-existing agendas and biased assumptions, many social media insta-pundits and click-starved members of the media are more than willing to jump to conclusions.” It is important to note that while there are many varying forms of “information poison”, all are destructive and often have long-term negative ramifications.
Infectious Fantasies and Contagious Tall Tales
The spread of “information poison” now often occurs swiftly and irrevocably, as opposed to its insidious transmission in bygone years. It is no longer necessary to wait for the morning paper or the evening news to remain up to date with the latest local, national and international happenings. The World Wide Web and social media now track news in real time! In this age of immediacy, when law enforcement personnel exercise justified restraint in their release of facts, the media has an all too familiar tendency to fill in the gaps (Kurasawa, 2018). This ultimately results in conflicting versions of events when the official story is eventually made public. These differing stories leave the public confused and inflict further suffering onto the families of perpetrators, who regularly become victims of prejudice assumptions. “What is clear is that we live in a social environment characterized by severe information pollution, in which the well is poisoned for everyone” (Kurasawa, 2018). Today, not only is society consuming “information poison” directly from a single media source, they are instantly sharing the toxicity with a diverse cyber network of friends, family and acquaintances. People want to believe the dramatic; they seek out the supernatural and the paranormal with little care for logic or sound reasoning. The Cottingley Fairies photographs of 1917 and The Blair Witch Project film from 1999 are two apt examples of historic “information poison” (Atkinson, 2018). One need only glance at the alleged photographs of fairies to see how unconvincing and poorly executed these images are. Though this is hardly surprising because they were, after all, captured by children playing with cardboard cut-outs and a camera! But the photographs were believed to be real because people wanted them to be! The Blair Witch Project was alleged to have been legitimate footage filmed by a group of students who went missing while making a documentary on the legend of The Blair Witch. Years later, it was revealed the “students” were actors. These two examples indicate that the poison itself is not new, rather it has evolved, becoming more frequent, accessible and detrimental with time.Long-Term Ramifications
There are very real social consequences of the malicious dissemination of “information poison.” One example of this is deep-fake, which is sometimes referred to as face-swapping. When this occurs in the form of pornography, it can be highly destructive and damaging to the reputation of the victim and deleting the initial post is unlikely to mitigate these negative effects. This is because, so wide and far-reaching is the network and rapidity of the internet, that it has potentially already been saved or downloaded by millions of people. Any or all these people could then potentially re-upload their own copy to their personal social media accounts and thus, the vicious cycle starts again. Aside from reputations, “Information poison” can also negatively impact mental health. This could be evidenced by the radio broadcast, “The War of the Worlds” (Atkinson, 2018), which aired in 1938 and was the first satire news report. It was about an alien invasion. Thousands of Americans were unaware of the report’s fictitiousness and this resulted in them fleeing from their homes, believing Martians were truly coming. The long-term possible impacts of this frightening ordeal would be anxiety, lowered or loss of self-esteem, post-traumatic stress disorder and an ongoing distrust of the media in general. Character defamation and mental health issues are an absurdly unjust price to pay for personal entertainment, amusement or even the furthering of one’s career.Human Rights and Freedom of Speech
Technology can, at best, aid willing individuals in their pursuits of truth, knowledge and facts but it is unconscionable to envisage it as being capable of providing a total cure. This is partly because “information poison has social and cognitive roots” (Ciampaglia & Menczer, 2018) and therefore, any possible technological solutions would require a certain element of user uptake, input and engagement. Forcing these aspects onto unwilling citizens would be a gross abuse of power and human rights. To illustrate this, it is helpful to note that even amid the global health crisis stemming from 2019’s Coronavirus (COVID-19), the Australian government refused to enforce the use of its “COVIDSafe” (Department of Health, 2020) disease tracking and tracing application- a tool designed to facilitate authorities contacting people who have been exposed to coronavirus. The voluntary status of COVIDSafe is a relevant and timely example of the technological and privacy freedoms Australian leaders are committed to protecting, even in times of crisis. These freedoms further extend to creativity and one example of this is satire news sites such as, “The Onion” (Nackers, 1996). Satire news can be problematic if it is mistakenly interpreted as authentic, however, making this form of creative writing illegal would violate freedom of speech. In America, where this source of entertainment is quite popular, it is protected under The First Amendment (Egemenoglu, 2020). Also notable is that these sites clearly state that their content is fictional, therefore, the onus for misinterpretation is essentially on the reader. For those who are willing to think critically and exert a little effort, technology such as search engines, credible websites and reputable online fact checkers can be of great assistance for researching and evaluating information. This will assist in weeding out the “poison” from the truth.References:
Ali, S., & Khattab, U. (2018). Trans-mediatized terrorism: The Sydney Lindt Café siege. Global Media and Communication, 14, (3), pp. 301-323. Retrieved from http://journals.sagepub.com/doi/10.1177/1742766518811367
Atkinson, S. (2018, September 18). Deepfakes: What fairies and aliens can teach us about fake videos. The Conversation. Retrieved from https://theconversation.com/deepfakes-what-fairies-and-aliens-can-teach-us-aboutfake-videos-102747
Ciampaglia,
G. L., & Menczer, F. (2018, June 20). Misinformation and biases infect social
media, both intentionally and accidentally. The Conversation.
Retrieved from https://theconversation.com/misinformation-and-biases-infect-social-media-bothintentionally-and-accidentally-97148
Department
of Health. (2020, May 27). Australian Government. COVIDSafe App. Retrieved from http://www.health.gov.au/resources/apps-and-tools/covidsafe-app
Egemenoglu, E. (2020, March). First Amendment. Cornell Law School. Retrieved from http://www.law.cornell.edu/wex/first_amendment
Kurasawa, F. (2018, July 27). Social media can be information poison when we need facts most. The Conversation. Retrieved from https://theconversation.com/social-mediacan-be-information-poison-when-we-need-facts-most-100495
Nackers, C. (1996, October 18). About the onion. The Onion. Retrieved from https://www.theonion.com/about
Newman, N. (2018). Digital News Project 2018: Journalism, media, technology trends and predictions 2018. Retrieved from http://www.digitalnewsreport.org
Wardle,
C., & Derakhshan, H. (2018). Thinking about “information disorder”: formats of misinformation, disinformation, and
mal-information. Journalism, “Fake News” and Disinformation.
pp. 32-42. Retrieved from https://en.unesco.org.
google.com, pub-5896944412523933, DIRECT, f08c47fec0942fa0
Comments
Post a Comment