Placeholder Content Image

How people get sucked into misinformation rabbit holes – and how to get them out

<p><em><a href="https://theconversation.com/profiles/emily-booth-715018">Emily Booth</a>, <a href="https://theconversation.com/institutions/university-of-technology-sydney-936">University of Technology Sydney</a> and <a href="https://theconversation.com/profiles/marian-andrei-rizoiu-850922">Marian-Andrei Rizoiu</a>, <a href="https://theconversation.com/institutions/university-of-technology-sydney-936">University of Technology Sydney</a></em></p> <p>As misinformation and radicalisation rise, it’s tempting to look for something to blame: the internet, social media personalities, sensationalised political campaigns, religion, or conspiracy theories. And once we’ve settled on a cause, solutions usually follow: do more fact-checking, regulate advertising, ban YouTubers deemed to have “gone too far”.</p> <p>However, if these strategies were the whole answer, we should already be seeing a decrease in people being drawn into fringe communities and beliefs, and less misinformation in the online environment. We’re not.</p> <p>In new research <a href="https://doi.org/10.1177/14407833241231756">published in the Journal of Sociology</a>, we and our colleagues found radicalisation is a process of increasingly intense stages, and only a small number of people progress to the point where they commit violent acts.</p> <p>Our work shows the misinformation radicalisation process is a pathway driven by human emotions rather than the information itself – and this understanding may be a first step in finding solutions.</p> <h2>A feeling of control</h2> <p>We analysed dozens of public statements from newspapers and online in which former radicalised people described their experiences. We identified different levels of intensity in misinformation and its online communities, associated with common recurring behaviours.</p> <p>In the early stages, we found people either encountered misinformation about an anxiety-inducing topic through algorithms or friends, or they went looking for an explanation for something that gave them a “bad feeling”.</p> <p>Regardless, they often reported finding the same things: a new sense of certainty, a new community they could talk to, and feeling they had regained some control of their lives.</p> <p>Once people reached the middle stages of our proposed radicalisation pathway, we considered them to be invested in the new community, its goals, and its values.</p> <h2>Growing intensity</h2> <p>It was during these more intense stages that people began to report more negative impacts on their own lives. This could include the loss of friends and family, health issues caused by too much time spent on screens and too little sleep, and feelings of stress and paranoia. To soothe these pains, they turned again to their fringe communities for support.</p> <p>Most people in our dataset didn’t progress past these middle stages. However, their continued activity in these spaces kept the misinformation ecosystem alive.</p> <p>When people did move further and reach the extreme final stages in our model, they were doing active harm.</p> <p>In their recounting of their experiences at these high levels of intensity, individuals spoke of choosing to break ties with loved ones, participating in public acts of disruption and, in some cases, engaging in violence against other people in the name of their cause.</p> <p>Once people reached this stage, it took pretty strong interventions to get them out of it. The challenge, then, is how to intervene safely and effectively when people are in the earlier stages of being drawn into a fringe community.</p> <h2>Respond with empathy, not shame</h2> <p>We have a few suggestions. For people who are still in the earlier stages, friends and trusted advisers, like a doctor or a nurse, can have a big impact by simply responding with empathy.</p> <p>If a loved one starts voicing possible fringe views, like a fear of vaccines, or animosity against women or other marginalised groups, a calm response that seeks to understand the person’s underlying concern can go a long way.</p> <p>The worst response is one that might leave them feeling ashamed or upset. It may drive them back to their fringe community and accelerate their radicalisation.</p> <p>Even if the person’s views intensify, maintaining your connection with them can turn you into a lifeline that will see them get out sooner rather than later.</p> <p>Once people reached the middle stages, we found third-party online content – not produced by government, but regular users – could reach people without backfiring. Considering that many people in our research sample had their radicalisation instigated by social media, we also suggest the private companies behind such platforms should be held responsible for the effects of their automated tools on society.</p> <p>By the middle stages, arguments on the basis of logic or fact are ineffective. It doesn’t matter whether they are delivered by a friend, a news anchor, or a platform-affiliated fact-checking tool.</p> <p>At the most extreme final stages, we found that only heavy-handed interventions worked, such as family members forcibly hospitalising their radicalised relative, or individuals undergoing government-supported deradicalisation programs.</p> <h2>How not to be radicalised</h2> <p>After all this, you might be wondering: how do you protect <em>yourself</em> from being radicalised?</p> <p>As much of society becomes more dependent on digital technologies, we’re going to get exposed to even more misinformation, and our world is likely going to get smaller through online echo chambers.</p> <p>One strategy is to foster your critical thinking skills by <a href="https://www.cell.com/trends/cognitive-sciences/abstract/S1364-6613(23)00198-5">reading long-form texts from paper books</a>.</p> <p>Another is to protect yourself from the emotional manipulation of platform algorithms by <a href="https://guilfordjournals.com/doi/10.1521/jscp.2018.37.10.751">limiting your social media use</a> to small, infrequent, purposefully-directed pockets of time.</p> <p>And a third is to sustain connections with other humans, and lead a more analogue life – which has other benefits as well.</p> <p>So in short: log off, read a book, and spend time with people you care about. <!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" src="https://counter.theconversation.com/content/223717/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p> <p><em><a href="https://theconversation.com/profiles/emily-booth-715018">Emily Booth</a>, Research assistant, <a href="https://theconversation.com/institutions/university-of-technology-sydney-936">University of Technology Sydney</a> and <a href="https://theconversation.com/profiles/marian-andrei-rizoiu-850922">Marian-Andrei Rizoiu</a>, Associate Professor in Behavioral Data Science, <a href="https://theconversation.com/institutions/university-of-technology-sydney-936">University of Technology Sydney</a></em></p> <p><em>Image credits: Getty Images</em></p> <p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/how-people-get-sucked-into-misinformation-rabbit-holes-and-how-to-get-them-out-223717">original article</a>.</em></p>

Mind

Placeholder Content Image

Misinformation and the Voice: how can you spot and defuse false claims?

<p>On 14 October, Australians will vote in their first referendum in 24 years.</p> <div class="copy"> <p>The question – whether to establish an Aboriginal and Torres Strait Islander Voice to Parliament – has been hotly debated for much of this year already, and campaigning will ramp up for both the Yes and No votes in coming weeks.</p> <p><a href="https://cosmosmagazine.com/technology/what-if-instead-of-blaming-readers-of-misinformation-we-showed-them-how-to-tell-the-difference-between-facts-and-falsehoods/" target="_blank" rel="noreferrer noopener" data-type="link" data-id="https://cosmosmagazine.com/technology/what-if-instead-of-blaming-readers-of-misinformation-we-showed-them-how-to-tell-the-difference-between-facts-and-falsehoods/">Misinformation</a> and <a href="https://cosmosmagazine.com/health/covid/inoculating-against-disinformation/" target="_blank" rel="noreferrer noopener" data-type="link" data-id="https://cosmosmagazine.com/health/covid/inoculating-against-disinformation/">disinformation</a> about the referendum have also been circulating, both on- and offline.</p> <p>What should we be keeping an eye out for, and what are the best methods of dealing with misinformation? <em>Cosmos</em> investigates.</p> <p>“There’s a whole field unto itself on how you classify misinformation,” says Dr Natasha van Antwerpen, a lecturer in psychology at the University of Adelaide.</p> <p>It can vary “from the very blatant, absolute lie, through to something that, even if all the facts are correct, the actual impression that you get is not true”, she says.</p> <p>It’s particularly difficult to see if you’re dealing with statements about the future – such as, ‘a Yes or No vote will cause this thing to happen’.</p> <p>“With prediction, it can be really challenging, because you don’t really have a ground truth to work with,” says van Antwerpen.</p> <p>“Things that you can always look out for tend to be: if it’s a really extreme statement, if there’s no degree of uncertainty in the prediction, and sometimes if it’s very obviously feeding into a politicised narrative, that can be a bit of a red flag.”</p> <p>Acknowledging uncertainty is often a better sign that the information is true, says van Antwerpen, as is checking someone’s citations.</p> <p>“What are the bases that they’re making those predictions on? Have they actually got solid research evidence behind the predictions that they’re making, as opposed to speculation?”</p> <p>While the actions both campaigns want people to take in this referendum are very simple – either vote yes, or no – they rest on a very complicated cultural context.</p> <p>“There’s a lot of things that are feeding into people’s decision making that don’t just come from the campaign, they have extraordinary long legacies in Australia,” says Dr Clare Southerton, a lecturer in digital technology and pedagogy at La Trobe University.</p> <p>“When you’re trying to inform people, they’re always going to be interpreting it through their own lens. And that’s how misinformation is able to circulate so rapidly: people respond to it in emotional ways, because they’re coming to it from their own personal histories.”</p> <p>What’s the best way to deal with misinformation if you do come across it?</p> <p>“I wish there was a simple answer,” says Southerton.</p> <p>“Unfortunately, research shows that at this point there is really no <em>most</em> successful strategy.”</p> <p>That said, there are things that work in different circumstances. Southerton says that on social media, reporting the misinformation is a reliable strategy. “When misinformation is mass-reported, it does get taken down – unfortunately, not usually before many, many eyeballs have seen it.”</p> <p>What about your friend or relative who’s dead-set on a stance you know is factually incorrect? Southerton says that while, once again, there’s no method with strong evidence proving it to be the best, connecting with the person “on an emotional level” often helps change their beliefs.</p> <p>“If you can think about where they might be coming from, and connect with them on that level, that’s going to be the most successful. Because we know that people share misinformation because the position that the misinformation has taken makes them feel good,” says Southerton.</p> <p>Southerton warns against “debunking” by simply telling someone that they’re wrong.</p> <p>“Correcting someone, or fact checking, feels good to us, but often shames the person who’s shared the misinformation and can radicalise them further.”</p> <p>This doesn’t mean you need to legitimise their viewpoint.</p> <p>“Try and think about ways that you can humanise your position to them,” says Southerton.</p> <p>“Ultimately, this is a very emotional time for Aboriginal people in Australia, to have these kinds of debates happening about them in a way that can open up conversation for extreme racism to happen in the public sphere.</p> <p>“So it’s really important that we don’t legitimise that racism. But at the same time, […] what is actually successful, as a way to combat misinformation, is about connecting with people who are sharing it, and seeing what ways we can best reach them.”</p> <p>For people who deal with a lot of misinformation professionally, van Antwerpen says it’s important to choose which myths to debunk – you won’t be able to fight every single false statement.</p> <p>Once chosen, she recommends <a href="https://www.climatechangecommunication.org/debunking-handbook-2020/" target="_blank" rel="noopener"><em>The Debunking Handbook</em></a> by Stephan Lewandowsky for evidence-based advice on challenging myths.</p> <p>In general, “you want to start with the facts in a very clear way, so you want it to be as concise as possible,” she says.</p> <p>“We used to say ‘never repeat the misinformation’, but that’s changed a bit now. Generally, it’s best to warn that you’re going to say misinformation, and then just say it once.”</p> <p>Then, van Antwerpen says it’s very important to explain why the misinformation is wrong.</p> <p>“Our brains like to have some sort of explanation. If we don’t have something to fill the gap that’s left when we correct the misinformation, it will just go back to the misinformation.”</p> <p>Being conscious of political narratives, without feeding them and getting more polarised, is important too.</p> <p>“When we present these really polarised arguments, people often tend to either polarise or they’ll get apathetic and drop out,” says van Antwerpen.</p> <p>“So if you’re looking at informing people, it’s finding how can you communicate it in a way that’s not encouraging that split.”</p> <p><em>Image credits: Getty Images</em></p> <p><em><a href="https://cosmosmagazine.com/people/behaviour/misinformation-voice-referendum/">This article</a> was originally published on <a href="https://cosmosmagazine.com">Cosmos Magazine</a> and was written by <a href="https://cosmosmagazine.com/contributor/ellen-phiddian/">Ellen Phiddian</a>. </em></p> </div>

Legal

Placeholder Content Image

Pete Evans "silenced" by Kyle and Jackie O

<p>Pete Evans has been silenced while defending his controversial opinions on <em>The Kyle and Jackie O Show</em>. </p> <p>The disgraced celebrity chef appeared on the KIISFM show on Thursday morning in an attempt to clear his name, after destroying his own career by spreading misinformation about the Covid pandemic. </p> <p>Throughout 2020 and 2021, Evans found himself in hot water after claiming that Covid-19 was a "f**king hoax", and was slapped with many fines after peddling fake, and often dangerous, treatments for the virus online. </p> <p>Calling into the show from his property in north-eastern New South Wales, the 50-year-old doubled down on his opposition to Covid vaccines, masks and social distancing.</p> <p>The former <em>My Kitchen Rules</em> judge also went on to cast doubt upon Covid rapid antigen tests and defended his claims about the "healing" abilities of the BioCharger lamps he was fined for promoting in 2020. </p> <p>However, much of his conversation with Kyle and Jackie O was beeped out by the station's censor.</p> <p>The censorship divided listeners, while some fans of the show called in to declare it was time to "un-cancel" him, because he is a "real man" who "stands up" for his beliefs. </p> <p>Another alleged that Evans had "gone and done the research" and "wasn't just talking s**t", and therefore shouldn't be barred from sharing his opinions publicly, even if they are viewed as scientifically incorrect.</p> <p>Others praised the work of the station's censor, saying the hosts and the station have a responsibility to stop dangerous misinformation being spread, especially when it could cause harm. </p> <p>One caller, who was a medical professional, pointed out that Evans, isn't qualified to give health advice, and that what he said on-air was "just the tip of the iceberg".</p> <p>"The issue is everyone with 100,000 followers, or whatever it may be, thinks all of a sudden they're a doctor or a personal trainer or wherever it may be and that they're qualified to give this health advice," she said.</p> <p>Another argued that Evans should remain cancelled because he has never apologised over his claims that were proven wrong by Australia's top medical authorities.</p> <p>Kyle Sandilands later explained why the censor had beeped out parts of Evans' interview, as well as some of what was said by the fans who called in to defend him. </p> <p>"I believe that this isn't the censor beeping out what she doesn't believe is right or wrong. This is the censor beeping out what legally we can and cannot put to air," Sandilands explained. </p> <p><em>Image credits: Instagram</em></p>

Caring

Placeholder Content Image

The Voice isn’t apartheid or a veto over parliament – this misinformation is undermining democratic debate

<p><em><a href="https://theconversation.com/profiles/dominic-osullivan-12535">Dominic O'Sullivan</a>, <a href="https://theconversation.com/institutions/charles-sturt-university-849">Charles Sturt University</a></em></p> <p><em>Readers please be advised this article discusses racism.</em></p> <p>We’ve heard many different arguments for and against the Voice to Parliament in the lead-up to this year’s referendum. This has included some <a href="https://www.youtube.com/watch?v=4a5MgbXj9kI">media</a> and <a href="https://www.skynews.com.au/australia-news/voice-to-parliament/pauline-hanson-claims-indigenous-voice-is-australias-version-of-apartheid-in-speech-aimed-at-lidia-thorpe-and-albanese/news-story/2d988413c54d81ba0cb9c55f19d9cffa">politicians</a> drawing comparisons between the Voice and <a href="https://au.int/en/auhrm-project-focus-area-apartheid">South Africa’s apartheid regime</a>.</p> <p>Cory Bernardi, a Sky News commentator, <a href="https://www.theguardian.com/australia-news/2023/may/02/liberals-accused-of-flirting-with-far-right-fringe-after-sky-news-show-where-indigenous-voice-compared-to-apartheid">argued</a>, for instance, that by implementing the Voice, “we’re effectively announcing an apartheid-type state, where some citizens have more legal rights or more rights in general than others”.</p> <p>As legal scholar Bede Harris has <a href="https://news.csu.edu.au/opinion/the-voice-to-parliament,-apartheid-and-cory-bernardi">pointed out</a>, it’s quite clear Bernardi doesn’t understand apartheid. He said,</p> <blockquote> <p>How the Voice could be described as creating such a system is unfathomable.</p> </blockquote> <h2>Comparisons to apartheid</h2> <p>Apartheid was a system of racial segregation implemented by the South African government to control and restrict the lives of the non-white populations, and to stop them from voting.</p> <p>During apartheid, non-white people could not freely visit the same beaches, live in the same neighbourhoods, attend the same schools or queue in the same lines as white people. My wife recalls her white parents being questioned by police after visiting the home of a Black colleague.</p> <p>The proposed Voice will ensure First Nations peoples have their views heard by parliament. It won’t have the power to stop people swimming at the same beaches or living, studying or shopping together. It won’t stop interracial marriages as the apartheid regime did. It doesn’t give anybody extra political rights.</p> <p>It simply provides First Nations people, who have previously had no say in developing the country’s system of government, with an opportunity to participate in a way that many say is meaningful and respectful.</p> <p>Apartheid and the Voice are polar opposites. The Voice is a path towards democratic participation, while apartheid eliminated any opportunity for this.</p> <p>Evoking emotional responses, like Bernardi attempted to do, can <a href="https://www.pnas.org/doi/10.1073/pnas.1618923114">inspire people</a> to quickly align with a political cause that moderation and reason might not encourage. This means opinions may be formed from <a href="https://royalsocietypublishing.org/doi/full/10.1098/rsos.180593">limited understanding</a> and misinformation.</p> <h2>Misinformation doesn’t stop at apartheid comparisons</h2> <p>The Institute of Public Affairs, a conservative lobby group, has published a “research” paper claiming the Voice would be like New Zealand’s Waitangi Tribunal and be able to veto decisions of the parliament.</p> <p>The <a href="https://www.aap.com.au/factcheck/voice-comparisons-with-nz-tribunal-are-just-wrong/">truth</a> is the tribunal is not a “Maori Voice to Parliament”. It can’t <a href="https://www.abc.net.au/news/2023-04-14/fact-check-checkmate-maori-voice-waitangi-tribunal/102217998">veto</a> parliament.</p> <p>The Waitangi Tribunal is a permanent commission of inquiry. It is chaired by a judge and has Maori and non-Maori membership. Its job is to investigate alleged breaches of the Treaty of Waitangi.</p> <p>The tribunal’s task is an independent search for truth. When it upholds a claim, its recommended remedies become the subject of political negotiation between government and claimants.</p> <p>The Voice in Australia would make representations to parliament. This is also not a veto. A veto is to stop parliament making a law.</p> <h2>We need to raise the quality of debate</h2> <p>Unlike the apartheid and Waitangi arguments, many <a href="https://theconversation.com/for-a-lot-of-first-nations-peoples-debates-around-the-voice-to-parliament-are-not-about-a-simple-yes-or-no-199766">objections</a> to the Voice are grounded in fact.</p> <p>Making representations to parliament and the government is a standard and necessary democratic practice. There are already many ways of doing this, but in the judgement of the First Nations’ people who developed the Voice proposal, a constitutionally enshrined Voice would be a better way of making these representations.</p> <p>Many people disagree with this judgement. The <a href="https://nationals.org.au/the-nationals-oppose-a-voice-to-parliament/">National Party</a> argues a Voice won’t actually improve people’s lives.</p> <p>Independent Senator Lidia Thorpe says she speaks for a Black Sovereignty movement when she advocates for a treaty to <a href="https://www.abc.net.au/news/2023-01-31/lidia-thorpe-wants-treaty-and-seats-not-voice-qa/101909286">come first</a>. The argument is that without a treaty, the system of government isn’t morally legitimate.</p> <p>Other people support the Voice in principle but think it will have <a href="https://independentaustralia.net/politics/politics-display/voice-to-parliament-yes-vote-has-many-enemies,17190">too much</a> power; <a href="https://theconversation.com/what-australia-could-learn-from-new-zealand-about-indigenous-representation-201761">others</a> think it won’t have enough.</p> <p>Thinking about honest differences of opinion helps us to understand and critique a proposal for what it is, rather than what it is not. Our vote then stands a better chance of reflecting what we really think.</p> <p>Lies can mask people’s real reasons for holding a particular point of view. When people’s true reasons can’t be scrutinised and tested, it prevents an honest exchange of ideas. Collective wisdom can’t emerge, and the final decision doesn’t demonstrate each voter’s full reflection on other perspectives.</p> <p>Altering the Constitution is very serious, and deliberately difficult to do. Whatever the referendum’s outcome, confidence in our collective judgement is more likely when truth and reason inform our debate.</p> <p>In my recently published book, <a href="https://link.springer.com/book/10.1007/978-981-99-0581-2">Indigeneity, Culture and the UN Sustainable Development Goals</a>, I argue the Voice could contribute to a more just and democratic system of government through ensuring decision-making is informed by what First Nations’ people want and why. Informed, also, by deep knowledge of what works and why.</p> <p>People may agree or disagree. But one thing is clear: deliberate misinformation doesn’t make a counter argument. It diminishes democracy.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" src="https://counter.theconversation.com/content/205474/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p> <p><em><a href="https://theconversation.com/profiles/dominic-osullivan-12535">Dominic O'Sullivan</a>, Adjunct Professor, Faculty of Health and Environmental Sciences, Auckland University of Technology, and Professor of Political Science, <a href="https://theconversation.com/institutions/charles-sturt-university-849">Charles Sturt University</a></em></p> <p><em>Image credits: Getty Images</em></p> <p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/the-voice-isnt-apartheid-or-a-veto-over-parliament-this-misinformation-is-undermining-democratic-debate-205474">original article</a>.</em></p>

Legal

Placeholder Content Image

Why fake news and misinformation is sabotaging the election

<p dir="ltr">After a messy election campaign, a lot of Australians have been left feeling confused about who they should be voting for at the polls this weekend. </p> <p dir="ltr">With conflicting media reports about both Scott Morrison and Anthony Albanese, it’s easy to get lost in what is the correct information. </p> <p dir="ltr">A recent report published by Avast, a global leader in digital security and privacy, has warned Australians to be wary of fake news and misinformation when casting their vote. </p> <p dir="ltr">The new research commissioned by Avast found that over half of Australians say they have believed a fake news story in the past, and a staggering 9 in 10 believe that fake news has the ability to impact Australians and their vote in the upcoming election. </p> <p dir="ltr">Stephen Kho, cyber security expert at Avast says, “Sensationalist fake news is often used to generate clicks onto a webpage to improve ad revenue. It has also been used to influence public thought…it’s increasingly important that Australians are aware of how to spot misinformation and misleading news that isn’t based in solid fact.”</p> <p dir="ltr">Concerningly, the research found that 38% of Australians are not confident in their ability to identify fake news online, as Stephen Kho recommends readers run through these three criteria when assessing a news source for misinformation.</p> <p dir="ltr"><strong>Check the source</strong></p> <p dir="ltr">Readers should question the source, ask themselves if they have ever heard of it, and assess the source's appearance. </p> <p dir="ltr">Readers should also research the source, to see what has been reported on the source and if the source has a vested interest in subjective reporting. </p> <p dir="ltr"><strong>Check the headline</strong></p> <p dir="ltr">Clickbait articles are designed to garner as many clicks as possible and often have very catchy headlines. </p> <p dir="ltr">It is therefore important for readers to question articles where the headline and the actual story have little or no connection, and short articles bringing little to no insights.</p> <p dir="ltr"><strong>Check the publication date</strong></p> <p dir="ltr">Readers should check the date of articles, regardless of if they are real or fake, to make sure they are reading the most current information.</p> <p dir="ltr">Stephen Kho also shared helpful tips on how to avoid fake news, and how to spot blatant misinformation.</p> <p dir="ltr"><strong>Avoid relying on social media</strong></p> <p dir="ltr">While social media giants are making an effort to flag fake news shared within their networks, it's best to avoid consuming news and current affairs via social media news feeds.</p> <p dir="ltr">Instead, go directly to a news site you trust.</p> <p dir="ltr"><strong>Read a variety of sources before forming an opinion</strong></p> <p dir="ltr">Reading multiple, reliable news sources, can help people avoid fake news. If one article is reporting a story with different facts, the news could be fake.</p> <p dir="ltr"><em>Image credits: Getty Images</em></p>

Mind

Placeholder Content Image

There is, in fact, a ‘wrong’ way to use Google

<p>I was recently reading comments on a post related to COVID-19, and saw a reply I would classify as misinformation, bordering on conspiracy. I couldn’t help but ask the commenter for evidence.</p> <p>Their response came with some web links and “do your own research”. I then asked about their research methodology, which turned out to be searching for specific terms on Google.</p> <p>As an academic, I was intrigued. Academic research aims to establish the truth of a phenomenon based on evidence, analysis and peer review.</p> <p>On the other hand, a search on Google provides links with content written by known or unknown authors, who may or may not have knowledge in that area, based on a ranking system that either follows the preferences of the user, or the collective popularity of certain sites.</p> <p>In other words, Google’s algorithms can penalise the truth for not being popular.</p> <p><a href="https://www.google.com/search/howsearchworks/algorithms" target="_blank" rel="noopener">Google Search’s</a> ranking system has a <a href="https://youtu.be/tFq6Q_muwG0" target="_blank" rel="noopener">fraction of a second</a> to sort through hundreds of billions of web pages, and index them to find the most relevant and (ideally) useful information.</p> <p>Somewhere along the way, mistakes get made. And it’ll be a while before these algorithms become foolproof – if ever. Until then, what can you do to make sure you’re not getting the short end of the stick?</p> <p><strong>One question, millions of answers</strong></p> <p>There are around <a href="https://morningscore.io/how-does-google-rank-websites/" target="_blank" rel="noopener">201 known factors</a> on which a website is analysed and ranked by Google’s algorithms. Some of the main ones are:</p> <ul> <li>the specific key words used in the search</li> <li>the meaning of the key words</li> <li>the relevance of the web page, as assessed by the ranking algorithm</li> <li>the “quality” of the contents</li> <li>the usability of the web page</li> <li>and user-specific factors such as their location and profiling data taken from connected Google products, including Gmail, YouTube and Google Maps.</li> </ul> <p><a href="https://link.springer.com/article/10.1007/s10676-013-9321-6" target="_blank" rel="noopener">Research has shown</a> users pay more attention to higher-ranked results on the first page. And there are known ways to ensure a website makes it to the first page.</p> <p>One of these is “<a href="https://en.wikipedia.org/wiki/Search_engine_optimization" target="_blank" rel="noopener">search engine optimisation</a>”, which can help a web page float into the top results even if its content isn’t necessarily quality.</p> <p>The other issue is Google Search results <a href="https://mcculloughwebservices.com/2021/01/07/why-google-results-look-different-for-everyone/" target="_blank" rel="noopener">are different for different people</a>, sometimes even if they have the exact same search query.</p> <p>Results are tailored to the user conducting the search. In his book <a href="https://www.penguin.co.uk/books/181/181850/the-filter-bubble/9780241954522.html" target="_blank" rel="noopener">The Filter Bubble</a>, Eli Pariser points out the dangers of this – especially when the topic is of a controversial nature.</p> <p>Personalised search results create alternate versions of the flow of information. Users receive more of what they’ve already engaged with (which is likely also what they already believe).</p> <p>This leads to a dangerous cycle which can further polarise people’s views, and in which more searching doesn’t necessarily mean getting closer to the truth.</p> <p><strong>A work in progress</strong></p> <p>While Google Search is a brilliant search engine, it’s also a work in progress. Google is <a href="https://ai.googleblog.com/2020/04/a-scalable-approach-to-reducing-gender.html" target="_blank" rel="noopener">continuously addressing various issues</a> related to its performance.</p> <p>One major challenge relates to societal biases <a href="https://www.kcl.ac.uk/news/artificial-intelligence-is-demonstrating-gender-bias-and-its-our-fault" target="_blank" rel="noopener">concerning race and gender</a>. For example, searching Google Images for “truck driver” or “president” returns images of mostly men, whereas “model” and “teacher” returns images of mostly women.</p> <p>While the results may represent what has <em>historically</em> been true (such as in the case of male presidents), this isn’t always the same as what is <em>currently</em> true – let alone representative of the world we wish to live in.</p> <p>Some years ago, Google <a href="https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai" target="_blank" rel="noopener">reportedly</a> had to block its image recognition algorithms from identifying “gorillas”, after they began classifying images of black people with the term.</p> <p>Another issue highlighted by health practitioners relates to people <a href="https://www.healthline.com/health/please-stop-using-doctor-google-dangerous" target="_blank" rel="noopener">self diagnosing based on symptoms</a>. It’s estimated about <a href="https://onlinelibrary.wiley.com/doi/full/10.5694/mja2.50600" target="_blank" rel="noopener">40% of Australians</a> search online for self diagnoses, and there are about 70,000 health-related searches conducted on Google each minute.</p> <p>There can be serious repercussions for those who <a href="https://www.medicaldirector.com/press/new-study-reveals-the-worrying-impact-of-doctor-google-in-australia" target="_blank" rel="noopener">incorrectly interpret</a> information found through “<a href="https://www.ideas.org.au/blogs/dr-google-should-you-trust-it.html" target="_blank" rel="noopener">Dr Google</a>” – not to mention what this means in the midst of a pandemic.</p> <p>Google has delivered a plethora of COVID misinformation related to unregistered medicines, fake cures, mask effectiveness, contact tracing, lockdowns and, of course, vaccines.</p> <p>According to <a href="https://www.ajtmh.org/view/journals/tpmd/103/4/article-p1621.xml" target="_blank" rel="noopener">one study</a>, an estimated 6,000 hospitalisations and 800 deaths during the first few months of the pandemic were attributable to misinformation (specifically the false claim that <a href="https://www.abc.net.au/news/2020-04-28/hundreds-dead-in-iran-after-drinking-methanol-to-cure-virus/12192582" target="_blank" rel="noopener">drinking methanol can cure COVID</a>).</p> <p>To combat this, <a href="https://misinforeview.hks.harvard.edu/article/how-search-engines-disseminate-information-about-covid-19-and-why-they-should-do-better/" target="_blank" rel="noopener">Google eventually prioritised</a> authoritative sources in its search results. But there’s only so much Google can do.</p> <p>We each have a responsibility to make sure we’re thinking critically about the information we come across. What can you do to make sure you’re asking Google the best question for the answer you need?</p> <p><strong>How to Google smarter</strong></p> <p>In summary, a Google Search user must be aware of the following facts:</p> <ol> <li> <p>Google Search will bring you the top-ranked web pages which are also the most relevant to your search terms. Your results will be as good as your terms, so always consider context and how the inclusion of certain terms might affect the result.</p> </li> <li> <p>You’re better off starting with a <a href="https://support.google.com/websearch/answer/134479?hl=enr" target="_blank" rel="noopener">simple search</a>, and adding more descriptive terms later. For instance, which of the following do you think is a more effective question: “<em>will hydroxychloroquine help cure my COVID?</em>” or “<em>what is hydroxychloroquine used for?</em>”</p> </li> <li> <p>Quality content comes from verified (or verifiable) sources. While scouring through results, look at the individual URLs and think about whether that source holds much authority (for instance, is it a government website?). Continue this process once you’re in the page, too, always checking for author credentials and information sources.</p> </li> <li> <p>Google may personalise your results based on your previous search history, current location and interests (gleaned through other products such as Gmail, YouTube or Maps). You can use <a href="https://support.google.com/chrome/answer/95464?hl=en&amp;co=GENIE.Platform%3DDesktop" target="_blank" rel="noopener">incognito mode</a> to prevent these factors from impacting your search results.</p> </li> <li> <p>Google Search isn’t the only option. And you don’t just have to leave your reading to the discretion of its algorithms. There are several other search engines available, including <a href="https://www.bing.com/" target="_blank" rel="noopener">Bing</a>, <a href="https://au.yahoo.com/" target="_blank" rel="noopener">Yahoo</a>, <a href="https://www.baidu.com/" target="_blank" rel="noopener">Baidu</a>, <a href="https://duckduckgo.com/" target="_blank" rel="noopener">DuckDuckGo</a> and <a href="https://www.ecosia.org/" target="_blank" rel="noopener">Ecosia</a>. Sometimes it’s good to triangulate your results from outside the filter bubble. <!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important; text-shadow: none !important;" src="https://counter.theconversation.com/content/179099/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p> </li> </ol> <p><em><a href="https://theconversation.com/profiles/muneera-bano-398400" target="_blank" rel="noopener">Muneera Bano</a>, Senior Lecturer, Software Engineering, <a href="https://theconversation.com/institutions/deakin-university-757" target="_blank" rel="noopener">Deakin University</a></em></p> <p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/there-is-in-fact-a-wrong-way-to-use-google-here-are-5-tips-to-set-you-on-the-right-path-179099" target="_blank" rel="noopener">original article</a>.</em></p> <p><em>Image: Getty Images</em></p>

Technology

Placeholder Content Image

Spotify’s response to Rogan-gate falls short of its ethical and editorial obligations

<p>Audio streaming giant <a href="https://www.spotify.com/au/" target="_blank" rel="noopener">Spotify</a> is getting a crash course in the tension between free speech and the need to protect the public from harmful misinformation.</p><p>The Swedish-founded platform, which has 400 million active users, has faced a hail of criticism over misinformation broadcast on its <a href="https://variety.com/2021/digital/news/joe-rogan-experience-most-popular-podcast-news-roundup-1235123361/" target="_blank" rel="noopener">most popular podcast</a>, the Joe Rogan Experience.</p><p>Rogan, a former ultimate fighting commentator and television presenter, has <a href="https://variety.com/2021/digital/news/joe-rogan-anti-vaccine-podcast-spotify-1234961803/" target="_blank" rel="noopener">argued</a> healthy young people should not get a COVID vaccination. This is contrary to medical advice from governments all over the world, not to mention the <a href="https://www.who.int/emergencies/diseases/novel-coronavirus-2019/covid-19-vaccines/advice" target="_blank" rel="noopener">World Health Organization</a>.</p><p>A recent episode of his podcast, featuring virologist Robert Malone, drew <a href="https://www.theguardian.com/technology/2022/jan/14/spotify-joe-rogan-podcast-open-letter" target="_blank" rel="noopener">criticism from public health experts</a> over its various conspiracist claims about COVID vaccination programs.</p><p>There were widespread calls for Spotify to deplatform Rogan and his interviewees. Rock legend Neil Young issued an ultimatum that Spotify could broadcast Rogan or Young, but not both.</p><p>Spotify made its choice: the Joe Rogan Experience is still on the air, while Young’s <a href="https://www.theguardian.com/commentisfree/2022/jan/28/joe-rogan-neil-young-spotify-streaming-service" target="_blank" rel="noopener">music</a> is gone, along with <a href="https://www.abc.net.au/news/2022-01-29/joni-mitchell-take-songs-off-spotify-solidarity-with-neil-young/100790200" target="_blank" rel="noopener">Joni Mitchell</a> and <a href="https://www.rollingstone.com/music/music-news/nils-lofgren-spotify-neil-young-1292480/" target="_blank" rel="noopener">Nils Lofgren</a>, who removed their content in solidarity.</p><p><strong>Spotify’s response</strong></p><p>Spotify co-founder Daniel Ek has since <a href="https://newsroom.spotify.com/2022-01-30/spotifys-platform-rules-and-approach-to-covid-19/" target="_blank" rel="noopener">promised</a> to tag controversial COVID-related content with links to a “hub” containing trustworthy information. But he stopped short of pledging to remove misinformation outright.</p><p>In a statement, Ek <a href="https://newsroom.spotify.com/2022-01-30/spotifys-platform-rules-and-approach-to-covid-19/" target="_blank" rel="noopener">said</a>:</p><blockquote><p>We know we have a critical role to play in supporting creator expression while balancing it with the safety of our users. In that role, it is important to me that we don’t take on the position of being content censor while also making sure that there are rules in place and consequences for those who violate them.</p></blockquote><p><strong>Does it go far enough?</strong></p><p>Freedom of expression is important, but so is prevention of harm. When what is being advocated is likely to cause harm or loss of life, a line has been crossed. Spotify has a moral obligation to restrict speech that damages the public interest.</p><p>In response to the controversy, Spotify also publicly shared its <a href="https://newsroom.spotify.com/2022-01-30/spotify-platform-rules/" target="_blank" rel="noopener">rules of engagement</a>. They are comprehensive and proactive in helping to make content creators aware of the lines that must not be crossed, while allowing for freedom of expression within these constraints.  </p><p>Has Spotify fulfilled its duty of care to customers? If it applies the rules as stated, provides listeners with links to trustworthy information, and refuses to let controversial yet profitable content creators off the hook, this is certainly a move in the right direction.</p><p><strong>Platform or publisher?</strong></p><p>At the crux of the problem is the question of whether social media providers are <a href="https://socialmediahq.com/if-social-media-companies-are-publishers-and-not-platforms-that-changes-everything/" target="_blank" rel="noopener">platforms or publishers</a>.</p><p>Spotify and other Big Tech players claim they are simply providing a platform for people’s opinions. But <a href="https://www.zdnet.com/article/scott-morrison-says-social-media-platforms-are-publishers-if-unwilling-to-identify-users/" target="_blank" rel="noopener">regulators</a> are beginning to say no, they are in fact publishers of information, and like any publisher must be accountable for their content.</p><figure class="align-center "><img src="https://images.theconversation.com/files/443600/original/file-20220201-19-1kyj1oy.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" alt="Logos of big tech platforms" /><figcaption><span class="caption">Tech platforms like to claim they’re not publishers.</span> <span class="attribution"><span class="source">Pixabay</span>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/" target="_blank" rel="noopener">CC BY</a></span></figcaption></figure><p>Facebook, YouTube, Twitter and other platforms <a href="https://www.brookings.edu/blog/techtank/2021/06/01/addressing-big-techs-power-over-speech/" target="_blank" rel="noopener">have significant power</a> to promote particular views and limit others, thereby influencing millions or even <a href="https://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/#:%7E:text=How%20many%20users%20does%20Facebook,the%20biggest%20social%20network%20worldwide." target="_blank" rel="noopener">billions</a> of users.</p><p>In the United States, these platforms have immunity from civil and criminal liability under a <a href="https://www.eff.org/issues/cda230" target="_blank" rel="noopener">1996 federal law</a> that shields them from liability as sites that host user-generated content. Being US corporations, their actions are primarily based on US legislation.</p><p>It is an ingenious business model that allows Facebook, for example, to turn a steady stream of free user-posted content into <a href="https://www.statista.com/statistics/277963/facebooks-quarterly-global-revenue-by-segment/" target="_blank" rel="noopener">US$28 billion in quarterly advertising revenue</a>.</p><p>Established newspapers and magazines also sell advertising, but they pay journalists to write content and are legally liable for what they publish. It’s little wonder they are <a href="https://www.theguardian.com/commentisfree/2020/apr/24/newspapers-journalists-coronavirus-press-democracy" target="_blank" rel="noopener">struggling</a> to survive, and little wonder the tech platforms are keen to avoid similar responsibilities.</p><p>But the fact is that social media companies do make editorial decisions about what appears on their platforms. So it is not morally defensible to hide behind the legal protections afforded to them as platforms, when they operate as publishers and reap considerable profits by doing so.</p><p><strong>How best to combat misinformation?</strong></p><p>Misinformation in the form of fake news, intentional disinformation and misinformed opinion has become a crucial issue for democratic systems around the world. How to combat this influence without compromising democratic values and free speech?</p><p>One way is to cultivate “news literacy” – an ability to discern misinformation. This can be done by making a practice of sampling news from across the political spectrum, then averaging out the message to the moderate middle. Most of us confine ourselves to the echo chamber of our preferred source, avoiding contrary opinions as we go.</p><p>If you are not sampling at least three reputable sources, you’re not getting the full picture. Here are the <a href="https://libguides.ucmerced.edu/news/reputable" target="_blank" rel="noopener">characteristics</a> of a reputable news source.</p><p>Social media, meanwhile, should invest in artificial intelligence (AI) tools to sift the deluge of real-time content and flag potential fake news. Some progress in this area has been made, but there is room for improvement.</p><p>The tide is turning for the big social media companies. Governments around the world are formulating laws that will oblige them to be more responsible for the content they publish. They won’t have long to wait.<img style="border: none !important;margin: 0 !important;max-height: 1px !important;max-width: 1px !important;min-height: 1px !important;min-width: 1px !important;padding: 0 !important" src="https://counter.theconversation.com/content/176022/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" /></p><p><em><a href="https://theconversation.com/profiles/david-tuffley-13731" target="_blank" rel="noopener">David Tuffley</a>, Senior Lecturer in Applied Ethics &amp; CyberSecurity, <a href="https://theconversation.com/institutions/griffith-university-828" target="_blank" rel="noopener">Griffith University</a></em></p><p><em>This article is republished from <a href="https://theconversation.com" target="_blank" rel="noopener">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/spotifys-response-to-rogan-gate-falls-short-of-its-ethical-and-editorial-obligations-176022" target="_blank" rel="noopener">original article</a>.</em></p><p><em>Image: Getty Images</em></p>

Technology

Placeholder Content Image

Finding climate misinformation

<div> <div class="copy"> <p>We learnt only last month that <a rel="noreferrer noopener" href="https://cosmosmagazine.com/people/behaviour/trolling-abuse-of-scientists-during-the-pandemic/" target="_blank">scientists have been abused</a> on social media for telling the truth during the COVID pandemic.</p> <p>Now, an international team of researchers has delved into a related phenomenon – climate misinformation – and found that attacks on the reliability of climate science is the most common form of misinformation, and that misinformation targeting climate solutions is on the rise.</p> <p>Monash University research fellow Dr John Cook and colleagues from the University of Exeter, UK, and Trinity College Dublin, Ireland, trained a machine-learning model to automatically detect and categorise climate misinformation.</p> <p>Then they reviewed 255,449 documents from 20 prominent conservative think-tank (CTT) websites and 33 climate change denial blogs to build a two-decade history of climate misinformation and find common topics, themes, peaks, and changes over time.</p> <p>It’s the largest content analysis to date on climate misinformation, with findings <a rel="noreferrer noopener" href="https://doi.org/10.1038/s41598-021-01714-4" target="_blank">published</a> today in in the <em>Nature </em>journal <em>Scientific Reports</em>.</p> <p>“Our study found claims used by such think-tanks and blogs focus on attacking the integrity of climate science and scientists, and, increasingly, challenged climate policy and renewable energy,” Cook says.</p> <p>“Organised climate change contrarianism has played a significant role in the spread of misinformation and the delay to meaningful action to mitigate climate change.”</p> <p>As a result of their analysis, the researchers developed a taxonomy to categorise claims about climate science and policy used by opponents of climate action.</p> <p>They found the five major claims about climate change used by CTTs and blogs were:</p> <ol type="1"> <li>It’s not happening</li> <li>It’s not us</li> <li>It’s not bad</li> <li>Solutions won’t work</li> <li>Climate science/scientists are unreliable</li> </ol> <p>Within these were a number of sub-claims providing a detailed delineation of specific arguments.</p> <p>The researchers say climate misinformation leads to a number of negative outcomes, including reduced climate literacy, public polarisation, cancelling out accurate information and influencing how scientists engage with the public.</p> <p>“The problem of misinformation is so widespread, practical solutions need to be scalable to match the size of the problem,” Cook says.</p> <p>“Misinformation spreads so quickly across social networks, we need to be able to identify misinformation claims instantly in order to respond quickly. Our research provides a tool to achieve this.”</p> <!-- Start of tracking content syndication. Please do not remove this section as it allows us to keep track of republished articles --> <img id="cosmos-post-tracker" style="opacity: 0; height: 1px!important; width: 1px!important; border: 0!important; position: absolute!important; z-index: -1!important;" src="https://syndication.cosmosmagazine.com/?id=172828&amp;title=Finding+climate+misinformation" alt="" width="1" height="1" /> <!-- End of tracking content syndication --></div> <div id="contributors"> <p><a href="https://cosmosmagazine.com/earth/climate/finding-climate-misinformation/">This article</a> was originally published on <a href="https://cosmosmagazine.com">Cosmos Magazine</a> and was written by <a href="https://cosmosmagazine.com/contributor/dr-deborah-devis">Deborah Devis</a>. Deborah Devis is a science journalist at Cosmos. She has a Bachelor of Liberal Arts and Science (Honours) in biology and philosophy from the University of Sydney, and a PhD in plant molecular genetics from the University of Adelaide.</p> <p><em>Image: Yasin Ozturk/Anadolu Agency via Getty Images</em></p> </div> </div>

International Travel

Placeholder Content Image

Is it even possible to regulate Facebook effectively? Time and again, attempts have led to the same outcome

<p>The Australian government’s <a href="https://theconversation.com/this-is-why-australia-may-be-powerless-to-force-tech-giants-to-regulate-harmful-content-169826">recent warning</a> to Facebook over misinformation is just the latest salvo in the seemingly constant battle to hold the social media giant to account for the content posted on its platform.</p> <p>It came in the same week as the US Senate heard <a href="https://www.bbc.com/news/world-us-canada-58805965">whistleblowing testimony</a> in which former Facebook executive Frances Haugen alleged the company knew of harmful consequences for its users but chose not to act.</p> <p>Governments all over the world have been pushing for years to make social media giants more accountable, both in terms of the quality of information they host, and their use of users’ data as part of their business models.</p> <p>The Australian government’s <a href="https://www.aph.gov.au/Parliamentary_Business/Bills_LEGislation/Bills_Search_Results/Result?bId=r6680">Online Safety Act</a> will <a href="https://perma.cc/95A5-T79H">come into effect in January 2022</a>, giving the eSafety Commissioner unprecedented powers to crack down on abusive or violent content, or sexual images posted without consent.</p> <p>But even if successful, this legislation will only deal with a small proportion of the issues that require regulation. On many such issues, social media platforms have attempted to regulate themselves rather than submit to legislation. But whether we are talking about legislation or self-regulation, past experiences do not engender much confidence that tech platforms can be successfully regulated and regulation put in action easily.</p> <p>Our <a href="https://aisel.aisnet.org/ecis2021_rip/35">research</a> has examined previous attempts to regulate tech giants in Australia. We analysed 269 media articles and 282 policy documents and industry reports published from 2015 to 2021. Let’s discuss a couple of relevant case studies.</p> <h2>1. Ads and news</h2> <p>In 2019, the Australian Competition and Consumer Commission (ACCC) <a href="https://www.accc.gov.au/publications/digital-platforms-inquiry-final-report">inquiry into digital platforms</a> described Facebook’s algorithms, particularly those that determine the positioning of advertising on Facebook pages, as “opaque”. It concluded media companies needed more assurance about the use of their content.</p> <p>Facebook initially welcomed the inquiry, but then <a href="https://www.accc.gov.au/system/files/Facebook_0.pdf">publicly opposed it</a> when the government argued the problems related to Facebook’s substantial market power in display advertising, and Facebook and Google’s dominance of news content generated by media companies, were too important to be left to the companies themselves.</p> <p>Facebook argued there was <a href="https://www.accc.gov.au/system/files/Facebook.pdf">no evidence of an imbalance of bargaining power</a>between it and news media companies, adding it would have no choice but to withdraw news services in Australia if forced to pay publishers for hosting their content. The standoff resulted in Facebook’s <a href="https://theconversation.com/facebook-has-pulled-the-trigger-on-news-content-and-possibly-shot-itself-in-the-foot-155547">infamous week-long embargo on Australian news</a>.</p> <p><span>The revised and amended News Media Bargaining Code was </span><a href="https://www.accc.gov.au/system/files/Final%20legislation%20as%20passed%20by%20both%20houses.pdf">passed by the parliament in February</a><span>. Both the government and Facebook declared victory, the former having managed to pass its legislation, and the latter ending up striking its own bargains with news publishers without having to be held legally to the code.</span></p> <h2>2. Hate speech and terrorism</h2> <p>In 2015, to deal with violent extremism on social media the Australian government initially worked with the tech giant to develop joint AI solutions to improve the technical processes of content identification to deal with countering violent extremism.</p> <p>This voluntary solution worked brilliantly, until it did not. In March 2019, mass shootings at mosques in Christchurch were live-streamed on Facebook by an Australian-born white supremacist terrorist, and the recordings subsequently circulated on the internet.</p> <p>This brought to light <a href="https://www.stuff.co.nz/national/christchurch-shooting/111473473/facebook-ai-failed-to-detect-christchurch-shooting-video">the inability Facebook’s artificial intelligence algorithms</a> to detect and remove the live footage of the shooting and how fast it was shared on the platform.</p> <p>The Australian government responded in 2019 by <a href="https://www.ag.gov.au/crime/abhorrent-violent-material">amending the Criminal Code</a>to require social media platforms to remove abhorrent or violent material “in reasonable time” and, where relevant, refer it to the Australian Federal Police.</p> <h2>What have we learned?</h2> <p>These two examples, while strikingly different, both unfolded in a similar way: an initial dialogue in which Facebook proposes an in-house solution involving its own algorithms, before a subsequent shift towards mandatory government regulation, which is met with resistance or bargaining (or both) from Facebook, and the final upshot which is piecemeal legislation that is either watered down or only covers a subset of specific types of harm.</p> <p>There are several obvious problems with this. The first is that only the tech giants themselves know how their algorithms work, so it is difficult for regulators to oversee them properly.</p> <p>Then there’s the fact that legislation typically applies at a national level, yet Facebook is a global company with billions of users across the world and a platform that is incorporated into our daily lives in all sorts of ways.</p> <p>How do we resolve the impasse? One option is for regulations to be drawn up by independent bodies appointed by governments and tech giants to drive the co-regulation agenda globally. But relying on regulation alone to guide tech giants’ behaviour against potential abuses might not be sufficient. There is also the need for self-discipline and appropriate corporate governance - potentially enforced by these independent bodies.</p> <p><em>Image credits: Shutterstock </em></p> <p><em>This article first appeared on <a rel="noopener" href="https://theconversation.com/is-it-even-possible-to-regulate-facebook-effectively-time-and-again-attempts-have-led-to-the-same-outcome-169947" target="_blank">The Conversation</a>.</em></p>

Technology

Placeholder Content Image

"It has to stop": Karl's blunt plea after anti-vaxxers hijack student's death

<p>Karl Stefanovic used his platform on <em>The Today Show</em> to plead with those spreading misinformation about the COVID-19 vaccine. </p> <p>The TV host discussed the high profile case of Year 12 student Tom van Dijk, who passed away last week after suffering from a cardiac arrest while swimming with his family. </p> <p>Karl went on to say that Tom's school was forced to step in when a flood of social media messages incorrectly linked his untimely death to the COVID-19 vaccine. </p> <p>"He had a cardiac arrest. So what happens? Thousands of faceless keyboard warriors take it upon themselves to seize on his awful death in front of his family, circulating false information, blaming his death on vaccinations," he said.</p> <p>"His school already dealing with grief amongst students was forced to confirm he hadn't even had a vaccination yet."</p> <p><span>"Imagine the added stress on that poor young man's family...their pain and their tragedy made worse by lies on social media."</span></p> <p><span>Karl went on to say the spread of misinformation is dangerous and "it has to stop".</span></p> <p>Tom van Dijk died on August 21, which was confirmed by his school principal, John Couani, at St Pius X College in Chatswood, Sydney.  </p> <p>Couani reiterated Karl Stefanovic's claims about vaccine misinformation, saying he is unsure why the dangerous rumours started in the first place. </p> <p><span>The misinformation is horrific, there was nothing to do with mental health, this is purely a health issue,” he said. </span></p> <p><span>“The school did not force students to be vaccinated — the school is not in a hotspot or local government area of concern, nor was it eligible for the priority vaccination program — we’ve made no statement calling for vaccinations of students.”</span></p> <p><em>Image credit: Channel Nine</em></p>

Caring

Placeholder Content Image

The sneaky way anti-vaxx groups are remaining undetected on Facebook

<p><span style="font-weight: 400;">Anti-vaccination groups on Facebook are relying on an interesting tactic to avoid detection from those who don’t share their beliefs. </span></p> <p><span style="font-weight: 400;">The groups are changing their names to euphemisms like ‘dance party’ or ‘dinner party’ to skirt rules put in place by the social media giant.</span></p> <p><span style="font-weight: 400;">Harsher bans were put in place by Facebook to crack down on dangerous misinformation about COVID-19 and subsequent vaccines. </span></p> <p><span style="font-weight: 400;">The groups are largely private and difficult to find on the social networking site, but still retain a large user base and have learned how to swap out detectable language to remain unseen. </span></p> <p><span style="font-weight: 400;">One major ‘dance party’ group has over 40,000 followers and has stopped allowing new users to join due to public backlash.</span></p> <p><span style="font-weight: 400;">The backup group for ‘Dance Party’, known as ‘Dinner Party’ and created by the same moderators, has more than 20,000 followers.</span></p> <p><span style="font-weight: 400;">Other anti-vaxx influencers on Instagram have adopted similar tactics, such as referring to vaccinated people as ‘swimmers’ and the act of vaccination as joining a ‘swim club’.</span></p> <p><span style="font-weight: 400;">These devious tactics have been recognised by governments internationally, as there is mounting pressure for officials to increase pressure on the social media platforms to do more to contain vaccine misinformation.</span></p> <p><span style="font-weight: 400;">An administrator for the ‘Dance Party’ wrote that beating Facebook’s moderating system “feels like a badge of honour”, as they urged users to stay away from ‘unapproved words’. </span></p> <p><span style="font-weight: 400;">Using code words and euphemisms is not new among the anti-vaxx community, as it borrows from a playbook used by extremists on Facebook and other social networking sites for many years.</span></p> <p><em><span style="font-weight: 400;">Image credit: Shutterstock</span></em></p>

Technology

Placeholder Content Image

Peddlers of fake news to be punished by Facebook

<p><span style="font-weight: 400;">People sharing false or misleading information on Facebook could soon be penalised.</span></p> <p><span style="font-weight: 400;">The social media giant has announced it will be cracking down on fake news by doling out harsher punishments for individual accounts repeatedly sharing misinformation.</span></p> <p><span style="font-weight: 400;">Under the new rules, Facebook will “reduce the distribution of all posts” from people guilty of doing this to make it harder for their content to be seen by other users.</span></p> <p><span style="font-weight: 400;">Though this already happens for Pages and Groups that post misinformation, it hasn’t extended that to individuals until now.</span></p> <p><span style="font-weight: 400;">Facebook does limit the reach of posts made by individual users that have been flagged by fact-checkers, but the new policy will act as a broader penalty for account holders sharing misinformation.</span></p> <p><span style="font-weight: 400;">But, Facebook has not specified how many times a user’s posts will have to be flagged before they are punished.</span></p> <p><span style="font-weight: 400;">The company will also start showing users pop-messages if they click the “like” button of a page that frequently shares misinformation to alert users that fact-checkers have previously flagged posts from the page.</span></p> <p><span style="font-weight: 400;">“This will help people make an informed decision about whether they want to follow the Page,” the company wrote on a blog post.</span></p> <p><span style="font-weight: 400;">The rules are the company’s latest effort to curb fake news. This comes after Facebook has continued to struggle with controlling the rumours and misleading posts from nearly 3 billion users, despite creating dedicated information hubs for topics such as the pandemic and climate change to present users with reliable information.</span></p>

Legal

Placeholder Content Image

Hair salon bans patrons who have had the jab

<p><span style="font-weight: 400;">A Gold Coast hair salon is turning away customers who have been vaccinated against COVID-19, claiming they are concerned for the “health and safety” of their staff.</span></p> <p><span style="font-weight: 400;">The Khemia HI Vibe Frequency Salon at Palm Beach posted a policy update posted the policy change on its social media pages, claiming the “unknown health effects of the mRNA vaccine” are not covered by its public liability insurance.</span></p> <p><span style="font-weight: 400;">According to their new policy, customers are required to notify the salon if they have had the vaccine before making an appointment.</span></p> <p><span style="font-weight: 400;">“The unknown health effects of the mRNA vaccine are not covered by our public liability insurance,” the salon wrote on its Facebook and Instagram pages.</span></p> <blockquote style="background: #FFF; border: 0; border-radius: 3px; box-shadow: 0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15); margin: 1px; max-width: 540px; min-width: 326px; padding: 0; width: calc(100% - 2px);" class="instagram-media" data-instgrm-captioned="" data-instgrm-permalink="https://www.instagram.com/p/CO96i_CJ7iz/?utm_source=ig_embed&amp;utm_campaign=loading" data-instgrm-version="13"> <div style="padding: 16px;"> <div style="display: flex; flex-direction: row; align-items: center;"> <div style="background-color: #f4f4f4; border-radius: 50%; flex-grow: 0; height: 40px; margin-right: 14px; width: 40px;"></div> <div style="display: flex; flex-direction: column; flex-grow: 1; justify-content: center;"> <div style="background-color: #f4f4f4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 100px;"></div> <div style="background-color: #f4f4f4; border-radius: 4px; flex-grow: 0; height: 14px; width: 60px;"></div> </div> </div> <div style="padding: 19% 0;"></div> <div style="display: block; height: 50px; margin: 0 auto 12px; width: 50px;"></div> <div style="padding-top: 8px;"> <div style="color: #3897f0; font-family: Arial,sans-serif; font-size: 14px; font-style: normal; font-weight: 550; line-height: 18px;">View this post on Instagram</div> </div> <p style="color: #c9c8cd; font-family: Arial,sans-serif; font-size: 14px; line-height: 17px; margin-bottom: 0; margin-top: 8px; overflow: hidden; padding: 8px 0 7px; text-align: center; text-overflow: ellipsis; white-space: nowrap;"><a style="color: #c9c8cd; font-family: Arial,sans-serif; font-size: 14px; font-style: normal; font-weight: normal; line-height: 17px; text-decoration: none;" rel="noopener" href="https://www.instagram.com/p/CO96i_CJ7iz/?utm_source=ig_embed&amp;utm_campaign=loading" target="_blank">A post shared by Khemia HI vibe Frequency Salon (@khemia_frequency_salon)</a></p> </div> </blockquote> <p><span style="font-weight: 400;">“We are deeply sorry for any inconvenience to you.”</span></p> <p><span style="font-weight: 400;">All vaccines confirmed for use in Australia have been heavily regulated to ensure their safety by the Therapeutic Goods Administration (TGA).</span></p> <p><span style="font-weight: 400;">“Australia’s vaccine safety and regulatory process is world class and people can be confident that vaccines approved for use are safe and effective,” Acting Chief Medical Officer, Professor Michael Kidd, and Head of the TGA Adjunct Professor John Skerritt said in a joint statement in April.</span></p> <p><span style="font-weight: 400;">“Our vaccines will save lives and are an essential part of tackling this global pandemic.”</span></p> <p><span style="font-weight: 400;">The Khemia team said the policy would be reevaluated after the completion of clinical trials in 2023.</span></p> <p><span style="font-weight: 400;">The post was also flagged by Facebook as “missing context”, and a fact box appeared beneath it that read: “Independent fact-checkers say that this information could mislead people.”</span></p> <p><span style="font-weight: 400;">The salon’s owner told 9News she had heard of women contracting side effects without actually being vaccinated.</span></p> <p><span style="font-weight: 400;">“I guess a lot of people would question that and I think it’s like anything, it’s like the disease or the virus at the moment - it’s spreading somehow and somehow women are reporting side effects when they haven’t had the host,” she told 9News.</span></p> <p><span style="font-weight: 400;">Though the side effects of vaccination might feel like you are sick, they do not mean that you are or that your symptoms are contagious, and there are no verified reports that support Ms Adler’s claims.</span></p> <p><span style="font-weight: 400;">“Side effect symptoms cannot be spread to others,” Manisha Juthani, MD, infectious disease specialist at Yale Medicine and associate professor at the Yale School of Medicine, told </span><a href="https://www.verywellhealth.com/covid-vaccine-side-effects-not-contagious-5182483"><span style="font-weight: 400;">Verywell</span></a><span style="font-weight: 400;">. “The vaccine cannot give you the virus, so the symptoms you experience are a manifestation of your immune system building a response so that you can fight the virus in the future should you be exposed to it.”</span></p> <p><span style="font-weight: 400;">Treasurer Josh Frydenberg told Today that public hesitancy over getting the jab was understandable but would not derail the government’s plans to reopen Australia’s borders.</span></p> <p><span style="font-weight: 400;">“It’s understandable that some people are hesitant, but ultimately, the more people that get the jab, the better.”</span></p>

Beauty & Style

Placeholder Content Image

How to talk to someone you believe is misinformed about the coronavirus

<p>The medical evidence is clear: The coronavirus global health threat is not an elaborate hoax. Bill Gates did not create the coronavirus to sell more vaccines. Essential oils are <a href="https://nccih.nih.gov/health/in-the-news-in-the-news-coronavirus-and-alternative-treatments">not effective</a> at protecting you from coronavirus.</p> <p>But those facts have not stopped contrary claims from spreading both on and offline.</p> <p>No matter the topic, people often hear conflicting information and must decide which sources to trust. The internet and the fast-paced news environment mean that information travels quickly, leaving little time for fact-checking.</p> <p>As a <a href="https://scholar.google.com/citations?user=Li4FgBUAAAAJ&amp;hl=en">researcher</a> interested in science communication and controversies, I study how scientific misinformation spreads and how to correct it.</p> <p>I’ve been very busy lately. Whether we are talking about the coronavirus, climate change, vaccines or something else, <a href="https://www.cnn.com/2020/03/05/tech/facebook-google-who-coronavirus-misinformation/index.html">misinformation abounds</a>. Maybe you have shared something on Facebook that turned out to be false, or retweeted something before <a href="https://theconversation.com/4-ways-to-protect-yourself-from-disinformation-130767">double-checking the source</a>. <a href="https://www.unlv.edu/news/article/future-alternative-facts">This can happen</a> to anyone.</p> <p>It’s also common to encounter people who are misinformed but don’t know it yet. It’s one thing to double-check your own information, but what’s the best way to talk to someone else about what they think is true – but which is not true?</p> <p><strong>Is it worth engaging?</strong></p> <p>First, consider the context of the situation. Is there enough time to engage them in a conversation? Do they seem interested in and open to discussion? Do you have a personal connection with them where they value your opinion?</p> <p>Evaluating the situation can help you decide whether you want to start a conversation to correct their misinformation. Sometimes we interact with people who are closed-minded and not willing to listen. <a href="https://rightingamerica.net/when-the-juice-is-not-worth-the-squeeze-distinguishing-between-productive-and-unproductive-conversations/">It’s OK</a> not to engage with them.</p> <p>In interpersonal interactions, correcting misinformation can be helped by the strength of the relationship. For example, it may be easier to correct misinformation held by a family member or partner because they are already aware that you care for them and you are interested in their well-being.</p> <p><strong>Don’t patronize</strong></p> <p>One approach is to engage in a back-and-forth discussion about the topic. This is often called a <a href="https://theconversation.com/understanding-christians-climate-views-can-lead-to-better-conversations-about-the-environment-115693">dialogue</a> approach to communication.</p> <p>That means you care about the person behind the opinion, even when you disagree. It is important not to enter conversations with a patronizing attitude. For example, when talking to climate change skeptics, the <a href="https://www.npr.org/2017/05/09/527541032/there-must-be-more-productive-ways-to-talk-about-climate-change">attitude</a> that the speaker holds toward an audience affects the success of the interaction and can lead to conversations ending before they’ve started.</p> <p>Instead of treating the conversation as a corrective lecture, treat the other person as an equal partner in the discussion. One way to create that common bond is to acknowledge the shared struggles of locating accurate information. Saying that there is a lot of information circulating can help someone feel comfortable changing their opinion and accepting new information, instead of <a href="https://bigthink.com/age-of-engagement/study-warns-of-boomerang-effects-in-climate-change-campaigns">resisting and sticking to</a> their previous beliefs to avoid admitting they were wrong.</p> <p>Part of creating dialogue is asking questions. For example, if someone says that they heard coronavirus was all a hoax, you might ask, “That’s not something I’d heard before, what was the source for that?” By being interested in their opinion and not rejecting it out of hand, you open the door for conversation about the information and can engage them in evaluating it.</p> <p><strong>Offer to trade information</strong></p> <p>Another strategy is to introduce the person to new sources. In my <a href="https://www.routledge.com/Communication-Strategies-for-Engaging-Climate-Skeptics-Religion-and-the/Bloomfield/p/book/9781138585935">book</a>, I discuss a conversation I had with a climate skeptic who did not believe that scientists had reached a 97% consensus on the existence of climate change. They dismissed this well-established number by referring to nonscientific sources and blog posts. Instead of rejecting their resources, I offered to trade with them. For each of their sources I read, they would read one of mine.</p> <p>It is likely that the misinformation people have received is not coming from a credible source, so you can propose an alternative. For example, you could offer to send them an article from the <a href="http://cdc.gov/">Centers for Disease Control</a> for medical and health information, the <a href="https://www.ipcc.ch/">Intergovernmental Panel on Climate Change</a> for environmental information, or the reputable debunking site <a href="http://snopes.com/">Snopes</a> to compare the information. If someone you are talking to is open to learning more, encourage that continued curiosity.</p> <p>It is sometimes hard, inconvenient, or awkward to engage someone who is misinformed. But I feel very strongly that opening ourselves up to have these conversations can help to correct misinformation. To ensure that society can make the best decisions about important topics, share accurate information and combat the spread of misinformation.<!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p> <p><em><a href="https://theconversation.com/profiles/emma-frances-bloomfield-712710">Emma Frances Bloomfield</a>, Assistant Professor of Communication Studies, <a href="https://theconversation.com/institutions/university-of-nevada-las-vegas-826">University of Nevada, Las Vegas</a></em></p> <p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/how-to-talk-to-someone-you-believe-is-misinformed-about-the-coronavirus-133044">original article</a>.</em></p>

Relationships

Placeholder Content Image

The best, smartest post about the bushfires you'll ever read

<div class="post_body_wrapper"> <div class="post_body"> <div class="body_text "> <p>An irritated firefighter has hit back at the misinformation that has circulated on social media about the current bushfire crisis in Australia.</p> <p>He took to Facebook to bust some myths that were spreading about the bushfires, their causes and the barriers that they face as firefighters.</p> <p>“First of all, does being a firey give me all the insight to this complex issue? Not even close and I need to make that clear,” the decorated firey began.</p> <p>“However I’ve felt a strong need to say something here because I just can’t stomach some of the false science and outright lies being peddled on social media as news or facts.</p> <p>“No, the Greens haven’t been stopping hazard reduction burns from taking place. We still do them and yes we should absolutely do more of them.”</p> <p>NSW Rural Fire Service Shane Fitzsimmon agrees with the firefighter, saying that there are “challenges” with hazard reduction.</p> <p>“Our biggest challenge with hazard reduction is the weather and the windows available to do it safely and effectively,” Mr Fitzsimmons said in an interview on <em>Sunrise.</em></p> <p>“Sure, there’s environmental and other checks to go through but we streamline those. There’s special legislation to give us clearance and to cut through what would otherwise be a very complex environment.”</p> <blockquote class="twitter-tweet" data-lang="en"> <p dir="ltr">"We've had tremendous support from the commonwealth - everything we've asked for, we've got"<a href="https://twitter.com/NSWRFS?ref_src=twsrc%5Etfw">@NSWRFS</a> Commissioner Shane Fitzsimmons responds to former fire chief Greg Mullins' claim the federal government ignored state requests for bushfire assistance.<a href="https://t.co/vg47W3JHmd">https://t.co/vg47W3JHmd</a> <a href="https://t.co/APqhKovp1N">pic.twitter.com/APqhKovp1N</a></p> — Sunrise (@sunriseon7) <a href="https://twitter.com/sunriseon7/status/1214290851553140736?ref_src=twsrc%5Etfw">January 6, 2020</a></blockquote> <p>The viral firefighter said that drought and extreme weather conditions have made their jobs harder.</p> <p>“Yes, conditions have been so bad this season that fires have still burnt through areas where hazard reduction burns were completed earlier in the year,” he said.</p> <p>He has said that the government needs to invest more into hazard reduction burns.</p> <p>“NSW for example, as an estimate, would need to increase their budget from $100 million to a half billion, a five fold increase and that money needs to come from somewhere,” he said.</p> <p>He then went on to challenge both sides of the political bubble saying that people should look outside their social media feeds.</p> <p>“No, a video on Facebook of a guy in the bush screaming at the greens is not facts about what caused these fires. No, a video of someone shouting at ScoMo for not funding the NSW Rural Fire Service (state gov funded) is not facts about what caused these fires.”</p> <p>Viral posts about the bushfires have spread misinformation, with a popular post saying that fires were started by firebugs. Another popular post said that the fires were started by climate change activists to prove their point about the issue of climate change.</p> <p>Queensland University of Technology researcher Timothy Graham has said that the information has been spread by Twitter accounts using a hashtag to get their point across, #ArsonEmergency.</p> <blockquote class="twitter-tweet" data-conversation="none" data-lang="en"> <p dir="ltr">More population means more arsonist <a href="https://twitter.com/hashtag/ArsonEmergency?src=hash&amp;ref_src=twsrc%5Etfw">#ArsonEmergency</a> <a href="https://t.co/mrwjJgCyYL">https://t.co/mrwjJgCyYL</a></p> — LifeMatters (@Joshn11) <a href="https://twitter.com/Joshn11/status/1215042328353624064?ref_src=twsrc%5Etfw">January 8, 2020</a></blockquote> <p>“The motivation underlying this often tends to not be changing people’s opinions about the bushfire itself and how it’s happening, but to sow discord and magnify already existing tensions in polarised political issues,” Dr Graham told the ABC.</p> <p>Another University of Queensland lecturer in critical thinking, Peter Ellerton, said that the information is being spread so rapidly due to people looking for information that confirms their existing belief.</p> <p>“This is a wonderful example of ‘motivated reasoning’, where we justify how we hold onto a world view that’s served us in the past but as the evidence mounts against it,” Dr Ellerton told <em><a rel="noopener" href="https://www.news.com.au/technology/online/social/firefighter-slams-outright-lies-about-bushfires-as-experts-expose-bots-and-bizarre-conspiracies/news-story/239e251201616f686a5e4d28c004947a" target="_blank">news.com.au.</a> </em></p> <p>“The attempts to preserve it are becoming more and more disparate and chaotic. You see this kind of thing happening more intensely.”</p> <p>With some posts suggesting that Muslims have lit fires as some kind of terror attack, Dr Ellerton calls for caution when reading the posts.</p> <p>“That stuff is only shocking if you begin with the assumption that people make decisions based on facts,” Dr Ellerton said.</p> <p>“They don’t. And we seldom have.</p> <p>“We’re far more persuaded by narratives than we are by facts. Facts are important, there’s no question about it, but they’re not enough.”</p> </div> </div> </div> <div class="post-action-bar-component-wrapper"> <div class="post-actions-component"> <div class="upper-row"><span class="like-bar-component"></span> <div class="watched-bookmark-container"></div> </div> </div> </div>

Domestic Travel

Our Partners