GET THE APP

Politics and Upheld Cultural Values Implications on the Credibility of Fake News
..

Journal of Mass Communication & Journalism

ISSN: 2165-7912

Open Access

Perspective - (2023) Volume 13, Issue 2

Politics and Upheld Cultural Values Implications on the Credibility of Fake News

Kathrin Dwivedi*
*Correspondence: Kathrin Dwivedi, Department of Communication and International Relations, Ramon Llull University, Barcelona, Spain, Email:
Department of Communication and International Relations, Ramon Llull University, Barcelona, Spain

Received: 01-Feb-2023, Manuscript No. jmcj-23-93064; Editor assigned: 03-Feb-2023, Pre QC No. P-93064; Reviewed: 15-Feb-2023, QC No. Q-93064; Revised: 20-Feb-2023, Manuscript No. R-93064; Published: 27-Feb-2023 , DOI: 10.37421/2165-7912.2023.13.509
Citation: Dwivedi, Kathrin. “Politics and Upheld Cultural Values Implications on the Credibility of Fake News.” J Mass Communicat Journalism 13 (2023): 509.
Copyright: © 2023 Dwivedi K. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Introduction

TikTok, Douyin, WeChat, Facebook, Reddit, Snapchat, and WhatsApp are examples of social networking sites (SNS) that provide a platform for the instantaneous viral dissemination of information across culturally diverse and geographically dispersed regions of the world. SNS are a popular tool for the dissemination of both true and fake news due to the speed at which information flows. Due to the rapid spread of "digital wildfires," the greatest threat to modern society is the possibility of fake news being spread by social media (SM. Here, "fabricated information that mimics news media content in form but not in organizational process or intent" is interpreted as fake news. In the context of presidential campaigns politics the COVID-19 pandemic climate change and global warming immigrants and religion fake news has been identified. As a result, societies have become increasingly polarized as a result of fake news [1].

Description

Using SNS, SM-induced polarization has indeed emerged as a new academic field. Academic interest in the connection between SM, political polarization, and disinformation (i.e., false information spread to deceive people, for example, as part of efforts to influence individuals' political beliefs and ideologies) is growing. Through associated informational cascades and echo-chambers, SM (and SNS) can help incite and exacerbate polarization. These informational cascades not only work to increase the number of rumors and their transmission speed, but they also make individuals give more credence to the rumors in question [2].

This, in turn, triggers viewpoint alteration processes as well as partisan epistemologies worldview gaps and By way of information and communications "that deliberately misrepresent symbols, appealing to emotions and prejudices and bypassing rational thought, to achieve a specific goal," computational propaganda-defined as the "assemblage of social-media platforms, autonomous agents, and big data tasked with the manipulation of public opinion" -has been studied and identified worldwide. By fostering, instigating, and escalating polarization, computational propaganda has been used to sow discord and dissent. During Brazil's 2014 presidential elections, bots-automated programs that perform simple, repetitive tasks and networks of bots-were used on Facebook, Twitter, and WhatsApp to both "support and attack political figures," debate issues like corruption, and encourage protest movements."

These bots are essential to the spread of computational propaganda. In addition, authoritarian regimes, such as those in China and Russia, have utilized bot-spreading computational propaganda to target not only their own people but also political figures in other nations who express dissent, such as those in Taiwan, Poland, and Ukraine. Sadly, computational propaganda is being used more and more frequently. There was evidence of formally organized SM manipulation campaigns in 28 countries in 2018, and in 2019, "at least one political party or government agency using SM to manipulate public opinion" was found in twenty more countries [3].

Economists have, presumably as a result, investigated, among other things, the levels of overall exposure to fake news that circulated on Social Media, as well as how persuasive these would have to have been, in order to have been crucial in changing the way people voted in the US election of 2016. Economists, on the other hand, have created a large sample of SM networks through the use of agent-based modeling in order to simulate how bots-those with opposing viewpoints-may influence opinion throughout these networks.

It turns out that these bots can not only spread fake news, which helps prevent information aggregation and consensus but they can also lead to complete misinformation, just like in the market for lemons. Therefore, increase polarization in the sense that SM algorithms can limit users' exposure to news that is opposed to their viewpoint. While the negative effects of polarization, partisanship, and divisiveness on political gridlock and institutional dysfunction were harder to foresee it seems intuitive that discord imposes additional transaction costs on information aggregation, political compromise, and consensus building, particularly in democracies (even well-established ones). Critically, there is "an increase in the polarization of users and URLs (in terms of their associated political viewpoints) for information labeled with fake news keywords and hashtags, when compared to information not labeled as ‘fake news'" when analyzing "the relationship between polarization and what is perceived as misinformation" [4].

In a similar vein, previous research has demonstrated that a person's use of technology can be influenced by their culture. A person's cultural values can also be used to predict how they will act on SNS. In addition, it turns out that cultural values can help explain how much a person is able to spot lies in technology-mediated human interactions between interviewers and interviewees. There are many different kinds of information disorders, such as skewed information, hyperpartisan information, rumors, misinformation, and disinformation. In this case, we are primarily concerned with the effect that espoused cultural values might have on the acceptability of fake news when using SNSs, rather than at the country level. A repeated-measures design would be used for the first time in this study [5].

Conclusion

We must all work to better understand the behaviors that may make people more or less likely to give credence to fake news given the prevalence of fake news on SNS. Especially because rumors, false information, and misinformation and disinformation can cause group polarization partisan epistemologies and confrontations that are triggered by a gap in one's worldview. We wanted to find out how people's political views and cultural beliefs affect whether or not people believe fake news that is shared on social media. Our findings supported the hypothesis that conservatism contributes to the credulity of fake news on the basis of the data gathered from a sample that was culturally diverse and included participants from the United States and India.

Acknowledgement

None.

Conflict of Interest

None.

References

  1. Dwivedi, Yogesh K, Gerald Kelly, Marijn Janssen and Nripendra P. Rana, et al. "Social media: The good, the bad, and the ugly." Inf Syst Front 20 (2018): 419-423.
  2. Google Scholar, Crossref, Indexed at

  3. Bail, Christopher A, Lisa P. Argyle, Taylor W. Brown and John P. Bumpus, et al. "Exposure to opposing views on social media can increase political polarization." Proc Natl Acad Sci 115 (2018): 9216-9221.
  4. Google Scholar, Crossref, Indexed at

  5. Barberá, Pablo, John T. Jost, Jonathan Nagler and Joshua A. Tucker, et al. "Tweeting from left to right: Is online political communication more than an echo chamber?." Psychol Sci 26 (2015): 1531-1542.
  6. Google Scholar, Crossref, Indexed at

  7. Linvill, Darren L and Patrick L. Warren. "Troll factories: Manufacturing specialized disinformation on Twitter." Political Commun 37 (2020): 447-467.
  8. Google Scholar, Crossref, Indexed at

  9. Modgil, Sachin, Rohit Kumar Singh, Shivam Gupta and Denis Dennehy. "A confirmation bias view on social media induced polarisation during Covid-19." Inf Syst Front (2021): 1-25.
  10. Google Scholar, Crossref, Indexed at

arrow_upward arrow_upward