Рус Eng Cn Translate this page:
Please select your language to translate the article


You can just close the window to don't translate
Library
Your profile

Back to contents

Litera
Reference:

Disinformation in the media: problems, challenges and solutions

Direev Ivan Dmitrievich

ORCID: 0009-0008-3669-7774

Lecturer; Department of Communications; South Ural State University (National Research University)
Head of the service; Branch of VUNTS VVS 'VVA' in Chelyabinsk

Burdenyuk str., 6, Chelyabinsk, 454015, Russia

tifey@udm.ru

DOI:

10.25136/2409-8698.2025.1.73205

EDN:

QOUINW

Received:

27-01-2025


Published:

03-02-2025


Abstract: This author studied the issues of disinformation in the mass media. The reasons for the spread of disinformation, its impact on society and possible measures to reduce its impact are analyzed. Special attention is paid to the role of new technologies and social networks in spreading false information. In the era of globalization and the rapid development of information technology, information has gained key importance, influencing all aspects of public life. The media has become the main channel of information dissemination, shaping public opinion and influencing political processes. However, along with this, a serious problem has arisen — disinformation, the dissemination of false or distorted information for the purpose of manipulation. Disinformation in the media undermines public confidence in information sources, contributes to the polarization of opinions and leads to incorrect political decisions. With the rapid development of the Internet and social networks, fake news and manipulation are becoming more sophisticated and widespread, which raises serious concerns about the stability and security of society. The scientific novelty of the work lies in an interdisciplinary approach linking the theories of media interaction, cognitive psychology and political science. Unlike previous studies that focus on individual aspects (for example, fact-checking), this article considers disinformation as a systemic phenomenon that requires a multi-level response. The research hypothesis is that effective counteraction to disinformation is possible only with the synthesis of "hard" measures (platform regulation, AI moderation) and "soft" tools (media literacy, strengthening trust in expertise). This study sets the framework for a detailed analysis, emphasizing that disinformation is not just a technical problem, but a symptom of the deep social and political contradictions of the digital age. The article also focuses on the role of new technologies and social networks in spreading false information.


Keywords:

disinformation, mass media, fake news, media education, media, social networks, manipulation, fact checking, public opinion, Internet

This article is automatically translated. You can find original text of the article here.

Introduction

In the modern world, the information space plays an important role in shaping public opinion and perception of reality. However, in recent years, the problem of disinformation has become one of the most pressing threats to society. Disinformation is a deliberately created or distorted message that misleads people [33. p. 1147]. In the context of globalization and the development of digital technologies, this problem is becoming increasingly significant, which requires a comprehensive analysis and the development of effective methods to combat it.

The main reasons for the spread of disinformation

The spread of disinformation in the modern information space is caused by many factors that interact and reinforce each other. Let's look at the most significant of them in more detail.

1. The rapid growth in the use of social media

Social media has become one of the key platforms for the dissemination of information in modern society. They provide users with the ability to quickly and easily share news, photos, videos, and other types of content. However, this freedom also creates conditions for the rapid spread of fake news and false information [12].

One of the important aspects is algorithmic content recommendation. Social networks use complex algorithms to analyze users' interests and show them the materials that the system believes will be most interesting. Such algorithms often prefer sensational and emotionally charged materials, which contributes to their wider distribution [13]. For example, studies have shown that fake news spreads faster and more widely than reliable information due to its emotional appeal [33. p.1149].

In addition, social networks often become a place of active discussion and reposting, which further enhances the effect of spreading misinformation. Users may not check the information before posting it, relying on the authority of the source or simply trusting their friends and followers. As a result, even a small percentage of unreliable data can quickly become a mass phenomenon [18].

2. The economic component

The economic interests of media companies play a significant role in spreading disinformation. In the face of fierce competition for audience attention, many media outlets strive to attract as many readers, viewers, and advertisers as possible. To do this, they use various methods, including creating scandalous headlines and publishing sensational materials [32].

This practice is known as "yellow press" and is based on the use of shocking or provocative topics to increase traffic and advertising revenue. Journalists and editors may intentionally exaggerate facts or interpret events in such a way as to generate maximum interest from the audience. In some cases, this leads to the dissemination of false or distorted information [11].

Another aspect of economic motivation is the influence of sponsors and advertisers. Some media companies may be forced to publish materials beneficial to certain financial groups or political forces, even if these materials do not correspond to reality. Such situations are especially dangerous because they undermine trust in the media and create a false perception of reality [22].

3. Political influence and manipulation

Political forces are also actively using disinformation to achieve their goals. In the context of globalization and growing political tensions, information manipulation is becoming one of the main tools of the struggle for power. Politicians and their supporters can use fake news to discredit opponents, create a positive self-image, or create artificial crises [14].

A special role in this process is played by so-called "bots" and "trolls" who actively spread false information through social networks and other digital platforms. These accounts are often created specifically for information campaigns and can work automatically, generating a huge number of messages in a short time [19]. This allows you to create the illusion of mass support or outrage, which can significantly influence public opinion.

An example of such an impact is the use of disinformation during elections. Fake news can change voters' perceptions of specific candidates or parties, which ultimately affects the voting results. In some countries, cases of the use of disinformation to interfere in electoral processes have already been recorded, which raises serious concerns about the democracy and transparency of elections [10].

The Russian experience also illustrates how disinformation can influence political processes. For example, a study conducted by domestic experts showed that social media was actively used during the elections in the country to spread false information that could change the course of voting [3].

4. Lack of regulation and control

The lack of effective regulation and control over the dissemination of information also contributes to the growth of disinformation. Unlike traditional media, online platforms are often not responsible for the content of the published materials. This creates conditions for the uncontrolled dissemination of false information and allows attackers to act with almost impunity [34].

Even when legislation provides for liability for the dissemination of fake news, its implementation often faces difficulties. Defining the boundaries between freedom of speech and the prohibition of disinformation is a difficult task that requires careful balancing. In addition, differences in the legal systems of different countries make it difficult for international cooperation in this area [26].

The lack of control by government agencies and independent organizations also creates favorable conditions for the spread of disinformation. In some cases, Governments themselves can be sources of false information, using it to manipulate public opinion or achieve political goals. This is especially true in authoritarian regimes, where freedom of the press is limited and access to reliable information is limited [15].

5. Technological factors and new media

With the development of technology, new opportunities for the creation and dissemination of disinformation are emerging. One such example is the use of artificial intelligence (AI) and machine learning to create high-quality fake materials. For example, deepfake technologies allow you to create videos in which real people utter texts that they have never said, or perform actions that they have never performed [35].

Such technologies pose a serious threat because they make disinformation more convincing and difficult to detect. Even experienced users can be misled if they do not have the special knowledge and tools to verify the authenticity of the materials. This creates new challenges for journalists, researchers, and law enforcement agencies, which must adapt to changing conditions [25].

In addition, the development of new media creates a new information space where traditional sources of information are losing their importance. People are increasingly turning to bloggers, video bloggers, and other independent authors who may not have sufficient training to verify information and comply with professional standards. This creates additional risks for the spread of disinformation and requires new approaches to its prevention [28].

The impact of disinformation on society

Disinformation has a wide range of negative effects on various aspects of society. Its influence extends to politics, economics, healthcare, social relations, and even individual perception of reality. Let's look at these aspects in more detail.

1. Political impact

One of the most significant aspects of the impact of disinformation is its impact on the political sphere. Disinformation can significantly affect election results, the formation of public opinion, and trust in government institutions.

Formation of public opinion

Fake news and false information can change people's perceptions of specific political figures or parties. For example, spreading false information about corruption, links to criminal gangs, or other scandalous facts can seriously undermine the reputation of a candidate or a political force [14]. This is especially true in the context of information warfare, when opponents use disinformation to discredit each other.

Research results show that fake news can significantly influence elections. In particular, they can change the preferences of voters by creating artificial trends and manipulating their emotions [21]. Some studies indicate that fake news can have long-term consequences, forming persistent stereotypes and biases that persist even after the lie has been exposed [17].

Interference in electoral processes

In recent years, it has become obvious that disinformation is being used not only for internal political struggle, but also to interfere in the electoral processes of other countries. An example of this is interference in the 2016 US presidential election, when cases of using fake news and bots to change public opinion were recorded [19]. Such actions undermine democratic principles and raise serious concerns about the security and transparency of elections.

The Russian experience also shows how disinformation can influence political processes. For example, a study conducted by Russian experts revealed that during the election period in Russia, social networks were actively used to spread false information that could affect the voting results [3].

Undermining trust in government institutions

Disinformation can also undermine trust in government agencies and institutions. If citizens begin to believe that official information is false or distorted, this can lead to a decrease in the level of trust in the government, parliament and other government institutions. This creates conditions for increasing political polarization and social conflicts, as people lose faith in the possibility of fair and objective governance [29].

Thus, a study conducted with the support of the Ministry of Science and Higher Education of the Russian Federation and the Expert Institute for Social Research showed that in the context of information manipulation, the level of trust in state institutions decreases significantly, which can lead to social tension and protest sentiments [6].

2. Health impact

Misinformation in the field of healthcare is a particular threat because it can directly affect people's health and lives. False information about medical treatments, vaccines, and other aspects of healthcare can lead to serious consequences.

Refusal of vaccination

One of the most striking examples of the impact of misinformation on healthcare is the refusal of vaccination. In recent years, there has been an increase in the number of parents in a number of countries who refuse to vaccinate their children due to the spread of myths about the dangers of vaccines. These myths are often based on false information that is disseminated through social networks and other platforms [27]. As a result, the level of collective immunity decreases, which increases the risk of outbreaks of infectious diseases such as measles and polio.

There are also downward trends in vaccination rates in Russia due to the spread of anti-vaccination campaigns. Thus, the results of an applied study aimed at discovering the relationship between anti-vaccination behavioral strategies and Russians) have shown that unreliable information about vaccines has become one of the reasons for the refusal of vaccinations among the population [5].

Incorrect treatment

Misinformation can also lead to the use of incorrect treatment methods. People can rely on advice from the Internet, which is not based on scientific evidence and can be dangerous. For example, spreading false information about "miraculous" remedies for cancer or other serious diseases can lead patients to abandon the treatment methods recommended by doctors and use ineffective or even harmful drugs [30].

Russian studies also confirm this trend, as according to scientific data, many patients prefer to use non-traditional methods of treatment based on unreliable information, which can negatively affect their health [4].

Psychological impact

In addition, misinformation in the field of healthcare can have a psychological impact, causing fear, anxiety and panic among the population. For example, spreading rumors about the possible end of the world due to a global pandemic or environmental disaster can lead to mass hysteria and chaos [20]. This creates additional problems for medical and law enforcement agencies, which must deal with the consequences of such a panic.

In Russia, during the COVID-19 pandemic, a significant number of cases of spreading misinformation related to the virus and measures to prevent it were recorded. A study conducted with the help of the Ministry of Science and Higher Education of the Russian Federation showed that false information about the coronavirus caused alarm and concern among the population, which complicated the work of medical services [2].

3. Economic impact

Misinformation also has an impact on the economy, as it can undermine confidence in financial markets, companies, and products.

Investment decisions

False information about the state of the economy, the financial situation of companies, or market prospects can lead to erroneous investment decisions. Investors can make decisions based on false information, which can lead to capital losses and even bankruptcy of companies [24]. For example, spreading rumors about the imminent bankruptcy of a large company can lead to a sharp drop in its shares, even if these rumors have no basis in fact.

Damage to business

In addition, misinformation can cause direct damage to a business. For example, spreading false information about the quality of a company's products or services can lead to a decline in its reputation and loss of customers. This is especially true for small and medium-sized businesses, which may not have sufficient resources to protect their reputation [16]. In some cases, companies are forced to spend significant funds on combating disinformation and restoring their image.

Russian companies also face problems related to disinformation. Small and medium-sized businesses in Russia often suffer from the spread of false information, which leads to lower profits and loss of competitiveness.

4. Social consequences

Disinformation can significantly influence social relations and structures of society, increasing polarization, creating artificial conflicts and forming false stereotypes.

The polarization of society

One of the most noticeable effects of disinformation is the increased social polarization. False information is often used to create opposing points of view and increase conflicts between different groups of the population. For example, the spread of myths that one ethnic group is the source of all problems in society can lead to an increase in racism and xenophobia. This creates conditions for social tension and even violence.

Russian society is also facing the problem of polarization caused by disinformation. The spread of false information on social networks contributes to the growth of intergroup conflicts and a decrease in the level of mutual understanding between different social groups [8].

Creating artificial conflicts

Disinformation can also be used to create artificial conflicts between different groups of the population. For example, false information that one professional group (such as doctors or teachers) receives too high salaries can cause discontent among other categories of workers and lead to social protests. As a result, society becomes more divided, and its ability to solve common problems decreases.

There are also cases of disinformation being used in Russia to create artificial conflicts. The conducted research has shown that the dissemination of false information about salaries of various professions can lead to social unrest and protests [7].

The formation of false stereotypes

Finally, misinformation can form false stereotypes and biases that persist for a long time. For example, the spread of myths that women are less competent in certain professions than men can lead to discrimination and limited opportunities for women [9. p. 457]. This creates additional barriers to equality and social justice.

Russian studies also confirm the existence of false stereotypes [1].

A study conducted by the Don State Agrarian University showed that the spread of misinformation about the role of women in society can lead to gender discrimination and limited career opportunities [9. p. 460].

Measures to reduce the impact of misinformation

Reducing the impact of disinformation is a complex task that requires an integrated approach at the level of individuals, society, technology platforms and States. Let's look at the necessary measures that can be applied to combat disinformation.:

1) Education and media literacy:

- Media literacy: programs to train people in critical thinking and information analysis will help them distinguish reliable sources from fake news. It is important to be able to recognize manipulative techniques such as out-of-context quotes or fake photos.;

- Financing of educational programs: schools and universities should include media literacy modules in their programs so that students can learn to evaluate information and understand the mechanisms of spreading disinformation [23].

- Public campaigns: The development of information campaigns aimed at promoting information verification skills can help a wide audience realize the importance of critical attitude to the news.

2) Regulation of social networks and digital platforms:

- Increased control over platforms: Social networks and other online platforms should be responsible for the content they distribute. This may include the mandatory use of algorithms to detect fake news, as well as transparent rules for advertisers.;

- Transparency of algorithms: platforms should be more open about how their algorithms choose which news to show to users. This will help minimize the impact of information bubbles, where people see only information that matches their views.;

- Legislation against disinformation: States can adopt laws that will regulate the activities of social networks and oblige them to block or label false information.

3) Support for journalistic ethics and media independence:

- Creation and support of independent media: independent journalistic organizations play a key role in providing high-quality and verified information. Supporting such organizations through grants, donations, or other forms of financing can help create a reliable information space.;

-Ethics of journalism: Journalists must adhere to strict standards when preparing and publishing materials, which includes fact-checking and avoiding the use of unreliable sources;

- Fact-checking: the development of independent fact-checking projects helps to refute false claims and prevent the further spread of misinformation.

4) Coordination between different actors:

- Public-private partnership: Authorities, technology companies, and civil society can collaborate to develop and implement effective measures to combat disinformation. For example, governments can provide data for analysis, and platforms can use it to improve content filtering mechanisms.;

- International cooperation: disinformation is often a cross-border problem. Therefore, it is important to coordinate efforts between countries, especially within the framework of international organizations such as the UN or OSCE, in order to develop common approaches to its solution.

5) Technological solutions:

- Using artificial intelligence (AI): AI algorithms can automatically scan content for fakes, forged images or videos ("deepfake"). Such technologies can help to find and block false information faster [31];

- early warning systems: the creation of systems that can quickly respond to the massive spread of disinformation, will quickly limit its impact;

- Content labeling: Platforms can implement systems that flag controversial or unverified content, alerting users to the potential risk.

6) Responsibility for the creation and dissemination of disinformation:

- Legal responsibility: the creation of laws that provide for legal responsibility for the deliberate dissemination of disinformation can serve as a deterrent [34]. However, it is important to strike a balance between protecting freedom of speech and fighting fakes.;

- Fines for platforms: If platforms fail to fulfill their responsibilities to remove or label false information, they may be subject to fines or other sanctions.

7) Increasing trust in official sources of information:

- Transparency and accessibility of data: Government agencies and organizations should be as open as possible about their work and provide accurate data so that people can trust them as sources of reliable information.;

- Communication during crises: During emergencies such as epidemics or natural disasters, authorities should actively engage with the public by providing accurate and timely information to minimize the risk of spreading misinformation.

Conclusion

In conclusion, the problem of disinformation is one of the most urgent and complex challenges of modern society. It affects various spheres of life, including politics, economics, science, and social relations, posing significant risks to stability and security. Our research has shown that the fight against disinformation requires an integrated approach that must take into account many aspects, from improving media literacy to implementing technological solutions and creating legislative mechanisms.

The first step in this direction is to increase media literacy. Educational programs aimed at developing critical thinking and information analysis skills play a key role in shaping a sustainable society capable of resisting manipulation. It is important not only to educate the younger generation, but also to continue educational work among the adult population, since misinformation can affect everyone without exception.

At the same time, it is necessary to strengthen the responsibility of digital platforms for distributed content. Technology companies should be more transparent about how their algorithms work and take an active part in filtering out false information. The introduction of legal responsibility for the creation and dissemination of disinformation, as well as the use of artificial intelligence to detect it, can become important tools in this fight. However, it should be remembered that there is a need to maintain a balance between regulation and protection of freedom of speech, so as not to restrict citizens' rights to express their opinions.

Support for journalistic ethics and media independence also plays an important role. Independent media organizations and fact-checking projects help ensure access to reliable information and minimize the impact of fake news. States and international organizations should support such initiatives by providing them with the necessary resources and protection from pressure from those who are interested in spreading disinformation.

In addition, it is important to develop international cooperation in the fight against disinformation. Since this problem is often of a cross-border nature, coordination of efforts between different countries and organizations can significantly improve the effectiveness of the measures taken. The creation of global standards and practices, as well as the exchange of experience and knowledge between the participants in the process are important steps towards a successful solution to the problem.

Finally, it is worth noting that disinformation will continue to evolve along with the development of technology. Therefore, it is necessary to pay attention to the continuous improvement of strategies and methods to combat it. Research into new forms of disinformation, such as "deepfake" videos and automated botnets, as well as the development of appropriate technologies to detect and prevent them should become a priority for the scientific community.

Thus, the fight against disinformation is a multifaceted and constantly evolving task that requires a systematic approach and joint efforts of various participants. Only through the combined actions of citizens, States, technology companies and international organizations can significant results be achieved in this area. It is important to remember that success in the fight against disinformation depends not only on the implementation of specific measures, but also on changing cultural norms and attitudes that contribute to the formation of a society based on trust and mutual respect.

References
1. Bartenev, D.G. (2016). Prohibited professions for women: a new way of dialogue between the Constitutional Court of Russia and the UN Committee? // International justice, 3(19), 37-47. https://doi.org/10.21128/2226-2059-2016-3-37-47
2. Burkova V.N., Butovskaya, M.L. (2023). Coronaphobia, infodemic and fakes during COVID-19. Siberian Historical Research, 40(2), 55-75. https://doi.org/10.17223/2312461X/40/3
3. Derevianchenko, A.A. (2022). Fraud in the elections to the State Duma of the VIII convocation in the focus of social networks. E-Scio, 9(72), 76-82. https://e-scio.ru/wp-content/uploads/2022/09/Деревянченко-А.-А.pdf
4. Zotova, L.A., & Chetvertakova, S.I.(2023). Analysis of the frequency of independent use of alternative and alternative medicine in outpatient clinical practice by patients with rheumatological pathology. Modern problems of science and education, 2, https://science-education.ru/article/view?id=32521
5. Leskova, I.V., & Zyazin, S.Y. (2021). Distrust of vaccination as an informational stuffing. Problems of social hygiene, public health and the history of medicine, 29(1), 37-40. https://doi.org/10.32687/0869-866X-2021-29-1-37-40
6. Matyuk, Y.S. (2023). Threats to the digital environment in the context of maintaining trust in state institutions. Pacific RIM: Economics, Politics, Law, 25(2), 50-55. https://doi.org/10.24866/1813-3274/2023-2/50-55
7. Murasheva, S.V. (2021). Social conflicts as causes and consequences of contradictions in Russian society. Society and Security Insights, 2, 151-164. https://doi.org/10.14258/ssi(2021)2-12
8. Panishchev, A.L., & Sapronov, A.V. (2019). Factors of social polarization. Proceedings of the Southwestern State University. Series: Economics. Sociology. Management, 9(4), 175-182.
9. Fedorov, V.K., Sheikhova, M.S., Safonova, S.G., & Kuvichkin, N.M. (2021). The influence of gender stereotypes on the formation and effectiveness of the Russian labor market. Moscow Economic Journal, 12, 447-453. https://doi.org/10.24412/2413-046KH-2021-10724
10. Fel’dman, P.I., Fediakin, A. V., & Ezhov, D. A. (2019). Technologies of election interference: scientific understanding in search of semantic certainty. Bulletin of Tomsk State University. Philosophy. Sociology. Political Science, 50, 210–218. https://doi.org/10.17223/1998863x/50/18
11. Frolova, V.I. (2018). Provocative Strategies in the Media Text's Headline Complex: an Ethical Aspect. Bulletin of the V.N. Tatishchev Volga University, 1(1), 152-162.
12. Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-236. https://doi.org/10.1257/jep.31.2.211
13. Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130-1132. https://doi.org/10.1126/science.aaa1160
14. Benkler, Y., Faris, R., Roberts, H., & Zuckerman, E. (2018). Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics. Oxford: Oxford University Press.
15. Chesney, B., & Citron D. K. (2019). Deep fakes: A looming challenge for privacy, democracy, and national security. California Law Review, 107(6), 1753-1790.
16. Eagly, A.H., & Karau, S.J. (2002). Role congruity theory of prejudice toward female leaders. Psychological Review, 109(3), 573-598. https://doi.org/10.1037/0033-295X.109.3.573
17. Grady, R.H., Ditto, P.H., & Loftus, E.F. (2021). Nevertheless, partisanship persisted: Fake news warnings help briefly, but bias returns with time. Cognitive Research: Principles and Implications, 6, Article 52. https://doi.org/10.1186/s41235-021-00315-z
18. Guess, A. M., Nagler, J., & Tucker, J.A. (2019). Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances, 5(1), eaau4586. https://doi.org/10.1126/sciadv.aau4586
19. Howard, P.N., Kollanyi, B., Bradshaw, S., & Neudert, L.M. (2017). Social Media, News and Political Information During the US Election: Was There a Bot Problem? Oxford: Oxford Internet Institute.
20. Iyengar, S., Sood, G., & Lelkes, Y. (2012). Affect, not ideology: A social identity perspective on polarization. Public Opinion Quarterly, 76(3), 405-431. https://doi.org/10.1093/poq/nfs038
21. Levitsky, S., & Ziblatt, D. (2018). How democracies die. Crown.
22. McChesney, R.W., & Nichols, J. (2010). The Death and Life of American Journalism: The Media Revolution that Will Begin the World Again. Nation Books.
23. McGrew, S., Breakstone, J., Ortega, T., Smith, M., & Wineburg, S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory & Research in Social Education, 46(2), 165-193. https://doi.org/10.1080/00933104.2017.1416320
24. Mutz, D. C. (2006). Hearing the Other Side: Deliberative versus Participatory Democracy. Cambridge University Press.
25. Newman, N., Fletcher, R., Schulz, A., Andı, S., & Nielsen, R. K. (2020). Reuters Institute Digital News Report 2020. Reuters Institute for the Study of Journalism.
26. Norris, P. (2001). Digital Divide: Civic Engagement, Information Poverty, and the Internet Worldwide. Cambridge University Press.
27. Nyhan, B., & Reifler, J. (2019). The effect of fact-checking on elites: A field experiment on US state legislators. American Journal of Political Science, 63(3), 547-564. https://doi.org/10.1111/ajps.12423
28. Pennycook, G., & Rand, D.G. (2019). The implied truth effect: Attaching warnings to a subset of fake news stories increases perceived accuracy of stories without warnings. Management Science, 66(11), 4944-4957. https://doi.org/10.1287/mnsc.2019.3478
29. Roser, M., Ritchie, H., Ortiz-Ospina, E., & Hasell, J. (2020). Coronavirus Pandemic (COVID-19). Our World in Data. https://ourworldindata.org/coronavirus
30. Shiller, R.J. (2019). Narrative Economics: How Stories Go Viral and Drive Major Economic Events. Princeton University Press.
31. Silverman, C. (2015). Lies, damn lies, and viral content: How news websites spread (and debunk) online rumors, unverified claims, and misinformation. Tow Center for Digital Journalism.
32. Tandoc, E.C., Lim, Z.W., & Ling, R. (2018). Defining "fake news". Digital Journalism, 6(2), 137-153. https://doi.org/10.1080/21670811.2017.1360143
33. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151. https://doi.org/10.1126/science.aap9559
34. Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe. https://www.researchgate.net/publication/339031969
35. West, J. Deepfakes and the New Disinformation War: The Coming Age of Post-Truth Geopolitics. Foreign Affairs. 2019. https://www.foreignaffairs.com/articles/world/2018-12-11/deepfakes-and-new-disinformation-wa

Peer Review

Peer reviewers' evaluations remain confidential and are not disclosed to the public. Only external reviews, authorized for publication by the article's author(s), are made public. Typically, these final reviews are conducted after the manuscript's revision. Adhering to our double-blind review policy, the reviewer's identity is kept confidential.
The list of publisher reviewers can be found here.

The topic of the reviewed article, "Disinformation in the media: problems, Challenges, and solutions," has great potential, and its relevance is beyond doubt. The significance of the research is determined by scientific, socio-cultural, and political factors. The author of the article identifies the challenges in the field of information that are aimed at modern society and determine the specifics of modern media communication, shows how the information flow has changed, and identified risk areas in it. The author of the article rightly notes the increased role of the media in shaping public opinion and perception of reality, while noting as a significant threat the increased flow of disinformation, which he defines as "deliberately created or distorted messages that mislead people," and emphasizes its manipulative nature. At the same time, the author of the article sets himself the task not only to fix the importance of this problem in the context of globalization and the development of digital technologies, but also to develop methods to combat disinformation. The article is very well structured. It clearly defines the subject and purpose of the research. The research methods are not spelled out separately, but the unity of the methodological base is clarified from the course of the analysis itself and the list of references (note that the article is well-founded, but for some reason the link to source 33 is not given like all the others). The article is based on a solid theoretical basis – the list of references contains 35 sources, 24 of which are foreign; thus, the author of the article takes into account not only the domestic, but also the world experience in combating disinformation. The main part of the article highlights the causes of the spread of disinformation, shows how it affects society, and what measures can be effectively used to reduce the impact of disinformation. In conclusion, detailed conclusions are given that fully represent the main content of the work. It is important that the author examines the problem comprehensively: it shows that disinformation affects different spheres of human activity (political: it forms public opinion, affects electoral processes, undermines the authority of government; life–sustaining - it highlights in detail how disinformation spreads in the healthcare sector; economic, etc.), is a real threat, and the measures that He suggests that they are multidirectional, with a special emphasis on the formation of media literacy and information perception skills. Based on the significance of the problem, the nature of its representation, and its practical orientation, the article is in some ways even a groundbreaking study that will be in demand not only in the sciences related to mass media and media, communication theory, and political research, but also in practical activities. The results and conclusions will also be of interest to a wide range of readers, as they will allow them to form tools for assessing the information flow in which they are immersed, and will allow them to develop criteria for distinguishing reliable and unreliable information. The article is recommended for publication.