Home Concepts Communication Technological Acceleration: The Crisis of Information, Reality and One’s Sense of Self

Technological Acceleration: The Crisis of Information, Reality and One’s Sense of Self

44 min read
0
0
203

The punchline here is that mediated messages developed via AI/metaverse/deepfake technologies are likely to greatly enhance the risks we have outlined in previous essays:
* In a highly polarized society where in/out-groups are highly distrustful of each other, the likelihood that mediated misinformation is believed without cognitive scrutiny is greatly enhanced.
* People with lower cognitive abilities and limited critical thinking skills are likely to be easily misled by messaging using the technologies discussed in this chapter.
* Machiavellian leaders and influencers are likely to rapidly adopt these technologies to manipulate susceptible people, and the expansive use of these technologies will likely be rapid.

Advice about how to overcome this sinister use of AI and Metaverse technologies often leans on the need for effort and education for seeking the truth. [For example, the website (Overview ‹ Detect DeepFakes: How to counteract misinformation created by AI — MIT Media Lab) lists numerous techniques for identifying deepfake videos. ]

Mitigation strategies

The problem with all of these suggestions is that each of them requires that we “pay attention” to various aspects of the message and video. As we’ve described previously, many people either don’t have any interest in paying attention to material that aligns with their in-group thinking and beliefs, and/or don’t have the cognitive endurance to do so.

For those interested, the Massachusetts Institute of Technology (MIT) (noted in the article above) has a research website where anyone can access and test their capability to detect deepfake and misinformation. Our limited technical knowledge does suggest that technological sophistication has progressed significantly since this MIT project to the point where some of these types of examples – which we correctly identified – would be undetectable.

As Artificial Intelligence technologies are used to create deepfake and misinformation, new technologies are being developed to counter this risk The report referenced earlier in this chapter noted measures to counter misinformation and deepfake. This counter effort is referred to as “warfare” – this is potentially how serious people and organizations consider this threat to societies values and beliefs. Experts in this field consider this a new and evolving threat to our democratic freedoms – manipulating how people think and believe to weaken western countries from the inside out. [Counter Misinformation (DeepFake & Fake News) Solutions Market – 2020-2026 (reportlinker.com)]

The above report notes that since most fake news and misinformation is disseminated via social media, “significant efforts are being made by technology companies Facebook, Twitter, Microsoft, Youtube and other leading content platforms to invest time and money to better understand and detect deepfakes to ensure their platforms are not misused by criminals and state-owned operators. However, these efforts alone will not be enough. These institutions will have to take a more prominent role by allocating larger budgets to purchase or develop capabilities to mitigate the risk. In addition to detecting these deepfakes, journalists and the social media platforms also need to figure out how best to warn people about deepfakes when they are detected to minimize the damage done”.

As previously noted, in the near future there may be highly visible pop-up notifications on content that scores high on the deepfake/misinformation algorithm that will then notify people of the risk. But even this “in-your-face” notification is unlikely to convince the conspiracy theory believer. These are the kinds of people (as have been frequently portrayed in the media) that when confronted with convincing evidence – for example – that there was not fraud in recent elections, simply won’t believe it – they don’t want to believe it. Other strategies and tactics are required here.

Pages 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Download Article 1K Club
Load More Related Articles
Load More By Kevin Weitz
Load More In Communication

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also

How Lies and Misinformation Undermine Trust in Experts, Leaders and Scientific Facts

We focus on the psychology of how influential people use language to propagate misinformat…