America’s people are increasingly being subjected to the usage of technologies that were formerly deployed against the country’s adversaries.
According to a recent revelation by Just the News, the Biden administration is working with companies, universities, and other organizations to fund an artificial intelligence (AI) censorship initiative that uses technology originally developed for cyber warfare against the Islamic State.
One monitor for free expression claims the National Science Foundation supports the concept that if citizens can’t trust their government via traditional means, then it must be established by scientific means.
Technologies previously used against America's enemies are now being used against us. The Biden admin (with Big Tech, academia & big corporations), is pouring taxpayer money into an AI censorship program that utilizes systems once used to wage information warfare against ISIS. pic.twitter.com/cEkQ1Ia9bo
— 🇺🇲✝️Ram_USA✝️🇺🇲 (@RamKou) January 27, 2023
When and Why?
The Intercept reported in late 2022 that DHS had gone well beyond the promises of the canceled “Disinformation Governance Board” to stifle free expression and mold internet conversation.
A whole-of-government strategy to managing the risk of [mal-information] was developed by government agencies, as evidenced by publicly available documents.
This approach aims to determine which tools, officials, and approaches are most appropriate to address the threats affecting the media environment.
According to reports, DHS has justified these restrictions on speech and choices about what material individuals should be authorized to engage with. They’ve done this by arguing that the propagation of misinformation and disinformation online might aggravate terrorism.
The Biden Admin makes clear their goals of total censorship.
If a super AI comes online during this Admin or a similar one, we’re all toast.
— Jack Murphy ⚡️ (@jackmurphylive) July 16, 2021
Since Elon Musk’s “Twitter Files” exposed how federal operatives coerced private corporations into silencing journalists, opponents, and even a former president, it has been evident that other statist groups, sometimes united, also participated in suppression and narrative planting.
The results may have been felt in the polls, as well as in the community.
It has been revealed the FBI pressured at least one major social media platform to suppress a damaging rumor about then-candidate Joe Biden’s prospects of being elected.
These labor-intensive attempts to control discourse and stories could presumably need some refinement.
Ill-Informed Coverage on the Home Front
The National Science Foundation distributed grants totaling millions of dollars in government funding to academic institutions and commercial companies to create censorship mechanisms.
Just the News stated these technologies are quite similar to those created by the Defense Advanced Research Projects Agency during its Social Media in Strategic Communications initiative, which ran from 2011 to 2017.
The goal of developing these instruments was to lessen the influence of enemies on the course of events by detecting and countering their campaigns of misinformation and deception with accurate data.
That’s why DARPA says SMISC will study things like language clues, information flow patterns, and opinion recognition in social media content. The researchers’ goal is to identify patterns and cultural narratives by following the flow of ideas and thoughts.
DARPA’s program manager, when SMISC was created, Rand Waltzman, saw how social media might be used to improve the military’s understanding of the environment in which it works and to facilitate more nimble use of information in support of operations.
He went on to say in order to detect, categorize, measure, track, and influence events in social media in the management of data and a timely manner, they would need to use systematic automated, robotized human operator support, as opposed to relying on luck and rudimentary manual processes as they do now.This article appeared in NewsHouse and has been published here with permission.