The Feds Have Hired Moonshot CVE, Be Worried

There was a time when CVE, first used by the Obama administration to describe terrorists such as the Haqqani Network, al Qaeda or Islamic State. Now, it is used to classify anyone the Federal government actually just thinks it should and that could mean you just based on your various internet searches. Your searches to find out patterns, various reports, names and dates and other detail is captured by Google (that is if you are still using Google and should not be) and then you are scored.

Sound crazy? This is a long read so hang on through it all as this post is an effort to give you full context, well as much as possible.

Ready?

Let’s begin here:

From February of 2021, a little more than a month since the J6 event in DC, The Hill reported the following:

When armed insurrectionists stormed the Capitol on Jan. 6, Vidhya Ramalingam wasn’t surprised.

A day earlier, her company Moonshot CVE, which monitors and combats online extremism, set up a crisis team in response to a flood of indications that the pro-Trump rally scheduled for Washington could turn violent.

Moonshot works to pull back from the brink people who have been inculcated into white supremacist movements, conspiracy theories and radical ideologies, and it offered crisis intervention to some 270,000 high-risk users around the time of the Capitol breach.

“For organizations like ours that have been working on domestic violent extremism for many years, and in the run up to the election and the months that followed, this was not a surprise, that this attack happened,” Ramalingam said.

But even the 33-year-old Ramalingam, who has spent her entire career focused on the issue both domestically and abroad, says the widespread nature of radicalization in the U.S. is alarming.

“It’s a very scary moment in America right now. I mean, the implications are so wide-reaching,” she told The Hill in a recent interview. “There’s just the potential for so much more violence right now.”

Ramalingam got her start embedding herself with white nationalist groups in Sweden for two years as part of her graduate studies.

“It was really tough. I spent a lot of time around people saying and spouting lies about people of color and about immigrants and people like me, so there were moments that were really horrible to sit and listen to,” she said.

But the experience helped open a window into their world and how people become radicalized.

“Some of them had life experiences that had led them here. And for me, it was really important to see that in order to then start to piece together, well, how could you get someone out?” she said.

When far-right extremist Anders Behring Breivik murdered 77 people in Norway in July 2011, the European Union tapped Ramalingam to lead its first intergovernmental initiative to respond to right-wing terrorism, a job she held for three years.

She worked on deradicalizing initiatives such as Exit Germany and Exit Sweden that included efforts setting up counseling interventions and training family members and loved ones.

She also sits on the board of Life After Hate, a U.S.-based group that provides similar interventions along with building a network of former extremists to push back on extremist content.

But Ramalingam says the problem needs larger-scale responses, something that became clear with the rise of ISIS and its use of social media to radicalize people.

“There was this sense of defeat, that the terrorists were winning and that they were just better than we were, they were able to use technology better,” she said.

The London-based Moonshot, which opened its D.C. office this week, seeks to scale up monitoring and intervention using the kinds of targeting that have become commonplace in business to build personalized responses.

The company counts a slew of governments, the United Nations and major tech companies such as Facebook and Google among its funders, and groups including the Anti-Defamation League among its partners.

“Technology can actually have the power to scale up really deeply personalized interactions the same way that every single advertisement we see is personalized towards me, my gender, my behavior online, my identity, where I live,” said Ramalingam.

“It really is literally the same thing that Coca-Cola is doing to sell us more Coke. We’re using those same tools to reach people and try and offer them safer alternatives, and either save their lives or save other people’s lives.”

Those efforts, which range from widely used platforms such as Google and Facebook to more niche ones such as Gab and Telegram, have led to some surprising results.

While countering facts and ideological debates seldom work to engage people online, a more empathic approach seems to yield gains. In a recent round of tests, Moonshot’s target audience was 17 percent more likely to engage with posts featuring the simple message that “anger and grief can be isolating” compared to other tested messages.

Other content focused on deescalating anger and even breathing exercises also found fertile ground.

But Ramalingam says the threat is also evolving.

“We’ve seen this kind of blending and metastasization of various once-distinct ideologies, groups and movements. You know, everything from white supremacist and neo-Nazis with armed groups and anti-vaxxers and election conspiracies,” she says.

“These groups weren’t always coordinating, and now we’re suddenly seeing this mess online come together.”

There are a slew of factors at play, including what Ramalingam says is a tepid response from technology companies that have “systematically overlooked and been unwilling to respond” to the threat, though the Capitol insurrection last month could be changing that. Tech platforms, she notes, were far more aggressive when dealing with ISIS and have proven tools on issues such as suicide prevention that show how much more they could be doing.

Another major contributor to the problem has been the willingness of people in positions of power to bolster conspiracy theories and misinformation, whether through full-throated endorsements or more subtle means, such as winking claims that questions remain in actual clear-cut cases or that certain facts are unknowable.

“Political leaders and people in that level of power should absolutely not be lending any credence to conspiracy theories and disinformation. Lending even the tiniest inkling of credence to those conspiracy theories is hugely dangerous because of the position of power that they’re in,” she said.

Ramalingam is no stranger to Washington, having grown up just a few hours away and later testifying before Congress on the threat of white nationalism.

She says she has been in touch with senior members of the Biden administration on how to take a whole-of-government approach to combatting right-wing extremism, which FBI Director Christopher Wray says is the top terrorism threat the country faces.

She worries that the country will assume that the events of Jan. 6 were the apex of a movement, rather than simply the latest in a series of deadly attacks ranging from Charlottesville, Va., to Pittsburgh to El Paso, Texas.

“For those of us that have been working on this form of extremism for 10 plus years now, it would be misleading to say that this is the — kind of the crescendo and now it’s going to dissipate,” she said.

“I think there’s a risk for the U.S. government, that the response following the Jan. 6 events focuses on public statements and on Band-Aids and not on the changes and the real shifts that need to take place in the entire system to deal with domestic violent extremism,” she said.

Got it? Hold on here comes the terrifying part….

From Fast Company:

How do you pull people out of the rabbit holes that lead to violent extremism, or keep them from falling in? If conspiracy-laced hate is another kind of pandemic pushed by online superspreaders, could we build something like a cure or a vaccine?

The deadly Capitol riot on January 6 has set off a fresh scramble to answer these questions, and prompted experts like Vidhya Ramalingam to look for new ways to reach extremists—like search ads for mindfulness.

“It’s so counterintuitive, you would just think that those audiences would be turned off by that messaging,” says Ramalingam, cofounder and CEO of Moonshot CVE, a digital counter-extremism company that counts governments like the U.S. and Canada and groups like the Anti-Defamation League (ADL) and Life After Hate among its clients. But Moonshot’s researchers recently found that Americans searching for information about armed groups online were more likely than typical audiences to click on messages that encourage calmness and mindful thinking.

“Our team tried it, and it seems to be working,” Ramalingam says. The finding echoes previous evidence suggesting that some violent extremists tend to be more receptive to messages offering mental health support. “And that’s an opening to a conversation with them.”

It’s a promising idea in a growing multimillion-dollar war—an effort that, even decades after 9/11 and especially after 1/6, is still hungry for tools to reach extremists. Old currents of violence and hate, amplified by a virtuous cycle of platforms and propagandists, are straining relationships and communities, draining wallets, and putting new pressure on the U.S. government to steer its anti-terror focus toward homegrown threats. Last month, the Department of Homeland Security said it was granting at least $77 million toward ideas for stopping what the agency says now represents the biggest danger to Americans’ safety: “small groups of individuals who commit acts of violence motivated by domestic extremist ideological beliefs.” 

The risk of violence is buoyed by a rising tide of conspiracy theories and extremist interest, which Ramalingam says has reached levels comparable to other “high risk” countries like Brazil, Sri Lanka, India, and Myanmar. In terms of indicators of extremism in the U.S., “the numbers are skyrocketing.”

How to reach people—and redirect them

To get those numbers, Moonshot goes to where the toxicity tends to spread, and where violent far-right groups do much of their recruiting: Twitter, YouTube, Instagram, and Facebook, but also niche platforms like MyMilitia, Zello, and Gab. But core to its strategy is the place where many of us start seeking answers—the most trafficked website of all. “We all live our lives by search engines,” Ramalingam says.

 

From an analysis of U.S. social media and search data by Moonshot and the ADL [Image: courtesy of Moonshot]

Social media tends to get the bulk of the attention when it comes to radicalization, but Google is also integral to the extremism on-ramp. And unlike social media, with its posts and shares and filters, a search can feel like a more private, largely unmoderated, experience. “We tell Google our deepest, darkest thoughts,” Ramalingam says. “We turn to Google and ask the things that we won’t ask our family members or partners or our brothers or sisters.”

Search can also convey to users an illusory sense of objectivity and authority in a way that social media doesn’t. “It’s important that we keep our eye on search engines as much, if not more than we do social media,” Safiya Noble, associate professor at the University of California, Los Angeles, and cofounder and codirector of the UCLA Center for Critical Internet Inquiry, recently wrote on Twitter. “The subjective nature of social media is much more obvious. With search, people truly believe they are experiencing credible, vetted information. Google is an ad platform, the end.”

Moonshot began in 2015 with a simple, insurgent strategy: Use Google’s ad platform—and the personal data it collects—to redirect people away from extremist movements and toward more constructive content. The idea, called the Redirect Method, was developed in partnership with Google, and widely touted as a way to reach people searching for jihadist content, targeting young men who were just getting into ISIS propaganda, or more radicalized people who might be Googling for information on how to sneak across the border into Syria. The idea is to steer potential extremists away—known as counterradicalization—or to help people who are deep down a rabbit hole find their way out through deradicalization. That might mean connecting them with a mentor or counselor, possibly a former extremist.

 

[Image: courtesy of Moonshot]

Ramalingam has seen these methods work up close. A decade ago, as part of her graduate studies, she embedded herself among neo-Nazis in Scandinavia, where a system of counseling and exit programs was helping bring people back to sanity and family. In 2015, she and another counter-extremism researcher named Ross Frenett started Moonshot to drive that approach using search ads, with a name that described their far-reaching goal. “If we knew that that worked offline,” she says, “couldn’t we test whether this would work online?” 

What began with a focus on jihadism and European white supremacy is now part of an effort to track a nexus of extremism, conspiracy theories, and disinformation—from QAnon to child exploitation content—from Canada to Sri Lanka. But for Moonshot, the U.S. is a new priority. Last month, Ramalingam, who grew up in the states, returned to open the company’s second office in D.C., where it can be closer to policy makers and practitioners. The company is also dropping the acronym from its birth name, Moonshot CVE: “Countering violent extremism” has become nearly synonymous with a misguided overemphasis on Muslim communities, Ramalingam points out, and in any case, old tactics aren’t sufficient. As extremist ideas have stretched into the mainstream, Moonshot’s once tiny target audiences now number in the millions.

“We can’t rely on what we knew worked when we were dealing with the dozens and the tens of people that were really on the fringes,” she says. “We need to be testing all sorts of new messaging.”

Understanding the data

If you were among the thousands of Americans who Googled for certain extremist-related keywords in the months around the election—phrases like “Join Oath Keepers Militia,” “I want to shoot Ron Wyden,” and “How to make C4″—you may have been targeted by the Redirect Method. It could have been a vague, nonjudgmental message at the top of your search results, like “Don’t do something you’ll regret.” Click, and you could end up at a playlist of YouTube videos with violence-prevention content, like a TED Talk by a would-be school shooter or testimonies from former neo-Nazis. Or you might encounter videos promoting calmness, or a page directing you to mental health resources. Around January 6 alone, Ramalingam says more than 270,000 Americans clicked on Moonshot’s crisis-counseling ads.

To do this, Google has given Moonshot special permission to target ads against extremist keywords that are typically banned. But while Moonshot launched the Redirect Method with Google’s help, these days it typically pays the ad giant to run its campaigns, just like any other advertiser. And now, given the sheer scale of the audiences Moonshot is reaching in the U.S., “the costs are off the charts,” Ramalingam says. Regarding its recent ADL-backed campaign, she says, “We’ve never paid this much for advertising in any one country on a monthly basis.”

This ad data comes with caveats. When looking at extremist search terms, for instance, Moonshot can’t be certain it’s measuring individual people or the same person searching multiple times. It also can’t know if it’s targeting an extremist or a journalist who’s simply writing about extremism.

 

Sample of U.S. Google search data during the three months around Election Day [Image: courtesy of Moonshot]

Still, the company is bringing more empirical evidence and scientific rigor to a field that sorely needs it, says Colin Clarke, the director of policy and research at the Soufan Center, an independent non-profit group that studies extremism. Moonshot’s data is even more concerning, Clarke says, because of another statistic that’s not exactly captured in Google analytics.
“At a time when people have been locked in their homes and consuming disinformation, with record levels of domestic violence, anxiety, depression, substance abuse, what’s the antidote? People have bought guns and ammunition in record numbers. So they’re anxious, they’re angry, isolated, and they’re well-armed,” he says. “It’s a perfect storm.”

In a recent analysis, done in partnership with the ADL and gathered in a report titled “From Shitposting to Sedition,” Moonshot tracked tens of thousands of extremist Google searches by Americans across all 50 states during the three months around Election Day. It saw searches spike around big political events, but also along geographic and political lines. In states where pandemic stay-at-home orders lasted 9 or fewer days, white-supremacist-related searches grew by only 1%; in states where stay-at-home orders were 10 days or longer, the increase was 21%.

The politics of the pandemic fomented domestic extremist interest, but also helped unite disparate fringe movements, from militias to climate denialists to anti-maskers and anti-vaxxers. “We started to see this worrying blending and metastasization of all these different ideologies in the U.S.—far-right groups blending and reaching across the aisle to work with anti-vax movements,” Ramalingam says. And it’s during times of crisis, she notes, “when we see these actors just grasping to turn fear and anxiety in society into an opportunity for them to grow.”

But Ramalingam isn’t just concerned about the most hard-core armed believers. After the election and the events of January 6, she worries now about splintered far-right groups and disaffected conspiracy theorists who are grappling for meaning. That puts them at risk of further radicalization, or worse.

“There are a lot of people who basically just feel misled, who feel like they’ve lost a lot because they followed these conspiracy theories,” she says. QAnon channels filled up with anxiety, self-harm, and talk of suicide, “like a father saying, ‘My son won’t speak to me,’ people who have lost their jobs, people who said, ‘I lost my family because of this,’ ” Ramalingam says. “And so there’s a real moment now where we need to be thinking about the mental health needs of people who, at scale, bought into these conspiracy theories and lies.”

What to say 

To reach violent radicals or conspiracy theorists to begin with, Ramalingam urges caution with ideological arguments. Shaming, ridiculing, and fact-based arguing can prove counterproductive. In some cases, it can be more effective to use nonjudgmental and nonideological messages that don’t directly threaten people’s beliefs or tastes but that try to meet them where they are. For instance, as Frenett suggests, if someone is searching for Nazi death metal, don’t show them a lecture; instead, show them a video with a death metal score, but without the racism.

Simple reminders to be mindful, and to think about how one’s actions impact others, may help. In its recent campaign, some of Moonshot’s most effective messaging asked people to “reflect and think on their neighbors, their loved ones, the people in their immediate community around them, and just to reflect on how their actions might be harmful to their loved ones,” Ramalingam says.

People interested in armed groups were most receptive to messages of “calm” offering mindfulness exercises. For all audiences, Moonshot found particularly high traction with an ad that said, “Anger and grief can be isolating.” When people clicked through, to meditation content or mental health resources, Ramalingam notes that “they seem to be watching it, or listening to it, or engaging with it for a long time.”

To reach QAnon supporters, Moonshot found the most success with messages that seek to empathize with their need for psychological and social support. “Are you feeling angry? Learn how to escape the anger and move forward,” said one Moonshot ad directed at QAnon keywords, which saw a click-through rate around 6%, twice that of other types of redirect messages. Clicking on that took some users to r/Qult_Headquarters, a subreddit that includes posts by former adherents.

Preventing the spread of violent extremist ideas involves a broader set of strategies. To bolster trust and a shared reality among the general public—people who haven’t yet gone down the rabbit hole—researchers are exploring other countermeasures like “attitudinal inoculation,” alerting people to emerging conspiracy theories and warning of extremists groups’ attempts at manipulation.

Experts also urge enhancing public media literacy through education and fact-checking systems. Governments may not be trusted messengers themselves, but they could help in other ways, through a public health model for countering violent extremism. That could mean funding for local community activities that can keep people engaged, and for mental health and social programs, an idea that then-Vice President Joe Biden endorsed at a 2015 White House summit on countering violent extremism.

Speaking of the White House, Ramalingam emphasizes that extremist ideologies warrant stern condemnation from public figures. Companies should deplatform the superspreaders of racism and disinformation, and political, cultural, and religious leaders should vehemently denounce them.

“Rhetoric that’s shaming of those ideologies can be really important and powerful from people in positions of power,” Ramalingam says. That’s for an already skeptical audience “that needs to hear it reinforced, but also the audience that is in the middle and doesn’t really know or doesn’t care that much. And that audience really needs to hear, ‘This is not okay. This is not acceptable. This is not a social norm.’ ”

But when addressing more extremist-minded individuals, Ramalingam suggests a gentler approach. “If someone is coming at you with an attack, you kind of pull yourself back into a corner and stand your ground and defend it,” she says. “And so if that’s our approach with the most extreme of society, that will actually worsen the problem.”

Does this work?

In the face of the domestic terror threat, mindfulness and compassion might sound like entering a space-laser fight with a water pistol or a hug.

But to Brad Galloway, who helps people exit right-wing extremist groups, Moonshot’s messaging makes sense. In a previous life, he used chat rooms and message boards to recruit people into a neo-Nazi group. After he joined—drawn in largely by camaraderie and music—what had been a U.S.-only organization eventually grew to 12 countries, thanks largely to the internet. Now Galloway is a coordinator at the Center on Hate, Bias and Extremism at Ontario Tech University, where he often urges his mentees to be more mindful, especially online.

“I ask people to think, Do I really need to watch this video of a school shooting?” Instead he encourages “positive content” to displace the stuff that can accelerate or even provoke radicalization.

Galloway, who has worked with Moonshot, Life After Hate, the Organization for the Prevention of Violence, and other groups, says the same principle of positive content applies to real life, too: Connecting with old friends and finding fun new activities can help people leave corrosive extremist communities. “What’s positive to that user, and how do we make that more prominent to them?”

 

Sample of U.S. Google search data around Election Day [Image: courtesy of Moonshot]

That’s not just a rhetorical question. What content works with which audience? Who is reachable? What counts as success? And how do strategies like the Redirect Method influence extremists? 

A 2018 Rand Corp. report on digital deradicalization tactics found that extremist audiences targeted with Redirect “clicked on these ads at a rate on par with industry standards.” Still, they couldn’t say what eventual impact it had on their behavior. As new funding flows in, and as experts throw up an arsenal of counter-radicalization ideas, there’s still scant evidence of what works.

For its part, Moonshot says its data suggests that some of its target audiences have viewed less extremist content, and points to the thousands of people it has connected to exit counseling and mental health resources. Still, Ramalingam says that the company sees “greater potential for us to assess whether our digital campaigns can lead to longer-term engagement with users, and longer-term change.”

There are other serious concerns as well. The missteps of previous digital wars on terror haunt Moonshot’s work: secret and extralegal surveillance systems, big data political warfare by military counter-radicalization contractors-turned-conspiracy mongers, untold violations of privacy and other civil rights. If Moonshot is tracking what messaging influences who, what data does it collect about “at risk” users, and where does that end up, and why? And who is at risk to begin with?

Ramalingam worries about the privacy concerns; she acknowledges that thanks to ad platforms and brokers, Moonshot can tap into “actually a heck of a lot of data.” But, she stresses, Moonshot isn’t accessing people’s private messages, and its work is bound by the stricter European personal data protections of the GDPR, as well as by an ethics panel that helps evaluate impacts. In any case, she argues, Moonshot is simply taking advantage of the multibillion-dollar digital platforms that drive most of the internet, not to mention the markets.

“As long as Nike and Coca-Cola are able to use personal data to understand how best to sell us Coke and sneakers, I’m quite comfortable using personal data to make sure that I can try and convince people not to do violent things,” Ramalingam says. Should that system of influence exist at all? “I’m totally up for that debate,” she says. “But while we’re in a context where that’s happening, I think it’s perfectly reasonable for us to use that sort of data for public safety purposes.”

What about the platforms?

The tech giants have run their own redirect and counter-speech programs as part of ongoing efforts to stem the toxicity that flourishes on their platforms. Google touts its work with Moonshot battling ISIS, its research on extremism, and its efforts to remove objectionable content and reduce recommendations to “borderline” content on YouTube. In December, its rights group Jigsaw published its findings on the digital spread of violent white supremacy.

Facebook tested the Redirect Method in a 2019 pilot in Australia aimed at nudging extremist users toward educational resources and off-platform support, a system that echoes its suicide-prevention efforts, which use pop-ups to redirect at-risk users to prevention hotlines. In an evaluation commissioned by Facebook last year, Moonshot called the program “broadly successful,” and recommended changes for future iterations. Facebook has also tested the program in Indonesia and Germany.

Ramalingam praises the tech platforms for their efforts, and supports their decisions to deplatform vast numbers of far-right and QAnon-related accounts, even if that’s made researching online extremism harder. Still, she says, Big Tech is doing “not nearly enough.”

Extremist content continues to slip through the platforms’ moderation filters, and then gets rewarded by the algorithms. Facebook’s own researchers have repeatedly shown how its growth-focused algorithms favor extremist content. Despite YouTube’s moderation efforts, political scientist Brendan Nyhan recently reported, the site’s design can still “reinforce exposure patterns” among extremist-curious people.

“The tech companies have an obligation to use their great privilege . . . of being a conduit of information, to get information and helpful resources to people that might be trapped in violent movements,” Ramalingam says.

As companies and lawmakers and law enforcement scramble for solutions in the wake of the events of January 6, Ramalingam also cautions against rash decisions and short-term thinking. “There’s an imperative to act now, and I have seen in the past mistakes get made by governments and by the tech companies just delivering on knee-jerk responses,” she says. “And then once the conversation dies down, they go back to essentially the way things were.”

Emotional reactions are understandable, given the shock of January 6, or of a family member who’s fallen down a rabbit hole, but they tend to be counterproductive. What works for battling violent extremism on a personal, one-on-one level, Ramalingam says, can also help fight it on a national scale: Avoid assumptions, be mindful, and consider the actual evidence.

“The way counselors and social workers do their work is they start by asking questions, by trying to understand,” she says. “It’s about asking questions so those people can reflect on themselves.”

Not finished yet….it gets worse.

From CSP:

The Defense Department, led by controversial diversity chief Bishop Garrison, has commissioned a study to investigate “extremism” in its ranks. But the chosen contractor may raise additional questions for a DOD that is already facing increasing Congressional scrutiny over accusations of politicization.

The U.S. Military Academy reportedly is working with a London, England based firm, Moonshot CVE [Countering Violent Extremism], whose CEO is Vidhya Ramalingam, a former Obama Foundation leader. Ramalingam is also the author of a 2013 paper on immigration in Europe funded by a grant from George Soros’ Open Society Foundations.

Ramalingam told Defense One she spoke with Garrison personally last month about how the Pentagon could use technology developed by her company to “find and eliminate extremism in the ranks.”

Why would the Pentagon hire a U.K.-based company to study allegations of extremism in the U.S. military? Why hire a politically connected group like Ramalingam’s?

It suggests that Garrison and Secretary of Defense Lloyd Austin may be looking for a predetermined answer. A deeper dive into Moonshot CVE might help unravel what they have in mind.

Moonshot CVE co-founder Ross Frenett expressed his support for Critical Race Theory (CRT) on Twitter last month, calling the opposition “Horrifying.” Joint Chiefs Chairman Mark Milley recently faced stiff criticism from congressional Republicans over the military’s recent moves to incorporate CRT elements into their training.

Moonshot CVE’s website dismisses Antifa’s and Black Lives Matter’s Marxist leanings and claims that those who assert its Marxism have engaged in a “white supremacist disinformation” campaign “as a means of delegitimizing it.”

“These sources echo far-right extremist disinformation narratives about BLM protesters trying to overthrow the republic and harm American citizens in a Marxist coup,” Moonshot CVE wrote in a paper jointly published with the Anti-Defamation League (ADL).

Of course, Antifa and BLM groups haven’t been shy about identifying themselves as Marxists. A popular graphic that circulated on pro-Antifa websites and Telegram accounts during the so-called “George Floyd Rebellion” of June 2020 claimed, “Militant networks will defend our revolutionary communities. Liberation begins where America dies” and the status of BLM founders as self-identified “trained Marxists” has been only discussed in the press.

Ramalingam and her organization claim that Antifa is unorganized, ignoring evidence of significant local, regional and international Antifa networks, and substantial material support from an extensive far-left network (including, as noted above, the Rosa Luxemburg Stiftung.) An extensive social media network including utilizing peer-to-peer encryption apps also exist, where BLM and Antifa activists share propaganda and techniques.

Why does Moonshot CVE fixate exclusively on “far-right” extremism, and work to minimize or deny the evidence of left-wing extremism?

One reason might be Moonshot’s apparent association with a German far-Left organization which is overtly pro-Marxist and pro-Antifa, and whose leaders have historical ties to Russian intelligence.

Ramalingam is a regular contributor to programs for an initiative at American University in Washington, D.C. called The Polarization and Extremism Research and Innovation Lab (PERIL). She participated in PERIL-sponsored seminars in October 2020, in April, and last month.

PERIL has partnered  with The Rosa Luxemburg Stiftung (RLS), the think tank of the German political party Die Linke (The Left). Die Linke is the successor of the former East German communist party. The think tank is named for Rosa Luxemburg, a German Communist revolutionary whose ideas pioneered the Marxist examination of race and gender, and was killed during the 1919 German communist uprising. A 2008 report by the German Federal Office for the Protection of the Constitution calls “the memory” of Luxemburg a “traditional element of Left-wing extremism.”

This alliance could be revealing about Ramalingam’s and PERIL’s ideological orientation.

PERIL’s description of the RSL is misinformation and raised questions about what else it glosses over.

PERIL unsurprisingly omits the fact the organization’s top leaders belonged to East Germany’s ruling party, the Socialist Unity Party (SED) and/or were either employees or informants of the Soviet KGB-run STASI. Many former STASI members shifted their allegiance to the KGB following its disbanding, a defector told “The Washington Post” in 1990. Die Linke is a pro-Russia stalwart. RLS’s representative in Moscow is a woman named Kerstin Kaiser, a former STASI employee who provided reports that were given to the KGB.

Kaiser belongs to the Petersburger Dialogue, along with Andre Brie another RLS leader and former STASI employee. Vladimir Putin and former German Chancellor Gerhard Schroeder, an important figure in Russia’s controversial Nordstream 2 pipeline, created the group in 2001 to foster closer Russian-German relations.

“It stands in the tradition of the workers’ and women’s movements, as well as anti-fascism and anti-racism,” PERIL says on its website.

Given that The Rosa Luxemburg Stiftung was founded in 1990 after the fall of the Berlin Wall, known officially as the “Antifascist Protection Barrier” one might have questions about what “traditions” of antifascism the group actually stands for.

PERIL’s head Cynthia Miller-Idriss wrote a blog for the RLS’s New York office on “radicalization” during COVID last year. She thanked RLS for the opportunity to write for it on Twitter.

Miller-Idriss and Ramalingam both participated in a conference in Jena, Germany called “Hate Not Found” sponsored by the Institute for Democracy and Civil Society last December where Miller-Idriss was the keynote speaker. Rosa Luxemburg Foundation member Maik Fleilitz was on a panel at the conference that discussed “deplatforming the far-Right.”

Ramalingam and Miller-Idriss both contributed articles to a journal on “radicalization” on the Far-Right in November of 2020.

RLS’s global head Dagmar Enkelmann belonged to the SED and the East German parliament before the wall fell. Gregor Gysi, who helped open the RLS’s New York office in 2012 and who visited last month, headed the SED when it rebranded itself as the “Party of Democratic Socialism” in December 1989. Gysi allegedly informed on his legal clients to the STASI. A bloc in the German Bundestag expelled him in 1992 for seeming to defend the STASI.

STASI informants played a key role in promoting the climate of fear that kept East German society under control. RLS hosted former East German spy chief Werner Grossmann in 2010 for a talk on his book.

East Germany’s last Premier Hans Modrow is an RLS member, and the RLS manages his foundation, The Hans Modrow Stiftung. Modrow had close KGB ties, including to KGB Chairman Vladimir Kryuchkov, who ran the Soviet spy agency     his tenure as Dresden Communist Party boss. Modrow supervised the dismantling of the STASI together with Grossmann. Today, Modrow received the Order of Friendship from the Vladimir Putin in 2017. He remains embittered toward Mikhail Gorbachev for allowing the collapse of the East German regime.

As a young KGB major, Putin supervised a local STASI office in Dresden, while Modrow was the local party boss.

The STASI trained the Red Army Faction (RAF), a predecessor of today’s Antifa.

RLS funded Antifa activities in Germany, and Die Linke openly supports Antifa. The Hamburg, Germany Antifa chapter even promoted a Rosa Luxemburg Stiftung panel on its Facebook page. Friedrich Burschel, editor of “Antifascitisiches Info Blatt, advises the foundation on subjects related to right-wing extremism and fascism at the Rosa Luxemburg Foundation. “Antifascitisiches Info Blatt”  ̶  the oldest ANTIFA publication, having first entered publication in 1987 in East Berlin  ̶  publishes articles on the Rosa Luxemburg Stiftung-funded website Linksnet, a collaboration of far-Left magazines.

The RLS hosted two BLM founders, Alicia Garza and Opal Tometi in 2014 and 2015 respectively. Garza attended the RLS-sponsored “Mapping Socialist Strategies” seminar in August 2014. RLS leader and former “unofficial STASI employee” Michael Brie spoke at this event. His brother Andre Brie spoke at a 1994 “Committees of Correspondence for Liberation and Socialism” conference along with Angela Davis, who has become influential in BLM. Davis worked closely with the East German regime in the 1970s, and she was a guest of honor at an event sponsored by Die Linke a decade ago. RLS’s New York office hosted BLM propagandist Shaun King in 2017.

The Southern Poverty Law Center (SPLC) is another PERIL partner who Ramalingam has worked with. The SPLC also has received money from the Rosa Luxemburg Stiftung. The SPLC is an extremely controversial organization which has been accused by its own former employees of bias and deliberately overinflating supposed far right threats for fundraising.  SPLC has defended Antifa. Former SPLC Intelligence Project Director Heidi Beirich and SPLC Intelligence Project Senior Analyst Evelyn Schlatter participated in a June 2017 RLS-sponsored session in New York called “Strategies Against the Far Right.” Ramalingam and Beirich are both advisory group members of a pan-European “anti-radicalization” project called The DARE Consortium.  In October, Ramalingam, Beirich and Miller-Idriss collaborated on a podcast on countering extremism sponsored by the ADL.

Moonshot CVE’s alliance with RLS-backed PERIL reinforces the perception that the Biden Pentagon’s hunt for extremism actually is an excuse for classifying dissenting view as “extremist.” And the pro-Russian/ex-STASI controlled RLS’s endorsement of the same talking points as Moonshot CVE shows it comes from a far-Left extremist perspective. U.S. troops shouldn’t be subjected to ideological warfare.

The fact Moonshot CVE equates opposing Antifa with extremism reminds us that this company doesn’t deserve taxpayer money or the Pentagon’s cooperation.

You’re already guilty just by the research you do while so many other cases are not prosecuted at all. Take caution reader…..

Even Federal contracts have gone to universities….

George Washington University School of Law’s Program on Extremism has created an online resource for tracking the hundreds of criminal cases filed by the Biden Justice Department against United States citizens for their alleged actions on January 6th. The Administration has charged people from all 50 states, and as is reflected in the“Capitol Hill Siege” project archive, every case has been filed in the District of Columbia. Read more here.

 

 

 

Yandex, a Russian tech company is on 250 US University Campuses

It is no wonder they are hacking America to death…

Do we really know what is inside these machines?

Forbes:

Yandex, a Russian tech company working on self-driving systems, is partnering with GrubHub to deploy a fleet of delivery robots on selected college campuses in the United States later this year.

Yandex NV Tests Autonomous Delivery Robots on Russian Streets

© 2020 Bloomberg Finance LP

Financial terms of the partnership were not disclosed.

Yandex often compares itself to Google. It offers arange of services, including a search engine, ride-hailing and food delivery. The company began operating food delivery robots, called Rovers, in 2019 in Moscow, Tel Aviv and Ann Arbor, Michigan.

“We chose to partner with GrubHub for campus delivery because of GrubHub’s unparalleled reach into college campuses across the United States, as well as the flexibility and strength of their ordering platform,” said Dmitry Polishchuk, CEO of Yandex Self-Driving Group. “We are delighted to deploy dozens of our rovers, taking the next step in actively commercializing our self-driving technology in different markets across the globe.’’

The partnership plans to launch the Rovers on 250 campuses.

“While college campuses are notoriously difficult for cars to navigate, specifically as it relates to food delivery, Yandex robots easily access parts of campuses that vehicles cannot,” Brian Madigan, GrubHub vice president of corporate and campus partners, said in a statement.

Yandex robot fleets have logged seven million autonomous miles since the team was founded in 2017, second only to Alphabet’s Waymo. That’s up from two million miles in February 2020.

MORE FOR YOU

Elon Musk Is On The Hook For Billions If He Loses Lawsuit Over Tesla’s SolarCity Deal

A Hydrogen-Powered Boat Is Sailing The World. If Not In Cars, Do Boats Make Sense?

Yandex employs about 400 engineers, plus operational and support staff.

Artem Fokin, Yandex’s head of business development, told Forbes.com that the company has spent only $100 million dollars on development in the four years since inception. That’s relatively frugal compared to Silicon Valley teams, which have raised billions toward the same goal.

The company’s Rovers deliver take-out meals, groceries, and retail consumer goods. Yandex has increased the dimensions and carrying capacity of the Rovers over time, to accommodate larger loads.

“We’ve worked to make the cost of Rover delivery extremely economical,” spokesman Yulia Shveyko, told Forbes.com contributor David Silver in May. “In Russia, human delivery is very price-competitive, and we have to be even more affordable than that.”

Unlike public roads, where vehicles travel in lanes and their travel patterns are predictable, Yandex vehicles must navigate sidewalks and other pedestrian paths where people’s movements are less orderly.

They can operate in broad daylight and in the dark of night, in moderate snowfall and rain, as well as in controlled and uncontrolled pedestrian crossing scenarios. But they can only travel at speeds up to 5 miles per hour.

Yandex owns 73% of its Self-Driving Group, while Uber owns 19% and a group of Yandex employees own the remaining 8%.

 

REvil, the Ransomware Hackers System Identified

Ahead of the three-day Fourth of July weekend, the REvil gang is suspected to be behind a new ransomware attack Friday that affected at least 200 companies in the U.S.

REvil, based in Russia, was likely behind the JBS Meat Packing attack in May, according to the FBI. The Flashpoint Intelligence Platform has suggested that former REvil members were involved in the recent Colonial Pipeline attack earlier this year as well, allegedly done by the DarkSide ransomware group. More here from Newsweek.

Per the FBI’s most recent statement:

Updated July 4, 2021: 

If you feel your systems have been compromised as a result of the Kaseya ransomware incident, we encourage you to employ all recommended mitigations, follow guidance from Kaseya and the Cybersecurity and Infrastructure Security Agency (CISA) to shut down your VSA servers immediately, and report your compromise to the FBI at ic3.gov. Please include as much information as possible to assist the FBI and CISA in determining prioritization for victim outreach. Due to the potential scale of this incident, the FBI and CISA may be unable to respond to each victim individually, but all information we receive will be useful in countering this threat.


Original statement:

The FBI is investigating this situation and working with Kaseya, in coordination with CISA, to conduct outreach to possibly impacted victims. We encourage all who might be affected to employ the recommended mitigations and for users to follow Kaseya’s guidance to shut down VSA servers immediately. As always, we stand ready to assist any impacted entities.

Additionally:

Kaseya had expected that it would be able to patch and restore its VSA software-as-a-service product by today, but technical problems its developers encountered have blocked the rollout. As of 8:00 AM EDT today, the company was still working to resolve the issues it encountered.

Reuters quotes US President Biden as offering, yesterday, a relatively upbeat preliminary assessment of the consequences of the ransomware campaign: “It appears to have caused minimal damage to U.S. businesses, but we’re still gathering information,” Mr. Biden said, adding “I feel good about our ability to be able to respond.”

That said, the US Government is continuing its investigation and is signalling an intention to do something about REvil and other gangs or privateers. Among other things, the US Administration said that it has communicated very clearly to Russian authorities that the US wants the REvil operators brought to book. CBS News reported yesterday that White House press secretary Psaki said that the US had been in touch with Russian officials about the REvil operation, and that if Russia doesn’t take action against its ransomware gangs, “we will” TASS is, of course, authorized to disclose that Russia not only had nothing to do with the attack, and that it knew nothing about it, and that in fact Moscow had heard nothing from Washington about the matter.

But, outside government cyber experts have uncovered the following:

Hat tip source

Resecurity® HUNTER, cyber threat intelligence and R&D unit, identified a strong connection to a cloud hosting and IoT company servicing the domain belonging to cybercriminals.

According to the recent research published by ReSecurity on Twitter, starting January 2021 REVil leveraged a new domain ‘decoder[.]re’ in addition to a ransomware page available in the TOR network.

***

The domain was included within the ransom notes dropped by the recent version of REVil, it came in the form of a text file containing contact and payment instructions.

revil map

Typically, the collaboration between the victim and REVil was organized via a page in TOR, but in the case their victim is not able to access the Onion Network, the group prepared domains available in Clearnet (WWW) acting as a ‘mirror’.

revil
TOR host

 

revil
WWW host (decoder[.]re)

To access the page in WWW or TOR – the victim needs to provide a valid UID (e.g.,”9343467A488841AC”). The researchers acquired a significant number of UIDs and private keys as a result of ransomware samples detonated and through the collaboration with victims globally. The private keys determine if the same functional process is available on both resources confirming, they’re delivering exactly the same content.

Like decryptor[.]cc and decryptor[.]top in previous REvil / Sodinokibi versions, decoder[.]re is used to grant the victims access to the threat actors WEB-site for further negotiations. The application hosted on it contains ‘chat’ functionality enabling interactive close to real-time communications between the victim and REVil.

The threat actors also used a disposable temporary e-mail address created via https://guerrillamail.com to anonymously register the domain name, which was later used for name servers too, this also allowed them to park other elements of their infrastructure. Such e-mails could only be used a limited number of times, for example all communications with them would be automatically deleted within 1 hour.

Resecurity was able to collect the available and historical DNS records, then create a visual graph representing the current network infrastructure used by REVil and shared it with the cybersecurity community. According to experts, such a step may facilitate proper legal action against ransomware, as well as outline parties responsible for such malicious activity, as the uncovered details raise significant questions regarding the reaction from hosting providers and law enforcement.

revil map

Based on the network and DNS intelligence collected by experts, the IPs associated with it have been rotated at least 3 times in Q1 2021 and were related to a particular cloud hosting and IoT solutions provider located in Eastern Europe, which continues to service them.

It’s hard to believe such malicious activity has gone unnoticed by certain governments resulting in damage to thousands of enterprises globally.” – said Gene Yoo, Chief Executive Officer of Resecurity.

President Joe Biden has ordered U.S. intelligence agencies to investigate the sophisticated ransomware attack on Kaseya presumably conducted by REVil, a notorious cybercriminal syndicate believed to have ties to Russian-speaking actors that’s previously gone after high-profile targets such as Apple and Acer.

The group is also believed to be behind last month’s successful attack on the world’s largest meat processing company, JBS, that extorted $11 million in ransom. REvil took official responsibility for the attack and released an announcement in their blog which is available in TOR network asking for $70 million payment from Kaseya – the biggest ransom payment demand known in the industry today.

The attack has already affected over 1,000 businesses globally disrupting their operations. One suspected victim of the breach, the Sweden-based retailer Coop, closed at least 800 stores over the weekend after its systems were taken offline.

The White House Press Secretary Jen Psaki said the US will take action against the cybercriminal groups from Russia if the Russian government refuses to do so.

The investigation is still ongoing.

About the author: Gene Yoo, Chief Executive Officer (Resecurity, Inc.)

Cartel Del Golfo is Operating Stash Houses in Texas

Primer: January 2020 by the Justice Department/ CDG is a violent Mexican criminal organization engaged in the manufacture, distribution, and importation of ton quantities of cocaine and marijuana into the United States. In the late 1990s, the Gulf Cartel recruited an elite group of former Mexican military personnel to join their ranks as security and enforcers who became known as Los Zetas. The Gulf Cartel and Los Zetas operated under the name of “The Company.” Costilla-Sanchez became the leader of The Company for several years following the arrest of Osiel Cardenas in 2003 and before Costilla-Sanchez’s arrest in September 2012. More details here.

***

Mexican Authorities Rescue 47 Kidnap Victims from Cartel ...

So, with that already classified, and with stash houses operating inside the United States, why has it not been declared a domestic terror organization and where are the arrests by Federal agents?

Texas border stash house packed with 108 migrants in searing heat

Nearly 930,000 illegal migrant crossing were reported by CBP through the end of May

A large human smuggling stash house harboring 108 migrants in southeast Texas was uncovered by U.S. Border Patrol agents Monday afternoon.

The migrants were found crammed inside what appeared to be an old car garage, enduring extreme heat and harsh living conditions.

Border Patrol officials told Fox News that smugglers keep migrants in stash houses located near the southern border before dispersing them deeper into the U.S.

The insignia for “Cartel Del Golfo,” which means Gulf Cartel, was spray-painted on one of the interior garage walls – which law enforcement said was the cartel’s method for laying claim to the operation.

 

Border Patrol said the Gulf Cartel is known to be heavily involved in running human smuggling operations across Texas’ southeast border.

Law enforcement initially said 107 migrants were found at the house before upping the count by one.

Officials identified one migrant caretaker during their apprehension near Alton, Texas Monday, but did not confirm whether he was involved in the running of the smuggling operation.

Five unaccompanied children and two-family units with children as young as six years old were uncovered in the stash house, U.S. Customs and Border Protection (CBP) confirmed Tuesday.

The migrants arrived from Mexico, Ecuador, El Salvador, Honduras, and Guatemala.

Stash houses like the garage discovered Monday are not rare sights for Border Patrol agents.

One hour after the stash house in Alton was discovered, CBP reported that a residence near Rio Grande City was found to have been harboring 23 adult migrants.

Fox News could not immediately reach CBP to confirm the number of stash houses found in 2021 but earlier this month local news outlet KGNS reported that over 4,000 migrants had been arrested in more than 200 dismantled stash homes.

CBP has reported nearly 930,000 illegal immigrant encounters at the southern border since January.

More than 180,000 migrants were encountered in May alone.

 

Hunter Gets Big Money for his Paintings Likely Due to his Shady Art Dealer

Any officials investigating for criminal activity other than the strident journalists at the New York Post? (rhetorical)

Hat tip:

As federal prosecutors continue their criminal probes into Hunter Biden’s taxes and international business dealings, the President’s son — shuttling between Washington DC and a sprawling Los Angeles home — is lying low, consulting with lawyers and focusing on his new career in art.Hunter1.The Georges Berges Gallery at 462 West Broadway in Soho.

Helayne Seidman

Biden, who turns 51 next week, is prepping a solo show with Soho art dealer Georges Berges, who currently represents Sylvester Stallone. Berges was once arrested for “terrorist threats” and assault with a deadly weapon in California and has strong ties to China.

Biden, who continues to hold business interests in a billion-dollar Chinese investment firm, recently moved to a sprawling Venice Beach rental with his wife Melissa Cohen and 10 month old son, according to the Daily Mail. He was previously living in a Hollywood Hills home where he had set up an art studio.hollywood-hills-hunter-biden-3 source

That home is connected to Shane Khoh, a Los Angeles-based entrepreneur and real estate investor who is CEO of SXU Investment Holdings LLC, the California company that has owned the $3.8 million property since 2011, according to public records. Khoh, an American who is fluent in Chinese, sits on the board of Siong Heng Realty Pte Ltd., a Singapore-based real estate holding company, according to his LinkedIn profile. He is also listed as a “venture partner” of Diverse Communities Impact Fund, a private-equity group that features former Democratic New Mexico Gov. Bill Richardson on its board of advisors.

The house was featured in a New York Times profile of Biden as an emerging abstract painter last year. Last year Khoh told The Washington Examiner that Biden was paying $12,000 a month for the property, which features a pool house that Biden has turned into an art studio. Khoh denied any prior relationship with Biden to the newspaper.

But when The Post asked this week about his arrangements with his tenant, Khoh clammed up: “I have nothing to say about Hunter Biden. I have no comment.”

Biden and his family have since moved into a $5.4 million Venice Beach home owned by Sweetgreen co-founder and CEO Jonathan Neman, according to the Daily Mail report.

Others in Biden’s orbit were even more reticent.

Calls to Lunden Alexis Roberts, an Arkansas stripper who sued Biden for paternity and child support after the birth of their 2-year-old daughter, refused comment, as did her lawyer. It is not known how much Biden is paying in child support for “Baby Doe,” as she is referred to in court papers. The father of five had initially argued that the child was not his, and repeatedly tried to delay the case. Roberts, who met Biden at a Washington, DC, strip club where she used to work, said in a December 2019 court filing that Biden had not provided any financial support for the child.

Although Biden has divested himself of many of his old business interests, he does not seem to be hard up for cash. He has been seen driving around Los Angeles in a Porsche Panamera, which retails for more than $90,000. He retains control of a limited liability corporation that has a 10 percent stake in BHR Partners, a Chinese private-equity firm with $2 billion in assets and partly owned by the Bank of China, according to reports.

Biden’s stake in the Chinese firm is owned by Skaneateles LLC, a company named for his mother Neilia Hunter Biden’s upstate New York hometown. The company has used the Hollywood Hills home as one of its addresses. Neilia, Joe Biden’s first wife, died in a 1972 car crash in Delaware that also killed Biden’s 1-year-old sister Naomi. Hunter Biden and his older brother Beau, who were toddlers, were injured in the accident.

“It’s like a lottery ticket he has in his hand with a 10 percent stake in a company worth billions,” said a source. “Just imagine if that company is worth $2 billion, Biden takes home $200 million.”

Biden’s convoluted international business dealings became a heated political issue in the final months of the 2020 presidential campaign after The Post revealed a trove of emails from Hunter’s laptop that raised questions about then-candidate Joe Biden’s ties to his son’s foreign business ventures, including Burisma. The Ukrainian energy company reportedly paid Hunter $50,000 a month between 2014 and 2019 to sit on its board of directors. Hunter Biden is also accused of promoting the interests of CEFC China Energy Co, a Chinese conglomerate that was to pay him more than $10 million a year for introductions to officials in Washington.

Last year, a federal watchdog called on the Department of Justice to launch “a full investigation” of Hunter Biden, who they claim did not register under federal Foreign Agent Registration Act rules that govern those lobbying for a foreign entity.

“Hunter Biden’s tangled web of shell companies, LLCs, investment vehicles, and options agreements make it virtually impossible to know where he is getting income from,” said Thomas Anderson, director for the National Legal Policy Center, adding that circumventing the FARA regulations allowed Biden and his associates to operate under the radar.

Selling his abstract artwork to wealthy investors may also be a lucrative way to rake in cash, Anderson said. “We highly doubt, however, a career as an artist will do anything more than act as a vehicle to further shield where that income is coming from,” he said.

But Hunter Biden told The Times he had another reason for turning to art. Painting is “literally keeping me sane right now,” he said, adding that it helped him in his battles with addiction to drugs and alcohol.

“If I didn’t know who it was and I saw it for the first time, I would think it was pretty interesting stuff. He’s got talent,” New York art critic Anthony Haden-Guest told The Post.

The paintings feature pastel bursts of flowers and other shapes made with layers of alcohol ink that he blows with a metallic straw onto Japanese Yupo paper, a smooth synthetic material made from recycled paper.

Biden’s new dealer, who opened his Soho gallery in 2015, is tight-lipped about his galleries in New York and Berlin, which are reportedly frequented by Spike Lee, Dave Chapelle and Susan Sarandon as well as international titans of industry.

“He’s got this Woody Allen look to him … He’s crazy in a good way,” one artist who’s worked with Berges told The Post.

Berges, 44, regularly features works by Chinese artists and told a Chinese network that he was keen to open other art galleries in Beijing and Shanghai in 2015. “The questions that I always had was how’s China changing the world in terms of art and culture,” Berges told the China Daily in 2014.

Berges was accused of defrauding an investor in a 2016 federal lawsuit. Ingrid Arneberg claims she invested $500,000 in Berges’ gallery for a promised expansion, but instead he used the cash to pay off old debts. Berges later countersued Arneberg, and the case was settled in 2018.

In 1998, he was charged with assault with a deadly weapon and making “terrorist threats,” which were dismissed. He pled “no contest” to the assault and received 36 months probation and served 90 days in jail, according to Santa Cruz Superior Court documents — the only information publicly available about the case.

Berges did not return several messages seeking comment. A worker at his gallery in Soho told The Post he didn’t know anything about Hunter Biden’s solo exhibition, which is scheduled for later this year, according to reports.

George Mesires, a lawyer for Hunter Biden did not return The Post’s calls.