An affordable price is probably the major benefit persuading people to buy drugs at www.americanbestpills.com. The cost of medications in Canadian drugstores is considerably lower than anywhere else simply because the medications here are oriented on international customers. In many cases, you will be able to cut your costs to a great extent and probably even save up a big fortune on your prescription drugs. What's more, pharmacies of Canada offer free-of-charge shipping, which is a convenient addition to all other benefits on offer. Cheap price is especially appealing to those users who are tight on a budget
Service Quality and Reputation
Although some believe that buying online is buying a pig in the poke, it is not. Canadian online pharmacies are excellent sources of information and are open for discussions. There one can read tons of users' feedback, where they share their experience of using a particular pharmacy, say what they like or do not like about the drugs and/or service. Reputable online pharmacy canadianrxon.com take this feedback into consideration and rely on it as a kind of expert advice, which helps them constantly improve they service and ensure that their clients buy safe and effective drugs. Last, but not least is their striving to attract professional doctors. As a result, users can directly contact a qualified doctor and ask whatever questions they have about a particular drug. Most likely, a doctor will ask several questions about the condition, for which the drug is going to be used. Based on this information, he or she will advise to use or not to use this medication.
A Yahoo News reporter, Natasha Bertrand in August of 2017 posted in part the following –>
A website launched on Wednesday by a former FBI special agent-turned disinformation expert claims to track Russian propaganda in near-real time, as it spreads via Twitter accounts that have been linked to Russian influence operations.
Clint Watts, who garnered national media attention after testifying before the Senate Intelligence Committee about Russia’s ongoing cyber and propaganda war against the West, spearheaded the project called Hamilton 68 — a hat tip to the founding father’s Federalist Papers No. 68.
“In the Federalist Papers No. 68, Alexander Hamilton wrote of protecting America’s electoral process from foreign meddling,” the site reads, alluding to Russia’s interference in the 2016 election. “Today, we face foreign interference of a type Hamilton could scarcely have imagined.”
Watts worked on Hamilton 68 with JM Berger, a fellow with the International Centre for Counter-Terrorism who studies extremism and propaganda on social media; Andrew Weisburd, a fellow at the Center for Cyber & Homeland Security; and Jonathon Morgan, the CEO of New Knowledge AI and head of Data for Democracy, a volunteer collective of data scientists and technologists. More here
Now you would think that former Federal government officials would tell the truth or at least do retractions as required when something is proven false…not so much.
In full disclosure, years ago, I read JM Berger’s book and interviewed him on my radio show. Furthermore, I followed Clint Watts on Twitter because as a former FBI agent, perhaps truth and context was important, it still is but not at the very least from those former ‘intelligence’ experts which now include even more former officials like Former Acting CIA Director Mike Morrell and former Ambassador to Russia Michael McFaul.
They among others created a fraud upon America as discovered by Matt Taibbi and the Twitter files.
Read in depth here to see just how scandalous media and the officials really were…perhaps still are actually. The New York Post in part has the following paragraph:
The Hamilton 68 “dashboard” was the brainchild of former FBI special agent and MSNBC contributor Clint Watts and operated under the Alliance for Securing Democracy, a think tank founded in 2017 — shortly after former President Trump took office. (Alliance for Securing Democracy, REALLY?)
Further from the New York Post: Emails in the disclosure show that Twitter’s own internal audits repeatedly showed that accounts flagged by Hamilton 68 were not Russian bots.
The Hamilton 68 website/screenshot as of the moment of this post:
Other names also include Bill Kristol, editor of the now defunct Weekly Standard, John Podesta and of course Hillary Clinton. Now we have some more questions for sure including who funded all of this? Perhaps the Clinton Foundation? How nutty is all this going to be when a deeper dive happens by the House Republicans on the Oversight Committee look at the other tech/media outlets like Google, Reddit, YouTube and Facebook?
Bullshit is right…more like KGB/Stasi tactics brought into the American public square and news outlets like CNN and the Washington Post need to own this too. Gotta wonder if the White House under Biden much less Obama’s White House team will get subpoenas….How much interaction was there between those former government officials and those in the House and Senate much like Adam Schiff?
This all brings a new definition to cyber wars and news media terrorism.
President Biden said that anyone making less than $400,000 per year would not a dime more in taxes….now a lie. Apps of all sorts are already asking for your banking information. Note….the banking information is getting reported by payments apps and other online sites such as Etsy, Marketplace and OfferUp. As you read further, understand what is not being revealed. The IRS is using private corporations to aid them in reporting personal information about you. Getting a 1099 could easily put you in a higher tax bracket dust because you collected dues from team members, sold an old umbrella or work on the side selling a potholder you knitted.
FNC: Americans who made money online this year could be in for a potentially brutal shock when they file their taxes in 2023.
That’s because, beginning next year, taxpayers must report to the IRS transactions of at least $600 that are received through payment apps like Venmo, PayPal and Cash App.
In an explainer posted online last month, the IRS warned small business owners about the $600 threshold for receiving Form 1099-K for third-party payments exceeding $600.
Third-party payment processors will now be required to report a user’s business transactions to the IRS if they exceed $600 for the year. The payment apps were previously required to send users Form 1099-K if their gross income exceeded $20,000 or they had 200 separate transactions within a calendar year.
“I think it will come as a shock out of nowhere that people are getting these,” Nancy Dollar, a tax lawyer at Hanson Bridgett, told FOX Business.
Democrats made the change in March 2021, when they passed the American Rescue Plan without any Republican votes.
Now, a single transaction over $600 will trigger the form. The change is intended to crack down on Americans evading taxes by not reporting the full extent of their gross income. However, critics say that it amounts to government overreach at its worst and that it could ultimately hurt small businesses.
The lower reporting threshold threatens to sweep up millions of Americans who make money online. Roughly one in four Americans rakes in extra income on the side by selling something online, renting their home or using a digital platform to do work, according to the Pew Research Center.
The change could discourage some Americans from participating in the gig economy, according to Dollar.
“Everyone I know offloads old goods that they have on these platforms because it’s so easy,” Dollar said. “Or they’ve been engaging in gig work on a very casual basis, and that affects gig workers as well who have been underreporting their income. I think it’s going to force people to either cut down on those activities or kind of take them more seriously and track them.”
The new rule only applies to payments received for goods and services transactions, meaning that using Venmo or PayPal to send a loved one a gift, pay your roommate rent or reimburse a friend for dinner will be excluded. Also excluded is anyone who receives money from selling a personal item at a loss; for example, if you purchased a couch for $300 and sold it for $250, the amount is not taxable.
“This doesn’t include things like paying your family or friends back using PayPal or Venmo for dinner, gifts, shared trips,” PayPal previously said.
To be clear, business owners are already required to report that income to the IRS. The new rule simply means that the IRS will figure out what business owners earned on the cash apps, regardless of what that individual actually reports on their 1099-K, because it broadens the scope of the threshold.
Form 1099-K is used to report goods and services payments received by a business or individual in the calendar year, but there are certain exclusions from gross income that are not subject to income tax, including amounts from selling personal items at a loss, amounts sent as reimbursements and amounts sent as gifts.
“For the 2022 tax year, you should consider the amounts shown on your Form 1099-K when calculating gross receipts for your income tax return,” PayPal said in a Q&A on its website. “The IRS will be able to cross-reference both our report and yours.”
The cash apps will now be required to send users who meet the newest requirements Form 1099-K for transactions made electronically or by mail.
The apps may request additional information from users shortly to properly report transactions, and users may be asked to provide their Employer Identification Number (EIN), Individual Tax Identification Number (ITIN), or Social Security Number (SSN) if it’s not already on file.
A herd of deer outside the equipment yard of the Google data center campus in Council Bluffs, Iowa. (Photo: Google)
The investment milestone by Google is the latest data point on the extraordinary growth of the data center industry in Iowa, which is also home to Meta’s largest cloud campus and a massive build-out by Microsoft in West Des Moines. The Iowa cloud cluster shows the prominent role of the Midwest in cloud geography, providing a data distribution hub in the center of the United States.
It’s being sued by price-comparison firm PriceRunner for around $2.4 billion.
The Swedish company alleges the tech giant manipulated search results.
PriceRunner wants Google to pay compensation for profits it claims it has lost in the UK since 2008; and Sweden and Denmark since 2013.
A Google spokesperson said the company would defend the lawsuit in court.
It claimed changes made to shopping ads five years ago have worked successfully.
It also said PriceRunner chose not to use shopping ads on Google, so may not have seen the same successes as others.
But PriceRunner said it was ready to fight for years, with financing in place and steps prepared in the event it does not win.
In November Google lost an appeal against a fine of over $2.7 billion imposed by the European Commission in 2017.
It found that the search giant used its own price comparison shopping service to gain an unfair advantage over smaller European rivals.
The seven-year investigation came about due to complaints that Google distorted internet search results in favour of its own shopping service.
PriceRunner is currently in the process of being bought by payments firm Klarna.
Source: PriceRunner said Monday that it plans to take Google to court in Stockholm. It’s seeking compensation for damages in relation to a 2017 ruling from the European Commission that Google breached antitrust laws by giving preference to its own shopping comparison product, Google Shopping, through its popular search engine.
After a seven-year investigation into the practices, the EU executive body dealt Google a historic $2.7 billion fine. Google appealed the penalty, but in November 2021, the decision was upheld by the EU’s General Court. The verdict can still be appealed and taken to the EU’s highest court.
PriceRunner CEO Mikael Lindahl said the company launched its lawsuit following “extensive and thorough preparations.”
“We are of course seeking compensation for the damage Google has caused us during many years, but are also seeing this lawsuit as a fight for consumers who have suffered tremendously from Google’s infringement of the competition law for the past fourteen years and still today,” Lindahl said in a statement.
A Google spokesperson said the company looks forward to defending its case in court. The company made a number of changes in 2017 aimed at addressing the commission’s concerns.
“The changes we made to shopping ads back in 2017 are working successfully, generating growth and jobs for hundreds of comparison shopping services who operate more than 800 websites across Europe,” the spokesperson said in an emailed statement.
“The system is subject to intensive monitoring by the EU Commission and two sets of outside experts. PriceRunner chose not to use shopping ads on Google, so may not have seen the same successes that others have.”
PriceRunner alleges Google has not complied with the commission’s ruling and is still abusing its dominant position among internet search engines. It expects the final damages to be “significantly higher” than the interim sum of 2.1 billion euros.
The company, which in November agreed to be taken over by Swedish fintech firm Klarna, wants Google to pay compensation for profits it lost in the U.K. since 2008, and in Sweden and Denmark from 2013 onward.
Klarna spokeswoman Aoife Houlihan said the company was “aware and supportive of this suit.”
“It is fundamental that all tech companies no matter where they operate, compete on the basis of their own merit with the best product and service and then gain consumers’ trust,” Houlihan told CNBC.
“European consumers have been denied real choice in shopping services for many years and this is one step to ensuring this ends now.”
PriceRunner says it’s the largest independent price comparison service in the Nordic region, with over 3.7 million products to select from 22,500 stores across 25 different countries.
There was a time when CVE, first used by the Obama administration to describe terrorists such as the Haqqani Network, al Qaeda or Islamic State. Now, it is used to classify anyone the Federal government actually just thinks it should and that could mean you just based on your various internet searches. Your searches to find out patterns, various reports, names and dates and other detail is captured by Google (that is if you are still using Google and should not be) and then you are scored.
Sound crazy? This is a long read so hang on through it all as this post is an effort to give you full context, well as much as possible.
Let’s begin here:
From February of 2021, a little more than a month since the J6 event in DC, The Hill reported the following:
When armed insurrectionists stormed the Capitol on Jan. 6, Vidhya Ramalingam wasn’t surprised.
A day earlier, her company Moonshot CVE, which monitors and combats online extremism, set up a crisis team in response to a flood of indications that the pro-Trump rally scheduled for Washington could turn violent.
Moonshot works to pull back from the brink people who have been inculcated into white supremacist movements, conspiracy theories and radical ideologies, and it offered crisis intervention to some 270,000 high-risk users around the time of the Capitol breach.
“For organizations like ours that have been working on domestic violent extremism for many years, and in the run up to the election and the months that followed, this was not a surprise, that this attack happened,” Ramalingam said.
But even the 33-year-old Ramalingam, who has spent her entire career focused on the issue both domestically and abroad, says the widespread nature of radicalization in the U.S. is alarming.
“It’s a very scary moment in America right now. I mean, the implications are so wide-reaching,” she told The Hill in a recent interview. “There’s just the potential for so much more violence right now.”
Ramalingam got her start embedding herself with white nationalist groups in Sweden for two years as part of her graduate studies.
“It was really tough. I spent a lot of time around people saying and spouting lies about people of color and about immigrants and people like me, so there were moments that were really horrible to sit and listen to,” she said.
But the experience helped open a window into their world and how people become radicalized.
“Some of them had life experiences that had led them here. And for me, it was really important to see that in order to then start to piece together, well, how could you get someone out?” she said.
When far-right extremist Anders Behring Breivik murdered 77 people in Norway in July 2011, the European Union tapped Ramalingam to lead its first intergovernmental initiative to respond to right-wing terrorism, a job she held for three years.
She worked on deradicalizing initiatives such as Exit Germany and Exit Sweden that included efforts setting up counseling interventions and training family members and loved ones.
She also sits on the board of Life After Hate, a U.S.-based group that provides similar interventions along with building a network of former extremists to push back on extremist content.
But Ramalingam says the problem needs larger-scale responses, something that became clear with the rise of ISIS and its use of social media to radicalize people.
“There was this sense of defeat, that the terrorists were winning and that they were just better than we were, they were able to use technology better,” she said.
The London-based Moonshot, which opened its D.C. office this week, seeks to scale up monitoring and intervention using the kinds of targeting that have become commonplace in business to build personalized responses.
The company counts a slew of governments, the United Nations and major tech companies such as Facebook and Google among its funders, and groups including the Anti-Defamation League among its partners.
“Technology can actually have the power to scale up really deeply personalized interactions the same way that every single advertisement we see is personalized towards me, my gender, my behavior online, my identity, where I live,” said Ramalingam.
“It really is literally the same thing that Coca-Cola is doing to sell us more Coke. We’re using those same tools to reach people and try and offer them safer alternatives, and either save their lives or save other people’s lives.”
Those efforts, which range from widely used platforms such as Google and Facebook to more niche ones such as Gab and Telegram, have led to some surprising results.
While countering facts and ideological debates seldom work to engage people online, a more empathic approach seems to yield gains. In a recent round of tests, Moonshot’s target audience was 17 percent more likely to engage with posts featuring the simple message that “anger and grief can be isolating” compared to other tested messages.
Other content focused on deescalating anger and even breathing exercises also found fertile ground.
But Ramalingam says the threat is also evolving.
“We’ve seen this kind of blending and metastasization of various once-distinct ideologies, groups and movements. You know, everything from white supremacist and neo-Nazis with armed groups and anti-vaxxers and election conspiracies,” she says.
“These groups weren’t always coordinating, and now we’re suddenly seeing this mess online come together.”
There are a slew of factors at play, including what Ramalingam says is a tepid response from technology companies that have “systematically overlooked and been unwilling to respond” to the threat, though the Capitol insurrection last month could be changing that. Tech platforms, she notes, were far more aggressive when dealing with ISIS and have proven tools on issues such as suicide prevention that show how much more they could be doing.
Another major contributor to the problem has been the willingness of people in positions of power to bolster conspiracy theories and misinformation, whether through full-throated endorsements or more subtle means, such as winking claims that questions remain in actual clear-cut cases or that certain facts are unknowable.
“Political leaders and people in that level of power should absolutely not be lending any credence to conspiracy theories and disinformation. Lending even the tiniest inkling of credence to those conspiracy theories is hugely dangerous because of the position of power that they’re in,” she said.
Ramalingam is no stranger to Washington, having grown up just a few hours away and later testifying before Congress on the threat of white nationalism.
She says she has been in touch with senior members of the Biden administration on how to take a whole-of-government approach to combatting right-wing extremism, which FBI Director Christopher Wray says is the top terrorism threat the country faces.
She worries that the country will assume that the events of Jan. 6 were the apex of a movement, rather than simply the latest in a series of deadly attacks ranging from Charlottesville, Va., to Pittsburgh to El Paso, Texas.
“For those of us that have been working on this form of extremism for 10 plus years now, it would be misleading to say that this is the — kind of the crescendo and now it’s going to dissipate,” she said.
“I think there’s a risk for the U.S. government, that the response following the Jan. 6 events focuses on public statements and on Band-Aids and not on the changes and the real shifts that need to take place in the entire system to deal with domestic violent extremism,” she said.
How do you pull people out of the rabbit holes that lead to violent extremism, or keep them from falling in? If conspiracy-laced hate is another kind of pandemic pushed by online superspreaders, could we build something like a cure or a vaccine?
The deadly Capitol riot on January 6 has set off a fresh scramble to answer these questions, and prompted experts like Vidhya Ramalingam to look for new ways to reach extremists—like search ads for mindfulness.
“It’s so counterintuitive, you would just think that those audiences would be turned off by that messaging,” says Ramalingam, cofounder and CEO of Moonshot CVE, a digital counter-extremism company that counts governments like the U.S. and Canada and groups like the Anti-Defamation League (ADL) and Life After Hate among its clients. But Moonshot’s researchers recently found that Americans searching for information about armed groups online were more likely than typical audiences to click on messages that encourage calmness and mindful thinking.
“Our team tried it, and it seems to be working,” Ramalingam says. The finding echoes previous evidence suggesting that some violent extremists tend to be more receptive to messages offering mental health support. “And that’s an opening to a conversation with them.”
It’s a promising idea in a growing multimillion-dollar war—an effort that, even decades after 9/11 and especially after 1/6, is still hungry for tools to reach extremists. Old currents of violence and hate, amplified by a virtuous cycle of platforms and propagandists, are straining relationships and communities, draining wallets, and putting new pressure on the U.S. government to steer its anti-terror focus toward homegrown threats. Last month, the Department of Homeland Security said it was granting at least $77 million toward ideas for stopping what the agency says now represents the biggest danger to Americans’ safety: “small groups of individuals who commit acts of violence motivated by domestic extremist ideological beliefs.”
The risk of violence is buoyed by a rising tide of conspiracy theories and extremist interest, which Ramalingam says has reached levels comparable to other “high risk” countries like Brazil, Sri Lanka, India, and Myanmar. In terms of indicators of extremism in the U.S., “the numbers are skyrocketing.”
How to reach people—and redirect them
To get those numbers, Moonshot goes to where the toxicity tends to spread, and where violent far-right groups do much of their recruiting: Twitter, YouTube, Instagram, and Facebook, but also niche platforms like MyMilitia, Zello, and Gab. But core to its strategy is the place where many of us start seeking answers—the most trafficked website of all. “We all live our lives by search engines,” Ramalingam says.
Social media tends to get the bulk of the attention when it comes to radicalization, but Google is also integral to the extremism on-ramp. And unlike social media, with its posts and shares and filters, a search can feel like a more private, largely unmoderated, experience. “We tell Google our deepest, darkest thoughts,” Ramalingam says. “We turn to Google and ask the things that we won’t ask our family members or partners or our brothers or sisters.”
Search can also convey to users an illusory sense of objectivity and authority in a way that social media doesn’t. “It’s important that we keep our eye on search engines as much, if not more than we do social media,” Safiya Noble, associate professor at the University of California, Los Angeles, and cofounder and codirector of the UCLA Center for Critical Internet Inquiry, recently wrote on Twitter. “The subjective nature of social media is much more obvious. With search, people truly believe they are experiencing credible, vetted information. Google is an ad platform, the end.”
Moonshot began in 2015 with a simple, insurgent strategy: Use Google’s ad platform—and the personal data it collects—to redirect people away from extremist movements and toward more constructive content. The idea, called the Redirect Method, was developed in partnership with Google, and widely touted as a way to reach people searching for jihadist content, targeting young men who were just getting into ISIS propaganda, or more radicalized people who might be Googling for information on how to sneak across the border into Syria. The idea is to steer potential extremists away—known as counterradicalization—or to help people who are deep down a rabbit hole find their way out through deradicalization. That might mean connecting them with a mentor or counselor, possibly a former extremist.
Ramalingam has seen these methods work up close. A decade ago, as part of her graduate studies, she embedded herself among neo-Nazis in Scandinavia, where a system of counseling and exit programs was helping bring people back to sanity and family. In 2015, she and another counter-extremism researcher named Ross Frenett started Moonshot to drive that approach using search ads, with a name that described their far-reaching goal. “If we knew that that worked offline,” she says, “couldn’t we test whether this would work online?”
What began with a focus on jihadism and European white supremacy is now part of an effort to track a nexus of extremism, conspiracy theories, and disinformation—from QAnon to child exploitation content—from Canada to Sri Lanka. But for Moonshot, the U.S. is a new priority. Last month, Ramalingam, who grew up in the states, returned to open the company’s second office in D.C., where it can be closer to policy makers and practitioners. The company is also dropping the acronym from its birth name, Moonshot CVE: “Countering violent extremism” has become nearly synonymous with a misguided overemphasis on Muslim communities, Ramalingam points out, and in any case, old tactics aren’t sufficient. As extremist ideas have stretched into the mainstream, Moonshot’s once tiny target audiences now number in the millions.
“We can’t rely on what we knew worked when we were dealing with the dozens and the tens of people that were really on the fringes,” she says. “We need to be testing all sorts of new messaging.”
Understanding the data
If you were among the thousands of Americans who Googled for certain extremist-related keywords in the months around the election—phrases like “Join Oath Keepers Militia,” “I want to shoot Ron Wyden,” and “How to make C4″—you may have been targeted by the Redirect Method. It could have been a vague, nonjudgmental message at the top of your search results, like “Don’t do something you’ll regret.” Click, and you could end up at a playlist of YouTube videos with violence-prevention content, like a TED Talk by a would-be school shooter or testimonies from former neo-Nazis. Or you might encounter videos promoting calmness, or a page directing you to mental health resources. Around January 6 alone, Ramalingam says more than 270,000 Americans clicked on Moonshot’s crisis-counseling ads.
To do this, Google has given Moonshot special permission to target ads against extremist keywords that are typically banned. But while Moonshot launched the Redirect Method with Google’s help, these days it typically pays the ad giant to run its campaigns, just like any other advertiser. And now, given the sheer scale of the audiences Moonshot is reaching in the U.S., “the costs are off the charts,” Ramalingam says. Regarding its recent ADL-backed campaign, she says, “We’ve never paid this much for advertising in any one country on a monthly basis.”
This ad data comes with caveats. When looking at extremist search terms, for instance, Moonshot can’t be certain it’s measuring individual people or the same person searching multiple times. It also can’t know if it’s targeting an extremist or a journalist who’s simply writing about extremism.
Still, the company is bringing more empirical evidence and scientific rigor to a field that sorely needs it, says Colin Clarke, the director of policy and research at the Soufan Center, an independent non-profit group that studies extremism. Moonshot’s data is even more concerning, Clarke says, because of another statistic that’s not exactly captured in Google analytics.
“At a time when people have been locked in their homes and consuming disinformation, with record levels of domestic violence, anxiety, depression, substance abuse, what’s the antidote? People have bought guns and ammunition in record numbers. So they’re anxious, they’re angry, isolated, and they’re well-armed,” he says. “It’s a perfect storm.”
In a recent analysis, done in partnership with the ADL and gathered in a report titled “From Shitposting to Sedition,” Moonshot tracked tens of thousands of extremist Google searches by Americans across all 50 states during the three months around Election Day. It saw searches spike around big political events, but also along geographic and political lines. In states where pandemic stay-at-home orders lasted 9 or fewer days, white-supremacist-related searches grew by only 1%; in states where stay-at-home orders were 10 days or longer, the increase was 21%.
The politics of the pandemic fomented domestic extremist interest, but also helped unite disparate fringe movements, from militias to climate denialists to anti-maskers and anti-vaxxers. “We started to see this worrying blending and metastasization of all these different ideologies in the U.S.—far-right groups blending and reaching across the aisle to work with anti-vax movements,” Ramalingam says. And it’s during times of crisis, she notes, “when we see these actors just grasping to turn fear and anxiety in society into an opportunity for them to grow.”
But Ramalingam isn’t just concerned about the most hard-core armed believers. After the election and the events of January 6, she worries now about splintered far-right groups and disaffected conspiracy theorists who are grappling for meaning. That puts them at risk of further radicalization, or worse.
“There are a lot of people who basically just feel misled, who feel like they’ve lost a lot because they followed these conspiracy theories,” she says. QAnon channels filled up with anxiety, self-harm, and talk of suicide, “like a father saying, ‘My son won’t speak to me,’ people who have lost their jobs, people who said, ‘I lost my family because of this,’ ” Ramalingam says. “And so there’s a real moment now where we need to be thinking about the mental health needs of people who, at scale, bought into these conspiracy theories and lies.”
What to say
To reach violent radicals or conspiracy theorists to begin with, Ramalingam urges caution with ideological arguments. Shaming, ridiculing, and fact-based arguing can prove counterproductive. In some cases, it can be more effective to use nonjudgmental and nonideological messages that don’t directly threaten people’s beliefs or tastes but that try to meet them where they are. For instance, as Frenett suggests, if someone is searching for Nazi death metal, don’t show them a lecture; instead, show them a video with a death metal score, but without the racism.
Simple reminders to be mindful, and to think about how one’s actions impact others, may help. In its recent campaign, some of Moonshot’s most effective messaging asked people to “reflect and think on their neighbors, their loved ones, the people in their immediate community around them, and just to reflect on how their actions might be harmful to their loved ones,” Ramalingam says.
People interested in armed groups were most receptive to messages of “calm” offering mindfulness exercises. For all audiences, Moonshot found particularly high traction with an ad that said, “Anger and grief can be isolating.” When people clicked through, to meditation content or mental health resources, Ramalingam notes that “they seem to be watching it, or listening to it, or engaging with it for a long time.”
To reach QAnon supporters, Moonshot found the most success with messages that seek to empathize with their need for psychological and social support. “Are you feeling angry? Learn how to escape the anger and move forward,” said one Moonshot ad directed at QAnon keywords, which saw a click-through rate around 6%, twice that of other types of redirect messages. Clicking on that took some users to r/Qult_Headquarters, a subreddit that includes posts by former adherents.
Preventing the spread of violent extremist ideas involves a broader set of strategies. To bolster trust and a shared reality among the general public—people who haven’t yet gone down the rabbit hole—researchers are exploring other countermeasures like “attitudinal inoculation,” alerting people to emerging conspiracy theories and warning of extremists groups’ attempts at manipulation.
Experts also urge enhancing public media literacy through education and fact-checking systems. Governments may not be trusted messengers themselves, but they could help in other ways, through a public health model for countering violent extremism. That could mean funding for local community activities that can keep people engaged, and for mental health and social programs, an idea that then-Vice President Joe Biden endorsed at a 2015 White House summit on countering violent extremism.
Speaking of the White House, Ramalingam emphasizes that extremist ideologies warrant stern condemnation from public figures. Companies should deplatform the superspreaders of racism and disinformation, and political, cultural, and religious leaders should vehemently denounce them.
“Rhetoric that’s shaming of those ideologies can be really important and powerful from people in positions of power,” Ramalingam says. That’s for an already skeptical audience “that needs to hear it reinforced, but also the audience that is in the middle and doesn’t really know or doesn’t care that much. And that audience really needs to hear, ‘This is not okay. This is not acceptable. This is not a social norm.’ ”
But when addressing more extremist-minded individuals, Ramalingam suggests a gentler approach. “If someone is coming at you with an attack, you kind of pull yourself back into a corner and stand your ground and defend it,” she says. “And so if that’s our approach with the most extreme of society, that will actually worsen the problem.”
Does this work?
In the face of the domestic terror threat, mindfulness and compassion might sound like entering a space-laser fight with a water pistol or a hug.
But to Brad Galloway, who helps people exit right-wing extremist groups, Moonshot’s messaging makes sense. In a previous life, he used chat rooms and message boards to recruit people into a neo-Nazi group. After he joined—drawn in largely by camaraderie and music—what had been a U.S.-only organization eventually grew to 12 countries, thanks largely to the internet. Now Galloway is a coordinator at the Center on Hate, Bias and Extremism at Ontario Tech University, where he often urges his mentees to be more mindful, especially online.
“I ask people to think, Do I really need to watch this video of a school shooting?” Instead he encourages “positive content” to displace the stuff that can accelerate or even provoke radicalization.
Galloway, who has worked with Moonshot, Life After Hate, the Organization for the Prevention of Violence, and other groups, says the same principle of positive content applies to real life, too: Connecting with old friends and finding fun new activities can help people leave corrosive extremist communities. “What’s positive to that user, and how do we make that more prominent to them?”
That’s not just a rhetorical question. What content works with which audience? Who is reachable? What counts as success? And how do strategies like the Redirect Method influence extremists?
A 2018 Rand Corp. report on digital deradicalization tactics found that extremist audiences targeted with Redirect “clicked on these ads at a rate on par with industry standards.” Still, they couldn’t say what eventual impact it had on their behavior. As new funding flows in, and as experts throw up an arsenal of counter-radicalization ideas, there’s still scant evidence of what works.
For its part, Moonshot says its data suggests that some of its target audiences have viewed less extremist content, and points to the thousands of people it has connected to exit counseling and mental health resources. Still, Ramalingam says that the company sees “greater potential for us to assess whether our digital campaigns can lead to longer-term engagement with users, and longer-term change.”
There are other serious concerns as well. The missteps of previous digital wars on terror haunt Moonshot’s work: secret and extralegal surveillance systems, big data political warfare by military counter-radicalization contractors-turned-conspiracy mongers, untold violations of privacy and other civil rights. If Moonshot is tracking what messaging influences who, what data does it collect about “at risk” users, and where does that end up, and why? And who is at risk to begin with?
Ramalingam worries about the privacy concerns; she acknowledges that thanks to ad platforms and brokers, Moonshot can tap into “actually a heck of a lot of data.” But, she stresses, Moonshot isn’t accessing people’s private messages, and its work is bound by the stricter European personal data protections of the GDPR, as well as by an ethics panel that helps evaluate impacts. In any case, she argues, Moonshot is simply taking advantage of the multibillion-dollar digital platforms that drive most of the internet, not to mention the markets.
“As long as Nike and Coca-Cola are able to use personal data to understand how best to sell us Coke and sneakers, I’m quite comfortable using personal data to make sure that I can try and convince people not to do violent things,” Ramalingam says. Should that system of influence exist at all? “I’m totally up for that debate,” she says. “But while we’re in a context where that’s happening, I think it’s perfectly reasonable for us to use that sort of data for public safety purposes.”
What about the platforms?
The tech giants have run their own redirect and counter-speech programs as part of ongoing efforts to stem the toxicity that flourishes on their platforms. Google touts its work with Moonshot battling ISIS, its research on extremism, and its efforts to remove objectionable content and reduce recommendations to “borderline” content on YouTube. In December, its rights group Jigsaw published its findings on the digital spread of violent white supremacy.
Facebook tested the Redirect Method in a 2019 pilot in Australia aimed at nudging extremist users toward educational resources and off-platform support, a system that echoes its suicide-prevention efforts, which use pop-ups to redirect at-risk users to prevention hotlines. In an evaluation commissioned by Facebook last year, Moonshot called the program “broadly successful,” and recommended changes for future iterations. Facebook has also tested the program in Indonesia and Germany.
Ramalingam praises the tech platforms for their efforts, and supports their decisions to deplatform vast numbers of far-right and QAnon-related accounts, even if that’s made researching online extremism harder. Still, she says, Big Tech is doing “not nearly enough.”
Extremist content continues to slip through the platforms’ moderation filters, and then gets rewarded by the algorithms. Facebook’s own researchers have repeatedly shown how its growth-focused algorithms favor extremist content. Despite YouTube’s moderation efforts, political scientist Brendan Nyhan recently reported, the site’s design can still “reinforce exposure patterns” among extremist-curious people.
“The tech companies have an obligation to use their great privilege . . . of being a conduit of information, to get information and helpful resources to people that might be trapped in violent movements,” Ramalingam says.
As companies and lawmakers and law enforcement scramble for solutions in the wake of the events of January 6, Ramalingam also cautions against rash decisions and short-term thinking. “There’s an imperative to act now, and I have seen in the past mistakes get made by governments and by the tech companies just delivering on knee-jerk responses,” she says. “And then once the conversation dies down, they go back to essentially the way things were.”
Emotional reactions are understandable, given the shock of January 6, or of a family member who’s fallen down a rabbit hole, but they tend to be counterproductive. What works for battling violent extremism on a personal, one-on-one level, Ramalingam says, can also help fight it on a national scale: Avoid assumptions, be mindful, and consider the actual evidence.
“The way counselors and social workers do their work is they start by asking questions, by trying to understand,” she says. “It’s about asking questions so those people can reflect on themselves.”
The U.S. Military Academy reportedly is working with a London, England based firm, Moonshot CVE [Countering Violent Extremism], whose CEO is Vidhya Ramalingam, a former Obama Foundation leader. Ramalingam is also the author of a 2013 paper on immigration in Europe funded by a grant from George Soros’ Open Society Foundations.
Ramalingam toldDefense One she spoke with Garrison personally last month about how the Pentagon could use technology developed by her company to “find and eliminate extremism in the ranks.”
Why would the Pentagon hire a U.K.-based company to study allegations of extremism in the U.S. military? Why hire a politically connected group like Ramalingam’s?
It suggests that Garrison and Secretary of Defense Lloyd Austin may be looking for a predetermined answer. A deeper dive into Moonshot CVE might help unravel what they have in mind.
Moonshot CVE co-founder Ross Frenett expressed his support for Critical Race Theory (CRT) on Twitter last month, calling the opposition “Horrifying.” Joint Chiefs Chairman Mark Milley recently faced stiff criticism from congressional Republicans over the military’s recent moves to incorporate CRT elements into their training.
Moonshot CVE’s website dismisses Antifa’s and Black Lives Matter’s Marxist leanings and claims that those who assert its Marxism have engaged in a “white supremacist disinformation” campaign “as a means of delegitimizing it.”
“These sources echo far-right extremist disinformation narratives about BLM protesters trying to overthrow the republic and harm American citizens in a Marxist coup,” Moonshot CVE wrote in a paper jointly published with the Anti-Defamation League (ADL).
Of course, Antifa and BLM groups haven’t been shy about identifying themselves as Marxists. A popular graphic that circulated on pro-Antifa websites and Telegram accounts during the so-called “George Floyd Rebellion” of June 2020 claimed, “Militant networks will defend our revolutionary communities. Liberation begins where America dies” and the status of BLM founders as self-identified “trained Marxists” has been only discussed in the press.
Ramalingam and her organization claim that Antifa is unorganized, ignoring evidence of significant local, regional and international Antifa networks, and substantial material support from an extensive far-left network (including, as noted above, the Rosa Luxemburg Stiftung.) An extensive social media network including utilizing peer-to-peer encryption apps also exist, where BLM and Antifa activists share propaganda and techniques.
Why does Moonshot CVE fixate exclusively on “far-right” extremism, and work to minimize or deny the evidence of left-wing extremism?
One reason might be Moonshot’s apparent association with a German far-Left organization which is overtly pro-Marxist and pro-Antifa, and whose leaders have historical ties to Russian intelligence.
Ramalingam is a regular contributor to programs for an initiative at American University in Washington, D.C. called The Polarization and Extremism Research and Innovation Lab (PERIL). She participated in PERIL-sponsored seminars in October 2020, in April, and last month.
PERIL has partnered with The Rosa Luxemburg Stiftung (RLS), the think tank of the German political party Die Linke (The Left). Die Linke is the successor of the former East German communist party. The think tank is named for Rosa Luxemburg, a German Communist revolutionary whose ideas pioneered the Marxist examination of race and gender, and was killed during the 1919 German communist uprising. A 2008 report by the German Federal Office for the Protection of the Constitution calls “the memory” of Luxemburg a “traditional element of Left-wing extremism.”
This alliance could be revealing about Ramalingam’s and PERIL’s ideological orientation.
PERIL’s description of the RSL is misinformation and raised questions about what else it glosses over.
PERIL unsurprisingly omits the fact the organization’s top leaders belonged to East Germany’s ruling party, the Socialist Unity Party (SED) and/or were either employees or informants of the Soviet KGB-run STASI. Many former STASI members shifted their allegiance to the KGB following its disbanding, a defector told “The Washington Post” in 1990. Die Linke is a pro-Russia stalwart. RLS’s representative in Moscow is a woman named Kerstin Kaiser, a former STASI employee who provided reports that were given to the KGB.
“It stands in the tradition of the workers’ and women’s movements, as well as anti-fascism and anti-racism,” PERIL says on its website.
Given that The Rosa Luxemburg Stiftung was founded in 1990 after the fall of the Berlin Wall, —known officially as the “Antifascist Protection Barrier”— one might have questions about what “traditions” of antifascism the group actually stands for.
PERIL’s head Cynthia Miller-Idriss wrote a blog for the RLS’s New York office on “radicalization” during COVID last year. She thanked RLS for the opportunity to write for it on Twitter.
Miller-Idriss and Ramalingam both participated in a conference in Jena, Germany called “Hate Not Found” sponsored by the Institute for Democracy and Civil Society last December where Miller-Idriss was the keynote speaker. Rosa Luxemburg Foundation member Maik Fleilitz was on a panel at the conference that discussed “deplatforming the far-Right.”
Ramalingam and Miller-Idriss both contributed articles to a journal on “radicalization” on the Far-Right in November of 2020.
RLS’s global head Dagmar Enkelmann belonged to the SED and the East German parliament before the wall fell. Gregor Gysi, who helped open the RLS’s New York office in 2012 and who visited last month, headed the SED when it rebranded itself as the “Party of Democratic Socialism” in December 1989. Gysi allegedly informed on his legal clients to the STASI. A bloc in the German Bundestag expelled him in 1992 for seeming to defend the STASI.
STASI informants played a key role in promoting the climate of fear that kept East German society under control. RLS hosted former East German spy chief Werner Grossmann in 2010 for a talk on his book.
East Germany’s last Premier Hans Modrow is an RLS member, and the RLS manages his foundation, The Hans Modrow Stiftung. Modrow had close KGB ties, including to KGB Chairman Vladimir Kryuchkov, who ran the Soviet spy agency his tenure as Dresden Communist Party boss. Modrow supervised the dismantling of the STASI together with Grossmann. Today, Modrow received the Order of Friendship from the Vladimir Putin in 2017. He remains embittered toward Mikhail Gorbachev for allowing the collapse of the East German regime.
As a young KGB major, Putin supervised a local STASI office in Dresden, while Modrow was the local party boss.
RLS funded Antifa activities in Germany, and Die Linke openly supports Antifa. The Hamburg, Germany Antifa chapter even promoted a Rosa Luxemburg Stiftung panel on its Facebook page. Friedrich Burschel, editor of “Antifascitisiches Info Blatt,” advises the foundation on subjects related to right-wing extremism and fascism at the Rosa Luxemburg Foundation. “Antifascitisiches Info Blatt” ̶ the oldest ANTIFA publication, having first entered publication in 1987 in East Berlin ̶ publishes articles on the Rosa Luxemburg Stiftung-funded website Linksnet, a collaboration of far-Left magazines.
The RLS hosted two BLM founders, Alicia Garza and Opal Tometi in 2014 and 2015 respectively. Garza attended the RLS-sponsored “Mapping Socialist Strategies” seminar in August 2014. RLS leader and former “unofficial STASI employee” Michael Brie spoke at this event. His brother Andre Brie spoke at a 1994 “Committees of Correspondence for Liberation and Socialism” conference along with Angela Davis, who has become influential in BLM. Davis worked closely with the East German regime in the 1970s, and she was a guest of honor at an event sponsored by Die Linke a decade ago. RLS’s New York office hosted BLM propagandist Shaun King in 2017.
The Southern Poverty Law Center (SPLC) is another PERIL partner who Ramalingam has worked with. The SPLC also has received money from the Rosa Luxemburg Stiftung. The SPLC is an extremely controversial organization which has been accused by its own former employees of bias and deliberately overinflating supposed far right threats for fundraising. SPLC has defended Antifa. Former SPLC Intelligence Project Director Heidi Beirich and SPLC Intelligence Project Senior Analyst Evelyn Schlatter participated in a June 2017 RLS-sponsored session in New York called “Strategies Against the Far Right.” Ramalingam and Beirich are both advisory group members of a pan-European “anti-radicalization” project called The DARE Consortium. In October, Ramalingam, Beirich and Miller-Idriss collaborated on a podcast on countering extremism sponsored by the ADL.
Moonshot CVE’s alliance with RLS-backed PERIL reinforces the perception that the Biden Pentagon’s hunt for extremism actually is an excuse for classifying dissenting view as “extremist.” And the pro-Russian/ex-STASI controlled RLS’s endorsement of the same talking points as Moonshot CVE shows it comes from a far-Left extremist perspective. U.S. troops shouldn’t be subjected to ideological warfare.
The fact Moonshot CVE equates opposing Antifa with extremism reminds us that this company doesn’t deserve taxpayer money or the Pentagon’s cooperation.
You’re already guilty just by the research you do while so many other cases are not prosecuted at all. Take caution reader…..
Even Federal contracts have gone to universities….
George Washington University School of Law’s Program on Extremism has created an online resource for tracking the hundreds of criminal cases filed by the Biden Justice Department against United States citizens for their alleged actions on January 6th. The Administration has charged people from all 50 states, and as is reflected in the“Capitol Hill Siege” project archive, every case has been filed in the District of Columbia. Read more here.
“Due to the growing number of threats our nation is combating,” the grant synopsis explains, the DHS Science and Technology Directorate “supports the evolving threat landscape of a dynamic world with changing motivations, actors, communication models and weaponry.”
The grant prioritizes data collection and technological innovation as means to identify, understand and combat the purported threat of penetration of U.S. law enforcement agencies by violent extremists.
“Objectives of this effort will identify high quality data to understand the risks posed to the United States by the potential for violent extremist organizations or lone actors to infiltrate law enforcement agencies (LEAs) and other government institutions,” the synopsis states.
While billing U.S. taxpayers $500K for this initiative to understand these clandestine “extremist organizations” infiltrating law enforcement, the grant neglects to define what it means by “extremist organizations.”
The research and data collected under the grant is to be shared with a variety of agencies, including private organizations. Yet civil rights and liberties will not be violated in the combined public-private harvesting and sharing of data about undefined “extremists,” DHS insists.
“Knowledge and findings from this research will be transferred to federal, state, local, and private organizations to enable education and awareness to reinforce a whole-of-society prevention architecture while respecting civil rights and civil liberties,” according to the grant description. “These prevention efforts will equip and empower local efforts — including peers, teachers, community leaders, and law enforcement — to minimize a threat as it evolves while enhancing emergency preparedness and response.”
The grant will task the awardee with understanding law enforcement threats from the perspectives of numerous fields, including including economics, psychology, politics and criminology. “The awardee(s) will assist with a range of activities,” the grant specifies, including designing data collection strategies, collecting data from primary and secondary sources, and analyzing data while identifying subject matter experts to participate in interviews and/or focus groups.”
Analyzing research from these various fields and experts will help fill in the gaps in understanding the threat environment and help “counter the threats posed by violent extremists and violent ideologies to United States LEAs and the public,”
The closing date for the grant applications is May 16, a day after the country concludes National Police Week. The week of May 9-May 15 has been designated as National Police Week since 1962 to recognize the service and sacrifice of federal, state and local law enforcement.
As reported by Just the News this week, the DHS and the Department of Defense have announced internal investigations of “extremism” within their departments, raising alarms among conservative civil liberties watchdogs, as the agencies’ notions of “extremism” were vague and appeared to omit from scrutiny far-left extremist groups implicated in widespread political violence in 2020.
The Secretary of Homeland Security has issued a new National Terrorism Advisory System (NTAS) Bulletin regarding the current heightened threat environment across the United States. The Homeland is facing threats that have evolved significantly and become increasingly complex and volatile in 2021. These threats include those posed by domestic terrorists, individuals and groups engaged in grievance-based violence, and those inspired or influenced by foreign terrorists and other malign foreign influences. Social media and online forums are increasingly exploited by these actors to influence and spread violent extremist narratives and activity. Such threats also are exacerbated by the impacts from the ongoing global pandemic.
Issued: May 14, 2021 02:00 pm Expires: August 13, 2021 02:00 pm
Violent extremists may seek to exploit the easing of COVID-19-related restrictions across the United States to conduct attacks against a broader range of targets after previous public capacity limits reduced opportunities for lethal attacks.
Historically, mass-casualty Domestic Violent Extremist (DVE) attacks linked to racially- or ethnically-motivated violent extremists (RMVEs) have targeted houses of worship and crowded commercial facilities or gatherings. Some RMVEs advocate via social media and online platforms for a race war and have stated that civil disorder provides opportunities to engage in violence in furtherance of ideological objectives.
Through 2020 and into 2021, government facilities and personnel have been common targets of DVEs, and opportunistic violent criminals are likely to exploit Constitutionally-protected freedom of speech activity linked to racial justice grievances and police use of force concerns, potentially targeting protestors perceived to be ideological opponents.
Ideologically-motivated violent extremists fueled by perceived grievances, false narratives, and conspiracy theories continue to share information online with the intent to incite violence. Online narratives across sites known to be frequented by individuals who hold violent extremist ideologies have called for violence against elected officials, political representatives, government facilities, law enforcement, religious or commercial facilities, and perceived ideologically-opposed individuals.
The use of encrypted messaging by lone offenders and small violent extremist cells may obscure operational indicators that provide specific warning of a pending act of violence.
Messaging from foreign terrorist organizations, including al-Qa‘ida and ISIS, intended to inspire U.S.-based homegrown violent extremists (HVEs) continues to amplify narratives related to exploiting protests. HVEs, who have typically conducted attacks against soft targets, mass gatherings, and law enforcement, remain a threat to the Homeland.
Nation-state adversaries have increased efforts to sow discord. For example, Russian, Chinese and Iranian government-linked media outlets have repeatedly amplified conspiracy theories concerning the origins of COVID-19 and effectiveness of vaccines; in some cases, amplifying calls for violence targeting persons of Asian descent.
DHS encourages law enforcement and homeland security partners to be alert to these developments and prepared for any effects to public safety. Consistent with applicable law, state, local, tribal, and territorial (SLTT) law enforcement organizations should maintain situational awareness of online and physical activities that may be related to an evolving threat of violence.
How We Are Responding
DHS and the Federal Bureau of Investigation (FBI) continue to provide guidance to SLTT partners about the current threat environment. Specifically, DHS has issued numerous intelligence assessments to SLTT officials on the evolving threat.
DHS is collaborating with industry partners to identify and respond to those individuals encouraging violence and attempting to radicalize others through spreading disinformation, conspiracy theories, and false narratives on social media and other online platforms.
DHS has prioritized combatting DVE threats within its FEMA grants as a National Priority Area.
DHS remains committed to identifying and preventing domestic terrorism.