Germany’s al Qaeda/Jihad Problems Include Welfare Payments

Primer: May 16. Ziyad K., a 32-year-old Iraqi Yazidi, was sentenced to 11 years in prison for raping two Chinese students, aged 22 and 28, at the University of Bochum in August and November 2016. Police linked the man, who was living with his wife and two children in a refugee shelter in Bochum, to both crimes through DNA evidence. “He has never shown remorse,” Prosecutor Andreas Bachmann said. “How could a person fleeing from violence and danger come to do this terrible violence to other people?”

The Muslim population of Germany surpassed six million in 2017 to become approximately 7.2% of the overall population of 83 million, according to calculations by the Gatestone Institute.

A recent Pew Research Center study on the growth of the Muslim population in Europe estimated that Germany’s Muslim population had reached five million by the middle of 2016, but that number is short by at least a million.

Pew, for instance, “decided not to count” the more than one million Muslim asylum seekers who arrived in the country in 2015-2017 because “they are not expected to receive refugee status.” European Union human rights laws, however, prohibit Germany from deporting many, if not most, of the refugees and asylum seekers back to conflict areas. As a result, most migrants who arrived in the country will almost certainly remain there over the long term.

In addition, German authorities have admitted to losing track of potentially hundreds of thousands of illegal immigrants, many of whom are living on German streets and are believed to be sustaining themselves on a steady diet of drug dealing, pickpocketing, purse snatching and other forms of petty crime. Much more detail here.

*** Copyright: Matthias Graben (WAZ) photo

According to the German newspaper WAZ, Sami A. allegedly recruited young Muslims in Bochum mosques to join the “Holy War.” The paper also linked him to the radicalization of two members of the so-called Düsseldorf al Qaeda cell.

WAZ also reported that Sami A. had taught two terrorists in Bochum mosques: 21-one-year-old Amid C. from Bochum and 28-year-old Halil S. from nearby Gelsenkirchen. Both reportedly received ideological training from him for their alleged terrorist plan. The two young men are on trial in Düsseldorf, accused of planning an attack together with two accomplices. According to the indictment, they intended to plant a cluster bomb in a crowd of people and “spread fear and terror in Germany.” More here.

Newsweek: The alleged former bodyguard of 9/11 mastermind Osama Bin Laden has been found collecting welfare checks from the government in Germany, according to local media, because he cannot be deported—even though he was refused asylum status.

A report in the German tabloid Bild said the man, named only as 42-year-old Sami A to protect his privacy, cannot be deported to his native Tunisia because he is at risk of torture there. He has lived in Germany since 1997 and has a wife and three children.

Sami A collects around $1,430 a month in welfare from the German government, a figure revealed after the far-right political party Alternative for Germany (AfD) asked questions of the local authority where he lives in Bochum, near the Dutch border.

He was accused by witnesses in a terrorism trial back in 2005 of having been Bin Laden’s bodyguard near the Afghanistan-Pakistan border for a few months at the turn of the millennium, something the judge said he believed to be true, though Sami A denies it.

German authorities regard Sami A as a “dangerous preacher,” reported Spiegel Online in 2012, and prosecutors say he was responsible for the radicalization of two men who later former part of a terror cell caught planning a bomb attack.

Though considered a security risk, no charges of Al Qaeda membership have so far been brought against Sami A. He must report every day to the police in Bochum, which he has done so since 2006. He was refused asylum status because of the security concerns, the BBC reported.

So After Congressional Hearings, Facebook Changes the Rules

The rules eh? Yeah those where employees are free to remove content with no explanation or often an appeals process. What is missing from the new rules, which Facebook states can change from time to time is the whole censorship issue especially when it comes to conservatives.

It was an internal secret on how Facebook controlled and managed content, in fact it still appears to be a secret. That means lawyers are involved, lots of them.In this day and time, definition of words and terms has become slippery and subjective and that continues to be the case at Facebook. So what are ‘community standards’ and exactly who decided those standards? Well 8000 words later describing community standards, that is IF anyone takes the time to read the text, we still don’t know.

How to control your data on Facebook like Mark Zuckerberg ... photo

Oh yeah, one other item….that fake news thing…..crickets….further Mark Zuckerberg himself is quite naive about the ugliness around the world…connecting people to talk about rainbows and bunnies will make it all better?

Facebook Terms and Policies

Facebook Terms of Service, still from 2015

MENLO PARK, Calif. (Reuters) – Facebook Inc (FB.O) on Tuesday released a rule book for the types of posts it allows on its social network, giving far more detail than ever before on what is permitted on subjects ranging from drug use and sex work to bullying, hate speech and inciting violence.

Facebook for years has had “community standards” for what people can post. But only a relatively brief and general version was publicly available, while it had a far more detailed internal document to decide when individual posts or accounts should be removed.

Now, the company is providing the longer document on its website to clear up confusion and be more open about its operations, said Monika Bickert, Facebook’s vice president of product policy and counter-terrorism.

“You should, when you come to Facebook, understand where we draw these lines and what’s OK and what’s not OK,” Bickert told reporters in a briefing at Facebook’s headquarters.

Facebook has faced fierce criticism from governments and rights groups in many countries for failing to do enough to stem hate speech and prevent the service from being used to promote terrorism, stir sectarian violence and broadcast acts including murder and suicide.

At the same time, the company has also been accused of doing the bidding of repressive regimes by aggressively removing content that crosses governments and providing too little information on why certain posts and accounts are removed.

New policies will, for the first time, allow people to appeal a decision to take down an individual piece of content. Previously, only the removal of accounts, Groups and Pages could be appealed.

Facebook is also beginning to provide the specific reason why content is being taken down for a wider variety of situations.

Facebook, the world’s largest social network, has become a dominant source of information in many countries around the world. It uses both automated software and an army of moderators that now numbers 7,500 to take down text, pictures and videos that violate its rules. Under pressure from several governments, it has been beefing up its moderator ranks since last year.

Bickert told Reuters in an interview that the standards are constantly evolving, based in part on feedback from more than 100 outside organizations and experts in areas such as counter-terrorism and child exploitation.

“Everybody should expect that these will be updated frequently,” she said.

The company considers changes to its content policy every two weeks at a meeting called the “Content Standards Forum,” led by Bickert. A small group of reporters was allowed to observe the meeting last week on the condition that they could describe process, but not substance.

At the April 17 meeting, about 25 employees sat around a conference table while others joined by video from New York, Dublin, Mexico City, Washington and elsewhere.

Attendees included people who specialize in public policy, legal matters, product development, communication and other areas. They heard reports from smaller working groups, relayed feedback they had gotten from civil rights groups and other outsiders and suggested ways that a policy or product could go wrong in the future. There was little mention of what competitors such as Alphabet Inc’s Google (GOOGL.O) do in similar situations.

Bickert, a former U.S. federal prosecutor, posed questions, provided background and kept the discussion moving. The meeting lasted about an hour.

Facebook is planning a series of public forums in May and June in different countries to get more feedback on its rules, said Mary deBree, Facebook’s head of content policy.

FROM CURSING TO MURDER

The longer version of the community standards document, some 8,000 words long, covers a wide array of words and images that Facebook sometimes censors, with detailed discussion of each category.

Videos of people wounded by cannibalism are not permitted, for instance, but such imagery is allowed with a warning screen if it is “in a medical setting.”

Facebook has long made clear that it does not allow people to buy and sell prescription drugs, marijuana or firearms on the social network, but the newly published document details what other speech on those subjects is permitted.

Content in which someone “admits to personal use of non-medical drugs” should not be posted on Facebook, the rule book says.

The document elaborates on harassment and bullying, barring for example “cursing at a minor.” It also prohibits content that comes from a hacked source, “except in limited cases of newsworthiness.”

The new community standards do not incorporate separate procedures under which governments can demand the removal of content that violates local law.

In those cases, Bickert said, formal written requests are required and are reviewed by Facebook’s legal team and outside attorneys. Content deemed to be permissible under community standards but in violation of local law – such as a prohibition in Thailand on disparaging the royal family – are then blocked in that country, but not globally.

The community standards also do not address false information – Facebook does not prohibit it but it does try to reduce its distribution – or other contentious issues such as use of personal data.