Photo, above: A screenshot from one of the terror-supporting jihadi videos on YouTube that was flagged by MEMRI. The video remains on YouTube to this day.
Terrorists are using YouTube to recruit, inspire, and train jihadis to commit acts of terror and martyrdom.
But, that’s OK, right. We have to be politically correct and give terrorists every benefit of the doubt before deleting their recruiting videos, as evidenced by the number which remain online even after they have been flagged as terrorist recruiting tools. Sickening!
However, if you would like to post a video to YouTube denouncing terrorism and its source, radical Islam, be prepared to have your video flagged for deletion. Gotta’ be PC! YouTube can’t appear to be Islamophobic.
From Gatestone Institute
- If anyone still doubted at that point the connection between terrorism and Google’s video platform, the Daily Telegraph revealed that British counterterrorism police had been monitoring a cell of ISIS “wannabes” since March, and recorded its members discussing how to use YouTube to plot a vehicular ramming and stabbing attack in London. Terrorists have learned that YouTube can be as deadly a weapon as knives and cars.
- YouTube and Google, by posting such videos, are effectively being accessories to murder. They are also inviting class-action lawsuits from families and individuals victimized by terrorism. They need to be held criminally liable for aiding and abetting mass murder.
- In Arabic with French subtitles, the clip lauds terrorists “martyred for Allah.” User comments include: “beautiful… may Allah give us all the knowledge and power to accelerate our imams.” In other words, the pictures of smiling terrorists and their dead bodies serve as an inspiration to young Muslims seeking Paradise through martyrdom. This is not theoretical. According to the website Wired UK, as of June 5, there were 535 terrorist attacks around the world — with 3,635 fatalities — since the beginning of 2017 alone.
In mid-March this year, major companies began withdrawing or reducing advertising from Google Inc., the owner of YouTube, for allowing their brand names to pop up alongside videos promoting jihad, a new report released on June 15 by the Middle East Research Media Institute (MEMRI) reveals.
According to the report — which documents the failure of Google to remove jihadi content that MEMRI volunteered to assist in flagging — thus far, AT&T, Verizon, Johnson & Johnson, Enterprise Holdings and GSK are among the companies pulling their ads from the platform. Google responded by promising to be more aggressive in ensuring brand safety of ad placements.
Then came the Westminster attack. On March 22, 2017, Khalid Masood rammed his car into pedestrians — killing four people and wounding dozens of others – then stabbed an unarmed police officer to death.
Exactly two months later, on May 22, Salman Ramadan Abedi detonated a shrapnel-laden homemade bomb at the Manchester Arena, after a concert by American singer Ariana Grande. The blast killed 22 people and wounded more than 100 others.
On June 3, ahead of Britain’s general election five days later, Khuram Shazad Butt, Rachid Redouane and Youssef Zaghba murdered eight people and wounded 48 others in a combined van-ramming and stabbing attack on London Bridge.
On June 6, Britain’s three main political parties pulled their campaign advertisements from YouTube, after realizing that they were placed in or alongside jihadi videos.
If anyone still doubted at that point the connection between terrorism and Google’s video platform, the Daily Telegraph revealed that British counterterrorism police had been monitoring a cell of ISIS “wannabes” since March, and recorded its members discussing how to use YouTube to plot a vehicular ramming and stabbing attack in London.
Appallingly, the surveillance did nothing to prevent the carnage. It did provide further evidence, however, that jihadis purposely use the major online platform to spread their message and recruit soldiers in their war against the West and any Muslims deemed “infidels.” Terrorists have learned that YouTube can be as deadly a weapon as cars and knives.
Nor could Google claim that it is unaware of the increasing pernicious use of its platform, or that it lacks the algorithmic tools to monitor YouTube’s massive traffic – involving 1.3 billion users and 300 hours of video uploaded every minute.
In the first place, complaints about jihadi content have been lodged by individuals and organizations for years. Secondly, Google vowed to tackle the problem through a flagging feature that alerts YouTube to material that “promotes terrorism.” Furthermore, YouTube itself claims: “Our staff reviews flagged videos 24 hours a day, 7 days a week to determine whether they violate our Community Guidelines.”
In 2010, five years after YouTube’s inception, MEMRI Executive Director Steven Stalinsky met with Google and YouTube free-speech attorneys and other company officials to discuss this issue in detail and offer assistance in monitoring jihadi online activity. Nevertheless, despite YouTube’s assurances, it has continued to serve as a virtual soap box for radical imams and recruiters of “martyrs” for missions against both general and specific targets.
During that period seven years ago, MEMRI also presented findings to members of Congress from both sides of the aisle, resulting in written appeals from both Democrats and Republicans to YouTube CEO Chad Hurley to take the matter seriously and do something about it.
In spite of Tube’s earlier promises, MEMRI found that most of the videos it had flagged, beginning in 2010, remained online two and three years later.
The breakdown was as follows:
Al-Qaeda leader Osama bin Laden and 9/11 attack glorification videos – 100 were flagged, 58 remained online.
Yemeni-American Al-Qaeda in the Arabian Peninsula (AQAP) cleric Anwar Al-Awlaki videos – 127 were flagged, 111 remained online.
Al-Qaeda leader Ayman Al-Zawahiri videos – 125 were flagged, 57 remained online.
More recently, of the 115 videos that MEMRI flagged on YouTube in 2015, 69 remained active as of February 27, 2017. Many are still online to this day. Some are so gruesome that the MEMRI report includes a warning to readers about “graphic images.”
One example is a clip titled: “A Martyr From the Taliban Laughs and Utters the Two Declarations [Of Faith] Before He is Martyred.” Posted on July 5, 2011 — and viewed by nearly three million people — it shows a terrorist welcoming death with a smile on his face. The comments beneath the video are all in Arabic.
Another, titled “Shuhada (Martyrs) Of Islam, Look They Are Smiling In Death,” was posted on September 22, 2009, with the YouTube disclaimer, “This video may be inappropriate for some users,” and the user option: “I understand and wish to proceed.” In Arabic with French subtitles, the clip lauds terrorists “martyred for Allah.” User comments include: “beautiful… may Allah give us all the knowledge and power to accelerate our imams.” In other words, the pictures of smiling terrorists and their dead bodies serve as an inspiration to young Muslims seeking Paradise through martyrdom.
This is not theoretical. According to the website Wired UK, as of June 5, there were 535 terrorist attacks around the world — with 3,635 fatalities — since the beginning of 2017 alone. It is only because the bulk of these attacks took place in countries such as Nigeria, Yemen, Somalia and Bangladesh — and involved Muslims killing other Muslims — that they were barely reported, and even less noticed, in the West.
Whenever a Western country is targeted successfully, however, the issue of global jihad hits the headlines – and now threatens to hurt the coffers of social media giants that have been acting as enablers. According to analyst firm Nomura Instinet, YouTube could lose $750 million in advertising revenue this year, as a result of its “funding” of terrorism and, in effect, enabling of wide-scale murder. Although this figure would not put Google in the red, it represents a protest on the part of users increasingly concerned about international security.
In what was clearly a move to counteract the latest outcry about jihadi videos on YouTube, Google announced on June 18 that it was introducing a “four-step plan” to “fight terrorism online,” referring specifically to ISIS propaganda.
In an op-ed in the Financial Times and a subsequent post on “Google in Europe,” Google General Counsel Kent Walker wrote:
“Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all. Google and YouTube are committed to being part of the solution. We are working with government, law enforcement and civil society groups to tackle the problem of violent extremism online. There should be no place for terrorist content on our services.
While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now.”
The steps Walker listed were: increasing the use of technology to identify terrorism-related videos; increasing the number of independent experts in YouTube’s “Trusted Flagger” program; making it harder for videos that do not strictly violate YouTube’s “community standards,” but which contain extremist content, to be located on the site; and implementing a “Redirect Method,” to send viewers in search of radical content to videos that debunk jihadi recruitment messages.
Robert Spencer, of Jihad Watch, responded wryly to these ostensibly new measures, including those that MEMRI found have not been implemented over the years in any case:
“Google says it will put ‘warnings on those videos and make them harder to find.’ Ten to one these warnings will end up going not on jihad videos, but on anti-jihad videos.”