The attacks came by land, sea, air – and online.
Terror propaganda − violent videos and graphic images of kidnappings and murders of civilians and soldiers that flooded social media from the deadly cross-border incursion into Israel − was a key element in Hamas’ military campaign, said Graham Brookie, senior director of the Atlantic Council’s Digital Forensic Research Lab.
Hamas streamed from the war zone “in closer to real-time than in past conflicts,” indicating that the online strategy “was an essential part of the overall planning for their attack on Israel” to take advantage of how hyperconnected Israel is with its widespread use of smartphones and social media, Brookie said.
Veterans of conflict zones on Brookie’s team tracking Hamas' digital footprint were struck by the ferocity and volume of the footage, the likes of which they had never seen, he said.
The effects could become more grave if Hamas makes good on a threat to broadcast executions of hostages, Brookie said. Some Hamas officials have since walked back the threat.
Spreading propaganda on social media networks – especially those with more lax moderation systems – is a low-cost and highly effective tactic that is increasingly being deployed by terrorist groups, said Colin P. Clarke, director of research at the Soufan Group, a global intelligence and security consultancy.
The videos and images from the deadly attacks posted by Hamas-affiliated accounts were shared by Hamas allies and unwitting participants around the globe, Clarke said.
The tactic is pulled from the playbooks of the Islamic State and al-Qaida, which experts said pioneered the viral form of psychological warfare by livestreaming beheadings and other graphic footage.
“This is the nature of asymmetric warfare – you've got to utilize tools that are going to make you seem bigger and more powerful than you are,” Clarke said. “For Hamas, it costs them nothing to videotape something and send it out there, and now everyone thinks that they're creeping around every corner.”
Hamas fired off posts from official accounts on the encrypted messaging and social media app Telegram that ricocheted to X, formerly Twitter, and beyond.
Telegram – which allows messaging between users but also has a “Channel” function that works similarly to X – has a long history of being used by terrorists, among them the Islamic State, or ISIS. The platform, often labeled by extremism experts as “Terrorgram,” is increasingly embraced by domestic extremists in the U.S. in recent years because of its almost nonexistent moderation.
“As soon as something posts on Telegram, we see this crossflow of content to mainstream platforms, either intentionally or just through the oxygen of amplification,” Brookie said.
X, which gutted its trust and safety teams under new owner Elon Musk, was unprepared for the wave of terror propaganda that then spread to other social media platforms and messaging apps like WhatsApp.
This week, X said it was removing newly created Hamas-affiliated accounts and coordinating with industry peers “to try and prevent terrorist content from being distributed online.” It also recommended that people adjust their sensitive media settings if they don’t want to see disturbing content.
The U.K. summoned social media executives Wednesday to demand platforms remove violent content of the Hamas attacks on Israel.
Telegram and X did not immediately respond to requests for comment.
If social media sites continue to erode guardrails and safeguards, Clarke warned, what is happening today will get worse.
“This is just the facet of what’s called ‘fourth generation warfare,’ or ‘modern warfare’ in this era,” he said. “So, whatever the next conflict is, it doesn't really matter, you're going to see this on steroids.”
Once propaganda and disinformation reaches X, the platform is allowing it to be shared and even monetized, a report from the Tech Transparency Project found.
The report, an advance copy of which was provided exclusively to USA TODAY, found several premium subscriber accounts – which pay a fee of $8 a month to the platform – sharing Hamas propaganda including violent and graphic images of the attacks on Israel.
According to the report, some video content appears to violate X’s “violent and hateful entities policy,” which prohibits promotion of terrorist organizations and their propaganda. It also appears to violate an X policy stating that the platform will remove accounts sharing content produced by terrorists, the report states.
Despite X’s announcement that it was working to remove Hamas-affiliated accounts, TTP researchers "found no evidence that X had restricted the accounts sharing the Hamas attack videos,” the report says. It also notes that in some cases, X ran advertisements in the comments under the video posts.
The TTP report found several instances of “verified” X users, some with more than 500,000 followers, spreading uncensored Hamas propaganda videos.
One clip posted to X shows bodycam footage of Hamas militants going room-to-room in an Israeli military base firing automatic weapons. Bloodied bodies lie on the ground in the video, which has been watched more than 60,000 times.
“It's been nearly 10 years since the Islamic State made heavy use of social media to amplify its propaganda and recruitment efforts. Although some platforms have improved at reducing the spread of Islamic State content, today we see Hamas deploying very similar tactics on X,” TTP director Katie A Paul told USA TODAY. “The company has monetized the spread of Hamas propaganda despite its clear violations of X policy. X needs to effectively enforce the rules it has on the books as the conflict continues.”
Brookie says social media platforms are being forced to grapple with a multi-front crisis to shield users from terror propaganda, misleading information, hate speech and violent and graphic images and footage.
“The amount of content is going to be more than any team, whether it’s at a platform or a research team like mine, or a journalism outlet or organization could possibly sift through,” he said. “That changes the frame from who’s doing well to who’s doing the least bad in a hard situation.”
The major platforms are working with the Global Internet Forum to Counter Terrorism (GIFCT) which was founded by Facebook, Microsoft, X and YouTube in 2017 to prevent terrorists and violent extremists from exploiting digital platforms.
“In this rapidly evolving situation, we continue to work with our members to identify and follow trends in content and activity online related to terrorist and violent extremist actors involved in the offline violence,” GIFCT said in a statement. “This includes the possibility that a range of terrorist and violent extremist networks and groups may seek to exploit the conflict for their own purposes, as we have seen in past events.”
Facebook, YouTube and TikTok say they have poured resources into moderating content from the Israel-Hamas war.
“After the terrorist attacks by Hamas on Israel on Saturday, we quickly established a special operations center staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation,” Meta said in a statement.
The major platforms prohibit content produced by terrorist organizations including Hamas as well as footage showing hostages.
电话:020-123456789
传真:020-123456789
Copyright © 2024 Powered by FR News http://frnewsprofile.com/