CRESTive Columns: Yasmine Houri and The shaping power of social media platforms over online communications about Israel and Palestine


Yasmine is a PhD candidate in Sociology at CREST, Institut Polytechnique de Paris (France).

She is interested in understanding the social mechanisms underlying the diffusion of information on social media. She analyses how online social groups form and collectively assess the appropriateness of content with respect to their reference norms, values, and authority groups. As a case study, she is mapping networks of Telegram channels and observing the re-contextualization of polarised content across communities. Trained as a data scientist, she is committed to combining qualitative analysis with social networks and computational tools.

Glueing back the middleman: the shaping power of social media platforms over online communications about Israel and Palestine. 

Social media have played a significant role in information and opinion sharing about military offenses in Israel and Gaza since the events of October 2023. On visual-based platforms like TikTok, war-related videos have generated billions of views. According to the Washington Post, by October 10th, the “#Palestine” and “#Israel” hashtags had respectively garnered 27.8 and 23 billion views on TikTok [1]. Although the documentation and spread of content related to Palestine and Israel on social media is not a new phenomenon, such viral and controversial topics are a growing concern for public policymakers and NGOs, as they often result in the dissemination of misleading information. In my research, I study the processes through which Internet users collectively judge the quality and appropriateness of information. I hope my work can shed light on how social norms and values frame the acceptance of potentially harmful information, notably in relation with contemporary geopolitical issues that are not commonly studied by computational sociologists.

My sociological approach to mass communication is inherited from Paul Lazarsfeld and Elihu Katz. According to the two-step flow of communication model, information flows from mass media to opinion leaders and from them to a wider population. Based on this hypothesis, the theory of personal influence embeds mass communication in structural and socio-contextual flows: areas of dense communication in a social network often revolve around central opinion leaders who are likely to share common interests, demographics, and socio-economic profile with their followers [2].

Yes, the social study of online information diffusion typically investigates the phenomenon of disintermediation, which refers to the fading mediation of institutions in individual access to information, or as Katz would phrase it, “what happens when a communicator invents a new medium, or adapts and extant one, to disintermediate  some middleman” [3]. The paradox of disintermediation is that institutions are, by definition, a social structure characterized by a set of beliefs, norms, attitudes, and practices which are relatively stable in time and meant to regulate social interactions [4]. Their removal from the information sharing process would be a threat to social cohesion and information quality.

Historically, technological breakthroughs have consistently been accused of disrupting information diffusion processes and, thus, social order – in this regard, the World Wide Web is the latest suspect. In 1962, Canadian philosopher Marshall McLuhan already analysed the effects of mass media on European culture, and elaborated on the idea that mass communication turned the entire world into a “global village” [5]. According to this idea now known as the “media ecology theory” [6], our brains are constrained by new media communication to process information in a certain way which is ultimately reflected in social organizations. To illustrate this idea, Katz uses the telling example of the linearity of print, which forces readers to sequential cause-and-effect reasoning, a logic then imported in rational social forms such as the assembly line. With this in mind, in my research, I challenge the theory of disintermediation and investigate the power of the (new) media to influence the online masses on how to think and where to belong [3]. In this essay, I will focus on the former, and illustrate how my theoretical and empirical frameworks can be employed to study contemporary conflict in the Middle East.

How to think: distorted perceptions of the Palestinian reality through algorithmic frames of thinking and content-based communication.

Social media affordances, which are the various models of concepts, relations and entities that frame communications on each social media platform, directly influence individual and collective perceptions of mediatised content. In particular, the fast-paced and algorithmically-curated diffusion of short video content out of context represents a specific risk in times of conventional and informational warfare.

One of the most consequential effects of technological innovations is the considerable acceleration of the flow of information produced by laypeople. By allowing content to circulate further and deeper in worldwide communication networks, new technologies interfere with prior human relations to time and space, and the social consequences of this phenomenon are non-negligible. According to Kantian philosophy, time and space are a priori frames of perception, a lens through which humans perceive any object in the world. When technological innovations such as social media tamper with these frames, they radically disrupt the human perception of space-time reality, and in turn, threaten to interfere with the existing social order. In contemporary mass communication, instances of such disruptive innovations arguably include the prevalence of algorithmic recommendations and free access to generative artificial intelligence.

The current social media landscape is characterised by the soaring success of infinite scrolling through a curated selection of short video clips, a feature most salient on TikTok. Ever since the popularisation of this affordance, and especially over the past months, users have easily been pushed down a rabbit hole of war-related videos and partisan content. In the absence of institutional intermediation, the risk of disseminating misleading information and propaganda is tremendously high. This affordance itself can be particularly damaging when it comes to sharing graphic images of casualties and material destructions in Gaza and Israel because the content displayed looks more real, is consumed compulsively and out of context, and provides consumers with the illusion that they have direct access to the refugees and the victims’ reality through their screens, without them being consciously aware of the algorithmic mediation of their feed. The latter is particularly concerning considering that NGOs are accusing giant platforms of operating massive systemic censorship: between October and November 2023, Human Rights Watch reported that over 1,050 pro-Palestine accounts and contents were taken down on Instagram and Facebook, 1,049 of which are said to have involved peaceful content [7].

Yet, in a fast-paced technological environment where it has never been easier to access generative artificial intelligence models, some users are well-equipped to bypass algorithmic regulations. This was demonstrated after the attack on a refugee camp in Rafah on May 26th, when an artificially generated image posted on Instagram by a Malaysian user went viral. The image depicts a refugee camp, with rows of tents stretching as far as the eye can see in a desert surrounded by mountains. At the centre of the frame, the pro-Palestinian motto “All eyes on Rafah”, originally pronounced by Richard Peeperkorn, a representative of the World Health Organization, is laid out in capital letters as if formed by white tents. Many features of this sanitized image suggest that it was created to circumvent platform censorship and be massively shared: the text is shaped by objects, making it harder to monitor for basic language-detection algorithms than if it were directly included in the description, and the absence of bodies and violence amongst these perfectly aligned tents apparently respects terms of use guidelines on most social media platforms. A few days after being published, the visual already scored 47 million shares, but it quickly raised criticism from media and research experts. Indeed, the image does not actually represent Rafah at the time of publication, and as a matter of fact, it does not show a real place at all. In an interview to the BBC, Dr. Paul Reilly, a senior lecturer in Communications, Media and Democracy at the University of Glasgow, drew attention to the fact that content posted on social media by journalists in Gaza were the object of less attention than this watered-down representation of a fictitious Rafah [8]. In an interview to France 24, Dr. Giada Pistilli, head of ethics at Hugging Face, argued that the overwhelming presence of artificially generated images such as this one poses a threat to our collective memory [9].

As a computational sociologist, I am interested in collective perceptions of  these phenomena, and in how they translate into online community building and information sharing. A major feature of content propagation on social media is the accelerating role of high-profile users. Whether it be the online profiles or real-life notorious figures, or individuals who make a career out of online content creating, such accounts prove to have a central role in information spreading for several reasons. First, because of the large size of their audience, they are central communicators whose posts have the potential to reach a significant number of other individuals. Furthermore, some of these high-profile users benefit either from real-life status or from online notoriety, and this endows them with a certain level of trust and establishment that might encourage their followers to relay their content with more confidence and a weaker critical appraisal. The fact that the “All eyes on Rafah” image was relayed by a number of celebrities on social media [10] is most certainly an illustration of this phenomenon.

The confrontation of content moderation accusations, algorithmic framing of content diffusion and artificially created war-related content sounds the alarm on the risks that social media communication poses to informational transparency and accuracy in collective appraisal of news content. When social media users encounter information related to Israel and Gaza, they are most often presented with a curated and fractionated snapshot of violent and partisan content, which impedes any accurate representation of the conflict in time and space.

To go further: identity politics in technocratic, privately-owned digital spaces.

My study of algorithmic framing and social influence on social media suggests a potential “re-intermediation” of information flows. In my current work, I seek to identify and characterise these intermediaries. Are they equivalent to those identified by McLuhan, Katz and Lazarsfeld during the last century? Are they the same ones but using different, newly born platforms? Or have completely unprecedented types of opinion leaders emerged, calling for a new theory of online intermediation? To answer these questions, I study the institutional role of giant social media platforms. I am particularly interested in the paradox that arises when authorities impose legal constraints on privately-owned, for-profit tech companies to regulate harmful content although the business model of these organisations specifically thrives on content virality. An interesting continuation to this research would seek to characterise how lobbying and partisanship among shareholders of the giant platforms impacts content moderation.

[1] Lorenz, Taylor (2023, October 10). “Why TikTok videos on Israel-Hamas war have drawn billions of views”. Washington Post.
https://www.washingtonpost.com/technology/2023/10/10/tiktok-hamas-israel-war-videos/

[2] Katz, E., Lazarsfeld, P. F., & Roper, E. (1955). Personal influence: The part played by people in the flow of mass communications. Routledge.

[3] Katz, E. (1988). Disintermediation: Cutting out the middle man. Intermedia, 16(2), 30-31.

[4] Hodgson, G. M. (2006). What are institutions?. Journal of economic issues, 40(1), 1-25.

[5] Marshall McLuhan (1962), The Gutenberg Galaxy: The Making of Typographic Man.

[6] Setiawan, H., Pawito, P., & Purwasito, A. (2022). Communication Behavior Use of Short Video Features by Adolescents. International Journal of Multicultural and Multireligious Understanding, 9(3), 710-716.

[7] Human Rights Watch. (2023, December 21). Meta’s broken promises: Systemic censorship of Palestine content on Instagram and Facebook. Retrieved from
https://www.hrw.org/report/2023/12/21/metas-broken-promises/systemic-censorship-palestine-content-instagram-and

[8] Alys Davies and BBC Arabic. (2024, May 30). All Eyes on Rafah: The post that’s been shared by more than 47m people. BBC News. https://www.bbc.com/news/articles/cjkkj5jejleo

[9] Barbara GABEL (2024, May 31). Gaza : “All eyes on Rafah”, une image virale qui “apaise nos consciences”. France 24.
https://www.france24.com/fr/moyen-orient/20240531-gaza-all-eyes-on-rafah-une-image-qui-apaise-nos-consciences

[10] NDTV (2024, May 29). Alia Bhatt To Dua Lipa: Celebs Draw Attention To Gaza After Rafah Horror.
https://www.ndtv.com/world-news/alia-bhatt-to-dua-lipa-celebs-who-shared-all-eyes-on-rafah-pic-5770022#:~:text=She%20posted%20an%20image%20with,show%20your%20solidarity%20with%20Gaza.%22