Revue française d’économie (2024/1 _Vol. XXXIX), pages 81 à 124) : Inégalités de revenu et de patrimoine : modèles, données et perspectives croisées, Stéphane Auray, Aurélien Eyquem, Bertrand Garbinti, Jonathan Goupille-Lebret
VoxDev: Free contraception had no impact on birth rates in Burkina Faso
An article co-written by Pauline Rossi, Pascaline Dupas, Seema Jayachandran and Adriana Lleras-Muney
Published 24.07.24
VoxDev: Free contraception had no impact on birth rates in Burkina Faso
An article co-written by Pauline Rossi, Pascaline Dupas, Seema Jayachandran and Adriana Lleras-Muney
Published 24.07.24
CRESTive Columns: Yasmine Houri and The shaping power of social media platforms over online communications about Israel and Palestine
Yasmine is a PhD candidate in Sociology at CREST, Institut Polytechnique de Paris (France).
She is interested in understanding the social mechanisms underlying the diffusion of information on social media. She analyses how online social groups form and collectively assess the appropriateness of content with respect to their reference norms, values, and authority groups. As a case study, she is mapping networks of Telegram channels and observing the re-contextualization of polarised content across communities. Trained as a data scientist, she is committed to combining qualitative analysis with social networks and computational tools.
Glueing back the middleman: the shaping power of social media platforms over online communications about Israel and Palestine.
Social media have played a significant role in information and opinion sharing about military offenses in Israel and Gaza since the events of October 2023. On visual-based platforms like TikTok, war-related videos have generated billions of views. According to the Washington Post, by October 10th, the “#Palestine” and “#Israel” hashtags had respectively garnered 27.8 and 23 billion views on TikTok [1]. Although the documentation and spread of content related to Palestine and Israel on social media is not a new phenomenon, such viral and controversial topics are a growing concern for public policymakers and NGOs, as they often result in the dissemination of misleading information. In my research, I study the processes through which Internet users collectively judge the quality and appropriateness of information. I hope my work can shed light on how social norms and values frame the acceptance of potentially harmful information, notably in relation with contemporary geopolitical issues that are not commonly studied by computational sociologists.
My sociological approach to mass communication is inherited from Paul Lazarsfeld and Elihu Katz. According to the two-step flow of communication model, information flows from mass media to opinion leaders and from them to a wider population. Based on this hypothesis, the theory of personal influence embeds mass communication in structural and socio-contextual flows: areas of dense communication in a social network often revolve around central opinion leaders who are likely to share common interests, demographics, and socio-economic profile with their followers [2].
Yes, the social study of online information diffusion typically investigates the phenomenon of disintermediation, which refers to the fading mediation of institutions in individual access to information, or as Katz would phrase it, “what happens when a communicator invents a new medium, or adapts and extant one, to disintermediate some middleman” [3]. The paradox of disintermediation is that institutions are, by definition, a social structure characterized by a set of beliefs, norms, attitudes, and practices which are relatively stable in time and meant to regulate social interactions [4]. Their removal from the information sharing process would be a threat to social cohesion and information quality.
Historically, technological breakthroughs have consistently been accused of disrupting information diffusion processes and, thus, social order – in this regard, the World Wide Web is the latest suspect. In 1962, Canadian philosopher Marshall McLuhan already analysed the effects of mass media on European culture, and elaborated on the idea that mass communication turned the entire world into a “global village” [5]. According to this idea now known as the “media ecology theory” [6], our brains are constrained by new media communication to process information in a certain way which is ultimately reflected in social organizations. To illustrate this idea, Katz uses the telling example of the linearity of print, which forces readers to sequential cause-and-effect reasoning, a logic then imported in rational social forms such as the assembly line. With this in mind, in my research, I challenge the theory of disintermediation and investigate the power of the (new) media to influence the online masses on how to think and where to belong [3]. In this essay, I will focus on the former, and illustrate how my theoretical and empirical frameworks can be employed to study contemporary conflict in the Middle East.
How to think: distorted perceptions of the Palestinian reality through algorithmic frames of thinking and content-based communication.
Social media affordances, which are the various models of concepts, relations and entities that frame communications on each social media platform, directly influence individual and collective perceptions of mediatised content. In particular, the fast-paced and algorithmically-curated diffusion of short video content out of context represents a specific risk in times of conventional and informational warfare.
One of the most consequential effects of technological innovations is the considerable acceleration of the flow of information produced by laypeople. By allowing content to circulate further and deeper in worldwide communication networks, new technologies interfere with prior human relations to time and space, and the social consequences of this phenomenon are non-negligible. According to Kantian philosophy, time and space are a priori frames of perception, a lens through which humans perceive any object in the world. When technological innovations such as social media tamper with these frames, they radically disrupt the human perception of space-time reality, and in turn, threaten to interfere with the existing social order. In contemporary mass communication, instances of such disruptive innovations arguably include the prevalence of algorithmic recommendations and free access to generative artificial intelligence.
The current social media landscape is characterised by the soaring success of infinite scrolling through a curated selection of short video clips, a feature most salient on TikTok. Ever since the popularisation of this affordance, and especially over the past months, users have easily been pushed down a rabbit hole of war-related videos and partisan content. In the absence of institutional intermediation, the risk of disseminating misleading information and propaganda is tremendously high. This affordance itself can be particularly damaging when it comes to sharing graphic images of casualties and material destructions in Gaza and Israel because the content displayed looks more real, is consumed compulsively and out of context, and provides consumers with the illusion that they have direct access to the refugees and the victims’ reality through their screens, without them being consciously aware of the algorithmic mediation of their feed. The latter is particularly concerning considering that NGOs are accusing giant platforms of operating massive systemic censorship: between October and November 2023, Human Rights Watch reported that over 1,050 pro-Palestine accounts and contents were taken down on Instagram and Facebook, 1,049 of which are said to have involved peaceful content [7].
Yet, in a fast-paced technological environment where it has never been easier to access generative artificial intelligence models, some users are well-equipped to bypass algorithmic regulations. This was demonstrated after the attack on a refugee camp in Rafah on May 26th, when an artificially generated image posted on Instagram by a Malaysian user went viral. The image depicts a refugee camp, with rows of tents stretching as far as the eye can see in a desert surrounded by mountains. At the centre of the frame, the pro-Palestinian motto “All eyes on Rafah”, originally pronounced by Richard Peeperkorn, a representative of the World Health Organization, is laid out in capital letters as if formed by white tents. Many features of this sanitized image suggest that it was created to circumvent platform censorship and be massively shared: the text is shaped by objects, making it harder to monitor for basic language-detection algorithms than if it were directly included in the description, and the absence of bodies and violence amongst these perfectly aligned tents apparently respects terms of use guidelines on most social media platforms. A few days after being published, the visual already scored 47 million shares, but it quickly raised criticism from media and research experts. Indeed, the image does not actually represent Rafah at the time of publication, and as a matter of fact, it does not show a real place at all. In an interview to the BBC, Dr. Paul Reilly, a senior lecturer in Communications, Media and Democracy at the University of Glasgow, drew attention to the fact that content posted on social media by journalists in Gaza were the object of less attention than this watered-down representation of a fictitious Rafah [8]. In an interview to France 24, Dr. Giada Pistilli, head of ethics at Hugging Face, argued that the overwhelming presence of artificially generated images such as this one poses a threat to our collective memory [9].
As a computational sociologist, I am interested in collective perceptions of these phenomena, and in how they translate into online community building and information sharing. A major feature of content propagation on social media is the accelerating role of high-profile users. Whether it be the online profiles or real-life notorious figures, or individuals who make a career out of online content creating, such accounts prove to have a central role in information spreading for several reasons. First, because of the large size of their audience, they are central communicators whose posts have the potential to reach a significant number of other individuals. Furthermore, some of these high-profile users benefit either from real-life status or from online notoriety, and this endows them with a certain level of trust and establishment that might encourage their followers to relay their content with more confidence and a weaker critical appraisal. The fact that the “All eyes on Rafah” image was relayed by a number of celebrities on social media [10] is most certainly an illustration of this phenomenon.
The confrontation of content moderation accusations, algorithmic framing of content diffusion and artificially created war-related content sounds the alarm on the risks that social media communication poses to informational transparency and accuracy in collective appraisal of news content. When social media users encounter information related to Israel and Gaza, they are most often presented with a curated and fractionated snapshot of violent and partisan content, which impedes any accurate representation of the conflict in time and space.
To go further: identity politics in technocratic, privately-owned digital spaces.
My study of algorithmic framing and social influence on social media suggests a potential “re-intermediation” of information flows. In my current work, I seek to identify and characterise these intermediaries. Are they equivalent to those identified by McLuhan, Katz and Lazarsfeld during the last century? Are they the same ones but using different, newly born platforms? Or have completely unprecedented types of opinion leaders emerged, calling for a new theory of online intermediation? To answer these questions, I study the institutional role of giant social media platforms. I am particularly interested in the paradox that arises when authorities impose legal constraints on privately-owned, for-profit tech companies to regulate harmful content although the business model of these organisations specifically thrives on content virality. An interesting continuation to this research would seek to characterise how lobbying and partisanship among shareholders of the giant platforms impacts content moderation.
[1] Lorenz, Taylor (2023, October 10). “Why TikTok videos on Israel-Hamas war have drawn billions of views”. Washington Post.
https://www.washingtonpost.com/technology/2023/10/10/tiktok-hamas-israel-war-videos/
[2] Katz, E., Lazarsfeld, P. F., & Roper, E. (1955). Personal influence: The part played by people in the flow of mass communications. Routledge.
[3] Katz, E. (1988). Disintermediation: Cutting out the middle man. Intermedia, 16(2), 30-31.
[4] Hodgson, G. M. (2006). What are institutions?. Journal of economic issues, 40(1), 1-25.
[5] Marshall McLuhan (1962), The Gutenberg Galaxy: The Making of Typographic Man.
[6] Setiawan, H., Pawito, P., & Purwasito, A. (2022). Communication Behavior Use of Short Video Features by Adolescents. International Journal of Multicultural and Multireligious Understanding, 9(3), 710-716.
[7] Human Rights Watch. (2023, December 21). Meta’s broken promises: Systemic censorship of Palestine content on Instagram and Facebook. Retrieved from
https://www.hrw.org/report/2023/12/21/metas-broken-promises/systemic-censorship-palestine-content-instagram-and
[8] Alys Davies and BBC Arabic. (2024, May 30). All Eyes on Rafah: The post that’s been shared by more than 47m people. BBC News. https://www.bbc.com/news/articles/cjkkj5jejleo
[9] Barbara GABEL (2024, May 31). Gaza : “All eyes on Rafah”, une image virale qui “apaise nos consciences”. France 24.
https://www.france24.com/fr/moyen-orient/20240531-gaza-all-eyes-on-rafah-une-image-qui-apaise-nos-consciences
[10] NDTV (2024, May 29). Alia Bhatt To Dua Lipa: Celebs Draw Attention To Gaza After Rafah Horror.
https://www.ndtv.com/world-news/alia-bhatt-to-dua-lipa-celebs-who-shared-all-eyes-on-rafah-pic-5770022#:~:text=She%20posted%20an%20image%20with,show%20your%20solidarity%20with%20Gaza.%22
CEPR DP19261 The Wage of Temporary Agency Workers By Sara Signorelli, Clément Malgouyres, Antonin Bergeaud, Pierre Cahuc and Thomas Zuber
17 Jul 2024
CRESTive Columns: Yasmine Houri and The shaping power of social media platforms over online communications about Israel and Palestine
Yasmine is a PhD candidate in Sociology at CREST, Institut Polytechnique de Paris (France).
She is interested in understanding the social mechanisms underlying the diffusion of information on social media. She analyses how online social groups form and collectively assess the appropriateness of content with respect to their reference norms, values, and authority groups. As a case study, she is mapping networks of Telegram channels and observing the re-contextualization of polarised content across communities. Trained as a data scientist, she is committed to combining qualitative analysis with social networks and computational tools.
Glueing back the middleman: the shaping power of social media platforms over online communications about Israel and Palestine.
Social media have played a significant role in information and opinion sharing about military offenses in Israel and Gaza since the events of October 2023. On visual-based platforms like TikTok, war-related videos have generated billions of views. According to the Washington Post, by October 10th, the “#Palestine” and “#Israel” hashtags had respectively garnered 27.8 and 23 billion views on TikTok [1]. Although the documentation and spread of content related to Palestine and Israel on social media is not a new phenomenon, such viral and controversial topics are a growing concern for public policymakers and NGOs, as they often result in the dissemination of misleading information. In my research, I study the processes through which Internet users collectively judge the quality and appropriateness of information. I hope my work can shed light on how social norms and values frame the acceptance of potentially harmful information, notably in relation with contemporary geopolitical issues that are not commonly studied by computational sociologists.
My sociological approach to mass communication is inherited from Paul Lazarsfeld and Elihu Katz. According to the two-step flow of communication model, information flows from mass media to opinion leaders and from them to a wider population. Based on this hypothesis, the theory of personal influence embeds mass communication in structural and socio-contextual flows: areas of dense communication in a social network often revolve around central opinion leaders who are likely to share common interests, demographics, and socio-economic profile with their followers [2].
Yes, the social study of online information diffusion typically investigates the phenomenon of disintermediation, which refers to the fading mediation of institutions in individual access to information, or as Katz would phrase it, “what happens when a communicator invents a new medium, or adapts and extant one, to disintermediate some middleman” [3]. The paradox of disintermediation is that institutions are, by definition, a social structure characterized by a set of beliefs, norms, attitudes, and practices which are relatively stable in time and meant to regulate social interactions [4]. Their removal from the information sharing process would be a threat to social cohesion and information quality.
Historically, technological breakthroughs have consistently been accused of disrupting information diffusion processes and, thus, social order – in this regard, the World Wide Web is the latest suspect. In 1962, Canadian philosopher Marshall McLuhan already analysed the effects of mass media on European culture, and elaborated on the idea that mass communication turned the entire world into a “global village” [5]. According to this idea now known as the “media ecology theory” [6], our brains are constrained by new media communication to process information in a certain way which is ultimately reflected in social organizations. To illustrate this idea, Katz uses the telling example of the linearity of print, which forces readers to sequential cause-and-effect reasoning, a logic then imported in rational social forms such as the assembly line. With this in mind, in my research, I challenge the theory of disintermediation and investigate the power of the (new) media to influence the online masses on how to think and where to belong [3]. In this essay, I will focus on the former, and illustrate how my theoretical and empirical frameworks can be employed to study contemporary conflict in the Middle East.
How to think: distorted perceptions of the Palestinian reality through algorithmic frames of thinking and content-based communication.
Social media affordances, which are the various models of concepts, relations and entities that frame communications on each social media platform, directly influence individual and collective perceptions of mediatised content. In particular, the fast-paced and algorithmically-curated diffusion of short video content out of context represents a specific risk in times of conventional and informational warfare.
One of the most consequential effects of technological innovations is the considerable acceleration of the flow of information produced by laypeople. By allowing content to circulate further and deeper in worldwide communication networks, new technologies interfere with prior human relations to time and space, and the social consequences of this phenomenon are non-negligible. According to Kantian philosophy, time and space are a priori frames of perception, a lens through which humans perceive any object in the world. When technological innovations such as social media tamper with these frames, they radically disrupt the human perception of space-time reality, and in turn, threaten to interfere with the existing social order. In contemporary mass communication, instances of such disruptive innovations arguably include the prevalence of algorithmic recommendations and free access to generative artificial intelligence.
The current social media landscape is characterised by the soaring success of infinite scrolling through a curated selection of short video clips, a feature most salient on TikTok. Ever since the popularisation of this affordance, and especially over the past months, users have easily been pushed down a rabbit hole of war-related videos and partisan content. In the absence of institutional intermediation, the risk of disseminating misleading information and propaganda is tremendously high. This affordance itself can be particularly damaging when it comes to sharing graphic images of casualties and material destructions in Gaza and Israel because the content displayed looks more real, is consumed compulsively and out of context, and provides consumers with the illusion that they have direct access to the refugees and the victims’ reality through their screens, without them being consciously aware of the algorithmic mediation of their feed. The latter is particularly concerning considering that NGOs are accusing giant platforms of operating massive systemic censorship: between October and November 2023, Human Rights Watch reported that over 1,050 pro-Palestine accounts and contents were taken down on Instagram and Facebook, 1,049 of which are said to have involved peaceful content [7].
Yet, in a fast-paced technological environment where it has never been easier to access generative artificial intelligence models, some users are well-equipped to bypass algorithmic regulations. This was demonstrated after the attack on a refugee camp in Rafah on May 26th, when an artificially generated image posted on Instagram by a Malaysian user went viral. The image depicts a refugee camp, with rows of tents stretching as far as the eye can see in a desert surrounded by mountains. At the centre of the frame, the pro-Palestinian motto “All eyes on Rafah”, originally pronounced by Richard Peeperkorn, a representative of the World Health Organization, is laid out in capital letters as if formed by white tents. Many features of this sanitized image suggest that it was created to circumvent platform censorship and be massively shared: the text is shaped by objects, making it harder to monitor for basic language-detection algorithms than if it were directly included in the description, and the absence of bodies and violence amongst these perfectly aligned tents apparently respects terms of use guidelines on most social media platforms. A few days after being published, the visual already scored 47 million shares, but it quickly raised criticism from media and research experts. Indeed, the image does not actually represent Rafah at the time of publication, and as a matter of fact, it does not show a real place at all. In an interview to the BBC, Dr. Paul Reilly, a senior lecturer in Communications, Media and Democracy at the University of Glasgow, drew attention to the fact that content posted on social media by journalists in Gaza were the object of less attention than this watered-down representation of a fictitious Rafah [8]. In an interview to France 24, Dr. Giada Pistilli, head of ethics at Hugging Face, argued that the overwhelming presence of artificially generated images such as this one poses a threat to our collective memory [9].
As a computational sociologist, I am interested in collective perceptions of these phenomena, and in how they translate into online community building and information sharing. A major feature of content propagation on social media is the accelerating role of high-profile users. Whether it be the online profiles or real-life notorious figures, or individuals who make a career out of online content creating, such accounts prove to have a central role in information spreading for several reasons. First, because of the large size of their audience, they are central communicators whose posts have the potential to reach a significant number of other individuals. Furthermore, some of these high-profile users benefit either from real-life status or from online notoriety, and this endows them with a certain level of trust and establishment that might encourage their followers to relay their content with more confidence and a weaker critical appraisal. The fact that the “All eyes on Rafah” image was relayed by a number of celebrities on social media [10] is most certainly an illustration of this phenomenon.
The confrontation of content moderation accusations, algorithmic framing of content diffusion and artificially created war-related content sounds the alarm on the risks that social media communication poses to informational transparency and accuracy in collective appraisal of news content. When social media users encounter information related to Israel and Gaza, they are most often presented with a curated and fractionated snapshot of violent and partisan content, which impedes any accurate representation of the conflict in time and space.
To go further: identity politics in technocratic, privately-owned digital spaces.
My study of algorithmic framing and social influence on social media suggests a potential “re-intermediation” of information flows. In my current work, I seek to identify and characterise these intermediaries. Are they equivalent to those identified by McLuhan, Katz and Lazarsfeld during the last century? Are they the same ones but using different, newly born platforms? Or have completely unprecedented types of opinion leaders emerged, calling for a new theory of online intermediation? To answer these questions, I study the institutional role of giant social media platforms. I am particularly interested in the paradox that arises when authorities impose legal constraints on privately-owned, for-profit tech companies to regulate harmful content although the business model of these organisations specifically thrives on content virality. An interesting continuation to this research would seek to characterise how lobbying and partisanship among shareholders of the giant platforms impacts content moderation.
[1] Lorenz, Taylor (2023, October 10). “Why TikTok videos on Israel-Hamas war have drawn billions of views”. Washington Post.
https://www.washingtonpost.com/technology/2023/10/10/tiktok-hamas-israel-war-videos/
[2] Katz, E., Lazarsfeld, P. F., & Roper, E. (1955). Personal influence: The part played by people in the flow of mass communications. Routledge.
[3] Katz, E. (1988). Disintermediation: Cutting out the middle man. Intermedia, 16(2), 30-31.
[4] Hodgson, G. M. (2006). What are institutions?. Journal of economic issues, 40(1), 1-25.
[5] Marshall McLuhan (1962), The Gutenberg Galaxy: The Making of Typographic Man.
[6] Setiawan, H., Pawito, P., & Purwasito, A. (2022). Communication Behavior Use of Short Video Features by Adolescents. International Journal of Multicultural and Multireligious Understanding, 9(3), 710-716.
[7] Human Rights Watch. (2023, December 21). Meta’s broken promises: Systemic censorship of Palestine content on Instagram and Facebook. Retrieved from
https://www.hrw.org/report/2023/12/21/metas-broken-promises/systemic-censorship-palestine-content-instagram-and
[8] Alys Davies and BBC Arabic. (2024, May 30). All Eyes on Rafah: The post that’s been shared by more than 47m people. BBC News. https://www.bbc.com/news/articles/cjkkj5jejleo
[9] Barbara GABEL (2024, May 31). Gaza : “All eyes on Rafah”, une image virale qui “apaise nos consciences”. France 24.
https://www.france24.com/fr/moyen-orient/20240531-gaza-all-eyes-on-rafah-une-image-qui-apaise-nos-consciences
[10] NDTV (2024, May 29). Alia Bhatt To Dua Lipa: Celebs Draw Attention To Gaza After Rafah Horror.
https://www.ndtv.com/world-news/alia-bhatt-to-dua-lipa-celebs-who-shared-all-eyes-on-rafah-pic-5770022#:~:text=She%20posted%20an%20image%20with,show%20your%20solidarity%20with%20Gaza.%22
GAIMSS’24 un évènement organisé par Simon Finster, Felipe Garrido-Lucero, Atulya Jain et Emilien Macault.
Publié le 11/07/2024 dans Factuel, l’info de l’Université de Lorraine
GAIMSS’24 un évènement organisé par Simon Finster, Felipe Garrido-Lucero, Atulya Jain et Emilien Macault.
Publié le 11/07/2024 dans Factuel, l’info de l’Université de Lorraine
Dissolution de l’Assemblée nationale et nouveau paysage politique : Etienne Ollion dans les médias
Etienne Ollion, sociologue et directeur de recherche au CNRS et professeur à l’Ecole polytechnique est spécialiste de sociologie politique. En 2017, Etienne Ollion mène une longue enquête à l’Assemblée nationale afin d’étudier la place des novices sous la présidence d’Emmanuel Macron, qui fera l’objet de son ouvrage publié aux éditions PUF, Les Candidats. Novices et professionnels en politique (2021).
En juin 2024, à la suite des résultats aux élections européennes, le président Emmanuel Macron a annoncé la dissolution de l’Assemblée nationale. Cette décision intervient après une victoire significative du Rassemblement National (RN) aux élections européennes, où le parti a obtenu 31,5 % des voix, devançant la liste de la majorité présidentielle (14,5 %).
La dissolution de l’Assemblée nationale entraîne la suspension immédiate de tous les travaux parlementaires en cours, y compris des projets de loi importants. De nouvelles élections législatives sont prévues les 30 juin et 7 juillet 2024. Ce type de dissolution est rare, la dernière remontant à 1997 sous Jacques Chirac.
Etienne Ollion, ayant largement étudié, la composition de l’assemblée, son fonctionnement, et plus récemment la situation en italienne avec l’arrivée au pouvoir de Giorgia Meloni, a été interviewé par de nombreux médias.
Emma Bonutti d’Agostini, doctorante en première année sous la supervision d’Etienne Ollion, s’intéresse à la circulation du discours et de l’idéologie de l’extrême droite dans les sphères journalistiques et médiatiques en France et en Italie. Elle a également été sollicité sur ses travaux par le journal Le Pèlerin : https://www.lepelerin.com/monde/decryptage/quel-est-le-bilan-pour-les-5-pays-d-europe-dans-lesquels-l-extreme-droite-gouverne-9867
Presse écrite
L’Express :
03/07/2024
https://www.lexpress.fr/idees-et-debats/etienne-ollion-bardella-au-pouvoir-pourrait-avoir-un-impact-bien-plus-fort-que-meloni-en-italie-XBQVX6R7PJDD7ITJZP55QIGPXM/?utm_medium=Social&utm_source=Twitter#Echobox=1720002052
L’Echo :
29/06/24
https://www.lecho.be/opinions/general/etienne-ollion-en-cas-de-victoire-du-rn-en-france-le-chef-serait-le-premier-ministre-et-non-le-president/10553529.html
AOC Media :
23/06/24
https://aoc.media/opinion/2024/06/23/le-mythe-de-limmunite-des-democraties-a-lautoritarisme/
La Nouvelle République :
23/06/2024
https://www.lanouvellerepublique.fr/a-la-une/l-arrivee-au-pouvoir-de-bardella-aurait-un-impact-plus-fort-que-celle-de-meloni-en-italie
Nice-Matin :
23/06/2024
https://www.pressreader.com/france/nice-matin-menton/20240623/282102051853070
Le Télégramme :
Le 23/06/24 :
https://www.letelegramme.fr/france/peut-on-comparer-giorgia-meloni-en-italie-et-jordan-bardella-en-france-6610523.php
06/07/24
https://www.letelegramme.fr/elections/legislatives/dans-une-coalition-ce-nest-pas-toujours-le-parti-du-milieu-qui-gouverne-6620952.php
France Infos :
03/07/24
https://www.francetvinfo.fr/elections/legislatives/legislatives-2024-comment-la-figure-de-jean-luc-melenchon-est-devenue-plus-repoussoir-que-celle-de-marine-le-pen_6639738.html
Le Nouvel Obs :
3/07/24
https://www.nouvelobs.com/idees/20240703.OBS90607/a-l-assemblee-des-deputes-lfi-anti-republicains-ou-ultra-republicains.html
9/07/24
https://www.nouvelobs.com/politique/20240709.OBS90886/legislatives-le-sursaut-et-l-inconnu.html
10/07/24
https://www.nouvelobs.com/politique/20240710.OBS90972/l-immense-majorite-des-pays-europeens-a-un-gouvernement-de-coalition-la-france-constitue-une-exception.html
Podcasts :
France Culture :
1/07/24
https://www.radiofrance.fr/franceculture/podcasts/france-culture-va-plus-loin-l-invite-e-des-matins/apres-le-premier-tour-des-elections-legislatives-la-resurrection-d-un-front-republicain-1520144
Vidéos :
Regards #LaMidinale :
21 juin 2024
https://regards.fr/sans-majorite-macron-pourrait-nommer-un-premier-ministre-entoure-par-un-gouvernement-technique/
Mediapart _ Soirées électorales :
Le 30/06/24 :
https://www.mediapart.fr/journal/politique/300624/legislatives-la-soiree-electorale-en-direct#at_medium=custom7&at_campaign=1047
Le 7/07/24 :
https://www.mediapart.fr/journal/politique/050724/dimanche-7-juillet-19-h-30-la-soiree-electorale-en-direct-sur-mediapart