Next Article in Journal / Special Issue
Digital Political Communication in the European Parliament: A Comparative Analysis of Threads and X During the 2024 Elections
Previous Article in Journal
Constructing Authenticity as an Alternative to Objectivity: A Study of Non-Fiction Journalism in Chinese Media
Previous Article in Special Issue
Attracting the Vote on TikTok: Far-Right Parties’ Emotional Communication Strategies in the 2024 European Elections
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

The Role of Requests for Information in Governing Digital Platforms Under the Digital Services Act: The Case of X

IMT School for Advanced Studies Lucca, 55100 Lucca, LU, Italy
Journal. Media 2025, 6(1), 41; https://doi.org/10.3390/journalmedia6010041
Submission received: 31 December 2024 / Revised: 26 February 2025 / Accepted: 28 February 2025 / Published: 12 March 2025

Abstract

:
The Digital Services Act (DSA) is the first supranational regulation aimed at improving the safety, transparency and accountability of online platforms. However, the DSA enforcement process is substantially opaque due to the scarcity of publicly available legal documents on methods, sources and results of the investigations carried out under its scope. This paper examines the transparency of the DSA enforcement process, focusing on the legal and political motivations of the progression from requests for information (RFIs) to the initiation of proceedings, using the European Commission’s investigation against X as a case study.

1. Introduction

The Digital Services Act (DSA) (European Parliament and Council, 2022) is the first supranational regulation addressing the influence of online platforms on society through provisions aimed at fostering their safety, transparency and accountability towards users. As of 17 February 2024, the DSA applies “to all online intermediaries in the EU”, and therefore also online platforms1 (European Commission, 2024d): this means that every online platform is now liable for failing to comply with its requirements on, among other things, content moderation, recommender systems, advertising transparency and systemic risk assessment. For what concerns very large online platforms (VLOPs) and search engines (VLOSEs), defined as those reaching more than 10% of the EU population or 45 million users, the deadline for full compliance with the DSA was set in August 2023. The Commission has the exclusive supervisory and enforcement power for the obligations concerning VLOPs and VLOSEs. Consequently, to verify any failure to comply with the new rules, the Commission sent requests for information (RFIs) to various VLOPs, including X, Facebook, Instagram, AliExpress, TikTok and YouTube, starting from October 2023 (European Commission, 2025b) The RFI is the first stage of an investigation under the DSA, which can be followed by access to the data and algorithms used by the platform, the conduction of interviews with informed subjects and inspections at the platform’s premises. The information gathered through RFIs has already motivated the opening of various proceedings of infringement, including against X, TikTok, AliExpress, Facebook, Instagram and Temu between December 2023 and December 2024.
In the perspective of clarifying the structure and implications of the DSA enforcement process, this analysis focuses on the progression from an RFI to the initiation of proceedings against a platform: in particular, this contribution aims to provide insights on which type of evidence collected by the Commission through investigations can justify the decision to open a procedure of infringement under the DSA. Given that the text of already submitted RFIs is not publicly available, nor is it possible to access official documents about further investigative steps (as they may not have taken place yet), this article will compare the press release about a specific RFI with its first legal consequence, i.e., the Commission decision on initiating proceedings against X (hereafter Commission decision), published on 18 December 2023 (European Commission, 2023a). Preceded by considerations on the problems raised by an opaque public governance of online platforms, this comparison will set the ground for understanding how the evidence collected by the Commission can justify the initiation of proceedings. This analysis relies on documents available through the European Commission website, including press releases, executive summaries, calls for applications and the Commission decision itself, which is the first legal document publicly disclosed regarding the DSA enforcement process, apart from designation decisions. The other decisions on initiating proceedings integrally publicly shared to date concern the one against AliExpress, the second of the three against TikTok (regarding TikTok Lite’s “Task and Reward Programme”, which has been suspended in the EU) and the one against Temu, in order of appearance (European Commission, 2025b). X is the only platform whose proceedings led to preliminary findings, which have been announced by the Commission through a press release (European Commission, 2024c).

2. The Problems Raised by Opacity

Notifying the initiation and development of proceedings against VLOPs through press releases, without publicly available legal documents backing most of the announcements, underscores the opacity of the communication strategy adopted by the Commission regarding DSA enforcement. The reason for sharing only a minority of the legal documents about the proceedings in such an inconsistent fashion has not been provided. However, the public availability of legal and policy evidence about the DSA enforcement procedures is essential for fostering scrutiny by researchers and platform users (Liesenfeld, 2024). Moreover, sharing information on who does what and how, i.e., which entity of the Commission is responsible for selecting specific evidence about the illegal activities of platforms and how such evidence can justify the opening of proceedings, is a necessary requisite for a transparent and accountable governance.
The lack of integral legal documents on the DSA enforcement strategy and on how the Commission evaluates DSA-mandated audits and information provided by VLOPs motivates concerns about platforms’ compliance practises2, outlining the risk of transparency washing (Zalnieriute, 2021) and regulatory capture. As Terzis et al. (2024) put in evidence regarding algorithmic auditing for VLOPs, “the breadth of systemic risks that the DSA requires to be considered also allows considerable room for framing, bringing the auditor’s priorities and worldview and constructing the DSA requirements around them, rather than the other way around”. Auditors’ priorities also include maintaining good business relationships with VLOPs that may already use their services for financial audits, as the Delegated Regulation on the Performance of Audits (DRPA) (European Commission, 2023b) clarifies that concerns for conflict of interest “should not exclude auditing organisations who have performed statutory financial audits” (recital 8 DRPA). While “responsibility for the initial formulation of the benchmarks against which compliance is/will be sought, rests with the audited organisation (DRPA art. 5 (1) (a))”, the “back-and-forth between the auditing organisation and the [...] audited organisation is likely to have a considerable impact on the way auditing benchmarks are formulated and/or standardised in the future” (Terzis et al., 2024). Consequently, as, de facto, “the primary responsibility for the formulation of benchmarks is left to the auditing organisation, [...] benchmark disparities amongst different auditors may incentivise platforms to choose their assessors and auditors based on their benchmarks (easy or difficult, simple or complicated)” (ibidem). Therefore, it can be argued that the market of platform auditing is going to expand but not necessarily by diversifying its offer: it might indeed align along minimalist interpretations of regulatory provisions advanced by large consulting companies.
The need for transparency regarding the DSA enforcement process the has also been highlighted by an EU institution holding the Commission accountable: in fact, on 14 November 2024, the European Ombudsman (2024) issued a letter to the President of the Commission challenging the “general presumption of confidentiality” which, the Commission argues, applies to all documents related to DSA enforcement. In particular, following the Commission’s rejection of a request to publish X’s systemic risk assessment report in 2023 (which would have anyway been published by November 2024 as mandated by the DSA), the Ombudsman argues that “the requirement for professional secrecy laid down in Article 84 of the DSA […] does not mean that all information the Commission obtains under the DSA should, as such, be considered as confidential” (European Ombudsman, 2024). Therefore, in the Ombudsman’s view, “applying a general presumption of confidentiality to the risk assessment report at issue […] constituted maladministration” (ibidem). In this regard, clarifications on the boundaries of what the Commission considers confidential information in the scope of the DSA are urgently needed to prevent accusations of maladministration, which might weaken public trust (Field & Roberts, 2020) and limit the overall effectiveness of the enforcement process. Indeed, the sharing of knowledge among the different social stakeholders involved, which would help design solutions for a pluralistic platform governance, is prevented by “behind-closed-door” enforcement and compliance dynamics. While the legal premises of the DSA align, in theory, with the principle of cooperative responsibility outlined by Helberger et al. (2018), the practises of its application have until now followed a different approach, focused on the political bargaining between regulators and regulated entities. For example, while recital 90 DSA invites VLOPs to consult users’ representatives for the assessment and mitigation of systemic risks, it is not possible to know, in many cases even after reading platforms’ systemic reports, which organisations have been consulted; the same holds for the investigations carried by the Commission.

3. The Content and Consequences of a Request for Information

According to the related press release, the RFI sent to X (European Commission, 2023d) on 12 October 2023 concerned “the assessment and mitigation of risks related to the dissemination of illegal content, disinformation, gender-based violence, and any negative effects on the exercise of fundamental rights, rights of the child, public security and mental well-being”. The Commission aimed at scrutinising X’s “policies and actions regarding notices on illegal content, complaint handling, risk assessment and measures to mitigate the risks identified”. Following the submission of the RFI, the Commission decision identified five areas of concern in which the platform is suspected to have infringed the DSA provisions. A closer look at the document of the decision can help highlight the content and consequences of an RFI to a VLOP.
Firstly, Twitter International Unlimited Company (TIUC), “the main establishment of the provider of X in the European Union”, failed to “diligently assess certain systemic risks in the European Union stemming from the design and functioning of X and its related systems, including algorithmic systems, or from the use made of their services” (European Commission, 2023a). In particular, the company did not “put in place reasonable, proportionate and effective mitigation measures” for “the actual and foreseeable negative effects on civic discourse and electoral process stemming from the design and functioning of X in the European Union”: in fact, the current solutions “appear inadequate […] notably in the absence of well-defined and objectively verifiable performance metrics” (ibidem). This failure is particularly evident in the moderation of content featuring languages different from English or pertaining to specific local and regional contexts. Suspicions that “insufficient resources [are] dedicated to mitigation measures” (ibidem) focus on the role of Community Notes, the collaborative feature that allows users to “leave notes on any post and, if enough contributors from different points of view rate that note as helpful, the note will be publicly shown on a post” (X Help Center, n.d.).
Secondly, the company would systematically fail to process efficiently, “take decisions in a diligent, non-arbitrary, and objective manner” and answer “without undue delay” to the “notices […] of allegedly illegal content hosted on X” (European Commission, 2023a); therefore, X’s content moderation would be insufficient also when the input comes from users. Thirdly, the recent possibility of purchasing the blue checkmark that once marked an account as verified is considered deceptive and manipulative for users of X, who “are led to interpret […] checkmarks as an indication that they are interacting with an account whose identity has been verified or is otherwise more trustworthy, when in fact no such verification or confirmation of trustworthiness appear to have taken place” (ibidem). Fourthly, the company did not abide by the transparency requirements for online advertising, “by not providing searchable and reliable tools that allow multi-criteria queries and application programming interfaces to obtain all the information on such advertisements as required by Article 39(2)” of the DSA (ibidem). Lastly, the VLOP seems “to have denied access to data that are publicly accessible on X’s online interface to qualified researchers” (ibidem) by imposing costs for using the API and prohibiting the scraping of publicly accessible data, in violation of Article 40(12) of the DSA.
As the excerpts from the Commission decision highlight, the proceedings against X were initiated as a follow-up to the platform’s response to the RFI submitted on 12 October 2023: in particular, the problematic issues regarding the handling of notices of illegal content, the measures to mitigate systemic risks and the platform’s moderation policy, indicated in the press release about the RFI, correspond to the main areas of concern addressed by the decision. The consequentiality between the RFI and the decision is underlined also by points 4 and 5 of the latter: in fact, while point 4 refers to the submission of the RFI and the response provided by X, point 5 states that, according to Article 66(1) of the DSA, “the Commission may initiate proceedings in view of the possible adoption of decisions […] in respect of the relevant conduct by the provider of the very large online platform or of the very large online search engine that the Commission suspects of having infringed any of the provisions” (ibidem).

4. Next Steps of the Enforcement Process

The areas of concern addressed by the Commission decision include issues that emerge both from the bottom-up interaction with the platform and the top-down governance of its socio-technical ecosystem: data access by researchers and users’ notices of illegal content pertain to the former, while the identification of systemic risks and the measures adopted to mitigate them pertain to the latter. The co-existence of top-down and bottom-up perspectives is not only motivated by the necessity to address issues coming from different sources (e.g., complaints filed by recipients of the service versus a letter from the Commission to the platform requesting clarifications); it is also functional to investigate different aspects of the same area of concern (e.g., the use of Community Notes to support content moderation, whose effectiveness is questioned both by users and by the Commission as the platform’s reliance on such collaborative feature may hint at the lack of dedicated personnel for this delicate role). The structure and content of the decision indicate that, after the enforcement of the DSA, VLOPs and VLOSEs may not be able to consider anymore prima facie compliance as separated from the obligations to provide reasonably prompt and reliable answers to the issues raised by individual users and researchers. In practice, platforms are expected to go beyond the understanding of compliance as a checking-the-box effort; instead, they should engage in a substantive dialogue with users regarding content moderation and with researchers regarding data access.
The investigation initiated by the Commission decision aims to establish whether the failures outlined above “would constitute infringements of Articles 34(1), 34(2) and 35(1)” of the DSA as regards the first area of concern (inadequate assessment and mitigation of systemic risks), Article 16(5) and 16(6) for what concerns the second one (handling of notices of illegal content), Article 25(1) with respect to the third one (deceptive design of checkmarks), Article 39 with reference to the fourth one (lack of tools to ensure transparency in ads) and Article 40 apropos the fifth one (denied data access to researchers) (European Commission, 2023a). According to the European Commission (2023c), the next steps of the investigation will include gathering further “evidence, for example by sending additional requests for information, conducting interviews or inspections”. Following the opening of formal infringement proceedings, the Commission will be empowered “to take further enforcement steps, such as interim measures, and non-compliance decisions” and “to accept any commitment made by X to remedy on the matters subject to the proceeding” (ibidem). Interestingly, the “DSA does not set any legal deadline” for the end of the proceedings, whose duration will depend on several “factors, including the complexity of the case, the extent to which the company concerned cooperates with the Commission and the exercise of the rights of defence” (European Commission, 2023c). The responsibility of carrying out the investigation pertains just to the Commission, whose decision “relieves Digital Services Coordinators, or any other competent authority of EU Member States, of their powers to supervise and enforce the DSA in relation to the suspected infringements of Articles 16(5), 16(6) and 25(1)” (ibidem). In this context, it is difficult to make hypotheses about a timeline for the conclusion of the proceedings against X.
The preliminary findings announced by the Commission in July 2024 (European Commission, 2024c) address three of the issues motivating the initiation of proceedings (dark patterns, advertising transparency and data access for researchers), while for the rest of them the investigation continues. The investigation was based on an “analysis of internal company documents, interviews with experts, as well as cooperation with national Digital Services Coordinators” (ibidem). As can be observed, neither the evidence sent by X in response to the RFI nor the content of the interviews with experts was made available by the Commission, which has kept both the methodology and the results of its investigation opaque. The reported preliminary findings sent to X substantially correspond to the assumptions of the Commission decision, as the investigation outcome highlights that: firstly, “X designs and operates its [...] ‘Blue checkmark’ in a way that [...] deceives users”, because, as “anyone can subscribe to obtain such a ‘verified’ status”, users may be prevented from making “free and informed decisions about the authenticity of the accounts and the content they interact with”; secondly, “X does not comply with the required transparency on advertising, as it does not provide a searchable and reliable advertisement repository” allowing “for the required supervision and research into emerging risks brought about by the distribution of advertising online”; thirdly, “X fails to provide access to its public data to researchers in line with the conditions set out in the DSA”; for example, by preventing scraping or making the access to its API too burdensome or expensive (ibidem).

5. The Intertwined Role of ECAT, DG Connect and External Contractors

Given the amount of investigative work that the Commission will embark on to enforce the DSA, it is useful to focus on the entities involved in this process: on the one side, the European Centre for Algorithmic Transparency (ECAT), within the Joint Research Centre (JRC), and, on the other side, the DSA enforcement team, within Directorate F (Platforms Policy and Enforcement) (European Commission, 2024e) of DG Connect. While the DSA enforcement team should have the responsibility of enforcing the regulation, some crucial investigative procedures, like “algorithmic system inspections” and “technical tests to enhance [...] the understanding of their functioning” (European Centre for Algorithmic Transparency, n.d.-a), would be carried out by ECAT; therefore, it is unclear how the collaboration and the division of duties between the two institutional bodies will be managed. On the research side, the mission of ECAT is to study the “short, mid and long-term societal impact” of algorithmic systems, identify and measure “systemic risks associated with VLOPs and VLOSEs” and develop “practical methodologies towards fair, transparent and accountable algorithmic approaches, with a focus on recommender systems and information retrieval” (ibidem). On the enforcement side, the DSA enforcement team will be composed of “multi-disciplinary teams […] co-operating with regulatory authorities in the Member States”, which “will engage with stakeholders and gather knowledge and evidence to support the application of the DSA and to detect, investigate and analyse potential infringements of the DSA” (European Commission, 2024e).
The information provided by the Commission is currently not sufficient to understand how the work of ECAT and the DSA enforcement team intertwines specifically. Indeed, both these entities focus on gathering knowledge and evidence about VLOPs and VLOSEs to support the enforcement of the DSA, to the point that ECAT features an “Algorithm inspections & DSA enforcement” (European Centre for Algorithmic Transparency, n.d.-b) team3, whose work overlaps even nominally with that carried out at DG Connect. In particular, the profile of a technology specialist in the enforcement team at DG Connect is, if not similar, at least complementary to that of a researcher/inspector at ECAT. In fact, technology specialists “will work in seamless cooperation with the European Centre for Algorithmic Transparency (ECAT) and facilitate interactions with technical teams at very large online platforms and search engines” (European Commission, 2024f). The relevance of distinguishing between researchers at ECAT and technology specialists at the DSA enforcement team lies in the undefined borders of the presumption of confidentiality regarding the DSA enforcement (European Ombudsman, 2024). In fact, although ECAT is a governmental research centre, no publication on the DSA, systemic risks and algorithmic systems inspections can be found in its publicly accessible repository (European Commission, n.d.-a). This underscores, again, an undeclared confidentiality policy that does not align with ECAT’s aim of sharing “knowledge and facilitation of discussions on algorithmic transparency with international stakeholders” (European Centre for Algorithmic Transparency, n.d.-a). If the research on the DSA produced by ECAT is considered confidential, then it would fall under the scope of the DSA enforcement according to the presumption of confidentiality advanced by the Commission and challenged by the Ombudsman. If this were the case, however, there would be no distinction between the role of an ECAT researcher working on algorithmic inspections and that of a data scientist or technology specialist at the DSA enforcement team; therefore, it is not clear why these two overlapping roles would be included in two different directorates-general, i.e., the JRC and Connect.
Between November and December 2024, VLOPs and VLOSEs have published the systemic risk assessment reports describing the measures they undertook to comply with the DSA requirements on the identification and mitigation systemic risks (art. 34 and 35), which lie at the core of the proceedings initiated by the Commission. However, the amount of work needed to verify the information provided by platforms cannot be carried out by the Commission alone, which will in fact rely on external contractors to produce the evidence needed to enforce the DSA systemic risk framework. In June and July 2024, respectively, two calls for tenders have been published by DG Connect to recruit external contractors to support the gathering of evidence for the DSA enforcement regarding VLOPSEs. The first call (European Commission, 2024a), which has an estimated total value 600.000 euros for a maximum duration of four months, involves carrying a study that “should inform with scientific and technical and other relevant insights the Commission’s work on the first edition of a report on systemic risks and their mitigation, as referred to in Article 35(2) DSA”. The second call (European Commission, 2024b), which has an estimated total value of 12 million euros for a maximum duration of three years, is divided into three lots: the first one is aimed at developing “(real-time) monitoring and “early warning” systems and tools to inform and alert the European Commission about technological developments related to platform architectures, functions and features; economic and market trends; the emergence of systemic risks (including linked to the protection of minors, i.e., dangerous challenges on social media); as well as the emergence of potential digital threats”; the second one is devoted to “Evidence gathering and compliance monitoring of prominent and recurring risks in Member States and specific areas”; the third one concerns “Evidence gathering and compliance monitoring of online marketplaces and compliance with online advertising provisions”.

6. Conclusions

The application of the DSA is proceeding at a fast pace: between the end of 2023 and the end of 2024, most social media VLOPs have been subject to investigations under the DSA, and four pornographic and two fashion retail platforms have been designated as VLOPs. While the DSA has come into force for every online platform since 17 February 2024, the main questions about the modalities and timeline of its application and the actions that platforms will need to take to ensure compliance are still open. In particular, it should be clarified which regulatory mechanisms and investigative evidence undergird the progression from an RFI to the initiation of proceedings against a platform. Considering that there is no legal deadline for the end of the proceedings, neither platforms nor users can have an estimate of their duration. Users are the main beneficiaries of the protection granted by the DSA, but if they are not involved in shaping its enforcement, the new provisions may not have tangible beneficial outcomes. The evidence on which the DSA enforcement relies risks being crafted by the interested interpretations of the regulatory requirements by auditing companies, while a wider stakeholder involvement does not appear under way. This contribution shows the direction of the first enforcement procedure under the DSA, thereby highlighting the type of evidence that European regulators see as symptomatic of non-compliance. The outcome of the ongoing proceedings against VLOPs will set a milestone for the future DSA enforcement strategies and their implications for users’ online experience. The DSA has the potential to change the interaction between digital companies and EU citizens by enhancing public accountability and users’ empowerment. However, for this potential to be realised, regulatory principles need to be translated into clear and transparent enforcement practises, which should be publicly disclosed to European researchers and citizens.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The documental evidence used for this study is available through the links to sources cited in the references.

Conflicts of Interest

The author declares no conflict of interest.

Notes

1
European Commission (n.d.-b) provides an outline of the pyramidal governance framework established by the DSA for intermediary services, hosting services and online platforms: https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en (accessed on 18 December 2023).
2
Regulatory guidelines are emerging to direct compliance in certain areas, such as the Code of Conduct on Disinformation, which was approved on 13 February 2025 (European Commission, 2025c). When its commitments are aligned with related DSA articles, the Code “represents a further operational and complementary layer, including reporting obligations, on top of existing legal obligations” (European Commission, 2025a) and therefore constitutes a compliance benchmark for its signatories, including VLOPs. However, the adoption of the Code is voluntary and not following it does not create a prejudice on the evaluation of compliance.
3
Notably, the webpage corresponding to this reference, which detailed ECAT’s teams (Research and Algorithms & Inspections) with the names of their respective members, has been recently removed or hidden, pointing to a voluntary reduction in transparency about the organisational structure of this research centre.

References

  1. European Centre for Algorithmic Transparency. (n.d.-a). About ECAT. Available online: https://algorithmic-transparency.ec.europa.eu/about_en (accessed on 10 May 2024).
  2. European Centre for Algorithmic Transparency. (n.d.-b). Who we are. Available online: https://algorithmic-transparency.ec.europa.eu/team_en (accessed on 10 May 2024).
  3. European Commission. (n.d.-a). HUMAINT publications. Available online: https://ai-watch.ec.europa.eu/humaint/publications_en (accessed on 20 February 2025).
  4. European Commission. (n.d.-b). The Digital Services Act—Ensuring a safe and accountable online environment. Available online: https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en (accessed on 12 December 2024).
  5. European Commission. (2023a). Commission decision initiating proceedings pursuant to Article 66(1) of Regulation (EU) 2022/2065. Available online: https://ec.europa.eu/newsroom/dae/redirection/document/101292 (accessed on 12 December 2024).
  6. European Commission. (2023b). Commission delegated regulation supplementing regulation (EU) 2022/2065 of the European Parliament and of the Council, by laying down rules on the performance of audits for very large online platforms and very large online search engines (DRPA). Available online: https://ec.europa.eu/newsroom/dae/redirection/document/99436 (accessed on 12 December 2024).
  7. European Commission. (2023c). Commission opens formal proceedings against X under the Digital Services Act. Available online: https://ec.europa.eu/commission/presscorner/detail/en/ip_23_6709 (accessed on 12 December 2024).
  8. European Commission. (2023d). The commission sends request for information to X under the Digital Services Act. Available online: https://ec.europa.eu/commission/presscorner/detail/en/IP_23_4953 (accessed on 12 December 2024).
  9. European Commission. (2024a). Call for tenders EC-CNECT/2024/OP/0032—Digital Services Act—Study on systemic risks and their mitigation. Available online: https://ec.europa.eu/info/funding-tenders/opportunities/portal/screen/opportunities/tender-details/c7f44485-4fac-457a-95ca-4a1671ac7b8d-CN?isExactMatch=true&order=DESC&pageNumber=1&pageSize=50&sortBy=startDate (accessed on 12 December 2024).
  10. European Commission. (2024b). Call for tenders EC-CNECT/2024/OP/0052—Digital Services Act: Technical assistance for market intelligence, evidence gathering and compliance monitoring. Available online: https://ec.europa.eu/info/funding-tenders/opportunities/portal/screen/opportunities/tender-details/efd2992a-c0a2-498c-83ca-a1729a3863ff-CN (accessed on 12 December 2024).
  11. European Commission. (2024c). Commission sends preliminary findings to X for breach of the Digital Services Act. Available online: https://ec.europa.eu/commission/presscorner/detail/en/ip_24_3761 (accessed on 12 December 2024).
  12. European Commission. (2024d). Digital Services Act starts applying to all online platforms in the EU. Available online: https://ec.europa.eu/commission/presscorner/detail/en/ip_24_881 (accessed on 12 December 2024).
  13. European Commission. (2024e). Do you want to help enforce the Digital Services Act? Apply now to be part of the DSA enforcement team! Available online: https://digital-strategy.ec.europa.eu/en/news/do-you-want-help-enforce-digital-services-act-apply-now-be-part-dsa-enforcement-team (accessed on 12 December 2024).
  14. European Commission. (2024f). Job opportunities within the Digital Services Act enforcement team. Available online: https://eu-careers.europa.eu/sites/default/files/eu_vacancies/2024-01/2024_2nd%20call_Job_opportunities%20within%20the%20DSA%20Enforcement%20Team%20final%20with%20privacy%20statement%20Final.pdf (accessed on 12 December 2024).
  15. European Commission. (2025a). COMMISSION OPINION of 13.2.2025 on the assessment of the code of practice on disinformation within the meaning of article 45 of regulation 2022/2065. Available online: https://ec.europa.eu/newsroom/dae/redirection/document/112679 (accessed on 20 February 2025).
  16. European Commission. (2025b). Supervision of the designated very large online platforms and search engines under DSA. Available online: https://digital-strategy.ec.europa.eu/en/policies/list-designated-vlops-and-vloses#ecl-inpage-lqfbha7w (accessed on 20 February 2025).
  17. European Commission. (2025c). The code of conduct on disinformation. Available online: https://digital-strategy.ec.europa.eu/en/library/code-conduct-disinformation (accessed on 20 February 2025).
  18. European Ombudsman. (2024). The European Ombudsman’s preliminary views on the European Commission’s refusal to give public access to the risk assessment report of a large social media company on its compliance with the provisions of the Digital Services Act. Available online: https://www.ombudsman.europa.eu/en/doc/correspondence/en/199731 (accessed on 20 February 2025).
  19. European Parliament and Council. (2022). REGULATION (EU) 2022/2065 of the european parliament and of the council of 19 October 2022 on a single market for digital services and amending directive 2000/31/EC (Digital Services Act). Official Journal of the European Union. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32022R2065 (accessed on 12 December 2024).
  20. Field, M. B., & Roberts, S. (2020). Trust, integrity and the weaponising of information: The EU’s transparency paradox. Journal of Contemporary European Research, 16(3). [Google Scholar] [CrossRef]
  21. Helberger, N., Pierson, J., & Poell, T. (2018). Governing online platforms: From contested to cooperative responsibility. The Information Society, 34(1), 1–14. [Google Scholar] [CrossRef]
  22. Liesenfeld, A. (2024). The legal significance of independent research based on article 40 DSA for the management of systemic risks in the digital services act. European Journal of Risk Regulation, 1–13. [Google Scholar] [CrossRef]
  23. Terzis, P., Veale, M., & Gaumann, N. (2024, June 3–6). Law and the emerging political economy of algorithmic audits. The 2024 ACM Conference on Fairness, Accountability, and Transparency (pp. 1255–1267), Rio de Janeiro, Brazil. [Google Scholar] [CrossRef]
  24. X Help Center. (n.d.). About community notes on X. Available online: https://help.twitter.com/en/using-x/community-notes (accessed on 12 December 2024).
  25. Zalnieriute, M. (2021). “Transparency washing” in the digital age: A corporate agenda of procedural fetishism. Critical Analysis L., 8, 139. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Fabbri, M. The Role of Requests for Information in Governing Digital Platforms Under the Digital Services Act: The Case of X. Journal. Media 2025, 6, 41. https://doi.org/10.3390/journalmedia6010041

AMA Style

Fabbri M. The Role of Requests for Information in Governing Digital Platforms Under the Digital Services Act: The Case of X. Journalism and Media. 2025; 6(1):41. https://doi.org/10.3390/journalmedia6010041

Chicago/Turabian Style

Fabbri, Matteo. 2025. "The Role of Requests for Information in Governing Digital Platforms Under the Digital Services Act: The Case of X" Journalism and Media 6, no. 1: 41. https://doi.org/10.3390/journalmedia6010041

APA Style

Fabbri, M. (2025). The Role of Requests for Information in Governing Digital Platforms Under the Digital Services Act: The Case of X. Journalism and Media, 6(1), 41. https://doi.org/10.3390/journalmedia6010041

Article Metrics

Back to TopTop