By Oguzhan Yesiltuna, 10 February 2026
Introduction
Being a regulation-level instrument, the AI Act has been made available in all official languages of the EU, as required by the founding treaties. Given the AI Act’s extraterritorial reach and the EU’s ambition for global leadership in digital governance, translating this landmark regulation into the languages of third countries is also essential for its international dissemination and influence. However, the fact that terms in the technology field, where English is the lingua franca, have become legal concepts may raise challenges in finding equivalents for these concepts in languages beyond English. This terminological tension is evident in the case of the “deployer” – a pivotal functional actor together with “provider” under the AI Act – where seeking a precise equivalent in other languages is complicated. In a recent paper, I explored this challenge within the Turkish context. Although Turkey has yet to enact comprehensive national legislation, its status as an EU-candidate country and a Customs Union member has already signalled a commitment to mirroring the AI Act’s framework in its domestic policy, which will also require finding a Turkish equivalent for the “deployer” term. This blogpost presents the key takeaways of my paper to an international audience.
Journey of the Concept in the Legislative Process
While absent from the Commission’s proposal, the term “deployer” emerged during the ordinary legislative process – specifically through the Parliament’s amendments – as a more precise replacement for the broader concept of the “user”. During the Parliamentary Committees’ work, certain proposals sought to include specific uses of the AI system (e.g. during personal activities or under the authority of a deployer) or specific users (e.g. educational and training institutions) within the scope of the definition, while others aim to retain the Commission’s proposed definition as it stands and replace “deployer” instead of “user”. Some proposals also seek to introduce new definitions not found in the initial proposal, such as end user, end recipient, AI subject, and affected person. The opinion of the Committee on the Environment, Public Health and Food Safety, for instance, states that “many of the applications mentioned in the proposed AI Act will involve not just users but end recipients. In the case of healthcare applications this distinction is crucial as there is a clear differentiation between the intended use and capabilities of patients and doctors. Therefore, the draft report now includes a new definition of end recipients and grants them the appropriate degree of transparency and provision of specific information.”
The following table presents an overview of the institutional stances before the trilogues.
‘user’ means any natural or legal person, public authority, agency or other body using an AI system under its authority, except where the AI system is used in the course of a personal non-professional activity | ‘deployer’ means any natural or legal person, public authority, agency or other body using an AI system under its authority, except where the AI system is used in the course of a personal non-professional activity | ‘user’ means any natural or legal person, including a public authority, agency or other body, under whose authority the system is used
|
The legislative journey ended with the AI Act introducing a new actor into EU acquis. A term that had only been used in the Payment Services Directive (albeit in a non-binding recital). “Deployer”, as defined under Article 3(4) AI Act, refers to “a natural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity.” Deployers, by integrating the AI system into specific operational environments, occupy a pivotal role in identifying context-specific risks that may have eluded the providers during the development. Consequently, while the providers maintain primary responsibility for the system’s compliance, Article 26 AI Act imposes specific obligations on the deployers to manage the risks associated with real-world applications. Identifying the deployer is crucial in determining responsibilities arising from the use of the system, including digital health applications.
Justification for the Concept
One argument for the replacement of “user” with “deployer” within the legislative records articulates the rationale for this conceptual shift as follows: “Deployer is the term that is used in the AI community. ‘User’ would at the same time lead to legal overlaps and contradictions with other laws such as the GDPR.” This justification is accurate but incomplete. The GDPR does not define the term “user”. In fact, the provisions of the GDPR do not even mention “user”. The term is only used in the recitals to the text and appears three times in total. However, the adoption of “deployer” appears intended to ensure cross-regulatory coherence. By avoiding “user,” the legislator mitigates the risk of conceptual conflation with existing definitions established under the New Legislative Framework, to which the AI Act frequently refers. This covers including but not limited to “end user” of Market Surveillance Regulation, “consumer” of General Product Safety Regulation, “user” of MDR and IVDR, and “professional user” of Machinery Regulation.
How Different EU Languages Use the Term “Deployer”?
The EU has 24 official languages, and all are considered “equally authentic” for legislation. To ensure legal certainty, regulations must be made available in all official languages so that citizens and practitioners can understand their rights and duties. While legislative acts are often drafted in English or French and then translated, legally, these are not considered “translations” but “equivalent versions” of the same act. The “equal authenticity” principle dictates that legislative acts do not exist as mere “translations” in the traditional sense. While the term “translation” implies a derivative relationship to a primary source, the “equal authenticity” principle rejects such a hierarchy, granting every linguistic version an original legal status. Consequently, this equivalence is recognized as a vital “legal fiction” essential for the uniform application of law within a multilingual Union.
The AI Act was published in the Official Journal of the EU on 12 July 2024 in all 24 official languages. When different language versions of the AI Act compared, it seems Member States have taken varied approaches to integrating the new “deployer” term into their respective languages. The Italian version borrows the English word “deployer” directly, as it is not a native Italian term, while French uses the term déployeur, a direct adaptation of the English concept. Some versions prefer functional descriptive expressions rather than a single word. For instance, Spanish uses responsable del despliegue (responsible for deployment), Portuguese uses responsável pela implantação (responsible for implementation), and Croatian uses subjekt koji uvodi sustav (the subject that introduces the system). The German version prefers Betreiber (operator), which is a common term in German technical law, though it uses Akteur for the broader category of operator under the AI Act. Although the Dutch initially used the term “exploitant”, the final version of the text prefers the term gebruiksverantwoordelijke, which mirrors the structure of the data controller term (verwerkingsverantwoordelijke) used in the GDPR.
Turning back to my paper, I argue that these varying approaches should serve as an example when seeking an equivalent for “deployer” in Turkish where different terms such as kullanıcı, dağıtıcı, konuşlandırıcı used in the first legislative proposal and scholarly texts. I suggest that uygulayıcı, uygulama sorumlusu (mirroring the Spanish and Portuegese way) or kullanım sorumlusu (mirroring the Dutch way) could be a good fit as these terms better emphasise the responsibility arising from the use of AI systems and ensure consistency with other product safety laws.
Conclusion
The emergence of the “deployer” as a new actor under the AI Act reflects a sophisticated attempt to align legal language with the technical realities of the AI community while ensuring cross-regulatory coherence. By moving away from the “user” concept, the EU has established a functional definition that pinpoints the responsibility for identifying context-specific risks during real-world applications. As the “equal authenticity” principle demonstrates, the challenge of translating such techno-legal concepts is a complex exercise in legal fiction. The diverse approaches taken by Member States – ranging from direct borrowing in Italian to the functional descriptive structures found in Spanish and Dutch – provide a valuable roadmap for non-EU jurisdictions. For countries like Turkey, navigating the tension between technical English and domestic legal terminology remains essential for alignment.
Credit: Cover Image created with Canva