Table of contents
A fairly common idea in many companies is to think: “We only use a pre-trained model, we don't develop anything.” But with the entry into force of the Artificial Intelligence Regulation (AIR) in the European Union, and especially after the specific consultation by the AI Office on “the scope of obligations of providers of general-purpose AI models,” such assumptions may involve a risk.
If your company modifies, adapts, or simply markets an AI model, even if it did not create it from scratch, it could be considered a provider of a general-purpose AI model (GPAIM). And that implies new legal responsibilities.
1. What is a general-purpose AI model?
The RIA defines GPAIMs as models trained with large amounts of data, designed to be versatile: they can perform multiple tasks and are not intended for a single use case. Some well-known examples are GPT, Claude, and Mistral, but vision, image generation, and multimodal analysis models also fall into this category.
To help identify these models, the AI Office has published a series of indicators that allow us to presume that a model is general-purpose. Not all of them need to be met, but the more that are met, the more likely it is that we are dealing with a GPAIM and that the RIA obligations for its providers will be triggered. Some examples:
- The model has been trained with more than 10²² FLOPs (a unit that measures the computing power invested in the creation of the model).
- It has the ability to generate text or images from an instruction.
2. Who is considered a provider of a general-purpose AI model (GPAIM)?
Not only is the person who creates the model from scratch considered a provider. According to Article 3.3 of the RIA and the AI Office consultation, the following are also considered providers:
- Those who market the model under their own name.
- Those who publish it in a repository or offer it through an API.
- Those who commission its development to a third party.
- Substantially modifies it (e.g., by applying fine-tuning, retraining, or customization with new data).
In other words, if you launch the model on the market or make significant changes, you will be considered a provider, and that comes with obligations.
3. Modifying a model = becoming a new provider
The AI Office introduces the concept of downstream modifiers: companies that take someone else's GPAIM and modify it in a significant way, thereby creating a new model. But when is a modification considered significant?
A simple example: if you use a pre-trained model (such as GPT, LLaMA, or Mistral) and retrain it with your own data or for specific tasks, you are altering its behavior. If the computational effort of that modification exceeds 30% of that used by the original model, it is considered a new model, and you become its provider.
This means that if you perform fine-tuning with significant resources, incorporate new data or functionalities, you must assume the same legal obligations as someone who developed the model from scratch.
4. Case studies: Who is the provider in each scenario?

5. What are the obligations of a GPAIM provider?
According to Articles 53 to 55 of the Regulation, a provider must, among other things:
- Develop and maintain technical documentation for the model.
- Publish a summary of the training.
- Ensure that no copyrights have been infringed during training.
- Notify the Commission if the model presents systemic risks.
- Inform integrators and users about the limitations and characteristics of the model.
And if the model has been trained with personal data, the obligations of the GDPR may also apply as data controller.
Conclusion: Responsibility is closer than you think
With the new regulation, being a provider of an AI model is no longer exclusive to large developers. If your company reuses, adapts, or launches a model with significant modifications, it may end up assuming that legal role, with all its consequences.
Our expert lawyers can help you comply with all legal requirements, depending on your situation, as described above. You can contact them here.
Data Protection and Digital Law Department
Add new comment