When AI Arrives in Your Company: A Systemic Lens to Navigate a Change That Transforms Everything
- Jordi Vinadé Sais
- Dec 7, 2025
- 4 min read

When AI Arrives in Your Company: A Systemic Lens to Navigate a Change That Transforms Everything
Most organizations are beginning to explore how to incorporate Artificial Intelligence (AI) into their processes. There is enthusiasm, concern, curiosity, and also fear. And all of that is normal. AI is not just a new technology: it is a systemic change, because it impacts roles, identities, ways of working, and expectations about the future.
Many companies try to manage this moment as if it were a technological project. But the truth is that the most complex —and also the most interesting— part is not the technology, but how the human system (people, teams, relationships, decision-making, culture) integrates this new piece.
This article proposes a systemic lens to understand what happens inside an organization when AI enters the scene and how to support the change without forcing it or generating unnecessary resistance.
1. The Systemic Lens: AI Doesn’t Enter a Company, It Enters a System
When a company incorporates AI, it is not installing simple software.
It is introducing a new element into a complex system, and this alters:
relationships,
expectations,
roles,
decision flows,
professional identities.
Any system —human, family, or organizational— tends to preserve the balance it knows. Therefore, change depends not only on what we implement, but on how this affects the meaning people attribute to their place, value, and contribution.
The key question is not only “What can AI do?”, but:“What will happen to the whole system when it comes in?”
2. Roles and Functions: What We Do… and What We Fear Losing
In an AI context, many technical, administrative, or analytical roles will change. But more relevant than the role is the function that the role fulfilled in the system.
For example:
A person who has always contributed value through technical expertise may feel that AI is replacing them.
Someone who was a reference point for problem-solving may lose their “symbolic place” if automation now does it faster.
Creative profiles may question whether their originality will still be distinctive.
Under the surface, the system is not protecting tasks: it is protecting identities, relationships, and symbolic places.
And this is one of the main drivers of resistance.
3. Emerging Functions: How the Organization Speaks in Response to AI
When a system experiences profound change, emerging functions appear: dynamics that no one plans but that express what the whole is trying to regulate.
In an AI adoption process, it is common to see:
The Questioner: “Is this safe? Can we trust it?”
The Informal Spokesperson of Discomfort: “They’re not telling us the whole truth.”
The Disconnected One: “This doesn’t affect me; I’ll keep doing what I’ve always done.”
The Activator: “We need to move faster or we’ll fall behind!”
The Humorist: “In the end, AI will take our jobs, haha!” (and not so haha).
These functions are not defects or problems. They are signals. The system is expressing:
“A need for safety, meaning, and a space to understand this new element.”
If you listen to the function instead of labeling the person, transformation accelerates.
4. Why Is There Resistance? Because AI Touches Deep Layers
When an organization incorporates AI, it does so to improve processes, optimize costs, or expand capacity. But resistance does not arise from that. It arises from what the system tries to preserve:
Loyalty to the past: “We’ve always done it this way.”
Identity protection: “If AI does it better, who am I now?”
Preservation of relationships: “Working with people is what gave meaning to my job.”
Fear of incompetence: “What if I’m not able to adapt?”
Fear of invisibility: “Will I be disposable with AI?”
AI implies technological change, but resistance implies existential change.A common mistake is trying to convince.A common success is trying to understand.
5. Microchanges to Integrate AI: Small Movements That Transform Systems
Digital transformation is not achieved with a big strategic plan:it is achieved with continuous microchanges.
Examples:
- Changing who leads the first pilot testAllows redistributing functions and legitimizing profiles that previously had little visibility.
- Creating demonstration spaces without evaluationWhere people can try AI without fear of being exposed.
- Introducing questions that open possibilitiesSuch as:
“What part of your work would you like AI to take off your plate?”
“What part do you want to keep doing yourself and why?”
- Valuing both technical knowledge and human judgmentSo that the system understands that AI adds, not replaces.
A microchange is a small action that the system can tolerate without feeling threatened.And for that very reason, it generates real movement.
6. Communication and Double Binds: The Real Battlefield of Change
If an organization says:
“AI is an opportunity for everyone”…but internally cuts roles or automates without criteria,or says:
“You can share your opinion”…but doubts and personal rhythms are penalized,
double binds are created: contradictory messages that block any evolution.
Communicative coherence is the foundation of trust.AI does not generate fear: inconsistency does.
Conclusion: The Technology May Be New, but the Challenge Is Old
AI is powerful. But an organization’s ability to adapt depends on much more human factors:
how functions are distributed,
how resistance is recognized,
how emotions are legitimized,
how uncertainty is supported,
how the system’s attempts at preservation are heard.
AI can transform processes.But only people can transform the organization.
When we stop seeing AI as a threat and start seeing it as a new piece within a living system, change stops being a fear and becomes a possibility.



Comments