

The inherent nature of systems
Why good intentions often wither in the grip of structure -- and what we can do about it
Most systems start with a sincere desire for improvement. A quality system should ensure equal treatment. A digital platform should free up time. A decision structure should provide predictability. But in practice, the opposite often happens: systems become rigid, self-reinforcing — and, in the worst case, they discourage human judgment, initiative, and development.
This is not a new insight. Many who have worked in health, education and organizational development have for decades warned against the same thing. Already in the 1970s, social psychologist wrote Stein Bråthen whether terms of dialogue in the data society, in which he showed how technological systems and structural pressures can displace the open, human conversation. Bråthen warned against “mechanical rationality” -- when computer systems and standards become governing at the expense of dialogue, ethics and judgment.
In more recent times, Per Fugelli, social medicine and social debater, an important voice that raised the same paradox: When systems grow, and people shrink. He used strong words to describe it -- among other things faerie Not to provoke, but to wake us up. To make us ask the question: Who really benefits from the system as it currently works?
When Systems Become Obstacles
Research from Amy Edmondson (2019) on psychological reassurance show that learning organizations rely on structures that support openness, experimentation, and error as a source of growth. But in many organizations we see the opposite: control mechanisms and procedural systems that punish deviations, and in which people do not dare to ask questions because “the system is like that”.
A concrete example: NAV and algorithm-driven case management
In the NAV scandal (2019), it became apparent how systems logic can override human judgment ability. Employees who suspected something was amiss felt bound by the “regulations” - despite the fact that it violated both the rule of law and EEA legislation. When a system is given the power to operate without real control from the people in it, it doesn't just become inefficient -- it becomes dangerous.
Empiry: When systems undermine trust
A study from MIT Sloan (2022) shows that complex compliance systems in large organizations often lead to what is called performative adherence: people behave correctly on the surface but avoid real reflection or responsibility. The result is stagnation and a loss of meaning in the work.
This is what we call the inherent nature of the system — that is, the paradox that the system gradually develops a force that undermines its own purpose.
And now -- at a time of increasing use of AI
These issues are becoming even more urgent today, as artificial intelligence and algorithm management gain an increasing place in decision-making. AI systems promise efficiency, precision and objectivity — but here, too, we see that they can quickly develop their own logic, detached from the human. When decisions are made based on pattern recognition and predictions, it can be at the expense of context, discretion, and ethical judgment.
Stein Bråthen's warnings about technological rationality are gaining renewed relevance in the face of automated hiring processes, health assessments and case reviews — in which the human role is reduced to a last link in the chain of control, rather than an actively assessing subject.
It's easy to be dazzled by AI as the solution — but we have to ask: Who questions the assumptions of the system? Who has access to understand how it works? And what happens to those who shout warsku?
What does this mean in practice?
- A municipality is implementing a new digital record system, but healthcare professionals spend up to 40% of their time documenting rather than being with patients (SINTEF, 2023).
- A school is given requirements for extensive testing and reporting. Teachers spend more time on registration than on feedback. Students' learning suffers.
- An organization introduces a performance management system that in theory should support development, but in practice becomes a goal-driven straitjacket.
- An AI system in HR selects candidates based on historical data — and recreates old biases with new technology.
So, what can we do?
We need to ask three simple but powerful questions:
- Does the system still serve its original purpose?
- Is there room for discretion, doubt and dialogue in this system?
- What happens to those who question the workings of the system -- are they rewarded or marginalized?
When the answer to these questions is negative, it's time to lift your gaze and adjust the course. Systems are not neutral. They have shaping power. And when shaped without critical reflection, they can develop what I purposely call a Inherent faaness -- a kind of structural blindness to what they're really for.
References
Bråthen, S. (1972). Models of Man and Society: Terms of Dialogue in the Computer Society. The University Publishing House.
Edmondson, A.C. (2019). Fearless organization: Creating psychological safety in the workplace for learning, innovation, and growth. John Wiley & Sons.
Fugelli, P. (2010). Death, shall we dance?. The University Publishing House.
NEW 2020:9. (2020). The Blind Zone: Examination of the misapplication of the EEA regulations in the Labour and Welfare Administration and the Social Insurance Court. Ministry of Labor and Social Affairs.
O'Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. The Crown.
Power, M. (1997). The audit society: Rituals of verification. Oxford University Press.
Raji, I. D., Bender, E. M., Paullada, A., Denton, E., & Hanna, A. (2020). Saving face: Investigating the ethical concerns of facial recognition auditing. Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, 145—151. https://doi.org/10.1145/3375627.3375824
Selbst, A. D., Boyd, D., Friedler, S. A., Venkatasubramanian, S., & Vertesi, J. (2019). Fairness and abstraction in sociotechnical systems. Proceedings of the Conference on Fairness, Accountability, and Transparency, 59—68. https://doi.org/10.1145/3287560.3287598
SYNTH. (2023). Time use and documentation requirements in municipal health and care services [Report]. SYNTHETICS.
Vaughan, D. (1996). The Challenger launch decision: Risky technology, culture, and deviance at NASA. University of Chicago Press.




