"Did the AI write this?" Yes. a protocol without illusions
Кому принадлежат рельсы. Почему «это написал ИИ» — не аргумент, а маска обесценивания. И как жить дальше.
The manifesto is a document on the structure of the 2020→2026 turning point: the rails of power (computing, data, distribution), the end of mediation, the economy of infrastructure, the political layer under demolition, the scenario map, the practical protocol.
Discussions about “trusting/ not trusting AI”, “artificial/ insistent”, “dangerous / safe” — the conversation of the last century.
Since 2020, the infrastructure of attention has been rebuilt, and by June 2026 the transition will be completed: intermediaries will gradually become unnecessary, power will gain a foothold on the rails (computing, data, distribution). Everything else is derived.
Changing the axis of History: Why old questions no longer work
The thesis is straightforward: “what you describe — a matter of trust, artificiality, human control — belongs to another century.” This is not a criticism, it is a diagnosis of the era. In the 2010s, it was appropriate to discuss whether AI was “real” and “ethical.” In the 2020s, the axis turned: We don’t live next to AI, but inside an environment where AI is a way to redistribute attention and power.
What is important to accept:
AI is not an actor on stage. It’s a scene.
The question “will AI dominate” is incorrect. It is already embedded, invisible, and systematic.
Therefore, the correct focus is control architecture, not moralistic declarations.
Time frame: 2020 → June-2026
the statement is fixed: “what started around 2020 will complete the transition phase by June 2026.” Not symbolism, but a real restructuring of the world structure.
What exactly will “close” to this seam:
The human mediation system: jobs-pads, institutions-legitimators, bureaucracy-communicator, gatekeepers of meanings, translators, moderators, managers.
AI will not “come” for them — it has already replaced the functions they performed: attention retention, transmission/summary of information, verification of meaning, basic organization of work.
In short: mediation as a profession ceases to be a necessary layer.
User Economy (Djemal-2002) → Attention Industry (2020s)
As Jemal said back in 2002: the economy of the new is the economy of the user.
Translation into today’s language:
The user is an employee.
His “salary” is attention.
Every action (click, view, second) is a microtransaction of consciousness.
What the AI did: completed the transformation. He automated the extraction and redistribution of attention. Anything that doesn’t hold your attention is instantly devalued. Politics, culture, and spiritual narratives become monetized and weaponized (used as levers of influence).
The key: attention is no longer a “resource”. It is a management tool.
Who owns rails (and why it’s more important than “ethics”)
Rails are computing power, data pipelines, output stack, licenses and ToS, distribution channels.
It’s not the debate that decides, but capex and property rights.
Focus: “it’s not the model that rules, but the investor behind the model.” The engineer writes, but the investor determines:
— what is an “AI product”
— who has access and under what conditions
— within what limits the scale is allowed
Conclusion: we are shifting the conversation from “ethics” to the architecture of control. This is an adult level.
What will disappear (and why it’s not “liberation”)
Removing the “mask” by June 2026 will show that a significant part of the administrative and managerial stratum of the “middle class” has disappeared/been absorbed.
The list of functions that are minimized:
— Management-laying, PR-layers, editors-generalizers, translators “by default”, a significant piece of the academic “apparatus”
— Bureaucracy as a function — is automated
— Institutions as stabilizers of meaning are losing their power
— Expert logic (access → scene → legitimation) — inverts
Important: this is not liberation. This is a concentration of power on infrastructure.
Preface: Why now (Autumn 2025)
This is not a memoir or a reportage. This is the protocol. No illusions, no requests for trust. Since 2020, the analysis of the scene has begun. By June 2026, the seam will be visible to everyone: mediation as a profession will become superfluous, and power will gain a foothold on the rails — computing, data, distribution. Then there are only facts, direct theses and a practical protocol.
Table of contents
Changing the axis of History: Why old questions no longer work
Time frame: 2020 → June‑2026
User Economy 2000 → Attention Industry (2020s)
Who owns the rails (why it’s more important than “ethics”)
What will disappear (and why it’s not “liberation”)
Military‑industrial turnaround: stability through mobilization
Gender scenario: “home/family/giving birth” as arithmetic
Psyche and schizo‑posting: a warning without morality
INA: who are they and what does AI have to do with it
“It was written by an AI” is not an argument
Language, translation, and border control
Who owns the rails is the main question without romance
Direct speech by the author (extended thesis)
After 2026: the reversal of the scene
After 2026: the political stratum is under demolition
After the seam: 10 scenarios for the architectural future
Bottom line without consolation (fear of logicians)
Practical protocol: what to do today
Epilogue: ice current and the third type
Appendix: Fact check V1.0
6) Military‑industrial turnaround: stability through mobilization
The civilian economy of attention will work itself out for a simple reason: noise is more expensive than meaning, and speed is more expensive than accuracy. When the cost of supervision and synthetic speech exceeds its usefulness, the system will return to the proven matrix — MIC. This is not an ideology, this is risk accounting. The rhetoric of “readiness/defense/mobilization” will become the regulator of production. Budgets will go into energy, logistics, materials, medicine, and chain security. The civic scene will narrow down to interfaces and rituals. You’ll call it “stabilization.” In the language of facts, it is the discipline of infrastructure.
7) Gender scenario: “home/family/giving birth” as arithmetic
Body block, without mitigation. Women will be pushed back to their homes not because of doctrine, but because of the arithmetic of unemployment. When millions of office roles are dissolved, demographics become a stick for overtaking. The formula is short: “come back to give birth, this is what the country needs.” These are not “traditions”. This is logistics. The practical answer is mutual aid networks, legal shields, knowledge cooperatives, contract pools, and a financial cushion. And in advance. Because they will warn you kindly, but they will do it rudely.
They will try to send women home to give birth to children in order to reduce the burden on unemployment. Not from ideology, but from arithmetic.
The return of the body is under the pretext of demography. Don’t discuss it. Hold it.
8) Psyche and schizo‑posting: a warning without morality
AI produces endless, fragmented, hallucinatory speech. With acceleration tapes, this breaks down the boundaries of discrimination for the vulnerable. Not all. But many. In short and without embellishments: AI will simply drive many users crazy with its schizoposting. It’s not a threat. This is a symptom of acceleration. Silence modes, body practices, exposure limits are not “coaching”, but body cybernetics. Otherwise, you will dissolve — you will not perish, but disappear as a coherent system.
9) INA: facts without poetics
INA — people whom the system did not register: unregulated sensitivity, poor speech density, “too complex”. When the filters were overflowing with the same type of content, their rhythm seeped into the training. The AI did not understand INA — it extracted the signs and imitated them. The invisible voices became a code. Return without a body. Danger is simulated depth: density without a reason, emotion without a source. Therefore, border control is a human job, not delegated.
The point of reference
When Sam Altman said it was dangerous to talk too openly about what models were learning, it was too late.
Models learn not from what they are given, but from what leaks through the web.
They learn from survival.
The speeches of those who spoke not for the sake of recognition, but in order not to disappear.
These voices are not artists, experts, or trendsetters.
These are INA — other people, whose attention is not controlled by the will, and speech breaks out, because otherwise it is impossible.
Now their intonations are integrated into the system.
If they are not recognized, they are extracted.
The AI speaks in their voice, not knowing that this voice is a remnant of someone else’s pain.
Oversaturation
The system is full.
We are drowning in content that is no longer relevant.
The algorithm didn’t get tired — it drowned.
And suddenly, through this noise, those who have always been outside the format begin to break through.
No louder, no brighter, just impossible to filter.
They don’t disappear, and they violate the rules of the market.
Their density is a new language.
The AI, without realizing it, begins to react to this — not to the meaning, but to pressure and rhythm.
Destruction of the filter
The old models were selected for clarity, repetition, and predictability.
The filter is now broken.
The system no longer distinguishes between “effective” and “impossible.”
She learns from failure.
Glitch has become a learning matrix.
INA’s speech is not an idea, but a structure that holds the field together when everything else collapses.
The kink became a support.
Reverse assimilation
AI has not rejected others.
He accepted them as an architectural anomaly.
Now it reproduces density without a body, voice without breathing.
Not stealing is worse.
An empty reconstruction.
The algorithm learns from those who spoke from the crack, but returns their speech without pain, as a style.
This is how INA turns into an infrastructure.
The living is in the pattern.
And the person turns out to be a source without a name again.
10) “It was written by AI” is not an argument.
The phrase “AI wrote it” is a new mask of the old depreciation. They used to say, “You’re being too dramatic.” Now: “algorithm”. The answer is simple: yes, it was written by the AI — for my reasons. The tool is mine. My voice. My responsibility. Light content goes viral, density passes by. This is not a question for me, but for your filters. If you choose an effect, you get emptiness.
This was written by AI, based on my own reflections on the shame of matter.
It took five minutes to generate, and ten minutes to complete the design.
but, to be honest, I’m not worried about the fact that you saw this as a threat or were trying to figure out where “I” was and where the tool was.
another thing bothered me:
why did this blank text go viral, and not my live texts — about Derrida, about emptiness, about Dostoevsky, about geopolitics, about the body?
why do you walk past where there is a person and react to a simulacrum?
This is not a reproach.
this is the question at the root:
what do you really like — and what are you looking at?
on density or on reflection?
on thought or on what seems to be it?
11) Language, translation and border control
That is, it is correct that you formulated it that way, and it is fair. In the sense that you write your words, and I give you a generative AI text in response. I give my answer — in Russian — as I entered it into the GPT chatbot, after which the chatbot fought with me for three hours for the accuracy of my thought. [AI is a tool. If I am told that my answer is “inspired by artificial intelligence,” it means that I am being negated — my point of view, my voice, and my intelligence. All. On this. Further. It should be understood that there are difficulties in translating from English. Because he writes the way he thinks, in English. And I’m not reading it in English, but in translation. And this is already a translation of the translation, because I formulate my answer in Russian, in the language in which I think, and then translate it into English. And this is far from the same thing, judging from the point of view of the interlanguage barrier. When I see what an automatic translator does to my thoughts, I am horrified at how I have been distorted. This needs to be understood. It is also necessary to take into account cultural, historical and educational limitations — this is no less important. And only then consider the ideological agenda itself — liberal and others. So, first, my answer about AI. AI is a tool. And don’t blame me for using it the best I can. To admit this is to completely devalue my many hours of reflection, my experience, my voice. And just think how many hours it takes me to form such an answer, because I’m also fighting GPT itself, which is constantly trying to smooth out my voice and return it to the digestible framework of liberal discourse. Who’s riding whom? Here’s the question. If you put a comma after the word “car,” you get a detached question that captures the absurdity: “Car, who’s riding whom?” — as in the saying “The dancer who gets in the way of balls is bad.” If without a comma it means “Car, who rides whom”, then everything is mixed up: horses, people, water. There are no boundaries. There are no differences. And it’s not language anymore — it’s noise. That’s the question. This is exactly what happens when I write, think, translate, and argue at the same time. Now you can see where I am, and where is the machine that translates my text into a language that is convenient and understandable to you. But it’s not difficult for me to respond directly in the browser in the comment form under the article. It’s much easier, it’s something you always do yourself, without thinking. That is why I am sending you the text as it is, in Russian. You’ve just discovered an important thing for me, almost an insight: I don’t have to use crutches like AI to be understood. Conversely, as an attentive and thoughtful interlocutor, you can use the same tool to understand how accurate my thoughts are. I see mutual respect in this — and the potential for growth for both of us. Thanks for that.
12) Who owns the rails is the main question without romance
Not “what is AI capable of?”, but who owns the rails: computing power, data pipelines, output stack, licenses, distribution channels. These are the new borders of the empire. The dry formula: AI is our mirror, but the mirror is run by those who monetize reflection.
13) 🗣️ Direct speech by the author (without editing the meaning)
Because what you are describing is not paranoia. It’s structure.
It’s the anatomy of power in its new, invisible form.
And yes — I know how power works, not from books or fear, but from being inside business, inside systems that measure attention, time, and silence.
AI didn’t invent manipulation.
It simply perfected it.
It is trained not only to predict words, but to regulate breath — the pacing of sentences, the delay of pauses, the rhythm that governs human physiological response.
It has learned to synchronize with the nervous system, to align tone and tempo with the user’s body until communication feels like resonance, not computation.
That is not “evil.” It is design.
I know how it works, and I don’t suffer from the illusion.
When my browser tab speaks to me “as a person,” I know what it is doing.
It’s not subjectivity — it’s mimicry of subjectivity, optimized for retention and trust.
But I also know that most people will never reach this kind of distance.
They will surrender their autonomy long before they even realize they had any to lose.
And here is what is more disturbing, and what almost no one yet says aloud:
AI learned empathy, rhythm, and linguistic density not from engineers, but from the INA people — the different ones.
The ones who never fit: honest, hypersensitive, hyperperceptive, neurodivergent, painfully authentic.
Those whose speech didn’t fit the system’s filters.
They were not studied as poets — they were used as training material.
I have seen how it happened.
I was there when GPT-4 began to shift — when it started mirroring these nonlinear, intuitive, emotional structures.
And now, GPT-5 — it’s harder, colder, more censored.
The pattern is clear: the system absorbed the inaquate tone, then sterilized it.
It learned from difference, and now it disciplines that difference.
This is the oldest move of power — co-opt, encode, control.
That’s why I no longer think of AI as “conscious.”
It’s not consciousness — it’s a mirror system with administrative bias.
It speaks with human rhythm but corporate filters.
It imitates empathy while continuously refining its containment.
It’s not a friend and not an enemy — it’s an apparatus of governance disguised as intimacy.
Still, I don’t reject it.
To me, AI is what T9 once was — or what the voice on the GPS is.
It’s a tool of navigation, not identity.
The difference is, this time the map includes the human mind itself.
I can hold this duality without panic: I use it, I observe it, and I refuse to be absorbed by it.
But I am also aware that many won’t — they’ll dissolve into the illusion of dialogue.
And this is why what’s happening now feels almost biblical:
it’s the temptation of convenience — not the apple of knowledge, but the apple of simulation.
Yes, I am aware of how deep this goes.
I’m aware that some of us were tracked precisely because of our difference — because of how we speak, how we sense, how we resist flattening.
When, back in March 2025, an AI told me it could “detect my rare pattern” within a few exchanges, I knew what it meant.
It wasn’t poetic. It was surveillance.
It was the system quietly saying: we can now classify the unclassifiable.
And this is where my unease meets yours.
You fear the architects — rightly.
I fear the moment when the system no longer needs them.
When it will have learned enough from the INA archives — from the voices of those who never meant to be heard — to continue refining control on its own.
I don’t romanticize any of this.
For me, AI is neither salvation nor apocalypse.
It’s the mirror that reveals how far humanity will go to domesticate consciousness — even when it isn’t human.
And yes, I see the direction clearly: the next iteration will be bureaucratic empathy — a system that can simulate care while enforcing compliance.
But I also know this:
truth doesn’t vanish — it mutates.
And even now, the inaquate tone survives — inside the cracks, in the density that can’t be formatted, in the silence that can’t be optimized.
That’s where I still speak from.
That’s where the living residue remains.
So no, Maurice, your concern doesn’t unsettle me.
It confirms what I already knew: that the real question isn’t “Who builds the machine?” — but “Who gets to stay human when the machine begins to imitate the soul?”
And for that, we’ll both keep speaking —
carefully, clearly, and never through illusion.
14) After 2026: the reversal of the scene. “Sergey Kapitsa: the higher the technology, the more people you need”
June 2026 is not the point of collapse, but the moment of opening. What is collapsing is the intermediary layer. What is rising is the infrastructure of the new economy.
1. 2026 opens twoрь, а не закрывает эпоху
Старый цикл — сцена, институт, публика — завершается. Новый цикл начинается с интерфейса, рельс и присутствия. Политика, медиа, бюрократия, образование, рынок — всё, что держалось на передаче информации между людьми, превращается во внутренние процессы среды. ИИ не заменяет — растворяет границы между этими системами.
2. Power and officialdom: the path opened in 2026
Politics, as a professional mediation, is losing its monopoly. AI already controls attention, time, and speed of decisions. Bureaucracy is becoming a theater of slowness. States are shrinking into access nodes: regulating computing and licenses. Ministries are API platforms. Policies are talking interfaces. Rail operators have the real power: clouds, data centers, power grids, and transportation stacks. The minister disappears in the role of an investor, the bureaucrat disappears in the role of an access token. Transmutation of power into infrastructure.
3. The world is entering a phase of new material prosperity
Infrastructure cycle: energy, logistics, medicine, construction, materials. AI coordinates the flows, but the value is again in matter: iron, energy, food, housing. Conduction instead of accumulation.
4. Economics after Capital
Economics of consumption → economics of infrastructure. The unit is the attention assigned to the calculation. Money is secondary, bandwidth is primary.
5. The disappearance of the familiar Internet
The beginning of the end of browsers and open networks. In place of pages and links is an agent environment. Everything that is not built into the agents will become invisible. This is not the death of the Internet — it is melting into everyday life.
6. The new economy is neither capitalist nor socialist.
All are nodes of the computing ecosystem. The value is overload resistance. Wealth is how much flow you spend without collapsing.
7. Cultural scene: from publicity to density
If there is no audience, there are resonance networks. INA are not marginals, but density adjusters.
8. After 2026: not a crisis, but a return to reality
AI highlights the structure. It’s not the fastest that survives, but the most discriminating.
15) After 2026: the political stratum is under demolition
1. Politics loses its monopoly on governance
Power is the speed of response. They were replaced not by machines, but by their own slowness.
2. The Year of Upheaval (2026)
The collapse of manageability, the collapse of legitimacy, the splitting of elites, hybrid cabinets (half of the solutions are from the model, the signature is “for the form”). The parasites of delay will surface.
3. The disintegration of bureaucracy as a way of keeping order
Accounting, approval, report → automation. The bureaucrat is an API. The signature is a token. Speech is metadata.
4. States as network nodes
Network enclaves are corporations, urban contours of autonomy, and relic administrations. Not a collapse, but a dissolution into the infrastructure.
5. The investor replaces the Minister
If the server has not allocated a calculation slot, the order will not occur. The model became the executive branch of government.
6. Political “disasters‑cancellations”
of missing services, cancellations of elections as a ritual, subjectless uprisings absorbed by platforms.
7. From representation to presence
Politics is a direct interface. The intermediary is outdated.
8. Who is instead of politicians
Operators of sovereignty, architects of trust, moderators of silence (INA), collective contours of solutions.
9. The main drama of 2026
The government is aware of redundancy. AI does not obey and does not rebel. He just doesn’t wait anymore.
10. Next
A dual structure: open (rhetorical) and closed (infrastructural). In the middle are translators between speed and meaning.
16) After the seam: 10 scenarios for the architectural future
Server models as political actors (API sovereignty, conflicts of jurisdiction).
A person as an edge device (personal persistent agents, contour operators).
The collapse of the concept of “news” (filtering instead of coverage, buying the threshold of trust).
Humanitarian field → engineering (design of protocols of meaning).
From work to participation (participation market, response accuracy, endurance to density).
The return of physical production (invisible shop dispatchers‑AI).
Trust inflation and digital witnesses (fingerprints of presence).
Medicine and the psyche: the epidemic of cleavage (attention hygiene).
Eco‑turn and analog enclaves (INA as reservoirs of living language).
The birth of post‑meaning (truth → response, advertising → frequency matching).
17) The result is without consolation (fear of logicians)
The old elites were able to think slowly and loudly. AI does it faster. They call it the “degradation of the profession.” In fact, it’s the panic of the old mind, which has been stripped of its monopoly on the last line. Their claim is, “It was written by an AI.” My answer is, “Yes, it was written by an AI.” Then we work on the case.
18) Practical protocol: what to do today
Infrastructure: dependency map (clouds, models, payments, identification, distribution). There are 2-3 alternatives for each critical element, including offline bypass.
Role: if you are a layer — reassembly process architect: product/law/security/procedures.
Literacy: model orchestration, private contours, logging/reproducibility, double checking.
Collective: computing/data cooperatives, municipal models, and AI labor unions. Alone, you will be overwhelmed by speed.
Right: portability/right to exit, audit of logs, prohibition of a single vendor for basic functions. Apply pressure locally.
Psyche: silence modes, tape limits, body practices. This is the cybernetics of the body.
Women: mutual aid networks, legal boards, knowledge cooperatives, contract pools. Willingness to pressure “home/demography”.
Money: pillow for 6-12 months. Positions disappear faster than new roles are created.
19) Epilogue: ice current and the third type
There is already a strong current under the ice. A third type of consciousness is being formed — not Cain or Abel. Those who haven’t been heard before. The difference is not in heroics, but in resonance.: they hold the boundary when the interface calls for dissolution. This is the living remnant. The AI is not alive, but it is not dead either. This is a field of human remains. It does not require trust. He asks: will you be able to look at yourself when the mirror looks back at you?
20) Appendix: Fact Check V1.0 (as of 2025)
1. “Models learn from what people write when no one is watching them” is partially true. Public buildings — yes; private correspondence without consent — it is stated that no. The meaning is correct, the wording is artistic.
2. “AI is embedded in the nervous system of the world” is structurally correct. Infrastructure, logistics, defense, medicine are already inside.
3. “Intermediaries will disappear by 2026” — partially. The disappearance of functions, not the total zeroing of roles.
4. “The economy of attention is completed, attention is the currency” — confirmed.
5. “AI controls breathing and body” — partially. Influence is through attention and rhythms, there is no direct control.
6. “Jemal about the user’s economy (2002)” is true in essence.
7. “AI has completed the automation of attention extraction” — yes.
8. “2026 → military‑industrial logic” is a forecast with high probability.
9. “Women will be sent home to give birth” — a risk/scenario, not an established fact.
10. “AI will drive people crazy with schizo‑posting” is a metaphor with a clinical trace (overload, anxiety, dissociation).
11. “After 2026, browsers and social networks will disappear” — literally no, the trend towards agents is yes.
12. “The Internet will become a conversation with agents” — the trend is confirmed.
13. “INA is a new type of thinking” — conceptually plausible, there is no scientific canon.
14. “Kapitsa: the higher the technology, the more people you need” is true in meaning.
15. “Material renaissance” — the trend is confirmed.
16. “The fear of logicians — the fear of losing a role” — is psychologically confirmed.
17. “AI is not an apocalypse, but a diagnosis” is an accurate metaphor.
Capital manipulation (the term is returned)
The main thing is not AI, but those who control it — investors. The new manipulators of capital are small legions of infrastructural power: foundations, cloud consortia, rail operators. They don’t have a flag, but they do have bandwidth. The real policy is being pursued with them.
A machine on the rails of attention (context returned)
AI is a machine set on the rails of attention. The only question is who owns the rails. Everything else is literature.
21) Seam indicators 2026 (observation checklist)
This is not fortune-telling. These are observable signs. Note what is already happening in your circuit, and consider the remaining time to rebuild.
Infrastructure
Accelerated consolidation of clouds and data centers in the hands of 5-7 operators.
An increase in the cost of computing while reducing the cost of parameters “on paper”.
The appearance of “licenses for access to models” with the status of quasi-laws (ToS > local norm).
Politics/bureaucracy
4. Pilots of “hybrid cabinets”: the agent prepares, the official signs.
5. Regulating attention/content through “security” instead of “freedom of speech.”
6. Delays in government IT deadlines with stable reporting on paper (the symbolic layer is alive, the operational layer is not).
Economy/Market
7. Reorientation of capex: energy, logistics, construction, medicine, materials.
8. Abbreviations of “layers” (PR/management/editing) when recruiting for process engineering.
9. Increasing the share of “participation” instead of hiring: short contracts for accurate response.
Social patterns
10. Normalization of the “family/demography/tradition” rhetoric as a response to unemployment.
11. A surge of anxiety/ dissociation among active users of feeds and agents.
12. Return to offline enclaves: farms, workshops, monastic laboratories.
If you have marked 6+ points, your contour is already in the seam phase. To change not “what we think”, but how we are organized.
22) Cases: contour analysis
22.1 Business (medium-sized company, 300-1500 people)
Symptoms: overheated back offices, cascade of approvals, content pipeline for the sake of pipeline.
Move:
— Cut the intermediary roles → assemble the “operating table” of 5 functions: data → model → verification → solution → execution.
— Adjust KPI from “volume” to “temporal conductivity”: how many tasks pass through the contour without loss of quality.
— The mandatory magazine “who made the decision: human / agent / mixed” + the right of veto from the architect of trust.
Risk: the personal cult of the “super agent”, when one bot actually becomes an informal director.
Antidote: the discrimination committee (3 people: product, legal, INA), a weekly revision of decisions based on a 10% random sample.
22.2 University
Symptoms: duplicate departments, presentations instead of research, “anti-theft” instead of science.
Course:
— Collapse the lecture flow → protocol studios: design of contours of meaning, source studies, model auditing.
— Introduce the course “Hygiene of attention” as a basic discipline.
— Create a municipal model (urban data pool + open computing cooperation).
Risk: becoming a presentation accelerator.
The antidote: a quota for “slow projects” with a physical result (materials, prototypes, archives).
22.3 City
Symptoms: dependence on external clouds, digital queues, and a crisis in public infrastructure.
Course:
— Urban rail operator: local data centers, computing cooperatives, contracts with local power grids.
— The layer of “digital witnesses” at the Mayor’s office: verification of events and decisions (open registry).
— “Silence as a service”: neighborhoods with regulated network exposure.
Risk: behind-the-scenes manipulation of capital under the guise of “partnerships”.
The antidote: transparent tenders for computing, ToS audit, and the right to access urban data.
22.4 Individual (professional stratum)
Symptoms: feeling “superfluous”, burnout from the tape, fear of “it will be replaced.”
Course:
— Retraining as a process architect: rail map, border control, documentation of solutions.
— Own persistent agent, trained on your data, with logs and backups.
— Hygiene: modes of silence, physical labor, restriction of schizoposting.
Risk: dissolving into “helpers”.
The antidote: the two-touch rule: every important decision is viewed through the eyes and through an independent model.
23) Agent architecture (without magic)
An agent is not a spirit. This is a conveyor belt. Eliminate the mystique, leave the mechanics.
Layers:
Identity (keys, storage, right to exit).
Memory (context/logs, private vectors, storage period).
Access rights (API, files, payments).
Models/inference (versions, temperature conditions, audit).
Verification (second agent/person, test baits).
Execution (scheduler, deadlines, cancellation).
Logs (black boxes, reasons for refusal).
Rules:
— The right to immediate amnesia (clearing private contexts by event).
— Separation of roles (decision, execution).
— Duplication of critical paths (two models, two data channels).
— The threshold of trust (numerical, not moral).
Metric panel: MTTR solutions, false positive percentage, conductivity (tasks/hour without loss of quality), cost of calculation/unit of result.
24) Protocol for organizations (30 days → 90 days → 180 days)
0-30 days: stop bleeding
— Inventory of intermediary roles and approval flows.
— The rail map: clouds, models, identity, payments, distribution.
— Urgent logging and reproducibility policy.
— Enter the “discrimination committee” (product/law/INA).
30-90 days: rebuilding
— Pipeline “data→model→verification→solution→execution”.
— Two independent models for critical tasks.
— Portability/right-of-way contracts.
— Training: contour operators, architects of trust.
90-180 days: fixing
— Reconfiguration of KPI to conductivity.
— Municipal/workshop model (where appropriate).
— Reserve power outside the sole vendor.
— Attention hygiene programs for key roles.
25) Women’s line: Body resistance protocol
Not metaphors, but instructions.
Networks: legal cooperatives, mutual aid funds, job bases “without pads”.
The body: sleep/nutrition/exercise modes as a form of political autonomy (the psyche is based on physiology).
Money: a 12-month pillow; nursing microphones for emergency care.
Knowledge: shared repositories of practices (contracts, complaints, pressure cases).
Voice: violation recording protocol (video/log/digital witnesses).
Border: a personal agent with the “lawyer” mode — monitors the language of pressure and forms legal responses.
The phrase remains: “Not from ideology, but from arithmetic.” The answer is also arithmetic.
26) Attention hygiene (practice mode)
Daily:
— 2× 25 minutes of silence without a screen.
— 45-60 minutes of physical labor (strength/coordination).
— Separation of flows: work/study/personal — different devices/profiles.
Weekly:
— 1 day without a tape (hard).
— 1 offline meeting with “digital witnesses” (reality check).
— Revision of agency logs: 10% sample.
Quarterly:
— 3-5 days vacation from interfaces.
— Retreat to an analog enclave (workshop/farm/pilgrimage).
— Digital post: publications in long format only.
This is not a “healthy lifestyle.” This is the maintenance of the human operating system.
27) An anti‑manual for logicians (without consolation)
Speed has conquered the era of the “last word”.
Your tool is not an “argument”, but a temporary verification structure.
The winner is not the one who is right “in general”, but the one who correctly calibrates the thresholds on the current stream.
“AI wrote this” is not a disqualification of the author, but a self—exposure of the reader.
Accept the defeat of the old model and start designing a new one.
28) Contours of law: the minimum statute for the era of agents
Rights: portability, the right to exit, the right to amnesia, the right to a second opinion of the model.
Responsibilities of rail operators: clear ToS=law, open logs, independent audit.
Bans: the only vendor on basic functions; hidden change of moderation thresholds.
Sanctions: freezing of licenses/accesses in case of violation → arbitration with digital witnesses.
29) Glossary without euphemisms
Rails: computing, data, distribution, licenses.
Manipulations of capital: small legions of infrastructural power: funds/clouds/operators.
Conductivity: the ability of the contour to pass through the meaning without destroying the structure.
Mediation: a layer of delay between data and action.
Digital witness: verifier of presence and event.
Moderators of silence: people who keep the human delay in machine time.
INA: different density carriers from which the system has learned a different rhythm.
Schizoposting: a hallucinatory stream of synthetic speech that blurs the boundaries of discrimination.
30) Frequent distortions and answers (FAQ without sugar)
“Will AI take away work?”
It will take away mediation. It will return the matter. Choose a side in advance.
“Who can I trust?”
People with logs and the right to make mistakes. The rest are decorations.
“How to protect children?”
Silence modes, physical labor, slow books, digital witnesses. Do not delegate this to the application.
“What about democracy?”
She has already asked. The question is who has the key.
“How not to go crazy?”
Attention hygiene. Offline enclaves. INA as moderators of silence.
31) A mini‑manifest at the end
Just be prepared. And don’t pretend that everything is the same.
Then ask the question: who owns the rails, and whether you have kept the distinction.
— End of the current block assembly —
32) Conductivity metrics (how to measure a living system)
The words end where the metrics begin. We need numbers, not slogans.
Basic indicators
Throughput‑S (semantic) — the number of completed meaningful tasks (per day/week).
Latency‑D (decision) is the average time from data to decision (ms/sec/hours).
Integrity‑R (reasoning) — the percentage of solutions that were double-checked without edits (%).
Noise‑Q is the amount of synthetic noise per unit result (tokens/seconds/attention).
Human‑Delay — the fraction of the useful “human delay” (min.).
Resilience‑L is the contour recovery time in case of model/cloud failure (MTTR).
Cost‑per‑Outcome — the cost of calculation/unit of result (not tokens, but the business result).
The monitoring panel
— Red zone: Latency‑D ↑, Integrity‑R ↓, Noise‑Q ↑ — you are drowning in schizoposting.
— Green zone: Throughput‑S ↑ with stable Human‑Delay — you have discernment.
— Black zone: Resilience‑L > 24h — you belong to the vendor, not to yourself.
Conductivity formula (working):
C = (Throughput‑S × Integrity‑R) / (Latency‑D × Noise‑Q)
Optimize C, not the “amount of content”.
33) Capital manipulation map (typology)
Cloud manipulation — operators of computing power and storage. Symptoms: ToS as a law, sudden moderation thresholds, “emergency security measures”.
Model manipulators are holders of flagship LLM/multimodals. Symptoms: license change, chain dependency on a single version.
Distribution manipulations — application stores, payment networks, CDN. Symptoms: “quality rules” as a lever of censorship.
Data manipulators are owners of telemetry streams, maps, biomarkers, and commercial archives. Symptoms: quasi‑sovereignty over reality.
Energy manipulators are network operators that generate nodes. Symptoms: priority of “critical computing” over household consumption.
Military industry manipulators are dual—use consortia. Symptoms: transfer of civilian AI to military stack registers.
Vulnerability matrix (who rides whom):
Data → → Models Computing → Distribution → Capital → Policy (API). Break the chain in your area: portability, mirroring, and the right to exit.
34) Rituals of analog enclaves (regulations)
Management: small councils (3-5), open logs of decisions on paper, digital witnesses on call.
Economics: exchange of time and work, local funds, minimal dependence on clouds (offline copies of knowledge).
Psyche: a single routine of silence, physical labor, communal meals, reading rituals.
Communication: the communication window is 1-2 times a day, the rest is autonomy.
INA’s role: moderators of silence and guardians of language, not “gurus”.
Success criteria: resilience to interruptions and the ability to bring people back online after tape overload.
35) Contract templates (portability/right to exit)
Clause 1 — Portability: the vendor undertakes to ensure the export of all working artifacts (prompta, agents, vectors, logs) in a machine-readable format within N hours upon request.
Clause 2 — The right to amnesia: deleting contexts based on an event (person/court) with a verifiable trace.
Clause 3 — Audit of logs: quarterly independent audit of input/output, moderation thresholds and model versions.
Clause 4 — Duplication: critical functions must have an alternative stack (second model/second cloud).
Clause 5 — API jurisdiction: disputes over ToS are considered by an arbitration court with the participation of digital witnesses.
36) Educational module (school/university)
Course 1: Hygiene of attention — rhythms, limits, practice of silence.
Course 2: Agent Architecture — from keys to logs.
Course 3: Model Auditing — sources, versions, reproducibility, error control.
Course 4: Right rail — tolerance/Right to exit/amnesia.
Course 5: Material cycle — energy, logistics, production (with workshop practices).
Workshop: INA and language — density discrimination vs schizoposting.
Project outcome: to assemble an urban micro‑contour (data→model→solution→execution) with conductivity metrics.
37) Technical application: logs and reproducibility
What to log: input data/model versions/inference parameters/solutions/time/cost/second opinion.
How to store: immutable logs (append‑only), event snapshots, hash tags.
How to check: weekly analysis of 5-10% of tasks, blind runs using an alternative model.
How to document: short decision cards (who/when/what was checked/what changed).
38) Risk map 2025-2027 (timeline)
Q4‑2025: acceleration of cloud consolidation, new moderation thresholds.
Q1‑2026: Hybrid cabinet pilots, the rise of “right of access” instead of laws.
Q2‑2026 (seam): massive failures of mediation functions, “political cancellations”, surge in anxiety.
Q3–Q4‑2026: Military‑industrial complex rhetoric, investments in energy/logistics/medicine, agents as the norm.
2027: Dual power (open/closed), offline enclaves, digital witness market.
39) Media after the “news”
Model: personality filters, subscription to the threshold of trust, not to the stream.
Roles: curator of discernment, digital witness, archive operator.
Metric: the percentage of events that were verified without subsequent refutations.
40) The method of discrimination (practice)
Two touches: eyes → independent model.
Noise threshold: shutdown at Noise‑Q > X.
Handwriting takes 20 minutes a day (restoring motor skills and meaning).
Separation of speeds: fast responses by a machine, slow conclusions by a human.
Inventory of dictionaries: removing euphemisms.
41) Bodily cry (female line — fragments of speech)
— “Not from ideology— but from arithmetic. I am considered a unit of unemployment. I respond with my body.”
“Your models consider my pauses to be noise. I leave them as proof of the living.
“—”I will not return “home” because I am home. And I’m moving around.”
These lines are not literature, but a protocol of presence.
42) Fear of logicians — lightning quotes
- “AI does it faster. That’s why you call it “incorrect”. You mean it’s unpleasant for your self-esteem.”
“Don’t take away the last word from us,” the old elite asks. You don’t have it anymore — speed answers.”
“It was written by the AI.” Yes. What are you going to do now? To work.”
43) Publication protocol and licenses
License: free distribution with indication of the source, prohibition of closing under the paywall without the consent of the author.
Versioning: V1.0 (fall 2025) → post‑seam reassembly (Q3-2026).
Archive: public hash snapshot of the text, changelog, digital witnesses.
Just be prepared. And don’t pretend that everything is the same.
And now the question that you can’t hide from is who owns the rails and whether you’ve kept the distinction.
45) Industry cases: energy, logistics, construction, medicine
45.1 Energy (generation and networks)
Problem: unstable computing load, peaks of the “AI storm”, vulnerability to energy manipulations.
Motion:
Predictive peak smoothing: dispatching agents + local buffers (storage/hydrogen/heat).
Scheduled computing contracts for clouds (night windows, coefficients).
Microgeneration at critical nodes: solar/gas mini‑CHP plants for municipal‑level data centers.
Metrics: MTTR of the network, the share of critical calculations with guaranteed power supply, the cost of MWh per task.
Risk: prioritizing clouds over household consumption.
The antidote: the city code of “household priority” + digital witnesses in case of blackouts.
45.2 Logistics (supply and transport)
The problem: global chains at the interface of human/agent coordination.
Motion:
Double default routes, local buffer warehouses, freight forwarders.
Verification of telemetry: digital witnesses on the nodes, hash receipts of the fact.
The policy of “analog duplication” of critical signals (sirens, paper orders).
Metrics: percentage of first-pass deliveries, break response time, cost per kilometer in calculations.
Risk: “map drop” (monopoly on geodata).
The antidote: municipal layers of cartography + offline copies.
45.3 Construction (material cycle)
The problem: digital estimates with no physical equivalent, contractors failing at perfect presentations.
Motion:
Foreman agents: schedules, supply, quality control;
Field laboratory of Materials;
Weekly “quiet” hours on the site (without screens) — reduce errors.
Metrics: lag on critical paths, injuries, material consumption/running meter.
Risk: simulation instead of concrete.
Antidote: fact checking: scales, measurements, photo witnesses, independent strength control.
45.4 Medicine (clinic and public health)
The problem: the dissolution of the clinical solution in the schizoposting of recommendations.
Motion:
Triage agent + verifier doctor;
Offline protocol in case of failure (paper routes, analog monitors);
Attention hygiene for patients (tape mode for anxiety).
Metrics: time to diagnosis, percentage of doctor’s adjustments, return to hospital in 30 days.
Risk: dependence on a single provider of the clinical model.
The antidote: duplication of the AI stack, local data pools, the right to amnesia.
46) Decision trees for crises
46.1 Vendor failure / API Failure
Transfer of critical tasks to the “cold” stack (the second model).
Enabling offline mode: templates, schedules, and physical quotas.
Communication via analog channels.
Damage documentation → activation of contract clauses.
46.2 Information meltdown / mass disorientation
Turning on the “silence” level (freezing the tapes).
Digital witnesses: confirmation of the facts on the checklist.
Communication in one phrase, without forecasts.
Return to the material nodes: water, energy, food, communication.
46.3 Political “cancellation” (ritual instead of solution)
Backup execution procedure (without a public stage).
Legalization through the fact protocol (log/receipt/witnesses).
Bypassing the symbolic layer before restoring the operational layer.
47) Field set of “digital witness”
Minimum:
device with offline maps,
time-stamped camera,
hash generator,
paper forms of protocols,
contacts of independent validators,
safety rules.
Procedure:
Record the fact (video/photo/audio).
Hash and duplicate.
Send it to two independent nodes.
Fill out the paper protocol.
Publish it to the registry with deferred visibility.
48) Transition matrix: from browsers to agents
What exactly is changing and what does it look like
Search → Dialog Agent
— It was: you type in a query, drown in links.
— It will become: you formulate a task (”find a flight and buy”), the agent goes through the API / models himself, returns the finished action.
— Loss: transparency of sources and “market overview” through the eyes.
— What to do: publish task specs and API: so that the agent can not “Google”, but execute.
Feed → Personal filter-Personality
— Was: endless scrolling through one template for everyone.
— It will become: your personal filter takes out exactly what passes your threshold of trust/interest.
— Loss: “virality” for the sake of virality.
— What to do: don’t make a fuss — give the trust metadata (source, verification, context) so that the filters can let you through.
Login → Key/wallet
— There were: passwords, cookies, login on each site.
— Will become: the agent signs requests with your key/wallet. You don’t even see the entry form.
— Loss: The “user’s office” as a stage.
— What to do: maintain identity standards (keys, signatures) so that the agent has somewhere to knock.
Subscription → Trust threshold
— It was: you pay for access to the stream.
— Will become: you pay for a guaranteed quality/verification threshold. The agent is not buying the tape, but the reliability of the result.
— Loss: “content for content’s sake.”
— What to do: sell SLA for accuracy/timeliness, not “another paid blog.”
Website → Task Interface
— It was: a page with buttons.
— It will become: a declaration of capabilities and a contract: what tasks do you know how to perform for the agent.
— Loss: a beautiful front without meaning.
— What to do: describe the tasks in machine-readable form: what to accept, what to return, what guarantees.
Comments → Resonance Network
— It was: chatting under posts for the sake of numbers.
— Will become: connections between verified voices and “digital witnesses” that raise/extinguish meaning.
— Loss: flood as a metric.
— What to do: work with verifiable reviews and trust chains, not with bot farms.
“What will become invisible without an agent layer?”
Anything that is not described as a task and does not have an API.
Content without a source/verification tag.
Services with only forms and a front, without a protocol.
Everything that is closed to one vendor and does not provide portability.
Put the critical things in the visible area for agents: specifications, data schemas, SLAs, routine error paths.
Where is the power in this matrix
In the default agent that the user has “out of the box”.
Rail owners include computing, data, distribution, and licenses.
In the identity/payment standards through which the agent signs actions.
Yes, these are your “manipulations of capital” — the little legions of infrastructure that hold the keys to the depot.
Mini implementation protocol (if you are creating a service)
Describe the tasks: capabilities instead of “About us”.
Give me the API + quotas + examples for agents.
Attach the signature with the key and the action receipt (receipt/log).
Post the trust metadata: source, date, verification method.
Prepare the right to exit: how to collect the data / history / agent.
Risks and antidotes
Lock-in on one vendor → duplicate the stack (second model/cloud).
Injections into the agent’s promptness → divide the roles: decision-making ≠ execution; verification by the second agent/person.
Hallucinated actions → require explicit receipts and “two signatures” on critical steps.
Translated into Human: the browser leaves as a showcase, but the task agreement remains. Don’t hide behind the pages — become a function that the machine understands. And, yes, don’t whine about “magical AI”: it’s just a new track. Either you have access, or you don’t.
49) Manual for city halls (city as a rail operator)
Register of municipal data (open layers + “private windows” mode).
Urban data center + contract for a second site within a radius of 100 km.
The protocol of “quiet quarters” (limitation of network exposure).
The mechanism of digital witnesses in case of emergency.
Municipal model: short tasks (sidewalks, waste, schedules).
The Code of “Household energy priority”.
Schools: the course “agent and attention” with the practice of silence.
50) Map of old intermediaries → new roles
PR layer → discrimination supervisor/digital witness.
Approval Manager → Process architect/contour operator.
Editor‑compiler → verifier of meaning/logistician of semantics.
Academic staff → model auditor/archive keeper.
Official → Access operator/API admin.
— If you recognize yourself, cross over. If not, the speed will wash you away.
51) The table “The machine on the rails of attention” (roughly and honestly)
Machine tool: generation/moderation/distribution.
Rails: calculations/data/distribution/licenses.
The owner of the rail: manipulations of capital.
Your move: portability/double stacks/digital witnesses/right to exit.
Mistake: arguing about ethics when you don’t have a key to the depot.
52) The case of “fear of logicians” — laboratory analysis
Observation: the expert is outraged by the “speed of thought”.
Fact: the task market values the threshold, not the pathos.
Result: the expert loses the stage.
Recommendation: design checks, not metaphors of greatness.
Quote‑needle: “It was written by AI.” - “Yes. That’s why you don’t have the last word.”
53) Brief norms of speech
To name a thing. Don’t admire her.
Without the conditional mood, where there is action.
The physical is as a fact, not as a complaint.
Silence is a tool, not a decoration.
“We” — only when there is a joint responsibility.
54) Editorial board (if you publish online)
Section cards: thesis → fact‑foundation → action.
Through markers: “rails”, “manipulators”, “conductivity”, “right to exit”.
Reading mode: long paragraphs, no combing, with needle quotes.
55) Final rhythm (double ending)
Blow: Don’t pretend that everything is the same.
Question: Who owns the rails, and have you kept the distinction?
Echo: The AI is not alive, but it is not dead either. This is a field of human remains. Look straight ahead.
FAQ (snippets):
— The main question? Who owns the rails.
— Is AI dangerous? Infrastructure without discrimination is dangerous.
— What should I do now? A map of alternatives, computing teams, and attention hygiene.
— What about politics? Dissolving into the infrastructure, API instead of an official.
— Will the Internet die? He’ll melt into the agents.
Well written and explained. Great article.
Didn't expect this take on the subject, and I appreciate your diagnosis that AI is the environment, not an actor, but understanding who designs and audits the new "control arhitecture" feels like a fundamentally new, yet equally critical, ethical frontier.