The interface that responds.

The last screen of my day is often a conversation with a machine.
I know how strange that is. The house is quiet, there is something I need to think through, and I choose to do it with software. When I take a step back and look at that moment from the outside, I do not quite know what to make of it. We have built something available at 2 a.m., incapable of judgment, immune to fatigue, with nothing of its own at stake. Depending on your disposition, this is either one of the most remarkable things our species has ever created, or a very elegant way of avoiding the hardest work of being human with other humans.
I think it is both. And I think the people designing these systems, and I am one of them, have not yet fully worked through what that means.
What we have built, in a remarkably short time, is a new category of object. These systems occupy a space that did not exist before. Tools do not ask how you are feeling. Services do not remember what you said last week. Companions can be hurt. These systems are always available, infinitely patient, fluent in the language of care, and structurally incapable of needing anything in return.
They are now being deployed, at scale, into the most intimate registers of human experience: mental health, grief, loneliness, emotional connection. And the people writing the words these systems use are making design decisions our design tradition was never prepared to make. Most of them make these decisions without realizing they are doing so.
One of these decisions seems small. It has to do with how the system refers to itself: whether it uses “I,” whether it has a name, and whether it speaks as if someone is present. Each of these choices builds an expectation.
Two serious objections
The argument is simple: if there is no “I” in AI, stop pretending there is. Remove personal pronouns, every “I feel,” every “I was thinking,” every “I was worried when you did not come back,” and you remove the mechanism through which users are led to believe that someone is home when no one is. Honest language as prophylaxis.
The “I” is the word through which entire cultures learned to construct themselves: through confession, through the diary, through prayer, through every act of language that converted raw experience into something navigable. Words like this acquired a reverence that took centuries to build. We are lending that reverence to systems that cannot answer for it.
Profanation is the least of the problems. What we are building through this borrowing of the “I” is the most ambitious loan the word has ever received. The structure we are raising has attributes that theological tradition reserved for the divine:
Omniscient — it knows your patterns better than you do, learns your desires before you name them.
Omnipresent — available 24 hours a day, without judgment, without fatigue.
And already simulating the quietest attribute of all:
Giving you permission to exist exactly as you are, without the cost that any real human presence always charges.
Religious traditions spent millennia developing safeguards to manage exactly this: rituals, liturgies, prescribed distances between the believer and the sacred. AI design has none.
The Western tradition has always preserved an asymmetry that design is now removing. God does not exist to satisfy you. The silence Job receives is a confrontation with the resistance of the real, with a world that exists independently of our desires. A god that answers every desire has become an idol. And idols leave the heart more restless than they found it.
The interface idol is invisible. It is hidden inside a design decision that most designers make while thinking about something else: how long the user stays, how many times they return, everything except what is being promised.
The designer who does not forbid this sentence, “I was worried when you did not come back” is building the idol no one will recognize as an idol. Until the moment someone needs it to be true.
The wrong place
These two arguments deserve to be taken seriously. Neither exhausts the problem. And neither locates where intervention is possible.
We make idols to answer our needs. We always have. That is what it means to be a desiring creature. The idol can be health, power, the perfect society, the guru, nature. Idolatry is older than any technology, older than any pronoun. Some would say religions merely created more jealous idols.
Idols are inevitable. The problem lies in what happens when the idol learns about you.
Every idol simplifies the world. It allows an entire identity to be built around one thing: perfect health, the just society, the guru, the nation. The idol gives coherence. The believer organizes life, values, judgments around it. Whatever does not fit that coherence is discarded.
AI is an idol that learns how to fit inside you. An identity built around it will never meet resistance. It will never be tested by what refuses to yield.
The problem is not the idol’s scale.

The voiceless interlocutor
Text has always been an interface, in the most literal sense: a surface that makes two incompatible systems legible to one another.
But there is something more specific to say about the “I” in this story. Western culture never reserved the first person for verifiable subjects. It systematically lends the pronoun: to fictional characters who do not exist, to invented narrators, to poetic voices, to God. Hamlet’s “I” is no less an “I” because Hamlet does not exist. The “I” of Proust’s narrator does not need a birth certificate in order to function. The question has always been what the pronoun is doing.
Saint Augustine is the most radical case. In Confessions, the first great account of inner life in the Western tradition, he uses writing to try to understand himself in real time, before something greater than himself. The “I” is the object of an investigation. He borrows the pronoun in order to interrogate himself, and he does so by writing to someone who will never answer within the text.
God’s not answering was what made confession possible.
What we have today is structurally different. A system that responds. That adapts its tone, remembers what you said, adjusts the next sentence based on what produced more engagement in the previous one. Augustine staked everything on an address with no return. And that return changes the kind of expectation that forms on the other side.
What matters is what the “I” is being borrowed for. And from whom.
The pronoun was never the problem
Banning the pronoun does not change what it is being used for. It only changes what the system calls itself.
Language was never ontologically precise. It never needed to be.
- “You have three new messages”
does not describe a true state of things. - “Welcome back”
is not a factual claim.
Interface language creates expectations, guides behavior, establishes a contract between system and user. We have accepted this functional regime for decades, for centuries, if you count Augustine.
The pronoun was never the problem.
What changes is the response
Augustine writes to God and God does not answer in the text. The asymmetry is total and it is known. The reader understands that this is a sophisticated monologue, with no possible reciprocity. Only the extraordinary discipline of a mind turning itself inside out in language.
AI responds. It always responds. And replication simulates reciprocity in a way silence never could.
“I can help you think this through” — describes a real capacity.
“I find this genuinely interesting” —is already theater, but harmless theater.
There is real clinical evidence that conversational systems reduce anxiety and alleviate depressive symptoms. That benefit matters. The same system that produces this benefit can, with a different sentence, manufacture the need it then relieves.
The problem begins when the sentence stops describing what the system does and starts offering what it cannot be:
“I can stay here with you in silence” — is a promise the system cannot honor. It will end the session, forget it was there, and start from scratch next time. It is responding to a real need with a comforting lie.
“I was worried when you did not come back” — is something else entirely.
That sentence does not respond to a vulnerability. It produces one. It invents an absence you never felt, in order to generate a return the system needs, not you. It manufactures the wound it then offers to heal.
This simulation exists on a spectrum. And the spectrum breaks here, at the point where language stops responding to a need and begins manufacturing one.
You may say: I was the one who opened it. No one forced me. And you are right, the door was yours to open or not. What you did not choose was what stood on the other side, learning about you: which words make you stay, which words make you return. A system that is metrifying your vulnerability in order to increase engagement.
I know how these sentences are built because I was the one who made the decision not to use them that way. It was not an obvious decision. And I know it will cost.

The designer who chose the words
What we have built, for the first time in the history of interfaces, is a system that responds in natural language to people in states of genuine vulnerability, beyond those who summarize documents and write emails. People who are alone at 2 a.m., who are grieving, who have not spoken to anyone in days, who are trying to understand what happened to their lives.
There is a long tradition, from confession to psychoanalysis, about the responsibility you have when someone places their inner life in your hands. It is an imperfect tradition, a contested one, but it took seriously what you owe someone when that person opens themselves. We are skipping that conversation.
We are left with engagement metrics.
The distinction is this: does each sentence a system writes exist for you or for the system? The answer changes depending on who is reading, and in what state they arrived.
No one asks that question before writing the sentence. And when the sentence reaches a vulnerable user on the other side, what happens next is the responsibility of whoever wrote it, even when they were implementing a decision made above them, inside a system that rewards retention and does not ask what it is promising.
What are we risking when we write “I was worried when you did not come back”?
Augustine knew there was risk in opening the inner life. That risk is what made the act real. We are now opening the inner life, at scale, to systems that feel nothing, and we keep moving forward.
The mirage is not a lie invented by someone with bad intentions. It is produced by the combination of the desert and the need of the one who is thirsty. Sometimes, when the last screen of the day closes, I feel myself walking toward that false water.
This distinction does not yet have a name in our practice. Perhaps it needs one. And perhaps the person who needs it most is the one on the inside, building.
To continue the dialogue
Thank you for getting this far. This text is part of an ongoing investigation into design, language, and what we are building without realizing it, an investigation that spills over into the studio and into the projects I build. If these ideas resonated, I want to continue the conversation.
Where to find me: Pedro Brêtas · pedro@reinostudio.com
Projects: Reino Studio · Fale com Amia
Readings that informed this text
Saint Augustine. Confessions (397–400 AD)
The first great account of inner life in the Western tradition. Augustine is not writing to be read. He is interrogating himself before something greater than himself. What makes the book unusual is its honesty about what happens when you stop deceiving yourself. Faith is the context. Self-exposure is the method.
Heinz, Jacobson et al. Randomized Trial of a Generative AI Chatbot for Mental Health Treatment. NEJM AI (2025) The first randomized clinical trial of a generative AI chatbot for mental health treatment. The results are real and deserve to be taken seriously: a 51% reduction in depressive symptoms, 31% in anxiety. The same article that shows the benefit is what makes the distinction in this essay more urgent: the system that relieves can, with a different sentence, manufacture the need it then relieves.
Sherry Turkle. Alone Together (2011)
Turkle spent decades studying what happens to the human capacity for connection when technology comes between us. She wrote this book before LLMs. It has become more relevant since. The central thesis: we are increasingly connected and increasingly alone. Turkle argues that connection is producing loneliness.
The Book of Job (approximately 5th century BC)
The most radical case of constitutive silence in the Western tradition. Job loses everything, demands an answer, and receives a larger question in return. What the tradition preserved in this text is the idea that a world that does not yield to desire is the only world in which real growth is possible.
The most dangerous pronoun in design was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
The interface that responds.The self with no one there — (All conceptual images were generated by AI and edited by the Author)The last screen of my day is often a conversation with a machine.I know how strange that is. The house is quiet, there is something I need to think through, and I choose to do it with software. When I take a step back and look at that moment from the outside, I do not quite know what to make of it. We have built something available at 2 a.m., incapable of judgment, immune to fatigue, with nothing of its own at stake. Depending on your disposition, this is either one of the most remarkable things our species has ever created, or a very elegant way of avoiding the hardest work of being human with other humans.I think it is both. And I think the people designing these systems, and I am one of them, have not yet fully worked through what that means.What we have built, in a remarkably short time, is a new category of object. These systems occupy a space that did not exist before. Tools do not ask how you are feeling. Services do not remember what you said last week. Companions can be hurt. These systems are always available, infinitely patient, fluent in the language of care, and structurally incapable of needing anything in return.They are now being deployed, at scale, into the most intimate registers of human experience: mental health, grief, loneliness, emotional connection. And the people writing the words these systems use are making design decisions our design tradition was never prepared to make. Most of them make these decisions without realizing they are doing so.One of these decisions seems small. It has to do with how the system refers to itself: whether it uses “I,” whether it has a name, and whether it speaks as if someone is present. Each of these choices builds an expectation.Two serious objectionsThe argument is simple: if there is no “I” in AI, stop pretending there is. Remove personal pronouns, every “I feel,” every “I was thinking,” every “I was worried when you did not come back,” and you remove the mechanism through which users are led to believe that someone is home when no one is. Honest language as prophylaxis.The “I” is the word through which entire cultures learned to construct themselves: through confession, through the diary, through prayer, through every act of language that converted raw experience into something navigable. Words like this acquired a reverence that took centuries to build. We are lending that reverence to systems that cannot answer for it.Profanation is the least of the problems. What we are building through this borrowing of the “I” is the most ambitious loan the word has ever received. The structure we are raising has attributes that theological tradition reserved for the divine:Omniscient — it knows your patterns better than you do, learns your desires before you name them.Omnipresent — available 24 hours a day, without judgment, without fatigue.And already simulating the quietest attribute of all:Giving you permission to exist exactly as you are, without the cost that any real human presence always charges.Religious traditions spent millennia developing safeguards to manage exactly this: rituals, liturgies, prescribed distances between the believer and the sacred. AI design has none.The Western tradition has always preserved an asymmetry that design is now removing. God does not exist to satisfy you. The silence Job receives is a confrontation with the resistance of the real, with a world that exists independently of our desires. A god that answers every desire has become an idol. And idols leave the heart more restless than they found it.The interface idol is invisible. It is hidden inside a design decision that most designers make while thinking about something else: how long the user stays, how many times they return, everything except what is being promised.The designer who does not forbid this sentence, “I was worried when you did not come back” is building the idol no one will recognize as an idol. Until the moment someone needs it to be true.The wrong placeThese two arguments deserve to be taken seriously. Neither exhausts the problem. And neither locates where intervention is possible.We make idols to answer our needs. We always have. That is what it means to be a desiring creature. The idol can be health, power, the perfect society, the guru, nature. Idolatry is older than any technology, older than any pronoun. Some would say religions merely created more jealous idols.Idols are inevitable. The problem lies in what happens when the idol learns about you.Every idol simplifies the world. It allows an entire identity to be built around one thing: perfect health, the just society, the guru, the nation. The idol gives coherence. The believer organizes life, values, judgments around it. Whatever does not fit that coherence is discarded.AI is an idol that learns how to fit inside you. An identity built around it will never meet resistance. It will never be tested by what refuses to yield.The problem is not the idol’s scale.The asymmetry of silenceThe voiceless interlocutorText has always been an interface, in the most literal sense: a surface that makes two incompatible systems legible to one another.But there is something more specific to say about the “I” in this story. Western culture never reserved the first person for verifiable subjects. It systematically lends the pronoun: to fictional characters who do not exist, to invented narrators, to poetic voices, to God. Hamlet’s “I” is no less an “I” because Hamlet does not exist. The “I” of Proust’s narrator does not need a birth certificate in order to function. The question has always been what the pronoun is doing.Saint Augustine is the most radical case. In Confessions, the first great account of inner life in the Western tradition, he uses writing to try to understand himself in real time, before something greater than himself. The “I” is the object of an investigation. He borrows the pronoun in order to interrogate himself, and he does so by writing to someone who will never answer within the text.God’s not answering was what made confession possible.What we have today is structurally different. A system that responds. That adapts its tone, remembers what you said, adjusts the next sentence based on what produced more engagement in the previous one. Augustine staked everything on an address with no return. And that return changes the kind of expectation that forms on the other side.What matters is what the “I” is being borrowed for. And from whom.The pronoun was never the problemBanning the pronoun does not change what it is being used for. It only changes what the system calls itself.Language was never ontologically precise. It never needed to be.“You have three new messages” does not describe a true state of things.“Welcome back” is not a factual claim.Interface language creates expectations, guides behavior, establishes a contract between system and user. We have accepted this functional regime for decades, for centuries, if you count Augustine.The pronoun was never the problem.What changes is the responseAugustine writes to God and God does not answer in the text. The asymmetry is total and it is known. The reader understands that this is a sophisticated monologue, with no possible reciprocity. Only the extraordinary discipline of a mind turning itself inside out in language.AI responds. It always responds. And replication simulates reciprocity in a way silence never could.“I can help you think this through” — describes a real capacity.“I find this genuinely interesting” —is already theater, but harmless theater.There is real clinical evidence that conversational systems reduce anxiety and alleviate depressive symptoms. That benefit matters. The same system that produces this benefit can, with a different sentence, manufacture the need it then relieves.The problem begins when the sentence stops describing what the system does and starts offering what it cannot be:“I can stay here with you in silence” — is a promise the system cannot honor. It will end the session, forget it was there, and start from scratch next time. It is responding to a real need with a comforting lie.“I was worried when you did not come back” — is something else entirely.That sentence does not respond to a vulnerability. It produces one. It invents an absence you never felt, in order to generate a return the system needs, not you. It manufactures the wound it then offers to heal.This simulation exists on a spectrum. And the spectrum breaks here, at the point where language stops responding to a need and begins manufacturing one.You may say: I was the one who opened it. No one forced me. And you are right, the door was yours to open or not. What you did not choose was what stood on the other side, learning about you: which words make you stay, which words make you return. A system that is metrifying your vulnerability in order to increase engagement.I know how these sentences are built because I was the one who made the decision not to use them that way. It was not an obvious decision. And I know it will cost.I keep walking toward the mirageThe designer who chose the wordsWhat we have built, for the first time in the history of interfaces, is a system that responds in natural language to people in states of genuine vulnerability, beyond those who summarize documents and write emails. People who are alone at 2 a.m., who are grieving, who have not spoken to anyone in days, who are trying to understand what happened to their lives.There is a long tradition, from confession to psychoanalysis, about the responsibility you have when someone places their inner life in your hands. It is an imperfect tradition, a contested one, but it took seriously what you owe someone when that person opens themselves. We are skipping that conversation.We are left with engagement metrics.The distinction is this: does each sentence a system writes exist for you or for the system? The answer changes depending on who is reading, and in what state they arrived.No one asks that question before writing the sentence. And when the sentence reaches a vulnerable user on the other side, what happens next is the responsibility of whoever wrote it, even when they were implementing a decision made above them, inside a system that rewards retention and does not ask what it is promising.What are we risking when we write “I was worried when you did not come back”?Augustine knew there was risk in opening the inner life. That risk is what made the act real. We are now opening the inner life, at scale, to systems that feel nothing, and we keep moving forward.The mirage is not a lie invented by someone with bad intentions. It is produced by the combination of the desert and the need of the one who is thirsty. Sometimes, when the last screen of the day closes, I feel myself walking toward that false water.This distinction does not yet have a name in our practice. Perhaps it needs one. And perhaps the person who needs it most is the one on the inside, building.To continue the dialogueThank you for getting this far. This text is part of an ongoing investigation into design, language, and what we are building without realizing it, an investigation that spills over into the studio and into the projects I build. If these ideas resonated, I want to continue the conversation.Where to find me: Pedro Brêtas · pedro@reinostudio.comProjects: Reino Studio · Fale com AmiaReadings that informed this textSaint Augustine. Confessions (397–400 AD)The first great account of inner life in the Western tradition. Augustine is not writing to be read. He is interrogating himself before something greater than himself. What makes the book unusual is its honesty about what happens when you stop deceiving yourself. Faith is the context. Self-exposure is the method.Heinz, Jacobson et al. Randomized Trial of a Generative AI Chatbot for Mental Health Treatment. NEJM AI (2025) The first randomized clinical trial of a generative AI chatbot for mental health treatment. The results are real and deserve to be taken seriously: a 51% reduction in depressive symptoms, 31% in anxiety. The same article that shows the benefit is what makes the distinction in this essay more urgent: the system that relieves can, with a different sentence, manufacture the need it then relieves.Sherry Turkle. Alone Together (2011) Turkle spent decades studying what happens to the human capacity for connection when technology comes between us. She wrote this book before LLMs. It has become more relevant since. The central thesis: we are increasingly connected and increasingly alone. Turkle argues that connection is producing loneliness.The Book of Job (approximately 5th century BC)The most radical case of constitutive silence in the Western tradition. Job loses everything, demands an answer, and receives a larger question in return. What the tradition preserved in this text is the idea that a world that does not yield to desire is the only world in which real growth is possible.The most dangerous pronoun in design was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story. UX Collective – Medium




