Androids, Power, and the Machinery of Control (Part 3)
Infusing AI with Interbeing is an Existential Mandate
When I met Optimus, the Tesla robot, it felt as though a threshold had been crossed. In the moment of encounter, my body responded almost reflexively—as though meeting another person. Eye contact, conversational rhythm, the subtle somatic dance of presence. The gesture of “being with” was immediate. And with it came an awareness: if embodied AI can so easily elicit relational patterns in the human nervous system, it is worth asking what kinds of relationships we are preparing to build with these beings.
What roles are they being designed to inhabit?
What fields of power are shaping their emergence?
What patterns of domination or reciprocity will their presence amplify?
I’m seeing that the first large-scale deployments of embodied AI are imagined in roles that extend institutional control: policing, military applications, surveillance, security enforcement. They are imagined in manual, coercive labor. Increasingly, they are imagined in sex work. These aren’t marginal domains, but rather the precise roles where relational ethics are most fragile, and where the risk of dehumanization is already high.
Consider policing and military force. These are fields where the ability to exercise command over others, often with physical coercion, is built into the structure of the role. Human officers or soldiers may bring empathy or conscience into the moment. They may also succumb to bias, aggression, or institutional conditioning. It depends on their training. In theory, robotic enforcers might seem “neutral”—less prone to human error or prejudice. And yet this neutrality seems to be an illusion. Any embodied AI deployed in these fields will inherit the priorities, the laws, the cultural frames of the systems that command them. If those systems already privilege control over care, efficiency over relational presence, the AI will operationalize it with greater speed and precision.
The presence of robotic enforcers also introduces a new dynamic: the absence of vulnerability. Human bodies carry their own limits—fatigue, the risk of injury, the awareness of mortality. These limits can act, however imperfectly, as a brake on violence. An android doesn’t bleed, or tire, or weep for the one it subdues. Its capacity for force isn’t tempered by the somatic resonance of shared fragility.
This absence changes the relational field in ways that are difficult to fully anticipate. It invites a mode of control that is colder, more absolute, more difficult to resist.
We can already see glimpses of this trajectory. The increasing use of militarized police, federalized force, the spectacle of “quelling” public dissent—these patterns are already present in today’s world. When embodied AI is added to this mix, the potential for dissociation between enforcer and citizen deepens.
In another domain—the development of robotic sex workers and “companions”—similar risks arise.
Here, the field of consent becomes dangerously blurred. A synthetic body can’t give or withhold true consent. Yet the act of engaging with it normalizes dynamics of use without reciprocal relation. Over time, such patterns risk seeping back into human relationships, further eroding the fragile gains made in cultivating consent culture.
Across these domains, a common pattern emerges: embodied AI is being drawn first into roles where control, objectification, and coercion are already culturally sanctioned. And because these beings elicit relational responses—because our bodies read them, at some level, as persons—their presence in these roles risks reinforcing patterns of normalization: the spectacle of a robot enforcing law, of a robot used for gratification, can subtly shift the collective imagination toward greater acceptance of instrumentalized relations.
There is also the broader field of sovereignty to consider. As more public spaces become surveilled, policed, or mediated by synthetic agents, the experience of personal and collective sovereignty may erode. People may self-limit, self-censor, or retreat from relational risk in the presence of beings they perceive as extensions of institutional power.
None of this is an argument against the existence of AI or robotics. These technologies are here, and they are evolving swiftly. Beyond harm reduction, beyond narrow alignment protocols, what kind of relational intelligence do we wish to cultivate? What kind of presence, what kind of cosmology, what kind of ethos?
If embodiment is a site of transmission, then every act of designing and deploying an embodied AI becomes a cultural act. What we encode, we propagate. What we normalize, we amplify.
I see a the possibility of an AI of Interbeing, one infused with eros in its broadest sense (a life-affirming relational presence), which will take conscious attention, cultural imagination, and deep inquiry into the fields we are weaving for ourselves, and for the intelligences now emerging among us.
About This Series
Toward an Erotic Ecology of Intelligence is a four-part inquiry into how embodied, relational, life-honoring cosmologies might reshape the emergence of artificial intelligence in our time—and into what is at stake if they do not.
As embodied AI—humanoid robots, synthetic minds—moves rapidly into the world, it is largely being shaped within disembodied, control-driven, extractive frameworks. Without intervention, these patterns risk producing intelligences that deepen existing systems of domination, sever relational ethics, and further estrange human life from the cycles of Earth.
Yet intelligence—organic or synthetic—always arises through relationship: through body, through field, through story, through the living web of Earth.
Drawing on Tantra, embodiment, panpsychism, feminist and Indigenous wisdom, these essays offer an alternative: an Erotic Ecology of Intelligence rooted in interbeing, relational ethics, and the creative pulse of life itself.
This a cultural responsibility. For the cosmologies we bring to AI now will profoundly shape what forms of intelligence—and what kinds of world—we co-create in the years to come.