Issue #7: Why We Build Machines in Our Own Image
Humanoid robots aren’t new. But something has shifted.
With billions in funding and real deployment plans in motion, we’re no longer in the era of sci-fi prototypes. We’re in the era of humanoids designed for work: logistics, caregiving, even factory floors. Machines with arms, legs, eyes, and in some cases, facial expressions, are being shaped not just to do the job, but to look like they belong in human spaces.
But here’s the question that haunts this moment:
Why are we still building machines in our own image?
From tool to figure
We’ve long built tools to extend our reach: plows, planes, processors. But humanoids are more than tools. They’re designed to stand among us.
Anthropologically, this matters. In every society, the human figure holds symbolic power. When we shape a machine to walk, grasp, gesture, even vaguely like us, we’re doing something more than functional. We’re placing it into our social field. Giving it a role.
Whether it’s the warehouse or the home, we’re not just delegating tasks. We’re delegating presence. A humanoid doesn’t simply complete a task; it occupies a role we recognize. And that subtle shift changes how we feel about the work being done, and about the being doing it.
The labor shift beneath the surface
China just committed over $20 billion to humanoid robotics for manufacturing. Figure AI is preparing to mass-produce general-purpose workers. In healthcare, education, and customer service, these machines are being positioned as support staff, not assistants.
The economic logic is straightforward: automate physically demanding, repetitive, or dangerous tasks. But the stakes go beyond efficiency.
Humanoid form isn’t just a novelty, it unlocks compatibility. Most environments were designed for human bodies. Doors, stairs, shelves, controls. A robot that can move like a human fits seamlessly into legacy infrastructure without needing us to rebuild our world.
That’s why humanoids are becoming more attractive to investors. Not just because of their technical potential, but because they promise minimal friction with the existing built environment.
Yet this functional argument can obscure deeper trade-offs. Once a machine shares our shape, we begin to treat it less as a tool and more as a peer or competitor.
Familiar forms, quiet compliance
Here’s what we know: the more human a robot looks, the more forgiving we tend to be. A slight nod, an expressive LED “face,” a body that moves like ours—these cues don’t just help us understand the machine. They help us trust it.
Even when it doesn’t deserve it.
This is the same phenomenon we explored last week with AI interfaces. But embodiment adds another layer. It’s not just language. It’s posture. Presence. Proximity.
We make machines that feel relatable, and then wonder why we start treating them like coworkers—or worse, caretakers. We normalize their presence in places where attentiveness and empathy were once non-negotiable. The performance of understanding becomes enough.
Over time, we may even shift our own behavior—learning to speak in ways that are easier for robots to process. A mutual adaptation that favors clarity over nuance, standardization over spontaneity.
Cultural mirrors
In Japan, humanoids have long been part of public life, guiding visitors in train stations, assisting the elderly, or teaching in schools. This isn’t accidental. Shinto traditions, which see spiritual essence in objects and nature, make the line between animate and inanimate more fluid.
In contrast, Western frameworks, rooted in Christian dualism and Enlightenment rationalism, tend to draw sharper distinctions between human and machine, body and soul. The result is more skepticism, even unease, toward humanoid forms.
These cultural logics shape how we design and receive robots. In one, the robot can be a helpful social companion. In the other, it’s often framed as a threat to authenticity or labor.
Yet across both, one thing is clear: form matters. And once you give a machine a face, you’re not just building tech. You’re building a symbol.
A symbol of familiarity. A placeholder for presence. A shape we instinctively read as social, even when it is anything but.
This symbolism is powerful. It disarms critique. It frames machines not as intrusions, but as evolutions of something we already understand.
What gets lost in the performance
As humanoids become more competent, we risk forgetting that their fluency, physical, verbal, or emotional, is a performance. A design decision.
That’s fine, until we stop noticing the difference.
A robot that bows doesn’t know respect. One that smiles isn’t feeling empathy. But the illusion is sticky. Especially when efficiency is on the line.
This is where anthropology offers something precious: a reminder that human rituals and signals evolved in context—messy, mutual, negotiated. Machines mimic these signals without the depth behind them. And the more realistic they become, the more we’ll have to ask:
Are we relating to function, or reacting to form?
When care is reduced to gesture, and presence is reduced to interface, we may gain productivity—but we lose something harder to quantify: the quiet recognition that makes relationships real.
Final thought
We say we want smarter tools. But we keep building more human ones.
Not because we need legs to stack boxes, or eyes to track packages, but because we want the future to feel familiar. And because humanoid form buys us more than function, it buys us acceptance.
But the cost of that acceptance is subtle: we stop noticing the systems we’re smoothing over.
Humanoid robots promise labor without complication. Presence without politics. But under the surface, they raise old questions in new forms:
What counts as human? Who decides what work is replaceable? And what are we training ourselves to believe—when the machine that does the job also looks a little like us?
Follow me for more reflections on how we build, trust, and reshape humanity through machines.
(OOO till 25 Aug) CEO @ growthcab.com ⏩️ alien-tier B2B sales advisory
3moAlexa got manners and now I’m spiraling ngl
AI Adoption Strategist
3moThis hits hard Dilan! Humanization of machines isn’t just an engineering feat, I t’s a narrative decision. When function has a face, we risk outsourcing not just tasks, but also emotions.
I help founders build and scale winning products with instant developers | Product + Dev Partner | Grab FREE tools in Featured to build MVPs by yourself .
3moWe've gotta be careful. We're building machines to comfort us in ways we didn't expect, but at what cost? It's like we're creating a mirror image of ourselves in tech.
Consultant en Marketing Digital | Formateur | Spécialiste référencement Mobile et Data 🚀 J'aide les entreprises à augmenter leurs ventes en ligne grâce au SEO, SEA, SMO et inbound marketing.
3moIntéressant
Fractional Head of HR | HR Transformation & Post-M&A Integration | AI-Ready People Strategy | Building High-Performance Cultures
3moI absolutely love this post Dilan Kurt. What is the difference between writing a prompt with or without thank you. Are we getting too comfortable talking to the machines over utilizing it as a tool? That’s a great food for thought from the emotional perspective.