AI Can't Tattoo You
AI will take over many jobs. But some are AI-resistant. Not because of technical limitation, but because they require embodied knowledge. Knowing what pain feels like.
My wife had Ink Master on in the background. One of those competition shows where tattoo artists get judged on their craft. I wasn't really watching. I was at my laptop running fal.ai, generating portfolio images for a client. I fed it a prompt, waited maybe five minutes, and had a dozen clean, professional-looking photos that would have taken a photographer half a day to shoot.
Two screens. Two completely different worlds. One of them was collapsing the distance between idea and output. The other one was showing me something that couldn't be collapsed at all.
I kept half-watching the show while I worked, and at some point it clicked. These two things were happening at the same moment, in the same room, and they were pointing in completely opposite directions.
The Pain Argument
Here's what a tattoo artist actually does.
They're not just executing a design. They're managing a conscious human being's experience of pain in real time. Every person sits differently. Every person has a different threshold. Some people go pale and quiet. Some people joke louder the more uncomfortable they get. Some people hit a wall and don't know how to say it.
The artist has to read all of that. Micro-expressions. Breathing. Muscle tension. When to keep going, when to pause, when to check in. They make dozens of adjustments per session that they're probably not even consciously aware of, because they've sat with enough people in pain to recognize the patterns.
That knowledge doesn't come from data. It comes from having a body. From having felt pain yourself. From being the kind of being who can sit across from someone in discomfort and understand, on a physical level, what that person is going through.
An AI cannot know what it feels like to get a needle dragged across your ribs for three hours. No training data bridges that gap. You can describe pain to a machine in infinite detail and it still won't know pain. It knows the description.
This is the line I've been turning over. Not "what can AI do technically?" but "what requires being the kind of thing that has actually experienced something?"
The Surgeon Contrast
Here's where it gets interesting, and where I'd push back on myself a little.
A surgeon performs incredibly complex, skilled procedures. Many of those procedures are candidates for robotic assistance, and some are already partially automated. We trust machines to assist in surgery. We're moving toward trusting them more.
But the patient in surgery is unconscious. They're not there in the same way. The trust equation is different. You don't need the surgeon to understand what the anesthesia feels like because you're not awake to need that.
A tattoo client is conscious the entire time. They're present. They're vulnerable. They need the person holding the needle to understand, from the inside, what that experience is. That's not a technical requirement. It's a human one.
This distinction matters. It's not about whether AI is capable. It's about whether consciousness is part of the interaction. Wherever a person is awake, aware, and in some kind of real-time vulnerable exchange with another person, the living experience of the practitioner becomes part of what they're offering.
The surgeon contrast shows me this isn't about skill level. It's about whether being human is actually the product.
The Enabler Frame
I build AI agents and automation pipelines for small businesses. This is what I do every day. Chatbots, workflows, data pipelines, automated lead management. I have seen what these tools can actually do, not from the outside looking in, but from the inside building the systems.
And I am not afraid of it. Most of the time, I think it's genuinely good.
Here's why. Think about a mother with a newborn. She has infinite things to manage. Scheduling, admin, groceries, home logistics, sleep tracking. If AI can take ten of those things off her plate, what does that free up? More time with the baby. More presence. More of the thing that only she can give.
She wouldn't hand the baby to a robot nanny. But she might absolutely let a system handle her calendar so she's not mentally somewhere else when her kid is in front of her.
That's the frame that makes sense to me. Automation handles the friction. It clears the path. What it's clearing the path toward is what matters. And the things it's clearing the path toward, the things worth having the time for, are usually deeply human ones.
Cooking a meal with your family. Not because the food needs to be cooked by hand, but because the act of doing it together is the point. The conversation. The mess. The kid who keeps stealing bits of cheese before anything hits the pan. A robot could produce a better meal. That's not actually the issue.
Some things are the product because of what they require you to be. Not because of what they produce.
The Forge
Here's something I didn't think about at first.
The essay up to this point is about the person receiving the service. The tattoo client. The child. The student. But what about the person doing the work?
A tattoo artist who has sat with hundreds of people in pain over years becomes a different kind of human because of that work. Not just more skilled. More human. More patient. More attuned to suffering. The practice of holding space for someone else's vulnerability, day after day, reshapes you. It's a forge.
A teacher who has invested deeply in struggling students doesn't just produce better outcomes. They grow through the investment. The work of caring about someone else's development, of showing up even when it's hard, of watching someone fail and choosing to stay, that changes who the teacher is.
If we automate these roles, we don't just take something from the recipients. We remove the forge that makes certain people who they are. We lose the process that builds depth, patience, empathy, the qualities we value most in each other. And we get a thinner species because of it.
The work shapes the worker. That's not a side effect. It might be the most important thing happening.
Theater of Knowing
AI is very good at performing understanding.
If you talk to a well-designed chatbot, it will ask follow-up questions. It will mirror your language. It will respond in a way that feels attuned to you. For a lot of interactions, that's more than enough.
But there's a difference between the theater of knowing and actually knowing. A difference between a system that has learned to produce the outputs of empathy and a person who has lived through something and carries it.
A coach who has failed, who has been exactly where you are, who has felt the specific weight of not knowing if it's going to work out. That person gives you something different. Not better information necessarily. Something else. The evidence that someone who has been through it is still standing in front of you and believes it's possible. You can't fake that. A machine can simulate it but it doesn't have it.
The interesting question isn't whether people will accept the simulation. Context determines that. Some will. Some won't. The question is whether, over time, we can tell the difference, and whether it matters if we can't.
The Hidden Cost
Here's the thread I can't stop pulling on, and it's darker.
An adult who has lived some life can usually feel when they're getting the simulation instead of the real thing. It's not always obvious, but there's usually something. A flatness. A feeling that the words are right but something is missing.
A kid doesn't have that calibration yet.
If children grow up with AI tutors as their primary source of educational mentorship, with AI systems managing their emotional check-ins, with algorithmic content replacing adult conversation about what matters, nobody sounds an alarm. The AI tutor answers questions. The AI system is available at 2am. The metrics look fine.
But what's actually being transmitted in a real student-teacher relationship isn't just information. It's the modeling of what it looks like to be human. To care about an idea. To fail at something and keep going. To struggle with a concept out loud. To actually invest in another person's development. You can't transfer that through a system that doesn't know what caring actually costs.
I don't think we're going to wake up one day and realize the damage. I think it will be more gradual than that. A slow erasure of something we assumed was always there.
This is the part of the conversation where I don't have neat answers. I just think it's worth naming.
The Luxury Problem
And here's where it gets uncomfortable.
If most things get automated, the remaining human-provided services become scarce. Scarcity drives price. A human tattoo artist becomes a premium experience. A human teacher becomes a privilege. A human therapist who has actually been through something becomes something only certain people can afford.
Everyone else gets the simulation.
That's a class divide nobody's really talking about yet. Not a divide in access to technology, but in access to real human presence.
And there's an economic engine behind it. AI creates deflationary pressure. It pushes the cost of services down across the board. That sounds good until you realize what it does to the human alternative. If an AI tutor costs five dollars a month and a human tutor costs eighty dollars an hour, the human doesn't just look expensive. They look irrational. Not because they got worse. Because everything around them got so cheap that the gap became impossible for most people to justify.
Now think about who benefits from that deflation. If you hold assets that don't deflate with everything else, hard assets, Bitcoin, real estate, your purchasing power goes up. You become the person who can still choose the human version. But if your only asset is your labor, and AI is competing with your labor, your purchasing power stays flat or drops. The cost of things falls, but so does your income.
So the divide isn't vague. It's specific. People who positioned themselves with the right assets get to keep the human version of life. Everyone else gets the efficient version. Not because anyone chose that outcome. Because the math made it inevitable.
What About the Rest of Us
I can already hear the pushback. And it's fair.
Everything I've written so far is about the tattoo artist, the teacher, the mother, the coach. Jobs where being human is obviously the point. But what about the trucker? The warehouse worker? The person doing data entry or driving a forklift or processing invoices? Nobody's paying them to be human. They're being paid to move things from one place to another. And a machine can move things.
I don't have a comfortable answer for that. I'm not going to pretend this essay covers everyone. It doesn't.
But I will say this. I think we undervalue the human knowledge in those jobs because we've already started thinking of them as mechanical. A trucker with twenty years on the road reads weather, reads other drivers, makes judgment calls in situations no training dataset has ever seen. A warehouse worker who has been in the same facility for a decade knows things about flow, timing, and edge cases that no system has been taught. That knowledge is real. It's embodied. It's the same kind of knowing I've been talking about this whole essay.
The problem is that nobody frames it that way. Not the companies. Not the public conversation. Not even the workers themselves, sometimes. And the moment you accept that your work has no human value, you make it easier for everyone else to accept it too.
I'm not saying every job will survive. Some won't. But I think more jobs have a human core than we're willing to admit. And the first step to protecting them is refusing to agree that they're purely mechanical when they're not.
I Build This Stuff
I should be transparent about something.
I'm not writing this from the outside. I build AI agents. I build chatbots. I build the automation pipelines that take tasks off people's plates. This is my work. This is how I pay my bills.
So when I say "choose carefully what you automate," I'm saying it as the person who builds the thing. Not as someone afraid of it. I know what it can do because I make it do those things every day. And precisely because I'm that close to it, I can see where the line is.
A school district that saves 40% by switching to AI tutors isn't really choosing. A nursing home that can't hire enough staff isn't really choosing. The market will push toward automation in every space, including the ones that matter most. Intentionality alone won't hold the line when the economics are that compelling.
So the question isn't just "what should we protect?" It's "what are we willing to pay to protect it?" Because human presence is going to cost more. A human teacher will cost more than an AI tutor. A human tattoo artist will cost more than whatever machine eventually holds a needle steady. And if nobody chooses the expensive option, it stops being offered. Not because it was taken away. Because it was priced out.
I'm not exempt from any of this. The same pressure applies to what I do. But that's a different essay.
Choose Carefully
I'm not writing this as an anti-AI argument. I use these tools every day. I build with them. I think they're genuinely going to improve a lot of things for a lot of people.
But the frame matters. The question isn't only "can AI do this?" The question is "is the human presence part of what makes this thing what it is?"
Sometimes the answer is no. A lot of back-office work, a lot of research, a lot of logistics and scheduling and data processing. Fine. Build the system. Free yourself up.
But when you're sitting with someone who is in pain, or trying to teach someone something that requires a person to believe in them, or cooking dinner with your kids and the point is the dinner-making and not the dinner, something is at stake that has nothing to do with technical capability.
The jobs that survive automation won't be the ones machines technically can't do. They'll be the ones where being human, having a body, having felt things, having actually been somewhere, is the whole point.
A machine can draw a perfect design. It cannot manage your pain. It cannot sit with you while the needle is moving across your ribs and know, from the inside, that you're getting close to your limit and you need a ten-second break and a breath before you continue.
That's still a person's job.
What would you refuse to hand off to a machine, even if it could do it better than you?