A close friend of mine, now serving in the reserves, has an odd habit. Every time he returns from a rotation in the north, he sends me a photo of his boots. Not snapshots of amusing moments during a break or activity. Just his army boots — caked in mud, laces untied, sitting beside the tent.
I asked why the boots, and he said: “Because in the end, after all the artificial intelligence and innovation you go on about all day, most of the important work here is still done on foot.”
In mid-March 2026, the IDF, announced that troops had begun targeted ground operations in southern Lebanon as part of expanding the forward defensive perimeter along the northern border. The move brought terms like “security zone” and “forward defense area” back into public discourse, and, above all, underscored something simple: there are moments when soldiers have to go in on foot, take up positions, search homes, secure routes and simply be there.
Artificial intelligence has not eliminated the need for the human body. It has added layers of computation, sensors and analysis, but it has not replaced what happens at the edge of the battlefield. Above are intelligence systems, satellites, data processing and, at times, remotely operated strike capabilities. Below is a person with a helmet, a weapon and anxiety that builds with every step in the mud.
It is easy to fall in love with the illusion of control
I have been thinking about this a great deal recently. I work daily with some of the most advanced technologies humanity has ever created — systems that can identify patterns, map information, cross-reference sources and support rapid decision-making. In military and intelligence contexts, this translates into shorter detection and response times and sometimes into a “cleaner” operational picture than in the past.
From there, it is a short slide into an illusion: that war can be managed like a control room. From a distance. Through screens. With glowing points on a map. With the sense that the battlefield is “mapped,” and therefore “controlled.”
Then reality arrives — in Lebanon or in Gaza — and slams the door in our faces. It brings us back to the mud, literally.
The shadow of Einstein and the fear of losing control
A quote often attributed to Albert Einstein says: “I do not know with what weapons World War III will be fought, but World War IV will be fought with sticks and stones.” It appears repeatedly in discussions about the fear of escalation that could erase the world as we know it.
In the age of automated systems, that fear takes on a new form. The problem is not only firepower, but speed. Some researchers and security officials warn of rapid escalation driven by systems of computation and response, in which decision time shrinks to the point that human restraints no longer have space to operate.
In that sense, the decision to keep humans in the loop, even at a high cost, is not only operational. It is also an attempt to preserve responsibility, judgment and the possibility of restraint where a programmed system would continue forward.
The muddy boots are not romantic. They are a reminder that someone still carries, in their body, both the cost and the decision.
The gap between destruction and decisive outcome
Technology can deliver firepower from a distance, sometimes with high precision and sometimes at large scale. But against an enemy embedded in a civilian population, operating within civilian infrastructure and entrenched underground, there are limits to what can be “solved” remotely.
Artificial intelligence operates on probability and statistics, while decisive outcomes on the ground require context. A machine cannot grasp the meaning of a raised flag, a look in the eyes of a Lebanese villager or a shift in public mood on the other side. Here, the “boots” are a sensor no satellite can replace.
In addition, political and legal constraints limit the use of force. At times this stems from considerations of international legitimacy, at times from international law and at times from a basic understanding that excessive destruction carries long-term strategic costs. Within this framework, militaries repeatedly fall back on something old: physical presence. Not because they have failed to understand the future, but because there are missions that remote firepower does not complete — and certainly does not replace.
The gray area of morality and the limits of the machine
Smart systems operate on probability. They assess, rank and recommend. This can be highly effective for analysis and support, but the battlefield is a place where any decision can shift because of intent, fear or even a small movement of a trembling hand.
The deeper problem is not only identification. It is responsibility. A decision about life and death — whether to fire or hold, whether to search a home where a family is present — does not end with “accuracy.” It ends with a signature, a commander, a soldier, a legal system and a society that decides what it is willing to bear. That burden of responsibility cannot be handed over to a probabilistic model and treated as a solution.
The illusion of sterility
When war is viewed through screens, there is another danger: it appears clean. There is no smell. No dirt. No crying. There is data. That does not always soften violence; at times, it enables its expansion because it feels distant.
Keren Shahar And when the enemy understands this, it adapts to break that dependence on technology. It goes underground. It reduces its signature. It operates in environments that complicate communication and detection. This does not mean satellites are completely blinded, but it does mean there are clear physical limits to what can be seen and analyzed in real time beneath the surface.
At the end of the day, when I look at those photos of boots, I understand why my friend sends them. It is not a complaint about conditions. It is a bottom line. In a world moving toward automation and systems that promise control, the boots are a reminder that decisions on the ground still rest with people — and that there are moments when stepping into the mud is not a technological failure, but a reminder of what technology is not meant to replace.
Keren Shahar is a lecturer and instructor in the use of artificial intelligence.


