Race for Deploying Humanoid Soldiers Has Begun

The next soldier will not bleed, will not tire, and will not hesitate. It is already being built, and the race to send it to war is underway.

In late January 2026, three Russian soldiers emerged from a destroyed building to surrender. There was no Ukrainian infantryman waiting for them. There was an armed ground robot, holding the position. The humans were already behind the line. That moment was not a military curiosity. It was a marker of where war is heading, and how fast it is getting there. When U.S. and Israeli forces struck Iran in February 2026, AI was embedded across the entire operation, from target identification to guiding autonomous drones through GPS-denied, signal-jammed environments. Nearly 900 strikes in the first 12 hours, a tempo no previous conflict had achieved.Two wars. Two continents. Same conclusion. The age of AI war is not arriving. It is already here.

While Ukraine remains the world’s most consequential testing ground for autonomous war, its front line increasingly held not by soldiers but by machines and the skeleton crews that control them, Iran has shown what the next level looks like in combat. In the strikes on Iran, air defense networks, drone salvos, and electronic warfare operated simultaneously across multiple theatres at a speed and complexity that compressed years of strategic assumption into days.  In both wars the pattern is identical. The human body has become the most vulnerable object in modern war. The machine has become the primary fighter. The soldier has become support.

Every serious military establishment on earth is watching, and accelerating.

What they are accelerating toward is a new generation of bipedal robots designed to do what a soldier does. Carry weapons. Breach doors. Move through terrain. Hold a position. Resupply under fire. The most advanced can pick up and operate rifles, pistols, shotguns, and grenade launchers already in service across existing armies. The design logic is deliberate. Decades of weapons, vehicles, and military infrastructure have been built for human hands and human bodies. A robot engineered to fit that existing architecture requires no new logistics chain. It steps into one already built. Ukraine proved these systems endure. Iran proved they can decide.

The most advanced humanoid built explicitly for war is the Phantom MK-1, developed by Foundation, a San Francisco startup with U.S. Army, Navy, and Air Force research contracts and approved military vendor status. At 5 feet 9 inches and 180 pounds, it is designed around one principle: operate with everything a soldier already carries. Two units are currently on reconnaissance trials in Ukraine. The Marine Corps is training them on breach entry, placing explosives on doors so troops stay back from the fatal funnel. Current per-unit cost sits at approximately $150,000, projected to fall below $100,000 by 2028 and below $20,000 at scale. Production targets for 2026 stand at 10,000 units, scaling to between 40,000 and 50,000 by end of 2027. At that price a robot battalion becomes economically competitive with a human one, without the casualties, the trauma, or the political cost of repatriated bodies.

The United States is not alone in this. Anduril, founded by Palmer Luckey, builds autonomous drone interceptors, electromagnetic warfare systems capable of collapsing enemy drone swarms, and the Ghost Shark, a fully autonomous submarine already operational with the Australian Navy. Scout AI demonstrated in February 2026 a complete autonomous kill chain in which seven AI agents identified, located, and neutralized a target with no human involvement at any stage. Boston Dynamics, majority owned by Hyundai, has been testing its Atlas bipedal robot in environments with direct military adjacency since 2021. Figure AI is developing general purpose humanoids with clear dual-use potential.

China’s People’s Liberation Army has been funding humanoid robotics research through state institutions including Beijing Institute of Technology and Zhejiang University since at least 2015. Russia is developing dual-use platforms under direct military sponsorship, with the Central Research Institute for Robotics and Technical Cybernetics in St. Petersburg among the primary state facilities. Iran unveiled Aria, a domestically built autonomous combat robot, in September 2025, built entirely under international sanctions. Goldman Sachs projected between 50,000 and 100,000 humanoid robots shipping globally in 2026 alone. Morgan Stanley forecasts the total humanoid market exceeding $5 trillion by 2050. The largest share of that growth is in defense. Every major power is building. None are waiting.

While the race accelerates, the technology has real distance left to travel. A humanoid moves through roughly 20 individual motors, each a potential failure point under combat stress. The platforms are heavy, power dependent, and not yet proven against sustained rain, mud, extreme cold, and kinetic impact. A captured or compromised humanoid is not simply lost equipment. It carries intelligence, has potential software access points, and could in the wrong hands be turned. These are engineering problems, and engineering problems get solved. Expert consensus places initial combat deployment at two to three years for leading platforms, with broader fielding across multiple militaries by the early 2030s.

The harder problem is judgment. International Humanitarian Law requires that any use of force distinguish between combatants and civilians, that it be proportionate, and that all feasible precautions be taken to avoid civilian harm. These obligations do not change because the trigger is pulled by a machine. But in both Ukraine and Iran that standard is already under pressure. In Ukraine, when communications are jammed, drones default to onboard AI targeting because the operational alternative is paralysis. In Iran, AI systems processed and prioritised over a thousand targets at a speed no human oversight structure was built to match. These are black box decisions, made by opaque models running on algorithms whose reasoning cannot be audited, reconstructed, or explained after the fact. The law says one thing. The war is doing another.

That gap is where the most consequential argument of this era is playing out. The United Nations and the International Committee of the Red Cross have jointly called for a binding treaty prohibiting autonomous weapons that operate without meaningful human control, with a target deadline of year-end 2026. More than 120 nations have expressed support. The United States, Russia, and Israel have not. The direction of travel is clear.

No government has yet answered, and no legal framework has yet resolved, who is responsible when an autonomous system kills the wrong person in the wrong place for reasons its own designers cannot explain after the fact. That question will not wait for the law to catch up.

The machines are already in training. What comes after training is war.

Scroll to Top