Army leaders are taking a hard look at how its new way of warfare, a fast-paced, multi-domain, wide-ranging style of combat, will bring a host of challenges to the laws of war.

In an event titled, “The Future Character of War and the Law of Armed Conflict” hosted by the Army Futures Command and the Lieber Institute for Law and Land Warfare at West Point April 22, military planners, professors, lawyers and the four-star tasked with charting the Army’s future shared what they think commanders in 2035 will face. They key takeaways: Everything will be much faster;l information overload is likely; and deciphering good data from bad will matter more than ever.

The Army is planning for a sensor-saturated environment where the ability for units to hide will be “almost impossible,” fighting in dense urban areas is highly likely, and nation-state-level technology will move rapidly into the hands of nonstate actors, said Gen. Mike Murray, the head of AFC.

Future Army commanders will have to lean on technology such as artificial intelligence and autonomous and semi-autonomous systems just to do their jobs, Murray said.

He painted a scenario in which a swarm of 100 drones is headed toward an Army formation.

“Is it within a human’s ability to pick out which ones have to be engaged first? Is it even necessary to have a human in the loop when you’re talking about effects against an unmanned system?” Murray said.

That’s where tech and operations merge in perhaps a clear way.

But what about mimicry? What are the ethical or legal boundaries for masking your formation to look like a civilian object? That Stryker emitting the kind of signature similar to a city bus, for instance?

That was a scenario West Point Professor Rob Lawless presented.

That’s where things get murky.

“There really won’t be anywhere to hide on that future battlefield,” he said. “It’s hard to overstate this.”

As cyber and electronic warfare tools progress, thinkers have to acknowledge the tough questions military leaders face in how they can use their own tools in a legal and ethical way, and how their adversaries might use such tools against them without the same constraints.

And those tools, such as autonomous platforms and artificial intelligence, are not simply add-ons to existing weapons. They are the entry fee for future war.

“That’s how entrenched AI and autonomy are in 2035,” said Dr. Sasha Radin, Lieber Institute professor.

The battalion or brigade commander of the near future, a West Point cadet today, will have more pressure on them to understand what their autonomous systems can and can’t do.

They will also need to know how much they can trust all of the data that’s being delivered to them.

“Is this data accurate, biased, corrupted, hacked?” she said.

Core principles of lawful warfare, such as judging the foreseeability of certain actions by commanders in combat will remain but will also be altered by these more complex, fast-paced scenarios, she said.

Military commanders, like civilians facing criminal accountability, are now judged on standards of “reasonableness,” that is, what would a reasonable person do in that same situation?

That standard, like the battlespace, may change over time.

“What does reasonable mean in 2035 with all of this reliance on data in decision making?” Radin said.

Some of this is being tested, at least the technology and decision-making part. The Army conducted its first major iteration of “Project Convergence” in fall 2020 with a scaled up version scheduled this year.

The project blends AI, robotics and autonomy across multiple systems. Murray said that at last year’s event in Yuma, Arizona, they were able to take processes that now take 10 minutes or more and reduce them to 10 seconds.

“That’s going to happen across the battlefield,” Murray said. “We’re going to have to learn to trust AI.”

Murray reflected on his own experience in Afghanistan, where he would be surrounded by an operations officer, intelligence officer, fires officer and a staff judge advocate to weigh in on the legal parameters and ramifications of certain life or death decisions.

Lt. Col. Keith Donnell, chief of the Army Future Studies and Integration Branch, noted that while the Army is on the edge of “unpacking capabilities for the force” it will take a significant amount of work to merge that tech with the human decision maker.

“The same thing will apply to how we examine it through a legal eye,” Donnell said. “We have to understand the operational risk.”

Future commanders involved in such situations will need legal experts alongside them who are able to identify and dissect these risks and provide options, he said.

Todd South has written about crime, courts, government and the military for multiple publications since 2004 and was named a 2014 Pulitzer finalist for a co-written project on witness intimidation. Todd is a Marine veteran of the Iraq War.

Share:
In Other News
Load More