You can help ground America's leaders in God's word!

U.S. Approach to AI in Warfare Would Reflect "Judeo-Christian" Values, Air Force General Says

U.S. Approach to AI in Warfare Would Reflect "Judeo-Christian" Values, Air Force General Says

A U.S. Air Force General recently suggested that the use of autonomous AI would be programmed with a "Judeo-Christian" value system should such weapons ever be employed in war.

According to the Congressional Research Service, Lethal autonomous weapon systems, or LAWS, are described as "a special class of weapon systems that use sensor suites and computer algorithms to independently identify a target and employ an onboard weapon system to engage and destroy the target without manual human control of the system."

During an event at the Hudson Institute last Thursday, Lt. Gen. Richard G. Moore Jr., deputy chief of staff for plans and programs in the U.S. Air Force, was asked when the country would begin deploying LAWS and if it was ethical for the military to keep humans involved once the technology is available.

Although Moore says that LAWS was not ready to be deployed, he noted that it is an area of concern for Deputy Secretary of Defense Kathleen Hicks, The Christian Post reported.

"What will the adversary do? It depends [on] who plays by the rules of warfare and who doesn't. There are societies that have a very different foundation [than] ours," he said. "Our society is a Judeo-Christian society, and we have a moral compass. Not everybody does. And there are those that are willing to go for the ends, regardless of what means have to be employed. And we'll have to be ready for that."

Even though LAWS have yet to undergo widespread development, the Congressional Research Service also explained, "it is believed they would enable military operations in communications degraded or denied environments in which traditional systems may not be able to operate."

AI, the true problem with AI

Photo credit: ©Getty Images/BlackJack3D

Moore noted that the development of ethical AI will be a significant component of the Defense Department's budget in 2024.

"And that takes several forms. The first one is, what do we think we're allowed to let AI [do]? The second one is how do we know how the algorithm made decisions? And do we trust it? And the third one is, at what point are we ready to let the algorithm start doing some things on its own that maybe we are or aren't comfortable with?" he said.

Citing from Chris Brose's 2020 book, The Kill Chain: Defending America in the Future of High-Tech Warfare, Moore warned that America's military is under threat by new technologies.

"He talks extensively about whether you would trust a young soldier on the ground that maybe hasn't had sleep in three or four days and hasn't had a good meal or certainly shower," Moore said of Brose.

"This young soldier, that heat, sweat, fatigue, all of that is making a decision about employing lethal force or not, or an algorithm that never gets tired. You might actually think that if you can understand how the algorithm makes decisions and trust it, you might rather have that algorithm that never gets hot and never gets tired, it never gets hungry. You might rather have it making decisions for you," the air force general said.

"But until you have in place the foundations of ethical AI that allow that to happen, you can't get there. So it is a very important discussion. It's one that's being held at the very highest levels of the Department of Defense."

Photo credit: ©Getty Images/Jacobs Stock Photography Ltd.

Milton QuintanillaMilton Quintanilla is a freelance writer and content creator. He is a contributing writer for CrosswalkHeadlines and the host of the For Your Soul Podcast, a podcast devoted to sound doctrine and biblical truth. He holds a Masters of Divinity from Alliance Theological Seminary.