Big Data, AI to Advance Modeling and Simulation

Por • 13 ene, 2018 • Sección: Ciencia y tecnología

Yasmin Tadjdeh

1/3/2018. Military officials and industry experts have long discussed how artificial intelligence can benefit the warfighter. The technology promises to crunch mountains of data into easily digestible bites of actionable information and to predict when parts on a vehicle are about to wear out.

However, there has been less emphasis on how it can improve modeling and simulations for training purposes — a market area that is becoming increasingly important as service leaders across the board call for more investment to improve readiness.

During the recent National Training and Simulation Association’s Interservice/Industry Training, Simulation and Education Conference in Orlando, Florida, military officials and industry experts discussed the benefits of applying the emerging technology to such scenarios.

Retired Rear Adm. James Robb, president of NTSA, the host of I/ITSEC and an affiliate of the National Defense Industrial Association, said there was a strong emphasis during the show on big data and artificial intelligence.

“What we’re really studying is how we can … bring data that’s collected from training exercises and bring it back in to replay, improve performance and give feedback to the trainees,” he said. “The emphasis is improving the training in whatever way we can with data analysis.”

Tony Cerri, director of data science, models and simulations at U.S. Army Training and Doctrine Command G2, said industry must begin incorporating artificial intelligence into modeling and simulation.

“This is going to differentiate our efforts from folks like … [Russian President Vladimir] Putin — who says that AI is the next battlefield and he plans to own it by 2030 — or the Chinese who are pouring phenomenal amounts of money into big data and AI.”

Because of the data-intensive nature of modeling and simulation, those working in the field already have an acute understanding of how to work with gobs of information and are well suited to take advantage of new technologies, he noted.

“If we can marry big data and AI with [modeling and simulation] … that’s an unbeatable advantage for not only the nation but our DoD and where we’re trying to go,” Cerri said. “I’m really excited about the potential here.”

Young Bang, senior vice president at Booz Allen Hamilton, said more countries around the world — particularly Russia, China and the United Arab Emirates — are investing in such technologies. Those three countries have all published their national strategies in artificial intelligence. By 2030 they each want to be the world leader in AI, he noted.

Putin last year went as far as to say that whichever country masters artificial intelligence “will be ruler of the world.”

China is already spending more than $15 billion a year directly or indirectly to support the technology’s development, Bang noted.

“They are doing it in the commercial side, they’re doing it on the government side,” he said. They are also investing in Silicon Valley companies that are focusing on the development of artificial intelligence algorithms, he added.

Eric Schmidt, executive chairman of Alphabet, the parent company of Google, said China’s investment in AI shouldn’t be underestimated.

“They’re going to use it for both commercial as well as military objectives — with all sorts of implications,” he said during a recent conference hosted by the Center for a New American Security, a Washington, D.C.-based think tank. “We know what they are doing: They have announced their strategy. You’re crazy to treat them as somehow second-class citizens.”

In a recent blog published by the Council on Foreign Relations, Gregory C. Allen, an adjunct fellow at CNAS, said China is far from lagging behind in the artificial intelligence game.

“The United States national security and tech communities urgently need to adjust to this new reality,” he said. “China’s military and commercial AI ambitions pose the first credible threat to United States technological supremacy since the Soviet Union.”

Beijing’s dominance is not only possible but also likely unless there is “a massive change in U.S. policy,” he added.

“The United States needs a surge of focus and funding comparable to the ‘Sputnik moment’ that launched the space race,” Allen said.

Despite its “astonishing” advances, China still remains behind the United States and its allies in the development of the technology, he said.

In a recent report, “Department of Defense: Artificial Intelligence, Big Data and Cloud Taxonomy,” by Govini, the firm found that Pentagon spending on all three fields collectively reached $7.4 billion in fiscal year 2017, which is 32.4 percent higher than it was in fiscal year 2012. AI accounted for 33 percent and big data for 47.9 percent of the total spending. Overall, investments in artificial intelligence have grown by a compound annual growth rate of 14.5 percent since 2012.

Spending on AI has gained traction, particularly with the development of virtual reality systems for training and simulation, the report noted.

While artificial intelligence can help with modeling and simulation, the field can also conversely benefit AI, said Richard Fujimoto, a Regents’ professor at the School of Computational Science and Engineering at the Georgia Institute of Technology.

“With the emergence of big data and the internet of things, there is really increased importance in online decision-making,” he said during a panel discussion at I/ITSEC. “Computational models such as simulations clearly have an important role to play in this area particularly in the predictive element and analytics. … Modeling and simulation has been contributing, but I think there could be much more activity in this area.”

Bang added: “There is huge brain power in this community that should be mobilized to really help us in … artificial intelligence for the Defense Department to actually catch up and leap frog our competitors.”

http://www.nationaldefensemagazine.org/articles/2018/1/3/big-data-ai-to-advance-modeling-and-simulation

 

Post to Twitter

Escribe un comentario