The kinds of initiatives are now taking on a newer, more advanced character as AI-enabled sensors, computers and targeting systems increasingly process and organize information more quickly, enabling ever-advancing measures of autonomy.
Dr. Bruce Jette, Assistant Secretary of the Army, Acquisition, Logistics & Technology, told TNI that weapons developers are seeking a ground vehicle “sensor fusion” to enable soldiers to make rapid decisions when faced with fast-changing combat variables.
“Vehicle crews are seeking optimal data to understand the terrain in front of them, to decide whether or not they should drive into it. Can I activate additional sensors, whether active or passive, to discern what really is there?” Jette told TNI.
Commercial applications of autonomy, such as those now used for driverless cars, have been advancing for quite some time, however Army developers have been taking on something quite different. Combat vehicles need autonomy not just for linear navigation but rather for an integrated series of complex, fast-changing variables such as incoming attacks, rocky terrain, air integration, and means to optimize methods of attack.
“We don’t want Soldiers to be operating these remote-controlled vehicles with their heads down, constantly paying attention to the vehicle in order to control it. We want these systems to be fully autonomous so that these Soldiers can do their jobs and these autonomous systems can work as teammates and perform effectively in the battlefield,” Dr. John Fossaceca, Artificial Intelligence for Maneuver and Mobility Program Manager, Army Research Laboratory, Combat Capabilities Development Command, Army Futures Command, said in an Army report.
Jette used an interesting term when describing the Army’s sought-after technological advantages, calling it a kind of “sensor fusion.” This term was not likely used by accident, as it often refers to the integrated sensor applications now operational in the F-35. Using early iterations of AI, computers on-board the F-35 are able to take otherwise disparate or stovepiped streams of combat relevant data, perform analytics on the information, organize it and present a single coherent picture for pilots to view. A single screen display, therefore, contains integrated navigational, targeting, flight details and threat information simultaneously. It merges a 360-degree camera system called Distributed Aperture System with a long-range Electro-optical Targeting System and other crucial flight variables. A ground equivalent to this kind application would seem to call upon an even greater measure of complexity, as ground autonomy must account for a wider range of variables.
The concept is aligned with ongoing research into new generations of AI being engineered to not only gather and organize information for human decision makers but also advance networking between humans and machines. Drawing upon advanced algorithms, computer technology can organize, and disseminate otherwise dis-aggregated pools of data in seconds. AI-empowered sensors can bounce incoming images, video or data off a seemingly limitless existing database to assess comparisons, differences and perform near real-time analytics. This kind of phenomenon seems to represent exactly what Jette was thinking of when he mentioned integrated armored vehicle sensors analyzing the upcoming terrain to make immediate decisions. At the speed of the most advanced computer processing, various AI systems can simultaneously organize and share information, perform analyses and solve certain problems otherwise impossible for human to address within any kind of comparable timeframe. At the same time, there are many key attributes, faculties and problem-solving abilities unique to human cognition.
Jette explained that the technology has made massive leaps forward since earlier iterations of sensor integration were pursued previously in the Army Future Combat Systems (FCS) program. FCS, which began to take shape more than ten years ago, built a small fleet of Manned-Ground Vehicles engineered with advanced sensors to provide a 360-degree camera view of surrounding terrain. The Army’s now-cancelled Non-Line of Sight Cannon, for instance, was built with integrated surrounding cameras, however Jette explained the system lacked the maturity to make key combat-sensitive distinctions. Jette, who participated in FCS development while at White Sands Missile Range, N.M., years ago, said the FCS “optical systems would try to figure out what they were seeing in a ‘dark spot.’ They could not tell whether it was a shadow or a VBIED (Vehicle-Borne IED). You needed multiple sensors from different angles with a more holistic view.”
Interestingly, while cancelled more than a decade ago, the fundamental networking concept pursued for the FCS program remains largely intact, if with different and far more advanced technical systems. FCS was engineered upon the technical premise that a fleet of forces would operate in a coordinated “networked” fashion wherein otherwise disparate sensors would share information in real time. It was envisioned as a layered system of sensors. For example, the MGVs were built to be lighter weight than other comparable combat platforms due to what developers called a “survivability onion.” The concept here was that an armored combat vehicle could be faster, lighter weight and more expeditionary by virtue of having a surrounding layered sensor system with which to detect and destroy incoming enemy fire. While this basic premise, as made manifest in early MGV prototypes, was deemed insufficiently survivable and cancelled, the fundamental strategic effort to sustain survivability while optimizing lighter-weight combat vehicles, persists to this day. Moreover, it is informing many of the parameters of the Army’s more expeditionary “light tank,” the Mobile Protected Firepower vehicle.
New technologies, including active protection systems, lighter weight armored materials, new sensor applications and rapid advancements with AI are now making the initial FCS vision much more attainable. Reconciling or optimizing a seemingly contradictory balance between survivability and mobility very much informs the Army rationale for its family of Next-Generation Combat Vehicles. Given this, it is not surprising that the advent of advanced, AI-empowered computer algorithms are greatly impacting the developmental equation, as explained by Jette.
Using AI, sensor integration and integrated command and control, the Army is already demonstrating new applications for autonomous systems in combat. For instance, teams of Army robots conducted a “deep assault through a breach” during an exercise last year. The experiment was intended to prepare the service for a new kind of man-machine drone warfare.
The Army exercise, which pitted groups of unmanned vehicles or ground drones against a mock enemy “tank ditch” and “minefield,” was part of a massive service-wide modernization effort to prepare for a new generation of combat—one wherein self-navigating drones directly confront enemy fire in high-threat war scenarios while humans perform command and control at safer distances.
During the Army demonstration, which took place several months ago, there “was not a single soldier in any vehicle” conducting the initial breach, Commander of Army Futures Command, Gen. John Murray, told reporters.
Various kinds of advanced autonomy, naturally, already exist, such as self-guiding aerial drones and the Navy’s emerging “ghost fleet” of coordinated unmanned surface vessels operating in tandem. Most kinds of air and sea autonomous vehicles confront fewer operational challenges when compared to ground autonomy. Nevertheless, the concepts and developmental trajectory between air, land and ground autonomy have distinct similarities; they are engineered to operate as part of a coordinated group of platforms able to share sensor information, gather targeting data and forward-position weapons—all while remaining networked with human decision makers.
“Future military missions are going to require autonomous vehicles that can determine what the passable routes might be, calculate the best route and make assessment about what’s happening in the environment,” Fossaceca said
No comments:
Post a Comment