Army Improves Systems Testing to Deliver More Capability to Fight

Dr. Jeffery Holland, director of the U.S. Army Engineer Research and Development Center and chief scientist for the U.S. Army Corps of Engineers, speaks during an ERS seminar in Springfield, Va., March 25, 2015.(U.S. Army photo/David Vergun)
Dr. Jeffery Holland, director of the U.S. Army Engineer Research and Development Center and chief scientist for the U.S. Army Corps of Engineers, speaks during an ERS seminar in Springfield, Va., March 25, 2015.(U.S. Army photo/David Vergun)

SPRINGFIELD, Va. -- The Army is now using high-performance computer modeling and simulation to ensure its weapons platforms and systems deliver greater effectiveness to the warfighters, said Dr. Jeffery Holland, director of U.S. Army Engineer Research and Development Center.

Holland, who is also chief scientist for the U.S. Army Corps of Engineers, spoke at a National Defense Industrial Association-sponsored Engineered Resilient Systems, or ERS seminar here, March 25.

Weapons platforms, like trucks and aircraft and all their internal components, are being subject to more rigorous testing throughout the design phase, and even after production, using supercomputers to model and simulate all kinds of extreme conditions, Holland said.

The supercomputers are really good at digesting huge chunks of data using many variables that simulate such things as dust, humidity, shock and vibration, materiel fatigue over time and so on, he said.

Once that is all digested, the supercomputer spits out its analysis of that data, often within mere seconds, he said, providing an easy-to-understand picture of failure and friction points between systems or between components within systems.

Different types of software programs can even massage the data to produce analysis of alternative designs, cost-benefit analysis, link analysis, risk assessments and so forth.

The supercomputer might come up with two billion design alternatives and then narrow that down to the 10 most optimal. Analysts will then not need to question whether or not something is optimal, but why it is optimal, he said. 

In Army-speak, the "why" question might be that such and such a design will make the system more (or less) lethal, survivable, mobile and so on. Optimal component designs will have tradeoffs, meaning something can go faster, but it might require less heavy protective armor to do that, he said. 

Those trades inform Army decision makers so they can make better decisions for the warfighter in a budget-constrained environment, he said.


The Army and the rest of the Department of Defense, or DoD, is testing its equipment in seven laboratories - including the Army Research Lab - with 2,500 ERS personnel spread throughout the United States in what is known as the ERS Community of Interest, or CoI, he said, meaning ERS is not physically located in one place.

The ERS CoI is less than two years old, he said, so harnessing the potential of this computing power is only beginning to take place, with many of the protocols still being worked out. 

Incidentally, there are 17 communities of interest across DoD, which include counter-weapons of mass destruction, autonomy, space, sensors, human systems, electronic warfare, air platforms and others. The CoI effort brings the DoD science and technology enterprise together in what is broadly termed Reliance 21.

All of the CoIs collaborate, as does a number of industry and academia partners, he said.


Holland, who has been working with supercomputers for three decades, has already seen some successes.

For example, Boeing asked ERS to evaluate a rotor-blade design for the CH-47. Boeing was hoping it might also use the rotor for the Future Vertical Lift program. Holland's team cranked up the supercomputer, feeding shovels full of data into it, and within hours, his team suddenly realized that the new rotor design was so efficient, it would increase the overall range of the helicopter "by a factor of two," he said. 

Without using this approach "we wouldn't have seen the results at this point," he said. Instead, the DoD approach would have been simply to fly the aircraft around and nothing would have been known for quite a while, if at all.

Without getting too technical, the military approach to designs of ships, aircraft, tanks and so on, uses "a point-based, spiral design." What that means, he said, is they first design the shell of the platform and once the size and shape is determined, they then try and figure out how much stuff they can stuff into it. 

He termed this approach: "Worship the hull, and everything shall be added unto you."

What ends up happening, is the design adds more cost, size and weight and is much less efficient than designing for mission requirements in a "set-based design process" using the data crunching power of the supercomputers and various software packages.

ERS is beginning work on the Army's autonomous ground vehicle systems in Warren, Michigan, as well, he said, although it's still in the planning stages.

Even before ERS CoI was formed, Holland was playing with supercomputers, doing things that ERS hopes to continue doing in a more methodical fashion moving forward.

For example, while wearing his Corps of Engineers hat, he and his team fed reams of data into a supercomputer, involving characteristics of hurricanes moving through the Gulf of Mexico. This was post-Hurricane Katrina, so there was a lot of interest, he said.

It was found that of 60 named storms, 22 percent of them in the Gulf of Mexico, came within 90 nautical miles of New Orleans, he said. The supercomputer then ran a series of 1,000 "synthetic storms" based on the tracks and characteristics of past storms. 

Risk-approach software showed that storms even farther than 90 nautical miles from the city could adversely impact the city, he said.

In 2008, Hurricane Gustav entered the gulf and although it later weakened, it came within 300 miles south of New Orleans.

Before Gustav got that far, Holland and his team were tracking it, feeding synthetically modeled data into the supercomputer and sending it out to the Corps, Federal Emergency Management Agency and other responders. 

The data, he said, showed that Gustav's track would have a "wraparound effect" on the eastern side of New Orleans, causing Lake Pontchartrain to breach and inundate populated areas, so his team recommended that the sea gates be closed as a precaution.

The simulation turned out to be highly accurate, validating the approach and potentially saving millions of dollars and possibly lives, he said.

Ten years ago, all of that data fed into the supercomputer might have been thrown away because no one would have known how to make use of it. Now that's changed, he said. "That's the power of advanced modeling and analytics." 

Story Continues