Bringing the Virtual World into Reality
Manned Flight Simulators Integrate VR/MR
Technology for CMV-22 Platform
By Rob Perry
For decades, pilots have taken to flight simulators at Naval Air Warfare Center Aircraft Division’s (NAWCAD) Manned Flight Simu lator (MFS) to train, test software and equipment, experiment, develop flight envelopes and even investigate aerial accidents in a safe and inexpensive manner. As newer aircraft and technology are created, these simulators need to be upgraded to reflect these developments, as was the case with one of the Navy’s newest
aircraft, the unique
While the Osprey can land on an aircraft carrier in the traditional fashion via approaching the flightdeck and landing, its tilt-rotor capability allows it to also take off and land vertically on carriers and other ship classes, as would a helicopter.
But in practicing vertical take-off and landing (VTOL) procedures for the CMV-22 in a simulator situation, there was one main problem: Osprey pilots are used to looking down from the cockpit and seeing the deck of the ship or the landing zone. When in a traditional simulator, where the environment is projected on a large screen in front of and surrounding the cockpit, if the pilot looks down, all they see is the simulator floor.
In addition, the existing motion platform and projection simulator at Manned Flight does not quite have the feel and range of motion to accurately depict real-time, on-station dynamics typical of the CMV-22. To upgrade the platform to the necessary specifications, the engineers at Manned Flight working on the Dynamic Integrated Virtual Environment (DIVE) Program began consulting and troubleshooting, and found that the cost of upgrading the already aging platform would be upwards of $3 million and take a considerable amount of time to construct.
Up to this point, the DIVE program was sponsoring the integration of virtual reality (VR) and mixed reality (MR) technologies with aircraft flight simulators at Manned Flight Simulator; particularly helicopters and other rotaries. When the motion platform upgrade cost estimate was finalized, Dr. Umberto Saetti, currently at Auburn University in Alabama, coincidentally proposed a research project to use motion-based virtual reality (VR) simulators for aerospace research specifically for VTOL shipboard landing operations.
Robert Calvillo and Donald Gaublomme, aerospace engineers at NAWCAD, consulted with Saetti and learned they could develop their own VR and mixed reality (MR) motion platform and software for the CMV-22 for a fraction of the existing motion platform upgrade cost—roughly $500,000—and then tailor it to be used with other aircraft as demand arises.
“There are many benefits to moving toward this virtual display,” said Matt Mueller, the Division Head at Manned Flight Simulator. “The biggest benefit being that when you have a facility where you are trying to put more and more simulators in, the advantage of being able to make your footprint smaller is huge. At this facility, we can fit nine simulators in here. But, if we were able to pull out the display systems with projectors and put something like the [VR/MR platform] in, we can have several more motion-capable simulators. They are cheaper to maintain, have a smaller footprint and offer us more capability.”
The new VR/MR motion platform sits in the center of what was once a cubicle office area at Manned Flight, a space that was repurposed more than 18 months ago once the COVID-19 pandemic hit and many workers were forced into telework capacity. The platform sits on the floor, with hydraulic pistons extending upward and attaching below the cockpit seat. An array of computer servers and equipment are tucked away in towers to the side of the room, powering the software and platform itself. To get into the pilot seat, one must climb a short ladder and be limber enough to maneuver into the seated position. Once situated, the pilot can don the VR headset, put their hands on the throttle and flight stick and toggle the foot pedals. After a brief calibration, the pilot can look around and see the insides of the cockpit, as well as the deck view of an aircraft carrier deployed at sea. Pilots can then practice take off and landing on the carrier, while others can observe what the pilot sees on a computer monitor only a few feet away.
The installation of the newest platform began in earnest about 18 months ago, but the push to move toward VR capability began even earlier.
“About four years ago when we were at the tail end of our first virtual reality project, our crew chiefs said they really enjoyed the VR aspect and that’s when we really started working on the mixed reality concept,” Calvillo said. “We had pilots come in and put on the VR headset and they were essentially able to see outside the windshield of the real cockpit and see the virtual environment, but when they look inside the cockpit, they see the real world, their real hands and instrument panels. Everyone started to see that virtual reality and mixed reality had some really good potential, so we needed to expand on it further.”
Calvillo said that’s when the team realized that refurbishing the old motion platform would not accomplish the needs of the VR/MR environment and alternatives were explored.
While the CMV-22 cockpit motion platform is the only one currently being used, it has the capability to be modified to suit other platforms including the MH-60 Seahawk and the F/A-18. Calvillo said the F-35 Lightning II Joint Program Office has expressed interest in developing a VR/MR trainer with an existing spare cockpit in storage at Manned Flight. In addition, Calvillo said there has also been interest in creating a VR training space for landing signal officers (LSO) to use and perform their deck duties as if on a deployed ship in first-person virtual reality.
Jim Pritchard, a former Marine aviator who has experience flying CH-53E Super Stallion and other single main rotor/tail rotor helicopters, said all of his familiarity with the V-22 has been through virtual simulators at Manned Flight.
“I’ve been involved with virtual simulation and handling qualities studies for the past 15 years on all type/model/series [T/M/S] rotorcraft hosted by Manned Flight Simulator,” he said.
Pritchard, the Flight Dynamics Rotary Wing Team Lead in the Aeromechanics Division with NAWCAD, first tested out the VR/MR platform in January and provided his feedback to the team, stating that the visual scene and ship models were rendered very accurately and believable.
“We generated a list of items necessary for further development,” Pritchard said. “However, the overall assessment is very positive. This new technology has the potential to revolutionize modeling and simulation in the shipboard environment.”
In comparing the current and frequently used motion platform simulators with immersive projector screens versus the newer VR/MR motion platform, Calvillo said there still is plenty of room for improvement with the new VR/MR equipment. For example, he said he is currently working with a Small Business Innovative Research effort to develop a higher resolution headset that would also be more ergonomic and cause less eye strain, which can be a byproduct of staring into the VR headset for an extended period of time.
“One of the limitations is head tracking and jitter update in the system, a key area that the industry in general—not just us— is trying to focus on,” Mueller said. “If you keep your head relatively stationary, and you are not moving your head a lot, the visuals look pretty good. But if you were in a situation where you are moving around, reaching for knobs and your whole body is shifting along with your eyes, it can be harder for the visuals to stay in sync. So, you will notice a bit of a jitter, which takes you out of the simulation.”
Calvillo also said latency in the software and the platform and control response are constantly being addressed.
“From stick to endpoint, the goal in the simulation is to get the response to under 100 milliseconds,” Mueller said. “If you can stay under that, the human brain has a hard time perceiving the delay. If you go over 100 milliseconds, then you can get motion sickness as, over time, the brain will notice there is a delay in when you told something to happen and when it is actually happening.”
The software currently uses the Unity 3D Game Engine, which is also used in the gaming industry. Calvillo said they have another team that is working to develop software using the Unreal Engine, another widely popular 3D game engine used throughout the gaming industry to render games for gaming consoles such as PlayStation, PC and Xbox.
“We’re hoping the Unreal Engine can address a lot of the limitations of Unity and also address a lot of the limitations of our existing image generator software capabilities for our traditional display systems,” Calvillo said.
Calvillo said pilots who have tried the VR/MR headset have described some of the limitations of the headset, including not having as wide a field of vision as they would see in the real world.
“We’re hoping with the newer headsets, when they come online, pilots will be able to fly like they would in the real aircraft as opposed to flying to the limitations of the headset,” he said.
Pritchard said the difference between the standard simulators and the VR/MR platform is “significant.”
“First of all is its adaptability,” Pritchard said of the VR/MR platform. “Only a few hardware components need to be changed to accommodate the wide range of T/M/S air vehicles in Naval Aviation—the rest of the changes come in the form of software. Second is its increased fidelity. The standard cab simulator is limited in visual rendering; In the shipboard environment the air crew uses visual information very close to nadir as well as behind the beam; the MR headset is eminently capable of rendering that information to the air crew. And third is it flexibility. MR allows the engineers to display information in the best possible manner, either artificially or virtually or both.”
While still working on improving the newest VR/MR motion platform simulator, Calvillo said the NAWCAD team is working to use VR in a Joint Simulation Environment—a scalable, expandable, high fidelity, government-owned, non-proprietary modeling and simulation environment to conduct testing on fifth-plus generation aircraft and systems accreditable for test as a supplement to open-air testing. Calvillo said his team has been contacted to consult on VR/MR simulators for the CH-47 Chinook helicopter, as well systems for the Army, Air Force, Naval Sea Systems Command (NAVSEA) and even NASA.
Mueller said the Joint Simulation Environment aims to set up entire facilities comprised of only mixed reality headsets: “no projectors or display systems as we see today.”
“There are problems to solve, which is why we keep tackling them,” Mueller said. “One of the big goals that we are shooting toward is to be able to have a facility of 30-plus simulators all flying in close proximity to each other.
“The advantages and the versatility you get with the VR headsets and the advances that are being made, I expect they will become the more accepted solution moving forward.”
Pritchard said he sees the simulators as a cost-saving measure as well.
“Here at Naval Air Systems Command, one of our primary data products comes through flight test, which is resource intensive, both in cost and schedule. This motion platform with MR headset has the potential to augment, and in some cases even replace flight test, thereby realizing a significant cost savings in critical aspect of data collection,” he said. “Regarding the larger scale naval fleet technology application, this unique simulator has the potential to provide high fidelity simulation for training, certification and currency, which would be a major breakthrough for Naval Aviation simulation.”
Rob Perry is a writer/editor with Naval Aviation News.