FLIR [FLIR] officials said Monday the company is “actively working on teaming” for the Army’s Optionally Manned Fighting Vehicle (OMFV) competition, while also supporting the service’s Project Convergence effort with a focus on applying artificial intelligence to multispectral imaging.

Troy Boonstra, FLIR’s vice president of product management for sensors, said the company’s work on the Army’s Robotic Combat Vehicle-Medium program has been an opportunity to showcase the potential for teaming with a prime contractor potentially going after the Bradley-replacing OMFV.

U.S. Army Sgt. 1st Class Chez Carter, assigned to Alpha Company, 2nd Battalion, 5th Cavalry Regiment, 1st Armored Brigade Combat Team, 1st Cavalry Division ground guides a M2A3 Bradley Fighting Vehicle during a Table XII Live Fire Exercise at Novo Selo Training Area in Bulgaria in 2018 (U.S. Army Photo)

“We could give you 360-degree vision and stitch the picture together. We can do digital zoom in-picture. We can do a gimbaled-sensor up on top that will give you extended range capability. Those are all things that come into play from a sensor standpoint,” Boonstra told reporters.  “Being in active competition, we probably don’t want to go much further than that but we are definitely interested and definitely bring a lot to the table for the central nervous system of the vehicle.”

The update on FLIR’s plans for OMFV follows an October discussion with reporters where officials said the company was interested in providing technology and potentially partnering with a prime contractor, with the Army having since dropped its Request for Proposals in December (Defense Daily, Oct. 22). 

Tom Frost, FLIR’s vice president of unmanned ground systems and integrated solutions, said the company’s sensors, CBRN tools, UAS and ground robot capabilities and autonomy tools could all be applicable to OMFV.

FLIR is currently part of a Textron Systems [TXT] team, along with Howe & Howe, that’s delivering the Ripsaw M5 platform for the Army’s Robotic Combat Vehicle-Medium prototype effort.

For the Army’s Project Convergence, which looks to prove out how the service can create a new ‘sensor-to-shooter’ network with its future weapon systems, FLIR is building off work informing how AI can support multi-spectral imaging during 2020’s first demonstration with plans to further support target recognition and decision-aiding areas in this year’s event. 

“FLIR is one of the only ones that has built up some of our own libraries, if you will, of what a multi-spectral image looks like because it’s very different than a visual image,” Boonstra said. “At the end of the day, folks need actionable intel, and that actionable intel is understanding the context.”

Boonstra said FLIR now has a Cooperative Research and Development Agreement (CRADA) with  Army C5ISR Center’s Night Vision and Electronic Sensors Directorate to provide access to libraries “that will help us grow that capability.”

The first Project Convergence demonstration at Yuma Proving Ground in Arizona brought together future capabilities, artificial intelligence-enabled systems and a new “computer brain” to prove out capacity for passing targeting data in a matter of seconds (Defense Daily, Sept. 24). 

David Proulx, vice president of product management for unmanned systems and integrated solutions, said FLIR is also interested in supporting the idea of “vehicles within vehicles” at Project Convergence.

“Concepts like the air-launched effects where you’ve got a larger manned or optionally-manned or unmanned platform that is the carrying vehicle for a smaller surrogate multi-sensor aircraft is really interesting,” Proulx told reporters. “As we look at our role in that, it’s not just providing a sensor or a bunch of sensors, it’s having a position on both the big vehicle and the smaller vehicles, and even smaller, smaller vehicles, recognizing you can nest these things like Russian dolls.”