Project Abstract
The intent of VIRDAD was to display and capture data
from the FLIR A70 infrared camera, in order to detect subsurface defects
in concrete, wind turbine blades, and other civil engineering applications.
The A70 is a research camera meant to be mounted on drones, and can transmit a
live-stream of infrared and visible video and other data over a wireless connection.
The application was meant to display an overlay of visible and infrared
live streams with a variable transparency, as well as capture and store still
images and videos from those streams for later analysis by computer vision algorithms
that could detect voids and cracks in the material under inspection.
The captures images and videos could be played back within the
application, which was meant to operate natively on Windows, Android,
as well as though a web application.
Project Description
What we built
Over the course of this semester we ended up building a web application capable of interfacing with a FLIR infrared camera.
Our web application currently consists of 4 tabs: Home, Images, Videos, and Live Feed. Each tab has seperate functionality.
In the Home tab you can test your connection to a camera by clicking the "Check Connection" button. On the Images tab you
can view static images on the left and then upon a user click on an image they will be greeted with two images overlayed
on each other on the right part of the screen. Here you can user a oppacity slider (located below the overlayed images) to
view the infrared image on top of the visable image. On the Videos tab you can view static videos. Finally on the Live Feed
tab if connected to a camera you can view live feed from the Flir camera.
How it works
VIRDAD is a react-native and node JS application. The react-native portion of the
code allows us to create a cross-platform frontend, where the user can view images,
videos, and live-streams. We provide the user a slider that varies the opacity of
the videos using JavaScript, allowing them to see the infrared and visual streams
overlaid. It is currently configured to run in a web-browser only.
The node application uses the Express library to serve video from the Flir A70
camera, which uses the real-time streaming protocol (RTSP) to serve video over its
hidden network. It simultaneously streams both the visual and infrared video streams.
Since RTSP isn't supported by many video players, our node application uses a library
called ffmpeg-static to convert the video format from RTSP to HTTP live stream (HLS)
in real time, allowing it to be rendered in a browser. It also uses a library called
Video.js that provides the video player that we use. Using express, we serve the
streams to the front-end. Since the node application uses the local network of the
device it is running on, this application requires no internet, allowing it to be
used by civil engineers in the field.