Map901: Rich Interior Hazard Maps for First Responders
Background
The University of Memphis in conjunction with the City of Memphis applied for the 2018 NIST Public Safety Innovation Accelerator Program (PAIAP) Point Cloud City Award. This research aims to achieve an extensive catalog of annotated 3D indoor point clouds that can be used by industry, academia, and government to advance research and development in the areas of indoor mapping, localization and navigation for public safety, as well as to demonstrate the potential value of indoor mapping.
The University of Memphis in conjunction with the City of Memphis applied for the 2018 NIST Public Safety Innovation Accelerator Program (PAIAP) Point Cloud City Award. This research aims to achieve an extensive catalog of annotated 3D indoor point clouds that can be used by industry, academia, and government to advance research and development in the areas of indoor mapping, localization and navigation for public safety, as well as to demonstrate the potential value of indoor mapping.
On October 1, 2018, The City of Memphis was
selected among three other applicants for our Map901: Building Rich Interior Hazard Maps for First Responders proposal.
Service Question
How can we advance research in the areas of indoor mapping, localization, and navigation to benefit public safety and demonstrate the potential value of indoor mapping?
How can we advance research in the areas of indoor mapping, localization, and navigation to benefit public safety and demonstrate the potential value of indoor mapping?
Implementation
City of Memphis partnered with the University of Memphis to survey 1.86 million square feet of indoor space across seven facilities.
City of Memphis partnered with the University of Memphis to survey 1.86 million square feet of indoor space across seven facilities.
When the surveying began, the team used a camera and LiDAR
separately which proved problematic when it came time to sync the data. To
correct the issue, GVI LiBackpacks with GPS input for timing and
synchronization were combined with an Insta360 camera (to provide colors that
can easily be referenced to the LiDAR points) and SLAM solution software that
stitched all snapshots into one 3D model. Data is gathered as the team walks
through the building with the backpack.
The resulting 3D model is a point cloud with accurate color.
A good parallel is to think of it like dot matrix printing. Every LiDAR point
is georeferenced down to the millimeter through GPS-RTK (real-time kinematic)
and reaches maximum accuracy within five to ten minutes. The color shows objects
that wouldn’t be visible with LiDAR alone, such as bicycles, and could be used
for virtual reality applications.
Objects of Interest
In addition to LiDAR mapping, objects of interest, are catalogued through ML algorithms, Mask R-CNN (convolutional neural networks) and Google’s Inception-ResNet-v2. Mask R-CNN is the gold standard for image classification. R-CNN analyzes snapshots and correctly identifies objects via a bounding box. Inception-ResNet-v2 classifies those objects and refines the bounding box. 43 label classes have been created for a training and testing dataset, e.g. exit sign, fire alarm, person.
In addition to LiDAR mapping, objects of interest, are catalogued through ML algorithms, Mask R-CNN (convolutional neural networks) and Google’s Inception-ResNet-v2. Mask R-CNN is the gold standard for image classification. R-CNN analyzes snapshots and correctly identifies objects via a bounding box. Inception-ResNet-v2 classifies those objects and refines the bounding box. 43 label classes have been created for a training and testing dataset, e.g. exit sign, fire alarm, person.
Results
The result of this project could change the way firefighters respond to an event. A commander could better coordinate an event by visualizing the team’s location in the building as they respond. A firefighter could be able to see past smoke to the interior of a building with important objects highlighted, e.g. exits and people.
The result of this project could change the way firefighters respond to an event. A commander could better coordinate an event by visualizing the team’s location in the building as they respond. A firefighter could be able to see past smoke to the interior of a building with important objects highlighted, e.g. exits and people.
The team collected 17TB of
raw data and 3TB of processed data. Processed data is shared with NIST to accelerate research
related to indoor mapping for public safety use cases. A 3D visualization
application, Insight 3D, is shared with first
responders.
Major Contributors
National Institute of Standards and Technology, City of Memphis, University of Memphis
National Institute of Standards and Technology, City of Memphis, University of Memphis
Video demo of the FedEx Institute of Technology
Learn more about indoor mapping in this ArcGIS storymap - In Search of the Holy Grail: Indoor Tracking for First Responders.
Download or print the Map901 Book.