AppsFromResearch
NASA NeMO-Net icon

NASA NeMO-Net

Evidence Tier:DOCUMENTED

Published in academic literature

For:Researchers & AcademicsGeneral Public & Enthusiasts

App Summary

NASA NeMO-Net is a citizen science game where players help classify coral reefs by painting 3D images, generating training data for a neural network that maps marine habitats. The associated research describes a deep learning approach that uses these citizen-generated labels to train a convolutional neural network, which demonstrated up to 94.4% accuracy in preliminary four-class coral classification. The authors conclude that this gamified approach can generate high-resolution 3D labels at an unprecedented scale, augmenting satellite data to produce global habitat maps for conservation efforts.

App Screenshots

NASA NeMO-Net screenshot 1 of 30NASA NeMO-Net screenshot 2 of 30NASA NeMO-Net screenshot 3 of 30NASA NeMO-Net screenshot 4 of 30NASA NeMO-Net screenshot 5 of 30NASA NeMO-Net screenshot 6 of 30NASA NeMO-Net screenshot 7 of 30NASA NeMO-Net screenshot 8 of 30NASA NeMO-Net screenshot 9 of 30NASA NeMO-Net screenshot 10 of 30NASA NeMO-Net screenshot 11 of 30NASA NeMO-Net screenshot 12 of 30NASA NeMO-Net screenshot 13 of 30NASA NeMO-Net screenshot 14 of 30NASA NeMO-Net screenshot 15 of 30NASA NeMO-Net screenshot 16 of 30NASA NeMO-Net screenshot 17 of 30NASA NeMO-Net screenshot 18 of 30NASA NeMO-Net screenshot 19 of 30NASA NeMO-Net screenshot 20 of 30NASA NeMO-Net screenshot 21 of 30NASA NeMO-Net screenshot 22 of 30NASA NeMO-Net screenshot 23 of 30NASA NeMO-Net screenshot 24 of 30NASA NeMO-Net screenshot 25 of 30NASA NeMO-Net screenshot 26 of 30NASA NeMO-Net screenshot 27 of 30NASA NeMO-Net screenshot 28 of 30NASA NeMO-Net screenshot 29 of 30NASA NeMO-Net screenshot 30 of 30

Detailed Description

Functionality & Mechanism

Developed by NASA, NeMO-Net is a citizen science application designed to generate training data for a machine learning model. The interface presents users with high-resolution 3D models of coral reefs captured by advanced remote sensing technologies. Sessions involve semantically segmenting these models by "painting" classifications onto different benthic habitats. An integrated active learning framework allows users to rate and edit the classifications of others, contributing to a high-quality dataset used for global coral reef mapping.

Evidence & Research Context

  • The system's underlying convolutional neural network demonstrated approximately 80-85% classification accuracy across nine benthic classes using spectrally variable satellite imagery from a Fijian island chain.
  • A preliminary validation study reported a four-class coral classification accuracy of 94.4% during early development stages.
  • The citizen science platform generated over 70,000 user classifications within its initial seven months, demonstrating its capacity for large-scale data acquisition.
  • An active learning framework is integrated to evaluate and filter user-generated data, a documented method for enhancing the quality of the final training dataset.

Intended Use & Scope

This application is intended for the general public to contribute to a citizen science research project. Its primary utility is to generate a large-scale, human-validated dataset for training and refining an automated global coral reef mapping algorithm. The tool does not provide direct ecological assessments but facilitates the creation of foundational data for large-scale environmental research and future conservation planning.

Studies & Publications

3 publications

Peer-reviewed research associated with this app.

Development/Design Paper

NeMO-Net Gamifying 3D Labeling of Multi-Modal Reference Datasets to Support Automated Marine Habitat Mapping

Van et al. (2021) · Frontiers in Marine Science

Describes the research-driven development of this app
NASA NeMO-Net, the Neural Multimodal Observation and Training Network for global coral reef assessment, is a convolutional neural network (CNN) that generates benthic habitat classification maps for coral reef and other shallow marine ecosystems from 2D satellite and 3D airborne remote sensing imagery. Training CNNs with high-accuracy labels for automated 2D and 3D semantic segmentation is a challenging task in machine learning. To overcome this big data challenge, we present a novel 3D online classification video game for mobile and desktop devices. Leveraging the power of citizen science, the NeMO-Net video game is able to generate high-resolution 3D benthic habitat labels at an unprecedented scale. The NeMO-Net video game trains users to accurately identify coral reef families and semantically segment 3D scenes captured using NASA FluidCam, the first remote sensing system capable of mitigating refractive ocean wave distortion. An active learning framework is used to allow users to rate and edit other user classifications. Data labels from the game are used to train the NeMO-Net CNN to autonomously map shallow marine systems, significantly augmenting satellite habitat mapping accuracy in these regions.We share the NeMO-Net approach to user training and retention, outline the 3D labeling technique developed to accurately label complex coral reef imagery, and present preliminary results from over 70,000 user classifications as well as criteria for evaluating and filtering user data, a vital step in overcoming the inherent variability of citizen science. Finally, we examine how future citizen science and machine learning approaches might benefit from label training in 3D space using an active learning framework.Within 7 months of its launch, NeMO-Net has reached over 300 million people globally and enabled a new generation to directly participate in a scientific campaign, uninhibited by geography, language, or physical ability. As the NeMO-Net video game reaches the needed training threshold for the NeMO-Net CNN, anticipated in early 2021, it will help produce the first cm-scale trained global shallow marine habitat mapping products later in 2021. These multimodal, multidecadal, high-resolution global data products will enable novel conservation applications by the UN (SDG 14), IUCN, federal, state, indigenous, and non-profit organizations.
... Read More
Development/Design Paper

NASA NeMO-Net's Convolutional Neural Network: Mapping Marine Habitats with Spectrally Heterogeneous Remote Sensing Imagery

Li et al. (2020) · IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing

Describes the research-driven development of this app
Recent advances in machine learning and computer vision have enabled increased automation in benthic habitat mapping through airborne and satellite remote sensing. Here, we applied deep learning and neural network architectures in NASA NeMO-Net, a novel neural multimodal observation and training network for global habitat mapping of shallow benthic tropical marine systems. These ecosystems, particularly coral reefs, are undergoing rapid changes as a result of increasing ocean temperatures, acidification, and pollution, among other stressors. Remote sensing from air and space has been the primary method in which changes are assessed within these important, often remote, ecosystems at a global scale. However, such global datasets often suffer from large spectral variances due to the time of observation, atmospheric effects, water column properties, and heterogeneous instruments and calibrations. To address these challenges, we developed an object-based fully convolutional network (FCN) to improve upon the spatial-spectral classification problem inherent in multimodal datasets. We showed that with training upon augmented data in conjunction with classical methods, such as K-nearest neighbors, we were able to achieve better overall classification and segmentation results. This suggests FCNs are able to effectively identify the relative applicable spectral and spatial spaces within an image, whereas pixel-based classical methods excel at classification within those identified spaces. Our spectrally invariant results, based on minimally preprocessed WorldView-2 and Planet satellite imagery, show a total accuracy of approximately 85% and 80%, respectively, over nine classes when trained and tested upon a chain of Fijian islands imaged under highly variable day-to-day spectral inputs.
... Read More

In the Media

NeMO-Net

NASA developed NeMO-Net as a single-player iPad game where players help classify coral reefs by painting 3D and 2D coral images, using game data to train the first neural multi-modal observation and training network for global coral reef assessment. The app leverages NASA's Supercomputer Pleiades and exploits active learning with mm-scale remotely sensed 3D images captured using fluid lensing technology to remove ocean wave distortion. Current global coral reef assessments suffer from segmentation errors greater than 40%, which NeMO-Net aims to improve through unprecedented spatial and temporal scale analysis.

NemonetRead article

Map the World's Coral Reefs for NASA with NeMO-Net

NASA scientists created NeMO-Net to help map the world's coral reefs by having players trace corals in satellite photos, training algorithms to automatically identify reef features. The game greets new players with oceanographer Sylvia Earle explaining "Your mission is to take command of a research vessel, and travel the world collecting data on the ocean." Players use in-game paintbrushes to color-code coral and ocean-floor features in three dimensions while learning to identify different types of corals.

DiscovermagazineRead article

NASA NeMO-Net video game helps researchers understand global coral reef health

NASA's Ames Research Center developed NeMO-Net to address coral reef conservation challenges, using a video game approach that trains artificial intelligence tools while engaging citizen scientists. Lead author Jarrett van den Bergh explains that "vast amounts of 3D coral reef imagery need to be classified so that we can get an idea of how coral reef ecosystems are faring over time." The game utilizes convolutional neural networks to automatically analyze complex underwater imagery collected from divers, snorkelers, and satellites.

FrontiersinRead article

NASA asks gamers to map coral reefs with NeMO-Net

NASA Ames Research Center developed NeMO-Net to help analyze coral reef data through citizen science gaming, using 3D "fluid lensing" camera images captured from locations including Puerto Rico, Guam and American Samoa. "NeMO-Net leverages the most powerful force on this planet: not a fancy camera or a supercomputer, but people," said principal investigator Ved Chirayath, who developed the neural network that uses player input to build a global coral map. The game trains NASA's Pleiades supercomputer to recognize corals through machine learning, with classification accuracy improving as more players participate.

OceanographicmagazineRead article

NASA Calls on Gamers, Citizen Scientists to Help Map World's Corals

NASA's Ames Research Center developed NeMO-Net to help map the world's coral reefs by having gamers and citizen scientists classify corals using 3D ocean floor images captured by fluid-lensing cameras. "NeMO-Net leverages the most powerful force on this planet: not a fancy camera or a supercomputer, but people," said principal investigator Ved Chirayath. Players interact with real NASA data while virtually traveling on the research vessel Nautilus, with their input training NASA's Pleiades supercomputer to recognize corals from ocean imagery.

NasaRead article

NASA NeMO-Net

Free