Dharini Jha - AIMS@JCU

Dharini Jha

dharini.jha@my.jcu.edu.au

Recipient of an AIMS@JCU Scholarship

PhD
College of Science and Engineering

Dharini Jha

dharini.jha@my.jcu.edu.au

PhD
College of Science and Engineering
Investigation into Artificial Intelligence Methods for Hyperspectral Image Analysis in Coral Reef Science Applications.

Dharini Jha completed her bachelor’s in Electronics and Telecommunication Engineering and master’s in Remote Sensing from Birla Institute of Technology, India. Her area of interest is Remote Sensing, Geographic Information System, Satellite Navigation and Digital Image Processing. Her noteworthy past experiences include her attachment to the Indian institute of technology, Kharagpur (IIT) and Banaras Hindu University, India as a Research assistant for 2 years.

Investigation into Artificial Intelligence Methods for Hyperspectral Image Analysis in Coral Reef Science Applications.

2019 to 2024

Project Description

Her research project aims to investigate the use of advanced artificial intelligence (AI) methods for hyperspectral image analysis in coral reef science applications.

Project Importance

The vast majority of benthic and pelagic fish health assessments are conducted with traditional visual imagery and expert marine researcher manual classification. Camera sensors have evolved to cover a larger contiguous spectrum, creating the ability to capture more information at each pixel. Conventional cameras use an array of RGB sensors to capture spatial and colour information of a scene. Colour information is limited to three spectral bands: red, blue and green. On the other hand, hyperspectral imaging systems use unique techniques (e.g., prism, diffraction gratings) to capture not only the spatial but also larger continuous bands of spectral information (200 to 500 spectral bands). The combination of spatial and spectral data allows for more aggressive artificial intelligence (AI) analysis methods (e.g., neural network, machine learning, deep learning, etc) to be applied with higher effectiveness in providing definitive information on species, stress state, material type or other aspects relevant to marine health. One example where the increased data information could significantly help is the potential improvement in the accuracy of benthic classification AI algorithms.

Hyperspectral imaging paves the way for novel research opportunities that has yet to be extensively explored in the field of marine science. Its capabilities have no doubt been proven and even commercialised in other areas, such as military, mineralogy and agriculture. However, its application within marine science has yet to be well established. This is due to the fact that the adaptation of hyperspectral imaging into marine science is not a trivial exercise and require significant considerations (e.g., seawater attenuation, expensive underwater positioning technology, complex data collection process, high corrosion risks, experiment controls, etc.). These barriers to entry have hindered research traction until recently as technological advancements have allowed for affordable and realisable solutions.

Hyperspectral sensors are now much more compact and affordable than before. Instead of requiring on a small aircraft to lift the large sensors for operations, the sensors are now compact enough to fit on aerial or underwater drones. Aerial and underwater drone technologies are also more capable now, compared to a few years ago, with enterprise quality systems that are readily available off-the-shelf (e.g., DJI M200, DJI M600, BlueROV). Powerful AI algorithms and engines by commercial companies are now opened to the public use, such as Google AI, Amazon AI, Telsa AI (a.k.a OpenAI). With all these resources available and coral reefs health at an all-time low, now is the best time to devote research into hyperspectral imaging AI methods in marine science to save the Great Barrier Reef before it is too late.

Project Methods

The two main research questions to be addressed are:

1. Can a digital coral hyperspectral library be created for identifying and studying benthic invertebrate community structure?

2. What are the accuracy and reproducibility of the results in real-world environments?

To be able to create a digital coral hyperspectral library with calibrated data in known environmental conditions, two key systems are needed: a waterproof hyperspectral imaging system and a controllable sea environment system. AIMS is the only facility in the world able to provide the capability of collecting hyperspectral data in a finely controlled state-of-the-art sea simulator facility (a.k.a. SeaSim). Data will be collected using existing AIMS infrastructure. A variety of corals will be placed into a SeaSim tank calibrated to a range of Great Barrier Reef conditions. Hyperspectral data will be collected above and under water. Active lighting will be used to generate higher illumination intensities for higher signal-to-noise ratio data. Spectral and spatial calibration targets will be setup in the experiment tank to calibrate and validate the data in post-processing.

Hyperspectral data is inherently large in volume, requiring significant storage capacity. This can be achieved via AIMS server or cloud services. The data has to be processed before analysing them. Firstly, the collected raw hyperspectral data will be converted into radiance. Secondly, the dark current noise will be removed. Subsequently, the data will be converted into reflectance standard using the spectral calibration targets. Finally, the data will be spatially corrected using the spatial calibration targets. Metadata of environment conditions will be tagged to the corresponding hyperspectral datasets.

For identifying the appropriate AI methods and parameters for classifying benthic invertebrate community structure, publicly available AI tools (such as Amazon AI, Google AI, Open AI, etc.) will be evaluated first. If results fall below 90% confidence interval, a new custom AI for benthic classification will be developed and evaluated.

Field data collection for benthic classification will further validate the effectiveness of the AI methods in real-world environments. Hyperspectral imagery data will be collected at AIMS Long Term Monitoring Program (LTMP) sites along the Great Barrier Reef, where the benthic community is known and actively being studied. AIMS hyperspectral imaging underwater and aerial drone will be used to collect imagery at different altitudes above the LTMP sites. The hyperspectral imaging platforms will be deployed using AIMS RV Cape Ferguson. Spectral and spatial calibration targets will be setup in the field to calibrate and validate the data in post-processing. The field data will be processed similar to the steps from the SeaSim experiments. The developed AI methods will be run on the field data and the results will be investigated for effectiveness.

Results through the research investigations, from the SeaSim to the field experiments, will be published in high impact peer-reviewed journals and conferences. The PhD dissertation will be formatted as thesis by publication.

Project Results

The outcome of this proposed project will provide actionable information on broad-scale community diversity, species, stress state, material type or other aspects relevant to marine health across the Great Barrier Reef.

Keywords

Biodiscovery,
Coral reefs,
Corals,
Field based,
Human use,
Mapping,
Modelling,
Monitoring,
Natural disturbance,
Oceanography,
Pollution,
Remote Sensing

Supervised By:

Jon Kok (AIMS)

Mia Hoogenboom (JCU)

Manuel Gonzalez Rivero (AIMS)

Manuel Gonzalez Rivero (AIMS)