Have you ever thought of using sound to navigate through the landscape? A team of scientists convert sound into a spectrum of coded colour bands to decipher hidden clues about the environment. Their work is making waves in ecology circles, with the identification of species so cryptic, trained specialists can’t spot them in the field.

spectrogram.png
False Colour Spectrogram. Image courtesy of QUT Ecoacoustics.

In the paper “Long duration false colour spectrograms detecting species in large audio data sets” (Journal of Ecoacoustics) led by Dr Michael Towsey at the Queensland University of Technology, long duration sound recordings are visually represented in a false colour spectrogram (LDFC). By applying a set of mathematical formulae, sound waves are converted into their visual counterpart called spectral indices. Several spectral indices (symbolised by a three letter code) are calculated and represent different concentrations of acoustic energy recorded in the study area.

Slide4
Long duration spectrograms prepared from 3 different acoustic indices representing 4 hours. The H(t) index refers to temporal entropy. CVR is short for cover. Each index reveals different components or events in the acoustic sound-space.  Image courtesy of QUT Ecoacoustics.

Depending on the aims of the research, the spectrogram produced reflects different combinations of these spectral (acoustic) indices that are assigned to the red, blue or green channels of colour (RGB) – a process inspired by false colour satellite imagery techniques used to produce pictures captured of the Earth’s surface from space. “The eyes have got the capacity to absorb huge amounts of information very quickly, so it can scan an image much faster than the ear can scan a recording” says Towsey.

slide5.png
Image representing the same four hour recording (16:00 to 20:00) from above. Red, green and blue colours are assigned to the three different spectrograms and produce the long-duration, false-colour spectrogram (RHS). CITATION: Towsey, M. et al., (2014). Visualisation of long duration acoustic recordings of the environment. Proceedings of the International Conference on Computational Science (ICCS 2014), Cairns, Australia 9 – 12 June 2014. Image courtesy of QUT Ecoacoustics.

The final spectrogram is a colourful account of the soundscape or environment. The calls of wild organisms, for example, frogs, insects and birds, are a distinctive contrast to the background environmental sound and referred to as soundmarks or acoustic signatures. They are used like landmarks by the research team to ‘navigate’ through the study environment to find answers to specific ecological questions.

Slide7
Different combinations of indices give different views of the soundscape. Here are two LDFC spectrograms of the same recordings using different combinations of indices. Image courtesy of QUT Ecoacoustics.
biophony-e1528423432117.png
Ecologists can identify the calls of different wildlife species in the spectrogram according to the filter applied. Image courtesy of QUT Ecoacoustics.

The LDFC technique was vital to assisting the researchers scope out clues for the whereabouts of the Lewin’s Rail in Tasman Island, Tasmania, a shy bird species normally hidden from ‘view’ in its wetland habitat and usually only identifiable by its vocalisations. The spectrogram reduced the need for the manual analysis of hundreds of hours of sound and enabled quick identification of the bird species. It also saved the research team the alternative cost of hiring extra crew to visually monitor the site on the ground.

P1020474 (1)
Ecologist, Elizabeth Znidersic in the field, collecting data from a passive audio recorder. Image courtesy of Elizabeth Znidersic.

Elizabeth Znidersic, an ecologist at Charles Sturt University, uses the less invasive method of passive sound recording to study wildlife in Tasmania and recognises the value of the LDFC technique. Armed with a spectrogram, Znidersic can not only capture cryptic species but she can visualise bird species that make no noise at all, only because they share a mutual relationship with a wildlife species recorded nearby. “Not all species will be primarily detected by their vocalisations, some will be silent, so we can look outside the box and see if there is a surrogate species for that species that doesn’t vocalise, so we can have that relationship and we can start to look for that species on a visual level” says Znidersic.

fig1-JEA-00022-2018-02_F0001 copy
The “grunt” and “wheeze” vocalisations of the Lewin’s Rail can be identified in the eight seconds of grey-scale spectrogram (Figure B) and as green vertical lines in the range 100–3,500 Hz (in the white rectangles) in the six hour sample represented by the LDFC spectrogram (Figure A). The bird chorus at dawn is represented by the green and pink hues that commence at 05:00 in the 1,500 – 5,000 Hz frequency range (Figure A). CITATION: Towsey M., Znidersic E., Broken-Brow J., Indraswari K., Watson D., Phillips Y., et al. (2018). Long-duration, false-colour spectrograms for detecting species in large audio data-sets. Journal of Ecoacoustics. 2: #IUSWUI, https://doi.org/10.22261/JEA.IUSWUI

The soundscapes being produced by the team at QUT Ecoacoustics with the LDFC technique are starting to blur the line between ecoaccoustics and bioacoustics – research areas normally considered to be two distinct disciplines. Ecoaccoustics studies the total sound generated by an environment, while the latter only records and monitors specific wildlife species calls. “The more experience we get with interpreting images of soundscapes, the more we’re seeing they reflect what bioaccousticians have already published” says Towsey.

fw-recording.png
Image shows two LDFC spectrograms of a 24-hour recording taken with a hydrophone in a pond of the Einasleigh River, northern Queensland, dry season. It highlights the change in sound during the day compared to the night. All the acoustic activity in this recording are due to aquatic insects. Recording courtesy of Simon Link and Toby Gifford, Griffith University, Brisbane.

Ecoaccoustics recorded at a location can be separated into three categories: geophony (surf, wind and rain), biophony (wildlife calls) and anthropophony (manmade noise).

geophony.png
Soundscape ecologists broadly categorise three or four sound sources which they label, biophony, geophony, anthropophony and sometimes a fourth is added, technophony. A spectrogram like this can direct an ecologist to those parts of the recording in which birds are singing, thereby saving a lot of time. Image courtesy of QUT Ecoacoustics.

Insects chorusing at the start and end of the day and birdcalls in the morning are being used as soundmarks by Towsey to determine the acoustic structure of sites, especially beneficial to observing slight differences in ecosystems located close together.

slide24-e1528421456736.png
Two consecutive days of recording were made at six sites for another research study, giving 12 days of recording in total. The contents of the 27 clusters were identified by selecting the false-colour spectrum of each minute in each cluster (top image). Cluster Y contained very quiet night-time recording segments, while cluster V included the morning chorus and other segments with much bird activity. The use of acoustic indices enables the calculation of acoustic signatures that characterise the soundscapes at different locations. CITATION: Sankupellay, M., Towsey, M., Truskinger, A., & Roe, P. (2015). Visual Fingerprints of the Acoustic Environment: The Use of Acoustic Indices to Characterise Natural Habitats, IEEE International Symposium on Big Data Visual Analytics, Tasmania, Australia, 22 – 25 Sep 2015. Image courtesy of QUT Ecoacoustics.

Once the wildlife call is identified, Towsey can use the combination of spectral indices to construct and apply an automated recogniser to the data via computer and locate the acoustic signature or soundmark of that wildlife species at a much faster rate. “We are using machine learning technology or artificial intelligence to recognise all the different categories of sound and we can break the day up into that” says Towsey. The team can even pinpoint the geographic location of a study, just by looking at an LDFC spectrogram. “I actually can look at a spectrogram and have a bit of an idea where that spectrogram was taken from and that can be two locations in America or multiple in Tasmania. I look for certain species, I look for frog chorus, I look for insects and for the intensity of dawn chorus and evening chorus, and what kind of night time activity there is” says Znidersic.

slide13.png
This image compares three 24-hour, false-colour spectrograms of three soundscapes from different latitudes. All these recordings were obtained in the first week of July (winter) 2015. The top recording in Papua New Guinea is dominated by insects (Eddie Game, The Nature Conservancy). The middle recording in Brisbane is dominated by birds (Yvonne Phillips, QUT Ecoacoustics Research Group) and the bottom desert recording is dominated by wind (David Watson, Charles Sturt University).

Towsey says the applications for the LDFC technique is limitless and it has already been applied to visually monitor the progress of environmental restoration projects and provide corroborating evidence for the conservation of natural environments. “People think about this field as being relatively new but I like to think it is beginning to mature. The ecological applications are only just being scratched” says Towsey.

pngimage
LDFC spectrogram was obtained from the Adelbert Ranges, Papua New Guinea, by The Nature Conservancy (TNC). TNC is a global conservation organisation who are attempting to preserve some of the natural forests of PNG. The local terrain for this recording is mountainous jungle. The entire sound-space is filled with acoustic activity, most of it due to insects, while birds are, for the most part, restricted to the lower frequency band. Image courtesy of QUT Ecoacoustics.

Dr Anthony Truskinger is the research software engineer responsible for building the computer infrastructure vital to the research teams work at the QUT and compares their library of sounds with an astronomical observatory. “We actually use a service provided by a collaboration of universities to store research data. We store 90 Terabytes of data. That’s only possible because there’s a national infrastructure for technological investment and prices keep dropping in storage” says Truskinger.

frogs JEA-00022-2018-02_F0002 copy
Four LDFC spectrograms, each 3 hours duration. White rectangles identify frog choruses and calls of interest. The vertical Hertz scale is the same for all spectrograms. (a) Intermittent chorusing of the ornate burrowing frog. (b) Chorusing of the Northern dwarf tree frog. (c) Chorusing of the flood plain toadlet. (d) The evening soundscape. CITATION: Towsey M., Znidersic E., Broken-Brow J., Indraswari K., Watson D., Phillips Y., et al. (2018). Long-duration, false-colour spectrograms for detecting species in large audio data-sets. Journal of Ecoacoustics. 2: #IUSWUI, https://doi.org/10.22261/JEA.IUSWUI

In the past the team applied the LDFC technique to process other scientists recordings but have recently released the Ecoacoustics Analysis Programs software package via GitHub as an open source for researchers to run their own analyses. “Open source sciences is what the future is” explains Truskinger. Long term the team will investigate how subtle temporal changes in soundscapes across land and water, for example, biodiversity, ecosystem health and behaviour of migratory wildlife populations, will be influenced by climate change.

Written by Gabrielle Ahern

Thank you to Dr Michael Towsey, Dr Anthony Truskinger and Elizabeth Znidersic for permission to use their images. Follow the link to QUT Ecoacoustics environmental sound recordings available via Ecosounds.

My interview with the research team will feature in an episode of the NOISEMAKERS podcast series, so stay tuned.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s