Photogrammetry Insights

By Leila Lemos, PhD Candidate, Fisheries and Wildlife Department, Oregon State University

After three years of fieldwork and analyzing a large dataset, it is time to finally start compiling the results, create plots and see what the trends are. The first dataset I am analyzing is the photogrammetry data (more on our photogrammetry method here), which so far has been full of unexpected results.

Our first big expectation was to find a noticeable intra-year variation. Gray whales spend their winter in the warm waters of Baja California, Mexico, period while they are fasting. In the spring, they perform a big migration to higher latitudes. Only when they reach their summer feeding grounds, that extends from Northern California to the Bering and Chukchi seas, Alaska, do they start feeding and gaining enough calories to support their migration back to Mexico and subsequent fasting period.

 

Northeastern gray whale migration route along the NE Pacific Ocean.
Source: https://journeynorth.org/tm/gwhale/annual/map.html

 

Thus, we expected to see whales arriving along the Oregon coast with a skinny body condition that would gradually improve over the months, during the feeding season. Some exceptions are reasonable, such as a lactating mother or a debilitated individual. However, datasets can be more complex than we expect most of the times, and many variables can influence the results. Our photogrammetry dataset is no different!

In addition, I need to decide what are the best plots to display the results and how to make them. For years now I’ve been hearing about the wonders of R, but I’ve been skeptical about learning a whole new programming/coding language “just to make plots”, as I first thought. I have always used statistical programs such as SPSS or Prism to do my plots and they were so easy to work with. However, there is a lot more we can do in R than “just plots”. Also, it is not just because something seems hard that you won’t even try. We need to expose ourselves sometimes. So, I decided to give it a try (and I am proud of myself I did), and here are some of the results:

 

Plot 1: Body Area Index (BAI) vs Day of the Year (DOY)

 

In this plot, we wanted to assess the annual Body Area Index (BAI) trends that describe how skinny (low number) or fat (higher number) a whale is. BAI is a simplified version of the BMI (Body Mass Index) used for humans. If you are interested about this method we have developed at our lab in collaboration with the Aerial Information Systems Laboratory/OSU, you can read more about it in our publication.

The plots above are three versions of the same data displayed in different ways. The first plot on the left shows all the data points by year, with polynomial best fit lines, and the confidence intervals (in gray). There are many overlapping observation points, so for the middle plot I tried to “clean up the plot” by reducing the size of the points and taking out the gray confidence interval range around the lines. In the last plot on the right, I used a linear regression best fit line, instead of polynomial.

We can see a general trend that the BAI was considerably higher in 2016 (red line), when compared to the following years, which makes us question the accuracy of the dataset for that year. In 2016, we also didn’t sample in the month of July, which is causing the 2016 polynomial line to show a sharp decrease in this month (DOY: ~200-230). But it is also interesting to note that the increasing slope of the linear regression line in all three years is very similar, indicating that the whales gained weight at about the same rate in all years.

 

Plot 2: Body Area Index (BAI) vs Body Condition Score (BCS)

 

In addition to the photogrammetry method of assessing whale body condition, we have also performed a body condition scoring method for all the photos we have taken in the field (based on the method described by Bradford et al. 2012). Thus, with this second set of plots, we wanted to compare both methods of assessing whale body condition in order to evaluate when the methods agree or not, and which method would be best and in which situation. Our hypothesis was that whales with a ‘fair’ body condition would have a lower BAI than whales with a ‘good’ body condition.

The plots above illustrate two versions of the same data, with data in the left plot grouped by year, and the data in the right plot grouped by month. In general, we see that no whales were observed with a poor body condition in the last analysis months (August to October), with both methods agreeing to this fact. Additionally, there were many whales that still had a fair body condition in August and September, but less whales in the month of October, indicating that most whales gained weight over the foraging seasons and were ready to start their Southbound migration and another fasting period. This result is important information regarding monitoring and conservation issues.

However, the 2016 dataset is still a concern, since the whales appear to have considerable higher body condition (BAI) when compared to other years.

 

Plot 3:Temporal Body Area Index (BAI) for individual whales

 

In this last group of plots, we wanted to visualize BAI trends over the season (using day of year – DOY) on the x-axis) for individuals we measured more than once. Here we can see the temporal patterns for the whales “Bit”, “Clouds”, “Pearl”, “Scarback, “Pointy”, and “White Hole”.

We expected to see an overall gradual increase in body condition (BAI) over the seasons, such as what we can observe for Pointy in 2018. However, some whales decreased their condition, such as Bit in 2018. Could this trend be accurate? Furthermore, what about BAI measurements that are different from the trend, such as Scarback in 2017, where the last observation point shows a lower BAI than past observation points? In addition, we still observe a high BAI in 2016 at this individual level, when compared to the other years.

My next step will be to check the whole dataset again and search for inconsistencies. There is something causing these 2016 values to possibly be wrong and I need to find out what it is. The overall quality of the measured photogrammetry images was good and in focus, but other variables could be influencing the quality and accuracy of the measurements.

For instance, when measuring images, I often struggled with glare, water splash, water turbidity, ocean swell, and shadows, as you can see in the photos below. All of these variables caused the borders of the whale body to not be clearly visible/identifiable, which may have caused measurements to be wrong.

 

Examples of bad conditions for performing photogrammetry: (1) glare and water splash, (2) water turbidity, (3) ocean swell, and (4) a shadow created in one of the sides of the whale body.
Source: GEMM Lab. Taken under NMFS permit 16111 issued to John Calambokidis.

 

Thus, I will need to check all of these variables to identify the causes for bad measurements and “clean the dataset”. Only after this process will I be able to make these plots again to look at the trends (which will be easy since I already have my R code written!). Then I’ll move on to my next hypothesis that the BAI of individual whales varied by demographics including sex, age and reproductive state.

To carry out robust science that produces results we can trust, we can’t simply collect data, perform a basic analysis, create plots and believe everything we see. Data is often messy, especially when developing new methods like we have done here with drone based photogrammetry and the BAI. So, I need to spend some important time checking my data for accuracy and examining confounding variables that might affect the dataset. Science can be challenging, both when interpreting data or learning a new command language, but it is all worth it in the end when we produce results we know we can trust.

 

 

 

Over the Ocean and Under the Bridges: STEM Cruise on the R/V Oceanus

By Alexa Kownacki, Ph.D. Student, OSU Department of Fisheries and Wildlife, Geospatial Ecology of Marine Megafauna Lab

From September 22nd through 30th, the GEMM Lab participated in a STEM research cruise aboard the R/V Oceanus, Oregon State University’s (OSU) largest research vessel, which served as a fully-functioning, floating, research laboratory and field station. The STEM cruise focused on integrating science, technology, engineering and mathematics (STEM) into hands-on teaching experiences alongside professionals in the marine sciences. The official science crew consisted of high school teachers and students, community college students, and Oregon State University graduate students and professors. As with a usual research cruise, there was ample set-up, data collection, data entry, experimentation, successes, and failures. And because everyone in the science party actively participated in the research process, everyone also experienced these successes, failures, and moments of inspiration.

The science party enjoying the sunset from the aft deck with the Astoria-Megler bridge in the background. (Image source: Alexa Kownacki)

Dr. Leigh Torres, Dr. Rachael Orben, and I were all primarily stationed on flybridge—one deck above the bridge—fully exposed to the elements, at the highest possible location on the ship for best viewing. We scanned the seas in hopes of spotting a blow, a splash, or any sign of a marine mammal or seabird. Beside us, students and teachers donned binoculars and positioned themselves around the mast, with Leigh and I taking a 90-degree swath from the mast—either to starboard or to port. For those who had not been part of marine mammal observations previously, it was a crash course into the peaks and troughs—of both the waves and of the sightings. We emphasized the importance of absence data: knowledge of what is not “there” is equally as important as what is. Fortunately, Leigh chose a course that proved to have surprisingly excellent environmental conditions and amazing sightings. Therefore, we collected a large amount of presence data: data collected when marine mammals or seabirds are present.

High school student, Chris Quashnick Holloway, records a seabird sighting for observer, Dr. Rachael Orben. (Image source: Alexa Kownacki).

When someone sighted a whale that surfaced regularly, we assessed the conditions: the sea state, the animal’s behavior, the wind conditions, etc. If we deemed them as “good to fly”, our licensed drone pilot and Orange Coast Community College student, Jason, prepared his Phantom 4 drone. While he and Leigh set up drone operations, I and the other science team members maintained a visual on the whale and stayed in constant communication with the bridge via radio. When the drone was ready, and the bridge gave the “all clear”, Jason launched his drone from the aft deck. Then, someone tossed an unassuming, meter-long, wood plank overboard—keeping it attached to the ship with a line. This wood board serves as a calibration tool; the drone flies over it at varying heights as determined by its built-in altimeter. Later, we analyze how many pixels one meter occupied at different heights and can thereby determine the body length of the whale from still images by converting pixel length to a metric unit.

High school student, Alishia Keller, uses binoculars to observe a whale, while PhD student, Alexa Kownacki, radios updates on the whale’s location to the bridge and the aft deck. (Image source: Tracy Crews)

Finally, when the drone is calibrated, I radio the most recent location of our animal. For example, “Blow at 9 o’clock, 250 meters away”. Then, the bridge and I constantly adjust the ship’s speed and location. If the whale “flukes” (dives and exposes the ventral side of its tail), and later resurfaced 500 meters away at our 10 o’clock, I might radio to the bridge to, “turn 60 degrees to port and increase speed to 5 knots”. (See the Hidden Math Lesson below). Jason then positions the drone over the whale, adjusting the camera angle as necessary, and recording high-quality video footage for later analysis. The aerial viewpoint provides major advantages. Whales usually expose about 10 percent of their body above the water’s surface. However, with an aerial vantage point, we can see more of the whale and its surroundings. From here, we can observe behaviors that are otherwise obscured (Torres et al. 2018), and record footage that to help quantify body condition (i.e. lengths and girths). Prior to the batteries running low, Jason returns the drone back to the aft deck, the vessel comes to an idle, and Leigh catches the drone. Throughout these operations, those of us on the flybridge photograph flukes for identification and document any behaviors we observe. Later, we match the whale we sighted to the whale that the drone flew over, and then to prior sightings of this same individual—adding information like body condition or the presence of a calf. I like to think of it as whale detective work. Moreover, it is a team effort; everyone has a critical role in the mission. When it’s all said and done, this noninvasive approach provides life history context to the health and behaviors of the animal.

Drone pilot, Jason Miranda, flying his drone using his handheld ground station on the aft deck. (Photo source: Tracy Crews)

Hidden Math Lesson: The location of 10 o’clock and 60 degrees to port refer to the exact same direction. The bow of the ship is our 12 o’clock with the stern at our 6 o’clock; you always orient yourself in this manner when giving directions. The same goes for a compass measurement in degrees when relating the direction to the boat: the bow is 360/0. An angle measure between two consecutive numbers on a clock is: 360 degrees divided by 12-“hour” markers = 30 degrees. Therefore, 10 o’clock was 0 degrees – (2 “hours”)= 0 degrees- (2*30 degrees)= -60 degrees. A negative degree less than 180 refers to the port side (left).

Killer whale traveling northbound.

Our trip was chalked full of science and graced with cooperative weather conditions. There were more highlights than I could list in a single sitting. We towed zooplankton nets under the night sky while eating ice cream bars; we sang together at sunset and watched the atmospheric phenomena: the green flash; we witnessed a humpback lunge-feeding beside the ship’s bow; and we saw a sperm whale traveling across calm seas.

Sperm whale surfacing before a long dive.

On this cruise, our lab focused on the marine mammal observations—which proved excellent during the cruise. In only four days of surveying, we had 43 marine mammal sightings containing 362 individuals representing 9 species (See figure 1). As you can see from figure 2, we traveled over shallow, coastal and deep waters, in both Washington and Oregon before inland to Portland, OR. Because we ventured to areas with different bathymetric and oceanographic conditions, we increased our likelihood of seeing a higher diversity of species than we would if we stayed in a single depth or area.

Humpback whale lunge feeding off the bow.
Number of sightings Total number of individuals
Humpback whale 22 40
Pacific white-sided dolphin 3 249
Northern right whale dolphin 1 9
Killer whale 1 3
Dall’s porpoise 5 49
Sperm whale 1 1
Gray whale 1 1
Harbor seal 1 1
California sea lion 8 9
Total 43 362

Figure 1. Summary table of all species sightings during cruise while the science team observed from the flybridge.

Pacific white-sided dolphins swimming towards the vessel.

Figure 2. Map with inset displaying study area and sightings observed by species during the cruise, made in ArcMap. (Image source: Alexa Kownacki).

Even after two days of STEM outreach events in Portland, we were excited to incorporate more science. For the transit from Portland, OR to Newport, OR, the entire science team consisted two people: me and Jason. But even with poor weather conditions, we still used science to answer questions and help us along our journey—only with different goals than on our main leg. With the help of the marine technician, we set up a camera on the bow of the ship, facing aft to watch the vessel maneuver through the famous Portland bridges.

Video 1. Time-lapse footage of the R/V Oceanus maneuvering the Portland Bridges from a GoPro. Compiled by Alexa Kownacki, assisted by Jason Miranda and Kristin Beem.

Prior to the crossing the Columbia River bar and re-entering the Pacific Ocean, the R/V Oceanus maneuvered up the picturesque Columbia River. We used our geospatial skills to locate our fellow science team member and high school student, Chris, who was located on land. We tracked each other using GPS technology in our cell phones, until the ship got close enough to use natural landmarks as reference points, and finally we could use our binoculars to see Chris shining a light from shore. As the ship powered forward and passed under the famous Astoria-Megler bridge that connects Oregon to Washington, Chris drove over it; he directed us “100 degrees to port”. And, thanks to clear directions, bright visual aids, and spatiotemporal analysis, we managed to find our team member waving from shore. This is only one of many examples that show how in a few days at sea, students utilized new skills, such as marine mammal observational techniques, and honed them for additional applications.

On the bow, Alexa and Jason use binoculars to find Chris–over 4 miles–on the Washington side of the Columbia River. (Image source: Kristin Beem)

Great science is the result of teamwork, passion, and ingenuity. Working alongside students, teachers, and other, more-experienced scientists, provided everyone with opportunities to learn from each other. We created great science because we asked questions, we passed on our knowledge to the next person, and we did so with enthusiasm.

High school students, Jason and Chris, alongside Dr. Leigh Torres, all try to get a glimpse at the zooplankton under Dr. Kim Bernard’s microscope. (Image source: Tracy Crews).

Check out other blog posts written by the science team about the trip here.

Big Data: Big possibilities with bigger challenges

By Alexa Kownacki, Ph.D. Student, OSU Department of Fisheries and Wildlife, Geospatial Ecology of Marine Megafauna Lab

Did you know that Excel has a maximum number of rows? I do. During Winter Term for my GIS project, I was using Excel to merge oceanographic data, from a publicly-available data source website, and Excel continuously quit. Naturally, I assumed I had caused some sort of computer error. [As an aside, I’ve concluded that most problems related to technology are human error-based.] Therefore, I tried reformatting the data, restarting my computer, the program, etc. Nothing. Then, thanks to the magic of Google, I discovered that Excel allows no more than 1,048,576 rows by 16,384 columns. ONLY 1.05 million rows?! The oceanography data was more than 3 million rows—and that’s with me eliminating data points. This is what happens when we’re dealing with big data.

According to Merriam-Webster dictionary, big data is an accumulation of data that is too large and complex for processing by traditional database management tools (www.merriam-webster.com). However, there are journal articles, like this one from Forbes, that discuss the ongoing debate of how to define “big data”. According to the article, there are 12 major definitions; so, I’ll let you decide what you qualify as “big data”. Either way, I think that when Excel reaches its maximum row capacity, I’m working with big data.

Collecting oceanography data aboard the R/V Shimada. Photo source: Alexa K.

Here’s the thing: the oceanography data that I referred to was just a snippet of my data. Technically, it’s not even MY data; it’s data I accessed from NOAA’s ERDDAP website that had been consistently observed for the time frame of my dolphin data points. You may recall my blog about maps and geospatial analysis that highlights some of the reasons these variables, such as temperature and salinity, are important. However, what I didn’t previously mention was that I spent weeks working on editing this NOAA data. My project on common bottlenose dolphins overlays environmental variables to better understand dolphin population health off of California. These variables should have similar spatiotemporal attributes as the dolphin data I’m working with, which has a time series beginning in the 1980s. Without taking out a calculator, I still know that equates to a lot of data. Great data: data that will let me answer interesting, pertinent questions. But, big data nonetheless.

This is a screenshot of what the oceanography data looked like when I downloaded it to Excel. This format repeats for nearly 3 million rows.

Excel Screen Shot. Image source: Alexa K.

I showed this Excel spreadsheet to my GIS professor, and his response was something akin to “holy smokes”, with a few more expletives and a look of horror. It was not the sheer number of rows that shocked him; it was the data format. Nowadays, nearly everyone works with big data. It’s par for the course. However, the way data are formatted is the major split between what I’ll call “easy” data and “hard” data. The oceanography data could have been “easy” data. It could have had many variables listed in columns. Instead, this data  alternated between rows with variable headings and columns with variable headings, for millions of cells. And, as described earlier, this is only one example of big data and its challenges.

Data does not always come in a form with text and numbers; sometimes it appears as media such as photographs, videos, and audio files. Big data just got a whole lot bigger. While working as a scientist at NOAA’s Southwest Fisheries Science Center, one project brought in over 80 terabytes of raw data per year. The project centered on the eastern north pacific gray whale population, and, more specifically, its migration. Scientists have observed the gray whale migration annually since 1994 from Piedras Blancas Light Station for the Northbound migration, and 2 out of every 5 years from Granite Canyon Field Station (GCFS) for the Southbound migration. One of my roles was to ground-truth software that would help transition from humans as observers to computer as observers. One avenue we assessed was to compare how well a computer “counted” whales compared to people. For this question, three infrared cameras at the GCFS recorded during the same time span that human observers were counting the migratory whales. Next, scientists, such as myself, would transfer those video files, upwards of 80 TB, from the hard drives to Synology boxes and to a different facility–miles away. Synology boxes store arrays of hard drives and that can be accessed remotely. To review, three locations with 80 TB of the same raw data. Once the data is saved in triplet, then I could run a computer program, to detect whale. In summary, three months of recorded infrared video files requires upwards of 240 TB before processing. This is big data.

Scientists on an observation shift at Granite Canyon Field Station in Northern California. Photo source: Alexa K.
Alexa and another NOAA scientist watching for gray whales at Piedras Blancas Light Station. Photo source: Alexa K.

In the GEMM Laboratory, we have so many sources of data that I did not bother trying to count. I’m entering my second year of the Ph.D. program and I already have a hard drive of data that I’ve backed up three different locations. It’s no longer a matter of “if” you work with big data, it’s “how”. How will you format the data? How will you store the data? How will you maintain back-ups of the data? How will you share this data with collaborators/funders/the public?

The wonderful aspect to big data is in the name: big and data. The scientific community can answer more, in-depth, challenging questions because of access to data and more of it. Data is often the limiting factor in what researchers can do because increased sample size allows more questions to be asked and greater confidence in results. That, and funding of course. It’s the reason why when you see GEMM Lab members in the field, we’re not only using drones to capture aerial images of whales, we’re taking fecal, biopsy, and phytoplankton samples. We’re recording the location, temperature, water conditions, wind conditions, cloud cover, date/time, water depth, and so much more. Because all of this data will help us and help other scientists answer critical questions. Thus, to my fellow scientists, I feel your pain and I applaud you, because I too know that the challenges that come with big data are worth it. And, to the non-scientists out there, hopefully this gives you some insight as to why we scientists ask for external hard drives as gifts.

Leila launching the drone to collect aerial images of gray whales to measure body condition. Photo source: Alexa K.
Using the theodolite to collect tracking data on the Pacific Coast Feeding Group in Port Orford, OR. Photo source: Alexa K.

References:

https://support.office.com/en-us/article/excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3

https://www.merriam-webster.com/dictionary/big%20data

How Unmanned Aircraft Systems (UAS, aka “drones”) are being applied in conservation research

By Leila Lemos, Ph.D. Student, Department of Fisheries and Wildlife, OSU

 

Unmanned Aircraft Systems (UAS), also known as “drones”, have been increasingly used in many diverse areas. Concerning field research, the use of drones has brought about reduced errors, increased safety and survey efforts, among other benefits, as described in a previous blog post of mine.

Several study groups around the world have been applying this new technology to a great variety of research applications, aiding in the conservation of certain areas and their respective fauna and flora. Examples of these studies include forest monitoring and tree cover analyses, .

Using drones for forest monitoring and tree cover analyses allows for many applications, such as biodiversity and tree height monitoring, forest classification and inventory, and plant disease and detection. The Ugalla Primate Project, for example, performed an interesting study on tree coverage mapping in western Tanzania (Figure 1).

Figure 1: Tree coverage analyses in Tanzania.
Source: Conservation Drones, 2016.

 

The access to this data (not possible before from the ground) and the acquired knowledge on tree density and structure were important to better understand how wild primates exploit a mosaic landscape. Here is a video about this project:

 

Forest restoration activities can also be monitored by drones. Rainforests around the world have been depleted through deforestation, partly to open up space for agriculture. To meet conservation goals, large areas are being restored to rainforests today (Elsevier 2015). It is important to monitor the success of the forest regeneration and to ensure that the inspected area is being replenished with the right vegetation. Since inspection events can be costly, labor intensive and time consuming, drones can facilitate these procedures, making the monitoring process more feasible.

Zahawi et al. (2015) conducted an interesting study in Costa Rica, being able to keep up with the success of the forest regeneration. They were also able to spot many fruit-eating birds important for forest regeneration (eg. mountain thrush, black guan and sooty-capped bush tanager). Researchers concluded that the automation of the process lead to equally accurate results.

Drones can also be used to inspect areas for illegal logging and habitat destruction. Conservationists have struggled to identify illegal activities, and the use of drones can accelerate the identification process of these activities and help to monitor their spread and ensure that they do not intersect with protected areas.

The Amazon Basin Conservation Association Los Amigos conservancy concession (LACC) has been monitoring 145,000 hectars of the local conservation area. Illegal gold mining and logging activities were identified (Figure 2) and drones have aided in tracking the spread of these activities and the progress of reforestation efforts.

Figure 2: Identification of illegal activities in the Amazon Basin.
Source: NPR, 2015.

 

Another remarkable project was held in Mexico, in one of the most important sites for monarch butterflies in the country: the Monarch Butterfly Biosphere Reserve. Around 10 hectars of vital trees were cut down in the reserve during 2013-2015, and a great decrease of the monarch population was perceived. The reserve did not allow researchers to enter in the area for inspection due to safety concerns. Therefore, drones were used and were able to reveal the illegal logging activity (Figure 3).

Figure 3: Identification of illegal logging at the Monarch Butterfly Biosphere Reserve, Mexico.
Source: Take Part, 2016.

 

Regarding the use of drones for mapping vulnerable areas, this new technology can be used to map potential exposed areas to avoid catastrophes. Concerning responses to fires or other natural disasters, drones can fly immediately, while planes and helicopters require a certain time. The drone material also allows for operating successfully under challenging conditions such as rain, snow and high temperatures, as in the case of fires. Data can be assessed in real time, with no need to have firefighters or other personnel at a dangerous location anymore. Drones can now fulfill this role. Examples of drone applications in this regard are the detection, monitoring and support for catastrophes such as landslides, tsunamis, ship collisions, volcanic eruptions, nuclear accidents, fire scenes, flooding, storms and hurricanes, and rescue of people and wildlife at risk. In addition, the use of a thermal image camera can better assist in rescue operations.

Researchers from the Universidad Politécnica de Madrid (UPM) are developing a system to detect forest fires by using a color index (Cruz et al. 2016). This index is based on vegetation classification techniques that have been adapted to detect different tonalities for flames and smoke (Figure 4). This new technique would result in more cost-effective outcomes than conventional systems (eg. helicopters, satellites) and in reaching inaccessible locations.

Figure 4: Fire detection with Forest Fire Detection Index (FFDI) in different scenes.
Source: UPM, 2016.

 

Marine debris detection by drones is another great functionality. The right localization and the extent of the problem can be detected through drone footage, and action plans for clean-ups can be developed.

A research conducted by the Duke University Marine Lab has been detecting marine debris on beaches around the world. They indicate that marine debris impacts water quality, and harms wildlife (eg. whales, sea birds, seals and sea turtles) that might confuse floating plastic with food. You can read a bit more about their research and its importance for conservation ends here.

Drones are also being extensively used for wildlife monitoring. Through drone footage, researchers around the world have been able to detect and map wildlife and habitat use, estimate densities and evaluate population status, detect rare behaviors, combat poaching, among others. One of the main benefits of using a drone instead of using helicopters or airplanes, or having researchers in the area, is the lower disturbance it may cause on wildlife.

A research team from Monash University is using drones for seabird monitoring in remote islands in northwestern Australia (Figure 5). After some tests, researchers were able to detect which altitude (~75 meters) the drone would not cause any disturbances to the birds. Results achieved by projects like this should be used in the future for approaching the species safely.

Figure 5: Photograph taken by a drone of a crested tern colony on a remote island in Australia.
Source: Conservation Drones, 2014.

 

Drones are also being used to combat elephant and rhino poaching in Africa. They are being implemented to predict, trace, track and catch suspects of poaching. The aim is to reduce the number of animals being killed for the detusking and dehorning practices and the illegal trade. You can read more about this theme here. The drone application on combating one of these illegal practices is also shown here in this video.

As if the innovation of this device alone was not enough, drones are also being used to load other tools. A good example is the collection of whale breath samples by attaching Petri dishes or sterile sponges in the basal part of the drones.

The collection of lung samples allows many health-monitoring applications, such as the analysis of virus and bacteria loads, DNA, hormones, and the detection of environmental toxins in their organisms. This non-invasive physiological tool, known as “Snotbot”, allows sampling collection without approaching closely the individuals and with minimal or no disturbance of the animals. The following video better describes about this amazing project:

It is inspiring to look at all of these wonderful applications of drones in conservation research. Our GEMM Lab team is already applying this great tool in the field and is hoping to support the conservation of wildlife.

 

 

References

Conservation Drones. 2014. Conservation Drones for Seabird Monitoring. Available at: https://conservationdrones.org/2014/05/05/conservation-drones-for-seabird-monitoring/

Conservation Drones. 2016. Tree cover analyses in Tanzania in collaboration with Envirodrone. Available at: https://conservationdrones.org/2016/09/17/tree-cover-analyses-in-tanzania-in-collaboration-with-envirodrone/

Cruz H, Eckert M, Meneses J and Martínez JF. 2016. Efficient Forest Fire Detection Index for Application in Unmanned Aerial Systems (UASs). Sensors 16(893):1-16.

Elsevier. 2015. Drones Could Make Forest Conservation Monitoring Significantly Cheaper: new study published in the Biological Conservation wins Elsevier’s Atlas award for September 2015. Available at: https://www.elsevier.com/about/press-releases/research-and-journals/drones-could-make-forest-conservation-monitoring significantly-cheaper

NPR. 2015. Eyes In The Sky: Foam Drones Keep Watch On Rain Forest Trees. Available at: http://www.npr.org/sections/goatsandsoda/2015/05/19/398765759/eyes-in-the-sky-styrofoam-drones-keep-watch-on-rainforest-trees

Take Part. 2016. Drones Uncover Illegal Logging in Critical Monarch Butterfly Reserve. Available at: http://www.takepart.com/article/2016/06/22/drones-uncover-illegal-logging-monarch-butterfly-habitat

UPM. 2016. New automatic forest fire detection system by using surveillance drones. Available at: http://www.upm.es/internacional/UPM/UPM_Channel/News/dc52fff26abf7510VgnVCM10000009c7648aRCRD

Zahawi RA, Dandois JP, Holl KD, Nadwodny D, Reid JL and Ellis EC. 2015. Using lightweight unmanned aerial vehicles to monitor tropical forest recovery. Biological Conservation 186:287–295.

 

The five senses of fieldwork

By Leila Lemos, PhD student

 

This summer was full of emotions for me: I finally started my first fieldwork season after almost a year of classes and saw my first gray whale (love at first sight!).

During the fieldwork we use a small research vessel (we call it “Red Rocket”) along the Oregon coast to collect data for my PhD project. We are collecting gray whale fecal samples to analyze hormone variations; acoustic data to assess ambient noise changes at different locations and also variations before, during and after events like the “Halibut opener”; GoPro recordings to evaluate prey availability; photographs in order to identify each individual whale and assess body and skin condition; and video recordings through UAS (aka “drone”) flights, so we can measure the whales and classify them as skinny/fat, calf/juvenile/adult and pregnant/non-pregnant.

However, in order to collect all of these data, we need to first find the whales. This is when we use our first sense: vision. We are always looking at the horizon searching for a blow to come up and once we see it, we safely approach the animal and start watching the individual’s behavior and taking photographs.

If the animal is surfacing regularly to allow a successful drone overflight, we stay with the whale and launch the UAS in order to collect photogrammetry and behavior data.

Each team member performs different functions on the boat, as seen in the figure below.

Figure 1: UAS image showing each team members’ functions in the boat at the moment just after the UAS launch.
Figure 1: UAS image showing each team members’ functions in the boat at the moment just after the UAS launch.

 

While one member pilots the boat, another operates the UAS. Another team member is responsible for taking photos of the whales so we can match individuals with the UAS videos. And the last team member puts the calibration board of known length in the water, so that we can later calculate the exact size of each pixel at various UAS altitudes, which allows us to accurately measure whale lengths. Team members also alternate between these and other functions.

Sometimes we put the UAS in the air and no whales are at the surface, or we can’t find any. These animals only stay at the surface for a short period of time, so working with whales can be really challenging. UAS batteries only last for 15-20 minutes and we need to make the most of that time as we can. All of the members need to help the UAS pilot in finding whales, and that is when, besides vision, we need to use hearing too. The sound of the whale’s respiration (blow) can be very loud, especially when whales are closer. Once we find the whale, we give the location to the UAS pilot: “whale at 2 o’clock at 30 meters from the boat!” and the pilot finds the whale for an overflight.

The opposite – too many whales around – can also happen. While we are observing one individual or searching for it in one direction, we may hear a blow from another whale right behind us, and that’s the signal for us to look for other individuals too.

But now you might be asking yourself: “ok, I agree with vision and hearing, but what about the other three senses? Smell? Taste? Touch?” Believe it or not, this happens. Sometimes whales surface pretty close to the boat and blow. If the wind is in our direction – ARGHHHH – we smell it and even taste it (after the first time you learn to close your mouth!). Not a smell I recommend.

Fecal samples are responsible for the 5th sense: touch!

Once we identify that the whale pooped, we approach the fecal plume in order to collect as much fecal matter as possible (Fig.2).

Figure 2: A: the poop is identified; B: the boat approaches the feces that are floating at the surface (~30 seconds); C: one of the team members remains at the bow of the boat to indicate where the feces are; D: another team member collects it with a fine-mesh net. Filmed under NOAA/NMFS permit #16111 to John Calambokidis).
Figure 2: A: the poop is identified; B: the boat approaches the feces that are floating at the surface (~30 seconds); C: one of the team members remains at the bow of the boat to indicate where the feces are; D: another team member collects it with a fine-mesh net. Filmed under NOAA/NMFS permit #16111 to John Calambokidis).

 

After collecting the poop we transfer all of it from the net to a small jar that we then keep cool in an ice chest until we arrive back at the lab and put it in the freezer. So, how do we transfer the poop to the jar? By touching it! We put the jar inside the net and transfer each poop spot to the jar with the help of water pressure from a squeeze bottle full of ambient salt water.

Figure 3: Two gray whale individuals swimming around kelp forests. Filmed under NOAA/NMFS permit #16111 to John Calambokidis).
Figure 3: Two gray whale individuals swimming around kelp forests. Filmed under NOAA/NMFS permit #16111 to John Calambokidis).

 

That’s how we use our senses to study the whales, and we also use an underwater sensory system (a GoPro) to see what the whales were feeding on.

GoPro video of mysid swarms that we recorded near feeding gray whales in Port Orford in August 2016:

Our fieldwork is wrapping up this week, and I can already say that it has been a success. The challenging Oregon weather allowed us to work on 25 days: 6 days in Port Orford and 19 days in the Newport and Depoe Bay region, totaling 141 hours and 50 minutes of effort. We saw 195 whales during 97 different sightings and collected 49 fecal samples. We also performed 67 UAS flights, 34 drifter deployments (to collect acoustic data), and 34 GoPro deployments.

It is incredible to see how much data we obtained! Now starts the second part of the challenge: how to put all of this data together and find the results. My next steps are:

– photo-identification analysis;

– body and skin condition scoring of individuals;

– photogrammetry analysis;

– analysis of the GoPro videos to characterize prey;

– hormone analysis laboratory training in November at the Seattle Aquarium

 

For now, enjoy some pictures and a video we collected during the fieldwork this summer. It was hard to choose my favorite pictures from 11,061 photos and a video from 13 hours and 29 minutes of recording, but I finally did! Enjoy!

Figure 4: Gray whale breaching in Port Orford on August 27th. (Photo by Leila Lemos; Taken under NOAA/NMFS permit #16111 to John Calambokidis).
Figure 4: Gray whale breaching in Port Orford on August 27th. (Photo by Leila Lemos; Taken under NOAA/NMFS permit #16111 to John Calambokidis).

 

Figure 5: Rainbow formation through sunlight refraction on the water droplets of a gray whale individual's blow in Newport on September 15th. (Photo by Leila Lemos; Taken under NOAA/NMFS permit #16111 to John Calambokidis).
Figure 5: Rainbow formation through sunlight refraction on the water droplets of a gray whale individual’s blow in Newport on September 15th. (Photo by Leila Lemos; Taken under NOAA/NMFS permit #16111 to John Calambokidis).

 

Likely gray whale nursing behavior (Taken under NOAA/NMFS permit #16111 to John Calambokidis):

Unmanned Aircraft Systems: keep your distance from wildlife!

By Leila Lemos, Ph.D. Student, Department of Fisheries and Wildlife, OSU

Unmanned aircraft systems (UAS) or “drones” are becoming commonly used to observe natural landscapes and wildlife. These systems can provide important information regarding habitat conditions, distribution and abundance of populations, and health, fitness and behavior of the individuals (Goebel et al. 2015, Durban et al. 2016).

The benefits for the use of UAS by researchers and wildlife managers are varied and include reduced errors of population estimations, reduced observer fatigue, increased observer safety, increased survey effort, and access to remote settings and harsh environments (Koski et al. 2010, Vermeulen et al. 2013, Goebel et al. 2015, Smith et al. 2016). Importantly, data gathered from UAS can provide needed information for the conservation and management of several species. Although it is often assumed that wildlife incur minimal disturbance from UAS due to the reduced noise compared to traditional aircraft used for wildlife monitoring (Acevedo-Whitehouse et al. 2010), the impacts of UAS on most wildlife populations is currently unexplored.

Several studies have tried to comprehend the effects of UAS flights over animals and so far there is no evidence of behavioral disturbance. For instance Vermeulen et al. (2013) conducted a study where authors observed a group of elephants’ reaction or warning behavior while a UAS passed ten times over the individuals at altitudes of 100 and 300 meters, and no disturbance was recorded. Furthermore, a study conducted by Acevedo-Whitehouse et al. (2010) reported that six different species of large cetaceans (Bryde’s whale, fin whale, sperm whale, humpback whale, blue whale and gray whale) did not display avoidance behavior when approached by the UAS for blow sampling, suggesting that the system caused minimal distress (negative stress) to the individuals.

However, the fact that we cannot visually see an effect in the animal does not mean that a stress response is not occurring. A study analyzed the effects of UAS flights on movements and heart rate responses of American black bears in northwestern Minnesota (Ditmer et al. 2015). It was observed that all bears, including an individual that was hibernating, responded to UAS flights with increased heart rates (123 beats per minute above the pre-flight baseline). In contrast, no behavioral response by the bears was recorded (Figure 1).

By Leila Lemos, Ph.D. Student, Department of Fisheries and Wildlife, OSU Unmanned aircraft systems (UAS) or “drones” are becoming commonly used to observe natural landscapes and wildlife. These systems can provide important information regarding habitat conditions, distribution and abundance of populations, and health, fitness and behavior of the individuals (Goebel et al. 2015, Durban et al. 2016). The benefits for the use of UAS by researchers and wildlife managers are varied and include reduced errors of population estimations, reduced observer fatigue, increased observer safety, increased survey effort, and access to remote settings and harsh environments (Koski et al. 2010, Vermeulen et al. 2013, Goebel et al. 2015, Smith et al. 2016). Importantly, data gathered from UAS can provide needed information for the conservation and management of several species. Although it is often assumed that wildlife incur minimal disturbance from UAS due to the reduced noise compared to traditional aircraft used for wildlife monitoring (Acevedo-Whitehouse et al. 2010), the impacts of UAS on most wildlife populations is currently unexplored. Several studies have tried to comprehend the effects of UAS flights over animals and so far there is no evidence of behavioral disturbance. For instance Vermeulen et al. (2013) conducted a study where authors observed a group of elephants’ reaction or warning behavior while a UAS passed ten times over the individuals at altitudes of 100 and 300 meters, and no disturbance was recorded. Furthermore, a study conducted by Acevedo-Whitehouse et al. (2010) reported that six different species of large cetaceans (Bryde’s whale, fin whale, sperm whale, humpback whale, blue whale and gray whale) did not display avoidance behavior when approached by the UAS for blow sampling, suggesting that the system caused minimal distress (negative stress) to the individuals. However, the fact that we cannot visually see an effect in the animal does not mean that a stress response is not occurring. A study analyzed the effects of UAS flights on movements and heart rate responses of American black bears in northwestern Minnesota (Ditmer et al. 2015). It was observed that all bears, including an individual that was hibernating, responded to UAS flights with increased heart rates (123 beats per minute above the pre-flight baseline). In contrast, no behavioral response by the bears was recorded (Figure 1).
Figure 1: (A) Movement rates (meters per hour) of an adult female black bear with cubs prior to, during, and after a UAS flight (gray bar); (B) The corresponding heart rate (beats per minute) of the adult female black bear. Source: Modified from Figure 1 from Ditmer et al. 2015.

 

Therefore, behavioral analysis alone may not be able to describe the complete effects of UAS on wildlife, and it is important to consider other possible stress responses of wildlife.

Regarding marine mammals, only a few studies have systematically documented the effects of UAS on these animals. A review of these studies was produced by Smith et al. (2016) and the main factors influencing behavioral disturbance were identified as (1) noise and visual stimulus (from the UAS or its shadow), and (2) flight altitude of the UAS. Thus, studies that approach marine mammals closely with UAS (e.g., blow sampling in cetaceans) should be closely monitored for behavioral reactions because the noise level and visual stimulus will likely be increased.

Fortunately, when UAS work is applied to cetaceans and sirenians (manatees and dugongs) the air-water interface acts as a barrier to sound so these animals are unlikely to be acoustically disturbed by UAS. However, acoustic detection and response are still possible when an animal’s ears are exposed in the air during a surfacing event.

The best way to minimize stress responses in wildlife is to use caution while operating UAS at any altitude. According to National Oceanic and Atmospheric Administration (NOAA), “UAS can also be disruptive to both people and animals if not used safely, appropriately, or responsibly”. Therefore, since 2012, the Federal Aviation Administration (FAA) has required UAS operators in the United States to have a certified and registered aircraft, a licensed pilot, and operational approval, known as Section 333 Exemption (Note: in late August 2016, the 333 will be replaced by a revision to part 107). These authorizations require an air worthiness statement or certificate and registered aircraft. Public entities, like Oregon State University, operate under a certificate of authorization (COA.) As a public entity OSU certifies its own aircraft and sets standards for UAS operators. These permit requirements discourage illegal operations and improves safety.

Regarding marine mammals, all UAS operators should also be aware of The Marine Mammal Protection Act (MMPA) of 1972. This law makes it illegal to harass marine mammals in the wild, which may cause disruption to behavioral patterns, including, but not limited to, migration, breathing, nursing, breeding, feeding, or sheltering. A close UAS approach has the potential to cause harassments to marine mammals, thus federal guidelines recommend keeping a safe distance from these animals in the wild. The required vertical distance is 1000 ft for most marine mammals, but increases for endangered animals such as the North Atlantic right whales with a required buffer of 1500 ft (http://www.nmfs.noaa.gov/pr/uas.html). Therefore, NOAA evaluates all scientific research that use UAS within 1000 ft of marine mammals in order to ensure that the benefits outweigh possible hazards. NOAA distributes research permits accordingly.

Of course, with new technology the rules are always changing. In fact, last week the Department of Transportation (DOT) and the FAA finalized the first operational rules for routine commercial use of small UAS. These new guidelines aim to support new innovations in order to spur job growth, advance critical scientific research and save lives, and are designed to minimize risks to other aircraft and people and property on the ground. These new regulations include several requirements (e.g., height and speed restrictions) and hopefully allow for a streamlined system that enables beneficial and exciting wildlife research.

For my PhD project we are using UAS to collect aerial images from gray whales in order to describe behavioral patterns and apply a photogrammetry methodology. Through these methods we will determine the overall body condition and health of the individuals for comparison to variable ambient ocean noise levels. This project is conducted in collaboration with the NOAA Pacific Marine Environmental Lab.

Since October 2015, we have conducted 31 over-flights of gray whales using our UAS (DJI Phantom 3) and no behavioral disturbance has been observed. When over the whale(s) we generally fly between 25 and 40 m above the animals. We have a FAA certified UAS operator and fly under our NOAA/NMFS permit 16111. Prior to each flight we ensure that the weather conditions are safe, the whales are behaving normally, and that no on-lookers from shore or other boats will be disturbed.

Here is a video showing the launch and retrieval of the UAS system, our research vessel, the surrounding Oregon coastline beauty and gray whale individuals. The video includes some interesting footage of a gray whale foraging over a shallow reef, indicating that this UAS flight did not disturb the animal’s natural behavior patterns.

We all have the responsibility to help keep wildlife safe. Here in the GEMM Lab, we commit to using UAS safely and responsibly, and aim to use this new and exciting technology to continue our efforts to better protect and understand marine mammals.

 

References

Acevedo‐Whitehouse K, Rocha‐Gosselin A and Gendron D. 2010. A novel non‐invasive tool for disease surveillance of free‐ranging whales and its relevance to conservation programs. Anim. Conserv. 13(2):217–225.

Ditmer MA, Vincent JB, Werden LK, Tanner JC, Laske TG, Iaizzo PA, Garshelis DL and Fieberg JR. 2015. Bears Show a Physiological but Limited Behavioral Response to Unmanned Aerial Vehicles. Current Biology 25:2278–2283.

Durban JW, Moore MJ, Chiang G, Hickmott LS, Bocconcelli A, Howes G, Bahamonde PA, Perryman WL and Leroi DJ. 2016. Photogrammetry of blue whales with an unmanned hexacopter. Marine Mammal Science. DOI: 10.1111/mms.12328.

Goebel ME, Perryman WL, Hinke JT, Krause DJ, Hann NA, Gardner S and LeRoi DJ. 2015. A small unmanned aerial system for estimating abundance and size of Antarctic predators. Polar Biol. 38(5):619-630.

Koski WR, Abgrall P and Yazvenko SB. 2010. An inventory and evaluation of unmanned aerial systems for offshore surveys of marine mammals. J. Cetacean Res. Manag. 11(3):239–247.

NOAA. Unmanned Aircraft Systems: Responsible Use to Help Protect Marine Mammals. In: http://www.nmfs.noaa.gov/pr/uas.html. Accessed in: 06/12/2016.

Smith CE, Sykora-Bodie ST, Bloodworth B, Pack SM, Spradlin TR and LeBoeuf NR. 2016. Assessment of known impacts of unmanned aerial systems (UAS) on marine mammals: data gaps and recommendations for researchers in the United States1 J. Unmanned Veh. Syst. 4:1–14.

Vermeulen C, Lejeune P, Lisein J, Sawadogo P and Bouché P. 2013. Unmanned aerial survey of elephants. PLoS One. 8(2):e54700.