New Technology Allows Humans To "See" The World Through Animal Eyes
By Madison Dapcevich
MOST ANIMALS HAVE COMPLETELY DIFFERENT VISUAL SYSTEMS TO HUMANS.FOR MANY SPECIES,IT IS UNCLEAR HOW THEY SEE COMPLEX VISUAL INFOTMATION OR COLOR PATTERNS IN NATURE.THE UNIVERSITY OF QUEENSLAND.
Editor's note: A previous version of this article incorrectly identified the bee/human viewpoints in the image below.
Ever wonder what a honeybee sees as it darts through fields of flowers? Wonder may soon become a reality thanks to innovative new software that gives humans a chance to “see” the world through the eyes of animals.
Until now, it has largely been difficult to understand how animals perceive the environment around them. Most animals have different visual systems than humans do. Because of this, it is unclear how species see complex visual information or color patterns that drive their behavior. That’s why researchers from the Universities of Queensland and Exeter have developed Quantitative Color Pattern Analysis (QCPA) to allow people to process and visualize environmental information in much the same way as animals do.
“The framework first strips away all of the information from the digital images which wouldn't be visible to the animal, leaving only the colors and details that would be visible from a given viewing distance,” study author Jolyon Troscianko told IFLScience.
“Next, the image is broken down into a smaller number of distinct colors, and various mathematical models are then used to quantify the arrangement, complexity, and intensity of these colors. Importantly, all these steps make use of our understanding of the limits of animal color vision, and can even include things like ultraviolet vision.”
QCPA calibrates digital photography and visual modeling to represent what an animal might see. A digital camera uses an array of light sensors to create a response in a pixel as a result of a given amount of light at a given wavelength, described study co-author Cedric van den Berg.
“If we know how the RGB receptors in the camera respond to light, we can use that information to reconstruct pixel by pixel how much light at what wavelengths we had present in an image when it was taken,” said van den Berg, adding that once this calibration has been done, the software can then start to model how light would behave if it entered an animal's visual system.
A field of bluebells from the perspective of a bee (left) and a human (right). Jolyon Troscianko
The “unprecedented quantitative and qualitative level” technology is described in the journal Methods in Ecology and Evolution. It studies the design and function of color patterns in nature to perceive how animal and plant color patterns appear against their natural background, as well as how an animal might perceive the relationship between the two. Existing methods are limited in that color and patterns are rarely analyzed in conjunction with each other, the authors noted. However, this is particularly crucial for animals and their environment.
“'Differential blending’ is an example of this, where an animal matches some of the background colors (but not all). Where sections of matching color occur at the edge of the animal this interferes with the animal's outline, which is one of the main features used to identify animals,” said Troscianko.
The system can be used in almost any habitat and with a variety of digital imaging processes, from regular cameras to more complex imaging systems. The researchers note that their free and easy-to-use online platform allows for citizen scientists to use the program for a variety of applications, such as identifying how an animal’s camouflage works in order to improve land management strategies.
Cedric van den Berg uses a prototype camera system to take calibrated pictures of a nudibranch mollusc at Lizard Island on the Great Barrier Reef. Emily Guevara and Cedric van den Berg