New Tool Detects Photoshop Shenanigans in Fashion Photos

80beatsBy Veronique GreenwoodNov 30, 2011 1:28 AM


Sign up for our email newsletter for the latest science news

An image analyzed by the researchers, before retouching, after retouching, with an overlay that shows the strongest retouching in red, and with two facial overlays showing other measures of retouching.

What's the News: It's not news that in the age of Photoshop, celebrities and models in magazines have started to look like perfect aliens crash-landed among we ugly Earthlings. But though sometimes it's obvious when a photo editor has gone too far (witness the Ralph Lauren her-head's-bigger-than-her-pelvis debacle

), the gap between what real people look like and what magazines and other media regularly show has grown distressingly wide without most people consciously noticing it, creating a sea of misinformation that may contribute to body-image disorders

. An analytical tool developed by Dartmouth scientists, though, picks up and quantifies those alterations

, potentially providing a useful metric for policymakers looking to set boundaries on how much limb-stretching, torso-trimming, face-smoothing alteration is appropriate. How the Heck:

  • The tool rates altered images on the basis of geometric change, like stretching and shrinking, and photometric changes, like airbrushing and heightened colors.

  • To test their system, the researchers had 390 volunteers from Amazon's Mechanical Turk service analyze 468 photos before and after retouching, most of them from the sites of retouchers advertising their services. Each photo got a rating (1 to 5) that described how much reworking the image had undergone.

  • The researchers then had their system rate the images on the basis of statistical measures like facial distortion and local image similarity, which measures the photometric change between different areas of a picture.

  • In most cases, it corroborated the ratings of the human volunteers. Interestingly, the few examples where the results didn't jibe reveal that humans do some complex processing that machines do not when looking at faces: the volunteers picked up on minute changes to facial structure more strongly than the tool did.

What's the Context:

Reference: Kee and Farid, A perceptual metric for photo retouching, PNAS. Published online before print November 28, 2011, doi: 10.1073/pnas.1110747108

Image courtesy of PNAS

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!


Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Our List

Sign up for our weekly science updates.

To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2023 Kalmbach Media Co.