BLOG

News, tips and tricks

Go back

Exciting: Google's new algorithm retouching pictures just like a professional photographer

Wouldn’t the world be so much simpler if we could take a picture with our phones and it’s instantly set as the best version of itself – the dull colors, crooked horizon, one closed eye... all fixed in one easy app or a smart algorithm.

Researchers from MIT and Google have recently demonstrated an algorithm capable of automatically retouching photos just like a professional photographer with a professional editing software.

Although seemingly very complex, all we need to know is, we take a photo, algorithm processes it and identifies exactly what changes to apply to make the picture look better than ever – increase contrast, calculate the right brightness level and so on – the changes take less than 20 milliseconds. ;That’s 50 times a second,” says Michael Gharbi, who is not only the MIT doctoral student but also the lead author of the paper. Gharbi’s algorithm edits and transforms photos so fast you’ll probably don’t even see the first version of the picture, for us, it’s automatically this perfect!

“That’s 50 times a second,” says Michael Gharbi, who is not only the MIT doctoral student but also the lead author of the paper. Gharbi’s algorithm edits and transforms photos so fast you’ll probably don’t even see the first version of the picture, for us, it’s automatically this perfect!

Gharbi started working with researchers from Google in 2016 to explore how neural networks might learn to mimic specific photographic styles. It follows similar analysis that German researchers completed in 2015 when they built a neural network that could imitate the styles of painters like Van Gogh and Picasso. The idea, Gharbi says, is to make it easier to produce professional-grade images without opening an editing app.

Gharbi’s algorithm calculates specific features within an image and applies the appropriate improvements. “Usually every pixel gets the same transformation,” he says. “It becomes more interesting when you have images that need to be retouched in specific areas. The algorithm might learn, for example, to automatically brighten a face in a selfie with a sunny background. You could train the network to increase the saturation of water or bump up the green in trees when it recognizes a landscape photo.”.

Gharbi’s algorithm is capable of all this because the researchers trained and improved it manually over and over again with different original and retouched photos. It had to learn more than 5000 professionally edited pictures to separate the „bad” from the „good”.

The most impressive aspect for an every-day user is that the software is able to run on mobile phones. “The key to making it fast and run in real time is to not process all the pixels in an image,” Gharbi says. With this effective modification, it would be no problem for a mobile system to operate this software.

Although this feature remains in the research phase, it has already made many selfie-junkies shivers of excitement. Day by day it’s proving to become a fast, highly-wanted and desired software, and we cannot wait to see it finally launch!

Go back