Click For Photo: http://i.dailymail.co.uk/i/pix/2017/12/22/18/4786579E00000578-0-image-a-11_1513967622461.jpgClick For Video: http://video.dailymail.co.uk/video/mol/2016/11/17/3259025414619837620/1024x576_MP4_3259025414619837620.mp4
Would you want your phone to tell you what makes a pretty picture?
Google's latest neural image assessment system (NIMA) is using AI to scan all the pictures you took on your phone for quality and then help choose the most attractive ones.
System - Networks - Type - System - Networks
The system uses deep convolutional neural networks, a type of computing system that replicates the biological networks in the brain, to scan phone photos for both technical and aesthetic elements.
Google hopes to develop the system as app to suggest improvements such as tweaks to brightness and contrast in real time, and even offer tips to improve the framing and 'aesthetic beauty and emotional appeal' of images.
Data - Judges - Images - Photo - Contests
With data based on what human judges generally select as good images in photo contests, the algorithm rates photos based on technical elements such as blurriness, highlights and use of shadows.
Once it is officially released for use on phones and computers, the system will also judge photos based on more subjective elements of attractiveness, such as aesthetic beauty or emotional appeal.
Photo - Reference - Image - Subject - Style
Ideally, each photo will be compared to a reference image of a similar subject and style.
But if no such reference image is available, Google will use statistical data to gauge what image humans are most likely to prefer.
Algorithm - Photos - Score - Edits - Things
The algorithm gives all photos a composite score of 1-10 and suggests edits for things like better brightness and exposure.
0 other people are viewing this story
Wake Up To Breaking News!