Wednesday, 26 July 2017


Researchers have trained a neural network to scour Google Street View (which of course is not limited to urban environs) and frame what it believes to be æsthetic scenes, applying algorithms on cropping, lighting and composition that its acquired in the learning process. The coda to this experiment was to subject the photographs to a sort of human-juried “Turing test.” The judges were not told that a machine had selected and perfected the images and rated nearly half of them to be the work of a professional. Chew more of the scenery over at Twisted Sifter at the link up top and learn more about the exercise in deep learning and wonder about its implications.