With the new algorithm, the system can apply a range of styles in real time and your viewfinder will display the enhanced image.
Accept it — most of your images captured on your phone would need to be retouched before you share, print or save. The reason — lighting conditions on your shots (be it a subject or the landscape) are usually not up to the mark. You end up with either dark or overexposed images at most times. In order to get that perfect shot, you would need high end cameras with professional-like camera sensors, and they cost a bomb. What if we tell you that future Android smartphones will come with an automatic photo enhancing software right out of the box?
Yes, it will soon be here. Google and MIT researchers have together found a breakthrough in speeding up high dynamic range image (HDR) processing to show up on your display without any lags. The system uses low resolution photos as the base and applies effects to it. This means that it would take just around 1/10th of the processing power as compared to the present solution that uses 100 per cent of the entire resolution.
Jon Barron, a researcher from Google, states that the system will be applied to smartphones and create realtime photographic experiences without causing battery drains or laggy experiences. The AI program was trained by using machine learning, which comprised of around 5,000 photos that had been retouched by five different professional photographers. These photos were fed to the system so that the program could learn how to improvise on photos in different ways, to improve on brightness and colour saturation needs. Google has been working on the same grounds for Google Street View.