This is one of those sites you’re going to want to try yourself. Take any black and white image, feed it to the algorithm, and watch as it spits out its best guess at a color version, which is often quite convincing.
Using a deep learning algorithm developed by Richard Zhang, Phillip Isola, and Alexei Efros of UC Berkeley, the process was trained on one million images. Though it currently fools humans only 20% of the time, that’s still a significantly higher rate than previous iterations and represents an exciting step forward, and further training should only increase that rate. Imagine a time when Photoshop can colorize a photo in one step, leaving the end-user to just tweak a few hues here and there. Such a capability would be huge for restoring old family photos and the like.
I played with it a bit today, and the results were rather interesting. It struggles a bit with skin tones, which may be due to dataset bias (meaning the team may have trained it more with landscape-style images), but often the results are fairly good and definitely close enough that they could be tweaked to believability without a lot of effort.
Comments are closed.