This is early test of latest algorithms to make a fully automated painting. There were two inputs for this painting. A single photograph and a stroke map made by human input. The stroke map consisted of a quick trace of the image by humans that was then later used to flavor the type and style of robot strokes. The number of colors in the photograph was reduced to 8 by k-means clustering. This color reduced image served as paint map to hint at where colors belonged as well as detect edges. But interestingly, no strokes were explicitly mapped out. Instead a camera took a picture of the canvas. This picture was then compared to the image processed by k-means clustering to create a delta map. The delta map represented the numeric difference between the RGB values seen on the canvas as compared to the image being painted. Algorithms then looked for the largest delta in the delta map and applied a stroke of the appropriate color to that coordinate. A photograph of the canvas was then retaken, and a delta map recalculated. The process would then look for the largest delta, apply stroke, retake photo, and repeat... This creative feedback loop repeated itself for 2 days to produce the painting seen. If you watch the time-lapse video, you will be able to see the progression of the delta map over the course of 2 days. Blue indicates areas that need to be darkened, while red indicates areas that need lightening. At times the mapping is warped, and at others times the algorithm failed to produce a map for unknown reasons. Though I have used this approach before, this is the first painting that has attempted to paint with this algorithm alongside k-means clustering and a stroke map. I also describe part of the process briefly in the video made by America's Greatest Makers.