Painting Robots and the Artificial Intelligence behind their Creativity – cloudpainter

HomeOther Content, TechPainting Robots and the Artificial Intelligence behind their Creativity – cloudpainter
Painting Robots and the Artificial Intelligence behind their Creativity - cloudpainter
Painting Robots and the Artificial Intelligence behind their Creativity – cloudpainter
CloudPainter is a project I have done to explore augmented creativity. For the past decade I have been experimenting with AI by making painting robots. I program my robots to try to make as many independent aesthetic decisions as possible. Most the time the paintings come out pretty decent but occasionally there is failure. One of my favorite memories occurred when they were doing a live demo for a film crew and their robot started spraying magenta paint all over the dining room wall. Despite the occasional spectacular failure, the robots sometimes hit on something interesting, something that gives them insight into human creativity. CloudPainter’s entries in the 2017 Robot Art Challenge are examples of a process that they believe has hit on something interesting and they are excited to share it. These images and how they were made ask a lot of important questions about the nature of art and human creativity.


Briefly described, this year’s entry has made significant strides in its ability to create and apply abstraction. Last year’s RobotArt entries were notable in that the robot created and painter its own original compositions from start to finish. The only decision a human made was the decision to create a portrait. Everything else from taking multiple photographs, selecting a favorite image, cropping it into an original composition, selecting a palette, and then painting it was done without human intervention. This year, CloudPainter takes it a step further by not only repeating the entire aesthetic decision making process, but also abstracting the original composition in a purposeful manner. It is achieving this with a new Deep Learning technique called Style Transfer. With Style Transfer the robot’s original composition is reimagined in the style of a reference piece of art. In some cases it references art that the robot made itself, in other cases it is a famous painting, and in others a painting by either Dante, Hunter, or Pindar. Regardless of the style CloudPainter decides to be inspired from, the effects are dramatic. The resulting image looks like it was abstracted in the style of the referenced artwork. More details of how this is done and an animation showing Style Transfer being applied to three images can be seen in the accompanying video.


In addition to applying meaningful abstraction, CloudPainter has also begun to try and learn from its previous artworks. The robot has long been collecting brushstroke data on the hundreds of paintings that it has created over the years. The geometry, color, timing, and overall composition of each stroke has been stored in several databases that now hold detailed descriptions of how all the paintings were created. This year instead of just recording strokes, the robot began referencing its database for guidance on how to execute new strokes. Details of how its historical strokes are used are highly experimental and vary with each painting. The important thing to understand about it conceptually is that the robot is now trying to learn from its history. Furthermore, with every stroke it is continuing to add more data to its history. The robot is trying to get better and better with each new painting by repeating strokes techniques that have worked in previous paintings. In some cases it is even attempting to learn from recreations it has done of classical masterpieces. CloudPainter is trying to hone its technique in a manner similar to a human art student. Though experimental, it is beginning to work and there is a more in depth explanation of how in the accompanying video.


CloudPainter has seen a lot of success and failure since Pindar built his first simple plotter in 2005. As his children have grown alongside this art project and become more involved, the robotics have become more sophisticated. It is now comprised of two Arrick Robotics based plotters, a pair of 7Bot robotic arms, and multiple custom printed 3D paint heads. Their software takes advantage of a host of algorithms that they have been fortunate enough to learn from other roboticist and artificial intelligence pioneers. Some of the highlights of their artistic journey over the past several years has been multiple talks including TEDxFoggyBottom, several short features, and invitations to conferences like SXSL (South By South Lawn at the White House) and Elastic{ON}. They enjoy sharing their art and freely give all of their code and brushstroke data to anyone that inquires. They even make it available to people that don’t inquire on their public websites. For additional details, their code, and a more in-depth explanation of their art visit and their code repository at

Take the opportunity to connect and share this video with your friends and family if you find it useful.

No Comments

Leave a Reply

Your email address will not be published. Required fields are marked *