The 2018 RobotArt Competition
I am an artist based in Vancouver, BC, Canada. I paint city-scapes and landscapes using acrylic paint on canvas and paper. I have been looking for a way to merge my interests of technology with my art practice, the competition gave me a starting point to figure it out… Here is how I took on the challenge:
Above are my 6 submissions into the 2018 competition. Every brush stroke in the above paintings were done with the robotic arm that I am holding.
My process of getting a robot to paint:
I have been consistently painting and selling my paintings for the past 12 years. I paint landscapes and cityscapes of the Pacific-North-West and my international travels. I use acrylic on canvas and paper. When I started the robotics project, I first needed to figure out which robot would suite my needs and then understand its capabilities: accuracy, repeat-ability, and most importantly how to control it. My ultimate goal is to build my own linkage (I have worked in the bike industry designing full-suspension linkages so I hope to build a large painting linkage…), however I decided the best way to start was with an off-the-shelf system.
Early Paintings – November 2017
The first robotic arm I purchased had miserable repeat-ability and had +/-5mm tolerance in all directions. This created many challenges especially in the z-direction — not great for putting a marker or a pencil consistently to the page. The robot would slam the page with a marker or going off the painting area. Since a large paintbrush with long bristles is forgiving with large z ranges for heights, my very first paintings were random straight lines with a larger brush.
My early paintings (above) were random brush strokes because my first robot as the robot had no accuracy to hit programmed locations (let alone paint a straight line). The change in thickness of the brushstroke in both pieces is due to the inaccuracy of the robot!
Importance of the Under-Painting
When painting in acrylic in my regular art practice, I paint in layers and use a hair dryer so each layer of paint is completely dry and not mixed with the earlier brush strokes. I brought this layering philosophy to programming my robot from the beginning. In my process, it is critical to have an under-painting to get depth in a painting (more important to me than adding the ability to mix paint on a palette). So the first things I programmed the robot to do was to clean the brush thoroughly and to wait for layers to dry before dipping the brush into the next color. In the videos I posted for the competition you will see the robot waiting up to 7 minutes between colors.
Mountain Range (6″ x 6″ Acrylic on Canvas) has a bright blue under-painting that you can see in progress on the left side. The blue peaks through on the final painting (right)
Progression from wobbly lines to painterly curves
Eventually my first robot died – the servo motors vibrated a lot. So my replacement arm with stepper motors was higher precision +/- 0.2mm. Since I had already started on the path of random lines – I continued on this programming path.
Due to the accuracy of the new robot, the painted straight lines it created were too straight. Definitely not painterly! So I spent significant time working out the trigonometry to draw curve with an input of two x,y coordinates and the angle of the curve. These three values, along with a color stroke are saved into a csv file when I am creating a brush stroke set.
The most significant programming time was spent altering my Python code to create curves between two points (right) versus straight lines (left). Artists paint with human joints of their wrists making most paint strokes have a curve to them. To get this painterly look this was a critical step in the development of my robotic painting process.
Finding Subject Matter for the Robot
Once I designed the brush cleaning and dipping set up (a lot of trial and error) with random curved line paintings, I started creating brush stroke sets based on 8-bit .png files (for the 8 colors – as part of the competition). It took many, many paintings to figure out how to optimize the brush stroke set based on the image file. This also changes with each image – similarly as every painting by hand has a different approach. With each painting I am always optimizing the code.
When considering what to paint for the Robot Art Competition, I looked carefully at the brush strokes the robot created. I initially tried landscape paintings, as per my normal art practice, but had dismal results. Mostly because landscapes require a lot of blending and reviewing with a human hand/eye to get thoughtful perception of depth, likewise landscapes are best on bigger canvases that my robot does not have the reach for. Floral immediately made sense for the composition as depth is not required and I could get bold paintings with minimal colors in a very small space (6” x 6”) to be exact. I also loved the idea of a very technical piece of equipment painting a very human interest like flowers.
I chose floral still-life as the main subject matter of my 2018 submission based on the capabilities of the robot, the limits of the competition… and well people like flowers 🙂
The Process of Painting Flowers
Each floral painting is organized into 4 layers which I sketch out in Photoshop with 8 colors. The under-painting, the backdrop, the flowers and a final layer of tweaks based on how the brush stroke set looks mocked up on my screen. I also may re-attempt the painting after I have seen it in person (my first few florals were extreme abstract).
Schematic of the process I use to create a brush stroke set to paint
For the abstract pieces (non-floral) I started to have fun with a feature of the robot – the ability to add calculations. I started to program the brush stroke size/angle/length variables based on the color or its position on the canvas. This feature of using the robot will gives me SO many ideas!
Abstract painting sumbitted to the competition – going left to right the brush stroke angle goes from 0 degrees to -90 degrees. The robots ability to use math is a pandora’s box of ideas to paint abstract patterns and shapes.
Vote for your favorite artwork
A portion of the judging is done by public voting. I am up against universities that have much bigger networks than me on my own… If you have a moment I would very much appreciate if you could click on your favorite Joanne Hastie Robot Painting below and vote for me via your Facebook page. For each painting on their website shows more photos of the painting and a link to a YouTube video of the painting being created as well as the link to vote.
My 2018 submissions to the 3rd annual Robot Art Competition. Vote for your favorite by clicking here
Getting a robotic arm to paint was fun, challenging and exhausting, on average each entry has about 15 hours of work (excluding the hours of doing trigonometry and learning python before I could start considering a painting… then the post painting video editing for the competition). My art practice (by hand) has improved as this project made me focus on simplifying my work. Getting to basics and creating art with a very limited capability tool. Also defining my brush strokes or how to paint an area vs. outline an area.
Thank you for taking a look at my work and reading this far. Thank you to Robot Art Competition for giving me a deadline to attempt this challenge!! If you have any questions or comments please reach out to me by checking out my website and contact page.
If you enjoy reading about my adventures in robotic painting – I have more stories posted in “Painting Variables” on my website.
Blog Posts about the Robotic Arm
Each week I add more complexity to the robotic painting project. It has been fun as I can leave the robot painting while I work on my own paintings (the ones I do myself by hand). This weekend I achieved several strides, which gave me so many more ideas... All...
Can a robot create a painting? I attend a monthly artist business group in Vancouver, when I told them about my idea of a painting robot the reaction was confusion “Why would I paint pictures of robots?”. I quickly learned to rephrase this idea: I actually meant that...