Parks x Needle is a series of human-robot tufted maps of New York City parks’ wildlife zones

Parks x Needle is an homage to some of my favorite things - yarn, parks, and robots. This project was done in collaboration with Owen Trueblood as part of the 2022 Ground Truth IRL exhibition hosted by the Data Through Design (DxD) collective.

How It Was Made

There were four major steps in producing these textile park visualizations:

  1. Hacking the tufting gun so that it can be controlled programmatically

  2. Generating tool paths for the tufting gun to tuft the shape of a park from each borough in New York City

  3. Controlling a UR5 robot arm and tufting gun to stitch the shapes of the wildlife zones

  4. Park inspired punch-needling done by hand to fill in the boundaries tufted by the robot

Hacking the Tufting Gun

Tufting is a textile manufacturing process where a needle shoves yarn into fabric and the fabric’s backing holds the yarn in place. It can be done by hand with a special large hollow needle called the punch needle or a machine called a tufting gun.

We first prototyped by mounting a punch needle on the UR5 robot arm, but decided to work with a tufting gun instead for speed. We wanted a way to control the tufting gun either from the robot controller or a computer. To do so, Owen replaced the motor controller, added an Arduino, exposed the Arduino’s USB connection, and added a standard stereo 3.5mm audio jack for the IO port. The USB was used to update the firmware, the digital output was used by the robot to turn the tufting gun on or off, and the analog output was used to set the speed of the tufting gun.

Tufting gun before our modifications

Tufting gun with motor controller and microcontroller added to allow the robot to control it

Generating Tool Paths

For the final pieces, I wanted the robot to draw the shapes of the parks which later I filled in with colors I saw from the parks. To do so, we used the NYC Parks Forever Wild dataset from Open Data NY which maps the ecologically important wildlife zones across 138 parks in NYC. For parsing the data and generating the tool path, I used a software called Houdini. I felt comfortable with Houdini because I’ve worked in the animation industry for the past five years where it is heavily used for creating procedural models, crowd simulations, and special effects.

Houdini provides a node-based programming environment for manipulating geometry and its off-the-shelf nodes allow for common operations like moving geometry around, simulating particles, adding noise, etc. But there are also nodes that allow arbitrary programs to be written to create any kind of operation that you might want. By combining some common geometry nodes and custom scripts, I made a Houdini node network did the following:

  1. Create a canvas polygon out of horizontal rows to represent the 16’ x 20’ monk cloth to be tufted on.

  2. Create Houdini polygon primitives from the NYC Parks Forever Wild GeoJSON data using vvzen’s Houdini-Geospatial-Tools.

  3. Transform the park shape primitive to fit in the canvas.

  4. Extrude the park shape to find intersections with the canvas polygon.

  5. Group points on canvas that intersects with the park shape.

  6. Assign attributes for each point that describes the direction of the tooltip. I wanted the robot to move row by row from left to right, so I assigned the direction values while moving along the rows in the canvas polygon.

  7. Create lines based on the point attribute’s state change. This corresponded to the lines that will be tufted by the robot. The robot motion planner takes in a series of points and does the inverse kinematics for us, so we just needed the endpoints of these lines.

  8. Export endpoints and a command if the rufting gun would need to go down to tuft or come up to stop as a comma-separated values file.

Controlling the Machines

With the CSV of generated tool paths, the next step was to use it to control the robot arm and the tufting gun. For this, we wrote a Python script to parse the CSV and send signals to the robot arm to move to a position or power the tufting gun. To communicate with the robot, we used Owen’s Python wrapper of RoboDK and robolink. RoboDK is a simulator for industrial robot programming and robolink is a python module that interfaces with RoboDK.

Park inspired Punch-needling

In the spirit of the exhibit’s theme “Ground Truth IRL”, I visited parks from all five boroughs of New York City to experience and understand what’s within these boundaries of ecologically important wildlife zones. This was also during the holiday season of 2021 when my friends were gone to their respective homes while I couldn’t go back home to Korea due to COVID travel restrictions. With no friends and family, I turned to nature.

At first glance, the parks in winter seemed so bare, gray, and cold because the trees had shed all its foliage. But with more time and attention, other parts of the forests started to come into sight. There were vibrant pink thorns and vines in the midst of dead leaves, golden ochre terminal buds at the end of the branches gearing up for new growth, and so much green manure from Canada geese.

I could have made a geographically accurate map of these sights or a statistically accurate color palette. But for the final piece that I was punch-needling by hand, I chose to focus on and highlight the sights that struck me the most from my personal park excursions. While the public data gave the shape or context of the pieces, the color and feeling were dictated by my personal experience.

As an example, for Prospect Park, I wanted to showcase the dead gingko leaves, evergreen ivies, and the red ice rescue ladder. This is because the gingko leaves reminded me of my home in Korea where we had many gingko trees and I felt that the ivies being the only green thing in the park and the rescue ladder being an ice ladder were poetic depictions of winter to me.

Work In Progress shot of Prospect Park, Brooklyn.

Exhibiting

Parks x Needle was part of Data Through Design’s 2022 exhibit, Ground Truth IRL. There were 11 other projects presented in this exhibit including weaved bar charts dyed with food scraps that visualize food insecurity in NY, an installation examining the discount stores in NY, and an experimental video exploring the unseen traces of children in the OpenCity data.

For the exhibit, we presented five pieces, each representing a park from a borough in New York:

We also brought the robot tufting system for a live demo which allowed the audience to see, ask, and learn about the production process.

To see more technical details check out the Robotic Tufting System project log on Hackaday.io.

Previous
Previous

Serri