Website Logo. Upload to /source/logo.png ; disable in /source/_includes/logo.html

Ryan Schenk


Synynyms is an award-winning tool that I wrote for visualizing the scientific names of species. You see, most species don’t just have one scientific name, they typically have many, which are sometimes hotly contested! Synynyms graphs the popularity of each these names through time, as can be seen in the graphs below of the American Bison, which changed names from Bos bison to Bison bison in the early 20th Century.

Synynyms ties together the APIs of three important biodiversity informatics projects. The names and taxonomic synonyms themselves come from the Encyclopedia of Life, those names are parsed into their components by the Global Names Index, and most importantly of all, the publication occurrences come from the Biodiversity Heritage Library.

Because the datasets are large and take a long time to download, I wrote this tool in Node.js and to facilitate realtime graphing in the client. As vast amounts of data stream in from the web services, the Node server parses and analyzes the data, then sends it to the client over a WebSocket, thus allowing the user to watch the graphs build themselves in realtime. The graphs act as their own progress bars. Up front, I used Backbone to keep the front end sane, and Raphael to draw the graphs themselves.


Update! I was interviewed on NPR about Breedlight! I go into detail about the Breedlight project, how it works, and the artistic motivation behind it, as well as my thoughts on art, design, and engineering. Please have a listen!

Breedlight is a series of sculptural lamps that I’ve been working on, whose design is dictated by a genome that I created. Breedlight lamps are capable of reproducing with each other, passing on their traits to future generations of interior lighting.

Breedlights at TEDxWoodsHole

Each lamp has a unique genetic code—a DNA of sorts—that describes its size and shape. Although I make the lamps, I do not design them. I breed them from other Breedlight lamps! Breeding happens inside software that I wrote for this purpose, which can run on the web and in mobile phones.

Each lamp’s traits are contained in a single chromosome, shown above, comprised of 20 genes. These 20 genes encode the size and shape of the Breedlight lamp. Its height, width, and the prevalence and shape of its curves are all described by these genes. Breedlight lamps with two different shapes will have correspondingly different chromosomes to manifest those physical differences.

I chose the chromosome model to enable two lamps to reproduce with each other, passing on their their genetic material and physical traits to their offspring. This works very much like in real life: to reproduce, each parent lamp splits its chromosome, independently assorts the genes, and contributes half its genetic material to the offspring. The offspring receives a half-chromosome from each parent, combining them to form a full chromosome. Laws that I’ve encoded in the genome determine how the traits inherited from each parent combine and contribute to the offspring’s physical form.

Consider the example breedings above, showing the children of lamp A♥B on the left, and B♥C on the right. Note that lamp B, which features a bulge in the middle is in both breedings. As you can see, most of the children in both of these breedings inherit the central bulge from lamp B. The children on the left tend to be wider and squatter, which they inherit from lamp A; the children on the right tend to be taller and skinnier, which they inherit from lamp C. These heritable traits are the genes at work, but because the chromosomes are split randomly, there is variation among the children.

A Breedlight’s genetic information is encoded into a QR code, which functions like their DNA. This code can be scanned by a smartphone to launch the Breedlight software, which I wrote to run on mobile phones and in the browser. The software reads the chromosomes of one or more Breedlights, renders the shapes, and allows a customer to breed any two lamps by tapping or clicking on them.

On the backend, the software renders vector outlines of a Breedlight in SVG. I use this file to cut out the forms that the Breedlights are built upon. If I had access to a CNC machine, I could programmatically generate toolpaths from this file, and send it out to be cut; unfortunately, I don’t have a CNC machine, so the forms get cut by yours truly and a jigsaw.

From there, I place the form in a jig that I made, and construct the lamps in the traditional manner of a Japanese chochin lantern. This process is results in some fantastic looking lamps, but is very labor intensive, and when making one-off designs like I am, wasteful of materials. Future developments for Breedlight will be focussed on making the construction process efficient, but more importantly ensuring the materials truly embody the artistic sensibilities of the project. These lamps are organisms, given life by a computational environment, and I am actively seeking out materials and processes that express these traits to the fullest extent.

Meta Detergent Logo

My coworker made a little app called Meta Detergent. I fell in love with the name, and drew him a little illustration to use as his logo.


As I work with scientific data, I am always seeking different ways of displaying it. Recently, I’ve been trying to create juxtapositions, trying to display scientific data in very “unscientific” ways. I built Taxatron as the ultimate extension of this thinking: a kinetic sculpture that displays graphs of scientific data with a 24’ long piece of purple silk chiffon.

The purple silk represents a timeline from 1750 to 2000. It displays a histogram of the number of new species discovered per year in any particular grouping of organisms (called “taxa”). In the image above, we can see the number of new species of octopus discovered every year. There are two major spikes in the early Twentieth Century, which correspond to two major published volumes on mollusks. In the video below, we can see the number of new turtles, birds, and rodents discovered per year since Linnaeus first described species in 1750. The silk moves beautifully as it transitions from one species to another.

Taxatron Prototype from Ryan Schenk on Vimeo.

The chiffon creates an entirely new setting for the scholarly data that Taxatron displays. Taxatron recontextualizes “hard science” data into something soft, translucent, and easily mutable. Taken out of the safety of a hard glass computer screen, or the comfort of a poster session, the data is vulnerable, suspended for viewers to inspect from any angle, and even to be buffeted by a strong wind.

Lately, there has been much ado about “transparency” in regards to data. Taxatron takes this charge literally. Light easily passes through the cloth, allowing viewers to visually interact with each other, and allowing the projected metadata to radiate through the cloth, wholly becoming part of the medium. The material selection also plays a pivotal role in the movement of the graphs, effortlessly transforming from one shape to another, and raising important questions about the permanence or fleetingness of scientific discovery and the organisms that inhabit our planet.

Taxatron was displayed at the TDWG 2010 conference in Woods Hole, and again at TEDxWoodsHole. Receptions at both events allowed scientists and nonscientist alike to view, discuss, and even touch scholarly data. I think my favorite part of the event was hearing the librarians and archivists from the MBLWHOI Library correlate spikes in the histograms with the influential monographs that described those species.

From a technical standpoint, the chiffon is suspended from an array of 15 stepper motors, which are mounted to a housing hung from the ceiling. With the help of an electrical engineer friend, I created custom circuit boards that contain the driver circuit, stepper motor mount, and a photo switch used to calibrate the system. The chiffon is suspended from a strand of thread, attached to a small pulley mounted on the stepper motor.

Inside the housing, I fabricated mounting brackets from sheet metal to hold the stepper motor boards in place. All the steppers are wired to an Arduino Mega, which runs the show. I wrote firmware in the Arduino to drive the motors given commands from the serial port. Every time the sculpture powers on, it calibrates itself with the photo switches on each motor board; this allows me to use any length of string to suspend the cloth; the sculpture will scale the measurements accordingly.

The stepper motors can move the silk into a wide variety of shapes by spooling the thread around a pulley, thus shortening or lengthening the thread suspending each segment of cloth.

Taxatron hung at TDWG 2010 and TEDxWoodsHole 2010; unfortunately, both events were extremely short, and I was not able to push the limits of what Taxatron can do because of its temporary location. As you can tell, I barely had time to snap a few photos. I am looking for a home for Taxatron that would allow it to hang for an extended period of time, and allow me to fully develop the projection software to transform the cloth into a unique and truly memorable experience.

Blowing in the Wind

I recently entered two pieces into the Falmouth Artist Guild’s April show, Blowing in the Wind.

The first piece is a cascade of laser cut felt feathers, called Feather Vane.

Feather Vane

The second piece is a slightly more conceptual mobile. It’s a large wood and paper sail that holds a piece of charcoal on the end.

Drawing Mobile

It is very delicately balanced so the charcoal just barely skitters across a paper-covered table. As the wind blows the sail around, it traces light marks on the paper.

Drawing Mobile's Traces Part 2

As the exhibition progresses, the charcoal drawing will build up layer-by-layer. This is the progress after one week. Looking great!

Drawing Mobile's Traces Up close

You can see more photos in the Flickr set.

Driving a $0.40 Stepper With Arduino

I recently bought 250 stepper motors (part number 21-02485-03) for a sculpture that I’m working on, for the low price of 40 cents a piece! Even if you don’t buy them in bulk, they are still really cheap.

There are several driver circuits available for this motor, which were very helpful in figuring out the strange wiring inside this motor. However, none of the above drivers played nicely with the Arduino Stepper Library, which uses Tom Igoe’s stepper driver circuit. It took an evening to figure out how to connect this stepper to Tom’s driver and the Arduino stepper library, so I thought I would post it here.

The first step is to cut the white wire, which turns the stepper from a weird hybrid into a normal bipolar stepper. The links above will explain this in more detail if you’re curious.

The next step is to correlate the four wires on the motor to Tom Igoe’s circuit and the Arduino stepper library. These use the numbers 1-4 to describe the wires coming out of the motor, and order does matter. With the 21-02485-03, the wires numbered 1-4 correlate to the Red, Blue, Yellow, and Black wires respectively.

With this knowledge we can easily hook up our stepper to Tom’s circuit and start using the Arduino Stepper Library immediately. Here’s a diagram of Tom Igoe’s 4-wire circuit, with the Arduino code after it. The IC chip used is a L293D H-Bridge and note that the notch in the IC faces to the right (a bit hard to tell).


  Drives the 21-02485-03 Stepper Motor attached
  to Tom Igoe's four-wire driver circuit


#include <stepper.h>

// Change these to the pin numbers of each color wire 
#define YELLOW  8
#define RED     9
#define BLACK   10
#define BLUE    11

#define SPEED 200 // RPM

#define LED_PIN 13

// create an instance of the stepper class, specifying
// the number of steps of the motor and the pins it's
// attached to
// Order matters here when giving it the pins
Stepper stepper(20, RED, BLUE, YELLOW, BLACK);

void setup()
  // set the speed of the motor to 30 RPMs

  pinMode(LED_PIN, OUTPUT);

void loop()




LigerCat–Literature and Genomics Resource Catalogue–is a Web application that provides a big-picture view of the National Library of Medicine’s Medline database. It allows researchers to browse the metadata of hundreds of thousands, even millions, of biomedical journal articles simultaneously. It also has the coolest name of any scientific application that has been published in peer review.

It is a Ruby on Rails application that I wrote when I was at the Marine Biological Laboratory. I designed and implemented the entire application stack from the ground up. The data is split between a MySQL database and a Redis cluster; the Redis cluster, which stores hundreds of millions values, was the largest in the world at the time it was built. To compute queries on demand, it has a large, horizontally scalable processing cluster, which pull tasks from an AMQP work queue and process them in parallel. Using this architecture, I processed each of the 1.9 million species in the Encyclopedia of Life though LigerCat, analyzing the metadata of tens of millions of scholarly articles, in a matter of days.

The basic idea behind LigerCat is that tagging is nothing new. The Web 2.0 folks got the idea from librarians, who have been tagging literature for many years. Librarians tag things using a “controlled vocabulary,” which is a set of tags that are curated and maintained by some authoratative body. For instance, scientific articles indexed by the National Library of Medicine are tagged with a controlled vocabulary called Medical Subject Headings (MeSH), which has over 20,000 tags in the set.

The Articles search, which is selected by default, allows the user to query the PubMed article database. Ligercat will download all the results, and build a MeSH tag cloud from the articles returned by your search. You can search for a topic, a person, or an organism, and LigerCat will build you a MeSH cloud based on the results.

In addition to designing the databases and writing the application code, I also designed and coded the UI. LigerCat has extensive client-side interaction that I developed with MooTools. The page below allows users to visually construct queries to the PubMed repository by clicking on interface elements. As the user clicks, LigerCat queries PubMed in realtime, providing instant on-screen feedback. The publication timeline at the bottom is fully interactive and is implemented entirely in CSS, no JavaScript (admittedly pedantic). I have been contacted by Dryad, Agricola, and Elsevier, who have wanted to use LigerCat’s interface to display their metadata. Because I designed LigerCat’s processing system based on the Strategy and Command design patterns, placing LigerCat on top of another literature corpus is as easy as writing two Strategy classes.

LigerCat can be cited as,

LigerCat: using “MeSH Clouds” from journal, article, or gene citations to facilitate the identification of relevant biomedical literature. Sarkar IN, Schenk R, Miller H, Norton CN. AMIA Annu Symp Proc. 2009 Nov 14;2009:563-7.

Washing Laser Cut Felt

I had some of my mobile designs laser cut out of wool felt through Ponoko, a fantastic service that makes laser cutting almost laughably easy. The mobile pieces came out looking great, but there’s two problems.

When you laser cut felt, it smells very strongly like burning hair (go figure), and it leaves a charred halo around the edges. Some people recommend dry cleaning to remove the smell and char, but that was too expensive and environmentally unfriendly for me. I experimented with a couple ways of cleaning the felt, and this is what worked best for me.

  1. Make a mixture of cold water and dish soap

  2. Soak the felt for 15-30 minutes

  3. While still submerged, go around the edges with an electric toothbrush

  4. Rinse with cold water

  5. Pat dry on a tea towel, then leave on the counter to air dry

This method is pretty quick, easy, and works. Just remember to use an old toothbrush head.

Photos after the jump

Putting a Dent in the National Debt

I recently read an article on CNN about a little-known law that allows the US federal government to accept contributions to pay down the country’s debt.

The article contains several quotes from random passers-by in New York City, who were asked if they would consider donating. One person said the following:

“I think I could give $10 to $20. And if everyone could do that it would make a good dent in the debt.”

This person clearly does not understand orders of magnitude; if everyone could give $10-$20, we would make a dent the size of four-hundredths of one percent in the national debt.

To give a better understanding of the numbers involved, I have created the following four graphics, showing exactly how much of a “dent” a personal contribution of $15, $100, $1000, and $10,000 from every US citizen would make in the National Deficit.

If you would like the raw data, these graphics were generated from this Google Spreadsheet.

Donating $10-$20 to The National Debt

Donating $100 to The National Debt

Donating $1000 to The National Debt

Donating $10,000 to The National Debt

Of course it’s really not that simple. The more astute of you will notice a fallacy in the above graphics. You see, donations to the federal government are tax deductible. If I donate $10,000 to the national debt today, that lowers my taxable income by $10,000, which in turn lowers my tax burden thus skewing the numbers in the graphics above.

If anyone out there would like to add some tax functions into my Google Spreadsheet, we could avoid this fallacy of overlooking secondary consequences! Somebody get on that!