Category Archives: Projects

Design Fiction Course on Environmental Interaction – NUS

Spring 2017

My first course I am teaching at NUS is New Media – 4225: Design Fiction. This semester’s theme centered around the future of humanity’s interactions with the natural world.

This class explores the concept of “Design Fiction” where science fiction – type storyworlds are shared through the creation of interactive, futuristic objects. These novel designs aim to suspend disbelief and invite audiences into the possible future worlds expressed by these interactive objects. The ideas and storyworlds surrounding the designs provoke audiences and encourage discussion about current issues that may lead to such futures. This semester’s theme focused on “Frontiers of Ecological Interaction.” Students were tasked with exploring contemporary technologies and issues arising between humanity and the natural world. They then designed interactive objects, performances, and storyworlds that highlight key concerns arising in the near future.

View Print Documentation of the class projects here:

Design Fiction Spring 2017 by Andrew Quitmeyer on Scribd

CNM Blog review of class final presentations

Student Showcase: An Uncertain Future Visualised and Explained Through Fiction

“Digital Survivalism” – TV spinoff of Digital Naturalism

I will be starring in a new Television series created for Discovery Networks called “Hacking the Wild.” It’s based loosely on my work during “Hiking Hacks” where I build interactive electronics entirely in the wilderness. The show airs on the Science Channel in the US start February 15, 2017. This first season features work inspired by my PhD research into Digital Naturalism put into play as “survival” tools for building technology in the wilderness. We spent 6 months filming the show in different locations from tropical islands, to swamps, to glaciers!

Watch episodes here!

https://www.sciencechannelgo.com/hacking-the-wild/

Transcontinental Hiking/Hack

From June 26th to July 5th, I have organized an expedition across Panama. The main goal is to design digital-biological field technology entirely in situ. The context in which a technology is made drives its design. Conventional development of digital technologies, however, typically occurs in climate controlled laboratory surroundings, and not the harsh environments of many biological field sites (like the Panamanian Rainforest).

This trip will help us find new ways to create novel devices for scientific exploration, hack existing devices, and share our biological-technological discoveries while cut off from the luxuries of standard electronics workshops.

Along the way we will also be critically analyzing the effect that these technologies have upon the different scientific surveys and investigations we will carry out during this transcontinental transect.

We will be fully immersed in the strange world of the other creatures, which will hopefully empower our designs for understanding them.

Images from a prototype Hiking / Hack with Signalfire artist residency

 The Crew

Peter Marting

Has been participating in Andy’s Digital Naturalism research since the beginning. He’s a true naturalist dedicated to understanding life in the wild. He’s developed mad hacking skills over the years in order to explore his Azteca ants even further collaborating with Andy to make devices like the Flick-o-matic and artificial Cecropia trees.  He’s also a musician in the band Ptarmigan.

Superpowers

  • Ant Enthusiasm
  • Bird Calls
  • Hymenopteran Stings

 

May Dixon

May Dixon is an all star bat scientist. She manages Rachel Page’s research lab in Panama, and has been leading projects about novel learning behaviors in Bats. She is about to start her PhD at UT Austin.

Superpowers

  • Science
  • Mammals and the Tropics
  • First Aid

Ummat Somjee

 Ummat studies heliconia beetles and holds encyclopedic knowledge of the many behavioral systems in the tropics and arctic. He is an experienced backpacker and a professional-grade mountain climber.

Superpowers

  • Fast Hands
  • Extreme Climbing
  • Insect Sex

Erin Welsh

 

 Erin is a graduate student at the University of Illinois studying the potential impact of climate change on off-host tick ecology in the neotropics. She has been working in the jungles of panama for the past two years.

Superpowers

  • Tropical Infectious Diseases
  • Trivia
  • Tick Wrangling

Nate Walsh

 

Nate Walsh is a professional writer and excellent communicator of the oddities of many cultural and social interactions.http://www.natewalsh.com/

Superpowers

  • Writing
  • Scary Memory
  • Mild Masochism

Harmon Pollock

 

Harmon is a roboticist currently working at Northwestern. Along with his excellent skills in all aspects of physical computing, he has also been on many challenging (sometimes solo) expeditions into backcountry areas.http://www.dhpollock.com/

Superpowers

  • Harware Hacking
  • Duct Tape Hacking
  • Carry Stuff that’s not quite as heavy as Andy.

Mary Tsang

Mary Tsanghttp://www.diysect.com/Mary Studied Biology and Art at Carnegie Mellon in Pittsburgh, where she picked up a knack for growing hydroponic kale and building installations inspired by 50s space age aesthetic. With an undying love for neotropical rainforests, she has traveled to Central America and back several times, mostly for researching frogs.

Superpowers

  • Videography
  • Catching Frogs
  • Bio-hacking-tweaking-punking

Andrew Quitmeyer

Will be leading this expedition. He loves inventing and building new things but hates being indoors. This is why this project came to be!

Superpowers

  • Carry Heavy Things
  • Hacking
  • Sewing

Background

 

Hike-and-Hack_Large

More details soon!

 

Here is the announcement / application

Dissertation Music Video

One of the main Side projects for my 2014 tenure in the rainforest will be to shoot a music video which somehow shares the thesis of my PhD research. I was enthralled when I first started my PhD three years ago and Dr. Becky Arundale pointed me to the “Dance Your PhD project.” I knew I would have to do it, but when I started check out the works submitted currently, many seemed to be just some sort of interpretive dancing against audio backdrops of pop songs with captions that explained the thesis. This is still SUPER COOL and super great when people find more fun and accessible ways to share their work. I knew however, that what I really wanted to do was a fully realized music video performance to explain it all. I had to make a custom song that actually explained some sort of the philosophy of the research itself, and then shoot a music video to back-up these ideas.

 

I’m not unfamiliar with creating music videos about a central thesis. Whether it was about the high incidents of beverage spillage amongst pimp cups, or the fact that I have lots of amazing things in my basement which I have to get rid of, I have always loved the tight structure of musical videos for expressing any sort of odd concept.

 

In the most cynical point of view, I realized that people just want to see captivating images synced up with fun sounds. If you can make something fun with decent visual and audio rhythms, you can get people’s attentions long enough to try to share some sort of idea.

 

I’m also not unfamiliar with filming music in the jungle. Last year, Peter’s band all came down to visit, and we held a concert for all the animals in the jungle. Adapting such an anthropocentric event to the forest made for a fun time and a compelling concert video.

The Song

Philosophy

I started the lyrics to the song during the Gamboa 2013 field trip. I had been reading an article about Superorganisms and the auto-poesis of control systems. This was the first time I had encountered the term poesis in its original term referring to creation and production. Meanwhile I had been thinking about how my research stood against typical scientific technological endeavors. One thought I had was that the overall goal of many technological works is to separate, and distinguish ourselves against the other facets of the world. We then dissect these “others” in order to dominate them. The target of my research, on the other hand, is to find technological means of granting greater agency to our surrounding environment. I want to dissolve the specialness of humans by finding ways for them to connect with the other pulsing creatures and environments forming the big pulsing body of the earth.

I had a vision of humans as little more than a simple appendage like an arm on the earth’s body. Currently this body part developed cancerous ideas of individuality and is currently attempting to cure itself by making saws to cut itself off from its own body.  I thought that a large philosophical target for Digital Naturalism then should be technologies which extend the reach of ourselves as these body parts and generate connections back to the other body parts. Our role in the body of the earth can develop into responsive nervous systems connecting the disparate appendages. Eventually helping the body function in tighter unison rather than staging a hopeless coup against it. Thus I had the basic principles behind a song with the awkward title: “Poetic Appendage.”

 

Creation

I would jot down a couple of lyrics that I felt should belong somewhere in the song. I would then pick some of these phrases and go on jogs and shout them around to develop different verses guided by the rhythmic structure of my running. I then started teaching myself how to better switch around chords on a guitar, and started to take these verses from the jogging sessions and add melodies. I eventually refined and developed a basic rhythmic structure with a standard frame of Verse-Chorus Verse-Chorus Verse-Chorus Bridge Verse – End, and simple chords to go along. Luckily then I managed to bump into an old friend Chris Gonzales. Chris is a recording mastermind. He has sound cards larger than my computer, and special boxes in the recording studio he built to give the room perfect warm audio qualities. We had worked together before on a raunchy song about sex with E.T. (this will be released with the full album after my PhD), and he seemed happy at the idea of working with me again on another musical project.

Chris’s interest was really super awesome for me, and I am really grateful he wanted to work with me. We met up and I shared the song and rudimentary philosophy and ideas behind my PhD, and he took to it right away and started thinking of ways to have the music and the recorded quality of the music represent the rhetoric and narrative of the ideas behind the song.

 

He used his fantastic guitar and drum skills to then get the core skeleton of the song together. Then I recorded the lyrics. Despite my usual demeanor, there’s something about singing that makes me INCREDIBLY SELF CONSCIOUS to the point where it is hard to actually sing. But battling this affliction, I think, is one of the main reasons that I keep forcing myself to do it (jpom.bandcamp.com). I still feel weird about hearing my voice on the song.

 

Finally we took lots of Peter’s recorded sounds of panama (petermarting.bandcamp.com), and I edited these clips of howler monkeys, cicadas, tropical birds, and frogs into the song. I also added some funky appegiatted synths to the mix so that all three main actors in the concept of the song (humans: guitar, drums, vox – Animals: Jungle Sounds – and Digital: Synths) would all be jumping around together.

 

The Song

{Is still currently being refined, check back later to hear it!}

Here’re the lyrics and chords to play it yourself!

 

The Video

{Is also still currently being refined, here’s some descriptions of the planning for it so far though.}

image

The verses will be shot in split screens. This will enable me to do fun dancing and stuff on the human side, and share cool nature videography on the creature side while supporting the thesis about technology and separation from the creatures. The aesthetic of the choruses are based off the “Tiny Planet” spherical video panoramas taken by GERMAN GUY INSERT NAME. I had to build a modified version of his 6 gopro camera holder to use with Gopro Hero2’s (not rich enough for 6 hero 3’s). Unfortunately one of the Gopros I bought off ebay had a faulty button, and I tried to fix it, but it just broke again – lame!

image

Arboreal Ant Sensor: Main Project Summer 2014

Background: Funding

After a couple years of trying to pitch lofty, abstract concepts for funding from the Smithsonian Tropical Research Institute, I changed up my strategy with great success! I went from rejected proposals such as “Give me money so we can do large scale exploratory performances with crazy technology in the jungle so that people can feel what it’s like to be an ant” to “Let me build an ant sensor” and surprisingly enough it got funded 🙂

As with all things in life, I realized that the reason I was having trouble getting funded last year was due to communication problems. No one should really be expected to just flat out accept big vague ideas that they have no real connection to what’s being proposed. So after doing lots of exploratory “Digital Naturalism” stuff last year, i decided it would be a good idea to take one of the projects that developed from this process, and present a very concrete idea that could gain traction among the scientists on the review board.

The basic concept

In the lab, we can track lots of ants with cameras pointed at the colony in a nice 2-dimensional plane. In the field however, the ants live on arbitrary geometric shapes- up the bark of a round tree  and onto frilly leaves, for example.

While working with Peter, we had been adopting the computer vision techniques from the lab in field sites with little success. However, after our work designing, utilizing and performing with the technology together in the jungle, we were able to start analyzing our problem from the ground up.

 

For Peter’s experiments, we realized that we didn’t need all the data that the lab tools were working to collect like, ant position, unique ID, orientation. Instead what we could really use would just be something that told us there mere fact that an ant was there or not. In the little time that I had last year, I made a really simple ant-detector prototype. An LED gives steady illumination to a point on the tree bark, and a photo-resistor gives a reading of how much light gets reflected back. When an ant walks in front of the area where this simple sensor is pointing, it reflects the light differently and gives a different reading.

This early prototype showed lots of promise. Once implemented, we can potentially build cheap, sub $10 sensors that could be attached in arrays to arbitrary surfaces in the jungle. This would be a different means of tracking the insect movements with its own bonuses and limitations.

Camera Tracking Modular Sensor Tracking
 Single Unit – Expensive  Multiple Cheap Units
 Rich Potential Information: Speed, Unique ID, Orientation, Multi-ant Interactions  Minimal Information from single source: Ant Present, Yes or No.
 Single Location, 2-dimensional  Multiple Locations

All Potential Technologies

Since I have learned the lesson over and over that everything will go wrong, and most things you assume to work will not, I came with several contingencies plans of different technologies which could also potentially work.

Reflected Light

Building more sophisticated versions in keeping with the original LED + Photoresistor sensor.

Modulated Light / Proximity Detection

This is the next step up from the original idea. The output light is pulsed, so the sensor knows exactly when to expect readings (cutting down on noise). Depending on how these readings come back, fancy sensors like the VNCL4000 (https://www.adafruit.com/products/466) can actually give distance

Optical Mouse Sensor

Right before I left for panama, Sparkfun started selling optical mouse sensors (https://www.sparkfun.com/products/12907). These chips are SUPER CHEAP ($1), and are actually high-frame rate cameras design to detect changes of movement in their visual fields. I ordered a bunch, and will try to see if I can rig them up to monitor patches of bark or leaves for the movements of any passing ants.

Electric Field Proximity Sensing

This option could be cool, because if successful, we could potentially detect ants within the trees themselves. This type of technology uses emitted electrical fields and senses any changes in the field strength it monitors coming back. Joshua Smith tested out a lot of this technology back in the 90’s with lots of sucess with Humans (we are big conductive blobs of water). In order to detect ants, this might not work at all (they are small dry and barely conductive). But if all other methods fail, this could be a cool thing to resort to.

http://en.wikipedia.org/wiki/Electric_field_proximity_sensing

 

Input / Output Examples

In June 2013 I held a small workshop to demonstrate simple devices that scientists could use in the field for sensing or acting within environments.

 

Robotic Woodpecker / Flick-O-Matic - In my Gamboa 2012 field session, Peter and I were discussing means of calibrating his initial assays, and also creating new ways of interacting with the plant ants that we were unable to do a simple humans, kicking and poking the trees. As an early very exploratory project, I built a simple, arduino-controlled robotic woodpecker. […]

Bio-Inspired Design

In Fall 2012 I was able to join Georgia Tech’s Center for Biologically Inspired Design to participate in their interdisciplinary course. The experimental class brings together biologists, engineers and physical scientists who seek to facilitate research and education for innovative products and techniques based on biologically-inspired design solutions. The participants of CBID believe that science and technology are increasingly hitting the limits of approaches based on traditional disciplines, and Biology may serve as an untapped resource for design methodology, with concept-testing having occurred over millions of years of evolution.

Projects

  • Pascobots – Desert Ant navigation and maple seed dispersion inspires a system design for rapid environmental surveying
  • Fresh Kicks – Bio-Inspired design
  • Several bio-provocative “found objects”

 

[rellink]

[/rellink]

Ants Secret Code – Reveal

Earlier, I posted a video of leafcutter ants claiming that it contained a secret code. Well it’s true! Here’s how to crack the code, and how I encoded my messages in the first place.

 

Deciphering

The astute observer may take note that the ants carrying leaves only travel in one direction (towards the nest). In fact, this is the entire underpinning to the code. When I presented the puzzle to my lab, the response I got that was closest to correct was from Prof. Tucker Balch who stated that the first thing he would do is “chart the number of leaf carriers visible in each frame over time and look for patterns in that time series.”  Good thinking Tucker!

The first step is to create a signal out of the leaf-carrying ants. To do this, one can simply take the green channel of the image and adjust a threshold until just the leaves are selected. To get more exacting data you could try to apply additional filters like blurs, dilations, etc on top of this thresholding. You can even use professional video compositing software like After Effects and “key-out” the green color. These additional improvements are not really necessary however, you can stay pretty crude.

Next, because there might be some extra foliage around the edges, you will want to crop to a region of interested just around the ants.

Example image targeting just the green leaves from the video.

The video should now be entirely white (255) in areas where the leaves are present, and entirely black everywhere else. I then made a simple script that tallies up all the white pixels (detections) present in every frame, and it saves all this data as a CSV. When I pop open this CSV file in an open-source equivalent to Microsoft Excel, and chart the results, I get something that looks like this:

Ahh, that looks like it might contain some sort of signal. Now’s the time for the cryptographic skills. Your first intuition should be that Andy isn’t that big on cryptography, and will probably just use the first temporal coding sequence that comes to his mind, Morse Code. If one takes the slightly wider pulses to be dashes and the slightly narrower ones to be dots, you can pull out this pattern:

— .  –   -.-.  .–.  .-..

Or translated from Morse->English: GT CPL

The Georgia Tech Computational Perception Laboratory (where I work).

Yay! I also have some additional videos where the ants say a couple other messages like “Digital Media” and, of course, “Hello World.” I even made a special message to the class of my cool Biology teacher sister.  I will post them here when they are ready.

Both videos from this puzzle say exactly the same message, it is just that one, the first video, was recorded further down the stream which gave lossier data, so more human, visual intuition was required.  The reason this data was lossier will be explained below. Additional props go to DM student Rebecca Rolfe who uncovered the unintentional Rebus of the video, “Soon there will be no leaves left” (Get it? Get it? Ants are carrying all the leaves to the right….)

Encoding

How did the ants know how to communicate this message? Well they probably didn’t.

Earlier in the summer, I wanted to test Leafcutter responses to temporary barriers. It turns out that if the barrier is only there for a short amount of time (<1 minute), the ants will just sort of pool-up behind it instead of walking around (note, this is not true of other ants, like Army Ants).

To get more precise results, I built a simple servo-device controlled by an arduino which was attached to a fluon coated plate. While I was cutting-off, and re-enabling the flow of ants, I realized I could also program this device to send ant-based messages in this fashion. Thus after lots of experimentation, and a long hot day sitting in the jungle with my ant tollbooth, I found a workable formula for sending dashes and dots, and made the servo go up-and-down correspondingly to whichever message I wanted to send.

Of course, I wasn’t 100% certain that it worked until I got back home and analyzed it myself!

Antmongous

–Update: If anyone knows of places that would like to have an interactive, cross-species exhibit like this, please contact me at Andrew.Quitmeyer|||gmail.com —

Antmongous is an embodied, interactive exhibit that will be distributed throughout the Castleberry Hill area and will encourage participant exploration by emergently provoking individuals or groups to follow ant-designated pathways in real time.

CASTLEBERRY HILL FOR ANTS

First, we shall replicate the networked layout of the Flux festival location as a 1/175 (ant-scale) abstracted model from inexpensive laser cut acrylic (see image). Since the core area of the Flux festival takes place in a roughly 300m X 400m geographical area, the ant-sized model will be approximately 1.7m x 2.3m, or the size of a large table.

The colony will be recursively loaded into the miniature model with the Queen and brood placed in the region of the model corresponding to the model’s location in the real world.Pathways such as roads, alleys, and building interiors will be featured in the model as areas accessible to the ants. The model’s restricted areas (like rooftops or sides of buildings) will be elevated and coated in Teflon paint (Fluon) to make sure that the ants only have analogous access to the same places the human audience.

TRACK ANTS

I create open-source software for visually analyzing the positions and movements of ants within our laboratory environment. This same software can be adapted for real-time tracking of the ants in the model.

PROJECT ANTS

Using ants’ positions within the model, we will make their presence felt in the corresponding locations in the actual world.

Positional data will transmit wirelessly to a mesh network of inexpensive, battery-powered XBee microcontrollers. The XBees, in turn, will turn lights lining the sidewalks on-and-off corresponding to the presence or absence of an ant in the analogous location. Thus, an ant walking through the model’s virtual intersection of Bradbury and Fair Street illuminates lamps in the actual location.

In this way, the ants can be felt crawling throughout the village. An advantage of mesh networks is that, unlike a string of Christmas lights, if a problem at one node will not affect the others.

 

AUDIENCE EXPERIENCE

The dual audience (humans+ants) will symbiotically interact throughout the night.

Food sources (agar paste) will be placed in various locations within the ants’ model. In the corresponding real-world locations, we will also hide prize packages. The humans and ants will collectively discover rewards.

During searching and foraging stages, humans will “feel” the passages of ants wandering about the city as pulsing trails of light. Arrivals to the festival who know nothing of the underlying mechanics, may feel the urge to follow along with these light movement patterns and explore the exhibitsat the festival in tandem with the ants.