Seed Pods

By Lawrence Goodman / November / December 2007
December 3rd, 2007
About a year- and-a-half-ago, Schuyler Maclay ’10 hacked the software that came with his iPod and changed what it displayed while playing music. Instead of the artist’s name and the song title, Maclay’s iPod showed words randomly strung together. It read like Dadaist poetry.

So began Osiris, a start-up by four undergraduates out to change how we view music—literally. Osiris’s software parses the words in the song you’re listening to and displays related images it finds on the Web.

 

26.OSIRIS1.32976.jpg
Scott Kingsley
The result is a spontaneous music video that’s different every time you play the song. Or as Nick Greenfield ’09, one of the Osiris team, puts it, the idea is “forging semantic linkages between lyrical content and displayed imagery.” Yes, Osiris began as a project in a modern culture and media class.

 

Last May, mtvU, MTV’s twenty- four-hour college network, and Internet giant Cisco Systems awarded the Osiris team $25,000 of seed money.

Maclay, Greenfield, and Osiris’s other cofounders—Zachary McCune ’10 and Sebastian Gallese ’10—have used the money for a business plan to sell Osiris to dance clubs.

Clubs now either hire video disc jockeys (VJs) or project random images and film loops onto giant screens to create a full-blown multimedia experience. Osiris could do this on the cheap, its inventors believe, and the software would select images specifically linked to the song that’s playing. The students expect to generate $300,000 in sales within a few years.

Here’s how the software works:

When you play a song on your computer, Osiris downloads the lyrics from a variety of Web sites. After weeding out words like the and and, Osiris compiles a list of key words. These then become search terms at a Web site called Flickr, which hosts millions of photographs posted by millions of users from around the world. When a word from a lyric runs through the site’s search engine, it’s matched with word tags users have noted to identify their photos. The search engine returns thousands of related images to Osiris. Osiris then downloads these results and stitches them together into a video.

27.OSIRIS2.32976.jpg
Scott Kingsley

 

“What we’re doing,” says Maclay, “is taking this vast expanse of everything people have put up on Flickr—this huge clutter of photos—and distilling it into a series of visual images.”

To demonstrate, McCune played the “Octopus’s Garden” by the Beatles.

The resulting video contained images of an octopus underwater, calamari on a plate, and a cat. Greenfield says photos of cats are always popping up in the videos because “people give their cats all sorts of weird names.” Evidently, this one was named Octopus.

Greenfield says Osiris will become more selective as the software gets better at parsing language. Right now Osiris takes a phrase like red cat and separates it, searching for both red and cat. But soon Osiris will be able to tell that the two words are connected and will run a search for red cat.

Osiris’s inventors consider their software more of a work of art than a commercial product. “Our generation is about remixing the old and creating something new,” says Gallese. “When you mix up these vast data sources, that’s art.”

Find more info on osiris here .

What do you think?
See what other readers are saying about this article and add your voice. 
Related Issue
November / December 2007