This project, which is still untitled, was an attempt to create an ambient display of recent news headlines. It was motivated by my desire to convey information in ways that do not require the user’s full attention. The late Mark Weiser wrote extensively on the topic of ambient information displays. Weiser and others desired methods of accessing information that evoked a sense of calm rather than a stressful feeling of information overload. In theory, this sense of calm is facilitated by making information easily accessible in the user’s periphery. This peripheral information placement requires special attention to the preconscious sensual experience. While this can be visual, it can also be haptic. This project attempts to leverage the human ability to immediately recognize the emotional state of another human face as a means to convey the emotional content of a given news story. Ideally, this piece would not require the user’s direct engagement. Instead it would provide the emotional gist of a given new story. The techniques used in this project were inspired by the work of Zachary Lieberman, Golan Levin and Mood News. Each of these artists was interested in alternative methods of both data visualization and communication. In JJ, Golan Levin used language analysis software to analyze the emotional content of sniffed network traffic. The emotional content was communicated by displaying the appropriate emotional face from a standardized database of emotional faces. Zachary Lieberman created a tool that located faces in images navigated through a large database of faces moving from face to face. Mood News is a feed that organizes BBC headlines based on the emotional content. For this project, I used Java to download RSS feeds of news stories. I used ConceptNet to guess the mood of the story. I used emotional face imagery from google searches to create my face database. The locations of faces were determined using the OpenCV library. The whole package, including the animation was created in Jitter. A sample of the raw animation is below.

When a new story appeared on the RSS feed, the content was analyzed and the appropriate mixture of emotional faces was then displayed.

[qt:http://christopherbaker.net/wp-content/uploads/anim.mov http://christopherbaker.net/wp-content/uploads/anim_poster1.mov 320 256]

For purposes of critique and discussion, I included printed copies of my code (below).



  • Mood News : A project that attempts to order the current BBC based on emotional content.
  • JJ : A Carnivore client by Golan Levin. It attempts to characterize the emotional content of sniffed network data.
  • Picturing Family : An image navigation system created by Zachary Lieberman.

Technical Resources Used

  • OpenCV : Computer Vision Library written in C++.
  • Rome: An RSS/Atom library for java.
  • ConceptNet : (Hugo Liu, et al.) A library capable of text analysis, gisting, mood guessing and more.
  • Max/MSP/Jitter : Multimedia software.