HCI Interventions for Science Communication

On Monday Monday, April 23 at 11:30AM I will be presenting a case study at the CHI 2018 conference in Montreal, Canada.

Abstract
In this paper we describe the practices used by alternate reality game (ARG) designers to engage fans with the issues and effects of global climate change under the scientific guidance of key non-profit organizations. Our multiple case study is based on three projects: Future Coast (2014), the Disaster Resilience Journal (2014) and Techno Medicine Wheel (2007 – ongoing). Our analysis derives from each ARG designer’s interview and observations of their game’s narrative structure, postmortem. Findings provide HCI practitioners with a list of best practices related to the designer’s use of narrative style and physical locations to support fan engagement. These practices emphasize the goals of non-profit organizations (NPO) through science communication utilizing popular media forms.

Paper: Moulder, V., Boschman, L.,Wakkary, R., Neustaedter, C. & Kobayashi, H. (2018) HCI Interventions for Science Communication, Case Study, Conference on Computer Human Interaction (CHI), ACM Press, Montreal, Quebec, Canada.

Related Research: Moulder, V. A. (2016) Transcoding Place Through Digital Media. The School of Interactive Arts and Technology, Simon Fraser University, Surrey, BC, CA. PhD Dissertation.

Monitoring Environmental Health: The Call Sensing Device

Come Visit Us!! On Monday, April 23 at 6PM we will be exhibiting the Kobayashi Lab’s Call Sensing Device concept for animal wearables at the 2018 Human Computer Interaction conference in Montreal

The animal Call Sensing Device explores the use of power-saving components in animal wearables designed to monitor radiation around the exclusion zone 11KM around the Fukushima Daiichi Nuclear Power Plant in Japan.

The challenges of monitoring animals living in the exclusion zone include designing power-saving components, autonomous shared networks, and the methods to transmit data to our base camp. To address these challenges, our human computer biosphere interaction (HCBI) design research team is studying devices that respond to animal calls. HCBI design research requires technologists and scientists to work together to observe animals and then design devices that respond to their behavior and calls. The HCBI storyboard below demonstrates how the animal call sensing device works within a deer herd living in the exclusion zone. 



The Kobayashi Lab conducts human computer biosphere interaction (HCBI) design research to enhance environmental health monitoring. Since 2011, Dr. Hill H. Kobayashi has monitored the exclusion zone 11km around the Fukushima Daiichi Nuclear Power Plant in Japan. This research has informed new parameters for monitoring the biosphere after a catastrophic disaster. In collaboration with earth and animal scientists the lab uses exploratory methods to HCI interventions for animal wearables and ubiquitous sensors that use sophisticated power-saving and information-processing methods to transfer data.


Human computer biosphere interaction is based on the Tsunagari communication concept, developed to foster a sense of closeness between family members living outside of the household in Japan (Itoh, Miyajima & Watanabe, 2002). The Family Planter is a specific application within the Tsunagari system which allows family members to exchange non-verbal cues over a network. HCBI is an adaptation of the Family Planter application and enables non-linguistic and non-verbal interactions among humans and different species (plants & animals) over physical distance.

HCI is a discipline concerned with the design, evaluation, and implementation of interactive computing systems for human use and the study of major phenomena surrounding them (Hewett et al., 1992). The Kobayashi Lab’s research aim is to extend HCI and human computer pet interaction (HCPI) to explore human computer biosphere interaction (Kobayashi, 2012). The conceptual relationships between HCI, HCPI, and HCBI are illustrated in Figure 1.
Figure 1. Human Computer Biosphere Interaction (HCBI) © 2018 Kobayashi Lab
Center for Spatial Information Science, The University of Tokyo, Chiba, Japan

.

Fukushima Audio Census

Fukushima Audio Census (2017) was an interactive artwork designed for the CHI 2017 Art Program. Live audio is transmitted from strategically placed microphones in the exclusion zone of a contaminated forest located 10 kilometers from the Daiichi Nuclear Power Plant in Japan. The artwork invited conference attendees to listen to forest sounds, retrieve past recordings, and talk with experts in the field of ecological neutrality. It thus creates a community among listeners at the conference, researchers, and creatures within the exclusion zone.


Fukushima Audio Census (2017) CHI 2017 Exhibit, Denver, USA

Live Sound from Fukushima was originally designed to help scientists such as Ishida Ken in placing portable digital recording devices to capture vocalizations of specific animals in the wild (Ishida, 2013). The system is comprised of microphones and transmission stations strategically placed in a contaminated forest within the exclusion zone, located 10 km from the Fukushima Daiichi Nuclear Power Plant (F1NPS). The artwork is based on Live Sound from Fukushima by researchers Hill Kobayashi and Hiromi Kudo at the Center for Spatial Information Science (CSIS) at The University of Tokyo, Chiba, Japan.

The artwork is comprised of two separate subsystems: The Field Encoding System, used to digitize live sounds from within the forests, and the Streaming/Archiving System to conduct live sound delivery via the Internet and to archive sound data in the form of archived files. Technical architecture and operational implications of the system have been discussed previously in (Kobayashi, 2010). This project is part of a larger collaboration with scientists to collect, share, and analyze the sound- scape data from over 500 locations in the exclusion.

Talking Poles, Public Artwork

Commission by the City of Surrey, BC, Canada our team built the Talking Poles. This public artwork is a larger-then-life sensing device that plays back residents’ words, music and laughter. In 2009, we began the process of fabricating the Poles and planed design tactics for engaging with local residents. The Poles are steel structures decorated with a vinyl mural displaying the thematic imagery. The electronic components are powered by a solar panel located in the top cone of each pole. Audio recordings of local residents’ voices range in length from 30-90 seconds and play back from an MP3 player housed inside a metal cone above each Pole. Click on Oldhands and Acharya Dwivedi‘s to hear examples of the audio recordings.

By working with people from the area, two themes were chosen – Love and Peace. These words are displayed in ten languages on the Poles surface. Pedestrians approaching a Pole will trigger a sensor, activating audio recordings by local residents sending messages to future generations. To involve participants, we contacted a number of local community groups, schools and spiritual leaders; developed iconography with a university visual art class; worked with high school design students; and organized a World Drumming Day event at a First Nations housing co-op. To engage residents who walked along the Greenway, we designed a Talking Pole prototype and placed it on location. With printed brochures in hand, we stood in front of the prototype inviting people to sit at a table and to talk with us. Documentation of the design process is archived on the City of Surrey’s Public Art Program web site.

> An example of the pole’s interactivity can be viewed in the Talking Poles Revisited video produced by Lorna Boshman.

Publication: Moulder, V., Boshman, L., and Wakkary, R. (2011) The Talking Poles: Public Art Based in Social Design. Extended Abstracts CHI11, Vancouver BC, ACM Press, pp. 201-209

Vancouver’s First Algorave

International Symposium Electronic Art (ISEA) 2015 was held in Vancouver, BC Canada. As the Performances and Partner Events Chair I was involved in the curation of the City’s first Algorave. An Algorave is an event where people create music generated from algorithms, often using live coding techniques. Artists produce these sounds by creating music-making algorithms on systems such as IXI Lang, Overtone, Puredata, Max/MSP, SuperCollider, Impromptu, Beads, Fluxus and Tidal. That’s right! Artists code live and you could see their compositions appear live on screen in real-time.

The movement originated in the computer music research community, and became popular in the UK and across Europe.  Alex McLean (Slub) and Ade Ward, the co-fathers of the scene presented the Generative Manifesto at the Institute for Contemporary Arts, 23rd August 2000. The document listed these key points:

  1. Attention to detail – that only hand made generative music can allow (code allows you to go deeper into creative structures)
  2. Realtime output and compositional control – we hate to wait (it is inconceivable to expect non-realtime systems to exhibit signs of life)
  3. Construct and explore new sonic environments with echoes from our own. (art reflects human narrative, code reflects human activity)
  4. Open process, open minds – we have nothing to hide (code is unambiguous, it can never hide behind obscurity. We seek to abolish obscurity in the arts)
  5. Only use software applications written by ourselves – software dictates output, we dictate software (authorship cannot be granted to those who have not authored!)

(top left) Alex McLean performed from-scratch “blank slate” live coding performance, improvising percussive techno, and occasionally bringing in and remixing elements from Peak Cut EP. Listen to At last  (middle) Marinos Giannoukakis performing the secret life of Burton as part of an anthology of real time narratives named “X short stories” Watch musicaUniversalis danceVer  (right) Norah Lorway wrote algorithmic, procedural techno with a twist of acid house. Listen to Huddersfield Algorave Excerpt

(bottom left) Shawn Lawson & Ryan Ross Smith performed Sarlacc, an audio-visual performance, features visuals live coded within the OpenGL fragment shader, that are reactive to incoming audio frequencies parsed by band. Check-out Kessel Run, (middle) Homage to Studio Vision Pro was performed by the Hamster Ate My Garageband (Oliver Bown) and (right) Arne Eigenfeldt performed Beats by GESMI.

 
ISEA 2015 Art Event on Friday August 14, 2015 10:00pm – 2:00am Fortune Sound Club (147 E Pender St, Vancouver, BC V6A 1T5, Canada).