Related publications:

Palmer, M (2012) 'Designing human computer interfaces', chapter for Moschini and Lane (eds), The Digital Media Handbook: Theory and Reflective Practice, Harlow, Essex: Pearson Education (forthcoming).

Reiser, M (2006) "Learning to play the instrument: how to encourage improvisation in interactive environments", peer-reviewed conference paper, Mindplay conference, London

 

Contact:

Michaela Palmer at mic.palmer@uwe.ac.uk

Collaborative projects

dance technology collaboration

Some larger-scale collaborative projects, that turned into dance performances and art installations.
Read more…

Students' interfacing projects

wii-action painting

Key examples of completed project supervisions. Some weird and wonderful new interfaces.
Read more…

Physical computing

gloves

Introducing physical computing, and why it is an interesting field of work.
Read more…

Students' generative projects

Key examples of completed project supervisions which explored an aspect of generative computing.
Read more…

Physical computing


My practice very often involves physical computing and human computer interfacing. Physical computing is an approach that bases interface design on how human beings express themselves physically - through sound, heat, hormones, touch, movement, etc. Interfacing with the world

In physical computing, sensors are used to trace these human expression, and micro-controllers employed to convert human activities into electrical signals a computer can interpret and react to. This process is the basis for the development of many exiting new interfaces.

Human computer interfacing (not synonymous with human computer interaction) means to develop communication devices that make the interaction between two or more participants possible. Like good communication devices, they work best when they interfere as little as possible with the actual conversation taking place.

There are many different kinds of human computer interfaces. Today, nearly every web site, software application or ticket machine makes use of graphical user interfaces (GUIs).

Yet despite their popularity, sometimes GUIs are not the best possible interfaces for a given situation, as they may not allow for the most intuitive form of interacting, or they may disadvantage some user groups. This makes it necessary to investigate into other, non-graphical kinds of interfaces that prioritise other senses than the visual.

Haptic, speech-based, movement-based or gestural interfaces like the ones featured on this page often require unusual and innovative design approaches. It is an exiting area in which to conduct research, prototype development and user testing.

Top of page arrow

Collaborative projects

A selection of two larger-scale collaborative projects, completed before I became almost completely absorbed in biofeedback performance and PhD research.


 Still from Dance Technology collaboration

The Dance Technology project was a collaboration with choreographer Claire Keating, at the invitation of Essexdance. It took place in Chelmsford in 2004.

Claire wanted to work with the idea of dancers playing with their own shadows, and so I developed an application (Shado tracker) in the programming environment Max/map, which could track the location of dance performers on stage using a camera.

Whilst dancing, the performers would move through several zones marked out on the floor. This triggered sounds, as well as the playback of some shadow play sequences recorded earlier.

The audience on the other side of the projection screen would only see the shadows of the dancers, and the projections of the sequences recorded earlier. This overlay of present and past would give the performers room to play, and make for intriguing viewing.

At the end of the project I created an audio/ visual mixer, a software application that produced kaleidoscopic images of some of the dance sequences filmed earlier. Players could load sequences, rotate them, multiply them or pulse them in sync with the music.

Essexdance: image 1 of 6 thumb
Essexdance: image 2 of 6 thumb
Essexdance: image 3 of 6 thumb
Essexdance: image 4 of 6 thumb
Essexdance: image 5 of 6 thumb
Essexdance: image 6 of 6 thumb


Livespace was made in collaboration with Lucie Hernandez, Jane Vippond, Carl Rohumaa and Alex Schembri for Watermans Art Gallery, London, 2003:

A large-scale audio-visual installation, Livespace focused on natural as well as human activity cycles. The aim was to make the gallery space come 'alive' by capturing, interpreting and representing data from sources in the immediate natural and man-made environment.

Microphones and cameras collected live data from both in and outside the gallery, which was then processed and played back as a series of images with real-time surround sound.

The images were collated by a server and played back in a montage style, which allowed visitors to see the present and the recent past as a time lapse. The microphones picked up natural sounds (weather, bird sounds) as well as human sounds (road and air traffic, gallery activities), which provided an immersive listening experience as a surround sound mix.

Livespace uploaded by Michaela Palmer on Vimeo.

Top of page arrow

Students' generative projects


In 2012 undergrad student James Llewellyn created Melody Maker. Melody Maker is a software application developed in Juce, which uses a genetic algorithm to initiate and evolve melodies. James then interfaced the software application with the d-touch system, a visual marker recognition system that enables the development of low-cost tangible user interfaces. With the Melody Maker, even a novice can playfully create and refine new melodies and then export them as midi files.

Melody Maker's d-touch tangible user interface

Top of page arrow

Students' interfacing projects


Wii based action painting application

Many innovative applications have been prototyped by some of my past students. This includes an Action Painting software application by postgrad student Angelos Panayides, which interfaced a motion tracking Nintendo wiimote with the programming environment Max/msp/jitter.

A camera tracks the players in real-time, so they can observe their actions as well as the result of their actions (see image on the right). This makes for a fascinating and immersive play experience.


colour tracking

Patrick Shuttleworth developed a Colour Tracking Interface (see image on the right, tracking a red and a blue light source). His aim was to develop an intuitive system that enabled users to control audio and visual parameters through their body movements.

The core of the enquiry was to investigate the technical and creative potential of colour tracking control, and how it may be applied in real-life situations. Target users for colour tracking are installation artists, DJs, musicians as well as sound engineers. The system was implemented in Max/msp/jitter.


colour tracking

Nicholas Allan created a Music Visualiser, for which he built a number of separate graphical objects, each reacting independently to different audio attributes. In contrast to many ready-made visualisers today, this one was programmed to be highly autonomous, reacting to midi values derived from audio files automatically.

The system is expansible, but to offer future users a starting point, Nicholas prepared a number of reactive animation sequences, suitable for different musical contexts. The project was implemented in Quartz Composer.


beat sneakers

Hugo Bishop, Piran Watson, Adam Khadaroo and Louis Catlett prototyped a Portable Drum Kit, which tracked the player's tapping of the feet via sensors in the shoe soles, and their tapping on the legs via removable sensors on the trouser legs.

The advantages the Portable Drum Kit offered were not having to carry actual drums about but still being able to practice with drum sounds. One particular problem this group overcame was latency, which often comes up in the construction of low-cost sensor-based interfaces.

Top of page arrow