Research
Horizon & TAS Hub Dance projects
We all have bodies and when we interact with robots we may become even more aware of our body. This awareness can mean we become more attuned to the physical contact we have with the robot, which can mean we notice the sensation of skin on metal or other hard material. The impact of the exchange of heat, pressure, vibration and so on can also prompt a greater sensorial awareness that can teach us more about our body’s capabilities and limitations. At root is the need for a trusting relationship that requires that interaction to be safe and avoid unintended harm. But how might that trust be extended or rethought when the interaction is between robots and skilled movers, in this case dancers? More particularly, what more do we learn when skilled dancers use a prosthesis or wheelchair? Dancers are not only expert movers but their expertise is informed by their knowledge of the body/machine interface so they already have a bodily awareness of what their body can do and what more (or less) it can do when joined with a mechanical device.
Moreover, dancers are naturally risk takers but what happens when their dancing “partner” is a robot? Is there a shared responsibility in moving together? What is the ‘language’ of the robots’ touch, the sensation of physical contact with a human body? How can robots be more responsive, sensitive, and alert to contact with ‘live’ bodies? And what might we learn about human-robot interaction that can be translatable to other contexts? In programming robots to always avoid collisions, how might the dancer influence thinking about the creative potential of ‘collision’?
The aim of this project was to answer these and other questions by exploring the idea of deeply embodied trust in autonomous systems through employing body-based methods, including contact improvisation and soma design, to examine the machine-body interface and reimagine bodily contact and collisions with robots as being creative and expressive, as well as trustworthy. Contact improvisation is a technique that dancers develop to work in relation with another dancer, involving weight supporting and weight bearing. The practice can range from very quiet and subtle contact through to more athletic exchanges of weight including rolling, lifting, weight catching, and develops skills in how to resolve ‘accidents’ through mutual awareness.
Four workshops employing a co-design methodology with dancers from a diverse range of experiences were held. The project approach ensured that the dancers were fully part of the research process, their perspectives and experience directly the design of human-robot interaction, potentially transforming robots into behaving in more creative ways whilst incorporating new modes of synchronising movement to address safety and ergonomics.
By involving professional dance artists and therefore skilled movers, the project aimed to reveal more about the nature of touch, physical contact, sensorial awareness and embodiment more generally, as well as broader themes such as autonomy, responsibility and body-based trust, that can be transferable to help with thinking about rehabilitation technologies and assistive devices.
This TAS project developed from earlier work introduced by Horizon Digital Economy Research (see below) which involved many of the same team, which laid the foundations for this interdisciplinary project that combined expertise from dance, computer scientists and engineers. The earlier research involved disabled dancers performing improvised movement sequences that were captured and drove a computational design algorithm, which were then algorithmically mapped onto the shapes of prosthetic limb covers that could be 3D printed. The research generated questions about agency, appropriation and ownership, and challenged normative thinking in the context of digital technologies. Many of these issues, particularly that of how normative design models might be challenged through a process of embodied explorations, inform the methods we are developing here.
The project team consisted of Sarah Whatley and Kate Marsh from Coventry University’s Centre for Dance Research; Steve Benford, Feng Zhou, Praminda Caleb-Solly and Paul Tennent from University of Nottingham School of Computer Science; Rachael Garrett from KTH Royal Institute of Technology, Sweden; and artists associated with Candoco Dance Company
Expressive personalisation of Consumer Products through Dance
Assistive technologies are often medicalised in their design. Horizon’s Consumer Product campaign project ‘Expressive Personalisation of Consumer Products through Dance’ explored how people could use their creative and expressive skills to interact with algorithms to personalise products such as prosthetics, mobility aids, tableware, furniture and other home products.
The project team was led by Paul Tennent, Assistant Professor in Computer Science and consisted of Steve Benford (Professor of Computer Science), Praminda Caleb-Solly (Professor of Embodied Intelligence), Ian Ashcroft (Engineer at Mechanical and Manufacturing Engineering), Virginia Portillo (Research Fellow) and Feng Zhou (Early Career Researcher) at the University of Nottingham. Also joining the team were Sarah Whately (Professor of Dance and Director, Centre for Dance Research) and Kate Marsh (Assistant Professor and Early Career Researcher at the Centre for Dance Research), Coventry University).