About PLEA
PLEA is an affective robotic head which has been developed to analyze and employ behaviours using a form of biomimicry.
PLEA is on a quest to find compassion with an unknown buddy.
PLEA is analyzing different sources of social signals from those who interact with it, including facial emotions, levels of loudness in the room, intensity of body movements of those moving around the installation and also sentiment analysis of speech that it hears. In this way, the system interprets the social signals to generate hypotheses and produce non-verbal expressions using information visualization techniques. The robot can reason about emotional clues of the person interacting with it and respond accordingly. PLEA currently has limited verbal capabilities but in future this type of robot will be able to speak.
PLEA is a proof-of-concept, and its presence in the Art AI Festival is to illustrate latest developments of affective robots which are capable of interacting with humans. The system is recording responses of the person during interaction (no personal data that identifies individuals is being gathered). In future, it will use data about the interactions it has collected with visitors to develop more advanced reasoning capabilities through a controlled learning experiment.
You can see a short video about the installation on our YouTube channel here
Background
PLEA is based on latest research into human cognition, cognitive robotics, information visualization and human robot interaction, using the research as the basis for developing new robot reasoning and interaction strategies.
The computational architecture used in PLEA is a context-to-data interpreter that endows the machine with capability to ‘reason’ based on constantly changing perspectives. It uses a framework that integrates human, robot and environmental data to adapt in real-time. This means that it is making decisions based on newly acquired information, which is incomplete in the moment it makes the decisions to behave in a particular way. Built-in functionalities of the robot provide it with a degree of situational embodiment, self-explainability and context-driven interaction. PLEA represents a new way of thinking about robots, moving away from sensed data toward contextual anticipation.
This installation has been supported in part by Croatian Science Foundation under the project ‘Affective Multimodal Interaction based on Constructed Robot Cognition – AMICORC (UIP-2020-02-7184)’. It is also a joint endeavour of the Faculty of Mechanical Engineering and Naval Architecture at the University of Zagreb (Prof Tomislav Stipančić) and Institute of Creative Technologies, De Montfort University (Prof Tracy Harwood and Visiting Prof Duska Rosenberg) to build a robot empath that may be used in care settings.
What is Biomimicry?
Biomimicry is the emulation of the models, systems, and elements of nature for the purpose of solving complex human problems. The terms ‘biomimetics’ and ‘biomimicry’ are derived from Ancient Greek – βίος (bios), life, and μίμησις (mīmēsis), imitation, from μιμεῖσθαι (mīmeisthai), to imitate, from μῖμος (mimos), actor. A closely related field is bionics.
Living organisms have evolved well-adapted structures and materials over geological time through natural selection. Biomimetics has given rise to new technologies inspired by biological solutions at macro and nanoscales. Humans have looked at nature for answers to problems and nature has solved many different types of engineering problems such as self-healing abilities, environmental exposure tolerance and resistance, self-assembly, flight, harnessing solar energy, etc. One of the early examples of biomimicry was the study of birds to enable human flight. Although never successful in creating a ‘flying machine’, Leonardo da Vinci (1452–1519) was a keen observer of the anatomy and flight of birds, and made numerous notes and sketches on his observations as well as sketches. The Wright Brothers, who succeeded in flying the first heavier-than-air aircraft in 1903, allegedly derived inspiration from observations of pigeons in flight.
PLEA is using humans as its inspiration, attempting to consider similar factors in decision-making as a human would, so environmental conditions, presence of others, movement, etc., all inform its response mechanisms.
Participate in Research into PLEA!
Are you over 40 years and based in Leicester? Please read on…
We will be carrying out data collection on people’s experiences with PLEA at 72 Church Gate, Leicester between 21 and 24 February 2022 (inclusive).
Specifically, the research aims to evaluate how well PLEA is able to communicate only by using its emotional expressions and WE ARE LOOKING FOR RESEARCH PARTICIPANTS!
Specifically, the research is attempting to better understand engagement and experiences with the robot head (which is called PLEA), emotional responses to PLEA, and perceptions of PLEA’s use in (future) social care settings.
PLEA’s inventor, Dr Tomislav Stipancic (of University of Zagreb, Croatia), along with DMU Emeritus Prof Duska Rosenberg and Prof Tracy Harwood (director of the Art AI Festival) will be running a data collection event over 4 days whilst PLEA is live at 72 Church Gate, Leicester (during February and March).
The 4 days and times the researcher will be on site to do the data collection are –
- Monday 21 Feb 11am-5.30pm
- Tuesday 22 Feb 11am-5.30pm
- Weds 23 Feb 11am-5.30pm
- Thurs 24 Feb 11am-5.30pm
Unfortunately, we are unable to pay expenses to attend the venue and we are not providing compensation to participants – participation is entirely voluntary.
Participation involves a maximum of ONE HOUR during which you will be given a task that involves communicating with PLEA. This will be followed by an interview about the experience lasting approximately 30 minutes.
We hope all our research participants will find it an interesting project, demonstrating latest technologies using artificial intelligence to communicate in human-like ways.
To book a time slot to participate, please follow the Eventbrite link HERE or at the top of this web page.