BuddyBot is a robot designed to help children feel less alone when confined to bed in a hospital or who have to spend time in isolation. We, Jon Gislason and I, built the robot for our Master thesis at the IT University of Copenhagen, our objective was to alleviate feelings of loneliness by creating an interactive friend that wasn't preprogrammed with a bunch of phrases, but instead was able to cuddle and give some kind of indication of presence, similar to that of a pet.
For a robot, BuddyBot doesn't move very much, only the tail and the head have been equipped with motors. in addition it can make non-sensical sounds (inspired by R2D2 from Star Wars) and change colour on the sphere of the 'face'. We wanted to encourage children to project their own imagined character/personality onto BuddyBot which is why the robot is faceless so that they could draw or glue on an appropriate face. Despite seeming fairly passive, the robot is packed full of sensors that monitor several aspects of the child's activity and wellbeing. Our intention was provide an alternative to being wired up to hospital equipment, instead a bed bound child could be wired up to a BuddyBot.
We also developed the Companion app, an application for Android smartphones, that collects data from the robot. In addition, the Companion app provides an alternative mode of communication between the child and the robot through a selection of games and activities, like the 'pain meter' - a graphical ruler for indicating pain using smileys or feeding the BuddyBot avatar such as one would feed a virtual pet, like a Tamagotchi.
There was a plethora of considerations throughout the development process and many different avenues that we could have taken this system. We received a lot of support from friends in terms of design (Arezoo Clark Sørensen), sewing (Amina Audrey Louborg), constructing (Emil Clausen), sound (Thor Axilgaard), and financial backers like the Awesome foundation and of course all the hospital staff, nurses, doctors and children who took part in our study. While the support is and was very much appreciated, it was not the right time for me to continue working on the project, because I had become uncertain whether such a system was indeed beneficial or whether it would actually increase the amount of children who are alone in hospitals. Not because the BuddyBot would fail to alleviate loneliness but because it would no longer be necessary to physically check in on the child since a digital system was in place to do just that.
The same 'worst-case scenario' can occur in a future where an increasing amount of assisted-living technologies become available to the growing population of the old and infirm. We have to remember that there are two sides to every story and that no technology is bad or good in or of itself, that is why I like to keep in mind the importance of being responsible in regard to the direction our research can take us. Of course, there are no guarantees because devices, ideas or concepts can be brought forth by others who are less inclined towards considering the potential consequences of their inventions.
Some might say it is better that a mindful person or group of persons are in charge of these types of researches, others may claim that whatever we do someone will invent a better idiot. I believe in personal responsibility and I think it is worthwhile to take a moment and consider some of the potential misuses that the ideas we come up with might be vulnerable to, before they are released onto the world and its peoples.
Sometimes I have a feeling that we become so caught up in the possibility of manifestating and materialising our ideas that we forget about the very real consequences on the world we live in; at other times it seems that the ego-centric drive for making money, achieving our ambitions or harnessing power, completely overshadows our ability to consider the ramifications that the things we make or do may have on other people.
All in all, I still quite like the idea of creating a little snuggle bot, that can be used when your alone and a little scared, but I would probably not be targeting vulnerable children, they need all the human contact they can get and it would weigh too much on my conscience if BuddyBot was suddenly taking that away from them.