Want to build an artificial brain? try building an embodied robot. It makes sense that to embody an AI system implies giving it a body to embody in. A guide to the advantages, challenges and problems of artificial embodied cognition are examined in a recent Frontiers in Psychology article (citation below).
We are given useful definitions of words that elsewhere are often used interchangeably. Cognition is ‘grounded’ in the physical properties of the world. ‘Embodied’ cognition on top of grounding is shaped by the physical constraints of the body and its sensorimotor interactions. ‘Situated’ cognition is context-specific on top of embodiment.
Then we are giving a general walk through the ideas surrounding embodiment in both AI and biology. The five authors have written a fictional conversation between someone with a computer bent and someone with a biological bent, but both interested in embodiment. A good case is made that neurobiology can assist AI and that AI can assist neurobiology if they collaborate. The end point of this dialogue is the agreement that robotics is the way to go to investigate embodiment.
… a necessary complement to all these methodologies is to increasingly adopt cognitive robotics as their experimental platform, rather than designing models of isolated phenomena, or relaxing too many constraints about sensorimotor processing and embodiment. Indeed, it seems to me that cognitive robotics offer a key advantage to the aforementioned methodologies, because it emphasizes almost all of the components of grounded models: the importance of embodiment, the loop among perceptual, motor and cognitive skills, and the mutual dependence of cognition and sensorimotor processes.
They follow that with a list of challenges that an embodied robot presents:
Challenge 1: Taking a developmental viewpoint to explore why and how embodied cognition could have originated. Through evolution, individual development and learning, how does embodiment come to be?
Challenge 2: Exploring the (causal) influence of embodied phenomena for cognitive processes. Amodal cognition models have to be abandoned. The sensory, cognitive and motor processes ‘leak’ into one another.
Challenge 3: Specifying the time course of activation for embodied concepts. Sensory, cognitive and motor processing is not sequential but overlap in time.
Challenge 4: Developing embodied computational models of symbolic and linguistic operations. Symbolic manipulations, which loosely includes reasoning and abstract thinking, predication, conceptual combination, language and communications must be re-thought in terms other than traditional symbolic processing.
Challenge 5: Realizing situated and complete architectures without losing contact with data. There must be general integration of real perception to real action so as to model internal needs and motivation. But as the robots have different physical ways of sensing and acting, the results will not be directly comparable to biological systems. Compromise will be needed.
Challenge 6: Realizing realistic social scenarios for studying collaborative, competitive, communication, and cultural abilities. There will need to be a level of robot society or of robot-human interaction to model this type of embodiment.
Much as I enjoyed and learned from this paper, I was a little disappointed with the lack of any mention of consciousness (of course, you have to realize that I am a consciousness nut). There was a general idea that sensory modalities had to be integrated and to share structure with motor maps/codes. But there was not discussion of whether this was likely to be a product of consciousness or not (or even maybe sometimes). There was mention of prediction but again not of the predictive nature of consciousness. A introspective modality was mentioned but not how a system could introspect without consciousness. Phrases like feed forward, feed back, lateral activation were used but with no hint that the neurological signature of consciousness is just such waves of activation sweeping forward, sideways and back. They may have had one of several reasons for this lack: that it would over-complicate the discussion; that it would be difficult to say what the equivalent of human consciousness would be in a robot; that it was a taboo word in their conversations; or that embodied cognition might be possible without it.
Pezzulo, G., Barsalou, L., Cangelosi, A., Fischer, M., McRae, K., & Spivey, M. (2011). The Mechanics of Embodiment: A Dialog on Embodiment and Computational Modeling Frontiers in Psychology, 2 DOI: 10.3389/fpsyg.2011.00005
Mirolli and Parisi discuss “mental life” from an embodied perspective here (paragraph 4.6):
http://laral.istc.cnr.it/mirolli/papers/MirolliParisiInPressTowardsVygotskyanCognitiveRobotics.pdf
JK:
Thank you very much for the link to that very interesting paper. I have no doubt that a linguistic ability would add immensely to the cognitive ability to an embodied robot and would be essential for a socially embedded one. However, the paper does not address head-on the need or non-need for consciousness.
I think consciousness is essential for anything above a minimal level of cognition, but this is a guess for which I have practically no evidence hence my interest in what might be achieved in embodied robots with and without consciousness or a surrogate by another name.
By consciousness I am thinking of a world model with a self at the center at a point in time. Such a entity is useful in many ways: it forces the results of perception of differing modalities to compromise on the best fit scenario to be shared by all perceptive-cognitive-motor processes, it can forecast the near future in order to avoid motor plan conflicts, it constitutes ‘experience’ which is used for episodic and linguistic memory and through memory fragments used to construct of imaginings of various kinds and also the construction of an autobiographic history, it allows a particular type of learning learning from experience (there are other types), and it is part of the control and access to short-term memory and focus of attention.
Language (even the most inclusive definition of it) applies only to humans and a few other animals. Consciousness on the other hand (again being as inclusive as possible and not demanding self-consciousness for example) applies to a great many animals, perhaps almost all. Animals would no doubt need to have consciousness to have language, but certainly do not need language to be conscious.
Thank you again for the link. Janet