It is important that an autonomous mobile robot is equipped with the capability of making sense of the dynamic world around it. The primary question of robot perception of its whereabouts in known and unknown environments has attracted a lot of attention with the mobile robot research community. In particular, mapping, localization, and their integration in the form of simultaneous localization and mapping (SLAM) have been vigorously pursued as potential solutions to this problem. At AISL, we have been studying these problems in the last few years.

We are currently studying the SLAM problem  in a humanoid robot (NAO). We are studying two approaches: (i) integration of SLAM with augmented reality, and (ii) SLAM via bio-inspired techniques. Both approached are expected to reduce the computational efforts compared to mathematical model-based SLAM and should also be useful in dynamic environments.  

Autonomous navigation is regarded an essential attribute of intelligent robots (mobile platform or humanoids). However, operating in human world is as important.  More and more robots are finding their way to human environments such as factories, universities, hospitals, homes, etc. Perception of different environment and learning specialized tasks within that environment present host of challenging problems. We are studying perception problems including but not limited to face detection, mood detection and their integration with navigation strategies.