Undergraduate Student Projects

Adapting Full Body Synergies

Completed by: Ezgi Tekdemir

Term Completed: Fall 2017

Adapting Full Body Synergies project is an analysis on humans’ way of representing movements with high dimensionality as a combination of as little dimensions as possible. The representation accurately represents the original motion if total variance of motion components compensate for a large percentage. The aim is to see if the recon- struction of a restricted movement from the same number of dimensionality as the original movement will converge to the reconstruction of the original movement as the subject adapts to the restricted movement. The research is on human learning and adaptation of movements with less dimensionality.


Completed by: Abdullah Muaz Ekici and Özer Biber

Term Completed: Fall 2017

Serve project is the implementation of a system, that enables interaction with Baxter by speaking.Baxter acts according to given commands.It understands given commands by speech recognition and act in the environment by using object detection.​

Previous Completed Undergraduate Student Projects

NAO Robot Avatar (Selected as "Best Undergraduate Project" of Spring 2017):

Completed by: Yunus Seker and Mehmet Özdemir

Term Completed: Spring 2017

Video: https://youtu.be/LZD0s43UvNE

In this project, you will implement a system that enables seeing from NAO's eyes and moving with NAO's body. NAO's motions will be copied through utilizing an adapted whole-body tracking system, and the robot camera images will be displayed on a head-mount display system. This system will enable full embodiment, and will be used for a very fruitful research direction: Utilizing robot avatars to understand the underlying mechanisms of human sensorimotor processes through changing different aspects of the embodiment.

Facial Expressions and Object Tracking for Baxter Robot:

Completed by: Bilgehan Nal

Term Completed: Summer 2017 as Summer Internship

The aim of the project is possess Baxter with a face including mouth, eye and eyebrows for different facial expressions. In addition enabling it with the ability to track its own end effector with its eyes and hand, or with help of its sonar sensors an object around it. Then control this face using an application for pc or Android phone.

HRI: Robot control and communication through speech:


Completed by: Bilgehan Nal

Term Completed: Summer 2017 as Summer Internship

The aim of this project is to integrate existing speech processing and synthesis tools to communication with the Baxter robot. English and Turkish languages will be used in communications, in setting tasks or in getting information from the robot. The robot's voice based communication skills will be reinforced with various interfaces including emotional faces displayed on the tablet.