Human activity recognition from object interaction in domestic scenarios
Tutor / director / avaluadorAranda López, Juan
Tipus de documentProjecte Final de Màster Oficial
Condicions d'accésAccés restringit per decisió de l'autor
This work describes the recognition of human activity based on the interaction between people and objects in domestic settings, specifically in a kitchen. In order to achieve the aim of recognizing activity it is necessary to establish a procedure and essential equipment. Regarding the procedure, in a simplified manner, it is based on capturing local images where the activity takes place using a colour camera (RGB), and processing the above mentioned images to recognize the present objects and its location. The interaction with the objects is classified as five types of possible actions (unchanged, add, remove, move and Indeterminate), which are used to analyze the probability of the human activity that is being performed at the moment. As for the technological tools employed, the system works with Ubuntu as general Operating System, ROS (Robot Operating System) as framework, OpenCV (Open Source Computer Vision) for the vision algorithms used, and Python programming language. The development starts with the segmentation using the "difference image" method that obtains the area that the objects take up in the image the recognition of objects is carried out by distinguishing them according to its colour histogram. the positioning is obtained through its centroid, applying the corresponding homography to go from the coordinate system of the image to the coordinates of the real world using comparisons of the historical and the new information of the objects we determine the actions that have been fulfilled as final stage, we filter the relevant objects on the basis of the actions carried out and compare with the objects defined for the accomplishment of every activity the result is the probability of executing each activity.
|TFM Carlos Flores Vazquez 13062014.pdf||Report||1.612Mb||Accés restringit|