Automatic flagging of offensive video content using Deep Learning
Tutor / director / evaluatorTarrés Ruiz, Francisco
Document typeMaster thesis
Rights accessOpen Access
Thanks to our visual system, it doesn't take any effort for us humans to tell apart a cat from an eagle, recognize our family´s faces, or reading a sign. But these are actually hard problems to solve with a computer: the difference relies in how the human brain and computers process images. With the rise of the Internet and Mobile Smartphones, the amount of visual content available on the internet has increased to well beyond manual analysis. Offensive classification of images is one of the major tasks for semantic analysis of visual content. In the last few years, the field of machine learning has made tremendous progress on addressing these difficult problems. In particular, we've found that a kind of model called a deep convolutional neural network (CNN) can achieve reasonable performance on hard visual recognition tasks -- matching or exceeding human performance in some domains. CNNs are now being to tackle one of the core problems in computer vision, which is, image classification. In this master thesis, Automatic flagging of offensive video content using Deep Learning, Deep Learning is the key enabler to address offensive video classification challenges posed by the Internet Age. Deep Learning is a new paradigm aiming to overcome the limitations of current approaches, which are complex and require manual intervention. We will design a system that automatically analyses video files and detects violent and/or adult content using a Deep Learning framework. The classification is based on a previous segmentation of the video files where the most representative shot key frames are extracted. The extracted frames will be classified by a deep learning neural structure. This project includes the training and testing of the system. Training process will consist on finding or creating a database of images and adapting the parameters of the neural network.