Integration of a Neurosignals System to Detect Human Expressions in the Multimedia Material Analysis

Main Article Content

Autores

Luz Ángela Moreno-Cueva
César Augusto Peña-Cortés
Herney González-Sepúlveda

Abstract

This paper presents the advances in the integration of a commercial low-cost device for capturing neural signals of a user. The idea is to record some expressions of a multimedia material user. Everyone performs various types of expressions watching TV, movies, ads or others external signals. Within these expressions can be highlighted some examples as: the clenching one's teeth with suspense scenes, moving one's head back when feeling an object is thrown out of the screen in one's direction dunging 3D movies, looking away during horror scenes, smiling in emotional trading, to laughter during humorous scenes or even fall asleep when there is disinterest. The general idea of this system is to capture all these expressions, together with emotive signals such as the level of attention, frustration and meditation, in order that the experts designers of multimedia material, can analyze and improve their products Experimental evidence is presented showing the good performance of the system.

Keywords:

Article Details

Licence

All articles included in the Revista Facultad de Ingeniería are published under the Creative Commons (BY) license.

Authors must complete, sign, and submit the Review and Publication Authorization Form of the manuscript provided by the Journal; this form should contain all the originality and copyright information of the manuscript.

The authors who publish in this Journal accept the following conditions:

a. The authors retain the copyright and transfer the right of the first publication to the journal, with the work registered under the Creative Commons attribution license, which allows third parties to use what is published as long as they mention the authorship of the work and the first publication in this Journal.

b. Authors can make other independent and additional contractual agreements for the non-exclusive distribution of the version of the article published in this journal (eg, include it in an institutional repository or publish it in a book) provided they clearly indicate that the work It was first published in this Journal.

c. Authors are allowed and recommended to publish their work on the Internet (for example on institutional or personal pages) before and during the process.
review and publication, as it can lead to productive exchanges and a greater and faster dissemination of published work.

d. The Journal authorizes the total or partial reproduction of the content of the publication, as long as the source is cited, that is, the name of the Journal, name of the author (s), year, volume, publication number and pages of the article.

e. The ideas and statements issued by the authors are their responsibility and in no case bind the Journal.

References

[1] L. M. Keefe, “What Is the Meaning of Marketing?” Marketing News, pp. 17-18. Sep. 15, 2004.

[2] G. Du-Jian, L. Wang, Q. Zheng, Y. Liu-Li, “Neuromarketing: Marketing through Science”, International Joint Conference on Service Sciences (IJCSS), pp.285-289. Mayo, 2012.

[3] M. Soleymani, G. Chanel, J. J. Kierkels, T. Pun, “Affective Characterization of Movie Scenes Based on Multimedia Content Analysis and User’s Physiological Emotional Responses”, ISM 2008. Tenth IEEE International Symposium on Multimedia, pp.228-235, 15-17 December, 2008.

[4] K. Yaomanee, S. Pan-ngum, P. I. N. Ayuthaya, “Brain signal detection methodology for attention training using minimal EEG channels”,10th International Conference on ICT and Knowledge Engineering (ICT & Knowledge Engineering), pp. 84-89, Nov. 21-23,2012.

[5] H. Kiwan, K. Jeonghun, L. Hyeongrae, P. Jinsick, C. Sangwoo, K. Jae-Jin, K. In Young; S. I. Kim, “Measurement of Expression Characteristics in Emotional Situations using Virtual Reality”, IEEE Conference on Virtual Reality (VR 2009), pp. 265, 266, March14-18, 2009.

[6] I. A. Oz, M. M. Khan, “Efficacy of biophysiological measurements at FTFPs for facial expression classification: A validation”, IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI), pp.108-111, January 5-7, 2012.

[7] Z. Ren, X. Qi, G. Zhou, H. Wang, “Exploiting the Data Sensitivity of Neurometric Fidelity for Optimizing EEG Sensing”, Internet of Things Journal, IEEE , no. 99, pp. 1-11, 2014.

[8] Y. Chih-Chin, J. T. Huang, “The era of Cloud Computer thru bio-detecting and openresources to achieve the Ubiquitous devices”, IEEE International Conference on Consumer Electronics (ICCE), pp.580-583. January, 2012.

[9] L. Grimaldi, “Measuring the Efficacy of Advertising Communication with Neuroscience Methods: An Experiment Performed by Telecom Italia”, Pulse, IEEE, vol. 3, no. 3, pp. 48-52. May 2012.

[10] C. Chuan-Yu, L. Yu-Mon, Z. Jun-Ying, “Physiological Angry Emotion Detection Using Support Vector Regression”,15th International Conference on Network-Based Information Systems (NBiS), pp.592-596. Sep. 2012.

[11] H. Touyama, “Post-saccadic event related potential toward a new technique for information retrieval”, IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 2939-2942. Oct. 2012.

[12] R. Ohme, M. Matukin, “A Small Frog That Makes a Big Difference: Brain Wave Testing of TV Advertisements”, Pulse, IEEE, vol. 3, no. 3, pp. 28-33. Mayo, 2012.

[13] M. Eid, A. Fernández, “Read Go Go!: Towards real-time notification on readers’ state of attention”, XXIV International Symposium on Information, Communication and Automation Technologies (ICAT), pp.1-6. Nov. 2013.

[14] J. Katona, I. Farkas, T. Ujbanyi, P. Dukan, A. Kovari, A., “Evaluation of the NeuroSky MindFlex EEG headset brain waves data”, IEEE 12th International Symposium on Applied Machine Intelligence and Informatics (SAMI), pp. 91-94, January 23-25, 2014.

[15] W. A. Dijk, W. van der Velde, W. J. M. Kolkman, H. J. G. M. Crijns, K. I. Lie, “Integration of the Marquette ECG management system into the Department Information System using the European SCP-ECG Standard”, Computers in Cardiology, pp.397-400, Sep.10- 13,1995.

[16] H. Boutani, M. Ohsuga, “Applicability of the ‘Emotiv EEG Neuroheadset’ as a user-friendly input interface”, 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp.1346-1349, Jul.3-7, 2013.

[17] K. Holewa, A. Nawrocka, “Emotiv EPOC neuroheadset in brain- computer interface”, 15th International Carpathian Control Conference (ICCC), pp.149-152, May28-30, 2014.

[18] K. George, A. Iniguez, H. Donze, “Automated sensing, interpretation and conversion of facial and mental expressions into text acronyms using brain-computer interface technology”, Proceedings of 2014 IEEE International Conference on Instrumentation and Measurement Technology (I2MTC), pp.1247-1250, May 12-15, 2014.

Downloads

Download data is not yet available.