Michael Kai Petersen’s research focus is cognitive modeling of media which fuses latent semantics and neuroimaging with aspects of affective computing. An approach that might be used for personalized search of media or to transform smartphones combined with mobile EEG into brain machine interfaces. He has 30 years of experience within digital media engineering and has since 2004 been associated with the DTU Technical University of Denmark where he received his PhD degree in 2010 and was subsequently appointed Assistant Professor in Cognitive Systems and Head of Studies for the Digital Media Engineering MSc program at DTU Informatics.
Combining a technical and creative background, he holds a M.Mic master degree in mobile internet communication from DTU Photonics 2004, while previously having been trained in digital sound engineering as a producer in DR Danish Broadcasting Corporation and Steeple Chase 1980-1993, after graduating from the soloist class of the Royal Danish Academy of Music in 1982. As an entrepreneur he has founded three start-up companies over the period 1993-2003, and has produced 100 plus CD albums for labels including Chandos and Naxos.
Coupling a wireless EEG headset with a smartphone offers new opportunities to capture brain imaging data reflecting our everyday social behavior in a mobile context. However processing the data on a portable device will require novel approaches to analyze and interpret significant patterns in order to make them available for runtime interaction. Applying a Bayesian approach to reconstruct the neural sources we demonstrate the ability to distinguish among emotional responses reflected in different scalp potentials when viewing pleasant and unpleasant pictures compared to neutral content. Rendering the activations in a 3D brain model on a smartphone may not only facilitate differentiation of emotional responses but also provide an intuitive interface for touch based interaction, allowing for both modeling the mental state of users as well as providing a basis for novel bio-feedback applications.
Michael Kai Petersen, Carsten Stahlhut, Arkadiusz Stopzynski, Jakob Eg Larsen and Lars Kai Hansen:smartphones_get_emotional_-_mind_reading_images_and_reconstructing_the_neural_sources.pdf - paper presented at ACII 2011 Affective Computing and Intelligent Interaction, Memphis, Tennessee, USA, October 2011
Neuroimaging studies have over the past decades established that language is grounded in sensorimotor areas of the brain. The same neuronal circuits seem involved whether we literally pick up a ball or in a phrase refer to grasping an idea. However recent findings have demonstrated that not only leg, hand and face related but also emotional action verbs activate premotor systems in the brain. Hypothesizing that the force and spatial parameters which define action based language might also be reflected in the latent semantics of words, we select motor and emotion related verbs and apply latent semantic analysis, multidimensional scaling, hierarchical clustering and network graph analysis to quantify their interaction and identify parameters of force and spatial differentiation which we propose cognitively relate emotions to sensorimotor action schemas
Michael Kai Petersen, Lars Kai Hansen: on_an_emotional_node_modeling_sentiment_in_graphs_of_action_verbs2.pdf
Michael Kai Petersen is part of the milab research environment within personalized context-awareness and teaching courses within the Digital Media Engineering master program related to building collective intelligence from metadata, self tracking, social network analysis as well as cognitive aspects of human computer interaction providing a foundation for user experience engineering.