This school is project oriented

A significant amount of time will be dedicated to applying the knowledge acquired during the school - and hopefully before - to realize pilot studies using different tools and technologies. All registered students will work in small teams - including senior participants - to complete a project by the end of the school. Projects can be chosen from a list of predefined projects (see pdf below) or proposals can be made on the first day of the school. Everybody is invited to bring their own equipment and tools. These projects are supposed to be realizing new ideas and not to be a continuation of work already performed.
School Awards: prize for the best project. see PROJECTS 2015        _____________________________________________


1. Neurobiological and computational mechanisms of social SMCs in joint action

Every day life activities such as playing tennis or moving a heavy piece of furniture require the coordinated sensory-motor interaction between multiple individuals. An important aspect for joint goal-oriented behaviour is the ability to predict and anticipate the motor intentions of a partner and act accordingly by estimating their consequences (Keller et al., 2014). From a computational perspective this feature has been ascribed to a predictive control framework that involves the notion of feedback and feedforward control (Wolpert, Doya, & Kawato, 2003, Maffei et al. 2014). In this project we will explore the architectural features of an adaptive control system suitable for the acquisition of sensory-motor contingencies that stem from the interaction between two self-balancing robots. We will do this by applying a model of the cerebellar microcircuit (Herreros and Verschure, 2013) able to acquire short term sensory-motor predictions to a joint action task that we will define during the course of the project. The final goal is to present a demo to be integrated in a final socSMC performance.

Students tasks:
- design socSMC scenario
- design control architecture from provided material
- implement the control system in a self-balancing robot

Skills required
- basic understanding of control concepts (feedback control, feedforward control)
- python and arduino (C++) programming

Supervisors:  Ivan Herreros, Giovanni Maffei,


STUDENTS:  Ben Hawker, Andrei Robu, Omar Zahra


Keller, P. E., Novembre, G., & Hove, M. J. (2014). Rhythm in joint action: psychological and neurophysiological mechanisms for real-time interpersonal coordination. Phil. Trans. R. Soc. B, 369(1658), 20130394.

Wolpert, D. M., Doya, K., & Kawato, M. (2003). A unifying computational framework for motor control and social interaction.
Philosophical Transactions of the Royal Society of London B: Biological Sciences, 358(1431), 593-602.

Herreros, Ivan, and Paul FMJ Verschure. "Nucleo-olivary inhibition balances the interaction between the reactive and adaptive layers in motor control."Neural Networks 47 (2013): 64-71.

Maffei, G., Sanchez-Fibla, M., Herreros, I., & Verschure, P. F. (2014, July). The role of a cerebellum-driven perceptual prediction within a robotic postural task. In International Conference on Simulation of Adaptive Behavior (pp. 76-87). Springer International Publishing.


2.  Social NeckWork and robots

One to two Creepy Worms (Barcelona) are moved by up to 3*17 XSens-sensor units distributed to one up to six students. Each participant controls one segment of a worm e.g.. Participants have to explore how they can act together to achieve certain movement pattern and/or sound pattern by
individual or interpersonally coordinated movements. Feedback will be available in the visual and the auditory modality. In the transformation condition students motion is transformed to worm motion
whereas the mapping between student(s) and referenced worm segment or student(s) and / or sound can be changed within a wide range.

Social NeckWork®I
– Social exploration of interpersonally coordinated movements and its transformation to other species (creepy worm)
(I) Real-time visualization: Students are asked to get familiar with Xsens and to track certain movement features. Real-time visualization is a fascinating tool to play with, to create new kinds of movements. Even dual-agent constellations can be tracked and used to explore interpersonal features (distance, facing etc.).
(II) Reassembled motion: The exploration of the system can be realized for a few days, also with more exotic trials like splitting the 17 full-body equipment (17 sensors) on two or more actors (one upper body, one lower body; one left body side, other right body sight etc….) resulting in reassembled movement patterns of the animated avatar based on partial movements of two to more
students. If the system offers such possibilities or if this would exceed the technical limitations of the Xsens system has to be explored before.
(III) Sonified motion: In a second step sound and basic musical structures can be added to individual as well as interpersonally coupled behavioral patterns. Common metric patters will provoke acoustic pulse and rhythm e.g.. Also exploration of acoustic/musical couplings should be realized for some days by the students. For these kinematic-acoustic/musical couplings a software interface has to be developed in Barcelona and there should be also devices (hardware?, software) available for sound synthesis and -modulation.
(IV) Movement transformation: A third step may consist of a transformation of individual as well as common motion to another species, the so-called creepy worm. Students motion will move the worm(s?), and sonification will be coupled to the students or the worms kinematics. So a possible scenario could be that four students are tracked by a neck-sensor, each of them just moving one segment or the worm. Many different constellations can be realized offering a lot of creative space

Supervisor: Clement Moulinfrier, Marti Sanchez, G. Schmitz 

STUDENTS: Alfred Effenberg, Tonghun Hwang

3. Incorporating art and social interaction

1. Painting together: two people will copy a famous work of art together (Dalí would fit very nicely to the location) while they wear eye trackers. This will be repeated for some trials. Later on the data will be analysed to see how the viewing behavior changed - do we split the picture in two halves? Does everyone cover certain objects? We will arrive with a measure of social interaction as well as some nice art produced by the participants.

2. Drawing with the eyes: Two people will play together. One will be presented pictures that he/she has to communicate to the other by his/her gaze only. The other one will only see certain measures of the eye tracking data (fixations, saccades, ...) live and has to guess what the other one is seeing. This project investigates the question how we change our more or less reflexive behavior when we want to communicate with one another.

Students tasks:

Supervisor: Anna Lisa Gert, (lab Peter Konig).


STUDENTS: Marta Exposito, Chanez Mettouchi, Markus Bajones, Gerçek ÇİÇEK

4. Characterization of a human analogous of the rodent Vicarious Trial and Error (VTE) using immersive VR technologies

In the rodent literature, a highly reliably behavioral marker of deliberation has been termed Vicarious Trial and Error (VTE). When rats come to a decision point, they sometimes pause and look back and forth as if thinking over the choice (Muenzinger, 1938).
Modern neuroscience has confirmed that VTE reflects an ‘indecision underlying deliberation’ process since sequential firing of hippocampal place cells depict future trajectories and current goals of the animal while engaged in VTE behaviour (Redish, 2016).
A human analog of VTE has been described during volitional sampling of an environment in a 2D spatial memory task by the phenomenon of “revisiting” items (Voss, 2012), yet, an embodied analog of VTE has not been previously described in humans.
Virtual Reality offers the great opportunity to study human behaviour in ecologically valid and highly controlled experimental conditions. We therefore propose to investigate a human analog of the rodent VTE in a completely embodied spatial arena and characterize its relationship to volitional sampling and spatial memory performance. We aim to identify a behavioral marker of deliberation in humans and determine its reliability to predict performance in memory and decision making.

Technical description
A 100 m2 tracking system in combination with head mounted display will allow as to create a fully customizable virtual world, increasing the sense of presence of participants. Head and body movements will be recorded for analysis.

- Design a simple virtual reality scenario
- Run experiments with subjects
- Data analysis

Requirements: Statistical analysis, Basic C# is a plus, Knowledge about Unity 3D is a plus

Supervisors: Daniel Pacheco, Marti Sánchez-Fibla.


References: Redish, A. D. (2016). Vicarious trial and error. Nature Reviews Neuroscience, 17(3), 147-159.

Voss, J. L., Warren, D. E., Gonsalves, B. D., Federmeier, K. D., Tranel, D., & Cohen, N. J. (2011). Spontaneous revisitation during visual exploration as a link among strategic behavior, learning, and the hippocampus. Proceedings of the National Academy of Sciences, 108(31), E402-E409.

Voss, J. L., Gonsalves, B. D., Federmeier, K. D., Tranel, D., & Cohen, N. J. (2011). Hippocampal brain-network coordination during volitional exploratory behavior enhances learning. Nature neuroscience, 14(1), 115-120.

Muenzinger, K. F. (1938). Vicarious trial and error at a point of choice: I. A general survey of its relation to learning efficiency. The Pedagogical Seminary and Journal of Genetic Psychology, 53(1), 75-86.

5. Identifying a temporal structure of distal/proximal motor variability in motor learning

A major goal of stroke rehabilitation is to understand the main principles underlying motor learning and motor recovery in order to optimize the rehabilitation practice. Previous literature suggests that training can modulate spontaneous motor variability, allowing for diversity in sensorimotor feedback at the first stages of learning. Therefore what was previously thought to be motor noise, may be a mechanism for optimizing the learning process and overcoming the exploitation/exploration trade off (Wu, et al. 2014). The implementation of these principles may be useful for motor rehabilitation and their implementation could be transferred to wearable technologies, offering a unique set of advantages to the rehabilitation field. A number of studies have been exploring the potential of wearable devices for the quantification of motor performance and recovery (Wang et al., 2014, Ballester et al. 2015). In this project we propose to implement a method for the quantification of distal/proximal motor variability by using wearable motion sensors. We would like to investigate the relationship between the temporal structure of distal and proximal motor variability with motor learning. We will be using the Rehabilitation Gaming System-Wear, a wearable system for the continuous monitoring of arm use in healthy subjects. This device is composed of a pair of bracelets and a smartphone. The bracelets include a coin-sized Bluetooth-Board with integrated accelerometer, a vibrating motor, an ultra bright RGB LED, a battery, and a wristband. In order to explore the proposed hypothesis, we will conduct behavioral experiments using this setup. The outcome of this study can directly inform the design of these systems and their application in the clinical context.

- Literature review on motor recovery and motor learning.
- Defining the experimental protocol.
- Collecting kinesthetic data from healthy subjects with the RGS wearables.
- Implementing a method for the quantification of motor variability both at distal and proximal arm segments.
- Data analysis and reporting.

Supervisors: Belen Rubio, Martina Maier, Klaudia Grechuta


STUDENTS: Mircea Stoica, Gabriel Axel, Michele Crabolu

References:  Wu, H. G., Miyamoto, Y. R., Castro, L. N. G., Ölveczky, B. P., & Smith, M. A. (2014). Temporal structure of motor variability is dynamically regulated and predicts motor learning ability. Nature neuroscience, 17(2), 312-321.

Q. Wang, W. Chen, and P. Markopoulos. Literature review on wearable systems in upper extremity rehabilitation. In IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI), pages 551–555. IEEE, June 2014. ISBN 978-1-4799-2131-7. doi: 10.1109/BHI.2014. 6864424. 159, 185

Ballester, B. R., Lathe, A., Duarte, E., Duff, A., & Verschure13, P. F. A Wearable Bracelet Device for Promoting Arm Use in Stroke Patients.

6. Integration of humance dance with behavior of robots and VR agents

Our task will be to come up with a creative way of integrating the movements of one or two human dancers on stage with the behavior of the robots, the VR agents and the general stage environment (e.g. sound, light). As a minimal exmaple solution, we have prepared a Matlab script that calculates how correlated (vs. uncorrelated and anti-correlated) the movement of two hands is in real time – a measure that could be calculated within or across agents in the performance, and be used to drive again other elements of the performance.

- Decide which measures/sensors to use in order to characterize human motion and feed it into the general performance (week 1, with Annika Lübbert & expoerts on movement from Hannover)
- Decide which coupling measures to use to characterize entrainment with other agents in the performance (week 1, with Annika Lübbert & experts on information theoretic measures from Hertfordshire)
- Adapt the Matlab script accordingly & integrate it with the (central program of the) performance (Martí Sanchez)
- Work on the artistic dance performance (week 2, with dancer Dayana Hristova and people who work on the performance script)

Requirements (to be covered within the group, not necessarily every individual):
- Experience with programming software, and/or integrating hardware into a program
- Interest or experience with dance / creative movement
- Interest in quantifying interaction dynamics
- Interest in movement tracking and real-time interactive systems

Supervisors: Dayana Hristova, Annika Lübbert, Petar , Pedro Gonzales



7. REEM-C and Humanoid robots

• Implement a coordination task with the REEM-C
• ROS environment : simulation and real robot
• Usage of PAL whole-body control module
• Tutorial Wednesday afternon in the afternoon in Polivalente

• Additionally we are going to take a look at the NAO SDK  

Supervisors:  Marti Sanchez,, Jordi Ysard

STUDENTS: Yeşim KALKAN,  Phuong Nguyen 


8. Sonification of interactive behavior in robots and intelligent spaces.

Nowadays a parametric and deliberately manipulation of sound diffusion in the acoustic space is a well-know possibility to listen to music in a theatre, cinema or living room. It is something similar to widespread technology surround 5.1. Different from this canonical system, our proposal is to setup an eight speakers system (octophonic field) to investigate how the listener’s perception behaves when virtual sound sources are presented and manipulated. That is, our goal is to immerse the listener into a sound projection created by audible sound trajectories.
In such configuration, how human perception works when low frequency drones, soundscapes with urban traffic, insects noise, birdsongs or even a text are multiplied in eight-channels?
How this technique can be useful in scientific sonification of large amount of data such it is presented in the BrainX3?

Supervisors: Jonatas Manzolli,, Artemis Moroni