Updated Activity Analysis, 2.1

Updated version of the Activity Analysis toolbox is up on GitHub now. The changes in functions came from working with more psychophysiological measurements and events. Now there is an option to assess the coordination score for a response collection using the distributions of local rank in activity levels, instead of the distribution of activity levels themselves. For more on Activity Analysis generally, check out this post on the paper that was published earlier this year, including a link to the full text pdf.

This addition may seem like a tiny adjustment. For many kinds of response events, it doesn’t substantially change alignment assessments. When considering the coordination of events that are fairly consistent in rate over time, like say inspiration onsets, the distributions of activity levels and of local rank give the same kinds of coordination scores. However, for response events that change their rate of occurance over the course of a piece of music, like skin conductance increases, the activity level time series distribution obscures moments of exceptional alignment in quieter times. In such cases, the local rank does a better job at capturing anomalous alignments.

This rank based coordination score appeared to be necessary when I was testing unrelated response collections of facial sEMG signals and skin conductance. The old calculation generated scores that were too low, producing insufficient numbers of false positives, while this adjustment behaved just as the statistic should.

Besides this change to the localActivityTest function outputs, a few other functions have been tweaked and the demos have been amended to work with these changes. Lastly, Demo_2 now include coordination assessments on a number psychophysiological signals recorded during 24 listenings to a fun piece of music by a single participant.

It all should work just fine in MatLab.

I’m looking forward to releasing this version in Python in the not too distant future too.

ICMPC15 – The Audience’s Breath

This year’s ICMPC recorded all of the talks in support of virtual attendance. Here is my long talk (20 min + Q&A) on coordination in respiration between audience members.

The Audience’s Breath: Collective Respiratory Coordination in Response to Music

F. Upham, H. Egermann, and S. McAdams

Abstract

Performers have used respiratory metaphors to describe the reactions of the audience’s engagement with a performance. We refer to an audience holding their collective breath, or sighing with a release of tension. Significant regularities in respiratory phase have been measured in participants’ responses over multiple listenings to some recorded music (Sato, Ohsuga, & Moriya, 2012), but this fleeting alignment has not yet been measured in audiences at live concerts.

Aims

With recordings of respiration from audience members at live performances, we aim to evaluate whether there is measurable respiratory alignment between them to some or all pieces. If there is coordination, we consider which phase of the respiratory cycle shows the highest degree of alignment and how this could relate to audience members’ experience of the music performed.

Method

Respiration data from two audiences were evaluated using new techniques in respiratory phase detection and measurement of coordination. From the first audience, 40 participants sat amongst a larger group in an experiment-led concert of chamber music including three pieces of contrasting genres. The second audience was composed of 48 participants who were presented solo flute music, some recorded and some played live. Half of this group continuously reported the unexpectedness of the music while the remaining half reported their felt emotional responses through handheld devices.

Five components of the respiratory phase were evaluated for coordination using activity analysis with parameters tuned to each: Inspiration Onset, High Inspiration Flow interval, Expiration Onset, High Expiration Flow interval, and Post-Expiration Pause. These phases relate to the mechanics of respiration and the sensory consequences of air exchange.

Results

Significant coordination in respiratory phase components were observed between audience members to most stimuli, but the most coordinated phases varied from piece to piece. High Inspiratory and Expiration Flow intervals were most often significantly coordinated, compared to onsets. Post Expiratory Pauses, which would count instances of breath holding, were only coordinated in one piece. Less than half of participants engage in phase alignment concurrently, however numerous instances relate well to developing theories of respiration/cognition interactions, including differences in the alignment patterns of participants per rating task.

Conclusions

Audiences engage in measurable collective respiratory coordination with live performance and recorded music through simultaneous inspirations and expirations. However, these behaviours are performed by only a subset of participants at a time. This inter-participant difference is consistent with the results from repeated response experiments, in which only some participants have shown respiratory coordination with their own previous listenings. The fact that different phases of respiration showed coordination underlines the possibility that multiple mechanisms like embodied listening, attention, and hearing facilitation may be encouraging adjustments in audience members’ respiratory sequences for alignment.

References

Sato, T. G., Ohsuga, M., and Moriya, T. (2012). Increase in the timing coincidence of a respiration event induced by listening repeatedly to the same music track. Acoustical Science and Technology, 33(4):255–261

Materials

The slides are also available for download.

From Matlab to Python

After years of unexplicable failure, I’ve finally gotten numpy and scipy to play nice on my computer. (Anaconda finally installed properly once I moved to Yosemite; who knows what was breaking the system before…) So now I can finally start converting my matlab toolbox for Activity Analysis to open source python, and in the process, and share analyses online with ipython notebook. Whose excited? well I am. I think it will make it easier to follow what the calculations and inferences, particularly to those who aren’t inclined to download a toolbox and run a demo on their own machines.

Music and coordinated experience in time: Back to Activity Analysis

There are two comically extreme positions on how music (or really any stimulus) affects observers. At one end, the position that all of our experiences are equivalent, dictated by the common signal, at the other, individual subjectivities make our impressions and reactions irreconcilable. In studying how people respond to music, it’s obvious that the reality lies somewhere in the middle: parts of our experience can match that of others, though differences and conflicts persist. I’ve spent years developing this thing called activity analysis to explore and grade the distance between absolute agreement and complete disarray in the responses measured across people sharing a common experience.

As people attend to a time varying stimulus (like music) their experience develops moment by moment, changes prompted by events in the action observed. What we have, in activity analysis, is a means of exploring and statistically assessing how strongly the shared music coordinates these changes in response. So if we are tracking smiles in an audience during a concert, we can evaluate the probability that those smiles are prompted by specific moments in the performance, and from there have some expectation of how another audience may respond.

If everyone agreed with each other, this would not be necessary, and if nothing was common between listeners’ experience, this would not be possible. Instead empirical data appears to wander in between, and with that variation comes the opportunity to study factors nudging inter-response agreement one way or the other. We’ve seen extreme coherence, that of the crowd singing together at the top of their lungs in a stadium saturated with amplified sound, and polite but disoriented disengagement is a common response to someone else’s favourite music. We need to test the many theories on why so many different response (and distributions of responses) arise from shared experiences, and Activity Analysis can help with that. Finally.

Here is hoping I can get back to sharing examples of what this approach to collections of continuous responses makes possible. The data and analyses have been waited too long already.

New paper: Tension and local activity

I’ve got a new paper out, with Mary Farbood (first author) about ratings of musical tension to an interesting example of romantic lieder, Schubert’s Morgengruss. The link is not to the performance we worked with, which was the Pears and Britten recording, but I like this interpretation too.

My contribution is in the comparison between verses, identifying significant moments of tension rating increases and decreases which differed between verses, and discussing how that might be related to the singer’s articulation, contrast between successive verses, and other factors often overlooked in continuous parametrizations of musical stimuli. While displaying some of what activity analysis can do, numerically, it was also fun to put on my music theory hat to interpret what might be influencing listeners continuous ratings of tension.