You are here: Home Showcase Multimodal Analyses

Multimodal Analyses Overview

In AMI, advanced signal processing and machine learning-based algorithms are being developed to detect patterns based on the multimodal data generated within a meeting.

Meeting Agenda

Patterns may be used to detect where the meeting is (progress of meeting) in relation to an agenda. Agenda detection may be based on verbal input, using word frequency to identify:

  • that a key concept has been raised,
  • that an agenda item has been concluded, and the action items associated with the agenda item,
  • when there is discussion (in contrast with someone addressing the group with information) or
  • that there is disagreement.

Meeting Participant Roles

Patterns may also be based on non-verbal communications such as gestures and body language. The integration of verbal and non-verbal communications can be used to produce meeting dialog acts and to depict participant infuence levels.

Summaries

Patterns are also used to produce condensed versions of a meeting. In this category of demonstrations we illustrate several different approaches to summarization.

The Role of the AMI Meeting Corpus

At this time all demonstrations and research are based on the AMI Meeting Corpus, however, in the future, meeting contents from other sources, including real time sources, will be supported.

 
European Research Area    Information Society Technologies

Powered by Plone CMS, the Open Source Content Management System

This site conforms to the following standards: