follow us

Follow AITathens on Twitter faceebook in_logo

Happening now...

Personalised Media Broadcasting

Reseacrh in the context of My-eDirector-2012 FP7 EU project

Under construction (Latest update May 26, 2011)

Nowadays there is a trend towards personalised delivery of multimedia streams, with applications in video-on-demand and personalised coverage of sports events [1]. A platform that empowers viewers to direct their own coverage of large athletics events should enable them to set up their own personal virtual director to manage camera coverage of an event and orchestrate event viewing to their own preferences. For example, broadcasting of athletics tends to be biased towards track events [2]. Personalised content delivery can change that for viewers who wish to focus on different content. Such a platform employs video processing to extract automatic video metadata that are used for reasoning about incidents of interest. Subsequently, viewers' preferences are matched to the detected incidents in order to issue recommendations on which camera to view. Finally dynamic adaptation to the end-user environment, including terminal and network capabilities is also important.

In the context of the FP7 EU project My-eDirector-2012, there are two types of camera proposals for personalised content delivery. In the case of between-sports camera proposals, the view is changed between two sports, upon occurance of an incident in a sport higher up the user preferences than the one currently being delivered. In within-sport camera proposals, the camera is changed to offer a better view of the evolution of the sport, or the incident of interest. Both camera proposal types are supported by accumulating metadata from processing the video streams and reasoning on incidents of interest to the viewers. The core of the reasoning system proposed in My-eDirector-2012 comprises a Knowledge Base:

KB reasoning in My-eDirector-2012

Automatic video metadata extraction includes tracking the way the cameras of the broadcaster move and tracking the athletes as they compete [3],[4]. Sound from sports venues is also used, since crowd excitement is an important cue for an incident of interest, but audio tracking is not attempted since there is no individual source to track. Details on the person tracking systems employed in My-eDirector-2012 can be found in the book  Audio-Visual Person Tracking: A Practical Approach (Imperial College Press, London, UK, Summer 2011). The camera motion estimation is performed by projectively matching subsequent frames, as detailed in the following video presentation:

An example of incident detection in long jump utilising face tracks from a frontal camera and camera shift from a profile camera [5] is shown next. The detected incidents are compared to the manual annotations provided by a sport information system, typically employed in large scale sports events.

KB reasoning in My-eDirector-2012

Automatic detection of the phases in a long jump event: the vision processing metrics used are the accumulated face track likelihood (red) and the weighted average of camera shift (blue). The resulting incident reasoning about athlete presence, start or run and land is depicted by green horizontal bars. Typical screenshots of the three phases are also overlaid in the figure. The manual annotations from appearance of athlete to distance of jump measured are depicted in red. The ground truth as the actual moment of beginning of a run is marked by the vertical dashed black lines.


[1] Ch. Patrikakis, A. Pnevmatikakis, P. Chippendale, M. Nunes, R. Cruz, S. Poslad, Z. Wang, N. Papaoulakis, P. Papageorgiou, 'Direct your personal coverage of large athletic events', IEEE Multimedia, DOI 10.1109/MMUL.2010.69, November 2010.

[2] S. Poslad, A. Pnevmatikakis, M. Nunes, E. Garrido Ostermann, P. Chippendale, P. Brightwell and Ch. Patrikakis, ‘ Directing Your Own Live and Interactive Sports Channel ,’, 10th International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS 2009), London, UK, May 2009.

[3] A. Pnevmatikakis, N. Katsarakis, P. Chippendale, Cl. Andreatta, S. Messelodi, C. Modena and F. Tobia, ‘ Tracking for Context Extraction in Athletic Events ,’ International Workshop on Social, Adaptive and Personalized Multimedia Interaction and Access (SAPMIA 2010), ACM Multimedia, Florence, Italy, Oct. 2010.

[4] P. Chippendale, A. Pnevmatikakis ‘ Sports Indexing Through Camera and Content Understanding ,’, The 7th European Conference on Visual Media Production (CVMP 2010), London, UK, 17-18 Nov 2010.

[5] N. Katsarakis and A. Pnevmatikakis, ‘ Event Detection in Athletics for Personalized Sports Content Delivery ,’ 10th International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS 2009), London, UK, May 2009.

Bookmark and Share
Affiliated with Aalborg University-CTiF, Harvard-Kennedy School Of Goverment © ATHENS INFORMATION TECHNOLOGY designed by {Linakis+Associates}