Article | Selected Papers from the CLARIN 2014 Conference, October 24-25, 2014, Soesterberg, The Netherlands | Sharing Multimodal Data: A Novel Metadata Session Profile for Multimodal Corpora
Göm menyn

Title:
Sharing Multimodal Data: A Novel Metadata Session Profile for Multimodal Corpora
Author:
Farina Freigang: Faculty of Technology, Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld University, Bielefeld, Germany Matthias A. Priesters: Faculty of Technology, Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld University, Bielefeld, Germany / Human Technology Centre (HumTec), Natural Media Lab, RWTH Aachen University, Aachen, Germany Rie Nishio: Institute of German Sign Language and Communication of the Deaf, University of Hamburg, Hamburg, Germany Kirsten Bergmann: Faculty of Technology, Center of Excellence “Cognitive Interaction Technology” (CITEC), Bielefeld University, Bielefeld, Germany
Download:
Full text (pdf)
Year:
2015
Conference:
Selected Papers from the CLARIN 2014 Conference, October 24-25, 2014, Soesterberg, The Netherlands
Issue:
116
Article no.:
003
Pages:
25-35
No. of pages:
11
Publication type:
Abstract and Fulltext
Published:
2015-08-26
ISBN:
978-91-7685-954-4
Series:
Linköping Electronic Conference Proceedings
ISSN (print):
1650-3686
ISSN (online):
1650-3740
Publisher:
Linköping University Electronic Press, Linköpings universitet


Export in BibTex, RIS or text

In the natural sciences and humanities, scientific data management and in particular the categorisation of data and the publication of (meta)data becomes ever more relevant. A new focus in corpus-based research are multimodal data. However, metadata profiles for multimodal data are rare and do not fit the needs of researchers who are searching for particular data. In this paper, we present a novel metadata session profile for describing data collections which contain other modalities beyond text and speech. The profile is based on experiences gained during the work on three different corpora comprising communicative speech-gestural behaviour as well as sign language data. The profile is aimed at creating metadata for individual recording sessions and is technically implemented in the CMDI format. Furthermore, it is designed to be paired with an existing profile for media corpora, which was extended for multimodal data.

Keywords: Metadata profile;multimodal data;multimodal corpora;gesture;sign language;CMDI;ISOcat;CLARIN

Selected Papers from the CLARIN 2014 Conference, October 24-25, 2014, Soesterberg, The Netherlands

Author:
Farina Freigang, Matthias A. Priesters, Rie Nishio, Kirsten Bergmann
Title:
Sharing Multimodal Data: A Novel Metadata Session Profile for Multimodal Corpora
References:

Paul Boersma and David Weenink. 2001. Praat, a system for doing phonetics by computer. Glot International, 5(9–10):341–345.

Daan Broeder, Thierry Declerck, Erhard Hinrichs, Stelios Piperidis, Laurent Romary, Nicoletta Calzolari, and Peter Wittenburg. 2008. Foundation of a component-based flexible registry for language resources and technology. In Proceedings of LREC 2008, Sixth International Conference on Language Resources and Evaluation, pages 1433–1436.

Daan Broeder, Menzo Windhouwer, Dieter van Uytvanck, Twan Goosen, and Thorsten Trippel. 2012. CMDI: a Component Metadata Infrastructure. In Proceedings of the workshop on Describing LRs with Metadata (LREC 2012).

Onno Crasborn and Thomas Hanke. 2003a. Additions to the IMDI metadata set for sign language corpora. Agreements at an ECHO workshop, May 8–9, 2003, Radboud University, Nijmegen. http://www.ru.nl/publish/pages/522090/signmetadata_oct2003.pdf.

Onno Crasborn and Thomas Hanke. 2003b. Metadata for sign language corpora. Background document for an ECHO workshop, May 8–9, 2003, Radboud University, Nijmegen. http://sign-lang.ruhosting. nl/echo/docs/ECHO_Metadata_SL.pdf.

Onno Crasborn and Menzo Windhouwer. 2012. ISOcat data categories for signed language resources. In Efthimiou, Eleni and Kouroupetroglou, Georgios and Fotinea, Stavroula-Evita, editor, Gestures in embodied communication and human-computer interaction, pages 118–128. Springer.

Folkert de Vriend, Daan Broeder, Griet Depoorter, Laura van Eerten, and Dieter van Uytvanck. 2013. Creating& testing CLARIN metadata components. Language Resources and Evaluation, 47(4):1315–1326.

Farina Freigang and Kirsten Bergmann. 2013. Towards metadata descriptions for multimodal corpora of natural communication data. In Proceedings of the workshop on Multimodal Corpora: Beyond Audio and Video, (IVA 2013).

Maria Gavrilidou, Penny Labropoulou, Elina Desipri, Stelios Piperidis, Haris Papageorgiou, Monica Monachini, Francesca Frontini, Thierry Declerck, Gil Francopoulo, Victoria Arranz, and Valerie Mapelli. 2012. The META-SHARE metadata schema for the description of language resources. In Proceedings of LREC 2012, Eighth International Conference on Language Resources and Evaluation.

Thomas Hanke, Lutz K¨onig, Sven Wagner, and Silke Matthes. 2010. DGS Corpus & Dicta-Sign: The Hamburg studio setup. In Proceedings of the 4th Workshop on the Representation and Processing of Sign Languages: Corpora and Sign Language Technologies (LREC 2010), pages 106–110.

Julius Hassemer. 2015. Towards a theory of Gesture Form Analysis: Principles of gesture conceptualisation, with empirical support from motion-capture data. Ph.D. thesis, RWTH Aachen University.

Michael Kipp. 2001. Anvil – A generic annotation tool for multimodal dialogue. In Proceedings of the 7th European Conference on Speech Communication and Technology (Eurospeech), pages 1367–1370.

Andy L¨ucking, Kirsten Bergmann, Florian Hahn, Stefan Kopp, and Hannes Rieser. 2013. Data-based Analysis of Speech and Gesture: The Bielefeld Speech and Gesture Alignment Corpus (SaGA) and its Applications. Journal on Multimodal User Interfaces, 7(1–2):5–18.

Silke Matthes, Thomas Hanke, Anja Regen, Jakob Storz, Satu Worseck, Eleni Efthimiou, Athanasia-Lida Dimou, Annelies Braffort, John Glauert, and Eva Safar. 2012. Dicta-Sign – Building a Multilingual Sign Language Corpus. In Proceedings of the 5th Workshop on the Representation and Processing of Sign Languages: Interactions between Corpus and Lexicon (LREC 2012).

David McNeill. 1992. Hand and mind: What gestures reveal about thought. University of Chicago Press, Chicago and London.

Thomas Schmidt. 2002. Exmaralda – ein System zur Diskurstranskription auf dem Computer. Arbeiten zur Mehrsprachigkeit, Serie B (34). Hamburg: SFB Mehrsprachigkeit.

Peter Withers. 2012. Metadata Management with Arbil. In Proceedings of the workshop on Describing LRs with Metadata (LREC 2012).

Peter Wittenburg, Hennie Brugman, Albert Russel, Alex Klassmann, and Han Sloetjes. 2006. ELAN: a professional framework for multimodality research. In Proceedings of LREC 2006, Fifth International Conference on Language Resources and Evaluation.

Selected Papers from the CLARIN 2014 Conference, October 24-25, 2014, Soesterberg, The Netherlands

Author:
Farina Freigang, Matthias A. Priesters, Rie Nishio, Kirsten Bergmann
Title:
Sharing Multimodal Data: A Novel Metadata Session Profile for Multimodal Corpora
Note: the following are taken directly from CrossRef
Citations:
No citations available at the moment


Responsible for this page: Peter Berkesand
Last updated: 2017-02-21