Published – Yet another First: scored the first article in the first issue of the new @multimedia journal by MDPI. It has been a month online, but yikes, it was a busy month :)
Metaverse Recordings (XR) can be made in the technical space of a computer. As part of our research, we explored the opportunities that recordable data can bring to Multimedia Information Retrieval.
Thanks again to my co-authors for your support Prof. Dr.-Ing. Stefan Wagenpfeil, Ingo Frommholz, and Matthias Hemmje.
Metaverse Recordings (MVRs), screen recordings of user experiences in virtual environments, represent a mostly underexplored field. This article addresses the integration of MVR and Multimedia Information Retrieval (MMIR). Unlike conventional media, MVRs can include additional streams of structured data, such as Scene Raw Data (SRD) and Peripheral Data (PD), which capture graphical rendering states and user interactions. We explore the technical facets of recordings in the Metaverse, detailing diverse methodologies and their implications for MVR-specific Multimedia Information Retrieval. Our discussion not only highlights the unique opportunities of MVR content analysis, but also examines the challenges they pose to conventional MMIR paradigms. It addresses the key challenges around the semantic gap in existing content analysis tools when applied to MVRs and the high computational cost and limited recall of video-based feature extraction. We present a model for MVR structure, a prototype recording system, and an evaluation framework to assess retrieval performance. We collected a set of 111 MVRs to study and evaluate the intricacies. Our findings show that SRD and PD provide significant, low-cost contributions to retrieval accuracy and scalability, and support the case for integrating structured interaction data into future MMIR architectures.
DOI: 10.3390/multimedia1010002