CREATING AN INTERACTIVE MUSICAL EXPERIENCE FOR A CONCERT HALL

Authors

  • Svitlana Antoshchuk
  • Mykyta Kovalenko
  • Jürgen Sieck

DOI:

https://doi.org/10.47839/ijc.17.3.1034

Keywords:

gesture recognition, pose estimation, computer vision, machine learning, Augmented Reality, virtual instrument.

Abstract

In this paper we introduce a Kinect based posture recognition approach that can classify the user’s pose and gesture and match them to a set of predefined musical instruments. The efficiency of the approach is then demonstrated using two applications. The Virtual Orchestra system uses pose- and gesture-recognition along with Augmented Reality technology to add a virtual musical instrument into the scene, both visually and audibly: the visual representation of the instrument is placed into the user’s hands and the sound of the corresponding instrument is played. An additional functionality is that the user can control the intensity and the pitch of the sound by changing the speed of his hand or finger movements. The Magic Mirror game is the second application developed for the Berlin Concert Hall that uses the posture recognition approach to introduce the visitors to some classical music pieces and familiarize them with various classical musical instruments.

References

F. J. Detmer, J. Hettig, D. Schindele, M. Schostak, C. Hansen, “Virtual and augmented reality systems for renal interventions: a systematic review,” in IEEE Reviews in Biomedical Engineering, vol. 10, pp. 78-94, 2017.

K. F. Lee, Y. L. Chen, H. C. Hsieh, K. Y. Chin, “Application of intuitive mixed reality interactive system to museum guide activity,” Proceedings IEEE International Conference on Consumer Electronics, Taiwan (ICCE-TW), Taipei, 2017, pp. 257-258.

L. A. Hernández-Ibáñez, V. Barneche-Naya, R. Mihura-López, “A comparative study of walkthrough paradigms for virtual environments using kinect based natural interaction,” Proceedings of the 22nd International Conference on Virtual System & Multimedia (VSMM), Kuala Lumpur, 2016, pp. 1–7.

S. Serafin, C. Erkut, J. Kojs, N. C. Nilsson, R. Nordahl, “Virtual reality musical instruments: state of the art, design principles, and future directions,” in Computer Music Journal, Vol. 40, No. 3, pp. 22–40, 2016.

W. Lages, M. Nabiyouni, J. Tibau, D. A. Bowman, “Interval player: Designing virtual musical instrument using in-air gestures,” in Proceedings of the IEEE International Conference 3D User Interfaces (3DUI), 2015.

H. Kanke, Y. Takegawa, T. Terada, M. Tsukamoto, “Airstic drum: a drumstick for integration of real and virtual drums,” in Advances in Computer Entertainment, pp. 57–69, Springer, 2012.

P. Sarang, A. More, A. Gaikwad, T. Varma, “Air drum using Kinect and Arduino,” International Journal of Computer Science and Information Technologies, Vol. 6, Issue 2, pp. 1153-1155, 2015.

M. Kovalenko, J. Sieck, S. Anotshchuk, “Markerless augmented reality approach based on hand posture and gesture recognition,” in Proceedings of the International Conference “Kultur und Informatik: Augmented Reality”, 2016, pp. 171–182.

W. Liu, Y. Fan, Z. Li, Z. Zhang, “RGBD video based human hand trajectory tracking and gesture recognition system,” Mathematical Problems in Engineering, Volume 2015, Article ID 863732, 15 pages, 2015.

I. Hwang, H. Son, J. R. Kim, “AirPiano: Enhancing music playing experience in virtual reality with mid-air haptic feedback,” Proceedings of the IEEE World Haptics Conference (WHC), Munich, 2017, pp. 213-218.

R. R. Hariadi, I. Kuswardayan, “Design and implementation of virtual Indonesian Musical Instrument (VIMi) application using leap motion controller,” Proceedings of the International Conference on Information & Communication Technology and Systems (ICTS), Surabaya, 2016, pp. 43–48.

N. Burks, L. Smith, J. Saquer, “A virtual xylophone for music education,” Proceedings of the IEEE International Symposium on Multimedia (ISM), San Jose, CA, 2016, pp. 409-410.

Open-source SDK for 3D sensors – OpenNI. [Online]. Available: http://www.openni.ru/, Retrieved January 18th, 2017

Open Computer Vision Library, Retrieved January 17th, 2017, [Online]. Available: http://opencv.org/.

J. Hou, H. Gao, Q. Xia, N. Qi, “Feature combination and the kNN Framework in object classification,” in IEEE Transactions on Neural Networks and Learning Systems, vol. 27, no. 6, pp. 1368–1378, June 2016.

M. Kovalenko, J. Sieck, S. Anotshchuk, “Real-time hand tracking and gesture recognition using a semantic-probabilistic network,” Proceedings of the 16th IEEE International Conference on Computer Modelling and Simulation, Cambridge, 2014, pp. 269-274.

A. B. M. T. Islam, C. Scheel, R. Pajarola, O. Staadt, “Robust enhancement of depth images from Kinect sensor,” Proceedings of the IEEE International Conference on Virtual Reality (VR), Arles, 2015, pp. 197–198.

N. Shirbhate, K. Talele, “Human body language understanding for action detection using geometric features,” Proceedings of the 2nd International Conference on Contemporary Computing and Informatics (IC3I), Noida, 2016, pp. 603–607.

N. H. Nguyen, K. Doğançay, “Improved pseudolinear Kalman filter algorithms for bearings-only target tracking,” in IEEE Transactions on Signal Processing, issue 99, pp. 1–1, 2017.

F. Berthaut, M. Hachet, “Spatial interfaces and interactive 3D environments for immersive musical performances,” in IEEE Computer Graphics and Applications, Vol. 36, No. 5, pp. 82–87, 2016.

M. Kovalenko, S. Antoshchuk, J. Sieck, “Virtual orchestra: an interactive solution for a concert hall,” in Proceedings of the XV International Conference “Culture and Computer Science,” Berlin, 2017, pp. 169–179.

Downloads

Published

2018-09-30

How to Cite

Antoshchuk, S., Kovalenko, M., & Sieck, J. (2018). CREATING AN INTERACTIVE MUSICAL EXPERIENCE FOR A CONCERT HALL. International Journal of Computing, 17(3), 143-152. https://doi.org/10.47839/ijc.17.3.1034

Issue

Section

Articles