Augmented Reality Authoring for Cultural Heritage Sites using the Sitsim AR Editor
Gunnar Liestøl, University of Oslo, Norway, Tomas Stenarson, CodeGrind AB, Sweden
Abstract
As Augmented Reality (AR) has become more popular in museum mediation and storytelling on Cultural Heritage (CH) sites, the need for easy–to–use tools to develop applications has become evident. This paper provides a background to the how-to session introducing the Sitsim AR Editor—an editor for putting together applications in the form of situated simulations (indirect augmented reality). The Sitsim AR Editor is an add on to the Unity 3-D engine, a game developer platform often used for non–game purposes, and is based on ten years of experience with app development for museums and CH sites, including Forum Romanum in Rome, the Acropolis in Athens, Ancient Phalasarna on Crete, Viking settlements and burial sites in Norway, the baroque Old Town of Narva in Estonia, and the D-Day landings site on Omaha Beach in Normandy. The how–to session is organized as follows: 1. Introduction to the Sitsim AR platform as a storytelling device for museums and CH sites. 2. Introduction to the Sitsim AR Editor, including how to import, position, scale and orient 3-D terrains and objects, create content links and build an application. The Sitsim AR Editor is currently MacOS–based.Keywords: Sitsim AR Editor, Augmented Reality, sitsim, situated simulations, storytelling, cultural heritage
Introduction
Over the past decade, Augmented Reality (AR) applications on smartphones and tablets have become more and more popular, both in museums and cultural heritage sites. Many museums and sites have allocated funds and developed apps related to their exhibitions and collections, and many more would like to do so. However, designing and producing good AR applications is not a straight forward enterprise—the rhetoric and grammar of AR communication and storytelling is still in its infancy. Designs must be customized and adapted to each museum’s and site’s individual needs as well as the special characteristics and requirements of the topics and subject matters they curate and convey. As a consequence, it is not always a good solution to outsource application development to specialized tech or digital design companies. On the other hand, it is often demanding to build the necessary competencies and skills in-house for any museum or cultural heritage institution, especially those with limited resources. Independent of funding and existing skills and competencies, there is always a need for software tools that are easy to use and which are prepared to match the requirements of the task in question.
In this paper, we will present the Sitsim AR Editor Situated Simulation Augmented Reality Editor, a Unity–based tool to help create compelling applications for use with outdoor museums and sites. (Unity is a generic game developer environment frequently used for non-gaming applications and contexts.) We will describe the editor’s history, how it differs from other available editing tools as well as its basic qualities. Further, we will clarify what it can and cannot do, the way a typical workflow is organized and how the Sitsim AR Editor is composed as a Unity add–on, as well as its technical requirements. Finally, we will present possible future features that could be included in the editor, such as indoor positioning.
Figure 1: A typical scene that can be compiled with the Sitsim AR Editor: The AR-situated simulation (sitsim) from Ancient Phalasarna on Crete. The visitor is positioned at the archeological site of Phalasarna in the middle of its ancient harbor, which today is 6.5 m above sea level due to an earthquake and land rise in 365 AD. The application displays the reconstructed quay and fortified harbour as it may have looked in 333 BC with moored triereme vessels. In the lower left part of the screen, one can see a spatially positioned hypertext link (grey sign ‘Quay’), which when activated gives access to background information based on the archaeological excavations and other sources. Note the indirect Augmented Reality approach where the digitally reconstructed environment takes up the whole screen (Photo: G. Liestøl).
Tools for Creating Augmented Reality Applications
As AR applications have increased in popularity, so has the availability of editors and development kits. With the general availability and dissemination of smartphones and tablets over the past decade, we have also witnessed the introduction of a variety of AR editors that do not require advanced skills to operate. Solutions like Layar and Wikitude offered easy-to-use editing capabilities early on. Today, there is a variety of editors available, which the Cultural Heritage and Museum sectors may take advantage of. Several presentations and evaluations of various tools are available (Ding, 2017; Herpich et al., 2017).
However, there are obvious limitations and problems related to most of these solutions when it comes to the development of applications for use in museums and cultural heritage environments. Most AR editors are aimed at developing mixed reality (MR) applications, where the digital information and 3-D graphics exists as a layer on top of a live video feed from the device’s camera, which display the immediate environment on the screen.
This mixing of video and graphics produces a profound problem of compatibility: video is 2-D and digital graphics in this context are 3-D. As a consequence visual paradoxes emerge. Let us say a visitor is using an application in a museum and points the AR enabled device at an object in the exhibition, for example a white marble statue, and overlays it with the original color painting—a frequent solution experimented with in many museums. If another visitor now walks by, between the device and the object, this person will be spatially represented between the video representation of the statue and its colored coating. A situation that will obviously reduce the value of the experience one initially intended to improve. Although this problem can be partly compensated for, the two levels of representation can never be fully integrated with current solutions (MacIntyre et al., 2001).
As a consequence we have looked to, and practiced, another form of mobile AR, known in general as Indirect Augmented Reality (Wither et al., 2011), where there is no mixing of real time video with a 3-D graphics layer, but a full screen showing the digitally reconstructed environment, see figure 1. The mixing of the real and the virtual then takes place between the full screen on the device and the physical surroundings itself. The Sitsim AR Editor is thus a tool for compilation of Indirect Augmented Reality applications.
The Sitsim Platform
Since 2008, we have at the University of Oslo (Norway) and in collaboration with CodeGrind AB (Sweden) and Tag of Joy (Lithuania), developed, tested, assessed and continually improved an AR publication platform, based on the Unity 3-D engine for design and production of situated simulations. Over the years, we have created prototypes and published applications (both iOS and Android) in various settings: Mission Dolores in San Francisco, Forum Romanum and Via Appia Antica in Rome, the Acropolis in Athens, Ancient Phalasarna on Crete, Viking settlements and burial sites in Norway, the baroque Old Town of Narva in Estonia, and the D-Day landings site on Omaha Beach in Normandy, France (Liestøl et al., 2011; Liestøl, 2018). We have also explored location-based simulations of the future, such as the new National Museum in Oslo, vegetation change and global warming on the west coast of Norway, a dystopian view of climate change in the year 2222 and land rise relative to Stone Age settlements in the South of Norway (Liestøl & Morrison, 2015; Bjørkli et al., 2018).
These projects have been presented at a variety of conferences in the Americas, Asia and Europe. When sharing the results of our research and development we often receive the following question: Can we get access to the platform or the code you are using? For a long time we have considered this request. And finally we have had the opportunity to do something about it. As part of the EU-Interreg project CINE (Connected Culture and Natural Heritage in a Northern Environment) we have had the chance to prepare an easy-to-use editor for creating AR apps based on the Situated simulation platform. The Sitsim AR Editor is created as a customized add-on to the Unity cross-platform game engine.
Figure 2: From a situated simulation about the ancient Panathenaia procession on the Acropolis Hill in Athens. To the left, the sequence displaying the Panathenaia procession has been paused and available hypertext links are visible. In the photo to the left, a link about the khitara, a musical instrument, as been activated and displays a PDF with information about the instrument in various modes (text, image and audio).
What is the Sitsim AR Editor?
The Sitsim AR Editor consists of two main parts:
- Frameworks and code that provide the foundation for a Sitsim application
- Unity 3-D add-on functionality to guide the user through the process of creating a Sitsim application
The frameworks contain functionality for accessing the device’s sensors (e.g GPS, gyroscope, magnetometer, etc.) in order to align the position and orientation of a virtual camera with that of the real device. It also provides a basic and extendable UI to access both standard Sitsim features like “snapshot” and “bird’s view” as well as features implemented by the user. The concept of spatially distributed hypertext links (balloon links) with access to different types of information, documents, and events is also a central part of the framework’s functionality.
The Unity 3-D add-on functionality is basically a wizard workflow that guides the user through the process of configuring the Sitsim application, preparing and adding terrain, adding balloons with content, and building the final Sitsim application.
Basic Features of the Sitsim AR Editor
The first and foremost feature in the Sitsim framework is to read the device’s sensors (GPS, gyroscope, accelerometer and magnetometer) and perform the necessary calculations to find how the player/“avatar” is moving and how its head is oriented in the simulated environment and in parallel to the real surroundings. The Sitsim AR Editor simplifies the configuration of parameters that control these calculations.
The Sitsim AR Editor also guides the user through the process of adding a terrain to the simulated environment, having it resized, positioned, and oriented correctly to align with the real world.
Through the Balloon Editor, which is a part of the Sitsim AR Editor, the user can easily add, edit, and remove spatially positioned hypertext links. The current implementation of the Sitsim AR Editor only supports links to detailed 3-D objects and Web pages, but the Sitsim framework supports additional types of targets, and the editor will support all of them with future updates.
In addition to controlling the player/“avatar” and having balloon links, the final Sitsim application has the following features out of the box:
Snapshot
Want to share your experience of indirect AR with others? Use Snapshot to take a photo of the real environment and overlay a snapshot of the simulated environment. The resulting image can be shared over several social networks as well as be saved to the camera roll.
Camera Zoom
The real world often contains obstacles that makes it impossible to get close to certain objects and areas, the Camera Zoom works like a pair of binoculars in a simulated environment.
Bird’s View
Depending on the layout of the simulated environment, it can be hard for a user to get an overview of the area. With Bird’s View, the virtual camera is sent straight up from the player’s/“avatar’s” position to an altitude controlled by the user.
Future updates
The Sitsim framework contains many features that are not yet supported by the Sitsim AR Editor, these will be added in future updates.
Additional target types for balloons
Audio clips
Simple text
PDFs
Scripts/behaviours implemented by the user
Localisation
Support for multiple languages (in the resulting Sitsim application)
As the Sitsim framework evolves, new features will be supported by the Sitsim AR Editor:
Indoor positioning
Multiple platforms (e.g Android)
Conclusion
We hope that the Sitsim AR Editor will benefit digital designers and curators whenever they wish to augment museums and cultural heritage sites or any location where alternative versions and simulations of the current environment are wanted. The Sitsim AR Editor gives the user a running start in augmenting sites with storytelling by creating Sitsim applications. With the advanced feature set of Unity 3-D at hand, there are almost no limits to what can be done in addition to the Sitsim foundation.
References
Bjørkli, B., Ledas, Š., Liestøl, G., Stenarson, T., Uleberg, E. (2018). “Archaeology and Augmented Reality. Visualizing Stone Age Sea Level on Location,” Oceans of Data. Proceedings of the 44th Conference on Computer Applications and Quantitative Methods in Archaeology. Oxford: Archaeopress Publishing Ltd., 2018, 367-377.
Ding, M. (2017). Augmented Reality in Museums. Heinz College, Carnegie Mellon University (Arts Management & Technology Laboratory). Consulted January 14, 2019.
Herpich, F., Gua- rese, R. L. M., & Tarouco, L. M. R. (2017). “A Comparative Analysis of Augmented Reality Frameworks Aimed at the Development of Educational Applications.” Creative Education, 8, 1433-1451. Available at: https://doi.org/10.4236/ce.2017.89101
Liestøl, G., T. Rasmussen, & T. Stenarson. (2011). “Mobile Innovation: Designing & Evaluating Situated Simulations.” Digital Creativity, 2011, Vol. 22, No. 3, 172–84. Abingdon: Routledge, Taylor & Francis Group.
Liestøl, G. & Morrison, A. (2015). “The power of place and perspective: sensory media and situated simulations in urban design.” Mobility and Locative Media: Mobile Communication in Hybrid Spaces (Changing Mobilities). New York: Routledge, 207-22.
Liestøl, G. (2018). “Storytelling with mobile augmented reality on Omaha Beach: Design considerations when reconstructing an historical event in situ.” Museums and the Web 2018. Published February 16, 2018. Consulted January 16, 2019.
MacIntyre, B., Lohse, M., Bolter, J. D. & Moreno, E. (2001). “Ghosts in the Machine: Integrating 2-D Video Actors into a 3-D AR System.” Proceedings of the International Symposium on Mixed Reality, 8.
Wither, J., Y.T. Tsai, R. Azuma. (2011). “Indirect Augmented Reality.” Computers & Graphics 35, 810–22.
Cite as:
Liestøl, Gunnar and Stenarson, Tomas. "Augmented Reality Authoring for Cultural Heritage Sites using the Sitsim AR Editor." MW19: MW 2019. Published January 14, 2019. Consulted .
https://mw19.mwconf.org/paper/augmented-reality-authoring-with-the-sitsim-ar-editor/