Virtual Dioramas Inside and Outside Museums with the AR Perpetual Garden App

Maria Harrington, University of Central Florida, USA


The AR Perpetual Garden App. is an immersive augmented reality (AR) application for use inside and outside museums to extend the learning impact of real dioramas and gardens. Using data visualizations and bioacoustics to reflect scientific data sets, this design enhances the perceptual experience to create virtual dioramas capable of interaction. Two contrasting scenarios, Woodland in Balance and Woodland out of Balance, are shown to instantly communicate the complex narrative on tropic cascade. This design extends prior art in Virtual Nature research into AR, using similar methods. Additional information is easily accessible at the click of a button, including the curator’s interpretive narrative and facts on a linked website. The AR Perpetual Garden App. was developed in part as an international collaboration between Harrington Lab at the University of Central Florida, Powdermill Nature Reserve at Carnegie Museum of Natural History, and the MultiMediaTechnology program of the Salzburg University of Applied Sciences, Austria. Undergraduate and graduate students were involved in the production. Download The AR Perpetual Garden App. from the Carnegie Museum App stores.

Keywords: Augmented Reality, Virtual Reality, Mixed Reality, Dioramas, Gardens, Informal Learning, Virtual Nature

Figure 1: AR Perpetual Garden App, showing the Woodland in Balance Scenario visualization interacting with the White-tailed Deer and Predator Diorama Exhibit inside
Figure 1: AR Perpetual Garden App, showing the Woodland in Balance Scenario visualization interacting with the White-tailed Deer and Predator Diorama Exhibit inside


This paper describes the iterative design and development process used by artists, scientists, and programmers to ensure high information accuracy in the creation of the application. Critical to this process was a multidisciplinary partnership between digital media producers in universities and institution experts to ensure high information quality in all of its modalities. This paper focuses on a unique collaborative and iterative production process used between a university researcher and her students and museum personal to design, develop, and produce a useful and educationally impactful product. Special attention is given to the multidisciplinary review and approval process of all content by the scientific stakeholders, the director, the museum’s botanist, and the research center’s gardener to ensure all facts, information, and visual- and audio representations of information were faithfully reproduced. The process may be used as a framework for creating immersive, multi-modal, interactive AR applications when high information accuracy is required, especially where it is important to achieve educational objectives and informal learning outcomes.

Such resulting applications may be thought of as virtual dioramas, developed with similar care and attention to detail in information and visual representation as the original dioramas. Much as the artists and scientists of the past worked together to ensure accuracy in the construction of real dioramas, the current collaboration jointly created a virtual diorama true to the spirit of the past, with the same attention to accuracy and detail needed to produce digital media artifacts infused with knowledge. Such interactive tools that open free exploration of scientifically accurate information are important for use in discovery-based informal learning environments, unlocking the knowledge stored in real and virtual exhibits.

Design Overview

At the Carnegie Museum of Natural History and its biological research center, Powdermill Nature Reserve, several resources existed for use in the project. Leveraging existing resources across centers created economic efficiencies for the organization and consistency for visitors with increased layers of meaning. One diorama, the “White-tailed Deer and Predators Diorama,” offered an ideal context for the curator’s narrative to demonstrate the invisible causal factors at play. The outdoor wildflower garden offered an additional opportunity to link the outside to the inside for a continuum of learning opportunities. The outdoor garden context invited the implementation innovation to use immersive AR to show all of the flowers in bloom, independent of season, and to demonstrate the biodiversity in its full force for the novice visitor. This allowed the visitor to see through the eyes of an expert scientist and experience his imagination without his deep knowledge of a woodland in full bloom, especially when the flowers were no longer in season. The last resource was a website representing a digital plant atlas, The Virtual Garden Timeline, with facts on native flowers and plants and a library of 3-D, AR- and VR-enabled plant models. These resources offered context and information for the design solution to fully solve a complex, multidimensional problem space (Figures 1-3).

The curator’s narrative was the basis for the story embedded in the design. It provided the guidelines for two visualized scenarios and all information requirements, such as the list of plants, insects, and birds required to drive the content and the creation of the media. The main educational concepts of the narrative focused on the data visualization of the information required to contrast the “Woodland in Balance” with the “Woodland out of Balance” scenarios, different due to deer population impacts on the ecosystem. The “Woodland in Balance” scenario demonstrates a woodland not overpopulated by deer, with many flowers and plants and many insects and bird sounds. The “Woodland out of Balance” shows a degraded woodland with fewer flowers and plants and, thus, a reduced number of insects and bird sounds used for contrast. The design produced two virtual dioramas in immersive AR as spatial and informational overlays, communicating complex and rich data and knowledge quickly and efficiently while offering interaction to access deeper concepts and knowledge.

This new project extended to AR construction methods, used in visualizing plant population data sets from The Virtual Trillium Trial Project and simultaneous research being conducted in The Virtual UCF Arboretum Project. Prior findings demonstrated learning gains as viable, indicating that when the content was the same, including the visual fidelity of the presentation, learning was the same (Harrington 2011), and established confidence in using virtual models of nature for learning. A second study investigated the factors of the virtual nature model and showed statistically significant interaction between graphical fidelity and navigational freedom on both facts inquired and learning gains. The statistical design used a two-way ANOVA with a sample of 64 volunteers, randomly assigned, to allow for precise understanding of the effects of visual fidelity and of navigational freedom. The highest learning gains were observed in the High Visual Fidelity and High Navigational Freedom conditions—Fact Inquiry (M=40.75 facts/hour, SD=24.02) and Knowledge Gained (M=37.44% gain on pretest; post-test score changes, SD=13.88)—indicating that both hig- fidelity graphics and freedom of choice in navigation are important design considerations in virtual learning environments (Harrington, 2012). Thus, the application design factors of graphical fidelity and free choice in navigation are important for educational VR application design, especially for use in informal settings, and independent of other design choices. These findings informed the design of the AR Perpetual Garden App, as a data visualization application open to free choice in the exploration of educational information.

The idea represented an exciting design challenge ideally suited for the new features available in immersive AR technology. The newly released AR SDKs for Apple (ARKit) and Google Android (ARCore) made the development of immersive AR environments easy to execute with the Unity game engine. Deployment on AR Android smartphones, iPhones, and iPads for wide public use in homes and schools made the timing ideal to build a prototype in early May 2018. The apps were released by November 2018 for both Android and Apple devices in the Google Play and iTunes stores.

The main design challenges were:

  • How to leverage multiple existing resources into one solution
  • How to truthfully communicate the educational information to the novice without overwhelming them with data, facts, and complexity
  • How to create a simple, easy-to-use, eloquent solution for all visitors
  • How to deploy the program on any device the public might bring to the experience
  • How to change visitor perception and understanding in an instant, truthfully, accurately, and with increased intrinsic motivation to learn more


Figure 2: AR Perpetual Garden App, showing the Woodland in Balance Scenario visualization interacting with the outside
Figure 2: AR Perpetual Garden App, showing the “Woodland in Balance” scenario visualization interacting with the outside


Figure 3: AR Perpetual Garden App Woodland out of Balance
Figure 3: AR Perpetual Garden App and “Woodland out of Balance” scenario

Educational Content Embedded and Accessible

Many times, a curator is not available to tell the story, and signage is often not useful in attracting or engaging visitors, even with explanatory visuals (Figure 4). The opportunity to view the story allows the public to explore, interact, and experience this difficult concept in a perceptually engaging manner.

The main educational concept is a causal chain reaction, referred to as a trophic cascade (Nuttle et al., 2011). This is difficult for people to understand because the reality of a woodland in balance was lost decades ago. Many people do not know what woodlands look like prior to deer overpopulation and do not spend time in the woods, especially during the brief and easily missed time when the flowers are in bloom. This translates into a public that does not know what a balanced woodland looks like, so they do not feel the loss of a reality they don’t know, don’t understand, and cannot imagine. Such poor woodlands still appear green, so they seem healthy, because the observers do not know what is missing. Immersive AR presented the design opportunity to show, not tell, the story and to enable interactivity to access more knowledge.

Figure 4: The Trophic Cascade illustration, from Nuttle et al., 2011
Figure 4: The “Trophic Cascade” illustration, from Nuttle et al., 2011

Curator’s Narrative

The curator’s narrative and interpretive story is the explanation of the facts and concepts, implemented with an audio file that plays when selected. The sound level was set high enough for a user to hear but not so loud that it would interfere with the gallery experience of other visitors. Here is part of the curator’s narrative:

People see the forest as composed of trees, but actually about 90% of the plant diversity is in the understory near the forest floor. These woodland shrubs and herbs are what generate the fruits and leaves that insects and birds rely upon.

Today, we have too many deer, because we controlled all the big predators, and game management is often directed at increasing deer for the benefit of hunters. Over browsing by deer eliminates the wild flowers, causing what is called a ‘trophic cascade,’ meaning that animals feeding at a basic level have an effect through higher levels.

As the deer eat the plants, the insects that rely on the same plants have no food and disappear. Then the birds that feed on those insects disappear.

A healthy woodland in balance, without too many deer, has abundant flowers, small insects, and birds in the canopy. With too many deer, a woodland is out of balance, and a trophic cascade eliminates many understory plants, reducing insects and birds.


Plant Facts

Interactivity was implemented with a button link in the app to open a complementary website, The Virtual Garden Timeline (Figure 5) (, creating easy access to about 150 plant and flower species (Figure 6). Each overview links to a plant fact page for detailed information (Figure 8), and about 25 species have AR- and VR-ready 3-D models embedded in the page (Figures 9-12). The site allows for the search and navigation of educational information facilitated by keyword search, meta tags on garden location and flower bloom times, and navigational functionality by category. Additionally, all flowers and plants used in the AR application are also available for closer inspection on AR-enabled smartphones and with inexpensive VR headsets, such as Google Cardboard, through the website.

Last, there is an interactive timeline displaying thumbnail images of each of the plants, clustered by garden location and bloom times in the 3-D interactive runway visualization of space and time (Figure 7). This feature allows for spatial-temporal exploration over the entire year and places the information into context as a whole system. Such visualization supports a contextual understanding beyond a 2-D website to show a garden in bloom over time and the goals of planning an English garden with native species, one that results in color all year and supports pollinator populations. Each plant on the timeline links back to its representative plant for details and facts in the website. Thus, multiple pathways to digital media creates a web of knowledge accessible through intentional interconnected facts and free navigation to encourage learning activity.


User and Stakeholder Profiles

The primary users are museum visitors of any age and ability, parents and children, teachers and students, and the staff of the museum. Such diversity in the user profile demanded that the app operate without any training, have simple and obvious functionality, and be accessible to pre-readers and very young children. Additionally, consideration had to be made for other museum visitors, who may not wish to have their experience impacted by others who use the app.

The stakeholders were the museum personnel. The primary stakeholder was the director of the Powdermill location with the requirement that the app work on museum-owned iPads, in addition to both AR Android and AR Apple devices for the public to use in schools and at home. Two locations were important, one inside to annotate and extend the learning opportunities of the real diorama, and one outside in a garden to feature the real plants. The vision was to create an AR-perpetual garden, so visitors at any time of the year could see the flowers and plants in the garden in bloom and learn about the connections between the outside exhibit and the inside diorama.

The Carnegie Museum was a stakeholder. The web team, technical IT operations team, and the marketing team were involved. The web team had standardized on the WordPress engine, and that platform became a requirement for developing and maintaining The Virtual Garden Timeline. The technical operations team had experience in developing and launching Android and Apple apps. The marketing team had branding and graphical style-guide requirements, and contact information and links for press provided additional requirements. Furthermore, the legal requirements for accessibility impacted the design and implementation requirements. Other requirements for privacy and terms-of-service notices were gathered and implemented.

Visualization and Bioacoustics Data

The two visualization scenarios—”Woodland in Balance” and “Woodland out of Balance” (Figures 2-3)—were developed as Unity game levels. Each was scaled to a 3 meter by 3 meter virtual plot to match the real world. Both Unity levels visualized the total plant population densities and distributions per scenario. This included the type of plant species, number of species representative of the real plot, and all 3-D plant models correctly scaled in height and breadth equal to the real plant. The plant data per plot came from photographs and data on long-term ecological studies of the Appalachian forests (Knight et al., 2009; Kalisz et al., 2014; Harrington and Wenzel, unpublished data).

Data visualizations of the past showed dots displayed on a 2-D map as abstract representations of the flowers and plants, leaving the viewer cold. The immersive AR visualization recreates the beauty of the real flowers and plants to emotionally and aesthetically engage the visitor in an experience close to that of the real world. The 3-D models were created to capture all salient visual information, so each plant model is an accurate and truthful representation of each plant. Combining all plants in a field of view results in a stunningly beautiful array of flowers, made even more so when the viewer understands that it is not fiction or fantasy, but fact and truth that they see.

To reinforce the visualization of plant data scenarios, ambient sounds representative of the number and type of insects and birds for each scenario were derived from data estimates of the two woodland scenarios (Nuttle et al., 2011). The insects and birds do not become quieter in the “Woodland out of Balance” scenario; there are fewer species and fewer of each. The bioacoustics derived from data for each scenario increase or decrease in song and diversity, showing contrast in context. Again, data drives the implementation of the digital media to enhance communication and education for greater impact and engagement.

User Interface

The app has a menu bar anchored to the bottom of the screen in the device. It scales to fit any of the many different sizes and scales on multiple AR-ready devices for both Android and Apple hardware. The menu has four buttons: “Story,” “Woodland in Balance,” “Woodland out of Balance,” and “Plant Info.” The “Story” button plays the curator’s narrative as an audio file. It will continue to play the audio while the user explores the other buttons. In testing, this feature did not appear to distract users. In fact, it appeared to complement and reinforce the visuals, and it is consistent with theories on learning and multimedia. The “Woodland in Balance” and “Woodland out of Balance” buttons act like a toggle, allowing the user to easily switch back and forth between the two visualizations, one for each scenario for easy comparison. The “Plant Info” button opens a complementary website, The Virtual Garden Timeline allows easy exploration of plant and flower facts. Consistent use of and mapping of button color and function supports mapping of button color to active states of functionality.

After downloading and installing the app, the user must allow the computer vision features in each SDK to find a flat surface. When it does, the system indicates success with visual markers on the ground. Then the user may tap the screen to plant virtual flowers. This flat surface could be in the museum gallery, annotating the real diorama, or outside in the real garden.

Since the AR is adjusting for external environmental lighting conditions, the app is best used in full interior light, not dark or dimmed lighting, and full or part sun outside. The interior of the Powdermill sight is bright, and the outside garden is in part sun. Schools are typically well lit as well, so the app should work well in those environmental conditions.

Figure 5: The Virtual Garden Timeline Main Page
Figure 5: The Virtual Garden Timeline main page


Figure 6: Plant List (partial)
Figure 6: Plant list (partial)


Figure 7: Plant Timeline (partial)
Figure 7: Plant timeline (partial)


Figure 8: Plant Fact Page (partial)
Figure 8: Plant Fact page (partial)


Figure 9: AR, VR, 3D Plant Model, Trillium (Trillium grandiflorum), demonstrating rotation and zoom
Figure 9: AR, VR, 3-D plant model, Trillium (Trillium grandiflorum), demonstrating rotation and zoom


Figure 10: AR, VR, 3D Plant Model, Trillium (Trillium grandiflorum), demonstrating rotation and zoom
Figure 10: AR, VR, 3-D plant model, Trillium (Trillium grandiflorum), demonstrating rotation and zoom


Figure 11: AR, VR, 3D Plant Model, Trillium (Trillium grandiflorum), demonstrating rotation and zoom
Figure 11: AR, VR, 3-D plant model, Trillium (Trillium grandiflorum), demonstrating rotation and zoom


Figure 12: AR, VR, 3D Plant Model, Trillium (Trillium grandiflorum), demonstrating rotation and zoom
Figure 12: AR, VR, 3-D plant model, Trillium (Trillium grandiflorum), demonstrating rotation and zoom


Figure 13: AR, VR, 3D Plant Model, Trillium (Trillium grandiflorum), demonstrating rotation and zoom

Figure 13: AR, VR, 3D Plant Model, Trillium (Trillium grandiflorum), demonstrating rotation and zoom

Design, Development, and Review Processes

The process was highly iterative, participatory, and uniquely focused on the data and information quality, the accuracy and fidelity of the visualizations, and the bioacoustics. Quality in the application was achieved through an iterative creation, review, and validation process among artists, scientists, and programmers.

Team Roles and Responsibilities

The principal investigator, assistant professor, and project manager organized and managed all collaborative interactions, work activities, work packages, schedules, communications, audits, and change control reviews. A major effort was mentoring undergraduate students in paid internships.

Two undergraduate students in the game design program developed the 3-D plant models first to aesthetic standards and then in iterative scientific reviews with the museum botanist, facilitated by the professor. Their main responsibility was to create the 25 plant and flower 3-D models to scale, with accurate morphology, using the botanically correct visual details. The models’ visual fidelity, aesthetics, and botanical details were carefully balanced with the technical constraints of AR, VR, and game-ready technical requirements. The students worked closely with their professor and the museum botanist to understand the details for both visual and scientific accuracy required for each model to ensure that the visual information accurately reflected the plant—every stamen, pistil, leaf, and petal had to match the real plant. In this careful process of co-creation, the students discovered that violets have spurs.

The website developer was an undergraduate student in the web design program, who was responsible for the integration of all digital media on plant and flower facts and concepts for the website in The Virtual Garden Timeline. She worked closely with the museum web team to ensure that their production and legal requirements were met.

A museum director provided the educational lesson in the form of the curator’s narrative, assisted in identifying plant images and selecting correct insect and bird audio sounds, set and reviewed the list of plant species, and determined the virtual plant plot densities and distributions per scenario. He worked closely with the Powdermill horticulturist to approve factual data in the website content. He worked closely with the principal investigator to approve composition and layouts for each scenario and to compose and audit the bioacoustics.

The Powdermill horticulturist provided detailed information about each plant on the website. There are about 150 plants available for educational application, each with its own set of facts. The plant facts on the website guided the creation of the 3-D models.

The museum botanist reviewed each of the 3D models of plants for botanical accuracy, provided feedback, and approved each model in an iterative audit, review, and feedback process. Feedback in the form of requested corrections used visual communication enhanced with images to annotate and mark errors and to highlight corrections for changes. The art team posted models on a website to facilitate this review.

The co-investigator, professor, and technical supervisor in Austria managed all coordination and communications with the programmers, work activities, and work packages and facilitated software development and product release of both AR apps.

The programmers of the MultiMediaTechnology program at the Salzburg University of Applied Sciences in Austria set up two development environments, one to integrate the mobile, Apple ARKit, and Unity environments, and the other to integrate the mobile, Android ARCore, and Unity environments. They programmed and compiled all code, tested, and released both apps. Unity was used to layout the virtual plots rendered for the AR app, working closely with the principal investigator to ensure the layouts with review and approval by the director.

Technology and Tools Used in Development

Photographs of flowers and plants from The Virtual Trillium Trail Project were recycled. Each image texture was used for creating game-ready model textures for the 3-D plants. When such textures were not available, the Powdermill staff provided new photographs. The photographs were processed with Photoshop, Quixel SUITE 2.0, and Substance Painter. Maya 2018 was used to create the 3-D plant and flower models. Unity was used to lay out the virtual plots rendered for the AR app.

The story for the curator’s interpretive narrative was recorded and edited in Adobe Audition CC. The sound file was packaged with the app to allow the user to click on a button to play. The insects and bird sounds were purchased by species type from an online sound service, and the sounds were mixed by population proportions in Adobe Audition CC for an ambient soundtrack that could be used in the app.

The WordPress engine was the platform of choice for the website, as it allowed multiple team members to create, edit, and publish content across a geographically dispersed area and, therefore, supported the virtual team. It was also consistent with the museum web support team and allowed for integration with other museum websites and long-term maintenance. The timeline visualization was created in a third-party system, ( The 3-D plant and flower models were posted in ( for review and publication and then embedded into the website.

Google Drive was used to manage production processes, edits and reviews, and art asset status communications with the team. It was also used to store and share art assets, from photographs to 3-D model folders, creating one point of access for source images, work files, game-ready assets, and SketchFab assets.

Creation, Review, and Approval Processes

In the summer of 2017, the principal investigator worked with the three undergraduates, two 3-D artists, and one web designer to organize all photographs and ensure correct matching of plant names to photographs. These files were reviewed with the Powdermill horticulturist for approval. At the same time, the horticulturist provided all plant and flower facts as data. The web designer published all content related to each plant and flower, by garden location and time of bloom, and published the content in the timeline application. She customized some of the code to allow a menu to support easy navigation by garden location and season of flower bloom date and to match the museum’s website style guide.

The 3-D artists created the plant and flower 3-D models. All 3-D plant models required multiple iterations and feedback from the professor in order to approximate the reality of the real plant. Once complete, the model was published on a SketchFab website for sharing, review, communication, and feedback from the museum’s botanist and director. The botanist used screen shots of the 3-D model to annotate corrections (Figure 14). This review and revision process with the botanist was repeated until she approved the model for release. She also provided photographs and links to material for communicating the salient attributes required to ensure each virtual plant accurately expressed the plant visually. Quality control of the educational material reviewed correct features, counts of stamens and pistols, and locations of stems, leaves, and petals in relation to other plant structures and was an important part of the process.

List of 3-D AR and VR plant models:

  1. Rue anemone, wildflower (Anemonella thalictroides)
  2. Bluets (Houstonia caerulea)
  3. Greenbrier, catbrier (Smilax rotundifolia)
  4. Common Blue Violet (Viola sororia)
  5. Great Solomon’s Seal (Polygonatum canaliculatum)
  6. Wake Robin (Trillium erectum)
  7. Halberd-Leaved Violet (Viola hastata)
  8. Bishop’s Cap, Miterwort (Mitella diphylla)
  9. Cinnamon Fern (Osmunda cinnamomea)
  10. Wild Sweet William (Phlox divaricata)
  11. Christmas Fern (Polystichum acrostichoides)
  12. Black Cohosh (Cimicifuga, Actaea racemosa)
  13. Spring Beauty (Claytonia virginiana)
  14. Intermediate Wood Fern (Dryopteris intermedia)
  15. Blue Flag Iris (Iris versicolor)
  16. Trillium (Trillium grandiflorum)
  17. May Apple (Podophyllum peltatum)
  18. Wild Geranium (Geranium maculatum)
  19. Wood Poppy (Stylophorum diphyllum)
  20. Cream Violet (Viola striata)
  21. Large Flowered Bellwort (Uvularia grandiflora)
  22. Foamflower (Tiarella cordifolia)
  23. Virginia Bluebells (Mertensia virginica)
  24. Wild Ginger (Asarum canadense)
  25. Jack in the Pulpit (Arisaema triphyllum)
Figure 14: Example of botanist’s annotations and feedback
Figure 14: Example of botanist’s annotations and feedback

Virtual Plant Plot Scenario Development Process

In May 2018, the programmer created two Unity levels, one for each scenario visualization according to the data on plant population sets (Figure 15). The scale was set to represent real-world measurements in a virtual plot sized 3 meters by 3 meters. The plant population data densities and dispersion were estimated to include all plants in the AR digital library but not an identical match to one single plot or photograph. In general, the virtual is a representation generalizable to similar locations in Appalachian forests but not precise to any one location. The first prototype was created and tested (Figure 16), and then UX modifications and museum branding requirements were added (Figures 1-3).

Figure 15: Unity level layout of the virtual plant plot
Figure 15: Unity level layout of the virtual plant plot


Figure 16: First Prototype, May 2018
Figure 16: First Prototype, May 2018


Bioacoustics Scenario Development Process

In June 2018, the principle investigator and the director composed ambient audio for the application, which was integrated into the Android AR app at that time. Sounds of the appropriate insects and birds were obtained from an online stock audio library and then combined in a soundtrack representing the data for each of the two scenarios. Each completed ambient bioacoustics soundtrack was paired to the correct visualization for a redundancy gain effect.

List of insects:

  1. Tree cricket (Orthoptera, Gryllidae, Oecanthinae)
  2. Katydid (Orthoptera, Tettigoniidae, Neoconocephalus)

List of birds:

  1. Blue Jay (Cyanocitta cristata)
  2. Ovenbird (Seiurus aurocapilla)
  3. Black-throated green warbler (Setophaga virens)
  4. Hooded warbler (Setophaga citrina)
  5. Veery (Catharus fuscescens)

Final AR App Product Launch

The Android AR app went through multiple revisions, testing, and refinement over the summer. It was released as a beta version on the Google Play store in October 2018, and the final release was in November 2018. The Apple iTunes AR app was released on October 25, 2018.

The AR Perpetual Garden App from Carnegie Museum App stores can be downloaded:

Apple iTunes: (

Google Play: (

The Virtual Garden Timeline website: ( )

AR Perpetual Garden App video:


The design of the immersive AR representation of this concept is presented in several modalities and offers several levels of informational access. The first is a perceptually engaging presentation in a data visualization of an immersive AR environment prior to a trophic cascade, and the second is a contrasting environment after a trophic cascade. The two immersive AR realities can be experienced in context with the real diorama and the garden for instant comparisons. Hence, the viewer can compare before and after views of the trophic cascade in perceptually powerful ways. Visitors can see through the eyes of the expert scientist and gain instant understanding, like an epiphany. This presentation is intended to spark awe and wonder, curiosity, and a desire to learn and to motivate independent investigation into the reasons why the two presentations are so radically different. In this way, the virtual dioramas echo the intent of the original, real dioramas to inspire learning. However, unlike real dioramas, the virtual dioramas are interactive and multi-modal, supporting individuals in their personalized exploration and discovery of knowledge. It facilitates comparisons and challenges the viewer to visually relate the story back to the real diorama’s story. Outside, in a very green garden, it shows the radical difference in the number and diversity of flowers possible, making the real look dull and lacking in comparison. The users see, hear, and experience the differential, making the experience highly salient at perceptual, cognitive, and emotional levels. The app intentionally creates a teachable moment with beauty for all ages.



Dr. John W. Wenzel, Director, Powdermill Nature Reserve, Carnegie Museum of Natural History, Pittsburgh, PA USA,

Dr. Markus Tatzgern, Professor, MultiMediaTechnology program of the Salzburg University of Applied Sciences, Austria,

Bonnie L. Isaac, Botanist, Carnegie Museum of Natural History

Martha Oliver, Horticulturist, Powdermill Nature Reserve, Carnegie Museum of Natural History

Zack Bledsoe, BA Digital Media, Game Design, UCF

Chris Jones, BA Digital Media, Game Design, UCF

Alexandra Guffey, BA Digital Media, Web Design, UCF

Tom Langer, MS graduate student, MultiMediaTechnology program of the Salzburg University of Applied Sciences, Austria

Radomir Dinic, Junior Researcher, MultiMediaTechnology program of the Salzburg University of Applied Sciences, Austria

Martin Tiefengrabner, MultiMediaTechnology program of the Salzburg University of Applied Sciences, Austria


Harrington, M. C. R. (2011). “Empirical Evidence of Priming, Transfer, Reinforcement, and Learning in the Real and Virtual Trillium Trail.” IEEE Transactions on Learning Technologies 4 (2), 175-186. Available at:

Harrington, M. C. R. (2012). “The Virtual Trillium Trail and the empirical effects of Freedom and Fidelity on discovery-based learning.” Virtual Reality 16 (2), 105-120.

Kalisz, S., Spigler, R. B., & Horvitz, C.C. (2014). “In a long-term experimental demography study, excluding ungulates reversed invader’s explosive population growth rate and restored natives.” Proceedings of the National Academy of Sciences of the United States of America 111 (12), 4501-4506. Available at:

Knight, T. M., Smith, L., Dunn, J., Davis, J., & Kalisz, S. (2009). “Exotic plants invasions are facilitated by deer overabundance in a Pennsylvania forest.” Natural Areas Journal 29 (2), 110-116. Available at:

Nuttle, T., Yerger, E. H., Stoleson, S. H., & Ristau, T. E. (2011). “Legacy of top-down herbivore pressure ricochets back up multiple trophic levels in forest canopies over 30 years.” Ecosphere 2 (1), 1-11. Available at:



Cite as:
Harrington, Maria. "Virtual Dioramas Inside and Outside Museums with the AR Perpetual Garden App." MW19: MW 2019. Published February 14, 2019. Consulted .