Your students can create immersive and interactive scenes to transport audiences to new worlds. Students can immerse in new cultures, explore ecosystems, and much more all from your classroom. The following instructions will focus on how students can use Reality Composer to create a virtual scene or setting in augmented reality.
Overview of Reality Composer:
Reality Composer is a free professional tool intended for software developers wishing to develop AR apps; however, it can also be a great tool for students who wish to create their own immersive AR experiences. While originally intended to be integrated in apps, experiences created in Reality Composer can be exported as USDZ files that can be freely viewed on iOS devices with the built in QuickLook feature, no need to download anything, and can even be embedded into Apple apps, like Keynote, Pages, Numbers, and FreeForm.
Integration Examples:
Reality Composer has several applications that can range from creating simple scenes to advanced simulations and modeling. This post is intended for those who are new to Reality Composer and will discuss how to create an immersive scene in Reality Composer. Creating immersive scenes can transport an audience into an experience that enables them to see, hear, and participate in a location where they otherwise may not be able to go. Here are three different examples of immersive scenes that my students have created:
Spanish Festivals and Holidays: Students researched a specific holiday or festival and created a scene highlighting what one may see, hear (music, greetings, sayings, etc.), and do when at that festival.
Biomes: Students researched a biome and then created a scene that showcases what one may see and hear while visiting that biome.
Survival Scenarios: In an English Language Arts class and in response to the novel, Unbroken by Laura Hillenbrand, students showcased a survival scenario that contained a remote location, survivors with unique skills, experiences, and weaknesses, and available supplies.
Pre-Production:
Step 1 Research: In this phase students should conduct appropriate research to gain the appropriate knowledge that is needed to create the scene. For instance, in the Biome example above students had to find out the location, flora/fauna, and climate of a biome.
Step 2 Visuals: Students will need to plan what visuals they will include in their scene and whether each one will be a 2D image or 3D design. Additionally, it should noted which visuals will be originally created and which ones will be curated. Original visuals can also be planned with a rough sketch.
Step 3 Audio: If using audio, students will need to plan what will be heard in their scene. If recording voice, a script should be prepared. If using music or sound effects, these will need to planned and it should be specified if these will be originally created or curated.
Step 4 Behaviors: If incorporating behaviors, like movement, students will need to plan this and specify which Reality Composer behavior(s) will be used. Click for a list of behavior presets and descriptions that are available in Reality Composer.
Note: If using curated or “found” assets (visuals or audio files) make sure that your students are complying with appropriate copyright/fair use laws and your organization’s policies.
Production:
Visuals:
These can be either 2D images or 3D artifacts and either originally created or curated.
2D images: These can be added to a Reality Composer scene, and it is recommended to remove backgrounds in Keynote prior to doing this and the link contains instructions if needed.
Here is a video on how to add an image from Keynote to Photos without a background.
Below is a video on how to add a 2D image to Reality Composer:
https://www.youtube.com/embed/S52d-iEb85w?showinfo=0&enablejsapi=13D Artifacts:
3D Artifacts can be added from the included Object Library.
Or create your own 3D artifacts in the TinkerCAD app for iPad. TinkerCAD is a free app intended for those who are new to 3D design and projects can be exported as USDZ files and imported into Reality Composer.
Audio:
Reality Composer accepts a variety of audio file formats, like MP3 and WAV, and files can be originally created or curated. GarageBand can be a useful tool for creating music, but, if seeking to capture simple sound effects and recordings, I recommend using the native Voice Memos app.
Behaviors:
Motion and movement can be added to Reality Composer using behaviors. I recommend starting with basic behaviors, like Tap and Add Force.
https://www.youtube.com/embed/jHn-cPuD6mU?showinfo=0&enablejsapi=1Post-Production
The joy of creating immersive scenes in Reality Composer comes with sharing with others. Projects in Reality Composer can be exported as a USDZ file and then embedded within popular Apple apps, like Keynote, Pages, Numbers, and FreeForm. I would recommend creating document in FreeForm, collaborating with all of your students, and then allow them to insert their completed USDZ file on that document. Here are instructions on how to embed USDZ file in a Numbers or FreeForm document.
Once files are shared and accessible to all, remember to give time to celebrate the work that has been completed and give time to for your community to explore and respond to the experiences.
August 15, 2024
Thanks for sharing Christopher. I'm fairly new to Reality Composer and want to use it with my students. I found some handy tips in your post. Great job and thanks again.
This action is unavailable while under moderation.
This action is unavailable while under moderation.