ScannerVerse is an app we created for iOS and Android using the Unity game engine. The game interacts with QR codes that advance a story and display 3D animated cutscenes. The first story was created for churches. The Parable of the Good Samaritan was converted to an animated adventure. In the Bible story, three people refuse to help someone who had been beaten up by bandits. In the app, kids search for QR codes that have been hidden in the church that correspond with those three people. After that, they are sent on a quest to find the hidden QR code for the traveler. He is very badly injured, so they must track down the hidden QR codes for bandages, a drink of water, and a horse to transport him. Each QR code triggers an animated cutscene on the device. A quest system manages the user’s progress and prevents them from jumping ahead in the story if they find QR codes out of order.
Gallery
Background AI
The game has a fair number of people in the background to add a little extra life to the scenes. The background characters work with the GameCreator plugin to enable the “extras” to wander around the scene on their own. They know to avoid obstacles. They can be constrained to a particular region or allowed to roam freely across the map. As they wander, they are aware of some of their context. For example, the characters can detect when another friendly character is nearby and initiate an animated wave.
QR Codes
The QR codes are matched up with cutscenes in the app. A special starter code is used to choose the active story (for whenever new stories are added to the app). After that, the app knows how to progress the user through the story. If the user forgets what they’re supposed to be doing, the Quests page shows them all of the steps they have unlocked in the current story. Any steps that have been completed trigger the completion cutscene. Any in-progress steps trigger the instruction scene.
Testing
We created a text console to enable rapid testing of the cutscenes, transitions, animations, and barcodes. Unity uses C# as its primary programming language, and C# supports introspection and decorators. So with a simple code decorator, the console is able to expose methods as text commands and it also knows the arguments the method requires. This allowed us to test the actual code that would be triggered by a real barcode with a console command like “barcode tutorial”.
Unity
We aren’t going for perfect lip-syncing, so the animation system is gesture-based. We built a custom character animation controller that enables a persistent state (ex: talking) to blend into instantaneous triggers that show one-shot animations (ex: jump, shrug, thumbs up). Once the one-shot animations are completed, they revert back to the persistent animation state.
Many custom components also had to be created in C# for Unity. We created components to read barcodes detected in a live video stream, to manage game state, to handle user interface transitions, and more.
Voice Acting
All of the scripted text has accompanying voice acting. We recruited people to voice the characters (giving tips for good recording quality) after finalizing the script. Each person’s raw audio then went through noise reduction, decibel normalization, and noise gating to deliver crisp audio files. These audio clips were then matched up with chunks of the written script, so a user sees the text appear at roughly the same time the voice actor is delivering the lines.
Publishing
This app will be available soon on the App Store and Google Play.