How might interaction look like in the future? Our exhibit attempts to simulate a dystopian reality of the metaverse where beauty holds an inflated value of social capital. With the rise and normalisation of using face-altering applications to make ourselves look "better" on social media, we thought to extend this to the metaverse. In our role-playing exhibit, the metaverse has high standads for the avatars that choose to enter the digital realm. When users first enter, an auto-generated avatar is created based off their face. Admins of the metaverse will then assess their objective beauty score; their beauty score decides how much of the metaverse they are able to access. The admins are tasked with ensuring everyone who enters fits the standard in order to not bring shame to this new digital community. Therefore, admins may suggest for users to enhance their avatar to become adjacent to the beauty standard so that they can unlock more of the metaverse. Users will be forced to choose between embracing their true selves or succumbing to the pressure of appearing "perfect". Around the room, there will be various features of their face (hair, eyes, nose, and mouth) that the user can change. Depending on what the user selects, their beauty score will increase or decrease. If users choose to succumb to the standards, they are greeted with a landscape of eerily similar looking avatars; they have successfully joined the higher society within the metaverse but at the cost of their uniqueness. On the contrary, if they choose to remain as themselves, they will enter a gloomy metaverse with few features.
Technical Description
We use a combination of Vuforia, Unity, and physical forms to create the intended experience. Object tracking in Vuforia is used to identify when an object has been placed and to allow for unique object identification. This is required for it to send a key press to Unity which allows users to create changes to the avatar accordingly. Unity is used to create the avatar itself and the character creation user interface for the audience to interact with. Avatars are generated through pre-created assets that were drawn in Adobe Illustrator. A blank face canvas is used and various features can be placed on top to create a variety of faces. There is a "progress check" so that users can clearly identify which item was tracked and if the requirements are met for the metaverse experience. In the physical exhibit, a webcam will be placed under an acrylic slab facing upwards and when the user places a box down on the slab, the object will be declared as tracked.