Amy arrives at the Cafe. Her smart glasses identify Cafe's network, and send her notifications to invite connection.
She walks in the Cafe and connects her glasses to the Cafe network. She is able to see Cafe’s virtual marketing content through the glasses.
She finds a good table and ready to order. She’s not willing to stand in the long waiting line. So, she chooses to order through smart glasses.
She has a great user experience reading through the virtual menu with her smart glasses and smart wrist band.
After a few minutes, she decides to check out two items with her glasses.
She holds her phone close to her smart glasses, so the camera on her glasses can scan her personal payment code.
She chooses "Deliver to table", so she does not need to leave the seat.
After a few minutes, one of the Cafe robots delivers the coffee to her table.
My goal is to map out how users interact with the virtual Cafe menu and how the user interface looks.
In this simple scenario, I combine several universal gestures and user interaction paradigms for smart glasses + smart wirst band.
Select a module and drag it to a different position.
Select a button.
Pull the screen closer to make it look larger.
Push the screen backward to make it look smaller.
Slide the screen or a section in the screen horizontally.
Slide the screen or a section in the screen vertically.
This demo was built in Figma with DraftXR plugin.
I created two versions of mock up in figma, dark mode and light mode.
This was my test in Spline(3D software), trying to demo the 3D version of the design.
The tech stack I applied is Unity + Oculus Interaction SDK hand tracking. I rebuilt the 3D version UI in Unity Engine, and using OVR hand tracking to realize a short demo. This video was recorded in Oculus Quest.
Moving from 2D to 3D prototypes pushes me to design 3D interactions in depth. This 10 seconds demo costs me more then 10 hours to figure out several technical obstacles and countless testing and experimenting, such as
I also did a version with XR Interaction toolkit, but later on I find OVR hand tracking performing more intuitively for this demo.
Import Oculus XR Plugin and other packages into the project.
Set and adjust the right position and angle in the scene between Camera rig and 3D menu.
Build up a basic interaction flow using Unity's Event and OVR' sample components.
Animate buttons and 3D models in Unity.
Import external 3D assets and modify materials. 3D modeling by Megan Alcock and Okotaru.
View other 3D/XR design projects:
View other UI UX design projects: