With passthrough vision and barcode recognition, Quest Locator turns your surroundings into an interactive nutritional guide. Mixed-reality panels present product information, including ingredients, nutritional values, health impact assessments and sustainability indicators. AI-generated explanations adapt to user preferences, enabling intuitive comparisons and informed decisions.
Our Goal
Process and Outcome
Tools and Ressources
- The application was build on Unity Ver. 6000.0.46f1 with a C# Visual Studio extension.
- To access the Meta Quest 3 we used the Meta XR SDK, which also enabled us to create our own gestures.
- A repository was created on GitHub.
- Communication happened over Discord, Mattermost and Zoom.
- For planning we used miro and Notion.
- Designs, including self-made icons, were made with PhotoShop and Paint.net.
- The Open-Food-Facts API is used to get the information about products and their barcode.
- „Deutsche Gesellschaft für Ernährung e.V.“ provides the values for the daily allowance of nutrients of different activities, ages and body types
Product/Outcome
We achieved to build an application for the Meta Quest which uses its passthrough camera feature. With our program the camera scans product barcodes reliably and generates movable mixed-reality panels that provide informations about ingredients, nutritional values and environmental impact. Furthermore it’s possible to add information about yourself to check how much of your daily nutrition allowance a product includes. If you don’t understand a certain ingredient you can click it to get an explanation in different difficulties from easy to scientific generated by a built-in AI feature. To include first time VR users we included a simple to understand tutorial. The design can be changed if you prefer light or dark mode. It is possible to close every panel individually or all at once with a clear-all button.
Team