Connecting doctors and patients by increasing transparency during difficult medical conversations (such as pre-surgery debriefs). Ultimately bolstering patient confidence in the medical system, and allowing healthcare to save more lives.
- Devpost:https://devpost.com/software/dr-trust
- YouTube: https://youtu.be/b7dJazhTQVk?si=TBphu_llmTRP2jBX
- Steven Le (Stanford) - Backend Unity with Meta's Presence Platform and Normcore's multiplayer solution.
- Vikram Agrawal (USC) - Frontend Unity with Bezi Integration
- Jessica Sheng (USC) - Design with Blender + Bezi, and Narrative
- Coco Xiong (USC) - Design with Bezi, and Video Editing + Narrative
- Amanda Leong (USC) - Project Manager + Medical Lead + Researcher + Director + Narrative Lead
Dr. Trust is a mixed reality experience that can be used both in-person and remotely through a multiplayer internet connection. Therefore you will need:
- 2 Meta Quest 3's (or Pro's) - one for the doctor and one for the patient
- Optional: Additional Meta 3/Pro's to bring in a patient's family members (even if they're remote)
- Oculus Touch Controllers (hand tracking will be supported in a future update - controllers offer the required level of precision for now)
If you're simply using Dr. Trust, you just need to load our APK onto your Quest headset (via Meta Quest Developer Hub or Sideload)
If you'd like to modify our app on your own computer, you'll need these softwares and libraries:
- Unity
- Normcore - realtime multi-user sessions
- OpenXR
- Unity XR Interaction Toolkit
- Unity AR Foundation
- glTFast - rendering glTF and GLB models Furthermore, be sure to go to "Project Settings", and switch the target platform to "Android", since Meta Quest is based on Android.
- Launch the app
- Press and hold the Oculus/Meta button to bring the surgical table in front of you
- To interact with an organ, highlight it with the ray, and press the grab (middle finger) button. It will snap to a floating position once released
- To annotate the body outline, cast a ray onto the body, then hit the trigger (index finger) button. It will paint a red line on the body
- To clear all annotations, select the trash bin at the foot of the surgical table with your right controller
- Press 'A' to select your role - "Doctor" or "Patient" and "Physical" or "Remote"; You'll be given corresonding avatars and name tags, and be able to hear others if remote.
- Install Meta Quest Development Hub
- Go to "Build Settings" and build an APK
- Drag the .apk file into the Meta Quest Development Hub
- Open your headset and launch
BEZI!!! <3 Bezi made our lives so much easier with such an intuitive way of desigining our work. Before learning about them this hackathon, we'd either struggle with conceptualizing 3D in Figma, painstakingly use a 3D engine like Unity for Spatial Design, or just give up and sketch on paper. Bezi made our 2 designers feel like 5 or 6, despite this being their first time using it. Thank you Julian and Daniel for the great workshop and consistent support!
Sebastian Yang, thank you so much for your amazing dedication to our cause. How we started off with disecting bodies on a beach, to now increasing trust and transparency with surgery.
Steve Lukas, you're the one that lights the path for the younger generation. We love your humor, humbleness, and motivation to seamlessly integrate XR into a specific purpose of life.
Working on Dr. Trust has been such an incredible and purposeful joy. So the biggest thank you goes to MIT Reality Hacks and all the amazing, hard working event coordinators who made this all possible! An event of this scale can't be easy, and we're so grateful to you for making it happen <3
