diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/.gitignore b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/.gitignore new file mode 100644 index 00000000..ef31309b --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/.gitignore @@ -0,0 +1,80 @@ +# Built application files +*.apk +# *.aar +*.ap_ +*.aab + +# Files for the ART/Dalvik VM +*.dex + +# Java class files +*.class + +# Generated files +bin/ +gen/ +out/ +# Uncomment the following line in case you need and you don't have the release build type files in your app +# release/ + +# Gradle files +.gradle/ +build/ + +# Local configuration file (sdk path, etc) +local.properties + +# Proguard folder generated by Eclipse +proguard/ + +# Log Files +*.log + +# Android Studio Navigation editor temp files +.navigation/ + +# Android Studio captures folder +captures/ + +# IntelliJ +.idea/* + +# Keystore files +# Uncomment the following lines if you do not want to check your keystore files in. +#*.jks +#*.keystore + +# External native build folder generated in Android Studio 2.2 and later +.externalNativeBuild +.cxx/ + +# Google Services (e.g. APIs or Firebase) +# google-services.json + +# Freeline +freeline.py +freeline/ +freeline_project_description.json + +# fastlane +fastlane/report.xml +fastlane/Preview.html +fastlane/screenshots +fastlane/test_output +fastlane/readme.md + +# Version control +vcs.xml + +# lint +lint/intermediates/ +lint/generated/ +lint/outputs/ +lint/tmp/ +# lint/reports/ + +# Android Profiling +*.hprof + +# Holds api keys +secrets.properties diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/LICENSE b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/LICENSE new file mode 100644 index 00000000..ccb93119 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) Meta Platforms, Inc. and its affiliates. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/README.md b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/README.md new file mode 100644 index 00000000..06ba0c72 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/README.md @@ -0,0 +1,194 @@ +# Meta Spatial Scanner for Smart Things + +Application developed by Javier González Peregrín for demonstration and learning purposes in a +Master's Thesis (TFM) - Master in Computer Engineering at the University of Granada (UGR). + +## Brief Description +----------------- +This application is a Mixed Reality example combining Meta Spatial SDK (VR/AR), object detection (ML +Kit/MediaPipe), and Smart Home functionalities. It allows detecting objects in the scene, creating +information panels in 3D space, and persisting/managing entities associated with smart devices. + +The project combines scene logic (entities/components/systems), a feature layer (object detection, +MRUK/room mapping), and a small integration with a Smart Home backend/service. For proper operation, +the environment must be correctly configured. + +This project employs clean architecture and dependency injection (Koin). + +## Table of Contents +------------------------ + +1. [Main Description](#1-main-description) +2. [How to Run the App](#2-how-to-run-it) + 1. [Requirement: Home Assistant / Smart Home API](#21-need-of-a-home-assistant-installed) +2. [Add new Smart Thing](#22-add-new-smart-thing) +3. [App Architecture (Clean Architecture Applied)](#3-app-architecture-clean-arch-on-app-and-features) +4. [Dependency Injection with Koin](#4-dependency-injection-with-koin) +5. [Technical Notes and Troubleshooting](#5-technical-notes-and-troubleshooting) + +# 1. Main Description +------------------- +The app is based on the Meta Spatial SDK and provides: + +- Scene: 3d scene with entities and interactive panels. +- Detection: object detection on camera (ML Kit / MediaPipe) and creation of representative entities + in the scene. +- Anchoring: capability to "anchor" (MRUK) detected objects in real space and persist their + position/orientation. +- Components: a `FollowHead` component that makes panels face the user's headset. +- Persistence: local storage (Room) to remember the pose (Pose) of each added device. + +**Main Use Cases:** + +- Detection: detect an object in the camera and show an anchored information panel in the scene. +- Storage: save the panel/device pose in SQLite (Room) to restore it later. +- Management: add, update, and remove persisted objects from the UI or code. + +An example of how different features and the Home Assistant interact is: +![img.png](documentation/images/img.png) +![img.png](documentation/images/clean_arch_features.png) +![img.png](documentation/images/samsung_state_media_player.png) + +# 2. How to run it +----------------- +**Prerequisites:** + +- Hardware: meta Quest 3 or higher with Meta Spatial development environment configured. +- Environment: jdk 17, NDK used 29.0.14206865. +- Android: android SDK / Android Studio (recommended) with API versions indicated in + `app/build.gradle.kts`. +- Server: a Home Assistant server to perform queries on real devices. + +**Configure Sensitive Variables:** + +The project reads `HTTP_API` from `gradle.properties`/`secrets.properties` (see +`app/build.gradle.kts`). Place the URL or token in `local.properties` if the flow requires the Smart +Home API. + +Also update `app/src/main/res/xml/network_security_config.xml` with the known IP. + +**Useful Commands (from repo root):** + +```bash +# Clean and compile the app +./gradlew :app:clean :app:assembleDebug + +# Install on connected device (if configured) +./gradlew :app:installDebug +``` + +También puedes abrir el proyecto en Android Studio y ejecutar/depurar normalmente. + +## ** [User Manual](documentation/MANUAL.md)** + + +## 2.1 Need of a Home Assistant installed +------------------------------------- +Some functionalities depend on a Smart Home service (API) to obtain device metadata or execute +actions. It is expected that: + +- Service: you have an accessible Smart Home service (e.g., Home Assistant or a mock REST API). +- Configuration: you pass the URL/token to the build via `secrets.properties` or the `HTTP_API` + variable. + +Without this service, device-related functionalities may degrade to a local mode (mock), but object +detection and AR/VR interaction will continue to function locally. + +The main types of supported smart devices are: + +- Lights: smart lights. +- Plugs: smart plugs. +- Media: smart media players. +- Weather: weather stations. + +## 2.2 Add new Smart Thing: + +Add to Domain interface, then create how to fetch it in the Domain Mapper with the Attributes (if +needed). + +Highly recommended to use one of the existing Domains as template or even to use it if it only +differs in some attributes. + +You can +check [Adding a Weather Station Smart Thing](https://github.com/javigp2002/spatial_sdk/pull/10) as +example. + + +# 3. App Architecture (clean arch on app and features) +--------------------------------------------------- +The project follows a structure with separation of concerns and a "clean architecture" concept for +features: + +- App: `app/` - main layer and configuration (Main Activity, systems/components registration, + Spatial SDK configuration). +- Features: `feature/*` - each feature contains its own separation: + - Domain: `domain/` - models and contracts (repository interfaces, use cases if applicable). + - Datasource: `datasource/` - concrete implementations (network, local/Room). Concrete mappers and + repositories are found here. + - Presentation: `presentation/` or `ui/` - UI/compose elements (where applicable) and scene entity + managers. + +**Practical Example in this Repo:** + +- Feature: `feature/MRUKSidePanelRaycasterFeature` - responsible for saving/retrieving MRUK + entities (scene anchors). It has: + - Domain: `domain/repository/IMRUKObjectsRepository` (high-level contract). + - Datasource: implementation `MRUKObjectsRepositoryImpl` which orchestrates entity creation in the + scene and local persistence. + - Local: `datasource/local` - Room implementation (`MrukEntity`, `MrukDao`, `MrukDatabase`, + `MrukLocalDatasource`). + +- Detection: `feature/objectdetection` - detection logic with ML Kit/MediaPipe and panel creation. + Maintains its own datasource and repository layer to separate responsibilities. + +**Benefits of this Approach:** + +- Testing: easy testing of domain logic by mocking repositories. +- Substitution: replacement of implementations (e.g., using another backend or persistence) without + breaking UI logic. + +# 4. Dependency Injection with Koin +-------------------------------- +The project uses Koin for dependency injection (DI). Main modules and singletons are registered in +`DiApplication.kt`. + +**What is registered in Koin (examples):** + +- Repositories: feature repositories (e.g., `IMRUKObjectsRepository` → `MRUKObjectsRepositoryImpl`). +- Datasources: concrete datasources (e.g., `MrukLocalDatasource` which wraps `MrukDao`). +- Network: network clients (e.g., `SmartHomeApi`) and utilities. + +**How to extend/add bindings:** + +- Add: add an entry in `appModule` within `DiApplication.kt`. +- Use: use `get()` to resolve necessary dependencies in the implementation. + +**Short Usage Example in Code:** + +```kotlin +class SomeFeature(private val repo: IMRUKObjectsRepository) { + suspend fun add(model: MrukRaycastModel) { + repo.addMRUKObject(model) + } +} + +// in Koin module: +// single { MRUKObjectsRepositoryImpl(get(), get()) } +``` + +# 5. Technical Notes and Troubleshooting +---------------------------------- + +- Room: the `app` module includes Room (dependencies and KAPT). +- Components: there is a `FollowHead` component and a `FollowHeadSystem` that forces panels to look + at the headset. +- Resources: some component IDs are generated from `app/src/main/components/*.xml`. If changes are + introduced in those schemas, rebuild the project to regenerate classes/IDs. +- Dependencies: check `libs.versions.toml` and `app/build.gradle.kts` if Meta Spatial SDK versions + need updating. + +# Contact / Contributions +------------------------- +This work started as a fork +of [Spatial Scanner](https://github.com/meta-quest/Meta-Spatial-SDK-Samples) and changes have been +made regarding it. \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/.gitignore b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/.gitignore new file mode 100644 index 00000000..b5a3d267 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/.gitignore @@ -0,0 +1,2 @@ +/build +/src/main/assets/scenes \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/build.gradle.kts b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/build.gradle.kts new file mode 100644 index 00000000..b1dbdb5d --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/build.gradle.kts @@ -0,0 +1,173 @@ +import com.android.build.gradle.internal.cxx.configure.gradleLocalProperties +import java.util.Properties + +plugins { + alias(libs.plugins.android.application) + alias(libs.plugins.jetbrains.kotlin.android) + alias(libs.plugins.meta.spatial.plugin) + alias(libs.plugins.jetbrains.kotlin.plugin.compose) + id("com.google.devtools.ksp") +} + +val httpApi: String = gradleLocalProperties(rootDir, providers).getProperty("HTTP_API") +val homeAssistantToken: String = + gradleLocalProperties(rootDir, providers).getProperty("HOME_ASSISTANT_TOKEN") + + +android { + namespace = "com.meta.pixelandtexel.scanner" + compileSdk = 35 + + defaultConfig { + applicationId = "com.meta.pixelandtexel.scanner" + minSdk = 29 + //noinspection ExpiredTargetSdkVersion + targetSdk = 32 + versionCode = 12 + versionName = "1.0" + + testInstrumentationRunner = "androidx.test.runner.AndroidJUnitRunner" + + // Update the ndkVersion to the right version for your app + ndkVersion = "28.0.13004108" + + // Pass our aws credentials to the BuildConfig + + buildConfigField("String", "HTTP_API", "\"$httpApi\"") + buildConfigField("String", "HOME_ASSISTANT_TOKEN", "\"$homeAssistantToken\"") + } + + packaging { + resources.excludes.add("META-INF/LICENSE") + resources.excludes.add("META-INF/INDEX.LIST") + resources.excludes.add("META-INF/io.netty.versions.properties") + } + + lint { abortOnError = false } + + buildTypes { + release { + isMinifyEnabled = false + proguardFiles(getDefaultProguardFile("proguard-android-optimize.txt"), "proguard-rules.pro") + } + } + buildFeatures { + compose = true + buildConfig = true + } + compileOptions { + sourceCompatibility = JavaVersion.VERSION_17 + targetCompatibility = JavaVersion.VERSION_17 + } + kotlinOptions { jvmTarget = "17" } + + androidResources { noCompress.addAll(listOf(".tflite", ".lite", ".caffemodel")) } + buildToolsVersion = "35.0.0" + ndkVersion = "29.0.14206865" +} + +dependencies { + implementation(libs.androidx.core.ktx) + testImplementation(libs.junit) + androidTestImplementation(libs.androidx.junit) + androidTestImplementation(libs.androidx.espresso.core) + + implementation(libs.androidx.appcompat) + implementation(libs.androidx.constraintlayout) + implementation(libs.androidx.navigation.compose) + + // Jetpack Compose + + implementation(platform(libs.androidx.compose.bom)) + androidTestImplementation(platform(libs.androidx.compose.bom)) + // material design 3 + implementation(libs.androidx.material3) + // Android Studio Preview support + implementation(libs.androidx.ui.tooling.preview) + debugImplementation(libs.androidx.ui.tooling) + // View Model support + implementation(libs.androidx.lifecycle.runtime.ktx) + implementation(libs.androidx.lifecycle.viewmodel.compose) + // Activity integration + implementation(libs.androidx.activity.compose) + + // Meta Spatial SDK libs + implementation(libs.meta.spatial.sdk.base) + implementation(libs.meta.spatial.sdk.compose) + implementation(libs.meta.spatial.sdk.isdk) + implementation(libs.meta.spatial.sdk.toolkit) + implementation(libs.meta.spatial.sdk.uiset) + implementation(libs.meta.spatial.sdk.vr) + implementation(libs.meta.spatial.sdk.mruk) + + // Mediapipe CV object detection + implementation(libs.google.mediapipe.tasks.vision) + // ML Kit object detection + implementation(libs.google.mlkit.object1.detection) + implementation(libs.google.mlkit.object1.detection.custom) + // Open CV + implementation(libs.opencv) + + // For Markdown formatting in Jetpack Compose + implementation(libs.compose.markdown) + + implementation(libs.retrofit) + implementation(libs.retrofit2.kotlinx.serialization.converter) + implementation(libs.google.gson) + implementation ("com.squareup.retrofit2:converter-gson:2.9.0") + + + // Http server for video streaming + implementation(libs.ktor.server.core) + implementation(libs.ktor.server.netty) + implementation(libs.aws.bedrockruntime) + + // Koin for Android + implementation(libs.koin.android) + + implementation ("com.google.code.gson:gson:2.13.2") + + // database + implementation(libs.androidx.room.runtime) + ksp(libs.androidx.room.compiler) +} + +afterEvaluate { tasks.named("assembleDebug") { dependsOn("export") } } + +// function to load properties from a .properties file +fun getLocalProperty(key: String, project: Project): String { + val properties = Properties() + val propertiesFile = project.rootProject.file("secrets.properties") + if (propertiesFile.exists()) { + propertiesFile.inputStream().use { inputStream -> properties.load(inputStream) } + } + return properties.getProperty(key, "") +} + +val projectDir = layout.projectDirectory +val sceneDirectory = projectDir.dir("scenes") + +spatial { + allowUsageDataCollection.set(true) + scenes { + // if you have installed Meta Spatial Editor somewhere else, update the file path. + // cliPath.set("/Applications/Meta Spatial Editor.app/Contents/MacOS/CLI") + exportItems { + item { + projectPath.set(sceneDirectory.file("Main.metaspatial")) + outputPath.set(projectDir.dir("src/main/assets/scenes")) + } + } + hotReload { + appPackage.set("com.meta.pixelandtexel.scanner") + appMainActivity.set(".MainActivity") + assetsDir.set(File("src/main/assets")) + } + shaders { + sources.add( + // replace with your shader directory + projectDir.dir("src/shaders") + ) + } + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/proguard-rules.pro b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/proguard-rules.pro new file mode 100644 index 00000000..481bb434 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/proguard-rules.pro @@ -0,0 +1,21 @@ +# Add project specific ProGuard rules here. +# You can control the set of applied configuration files using the +# proguardFiles setting in build.gradle. +# +# For more details, see +# http://developer.android.com/guide/developing/tools/proguard.html + +# If your project uses WebView with JS, uncomment the following +# and specify the fully qualified class name to the JavaScript interface +# class: +#-keepclassmembers class fqcn.of.javascript.interface.for.webview { +# public *; +#} + +# Uncomment this to preserve the line number information for +# debugging stack traces. +#-keepattributes SourceFile,LineNumberTable + +# If you keep the line number information, uncomment this to +# hide the original source file name. +#-renamesourcefileattribute SourceFile \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Composition/Main.metaspatialcomp b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Composition/Main.metaspatialcomp new file mode 100644 index 00000000..c5d13f3f --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Composition/Main.metaspatialcomp @@ -0,0 +1,12 @@ +scene: docref:Main.scene +entities: + com.meta.models.AssetRoot: + - components: + com.meta.components.AssetMetadata: + {} + com.meta.components.AssetRoot: + defaultScene: ref:Scene + scenes: + - ref:Scene +metadata: + version: 1.35 \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Composition/Main.scene b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Composition/Main.scene new file mode 100644 index 00000000..8f066667 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Composition/Main.scene @@ -0,0 +1,260 @@ +entities: + com.meta.models.Scene: + - components: + com.meta.components.Name: + {} + com.meta.components.Scene: + nodes: + - ref:WelcomePanel + - ref:CameraControlsPanel + - ref:ShowSmartButtonPanel + - ref:SmartThingButtonDeletePanel + tag: Scene + com.meta.models.SceneNode: + - components: + com.meta.components.Animatable: + {} + com.meta.components.Name: + name: CameraControlsPanel + com.meta.components.PointerNodeInverseComponent: + {} + com.meta.components.SceneNode: + rotation.format: Euler + scale: + - 0.0399999991 + - 0.0399999991 + - 1 + componentVersion: 1 + com.meta.components.Visibility: + {} + com.meta.pixelandtexel.scanner.WristAttached: + faceUser: true + rotation: + - -30 + - -55 + - -90 + position: + - 0.0799999982 + - -0.0399999991 + - -0.0199999996 + com.meta.spatial.toolkit.Panel: + panel: "@layout/ui_camera_controls_view" + com.meta.spatial.toolkit.PanelDimensions: + {} + tag: CameraControlsPanel + - components: + com.meta.components.Animatable: + {} + com.meta.components.Name: + name: WelcomePanel + com.meta.components.PointerNodeInverseComponent: + {} + com.meta.components.SceneNode: + rotation.data: + - 0 + - 3.14159274 + - 0 + - 0 + rotation.format: Euler + scale: + - 0.370000005 + - 0.400000006 + - 1 + translation: + - 0 + - 1.70000005 + - -1 + componentVersion: 1 + com.meta.components.Visibility: + visible: true + com.meta.spatial.toolkit.Grabbable: + type: 1 + com.meta.spatial.toolkit.Panel: + panel: "@integer/welcome_panel_id" + com.meta.spatial.toolkit.PanelDimensions: + {} + tag: WelcomePanel + - components: + com.meta.components.Animatable: + {} + com.meta.components.Name: + name: min + com.meta.components.PointerNodeInverseComponent: + {} + com.meta.components.SceneNode: + rotation.format: Euler + scale: + - 1 + - 1 + - 1 + translation: + - -0.423324108 + - -0.812217712 + - 0 + componentVersion: 1 + com.meta.components.Visibility: + visible: true + - components: + com.meta.components.Animatable: + {} + com.meta.components.Name: + name: max + com.meta.components.PointerNodeInverseComponent: + {} + com.meta.components.SceneNode: + rotation.format: Euler + scale: + - 1 + - 1 + - 1 + translation: + - 0.422205925 + - 0.823295593 + - 0.772341907 + componentVersion: 1 + com.meta.components.Visibility: + visible: true + - components: + com.meta.components.Animatable: + {} + com.meta.components.Name: + name: ShowSmartButtonPanel + com.meta.components.PointerNodeInverseComponent: + {} + com.meta.components.SceneNode: + rotation.format: Euler + scale: + - 0.0399999991 + - 0.0399999991 + - 1 + componentVersion: 1 + com.meta.components.Visibility: + {} + com.meta.pixelandtexel.scanner.WristAttached: + position: + - 0.0700000003 + - -0.0799999982 + - -0.0299999993 + rotation: + - -30 + - -55 + - -90 + faceUser: true + com.meta.spatial.toolkit.Panel: + panel: "@layout/ui_show_smart_things_button_view" + com.meta.spatial.toolkit.PanelDimensions: + {} + tag: ShowSmartButtonPanel + - components: + com.meta.components.Animatable: + {} + com.meta.components.Name: + name: min + com.meta.components.PointerNodeInverseComponent: + {} + com.meta.components.SceneNode: + rotation.format: Euler + scale: + - 1 + - 1 + - 1 + translation: + - -0.501837969 + - -0.297210693 + - 0.00251603499 + componentVersion: 1 + com.meta.components.Visibility: + visible: true + - components: + com.meta.components.Animatable: + {} + com.meta.components.Name: + name: max + com.meta.components.PointerNodeInverseComponent: + {} + com.meta.components.SceneNode: + rotation.format: Euler + scale: + - 1 + - 1 + - 1 + translation: + - 0.506229162 + - 0.302787781 + - 0.0970481634 + componentVersion: 1 + com.meta.components.Visibility: + visible: true + - components: + com.meta.components.Animatable: + {} + com.meta.components.Name: + name: min + com.meta.components.PointerNodeInverseComponent: + {} + com.meta.components.SceneNode: + rotation.format: Euler + scale: + - 1 + - 1 + - 1 + translation: + - -0.220475674 + - -0.4633255 + - 0 + componentVersion: 1 + com.meta.components.Visibility: + visible: true + - components: + com.meta.components.Animatable: + {} + com.meta.components.Name: + name: max + com.meta.components.PointerNodeInverseComponent: + {} + com.meta.components.SceneNode: + rotation.format: Euler + scale: + - 1 + - 1 + - 1 + translation: + - 0.21383369 + - 0.453491211 + - 0.056432277 + componentVersion: 1 + com.meta.components.Visibility: + visible: true + - components: + com.meta.components.Animatable: + {} + com.meta.components.Name: + name: SmartThingButtonDeletePanel + com.meta.components.PointerNodeInverseComponent: + {} + com.meta.components.SceneNode: + rotation.format: Euler + scale: + - 0.0399999991 + - 0.0399999991 + - 1 + componentVersion: 1 + com.meta.components.Visibility: + {} + com.meta.pixelandtexel.scanner.WristAttached: + position: + - 0.0700000003 + - -0.109999999 + - -0.0299999993 + rotation: + - -30 + - -55 + - -100 + faceUser: true + com.meta.spatial.toolkit.Panel: + panel: "@layout/ui_delete_smart_things_button_view" + com.meta.spatial.toolkit.PanelDimensions: + {} + tag: SmartThingButtonDeletePanel +metadata: + version: 1.35 \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/LICENSE.md b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/LICENSE.md new file mode 100644 index 00000000..f734505f --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/LICENSE.md @@ -0,0 +1,137 @@ +Meta Platform Technologies SDK License Agreement +Effective date: October 25, 2022 + +Copyright © Meta Platform Technologies, LLC and its affiliates. All rights reserved. + +The text of this may be found at: https://developer.oculus.com/licenses/oculussdk/ + +This Meta Platforms Technologies SDK License Agreement, formerly known as the Oculus SDK License Agreement, (“Agreement”) is a legal agreement between you and MPT (defined below) governing your use of our MPT Software Development Kit, formerly known as the Oculus Software Developer Kit. MPT Software Development Kit means any application programming interfaces (“APIs”), tools, plugins, code, technology, specification, documentation, Platform Services, and/or content made available by us to others, including app developers and content providers (collectively, the “SDK”). + +By downloading or using our SDK, you are agreeing to this Agreement along with other applicable terms and conditions such as the additional terms or documents accompanying the SDK and the Terms of Service, and acknowledging our Privacy Policy (collectively, the “Terms”). If you use the SDK as an interface to, or in conjunction with other MPT products or services, then the terms for those other products or services also apply. + +Here, "MPT" means Meta Platforms Technologies, LLC, formerly known as Facebook Technologies, LLC (formerly known as Oculus VR, LLC), a Delaware limited liability company with its principal place of business at 1 Hacker Way, Menlo Park, California 94025, United States unless set forth otherwise. We may also refer to MPT as "we", "our", or "us" in this Agreement. + +You may not use the SDK and may not accept this Agreement if (1) you are a person with whom MPT is prohibited from transacting business under applicable law, or (2) you are a person barred from using or receiving the SDK by MPT or under the applicable laws of the United States or other countries including the country in which you are resident or from which you use the SDK. If you are using the SDK on behalf of an entity, you represent and warrant that you have authority to bind that entity to this Agreement and by accepting this Agreement, you are doing so on behalf of that entity (and all references to "you" in this Agreement refer to that entity). + +This Agreement requires the resolution of most disputes between you and MPT by binding arbitration on an individual basis; class actions and jury trials are not permitted. + +1. License Grant +1.1 License. Subject to the Terms and the restrictions set forth in this Agreement, MPT hereby grants you a limited, royalty-free, non-exclusive, non-transferrable, non-sublicensable (except as otherwise set forth in this Agreement), revocable copyright license (“License”) during the term of this Agreement to use and reproduce the SDK solely to develop, test, and/or distribute your Application (defined below) and to enable you and/or your end users to access MPT features through your Application. You may only use the SDK to develop Applications in connection with MPT approved hardware and software products (“MPT Approved Products”) unless the documentation accompanying the SDK expressly authorizes broader use such as with other third-party platforms. + +1.1.1 If the SDK includes any libraries, sample source code, or other materials that we make available specifically for incorporation in your Application (as indicated by applicable documentation), you may incorporate those materials and reproduce and distribute them as part of your Application, including by distributing those materials to third parties contributing to your Application. + +1.1.2 The SDK may include other content (e.g., sample code) that is for demonstration, reference, or other purposes and is subject to terms and conditions included with such materials. Such materials will be clearly marked in the applicable documentation. Absent such additional terms and conditions, you may modify, distribute, and sublicense any sample source made available as part of the SDK pursuant to this Agreement and the Terms. + +1.1.3 The SDK may include MPT content that is subject to your additional right to display the content to your end users through the use of the corresponding SDK, as contemplated by the documentation accompanying such SDK. For example, the SDK may include avatars that you may display to your end users. + +1.2 General Restrictions. The License grant in this Section is solely for the purpose of developing, testing, and promoting your engines, tools, applications, content, games and demos, or other products and features (collectively, “Application”) and providing you and/or your end users access to MPT services and features through your Application as contemplated by applicable documentation accompanying the SDK. You may not (or allow those acting on your behalf to): + +1.2.1 modify or create derivative works from any SDK or its component (other than sample source code described in this Section or expressly authorized by the documents accompanying the SDK); + +1.2.2 misrepresent or mask either your identity or your Application's identity when using the SDK or developer accounts; + +1.2.3 attempt to circumvent any limitations implemented within or documented with the SDK (e.g., limiting the number of requests you may make, end users you may serve, or processing abstracted gaze data or abstracted facial expressions data for prohibited uses under the Developer Data Use Policy); + +1.2.4 reverse engineer, decompile, disassemble, or otherwise attempt to extract the source code from the SDK, except to the extent that applicable law expressly permits such actions despite this limitation; + +1.2.5 alter, restrict, or interfere with the normal operation or functionality of the SDK, the MPT hardware or software, or MPT Approved Products, including, but not limited to: (a) the behavior of the “Meta Quest button” and “XBox button” implemented by the MPT system software; (b) any on-screen messages or information; (c) the behavior of the proximity sensor in the MPT hardware implemented by the MPT system software; (d) any MPT hardware or software security features; (e) any end user's settings; and (f) Health and Safety Warnings; + +1.2.6 use the SDK or your Application in a manner that violates: (a) the Developer Data Use Policy (where applicable); (b) the Content Guidelines, or other applicable terms and policies made available on our Developer Policy portal; (c) any rights of MPT, its affiliates or third parties; (d) applicable laws (such as laws regarding import, export, privacy, health & safety); or (e) other terms of service with MPT or its affiliates; + +1.2.7 remove, obscure, or alter any Terms or any links to or notices of those Terms; or + +1.2.8 use or redistribute the SDK or any portion thereof in any manner that would cause the SDK (or any portion thereof) or MPT to become subject to the terms of any open source license or other restrictions. + +1.3 Distribution and Sublicense Restrictions. The redistribution and sublicense rights under this Section are further subject to the following restrictions: (1) redistribution of sample source code or other materials must include the following copyright notice: “Copyright © Meta Platform Technologies, LLC and its affiliates. All rights reserved;” and (2) if the sample source code or other materials include a "License" or "Notice" text file, you must provide a copy of the License or Notice file with the sample code. + +1.4 Privacy and Security. + +1.4.1 You are responsible for the data collection, processing and disclosure by your Application and agree to comply with all applicable privacy and data protection laws, as well as our applicable terms and policies, particularly the Developer Data Use Policy. You represent and warrant that you have provided robust and sufficiently prominent notice to users regarding (i) data processing that includes, at a minimum, that third parties, including MPT and its affiliates, may collect or receive information from your Application, and (ii) any other information required to be disclosed to users by applicable privacy and data protection laws. You represent and warrant that you will not back up or make available to the Cloud Backup feature any information that you know or reasonably should know (a) is from or about children under the age of 13, or (b) includes data concerning health, financial information, or other categories of sensitive information (including any information defined as special or sensitive under applicable laws, regulations, and applicable industry guidelines). + +For purposes of the GDPR, you acknowledge and agree that you are a separate and independent controller of the Developer User Data (as defined in the MPT Developer Data Use Policy) and Meta Platforms Ireland Ltd. (an affiliate of MPT) is a separate and independent controller for any processing of personal data, except as provided in Section 1.4.3, including the MPT User Data (as defined in the MPT Developer Data Use Policy). The parties do not and will not process Developer User Data or MPT User Data as joint controllers. Each party shall comply with the obligations that apply to it as a controller under the GDPR, and each party shall be individually and separately responsible for its own compliance. + +1.4.3 Notwithstanding the foregoing, where (a) we process Developer User Data that contains personal data to (i) store, host or otherwise backup Developer User Data through Cloud Backup; (ii) provide and operate the Spatial Audio VoIP API; (iii) or provide other services described in the Data Processing Terms (the “Services”), and (b) (i) our processing of such personal data is subject to the GDPR, you instruct Meta Platforms Ireland Ltd. to process such personal data in order to provide the Services pursuant to this Agreement and the Data Processing Terms, which are incorporated herein by reference, including for product improvement for your benefit and to anonymize the data so that it is no longer Personal Data for the purposes of GDPR, and you understand and agree that we may retain such anonymized metadata for our own legitimate purposes, including improvement and development of the Services; or (ii) such personal data is considered “personal information,” “personal data,” “personally identifiable information” or similar terms and is subject to the California Consumer Privacy Act of 2018 (“CCPA”), or other applicable privacy and data protection laws (excluding the GDPR), we will only retain, use and disclose such personal data for the purposes of providing those Services to you and improving those Services on your behalf or as otherwise permitted by the CCPA and such other applicable privacy and data protection laws. + +1.4.4 “Personal data,” “controller,” “processor,” and “process” in this Section 1.4 have the meanings set out in the Data Processing Terms. + +1.5 You have no obligations under this Agreement to license or make available your Application to MPT, its affiliates, or any third parties. Nothing in this Agreement obligates MPT or its affiliates to enable you or any of your Applications to access, interact with, or retrieve or publish content to the Meta VR platform or any other MPT platforms or service. However, MPT and/or its affiliates may require you to agree to additional terms as a condition of providing you with such platform services in connection with your use of the SDK. You acknowledge and agree that MPT and its affiliates may develop products or services that may compete with your Application or any other products or services of yours. + +1.6 Experimental Features. From time to time, MPT may, in its sole discretion, make available to you as part of the MPT Software Development Kit, certain experimental, test or beta software, APIs or features on a limited or test basis (“Experimental Features”). Experimental Features can only be used for experimental or testing purposes and cannot be incorporated into a production build unless (i) the Experimental Feature has been released or included in MPT software production builds or (ii) otherwise permitted by MPT in writing. Your use of any Experimental Feature is voluntary. You agree that all use of any Experimental Feature is at your sole risk. You agree that once you use an Experiment Feature, your content, data and/or systems may be affected, and you may be unable to revert back to a prior version of the same or similar feature. Additionally, if such reversion is possible, you may not be able to return or restore data created or transferred using the Experimental Feature back to the prior version. The Experimental Features may not work in the same way as a final production version. MPT and its affiliates make no representations or warranties that the Experimental Features will function or be free from errors. The Experimental Features are provided on an “as is” basis and may contain errors or inaccuracies that could cause failures, corruption or loss of data and information from any connected device or service. MPT and its affiliates have no obligation to correct bugs, defects, or errors or otherwise support or maintain Experimental Features. MPT and its affiliates may discontinue, update, modify or remove access to any Experimental Feature at any time in its sole discretion, and may not release a final version of an Experimental Feature in its sole discretion. + +1.7 Medical & HIPAA Use Restrictions. The SDK in association with MPT Approved Products is not intended to be a medical device unless otherwise specified in writing by MPT. The use of the SDK is not intended to create, and should not be interpreted as creating, any obligations to MPT under the Health Insurance Portability and Accountability Act of 1996, the Health Information Technology for Economic and Clinical Health Act, or their implementing regulations, as amended (collectively “HIPAA”), and MPT makes no representations that the SDK satisfies HIPAA requirements. If you are or become a “Covered Entity”, “Business Associate”, “Subcontractor”, or a “Workforce” member of a Covered Entity, Business Associate, or Subcontractor (as those terms are defined at 45 C.F.R. § 160.103), you agree not to use the SDK to create, receive, maintain or transmit any “Protected Health Information” (as that term is defined at 45 C.F.R. § 160.103) in any manner that would make MPT (or any of its affiliates and subsidiaries) your or any third party’s Business Associate or Subcontractor Business Associate. You are solely responsible for any applicable HIPAA compliance obligations. + +2. Meta VR Platform Services +MPT and/or its affiliates makes certain Platform Services (defined below) available to you to include and enable in your Application on our Platform. An Application that enables or includes any Platform Service must implement the Meta VR Platform Framework (defined below) with the Application. Once your Application has been authorized for use of the Platform Services, you are not required to update your Application to include new Platform Services that MPT and/or its affiliates may make available as part of the Meta VR Platform Framework. For more information, please visit https://developer.oculus.com. + +2.1 For the purpose of this Section, + +2.1.1 “Application Services” means services provided by MPT and/or its affiliates associated with the Platform, including, but not limited to, in-app purchasing, multiplayer matchmaking, friends, leader boards, achievements, Virtual Reality Real Time Systems (“VERTS”), voice over IP and Cloud Backup, which list may be changed from time to time in MPT's or its affiliates’ sole discretion. + +2.1.2 "Meta VR Platform Framework" means the suite of Meta VR platform services, including, but not limited to, the MPT file distribution and update system (enabling distribution and updates of Applications by MPT and/or its affiliates, including through generated activation Keys), entitlement system, and account authentication, which list may be changed from time to time in MPT or its affiliates’ sole discretion. + +2.1.3 "Platform" means the virtual, mixed, and augmented reality platform made available by MPT and/or its affiliates, including, but not limited to, the user experience, user interface, store, and social features, usable on hardware approved by MPT or its affiliates or any third-party device or operating system, including, but not limited to, iOS, Android, Windows, OS X, Linux, and Windows Mobile. + +2.1.4 "Platform Services" means the Meta VR Platform Framework and the Application Services. + +2.2 Key Provision and Redemption. If you request that MPT generate activation keys for your Application on the Platform ("Keys") and MPT agrees, you hereby grant MPT and its affiliates (1) the right to generate Keys for you and (2) a license to make available, reproduce, distribute, perform, and display the Application to end users who have submitted a Key to MPT or its affiliates. MPT agrees to authenticate and make the Application available to any end user supplying a valid Key (or have its affiliates do so) (unless the Application has been removed or withdrawn). + +2.3 Platform Services Requirements. You will not make any use of any API, software, code or other item or information supplied by MPT or its affiliates in connection with the Platform Services other than to enhance the functionality of your Application. In particular, you must not (nor enable others to): (1) defame, abuse, harass, stalk, or threaten others, or to promote or facilitate any prohibited or illegal activities; (2) enable any functionality in your Application that would generate excessive traffic over the MPT network or servers that would negatively impact other users' experience, or otherwise interfere with or restrict the operation of the Platform Services, or MPT or its affiliates’ servers or networks providing the Platform Services; (3) remove, obscure, or alter any license terms, policies or terms of service or any links to or notices thereto provided by MPT or its affiliates; or (4) violate any rights of MPT, its affiliates, or any third parties. Notwithstanding anything to the contrary set forth in this Agreement, you may not sublicense any software, firmware or other item or information supplied by MPT or its affiliates in connection with the Platform Services for use by a third party, unless expressly authorized by MPT or its affiliates to do so. You agree not to use (or encourage the use of) the Platform Services for mission critical, life saving or ultra-hazardous activities. MPT or its affiliates may suspend operation of or remove any Application that does not comply with the restrictions in this Agreement. + +2.4 Changes to Platform or Platform Services. MPT and/or its affiliates may change the Platform or the functionality of the Platform Services at any time, including discontinuing some of the functionality of the Platform Services, and your continued use of the Platform or Platform Services or use of any modified or additional Platform Services is conditioned upon your adherence to the terms of this Agreement, as modified by MPT or its affiliates from time to time. + +3. Intellectual Property +3.1 Ownership. As between you and MPT, MPT and/or its affiliates or licensors own all rights, title, and interest, including all Intellectual Property Rights (defined below), in and to the SDK (including associated MPT content and sample code) and all derivatives thereof. MPT reserves all rights not expressly granted under the License. As between you and MPT, you and/or your licensors own all rights, title, and interest in and to your Application, (excluding our SDK), including all Intellectual Property Rights. “Intellectual Property Rights” means any and all worldwide rights under applicable laws of patent, copyright, trade secret, trademark, rights of publicity and privacy, and other proprietary rights. + +3.2 Third-Party Materials. Our SDK may include third-party software offered under an open source license or third-party content subject to a separate third-party agreement. To the extent any of such third-party terms conflicts with this Agreement, such third-party terms will control solely with respect to such third-party software or content. + +3.3 Feedback. If you provide comments, suggestions, recommendations, ideas, know-how or other feedback about our SDK or any other MPT or affiliate product or service, we (and our affiliates and those we allow) may use such information for any purposes without obligation to you and all intellectual property and other proprietary rights in any such feedback are deemed (and hereby) licensed to MPT (with the right to sublicense through multiple tiers) for any purpose on a perpetual, irrevocable, worldwide, paid-up,and royalty-free basis and may be used or disclosed for any purpose. + +3.4 Brand Attribution. This Agreement does not grant you or any third party permission to use our trade names, trademarks, service marks, logos, domain names, and other distinctive brand features (collectively, “Brand Features”) except as required for reasonable and customary use in describing the origin of the SDK or reproduction of the copyright notice as required under the License grant. You will not use our SDK or make any statement regarding the SDK or your Application which suggests partnership with, sponsorship by, or endorsement by MPT, its affiliates or any of their employees, contractors, contributors, licensors, affiliates, or partners without our prior written permission. + +4. Confidentiality +4.1 Confidentiality. Our communications to you and our SDK may contain MPT confidential information, which includes information that is marked confidential or that would normally be considered confidential under the circumstances. If you receive any such information, you will not disclose it to any third party without MPT prior written consent. MPT confidential information does not include information that you independently developed, that was rightfully given to you by a third party without a confidentiality obligation with regard to such information, or that becomes public through no fault of your own. You may disclose MPT confidential information when compelled to do so by law if you provide us reasonable prior notice, unless a court order prohibits such notice. + +5. Termination +5.1 Termination. The term of this Agreement will begin on the date on which you click accept, download, or use the SDK or any of its components and will continue until terminated as set forth in this Agreement. MPT reserves the right to terminate this Agreement with you, or to discontinue or suspend the SDK or any portion or feature or your access thereto in the event you breach any material provisions of this Agreement or the Terms, without liability or other obligation to you. + +5.2. Discontinuation of SDK. MPT reserves the right to discontinue all or part of the SDK at any time, in our sole discretion, without notice to you, and without liability or other obligation to you. This Agreement will terminate automatically and without notice to you in the event that the SDK is discontinued in its entirety. + +5.3 Effect of Termination. Upon termination of this Agreement, you will immediately stop using, distributing, or otherwise making available the SDK and all Applications that incorporate the SDK or any of its components, cease all use of the Brand Features, and destroy or return any cached or stored content, software, or other materials obtained through our SDK. + +5.4 Surviving Provisions. When this Agreement terminates, those terms that by their nature are intended to continue indefinitely will continue to apply, including, but not limited to, Section 3 (Intellectual Property), Section 4 (Confidentiality), Section 5 (Termination), Section 6 (Liability) and Section 7 (General Provisions). + +6. Liability +6.1 Indemnification. Unless prohibited by applicable law, you will indemnify and (at MPT’s option), defend MPT, its affiliates and subsidiaries, and the agents, licensors, contributors, directors, officers, employees, suppliers, and distributors thereof (collectively, “MPT Parties”) against all liabilities, damages, losses, costs, fees (including legal fees), and expenses relating to any allegation or third-party legal proceeding arising from: (1) your use of the SDK, or any negligence or misconduct, by you or your employees, agents, vendors, or contractors (collectively “Developer Parties”); (2) any Developer Parties’ violation of this Agreement, Terms, or any applicable law and regulation; (3) any use of your Application; or (4) Developer User Data or MPT User Data (each defined in the Developer Data Use Policy). + +6.2 WARRANTIES. EXCEPT AS EXPRESSLY SET OUT IN THE TERMS, THE SDK IS PROVIDED “AS IS” WITHOUT ANY SPECIFIC PROMISES, REPRESENTATIONS, GUARANTEES OR WARRANTIES, WHETHER EXPRESS, IMPLIED OR STATUTORY, INCLUDING, BUT NOT LIMITED TO, ANY COMMITMENTS ABOUT THE CONTENT ACCESSED THROUGH THE SDK, THE SPECIFIC FUNCTIONS OF THE SDK OR OUR PLATFORM SERVICES, OR THEIR RELIABILITY, AVAILABILITY, OR ABILITY TO MEET YOUR NEEDS. THE MPT PARTIES HEREBY DISCLAIM ANY IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, NON-INFRINGEMENT AND FITNESS FOR A PARTICULAR PURPOSE. SOME JURISDICTIONS DO NOT PERMIT THE EXCLUSION OR LIMITATION OF IMPLIED WARRANTIES, SO YOU MAY HAVE ADDITIONAL RIGHTS. + +6.3 LIMITATION OF LIABILITY. TO THE EXTENT PERMITTED BY APPLICABLE LAW, MPT PARTIES WILL NOT BE RESPONSIBLE FOR LOST PROFITS, BUSINESS OR GOODWILL, REVENUES, OR DATA; FINANCIAL LOSSES; OR INDIRECT, SPECIAL, CONSEQUENTIAL, EXEMPLARY, OR PUNITIVE DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) ARISING AS A RESULT OF THIS AGREEMENT, USE OF THE SDK OR ANY MODIFIED SAMPLE CODE EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. YOU AGREE THAT YOUR REMEDIES UNDER THIS AGREEMENT ARE LIMITED SOLELY TO THE RIGHT TO COLLECT MONEY DAMAGES, IF ANY, AND YOU HEREBY WAIVE YOUR RIGHT TO SEEK INJUNCTIVE RELIEF OR OTHER EQUITABLE RELIEF. IF YOU ARE A CALIFORNIA RESIDENT, YOU AGREE TO WAIVE CALIFORNIA CIVIL CODE § 1542, WHICH SAYS: “A GENERAL RELEASE DOES NOT EXTEND TO CLAIMS THAT THE CREDITOR OR RELEASING PARTY DOES NOT KNOW OR SUSPECT TO EXIST IN HIS OR HER FAVOR AT THE TIME OF EXECUTING THE RELEASE, WHICH IF KNOWN BY HIM OR HER WOULD HAVE MATERIALLY AFFECTED HIS OR HER SETTLEMENT WITH THE DEBTOR OR RELEASED PARTY.” TO THE EXTENT PERMITTED BY LAW, THE CUMULATIVE, AGGREGATE LIABILITY OF MPT PARTIES, FOR ANY AND ALL CLAIMS ARISING UNDER THIS AGREEMENT OR ITS SUBJECT MATTER SHALL NOT EXCEED THE GREATER OF ONE HUNDRED US DOLLARS ($100) OR THE AMOUNT YOU HAVE PAID US IN THE PAST TWELVE MONTHS. IN ALL CASES, MPT PARTIES WILL NOT BE LIABLE FOR ANY EXPENSE, LOSS, OR DAMAGE THAT IS NOT REASONABLY FORESEEABLE. + +7. General Provisions +7.1 Updates. We may need to update this Agreement from time to time, including to accurately reflect the access or uses of our SDK, and so we encourage you to check this Agreement regularly. By continuing to access or use our SDK after any notice of an update to this Agreement, you agree to be bound by them. Any updates to the Disputes section of this Agreement will apply only to disputes that arise after notice of the update takes place. If you do not agree to the updated terms, please stop all access or use of our SDK. You cannot sidestep your compliance obligations under an updated version of this Agreement by developing against an older release of the SDK or relying on the older Agreement and all updates to your application are subject to the modified Agreement. + +7.2 Authorization. You hereby grant MPT and its contractors and affiliates the authorization reasonably necessary for MPT to exercise its rights and perform its obligations under this Agreement, including a limited, royalty-free, non-exclusive license to use, perform, and display the Application you provide to MPT for testing, evaluation, and approval purposes. + +7.3 General Provisions. You and MPT are independent contractors with regard to each other. This Agreement does not create any third-party beneficiary rights or any agency, partnership, employment, or joint venture. We are not liable for failure or delay in performance to the extent caused by circumstances beyond our reasonable control. If you do not comply with this Agreement, and MPT does not take action right away or does not enforce any provision of this Agreement, this inaction or lack of enforcement will not act as a waiver by MPT of any rights that it may have (such as taking action in the future) or in any way affect the validity of this Agreement or parts thereof. If a particular provision of this Agreement is deemed unenforceable, it will be deemed modified to the minimum extent necessary to render it enforceable and most nearly reflect the intent of the original provision, and all other provisions in this Agreement shall remain in full force and effect. You may not assign or delegate this Agreement or any obligations under this Agreement without our advance written consent. Any such prohibited attempted assignment will be void. MPT may assign or delegate this Agreement and any of its rights or obligations under this Agreement without your consent or notice to you. This Agreement shall bind the parties and their respective heirs, successors, and permitted assigns. This Agreement is the entire agreement between you and MPT relating to the subject matter herein and supersede any prior or contemporaneous agreements on such subject matter. + +7.4 Dispute Resolution. + +7.4.1 If you reside outside the US or your business is located outside the US: You agree that any claim, cause of action, or dispute you have against us that arises out of or relates to any access or use of the SDK must be resolved exclusively in the U.S. District Court for the Northern District of California or a state court located in San Mateo County, that you submit to the personal jurisdiction of either of these courts for the purpose of litigating any such claim, and that the laws of the State of California will govern this Agreement and any such claim, without regard to conflict of law provisions. + +7.4.2 If you reside in the US or your business is located in the US: You and we agree to arbitrate any claim, cause of action, or dispute between you and us that arises out of or relates to any access or use of the SDK for business or commercial purposes (“commercial claim”). This provision does not cover any commercial claims relating to violations of your or our intellectual property rights, including, but not limited to, copyright infringement, patent infringement, trademark infringement, violations of the brand guidelines, violations of your or our confidential information or trade secrets, or efforts to interfere with our products or engage with our products in unauthorized ways (for example, automated ways). + +7.4.3 We and you agree that, by entering into this arbitration provision all parties are waiving their respective rights to a trial by jury or to participate in a class or representative action. THE PARTIES AGREE THAT EACH MAY BRING COMMERCIAL CLAIMS AGAINST THE OTHER ONLY IN ITS INDIVIDUAL CAPACITY, AND NOT AS A PLAINTIFF OR CLASS MEMBER IN ANY PURPORTED CLASS, REPRESENTATIVE, OR PRIVATE ATTORNEY GENERAL PROCEEDING. You may bring a commercial claim only on your own behalf and cannot seek relief that would affect other parties. If there is a final judicial determination that any particular commercial claim (or a request for particular relief) cannot be arbitrated in accordance with this paragraph’s limitations, then only that commercial claim (or only that request for relief) may be brought in court. All other commercial claims (or requests for relief) remain subject to this paragraph. + +7.4.4 The Federal Arbitration Act governs the interpretation and enforcement of this arbitration provision. All issues are for an arbitrator to decide, except that only a court may decide issues relating to the scope or enforceability of this arbitration provision or the interpretation of the prohibition of class and representative actions. + +7.4.5 If any party intends to seek arbitration of a dispute, that party must provide the other party with notice in writing. + +7.4.6 The arbitration will be governed by the AAA’s Commercial Arbitration Rules (“AAA Rules”), as modified by this Agreement, and will be administered by the AAA. If the AAA is unavailable, the parties will agree to another arbitration provider or the court will appoint a substitute. The arbitrator will not be bound by rulings in other arbitrations in which you are not a party. To the fullest extent permitted by applicable law, any evidentiary submissions made in arbitration will be maintained as confidential in the absence of good cause for its disclosure. The arbitrator’s award will be maintained as confidential only to the extent necessary to protect either party’s trade secrets or proprietary business information or to comply with a legal requirement mandating confidentiality. Each party will be responsible for paying any AAA filing, administrative and arbitrator fees in accordance with AAA Rules, except that we will pay for your filing, administrative, and arbitrator fees if your commercial claim for damages does not exceed $75,000 and is non-frivolous (as measured by the standards set forth in Federal Rule of Civil Procedure 11(b)). + +7.4.7 If you do not wish to be bound by this provision (including its waiver of class and representative claims), you must notify us as set forth below within 30 days of the first acceptance date of any version of this Agreement containing an arbitration provision. Your notice to us under this subsection must be submitted to the address here: Meta Platforms Technologies, LLC, 1 Hacker Way, Menlo Park, California 94025 + +7.4.8 All commercial claims between us, whether subject to arbitration or not, will be governed by California law, excluding California’s conflict of laws rules, except to the extent that California law is contrary to or preempted by federal law. + +7.4.9 If a commercial claim between you and us is not subject to arbitration, you agree that the claim must be resolved exclusively in the U.S. District Court for the Northern District of California or a state court located in San Mateo County, and that you submit to the personal jurisdiction of either of these courts for the purpose of litigating any such claim. + +7.4.10 If any provision of this dispute resolution provision is found unenforceable, that provision will be severed and the balance of the dispute resolution provision will remain in full force and effect. diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Main.localsettings b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Main.localsettings new file mode 100644 index 00000000..cf3bbb96 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Main.localsettings @@ -0,0 +1,7 @@ +data: + tabs: + - Composition + - Phone + - TV + - FridgeAnimFix + current_tab: Composition \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Main.metaspatial b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Main.metaspatial new file mode 100644 index 00000000..891bcfe3 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/Main.metaspatial @@ -0,0 +1,7 @@ +entities: + com.meta.models.TargetModel: + - components: + com.meta.components.TargetComponent: + platform: horizon_os +metadata: + version: 1.29 \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/components.json b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/components.json new file mode 100644 index 00000000..0c2bf720 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/components.json @@ -0,0 +1,726 @@ +{ + "components": [ + { + "name": "com.meta.pixelandtexel.scanner.TrackedObject", + "attributes": [ + { + "name": "objectId", + "keyString": "objectId", + "type": { + "attributeType": "Int" + }, + "description": "The detected object id. " + } + ], + "description": "\n A component representing an object detected using ML and tracked across frames.\n" + }, + { + "name": "com.meta.pixelandtexel.scanner.ViewLocked", + "attributes": [ + { + "name": "position", + "keyString": "position", + "type": { + "attributeType": "Vector3" + }, + "description": "The position offset to apply to the object, relative to the head. " + }, + { + "name": "rotation", + "keyString": "rotation", + "type": { + "attributeType": "Vector3" + }, + "description": "The rotation offset in euler angles to apply to the object, relative to the head. " + }, + { + "name": "fillView", + "keyString": "fillView", + "type": { + "attributeType": "Boolean" + }, + "description": "Whether or not to add a z offset so that the panel fills the camera view across the width of the panel. " + } + ], + "description": "\n A component which locks the position and rotation of an entity which has a Panel component to the user\u0027s view.\n" + }, + { + "name": "com.meta.pixelandtexel.scanner.WristAttached", + "attributes": [ + { + "name": "position", + "keyString": "position", + "type": { + "attributeType": "Vector3" + }, + "description": "The position offset to apply to the object, relative to the hand. " + }, + { + "name": "rotation", + "keyString": "rotation", + "type": { + "attributeType": "Vector3" + }, + "description": "The rotation offset in euler angles to apply to the object, relative to the hand. " + }, + { + "name": "side", + "keyString": "side", + "type": { + "attributeType": "com.meta.pixelandtexel.scanner.HandSide" + } + }, + { + "name": "faceUser", + "keyString": "faceUser", + "type": { + "attributeType": "Boolean" + }, + "description": "Whether or not to orient the entity such that it faces the user (ignores the rotation offset). " + } + ], + "description": "\n A component which positions and orients the entity on the user\u0027s wrist.\n" + }, + { + "name": "com.meta.spatial.toolkit.Animated", + "attributes": [ + { + "name": "pausedTime", + "keyString": "pausedTime", + "type": { + "attributeType": "Float" + }, + "description": "Paused location/time (sec) within animation track " + }, + { + "name": "playbackState", + "keyString": "playbackState", + "type": { + "attributeType": "com.meta.spatial.toolkit.PlaybackState" + }, + "description": "State of the animation (playing or paused) " + }, + { + "name": "playbackType", + "keyString": "playbackType", + "type": { + "attributeType": "com.meta.spatial.toolkit.PlaybackType" + }, + "description": "The type of animation playback to be used " + }, + { + "name": "startTime", + "keyString": "startTime", + "type": { + "attributeType": "Long" + }, + "description": "World time at which animation started (ms since epoch) " + }, + { + "name": "track", + "keyString": "track", + "type": { + "attributeType": "Int" + }, + "description": "which animation track of the glTF to play " + } + ], + "description": "\n Plays animation for a glTF asset. Also configures different settings for animation.\n @param startTime World time at which animation started (ms since epoch)\n @param pausedTime Paused location/time (sec) within animation track\n @param playbackState Playback state of the animated entity\n @param playbackType Playback type of the animated entity\n @param track the animation track of the glTF to play\n" + }, + { + "name": "com.meta.spatial.toolkit.Audio", + "attributes": [ + { + "name": "audio", + "keyString": "audio", + "type": { + "attributeType": "String" + }, + "description": "The Uri String of the audio file to be used " + }, + { + "name": "volume", + "keyString": "volume", + "type": { + "attributeType": "Float" + } + } + ], + "description": "\n Audio component that can be attached to an entity. This component will play audio spatially from the entity.\n @param file The Uri String of the audio file to be used\n @param volume The volume of the audio to be played, default is 1.0f\n @property audio The Uri String of the audio file to be used\n @property volume The volume of the audio to be played, default is 1.0f \n" + }, + { + "name": "com.meta.spatial.toolkit.AvatarAttachment", + "attributes": [ + { + "name": "type", + "keyString": "type", + "type": { + "attributeType": "String" + }, + "description": "Which part of the avatar the entity is meant to represent (i.e. head, body, right_hand, left_controller) " + } + ], + "description": "\n Defines what part of an Avatar the entity is meant to represent. (head, body, controller, etc.)\n" + }, + { + "name": "com.meta.spatial.toolkit.AvatarBody", + "attributes": [ + { + "name": "head", + "keyString": "head", + "type": { + "attributeType": "Entity" + } + }, + { + "name": "isPlayerControlled", + "keyString": "isPlayerControlled", + "type": { + "attributeType": "Boolean" + } + }, + { + "name": "leftHand", + "keyString": "leftHand", + "type": { + "attributeType": "Entity" + } + }, + { + "name": "rightHand", + "keyString": "rightHand", + "type": { + "attributeType": "Entity" + } + }, + { + "name": "root", + "keyString": "root", + "type": { + "attributeType": "Entity" + } + } + ], + "description": "\n AvatarBody is a component that represents the body of an avatar. It contains references to the\n head, left hand, right hand, and root of the avatar. It also contains a boolean flag to indicate\n whether the avatar is player controlled.\n \n @param head The head Entity of the avatar.\n @param leftHand The left hand Entity of the avatar.\n @param rightHand The right hand Entity of the avatar.\n @param root The root Entity of the avatar.\n @param isPlayerControlled A boolean flag to indicate whether the avatar is player controlled.\n" + }, + { + "name": "com.meta.spatial.toolkit.Box", + "attributes": [ + { + "name": "max", + "keyString": "max", + "type": { + "attributeType": "Vector3" + }, + "description": "\n The relative offset of the top corner (furthest in the +x, +y, +z direction) of the box from\n the center\n" + }, + { + "name": "min", + "keyString": "min", + "type": { + "attributeType": "Vector3" + }, + "description": "\n The relative offset of the bottom corner (furthest in the -x, -y, -z direction) of the box from\n the center\n" + } + ], + "description": "\n Defines the dimensions of a box shape by the relative offset of two opposite corners. A box of\n max\u003dVector3(1,1,1) and min\u003dVector3(-1,-1,-1) will result in a 2x2x2 box. This is to be used with\n the `mesh://box` `Mesh` URI.\n \n @property min The relative offset of the bottom corner (furthest in the -x, -y, -z direction)\n @property max The relative offset of the top corner (furthest in the +x, +y, +z direction)\n" + }, + { + "name": "com.meta.spatial.toolkit.Controller", + "attributes": [ + { + "name": "buttonState", + "keyString": "buttonState", + "type": { + "attributeType": "Int" + }, + "description": "The current state of the buttons being pressed represented by integer bits " + }, + { + "name": "changedButtons", + "keyString": "changedButtons", + "type": { + "attributeType": "Int" + }, + "description": "Which buttons (represented by integer bits) have been changed (pressed or unpressed since the last frame) " + }, + { + "name": "directTouchButtonState", + "keyString": "directTouchButtonState", + "type": { + "attributeType": "Int" + }, + "description": "The state of the direct touch buttons " + }, + { + "name": "directTouchEnabled", + "keyString": "directTouchEnabled", + "type": { + "attributeType": "Boolean" + } + }, + { + "name": "isActive", + "keyString": "isActive", + "type": { + "attributeType": "Boolean" + } + }, + { + "name": "laserEnabled", + "keyString": "laserEnabled", + "type": { + "attributeType": "Boolean" + } + }, + { + "name": "type", + "keyString": "type", + "type": { + "attributeType": "com.meta.spatial.toolkit.ControllerType" + }, + "description": "What type of controller it is: 0-\u003econtroller 1-\u003ehand 2-\u003eeye " + } + ], + "description": "\n Represents Controller Data and properties that can be used to facilitate input.\n \n @param buttonState The current state of the buttons being pressed represented by integer bits\n @param changedButtons Which buttons (represented by integer bits) have been changed (pressed or\n unpressed)\n @param isActive Whether the controller is active or not\n @param type What type of controller it is: 0-\u003econtroller 1-\u003ehand\n @param directTouchEnabled Whether direct touch is enabled or not\n @param directTouchButtonState The state of the direct touch buttons\n @param laserEnabled Whether the laser is enabled or not\n" + }, + { + "name": "com.meta.spatial.toolkit.CreatorVisibility", + "attributes": [ + { + "name": "state", + "keyString": "state", + "type": { + "attributeType": "com.meta.spatial.toolkit.CreatorVisibilityState" + }, + "description": "\n Different states of CreatorVisibility, such as only visible to the creator, or only invisible\n to the creator\n" + } + ], + "description": "\n Use this Component to hide an Entity from every user except for the creator\n of the Entity or hide the entity from only the creator. This *MUST* be used\n with dynamically created Entities, Entities created using scene.xml do not\n have a \"creator\" and so will not be hidden.\n \n This Component is currently only designed for simple use like having an object or panel that\n only the creator can see. If you need more complex visibility logic you will need to write\n your own Visibility Component and System, feel free to use this one as a reference.\n \n @param state The state of the visibility, see CreatorVisibilityState for more info\n" + }, + { + "name": "com.meta.spatial.toolkit.Dome", + "attributes": [ + { + "name": "radius", + "keyString": "radius", + "type": { + "attributeType": "Float" + }, + "description": "The radius of the dome in meters " + } + ], + "description": "\n Defines the dimensions of a dome shape by a radius. This can be used for skyboxes.\n \n @param radius The radius of the dome in meters\n @property radius The radius of the dome in meters\n" + }, + { + "name": "com.meta.spatial.toolkit.Followable", + "attributes": [ + { + "name": "active", + "keyString": "active", + "type": { + "attributeType": "Boolean" + }, + "description": "Whether entity is actively following or not " + }, + { + "name": "maxAngle", + "keyString": "maxAngle", + "type": { + "attributeType": "Float" + }, + "description": "\n Maximum Y angle offset a followable will keep from target, 0 is straight ahead, positive values\n are up, negative values are down\n" + }, + { + "name": "minAngle", + "keyString": "minAngle", + "type": { + "attributeType": "Float" + }, + "description": "\n Minimum Y angle offset a followable will keep from target, 0 is straight ahead, positive values\n are up, negative values are down\n" + }, + { + "name": "offset", + "keyString": "offset", + "type": { + "attributeType": "Pose" + }, + "description": "Pose offset to keep from target. Defining a quaternion will rotate the Entity " + }, + { + "name": "speed", + "keyString": "speed", + "type": { + "attributeType": "Float" + }, + "description": "How fast followable tracks to its desired location. Float value is a percent of default speed " + }, + { + "name": "target", + "keyString": "target", + "type": { + "attributeType": "Entity" + }, + "description": "Target Entity to follow " + }, + { + "name": "tolerance", + "keyString": "tolerance", + "type": { + "attributeType": "Float" + }, + "description": "This is the change in distance needed to start moving " + }, + { + "name": "type", + "keyString": "type", + "type": { + "attributeType": "com.meta.spatial.toolkit.FollowableType" + }, + "description": "The type of behavior an object has when following (faces user, pivots on y axis, etc.) " + } + ], + "description": "\n Followable is a component that enables an entity to stay in front of another entity. Followable\n will track the orientation of the parent and move itself to stay in front.\n \n @param target Target Entity to follow\n @param offset Pose offset to keep from target. Defining a quaternion will rotate the Entity\n @param minAngle Minimum Y angle offset a followable will keep from target, 0 is straight ahead,\n positive values are up, negative values are down\n @param maxAngle Maximum Y angle offset a followable will keep from target, 0 is straight ahead,\n positive values are up, negative values are down\n @param type The behavior an object has when following (faces user, pivots on y axis, etc.)\n @param tolerance This is the change in distance needed to start moving\n @param speed How fast followable tracks to its desired location. Float value is a percent of\n default speed\n @param active Whether entity is actively following or not\n" + }, + { + "name": "com.meta.spatial.toolkit.Grabbable", + "attributes": [ + { + "name": "enabled", + "keyString": "enabled", + "type": { + "attributeType": "Boolean" + }, + "description": "Defines whether the object can be grabbed or not. " + }, + { + "name": "isGrabbed", + "keyString": "isGrabbed", + "type": { + "attributeType": "Boolean" + }, + "description": "Whether the object is currently grabbed or not " + }, + { + "name": "maxHeight", + "keyString": "maxHeight", + "type": { + "attributeType": "Float" + }, + "description": "the maximum height an object can be held when grabbed " + }, + { + "name": "minHeight", + "keyString": "minHeight", + "type": { + "attributeType": "Float" + }, + "description": "the minimum height an object can be held when grabbed " + }, + { + "name": "type", + "keyString": "type", + "type": { + "attributeType": "com.meta.spatial.toolkit.GrabbableType" + }, + "description": "The type of behavior an object has when grabbed (faces user, pivots on y axis, etc.) " + } + ], + "description": "\n Grabbable is a component that allows an object to be grabbed by a controller. It requires the\n Mesh Component to be present.\n \n @param enabled Defines whether the object can be grabbed or not.\n @param type The type of behavior an object has when grabbed (faces user, pivots on y axis, etc.)\n @param isGrabbed Whether the object is currently grabbed or not\n @param minHeight the minimum height an object can be held when grabbed\n @param maxHeight the maximum height an object can be held when grabbed\n" + }, + { + "name": "com.meta.spatial.toolkit.Hittable", + "attributes": [ + { + "name": "hittable", + "keyString": "hittable", + "type": { + "attributeType": "com.meta.spatial.toolkit.MeshCollision" + }, + "description": "The type of behavior the object can be hit using. " + } + ], + "description": "\n Defines whether an object is hittable or not.\n \n @param hittable The type of behavior the object can be hit using\n @property hittable The type of behavior the object can be hit using\n" + }, + { + "name": "com.meta.spatial.toolkit.Panel", + "attributes": [ + { + "name": "hittable", + "keyString": "hittable", + "type": { + "attributeType": "com.meta.spatial.toolkit.MeshCollision" + }, + "description": "hittable hit test type for the panel, @see MeshCollision " + }, + { + "name": "panelRegistrationId", + "keyString": "panel", + "type": { + "attributeType": "Int" + }, + "description": "panelRegistrationId The id of the panel. This is used to identify the @PanelRegistration when creating the panel. " + } + ], + "description": "\n Panel is a component that can be attached to a scene object to indicate that it is a panel.\n @param panelRegistrationId The id of the panel. This is used to identify the @PanelRegistration when creating the panel.\n @param hittable hit test type for the panel, @see MeshCollision\n" + }, + { + "name": "com.meta.spatial.toolkit.PanelClickState", + "attributes": [ + { + "name": "clickStateInternal", + "keyString": "clickStateInternal", + "type": { + "attributeType": "Int" + }, + "description": "The current state of the panel click. " + } + ], + "description": "\n PanelClickState is a component that is used to track the state of a panel click. It is used\n in @see PanelClickSystem, and passes the state to the @see Panel component.\n \n @param clickState The current state of the panel click.\n @constructor Creates a new PanelClickState component with the specified click state.\n" + }, + { + "name": "com.meta.spatial.toolkit.PanelDimensions", + "attributes": [ + { + "name": "dimensions", + "keyString": "dimensions", + "type": { + "attributeType": "Vector2" + } + } + ], + "description": "\n PanelDimensions is a component that holds the dimensions of a panel. This is used to override the\n height and width of a panel when creating the panel. Without it, the panel\u0027s height and width\n will be the values from @see PanelRegistration.\n \n @param dimensions The dimensions of the panel.\n @property dimensions The dimensions of the panel.\n" + }, + { + "name": "com.meta.spatial.toolkit.Plane", + "attributes": [ + { + "name": "depth", + "keyString": "depth", + "type": { + "attributeType": "Float" + } + }, + { + "name": "width", + "keyString": "width", + "type": { + "attributeType": "Float" + } + } + ], + "description": "\n Defines the dimensions of a horizontal plane.\n \n @param width The width of the plane\n @param depth The depth of the plane\n @property width The width of the plane\n @property depth The depth of the plane\n @constructor Creates a plane with default width and depth of 1.0f\n" + }, + { + "name": "com.meta.spatial.toolkit.Quad", + "attributes": [ + { + "name": "max", + "keyString": "max", + "type": { + "attributeType": "Vector2" + } + }, + { + "name": "min", + "keyString": "min", + "type": { + "attributeType": "Vector2" + } + } + ], + "description": "\n A Quad is a 2D shape that can be used to represent a 2D surface in 3D space. It is defined by its\n minimum and maximum coordinates.\n \n @param min The minimum coordinates of the Quad.\n @param max The maximum coordinates of the Quad.\n @property min The minimum coordinates of the Quad.\n @property max The maximum coordinates of the Quad.\n @constructor Creates a new Quad with the given minimum and maximum coordinates.\n" + }, + { + "name": "com.meta.spatial.toolkit.RoundedBox", + "attributes": [ + { + "name": "max", + "keyString": "max", + "type": { + "attributeType": "Vector3" + }, + "description": "\n The relative offset of the top corner (furthest in the +x, +y, +z direction) of the box from\n the center\n" + }, + { + "name": "min", + "keyString": "min", + "type": { + "attributeType": "Vector3" + }, + "description": "\n The relative offset of the bottom corner (furthest in the -x, -y, -z direction) of the box from\n the center\n" + }, + { + "name": "radius", + "keyString": "radius", + "type": { + "attributeType": "Vector3" + }, + "description": "\n The radii of the rounded edges of the box, where the radii corresponds to the edges along that\n plane. (i.e radius.x corresponds to the edges running along in the x axis)\n" + } + ], + "description": "\n Defines the dimensions of a box shape with rounded edges by the relative offset of two opposite\n corners and a Vector3 of radii to modify the roundedness of the edges\n" + }, + { + "name": "com.meta.spatial.toolkit.ScenePlane", + "attributes": [ + { + "name": "extents", + "keyString": "extents", + "type": { + "attributeType": "Vector3" + }, + "description": "The extents of the plane. " + }, + { + "name": "name", + "keyString": "name", + "type": { + "attributeType": "String" + }, + "description": "The name of the plane " + }, + { + "name": "offset", + "keyString": "offset", + "type": { + "attributeType": "Vector3" + }, + "description": "The offset of the plane. " + }, + { + "name": "type", + "keyString": "type", + "type": { + "attributeType": "String" + }, + "description": "The type of the plane " + } + ], + "description": "\n A ScenePlane is a 2D plane in the scene. It has a name, type, offset, and extents.\n \n @param name The name of the plane.\n @param type The type of the plane.\n @param offset The offset of the plane.\n @param extents The extents of the plane.\n" + }, + { + "name": "com.meta.spatial.toolkit.SceneVolume", + "attributes": [ + { + "name": "extents", + "keyString": "extents", + "type": { + "attributeType": "Vector3" + }, + "description": "The extents of the volume. " + }, + { + "name": "name", + "keyString": "name", + "type": { + "attributeType": "String" + }, + "description": "The name of the volume " + }, + { + "name": "offset", + "keyString": "offset", + "type": { + "attributeType": "Vector3" + }, + "description": "The offset of the volume. " + }, + { + "name": "type", + "keyString": "type", + "type": { + "attributeType": "String" + }, + "description": "The type of the volume " + } + ], + "description": "\n A SceneVolume is a 3D volume that can be used to represent a physical object in the scene.\n \n @param name The name of the volume, identifier for the volume.\n @param type The type of the volume, it will be used to determine a Anchor is belong to the volume\n or not.\n @param offset The offset of the volume, all the Entity belongs to the volume will be offset by\n this value.\n @param extents The extents of the volume\n" + }, + { + "name": "com.meta.spatial.toolkit.Sphere", + "attributes": [ + { + "name": "radius", + "keyString": "radius", + "type": { + "attributeType": "Float" + }, + "description": "The radius of the sphere in meters " + } + ], + "description": "\n The component that defines the dimensions of a sphere by the radius. To be used with the\n `mesh://sphere` `Mesh` URI.\n \n @param radius The radius of the sphere in meters\n @property radius The radius of the sphere in meters\n @constructor Creates a sphere with the given radius\n" + }, + { + "name": "com.meta.spatial.toolkit.SupportsLocomotion", + "attributes": [], + "description": "\n This component is used to indicate that an entity supports locomotion. When added to an entity\n with a mesh, it allows for default locomotion on the mesh\n" + }, + { + "name": "com.meta.spatial.toolkit.TrackedBody", + "attributes": [] + } + ], + "enums": [ + { + "name": "com.meta.pixelandtexel.scanner.HandSide", + "values": [ + "LEFT", + "RIGHT" + ] + }, + { + "name": "com.meta.spatial.toolkit.ControllerType", + "values": [ + "CONTROLLER", + "HAND", + "EYES" + ] + }, + { + "name": "com.meta.spatial.toolkit.CreatorVisibilityState", + "values": [ + "CREATOR_ONLY_VISIBLE", + "CREATOR_ONLY_INVISIBLE" + ] + }, + { + "name": "com.meta.spatial.toolkit.FollowableType", + "values": [ + "FACE", + "PIVOT_Y" + ] + }, + { + "name": "com.meta.spatial.toolkit.GrabbableType", + "values": [ + "FACE", + "PIVOT_Y" + ] + }, + { + "name": "com.meta.spatial.toolkit.MeshCollision", + "values": [ + "NoCollision", + "LineTest", + "LineTest_IgnoreVisible" + ] + }, + { + "name": "com.meta.spatial.toolkit.PlaybackState", + "values": [ + "PLAYING", + "PAUSED" + ] + }, + { + "name": "com.meta.spatial.toolkit.PlaybackType", + "values": [ + "LOOP", + "CLAMP", + "BOUNCE", + "REVERSE_LOOP" + ] + } + ] +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/config.json b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/config.json new file mode 100644 index 00000000..a6c6e869 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/scenes/config.json @@ -0,0 +1,6 @@ +{ + "spatial.editor.customComponentXmlsPath": [ + "../src/main/components/" + ], + "spatial.editor.libraryComponentXmlsPath": "../build/generated/components" +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/androidTest/java/com/meta/pixelandtexel/scanner/ExampleInstrumentedTest.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/androidTest/java/com/meta/pixelandtexel/scanner/ExampleInstrumentedTest.kt new file mode 100644 index 00000000..1c31a0c6 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/androidTest/java/com/meta/pixelandtexel/scanner/ExampleInstrumentedTest.kt @@ -0,0 +1,24 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner + +import androidx.test.ext.junit.runners.AndroidJUnit4 +import androidx.test.platform.app.InstrumentationRegistry +import org.junit.Assert.* +import org.junit.Test +import org.junit.runner.RunWith + +/** + * Instrumented test, which will execute on an Android device. + * + * See [testing documentation](http://d.android.com/tools/testing). + */ +@RunWith(AndroidJUnit4::class) +class ExampleInstrumentedTest { + @Test + fun useAppContext() { + // Context of the app under test. + val appContext = InstrumentationRegistry.getInstrumentation().targetContext + assertEquals("com.meta.pixelandtexel.scanner", appContext.packageName) + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/AndroidManifest.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/AndroidManifest.xml new file mode 100644 index 00000000..7f61ff28 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/AndroidManifest.xml @@ -0,0 +1,123 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/LICENSE.md b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/LICENSE.md new file mode 100644 index 00000000..f734505f --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/LICENSE.md @@ -0,0 +1,137 @@ +Meta Platform Technologies SDK License Agreement +Effective date: October 25, 2022 + +Copyright © Meta Platform Technologies, LLC and its affiliates. All rights reserved. + +The text of this may be found at: https://developer.oculus.com/licenses/oculussdk/ + +This Meta Platforms Technologies SDK License Agreement, formerly known as the Oculus SDK License Agreement, (“Agreement”) is a legal agreement between you and MPT (defined below) governing your use of our MPT Software Development Kit, formerly known as the Oculus Software Developer Kit. MPT Software Development Kit means any application programming interfaces (“APIs”), tools, plugins, code, technology, specification, documentation, Platform Services, and/or content made available by us to others, including app developers and content providers (collectively, the “SDK”). + +By downloading or using our SDK, you are agreeing to this Agreement along with other applicable terms and conditions such as the additional terms or documents accompanying the SDK and the Terms of Service, and acknowledging our Privacy Policy (collectively, the “Terms”). If you use the SDK as an interface to, or in conjunction with other MPT products or services, then the terms for those other products or services also apply. + +Here, "MPT" means Meta Platforms Technologies, LLC, formerly known as Facebook Technologies, LLC (formerly known as Oculus VR, LLC), a Delaware limited liability company with its principal place of business at 1 Hacker Way, Menlo Park, California 94025, United States unless set forth otherwise. We may also refer to MPT as "we", "our", or "us" in this Agreement. + +You may not use the SDK and may not accept this Agreement if (1) you are a person with whom MPT is prohibited from transacting business under applicable law, or (2) you are a person barred from using or receiving the SDK by MPT or under the applicable laws of the United States or other countries including the country in which you are resident or from which you use the SDK. If you are using the SDK on behalf of an entity, you represent and warrant that you have authority to bind that entity to this Agreement and by accepting this Agreement, you are doing so on behalf of that entity (and all references to "you" in this Agreement refer to that entity). + +This Agreement requires the resolution of most disputes between you and MPT by binding arbitration on an individual basis; class actions and jury trials are not permitted. + +1. License Grant +1.1 License. Subject to the Terms and the restrictions set forth in this Agreement, MPT hereby grants you a limited, royalty-free, non-exclusive, non-transferrable, non-sublicensable (except as otherwise set forth in this Agreement), revocable copyright license (“License”) during the term of this Agreement to use and reproduce the SDK solely to develop, test, and/or distribute your Application (defined below) and to enable you and/or your end users to access MPT features through your Application. You may only use the SDK to develop Applications in connection with MPT approved hardware and software products (“MPT Approved Products”) unless the documentation accompanying the SDK expressly authorizes broader use such as with other third-party platforms. + +1.1.1 If the SDK includes any libraries, sample source code, or other materials that we make available specifically for incorporation in your Application (as indicated by applicable documentation), you may incorporate those materials and reproduce and distribute them as part of your Application, including by distributing those materials to third parties contributing to your Application. + +1.1.2 The SDK may include other content (e.g., sample code) that is for demonstration, reference, or other purposes and is subject to terms and conditions included with such materials. Such materials will be clearly marked in the applicable documentation. Absent such additional terms and conditions, you may modify, distribute, and sublicense any sample source made available as part of the SDK pursuant to this Agreement and the Terms. + +1.1.3 The SDK may include MPT content that is subject to your additional right to display the content to your end users through the use of the corresponding SDK, as contemplated by the documentation accompanying such SDK. For example, the SDK may include avatars that you may display to your end users. + +1.2 General Restrictions. The License grant in this Section is solely for the purpose of developing, testing, and promoting your engines, tools, applications, content, games and demos, or other products and features (collectively, “Application”) and providing you and/or your end users access to MPT services and features through your Application as contemplated by applicable documentation accompanying the SDK. You may not (or allow those acting on your behalf to): + +1.2.1 modify or create derivative works from any SDK or its component (other than sample source code described in this Section or expressly authorized by the documents accompanying the SDK); + +1.2.2 misrepresent or mask either your identity or your Application's identity when using the SDK or developer accounts; + +1.2.3 attempt to circumvent any limitations implemented within or documented with the SDK (e.g., limiting the number of requests you may make, end users you may serve, or processing abstracted gaze data or abstracted facial expressions data for prohibited uses under the Developer Data Use Policy); + +1.2.4 reverse engineer, decompile, disassemble, or otherwise attempt to extract the source code from the SDK, except to the extent that applicable law expressly permits such actions despite this limitation; + +1.2.5 alter, restrict, or interfere with the normal operation or functionality of the SDK, the MPT hardware or software, or MPT Approved Products, including, but not limited to: (a) the behavior of the “Meta Quest button” and “XBox button” implemented by the MPT system software; (b) any on-screen messages or information; (c) the behavior of the proximity sensor in the MPT hardware implemented by the MPT system software; (d) any MPT hardware or software security features; (e) any end user's settings; and (f) Health and Safety Warnings; + +1.2.6 use the SDK or your Application in a manner that violates: (a) the Developer Data Use Policy (where applicable); (b) the Content Guidelines, or other applicable terms and policies made available on our Developer Policy portal; (c) any rights of MPT, its affiliates or third parties; (d) applicable laws (such as laws regarding import, export, privacy, health & safety); or (e) other terms of service with MPT or its affiliates; + +1.2.7 remove, obscure, or alter any Terms or any links to or notices of those Terms; or + +1.2.8 use or redistribute the SDK or any portion thereof in any manner that would cause the SDK (or any portion thereof) or MPT to become subject to the terms of any open source license or other restrictions. + +1.3 Distribution and Sublicense Restrictions. The redistribution and sublicense rights under this Section are further subject to the following restrictions: (1) redistribution of sample source code or other materials must include the following copyright notice: “Copyright © Meta Platform Technologies, LLC and its affiliates. All rights reserved;” and (2) if the sample source code or other materials include a "License" or "Notice" text file, you must provide a copy of the License or Notice file with the sample code. + +1.4 Privacy and Security. + +1.4.1 You are responsible for the data collection, processing and disclosure by your Application and agree to comply with all applicable privacy and data protection laws, as well as our applicable terms and policies, particularly the Developer Data Use Policy. You represent and warrant that you have provided robust and sufficiently prominent notice to users regarding (i) data processing that includes, at a minimum, that third parties, including MPT and its affiliates, may collect or receive information from your Application, and (ii) any other information required to be disclosed to users by applicable privacy and data protection laws. You represent and warrant that you will not back up or make available to the Cloud Backup feature any information that you know or reasonably should know (a) is from or about children under the age of 13, or (b) includes data concerning health, financial information, or other categories of sensitive information (including any information defined as special or sensitive under applicable laws, regulations, and applicable industry guidelines). + +For purposes of the GDPR, you acknowledge and agree that you are a separate and independent controller of the Developer User Data (as defined in the MPT Developer Data Use Policy) and Meta Platforms Ireland Ltd. (an affiliate of MPT) is a separate and independent controller for any processing of personal data, except as provided in Section 1.4.3, including the MPT User Data (as defined in the MPT Developer Data Use Policy). The parties do not and will not process Developer User Data or MPT User Data as joint controllers. Each party shall comply with the obligations that apply to it as a controller under the GDPR, and each party shall be individually and separately responsible for its own compliance. + +1.4.3 Notwithstanding the foregoing, where (a) we process Developer User Data that contains personal data to (i) store, host or otherwise backup Developer User Data through Cloud Backup; (ii) provide and operate the Spatial Audio VoIP API; (iii) or provide other services described in the Data Processing Terms (the “Services”), and (b) (i) our processing of such personal data is subject to the GDPR, you instruct Meta Platforms Ireland Ltd. to process such personal data in order to provide the Services pursuant to this Agreement and the Data Processing Terms, which are incorporated herein by reference, including for product improvement for your benefit and to anonymize the data so that it is no longer Personal Data for the purposes of GDPR, and you understand and agree that we may retain such anonymized metadata for our own legitimate purposes, including improvement and development of the Services; or (ii) such personal data is considered “personal information,” “personal data,” “personally identifiable information” or similar terms and is subject to the California Consumer Privacy Act of 2018 (“CCPA”), or other applicable privacy and data protection laws (excluding the GDPR), we will only retain, use and disclose such personal data for the purposes of providing those Services to you and improving those Services on your behalf or as otherwise permitted by the CCPA and such other applicable privacy and data protection laws. + +1.4.4 “Personal data,” “controller,” “processor,” and “process” in this Section 1.4 have the meanings set out in the Data Processing Terms. + +1.5 You have no obligations under this Agreement to license or make available your Application to MPT, its affiliates, or any third parties. Nothing in this Agreement obligates MPT or its affiliates to enable you or any of your Applications to access, interact with, or retrieve or publish content to the Meta VR platform or any other MPT platforms or service. However, MPT and/or its affiliates may require you to agree to additional terms as a condition of providing you with such platform services in connection with your use of the SDK. You acknowledge and agree that MPT and its affiliates may develop products or services that may compete with your Application or any other products or services of yours. + +1.6 Experimental Features. From time to time, MPT may, in its sole discretion, make available to you as part of the MPT Software Development Kit, certain experimental, test or beta software, APIs or features on a limited or test basis (“Experimental Features”). Experimental Features can only be used for experimental or testing purposes and cannot be incorporated into a production build unless (i) the Experimental Feature has been released or included in MPT software production builds or (ii) otherwise permitted by MPT in writing. Your use of any Experimental Feature is voluntary. You agree that all use of any Experimental Feature is at your sole risk. You agree that once you use an Experiment Feature, your content, data and/or systems may be affected, and you may be unable to revert back to a prior version of the same or similar feature. Additionally, if such reversion is possible, you may not be able to return or restore data created or transferred using the Experimental Feature back to the prior version. The Experimental Features may not work in the same way as a final production version. MPT and its affiliates make no representations or warranties that the Experimental Features will function or be free from errors. The Experimental Features are provided on an “as is” basis and may contain errors or inaccuracies that could cause failures, corruption or loss of data and information from any connected device or service. MPT and its affiliates have no obligation to correct bugs, defects, or errors or otherwise support or maintain Experimental Features. MPT and its affiliates may discontinue, update, modify or remove access to any Experimental Feature at any time in its sole discretion, and may not release a final version of an Experimental Feature in its sole discretion. + +1.7 Medical & HIPAA Use Restrictions. The SDK in association with MPT Approved Products is not intended to be a medical device unless otherwise specified in writing by MPT. The use of the SDK is not intended to create, and should not be interpreted as creating, any obligations to MPT under the Health Insurance Portability and Accountability Act of 1996, the Health Information Technology for Economic and Clinical Health Act, or their implementing regulations, as amended (collectively “HIPAA”), and MPT makes no representations that the SDK satisfies HIPAA requirements. If you are or become a “Covered Entity”, “Business Associate”, “Subcontractor”, or a “Workforce” member of a Covered Entity, Business Associate, or Subcontractor (as those terms are defined at 45 C.F.R. § 160.103), you agree not to use the SDK to create, receive, maintain or transmit any “Protected Health Information” (as that term is defined at 45 C.F.R. § 160.103) in any manner that would make MPT (or any of its affiliates and subsidiaries) your or any third party’s Business Associate or Subcontractor Business Associate. You are solely responsible for any applicable HIPAA compliance obligations. + +2. Meta VR Platform Services +MPT and/or its affiliates makes certain Platform Services (defined below) available to you to include and enable in your Application on our Platform. An Application that enables or includes any Platform Service must implement the Meta VR Platform Framework (defined below) with the Application. Once your Application has been authorized for use of the Platform Services, you are not required to update your Application to include new Platform Services that MPT and/or its affiliates may make available as part of the Meta VR Platform Framework. For more information, please visit https://developer.oculus.com. + +2.1 For the purpose of this Section, + +2.1.1 “Application Services” means services provided by MPT and/or its affiliates associated with the Platform, including, but not limited to, in-app purchasing, multiplayer matchmaking, friends, leader boards, achievements, Virtual Reality Real Time Systems (“VERTS”), voice over IP and Cloud Backup, which list may be changed from time to time in MPT's or its affiliates’ sole discretion. + +2.1.2 "Meta VR Platform Framework" means the suite of Meta VR platform services, including, but not limited to, the MPT file distribution and update system (enabling distribution and updates of Applications by MPT and/or its affiliates, including through generated activation Keys), entitlement system, and account authentication, which list may be changed from time to time in MPT or its affiliates’ sole discretion. + +2.1.3 "Platform" means the virtual, mixed, and augmented reality platform made available by MPT and/or its affiliates, including, but not limited to, the user experience, user interface, store, and social features, usable on hardware approved by MPT or its affiliates or any third-party device or operating system, including, but not limited to, iOS, Android, Windows, OS X, Linux, and Windows Mobile. + +2.1.4 "Platform Services" means the Meta VR Platform Framework and the Application Services. + +2.2 Key Provision and Redemption. If you request that MPT generate activation keys for your Application on the Platform ("Keys") and MPT agrees, you hereby grant MPT and its affiliates (1) the right to generate Keys for you and (2) a license to make available, reproduce, distribute, perform, and display the Application to end users who have submitted a Key to MPT or its affiliates. MPT agrees to authenticate and make the Application available to any end user supplying a valid Key (or have its affiliates do so) (unless the Application has been removed or withdrawn). + +2.3 Platform Services Requirements. You will not make any use of any API, software, code or other item or information supplied by MPT or its affiliates in connection with the Platform Services other than to enhance the functionality of your Application. In particular, you must not (nor enable others to): (1) defame, abuse, harass, stalk, or threaten others, or to promote or facilitate any prohibited or illegal activities; (2) enable any functionality in your Application that would generate excessive traffic over the MPT network or servers that would negatively impact other users' experience, or otherwise interfere with or restrict the operation of the Platform Services, or MPT or its affiliates’ servers or networks providing the Platform Services; (3) remove, obscure, or alter any license terms, policies or terms of service or any links to or notices thereto provided by MPT or its affiliates; or (4) violate any rights of MPT, its affiliates, or any third parties. Notwithstanding anything to the contrary set forth in this Agreement, you may not sublicense any software, firmware or other item or information supplied by MPT or its affiliates in connection with the Platform Services for use by a third party, unless expressly authorized by MPT or its affiliates to do so. You agree not to use (or encourage the use of) the Platform Services for mission critical, life saving or ultra-hazardous activities. MPT or its affiliates may suspend operation of or remove any Application that does not comply with the restrictions in this Agreement. + +2.4 Changes to Platform or Platform Services. MPT and/or its affiliates may change the Platform or the functionality of the Platform Services at any time, including discontinuing some of the functionality of the Platform Services, and your continued use of the Platform or Platform Services or use of any modified or additional Platform Services is conditioned upon your adherence to the terms of this Agreement, as modified by MPT or its affiliates from time to time. + +3. Intellectual Property +3.1 Ownership. As between you and MPT, MPT and/or its affiliates or licensors own all rights, title, and interest, including all Intellectual Property Rights (defined below), in and to the SDK (including associated MPT content and sample code) and all derivatives thereof. MPT reserves all rights not expressly granted under the License. As between you and MPT, you and/or your licensors own all rights, title, and interest in and to your Application, (excluding our SDK), including all Intellectual Property Rights. “Intellectual Property Rights” means any and all worldwide rights under applicable laws of patent, copyright, trade secret, trademark, rights of publicity and privacy, and other proprietary rights. + +3.2 Third-Party Materials. Our SDK may include third-party software offered under an open source license or third-party content subject to a separate third-party agreement. To the extent any of such third-party terms conflicts with this Agreement, such third-party terms will control solely with respect to such third-party software or content. + +3.3 Feedback. If you provide comments, suggestions, recommendations, ideas, know-how or other feedback about our SDK or any other MPT or affiliate product or service, we (and our affiliates and those we allow) may use such information for any purposes without obligation to you and all intellectual property and other proprietary rights in any such feedback are deemed (and hereby) licensed to MPT (with the right to sublicense through multiple tiers) for any purpose on a perpetual, irrevocable, worldwide, paid-up,and royalty-free basis and may be used or disclosed for any purpose. + +3.4 Brand Attribution. This Agreement does not grant you or any third party permission to use our trade names, trademarks, service marks, logos, domain names, and other distinctive brand features (collectively, “Brand Features”) except as required for reasonable and customary use in describing the origin of the SDK or reproduction of the copyright notice as required under the License grant. You will not use our SDK or make any statement regarding the SDK or your Application which suggests partnership with, sponsorship by, or endorsement by MPT, its affiliates or any of their employees, contractors, contributors, licensors, affiliates, or partners without our prior written permission. + +4. Confidentiality +4.1 Confidentiality. Our communications to you and our SDK may contain MPT confidential information, which includes information that is marked confidential or that would normally be considered confidential under the circumstances. If you receive any such information, you will not disclose it to any third party without MPT prior written consent. MPT confidential information does not include information that you independently developed, that was rightfully given to you by a third party without a confidentiality obligation with regard to such information, or that becomes public through no fault of your own. You may disclose MPT confidential information when compelled to do so by law if you provide us reasonable prior notice, unless a court order prohibits such notice. + +5. Termination +5.1 Termination. The term of this Agreement will begin on the date on which you click accept, download, or use the SDK or any of its components and will continue until terminated as set forth in this Agreement. MPT reserves the right to terminate this Agreement with you, or to discontinue or suspend the SDK or any portion or feature or your access thereto in the event you breach any material provisions of this Agreement or the Terms, without liability or other obligation to you. + +5.2. Discontinuation of SDK. MPT reserves the right to discontinue all or part of the SDK at any time, in our sole discretion, without notice to you, and without liability or other obligation to you. This Agreement will terminate automatically and without notice to you in the event that the SDK is discontinued in its entirety. + +5.3 Effect of Termination. Upon termination of this Agreement, you will immediately stop using, distributing, or otherwise making available the SDK and all Applications that incorporate the SDK or any of its components, cease all use of the Brand Features, and destroy or return any cached or stored content, software, or other materials obtained through our SDK. + +5.4 Surviving Provisions. When this Agreement terminates, those terms that by their nature are intended to continue indefinitely will continue to apply, including, but not limited to, Section 3 (Intellectual Property), Section 4 (Confidentiality), Section 5 (Termination), Section 6 (Liability) and Section 7 (General Provisions). + +6. Liability +6.1 Indemnification. Unless prohibited by applicable law, you will indemnify and (at MPT’s option), defend MPT, its affiliates and subsidiaries, and the agents, licensors, contributors, directors, officers, employees, suppliers, and distributors thereof (collectively, “MPT Parties”) against all liabilities, damages, losses, costs, fees (including legal fees), and expenses relating to any allegation or third-party legal proceeding arising from: (1) your use of the SDK, or any negligence or misconduct, by you or your employees, agents, vendors, or contractors (collectively “Developer Parties”); (2) any Developer Parties’ violation of this Agreement, Terms, or any applicable law and regulation; (3) any use of your Application; or (4) Developer User Data or MPT User Data (each defined in the Developer Data Use Policy). + +6.2 WARRANTIES. EXCEPT AS EXPRESSLY SET OUT IN THE TERMS, THE SDK IS PROVIDED “AS IS” WITHOUT ANY SPECIFIC PROMISES, REPRESENTATIONS, GUARANTEES OR WARRANTIES, WHETHER EXPRESS, IMPLIED OR STATUTORY, INCLUDING, BUT NOT LIMITED TO, ANY COMMITMENTS ABOUT THE CONTENT ACCESSED THROUGH THE SDK, THE SPECIFIC FUNCTIONS OF THE SDK OR OUR PLATFORM SERVICES, OR THEIR RELIABILITY, AVAILABILITY, OR ABILITY TO MEET YOUR NEEDS. THE MPT PARTIES HEREBY DISCLAIM ANY IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, NON-INFRINGEMENT AND FITNESS FOR A PARTICULAR PURPOSE. SOME JURISDICTIONS DO NOT PERMIT THE EXCLUSION OR LIMITATION OF IMPLIED WARRANTIES, SO YOU MAY HAVE ADDITIONAL RIGHTS. + +6.3 LIMITATION OF LIABILITY. TO THE EXTENT PERMITTED BY APPLICABLE LAW, MPT PARTIES WILL NOT BE RESPONSIBLE FOR LOST PROFITS, BUSINESS OR GOODWILL, REVENUES, OR DATA; FINANCIAL LOSSES; OR INDIRECT, SPECIAL, CONSEQUENTIAL, EXEMPLARY, OR PUNITIVE DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) ARISING AS A RESULT OF THIS AGREEMENT, USE OF THE SDK OR ANY MODIFIED SAMPLE CODE EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. YOU AGREE THAT YOUR REMEDIES UNDER THIS AGREEMENT ARE LIMITED SOLELY TO THE RIGHT TO COLLECT MONEY DAMAGES, IF ANY, AND YOU HEREBY WAIVE YOUR RIGHT TO SEEK INJUNCTIVE RELIEF OR OTHER EQUITABLE RELIEF. IF YOU ARE A CALIFORNIA RESIDENT, YOU AGREE TO WAIVE CALIFORNIA CIVIL CODE § 1542, WHICH SAYS: “A GENERAL RELEASE DOES NOT EXTEND TO CLAIMS THAT THE CREDITOR OR RELEASING PARTY DOES NOT KNOW OR SUSPECT TO EXIST IN HIS OR HER FAVOR AT THE TIME OF EXECUTING THE RELEASE, WHICH IF KNOWN BY HIM OR HER WOULD HAVE MATERIALLY AFFECTED HIS OR HER SETTLEMENT WITH THE DEBTOR OR RELEASED PARTY.” TO THE EXTENT PERMITTED BY LAW, THE CUMULATIVE, AGGREGATE LIABILITY OF MPT PARTIES, FOR ANY AND ALL CLAIMS ARISING UNDER THIS AGREEMENT OR ITS SUBJECT MATTER SHALL NOT EXCEED THE GREATER OF ONE HUNDRED US DOLLARS ($100) OR THE AMOUNT YOU HAVE PAID US IN THE PAST TWELVE MONTHS. IN ALL CASES, MPT PARTIES WILL NOT BE LIABLE FOR ANY EXPENSE, LOSS, OR DAMAGE THAT IS NOT REASONABLY FORESEEABLE. + +7. General Provisions +7.1 Updates. We may need to update this Agreement from time to time, including to accurately reflect the access or uses of our SDK, and so we encourage you to check this Agreement regularly. By continuing to access or use our SDK after any notice of an update to this Agreement, you agree to be bound by them. Any updates to the Disputes section of this Agreement will apply only to disputes that arise after notice of the update takes place. If you do not agree to the updated terms, please stop all access or use of our SDK. You cannot sidestep your compliance obligations under an updated version of this Agreement by developing against an older release of the SDK or relying on the older Agreement and all updates to your application are subject to the modified Agreement. + +7.2 Authorization. You hereby grant MPT and its contractors and affiliates the authorization reasonably necessary for MPT to exercise its rights and perform its obligations under this Agreement, including a limited, royalty-free, non-exclusive license to use, perform, and display the Application you provide to MPT for testing, evaluation, and approval purposes. + +7.3 General Provisions. You and MPT are independent contractors with regard to each other. This Agreement does not create any third-party beneficiary rights or any agency, partnership, employment, or joint venture. We are not liable for failure or delay in performance to the extent caused by circumstances beyond our reasonable control. If you do not comply with this Agreement, and MPT does not take action right away or does not enforce any provision of this Agreement, this inaction or lack of enforcement will not act as a waiver by MPT of any rights that it may have (such as taking action in the future) or in any way affect the validity of this Agreement or parts thereof. If a particular provision of this Agreement is deemed unenforceable, it will be deemed modified to the minimum extent necessary to render it enforceable and most nearly reflect the intent of the original provision, and all other provisions in this Agreement shall remain in full force and effect. You may not assign or delegate this Agreement or any obligations under this Agreement without our advance written consent. Any such prohibited attempted assignment will be void. MPT may assign or delegate this Agreement and any of its rights or obligations under this Agreement without your consent or notice to you. This Agreement shall bind the parties and their respective heirs, successors, and permitted assigns. This Agreement is the entire agreement between you and MPT relating to the subject matter herein and supersede any prior or contemporaneous agreements on such subject matter. + +7.4 Dispute Resolution. + +7.4.1 If you reside outside the US or your business is located outside the US: You agree that any claim, cause of action, or dispute you have against us that arises out of or relates to any access or use of the SDK must be resolved exclusively in the U.S. District Court for the Northern District of California or a state court located in San Mateo County, that you submit to the personal jurisdiction of either of these courts for the purpose of litigating any such claim, and that the laws of the State of California will govern this Agreement and any such claim, without regard to conflict of law provisions. + +7.4.2 If you reside in the US or your business is located in the US: You and we agree to arbitrate any claim, cause of action, or dispute between you and us that arises out of or relates to any access or use of the SDK for business or commercial purposes (“commercial claim”). This provision does not cover any commercial claims relating to violations of your or our intellectual property rights, including, but not limited to, copyright infringement, patent infringement, trademark infringement, violations of the brand guidelines, violations of your or our confidential information or trade secrets, or efforts to interfere with our products or engage with our products in unauthorized ways (for example, automated ways). + +7.4.3 We and you agree that, by entering into this arbitration provision all parties are waiving their respective rights to a trial by jury or to participate in a class or representative action. THE PARTIES AGREE THAT EACH MAY BRING COMMERCIAL CLAIMS AGAINST THE OTHER ONLY IN ITS INDIVIDUAL CAPACITY, AND NOT AS A PLAINTIFF OR CLASS MEMBER IN ANY PURPORTED CLASS, REPRESENTATIVE, OR PRIVATE ATTORNEY GENERAL PROCEEDING. You may bring a commercial claim only on your own behalf and cannot seek relief that would affect other parties. If there is a final judicial determination that any particular commercial claim (or a request for particular relief) cannot be arbitrated in accordance with this paragraph’s limitations, then only that commercial claim (or only that request for relief) may be brought in court. All other commercial claims (or requests for relief) remain subject to this paragraph. + +7.4.4 The Federal Arbitration Act governs the interpretation and enforcement of this arbitration provision. All issues are for an arbitrator to decide, except that only a court may decide issues relating to the scope or enforceability of this arbitration provision or the interpretation of the prohibition of class and representative actions. + +7.4.5 If any party intends to seek arbitration of a dispute, that party must provide the other party with notice in writing. + +7.4.6 The arbitration will be governed by the AAA’s Commercial Arbitration Rules (“AAA Rules”), as modified by this Agreement, and will be administered by the AAA. If the AAA is unavailable, the parties will agree to another arbitration provider or the court will appoint a substitute. The arbitrator will not be bound by rulings in other arbitrations in which you are not a party. To the fullest extent permitted by applicable law, any evidentiary submissions made in arbitration will be maintained as confidential in the absence of good cause for its disclosure. The arbitrator’s award will be maintained as confidential only to the extent necessary to protect either party’s trade secrets or proprietary business information or to comply with a legal requirement mandating confidentiality. Each party will be responsible for paying any AAA filing, administrative and arbitrator fees in accordance with AAA Rules, except that we will pay for your filing, administrative, and arbitrator fees if your commercial claim for damages does not exceed $75,000 and is non-frivolous (as measured by the standards set forth in Federal Rule of Civil Procedure 11(b)). + +7.4.7 If you do not wish to be bound by this provision (including its waiver of class and representative claims), you must notify us as set forth below within 30 days of the first acceptance date of any version of this Agreement containing an arbitration provision. Your notice to us under this subsection must be submitted to the address here: Meta Platforms Technologies, LLC, 1 Hacker Way, Menlo Park, California 94025 + +7.4.8 All commercial claims between us, whether subject to arbitration or not, will be governed by California law, excluding California’s conflict of laws rules, except to the extent that California law is contrary to or preempted by federal law. + +7.4.9 If a commercial claim between you and us is not subject to arbitration, you agree that the claim must be resolved exclusively in the U.S. District Court for the Northern District of California or a state court located in San Mateo County, and that you submit to the personal jurisdiction of either of these courts for the purpose of litigating any such claim. + +7.4.10 If any provision of this dispute resolution provision is found unenforceable, that provision will be severed and the balance of the dispute resolution provision will remain in full force and effect. diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mediapipe/efficientdet_lite0.tflite b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mediapipe/efficientdet_lite0.tflite new file mode 100644 index 00000000..74fa351d Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mediapipe/efficientdet_lite0.tflite differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/efficientnet-tflite-lite4-uint8-v1.tflite b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/efficientnet-tflite-lite4-uint8-v1.tflite new file mode 100755 index 00000000..58ca8604 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/efficientnet-tflite-lite4-uint8-v1.tflite differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/mobile_object_labeler_v1.tflite b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/mobile_object_labeler_v1.tflite new file mode 100755 index 00000000..b118ad39 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/mobile_object_labeler_v1.tflite differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/mobilenet-v1-tflite-1-0-224-quantized-metadata-v1.tflite b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/mobilenet-v1-tflite-1-0-224-quantized-metadata-v1.tflite new file mode 100755 index 00000000..8cf2048f Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/mobilenet-v1-tflite-1-0-224-quantized-metadata-v1.tflite differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/nasnet-tflite-mobile-metadata-v1.tflite b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/nasnet-tflite-mobile-metadata-v1.tflite new file mode 100755 index 00000000..65e59578 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/mlkit/nasnet-tflite-mobile-metadata-v1.tflite differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/opencv/deploy.prototxt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/opencv/deploy.prototxt new file mode 100644 index 00000000..88021a94 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/opencv/deploy.prototxt @@ -0,0 +1,3102 @@ +name: "MobileNet-SSD" +input: "data" +input_shape { + dim: 1 + dim: 3 + dim: 300 + dim: 300 +} +layer { + name: "conv0" + type: "Convolution" + bottom: "data" + top: "conv0" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 32 + bias_term: false + pad: 1 + kernel_size: 3 + stride: 2 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv0/bn" + type: "BatchNorm" + bottom: "conv0" + top: "conv0" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv0/scale" + type: "Scale" + bottom: "conv0" + top: "conv0" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv0/relu" + type: "ReLU" + bottom: "conv0" + top: "conv0" +} +layer { + name: "conv1/dw" + type: "Convolution" + bottom: "conv0" + top: "conv1/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 32 + bias_term: false + pad: 1 + kernel_size: 3 + group: 32 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv1/dw/bn" + type: "BatchNorm" + bottom: "conv1/dw" + top: "conv1/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv1/dw/scale" + type: "Scale" + bottom: "conv1/dw" + top: "conv1/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv1/dw/relu" + type: "ReLU" + bottom: "conv1/dw" + top: "conv1/dw" +} +layer { + name: "conv1" + type: "Convolution" + bottom: "conv1/dw" + top: "conv1" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 64 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv1/bn" + type: "BatchNorm" + bottom: "conv1" + top: "conv1" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv1/scale" + type: "Scale" + bottom: "conv1" + top: "conv1" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv1/relu" + type: "ReLU" + bottom: "conv1" + top: "conv1" +} +layer { + name: "conv2/dw" + type: "Convolution" + bottom: "conv1" + top: "conv2/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 64 + bias_term: false + pad: 1 + kernel_size: 3 + stride: 2 + group: 64 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv2/dw/bn" + type: "BatchNorm" + bottom: "conv2/dw" + top: "conv2/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv2/dw/scale" + type: "Scale" + bottom: "conv2/dw" + top: "conv2/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv2/dw/relu" + type: "ReLU" + bottom: "conv2/dw" + top: "conv2/dw" +} +layer { + name: "conv2" + type: "Convolution" + bottom: "conv2/dw" + top: "conv2" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 128 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv2/bn" + type: "BatchNorm" + bottom: "conv2" + top: "conv2" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv2/scale" + type: "Scale" + bottom: "conv2" + top: "conv2" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv2/relu" + type: "ReLU" + bottom: "conv2" + top: "conv2" +} +layer { + name: "conv3/dw" + type: "Convolution" + bottom: "conv2" + top: "conv3/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 128 + bias_term: false + pad: 1 + kernel_size: 3 + group: 128 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv3/dw/bn" + type: "BatchNorm" + bottom: "conv3/dw" + top: "conv3/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv3/dw/scale" + type: "Scale" + bottom: "conv3/dw" + top: "conv3/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv3/dw/relu" + type: "ReLU" + bottom: "conv3/dw" + top: "conv3/dw" +} +layer { + name: "conv3" + type: "Convolution" + bottom: "conv3/dw" + top: "conv3" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 128 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv3/bn" + type: "BatchNorm" + bottom: "conv3" + top: "conv3" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv3/scale" + type: "Scale" + bottom: "conv3" + top: "conv3" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv3/relu" + type: "ReLU" + bottom: "conv3" + top: "conv3" +} +layer { + name: "conv4/dw" + type: "Convolution" + bottom: "conv3" + top: "conv4/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 128 + bias_term: false + pad: 1 + kernel_size: 3 + stride: 2 + group: 128 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv4/dw/bn" + type: "BatchNorm" + bottom: "conv4/dw" + top: "conv4/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv4/dw/scale" + type: "Scale" + bottom: "conv4/dw" + top: "conv4/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv4/dw/relu" + type: "ReLU" + bottom: "conv4/dw" + top: "conv4/dw" +} +layer { + name: "conv4" + type: "Convolution" + bottom: "conv4/dw" + top: "conv4" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 256 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv4/bn" + type: "BatchNorm" + bottom: "conv4" + top: "conv4" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv4/scale" + type: "Scale" + bottom: "conv4" + top: "conv4" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv4/relu" + type: "ReLU" + bottom: "conv4" + top: "conv4" +} +layer { + name: "conv5/dw" + type: "Convolution" + bottom: "conv4" + top: "conv5/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 256 + bias_term: false + pad: 1 + kernel_size: 3 + group: 256 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv5/dw/bn" + type: "BatchNorm" + bottom: "conv5/dw" + top: "conv5/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv5/dw/scale" + type: "Scale" + bottom: "conv5/dw" + top: "conv5/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv5/dw/relu" + type: "ReLU" + bottom: "conv5/dw" + top: "conv5/dw" +} +layer { + name: "conv5" + type: "Convolution" + bottom: "conv5/dw" + top: "conv5" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 256 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv5/bn" + type: "BatchNorm" + bottom: "conv5" + top: "conv5" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv5/scale" + type: "Scale" + bottom: "conv5" + top: "conv5" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv5/relu" + type: "ReLU" + bottom: "conv5" + top: "conv5" +} +layer { + name: "conv6/dw" + type: "Convolution" + bottom: "conv5" + top: "conv6/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 256 + bias_term: false + pad: 1 + kernel_size: 3 + stride: 2 + group: 256 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv6/dw/bn" + type: "BatchNorm" + bottom: "conv6/dw" + top: "conv6/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv6/dw/scale" + type: "Scale" + bottom: "conv6/dw" + top: "conv6/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv6/dw/relu" + type: "ReLU" + bottom: "conv6/dw" + top: "conv6/dw" +} +layer { + name: "conv6" + type: "Convolution" + bottom: "conv6/dw" + top: "conv6" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv6/bn" + type: "BatchNorm" + bottom: "conv6" + top: "conv6" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv6/scale" + type: "Scale" + bottom: "conv6" + top: "conv6" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv6/relu" + type: "ReLU" + bottom: "conv6" + top: "conv6" +} +layer { + name: "conv7/dw" + type: "Convolution" + bottom: "conv6" + top: "conv7/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + pad: 1 + kernel_size: 3 + group: 512 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv7/dw/bn" + type: "BatchNorm" + bottom: "conv7/dw" + top: "conv7/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv7/dw/scale" + type: "Scale" + bottom: "conv7/dw" + top: "conv7/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv7/dw/relu" + type: "ReLU" + bottom: "conv7/dw" + top: "conv7/dw" +} +layer { + name: "conv7" + type: "Convolution" + bottom: "conv7/dw" + top: "conv7" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv7/bn" + type: "BatchNorm" + bottom: "conv7" + top: "conv7" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv7/scale" + type: "Scale" + bottom: "conv7" + top: "conv7" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv7/relu" + type: "ReLU" + bottom: "conv7" + top: "conv7" +} +layer { + name: "conv8/dw" + type: "Convolution" + bottom: "conv7" + top: "conv8/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + pad: 1 + kernel_size: 3 + group: 512 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv8/dw/bn" + type: "BatchNorm" + bottom: "conv8/dw" + top: "conv8/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv8/dw/scale" + type: "Scale" + bottom: "conv8/dw" + top: "conv8/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv8/dw/relu" + type: "ReLU" + bottom: "conv8/dw" + top: "conv8/dw" +} +layer { + name: "conv8" + type: "Convolution" + bottom: "conv8/dw" + top: "conv8" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv8/bn" + type: "BatchNorm" + bottom: "conv8" + top: "conv8" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv8/scale" + type: "Scale" + bottom: "conv8" + top: "conv8" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv8/relu" + type: "ReLU" + bottom: "conv8" + top: "conv8" +} +layer { + name: "conv9/dw" + type: "Convolution" + bottom: "conv8" + top: "conv9/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + pad: 1 + kernel_size: 3 + group: 512 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv9/dw/bn" + type: "BatchNorm" + bottom: "conv9/dw" + top: "conv9/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv9/dw/scale" + type: "Scale" + bottom: "conv9/dw" + top: "conv9/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv9/dw/relu" + type: "ReLU" + bottom: "conv9/dw" + top: "conv9/dw" +} +layer { + name: "conv9" + type: "Convolution" + bottom: "conv9/dw" + top: "conv9" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv9/bn" + type: "BatchNorm" + bottom: "conv9" + top: "conv9" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv9/scale" + type: "Scale" + bottom: "conv9" + top: "conv9" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv9/relu" + type: "ReLU" + bottom: "conv9" + top: "conv9" +} +layer { + name: "conv10/dw" + type: "Convolution" + bottom: "conv9" + top: "conv10/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + pad: 1 + kernel_size: 3 + group: 512 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv10/dw/bn" + type: "BatchNorm" + bottom: "conv10/dw" + top: "conv10/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv10/dw/scale" + type: "Scale" + bottom: "conv10/dw" + top: "conv10/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv10/dw/relu" + type: "ReLU" + bottom: "conv10/dw" + top: "conv10/dw" +} +layer { + name: "conv10" + type: "Convolution" + bottom: "conv10/dw" + top: "conv10" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv10/bn" + type: "BatchNorm" + bottom: "conv10" + top: "conv10" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv10/scale" + type: "Scale" + bottom: "conv10" + top: "conv10" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv10/relu" + type: "ReLU" + bottom: "conv10" + top: "conv10" +} +layer { + name: "conv11/dw" + type: "Convolution" + bottom: "conv10" + top: "conv11/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + pad: 1 + kernel_size: 3 + group: 512 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv11/dw/bn" + type: "BatchNorm" + bottom: "conv11/dw" + top: "conv11/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv11/dw/scale" + type: "Scale" + bottom: "conv11/dw" + top: "conv11/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv11/dw/relu" + type: "ReLU" + bottom: "conv11/dw" + top: "conv11/dw" +} +layer { + name: "conv11" + type: "Convolution" + bottom: "conv11/dw" + top: "conv11" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv11/bn" + type: "BatchNorm" + bottom: "conv11" + top: "conv11" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv11/scale" + type: "Scale" + bottom: "conv11" + top: "conv11" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv11/relu" + type: "ReLU" + bottom: "conv11" + top: "conv11" +} +layer { + name: "conv12/dw" + type: "Convolution" + bottom: "conv11" + top: "conv12/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + pad: 1 + kernel_size: 3 + stride: 2 + group: 512 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv12/dw/bn" + type: "BatchNorm" + bottom: "conv12/dw" + top: "conv12/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv12/dw/scale" + type: "Scale" + bottom: "conv12/dw" + top: "conv12/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv12/dw/relu" + type: "ReLU" + bottom: "conv12/dw" + top: "conv12/dw" +} +layer { + name: "conv12" + type: "Convolution" + bottom: "conv12/dw" + top: "conv12" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 1024 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv12/bn" + type: "BatchNorm" + bottom: "conv12" + top: "conv12" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv12/scale" + type: "Scale" + bottom: "conv12" + top: "conv12" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv12/relu" + type: "ReLU" + bottom: "conv12" + top: "conv12" +} +layer { + name: "conv13/dw" + type: "Convolution" + bottom: "conv12" + top: "conv13/dw" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 1024 + bias_term: false + pad: 1 + kernel_size: 3 + group: 1024 + #engine: CAFFE + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv13/dw/bn" + type: "BatchNorm" + bottom: "conv13/dw" + top: "conv13/dw" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv13/dw/scale" + type: "Scale" + bottom: "conv13/dw" + top: "conv13/dw" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv13/dw/relu" + type: "ReLU" + bottom: "conv13/dw" + top: "conv13/dw" +} +layer { + name: "conv13" + type: "Convolution" + bottom: "conv13/dw" + top: "conv13" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 1024 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv13/bn" + type: "BatchNorm" + bottom: "conv13" + top: "conv13" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv13/scale" + type: "Scale" + bottom: "conv13" + top: "conv13" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv13/relu" + type: "ReLU" + bottom: "conv13" + top: "conv13" +} +layer { + name: "conv14_1" + type: "Convolution" + bottom: "conv13" + top: "conv14_1" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 256 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv14_1/bn" + type: "BatchNorm" + bottom: "conv14_1" + top: "conv14_1" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv14_1/scale" + type: "Scale" + bottom: "conv14_1" + top: "conv14_1" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv14_1/relu" + type: "ReLU" + bottom: "conv14_1" + top: "conv14_1" +} +layer { + name: "conv14_2" + type: "Convolution" + bottom: "conv14_1" + top: "conv14_2" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 512 + bias_term: false + pad: 1 + kernel_size: 3 + stride: 2 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv14_2/bn" + type: "BatchNorm" + bottom: "conv14_2" + top: "conv14_2" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv14_2/scale" + type: "Scale" + bottom: "conv14_2" + top: "conv14_2" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv14_2/relu" + type: "ReLU" + bottom: "conv14_2" + top: "conv14_2" +} +layer { + name: "conv15_1" + type: "Convolution" + bottom: "conv14_2" + top: "conv15_1" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 128 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv15_1/bn" + type: "BatchNorm" + bottom: "conv15_1" + top: "conv15_1" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv15_1/scale" + type: "Scale" + bottom: "conv15_1" + top: "conv15_1" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv15_1/relu" + type: "ReLU" + bottom: "conv15_1" + top: "conv15_1" +} +layer { + name: "conv15_2" + type: "Convolution" + bottom: "conv15_1" + top: "conv15_2" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 256 + bias_term: false + pad: 1 + kernel_size: 3 + stride: 2 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv15_2/bn" + type: "BatchNorm" + bottom: "conv15_2" + top: "conv15_2" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv15_2/scale" + type: "Scale" + bottom: "conv15_2" + top: "conv15_2" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv15_2/relu" + type: "ReLU" + bottom: "conv15_2" + top: "conv15_2" +} +layer { + name: "conv16_1" + type: "Convolution" + bottom: "conv15_2" + top: "conv16_1" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 128 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv16_1/bn" + type: "BatchNorm" + bottom: "conv16_1" + top: "conv16_1" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv16_1/scale" + type: "Scale" + bottom: "conv16_1" + top: "conv16_1" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv16_1/relu" + type: "ReLU" + bottom: "conv16_1" + top: "conv16_1" +} +layer { + name: "conv16_2" + type: "Convolution" + bottom: "conv16_1" + top: "conv16_2" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 256 + bias_term: false + pad: 1 + kernel_size: 3 + stride: 2 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv16_2/bn" + type: "BatchNorm" + bottom: "conv16_2" + top: "conv16_2" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv16_2/scale" + type: "Scale" + bottom: "conv16_2" + top: "conv16_2" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv16_2/relu" + type: "ReLU" + bottom: "conv16_2" + top: "conv16_2" +} +layer { + name: "conv17_1" + type: "Convolution" + bottom: "conv16_2" + top: "conv17_1" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 64 + bias_term: false + kernel_size: 1 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv17_1/bn" + type: "BatchNorm" + bottom: "conv17_1" + top: "conv17_1" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv17_1/scale" + type: "Scale" + bottom: "conv17_1" + top: "conv17_1" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv17_1/relu" + type: "ReLU" + bottom: "conv17_1" + top: "conv17_1" +} +layer { + name: "conv17_2" + type: "Convolution" + bottom: "conv17_1" + top: "conv17_2" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + convolution_param { + num_output: 128 + bias_term: false + pad: 1 + kernel_size: 3 + stride: 2 + weight_filler { + type: "msra" + } + } +} +layer { + name: "conv17_2/bn" + type: "BatchNorm" + bottom: "conv17_2" + top: "conv17_2" + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } + param { + lr_mult: 0 + decay_mult: 0 + } +} +layer { + name: "conv17_2/scale" + type: "Scale" + bottom: "conv17_2" + top: "conv17_2" + param { + lr_mult: 0.1 + decay_mult: 0.0 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + scale_param { + filler { + value: 1 + } + bias_term: true + bias_filler { + value: 0 + } + } +} +layer { + name: "conv17_2/relu" + type: "ReLU" + bottom: "conv17_2" + top: "conv17_2" +} +layer { + name: "conv11_mbox_loc" + type: "Convolution" + bottom: "conv11" + top: "conv11_mbox_loc" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + convolution_param { + num_output: 12 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv11_mbox_loc_perm" + type: "Permute" + bottom: "conv11_mbox_loc" + top: "conv11_mbox_loc_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv11_mbox_loc_flat" + type: "Flatten" + bottom: "conv11_mbox_loc_perm" + top: "conv11_mbox_loc_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv11_mbox_conf" + type: "Convolution" + bottom: "conv11" + top: "conv11_mbox_conf" + param { + lr_mult: 1.0 + decay_mult: 1.0 + } + param { + lr_mult: 2.0 + decay_mult: 0.0 + } + convolution_param { + num_output: 63 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv11_mbox_conf_perm" + type: "Permute" + bottom: "conv11_mbox_conf" + top: "conv11_mbox_conf_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv11_mbox_conf_flat" + type: "Flatten" + bottom: "conv11_mbox_conf_perm" + top: "conv11_mbox_conf_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv11_mbox_priorbox" + type: "PriorBox" + bottom: "conv11" + bottom: "data" + top: "conv11_mbox_priorbox" + prior_box_param { + min_size: 60.0 + aspect_ratio: 2.0 + flip: true + clip: false + variance: 0.1 + variance: 0.1 + variance: 0.2 + variance: 0.2 + offset: 0.5 + } +} +layer { + name: "conv13_mbox_loc" + type: "Convolution" + bottom: "conv13" + top: "conv13_mbox_loc" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + convolution_param { + num_output: 24 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv13_mbox_loc_perm" + type: "Permute" + bottom: "conv13_mbox_loc" + top: "conv13_mbox_loc_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv13_mbox_loc_flat" + type: "Flatten" + bottom: "conv13_mbox_loc_perm" + top: "conv13_mbox_loc_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv13_mbox_conf" + type: "Convolution" + bottom: "conv13" + top: "conv13_mbox_conf" + param { + lr_mult: 1.0 + decay_mult: 1.0 + } + param { + lr_mult: 2.0 + decay_mult: 0.0 + } + convolution_param { + num_output: 126 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv13_mbox_conf_perm" + type: "Permute" + bottom: "conv13_mbox_conf" + top: "conv13_mbox_conf_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv13_mbox_conf_flat" + type: "Flatten" + bottom: "conv13_mbox_conf_perm" + top: "conv13_mbox_conf_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv13_mbox_priorbox" + type: "PriorBox" + bottom: "conv13" + bottom: "data" + top: "conv13_mbox_priorbox" + prior_box_param { + min_size: 105.0 + max_size: 150.0 + aspect_ratio: 2.0 + aspect_ratio: 3.0 + flip: true + clip: false + variance: 0.1 + variance: 0.1 + variance: 0.2 + variance: 0.2 + offset: 0.5 + } +} +layer { + name: "conv14_2_mbox_loc" + type: "Convolution" + bottom: "conv14_2" + top: "conv14_2_mbox_loc" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + convolution_param { + num_output: 24 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv14_2_mbox_loc_perm" + type: "Permute" + bottom: "conv14_2_mbox_loc" + top: "conv14_2_mbox_loc_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv14_2_mbox_loc_flat" + type: "Flatten" + bottom: "conv14_2_mbox_loc_perm" + top: "conv14_2_mbox_loc_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv14_2_mbox_conf" + type: "Convolution" + bottom: "conv14_2" + top: "conv14_2_mbox_conf" + param { + lr_mult: 1.0 + decay_mult: 1.0 + } + param { + lr_mult: 2.0 + decay_mult: 0.0 + } + convolution_param { + num_output: 126 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv14_2_mbox_conf_perm" + type: "Permute" + bottom: "conv14_2_mbox_conf" + top: "conv14_2_mbox_conf_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv14_2_mbox_conf_flat" + type: "Flatten" + bottom: "conv14_2_mbox_conf_perm" + top: "conv14_2_mbox_conf_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv14_2_mbox_priorbox" + type: "PriorBox" + bottom: "conv14_2" + bottom: "data" + top: "conv14_2_mbox_priorbox" + prior_box_param { + min_size: 150.0 + max_size: 195.0 + aspect_ratio: 2.0 + aspect_ratio: 3.0 + flip: true + clip: false + variance: 0.1 + variance: 0.1 + variance: 0.2 + variance: 0.2 + offset: 0.5 + } +} +layer { + name: "conv15_2_mbox_loc" + type: "Convolution" + bottom: "conv15_2" + top: "conv15_2_mbox_loc" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + convolution_param { + num_output: 24 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv15_2_mbox_loc_perm" + type: "Permute" + bottom: "conv15_2_mbox_loc" + top: "conv15_2_mbox_loc_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv15_2_mbox_loc_flat" + type: "Flatten" + bottom: "conv15_2_mbox_loc_perm" + top: "conv15_2_mbox_loc_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv15_2_mbox_conf" + type: "Convolution" + bottom: "conv15_2" + top: "conv15_2_mbox_conf" + param { + lr_mult: 1.0 + decay_mult: 1.0 + } + param { + lr_mult: 2.0 + decay_mult: 0.0 + } + convolution_param { + num_output: 126 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv15_2_mbox_conf_perm" + type: "Permute" + bottom: "conv15_2_mbox_conf" + top: "conv15_2_mbox_conf_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv15_2_mbox_conf_flat" + type: "Flatten" + bottom: "conv15_2_mbox_conf_perm" + top: "conv15_2_mbox_conf_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv15_2_mbox_priorbox" + type: "PriorBox" + bottom: "conv15_2" + bottom: "data" + top: "conv15_2_mbox_priorbox" + prior_box_param { + min_size: 195.0 + max_size: 240.0 + aspect_ratio: 2.0 + aspect_ratio: 3.0 + flip: true + clip: false + variance: 0.1 + variance: 0.1 + variance: 0.2 + variance: 0.2 + offset: 0.5 + } +} +layer { + name: "conv16_2_mbox_loc" + type: "Convolution" + bottom: "conv16_2" + top: "conv16_2_mbox_loc" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + convolution_param { + num_output: 24 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv16_2_mbox_loc_perm" + type: "Permute" + bottom: "conv16_2_mbox_loc" + top: "conv16_2_mbox_loc_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv16_2_mbox_loc_flat" + type: "Flatten" + bottom: "conv16_2_mbox_loc_perm" + top: "conv16_2_mbox_loc_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv16_2_mbox_conf" + type: "Convolution" + bottom: "conv16_2" + top: "conv16_2_mbox_conf" + param { + lr_mult: 1.0 + decay_mult: 1.0 + } + param { + lr_mult: 2.0 + decay_mult: 0.0 + } + convolution_param { + num_output: 126 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv16_2_mbox_conf_perm" + type: "Permute" + bottom: "conv16_2_mbox_conf" + top: "conv16_2_mbox_conf_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv16_2_mbox_conf_flat" + type: "Flatten" + bottom: "conv16_2_mbox_conf_perm" + top: "conv16_2_mbox_conf_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv16_2_mbox_priorbox" + type: "PriorBox" + bottom: "conv16_2" + bottom: "data" + top: "conv16_2_mbox_priorbox" + prior_box_param { + min_size: 240.0 + max_size: 285.0 + aspect_ratio: 2.0 + aspect_ratio: 3.0 + flip: true + clip: false + variance: 0.1 + variance: 0.1 + variance: 0.2 + variance: 0.2 + offset: 0.5 + } +} +layer { + name: "conv17_2_mbox_loc" + type: "Convolution" + bottom: "conv17_2" + top: "conv17_2_mbox_loc" + param { + lr_mult: 0.1 + decay_mult: 0.1 + } + param { + lr_mult: 0.2 + decay_mult: 0.0 + } + convolution_param { + num_output: 24 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv17_2_mbox_loc_perm" + type: "Permute" + bottom: "conv17_2_mbox_loc" + top: "conv17_2_mbox_loc_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv17_2_mbox_loc_flat" + type: "Flatten" + bottom: "conv17_2_mbox_loc_perm" + top: "conv17_2_mbox_loc_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv17_2_mbox_conf" + type: "Convolution" + bottom: "conv17_2" + top: "conv17_2_mbox_conf" + param { + lr_mult: 1.0 + decay_mult: 1.0 + } + param { + lr_mult: 2.0 + decay_mult: 0.0 + } + convolution_param { + num_output: 126 + kernel_size: 1 + weight_filler { + type: "msra" + } + bias_filler { + type: "constant" + value: 0.0 + } + } +} +layer { + name: "conv17_2_mbox_conf_perm" + type: "Permute" + bottom: "conv17_2_mbox_conf" + top: "conv17_2_mbox_conf_perm" + permute_param { + order: 0 + order: 2 + order: 3 + order: 1 + } +} +layer { + name: "conv17_2_mbox_conf_flat" + type: "Flatten" + bottom: "conv17_2_mbox_conf_perm" + top: "conv17_2_mbox_conf_flat" + flatten_param { + axis: 1 + } +} +layer { + name: "conv17_2_mbox_priorbox" + type: "PriorBox" + bottom: "conv17_2" + bottom: "data" + top: "conv17_2_mbox_priorbox" + prior_box_param { + min_size: 285.0 + max_size: 300.0 + aspect_ratio: 2.0 + aspect_ratio: 3.0 + flip: true + clip: false + variance: 0.1 + variance: 0.1 + variance: 0.2 + variance: 0.2 + offset: 0.5 + } +} +layer { + name: "mbox_loc" + type: "Concat" + bottom: "conv11_mbox_loc_flat" + bottom: "conv13_mbox_loc_flat" + bottom: "conv14_2_mbox_loc_flat" + bottom: "conv15_2_mbox_loc_flat" + bottom: "conv16_2_mbox_loc_flat" + bottom: "conv17_2_mbox_loc_flat" + top: "mbox_loc" + concat_param { + axis: 1 + } +} +layer { + name: "mbox_conf" + type: "Concat" + bottom: "conv11_mbox_conf_flat" + bottom: "conv13_mbox_conf_flat" + bottom: "conv14_2_mbox_conf_flat" + bottom: "conv15_2_mbox_conf_flat" + bottom: "conv16_2_mbox_conf_flat" + bottom: "conv17_2_mbox_conf_flat" + top: "mbox_conf" + concat_param { + axis: 1 + } +} +layer { + name: "mbox_priorbox" + type: "Concat" + bottom: "conv11_mbox_priorbox" + bottom: "conv13_mbox_priorbox" + bottom: "conv14_2_mbox_priorbox" + bottom: "conv15_2_mbox_priorbox" + bottom: "conv16_2_mbox_priorbox" + bottom: "conv17_2_mbox_priorbox" + top: "mbox_priorbox" + concat_param { + axis: 2 + } +} +layer { + name: "mbox_conf_reshape" + type: "Reshape" + bottom: "mbox_conf" + top: "mbox_conf_reshape" + reshape_param { + shape { + dim: 0 + dim: -1 + dim: 21 + } + } +} +layer { + name: "mbox_conf_softmax" + type: "Softmax" + bottom: "mbox_conf_reshape" + top: "mbox_conf_softmax" + softmax_param { + axis: 2 + } +} +layer { + name: "mbox_conf_flatten" + type: "Flatten" + bottom: "mbox_conf_softmax" + top: "mbox_conf_flatten" + flatten_param { + axis: 1 + } +} +layer { + name: "detection_out" + type: "DetectionOutput" + bottom: "mbox_loc" + bottom: "mbox_conf_flatten" + bottom: "mbox_priorbox" + top: "detection_out" + include { + phase: TEST + } + detection_output_param { + num_classes: 21 + share_location: true + background_label_id: 0 + nms_param { + nms_threshold: 0.45 + top_k: 100 + } + code_type: CENTER_SIZE + keep_top_k: 100 + confidence_threshold: 0.25 + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/opencv/mobilenet_iter_73000.caffemodel b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/opencv/mobilenet_iter_73000.caffemodel new file mode 100644 index 00000000..253e5012 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/models/opencv/mobilenet_iter_73000.caffemodel differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/museum_lobby.env b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/museum_lobby.env new file mode 100644 index 00000000..18a6cb82 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/assets/museum_lobby.env differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/FollowHead.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/FollowHead.xml new file mode 100644 index 00000000..b9a43748 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/FollowHead.xml @@ -0,0 +1,18 @@ + + + + + + + + Component that makes an entity look at a specified target entity's head. + + + + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/Outlined.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/Outlined.xml new file mode 100644 index 00000000..60acd294 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/Outlined.xml @@ -0,0 +1,11 @@ + + + A component that draws an outline around an object at a specified size. + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/TrackedObject.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/TrackedObject.xml new file mode 100644 index 00000000..ee714ec8 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/TrackedObject.xml @@ -0,0 +1,11 @@ + + + A component representing an object detected using ML and tracked across frames. + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/ViewLocked.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/ViewLocked.xml new file mode 100644 index 00000000..a5cacee1 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/ViewLocked.xml @@ -0,0 +1,21 @@ + + + A component which locks the position and rotation of an entity which has a Panel component to the user's view. + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/WristAttached.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/WristAttached.xml new file mode 100644 index 00000000..c9414350 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/components/WristAttached.xml @@ -0,0 +1,26 @@ + + + + + + + A component which positions and orients the entity on the user's wrist. + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/DiApplication.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/DiApplication.kt new file mode 100644 index 00000000..ce759501 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/DiApplication.kt @@ -0,0 +1,51 @@ +package com.meta.pixelandtexel.scanner + +import android.app.Application +import com.meta.pixelandtexel.scanner.datasource.network.networkModule +import com.meta.pixelandtexel.scanner.android.datasource.repository.SmartHomeRepositoryImpl +import com.meta.pixelandtexel.scanner.android.domain.repository.SmartHomeRepository +import com.meta.pixelandtexel.scanner.android.domain.usecases.GetDeviceInfoUsecase +import com.meta.pixelandtexel.scanner.android.domain.usecases.GetDevicesOfASmarthomeType +import com.meta.pixelandtexel.scanner.android.domain.usecases.UseActionDevice +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.datasource.MRUKObjectsRepositoryImpl +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.repository.IMRUKObjectsRepository +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.IObjectDetectorHelper +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.MLKitObjectDetector +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.detection.IObjectDetectionRepository +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.repository.ObjectDetectionRepository +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.repository.DisplayedEntityRepository +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.display.IDisplayedEntityRepository +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.datasource.local.dbModule +import org.koin.android.ext.koin.androidContext +import org.koin.core.context.GlobalContext.startKoin +import org.koin.dsl.module + +val appModule = module { + single{ DisplayedEntityRepository() } + single{ MLKitObjectDetector()} + + single { + ObjectDetectionRepository(get(), get()) + } + + single { MRUKObjectsRepositoryImpl(get()) } + single{ + SmartHomeRepositoryImpl(get()) + } + + factory { UseActionDevice(get()) } + factory { GetDevicesOfASmarthomeType(get()) } + factory { GetDeviceInfoUsecase(get()) } + +} + +class DiApplication : Application() { + override fun onCreate() { + super.onCreate() + + startKoin { + androidContext(this@DiApplication) + modules(networkModule, dbModule, appModule) + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/MainActivity.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/MainActivity.kt new file mode 100644 index 00000000..32aa206b --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/MainActivity.kt @@ -0,0 +1,437 @@ +package com.meta.pixelandtexel.scanner + +import android.content.pm.PackageManager +import android.os.Bundle +import android.util.Log +import android.widget.ImageButton +import androidx.activity.OnBackPressedDispatcher +import androidx.activity.OnBackPressedDispatcherOwner +import androidx.activity.compose.LocalOnBackPressedDispatcherOwner +import androidx.compose.runtime.CompositionLocalProvider +import androidx.core.app.ActivityCompat +import androidx.core.content.ContextCompat +import androidx.core.net.toUri +import androidx.lifecycle.Lifecycle +import com.meta.pixelandtexel.scanner.android.views.smarthome.selection.DeviceSelectionScreen +import com.meta.pixelandtexel.scanner.android.views.smarthome.selection.DeviceSelectionViewModel +import com.meta.pixelandtexel.scanner.android.views.welcome.WelcomeScreen +import com.meta.pixelandtexel.scanner.ecs.OutlinedSystem +import com.meta.pixelandtexel.scanner.ecs.WristAttachedSystem +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.MRUKSidePanelRaycasterFeature +import com.meta.pixelandtexel.scanner.feature.objectdetection.ObjectDetectionFeature +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.display.IDisplayedEntityRepository +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.enums.CameraStatus +import com.meta.pixelandtexel.scanner.feature.objectdetection.model.RaycastRequestModel +import com.meta.pixelandtexel.scanner.services.settings.SettingsService +import com.meta.spatial.compose.ComposeFeature +import com.meta.spatial.compose.composePanel +import com.meta.spatial.compose.panelViewLifecycleOwner +import com.meta.spatial.core.Entity +import com.meta.spatial.core.Pose +import com.meta.spatial.core.Quaternion +import com.meta.spatial.mruk.SurfaceType +import com.meta.spatial.core.SendRate +import com.meta.spatial.core.SpatialFeature +import com.meta.spatial.mruk.MRUKFeature +import com.meta.spatial.mruk.MRUKLoadDeviceResult +import com.meta.spatial.okhttp3.OkHttpAssetFetcher +import com.meta.spatial.runtime.LayerConfig +import com.meta.spatial.runtime.NetworkedAssetLoader +import com.meta.spatial.runtime.PanelShapeLayerBlendType +import com.meta.spatial.runtime.ReferenceSpace +import com.meta.spatial.toolkit.AppSystemActivity +import com.meta.spatial.toolkit.PanelRegistration +import com.meta.spatial.vr.LocomotionSystem +import com.meta.spatial.vr.VRFeature +import kotlinx.coroutines.CoroutineScope +import kotlinx.coroutines.Dispatchers +import kotlinx.coroutines.Job +import kotlinx.coroutines.launch +import org.koin.android.ext.android.get +import java.io.File + + +class MainActivity : ActivityCompat.OnRequestPermissionsResultCallback, AppSystemActivity() { + companion object { + private const val TAG = "MainActivity" + + private const val PERMISSIONS_REQUEST_CODE = 1000 + private val PERMISSIONS_REQUIRED = arrayOf("horizonos.permission.HEADSET_CAMERA", "com.oculus.permission.USE_SCENE") + const val MAX_DISTANCE = Float.MAX_VALUE + } + + // used for scene inflation + private var gltfxEntity: Entity? = null + private val activityScope = CoroutineScope(Dispatchers.Main) + + private lateinit var permissionsResultCallback: (granted: Boolean) -> Unit + + // button for toggling the scanning + private var scanControlsBtn: ImageButton? = null + + // our main scene entities + private var welcomePanelEntity: Entity? = null + + // our main services for detected object, displaying helpful tips, and displaying pre-assembled + // panel content for select objects (with 3D models) + private lateinit var objectDetectionFeature: ObjectDetectionFeature + private lateinit var mrukFeature: MRUKFeature + private lateinit var mrukSidePanelRaycasterFeature: MRUKSidePanelRaycasterFeature + + + lateinit var entityRepository: IDisplayedEntityRepository + + override fun registerFeatures(): List { + objectDetectionFeature = + ObjectDetectionFeature( + this, + onStatusChanged = ::onObjectDetectionFeatureStatusChanged, + ) + + mrukFeature = MRUKFeature(this, systemManager) + mrukSidePanelRaycasterFeature = MRUKSidePanelRaycasterFeature(this) + return listOf( + VRFeature(this), + ComposeFeature(), + objectDetectionFeature, + mrukFeature, + mrukSidePanelRaycasterFeature + ) + } + + override fun onCreate(savedInstanceState: Bundle?) { + super.onCreate(savedInstanceState) + + SettingsService.initialize(this) + + NetworkedAssetLoader.init( + File(applicationContext.cacheDir.canonicalPath), + OkHttpAssetFetcher() + ) + + // extra object detection handling and usability + entityRepository = get() + + // register systems/components + systemManager.unregisterSystem() + + componentManager.registerComponent(WristAttached.Companion, SendRate.DEFAULT) + systemManager.registerSystem(WristAttachedSystem()) + + componentManager.registerComponent(Outlined.Companion, SendRate.DEFAULT) + systemManager.registerSystem(OutlinedSystem(this)) + + loadGLXF().invokeOnCompletion { + val composition = glXFManager.getGLXFInfo("scanner_app_main_scene") + + // wait for system manager to initialize so we can get the underlying scene objects + welcomePanelEntity = composition.getNodeByName("WelcomePanel").entity + } + } + + override fun onSceneReady() { + super.onSceneReady() + + // set the reference space to enable re-centering + scene.setReferenceSpace(ReferenceSpace.STAGE) + + requestPermissions { permissionsGranted -> + loadScene(permissionsGranted) + } + + scene.enablePassthrough(true) + } + + private fun loadScene(scenePermissionsGranted: Boolean) { + if (scenePermissionsGranted) { + loadSceneFromDevice() + } else { + Log.d("JAVI DEBUG", "Permisos denegados. No se puede cargar la escena desde el dispositivo.") + } + } + private fun loadSceneFromDevice() { + val future = mrukFeature.loadSceneFromDevice(requestSceneCaptureIfNoDataFound = true) + + future.whenComplete { result: MRUKLoadDeviceResult, _ -> + Log.d("JAVI DEBUG", "Scene loaded from device with result: $result") + + if (result != MRUKLoadDeviceResult.SUCCESS) { + Log.d("JAVI DEBUG", "error") + } + } + } + + override fun registerPanels(): List { + return listOf( + PanelRegistration(R.integer.welcome_panel_id) { + config { + themeResourceId = R.style.PanelAppThemeTransparent + includeGlass = false + layoutWidthInDp = 368f + width = 0.368f + height = 0.404f + layerConfig = LayerConfig() + layerBlendType = PanelShapeLayerBlendType.MASKED + enableLayerFeatheredEdge = true + effectShader = "customPanel.frag" // just for demonstration purposes + } + composePanel { + setContent { + CompositionLocalProvider( + LocalOnBackPressedDispatcherOwner provides + object : OnBackPressedDispatcherOwner { + override val lifecycle: Lifecycle + get() = this@MainActivity.panelViewLifecycleOwner.lifecycle + + override val onBackPressedDispatcher: OnBackPressedDispatcher + get() = OnBackPressedDispatcher() + } + ) { + WelcomeScreen { + welcomePanelEntity?.destroy() + welcomePanelEntity = null + } + } + } + } + }, + PanelRegistration(R.layout.ui_show_smart_things_button_view) { + config { + themeResourceId = R.style.PanelAppThemeTransparent + includeGlass = false + layoutWidthInDp = 80f + width = 0.04f + height = 0.04f + layerConfig = LayerConfig() + layerBlendType = PanelShapeLayerBlendType.MASKED + enableLayerFeatheredEdge = true + } + panel { + val showSmartThingsButton = + rootView?.findViewById(R.id.show_smart_things_btn) + ?: throw RuntimeException("Missing help button") + + showSmartThingsButton.setOnClickListener { + welcomePanelEntity?.destroy() + welcomePanelEntity = null + stopScanning() + loadScene(true) + + activityScope.launch { + mrukSidePanelRaycasterFeature.getAllSmartThings() + } + } + } + }, + PanelRegistration(R.layout.ui_delete_smart_things_button_view) { + config { + themeResourceId = R.style.PanelAppThemeTransparent + includeGlass = false + layoutWidthInDp = 80f + width = 0.04f + height = 0.04f + layerConfig = LayerConfig() + layerBlendType = PanelShapeLayerBlendType.MASKED + enableLayerFeatheredEdge = true + } + panel { + val deleteBtn = + rootView?.findViewById(R.id.delete_btn) + ?: throw RuntimeException("Missing delete button") + + deleteBtn.setOnClickListener { + welcomePanelEntity?.destroy() + welcomePanelEntity = null + stopScanning() + loadScene(true) + + activityScope.launch { + mrukSidePanelRaycasterFeature.deleteAllSmartThingEntities() + } + } + } + }, + PanelRegistration(R.layout.ui_camera_controls_view) { + config { + themeResourceId = R.style.PanelAppThemeTransparent + includeGlass = false + layoutWidthInDp = 80f + width = 0.04f + height = 0.04f + layerConfig = LayerConfig() + layerBlendType = PanelShapeLayerBlendType.MASKED + enableLayerFeatheredEdge = true + } + panel { + scanControlsBtn = + rootView?.findViewById(R.id.camera_play_btn) + ?: throw RuntimeException("Missing camera play/pause button") + + scanControlsBtn?.setOnClickListener { + welcomePanelEntity?.destroy() + welcomePanelEntity = null + + when (objectDetectionFeature.status) { + CameraStatus.PAUSED -> { + // first ask permission if we haven't already + if (!hasPermissions()) { + this@MainActivity.requestPermissions { granted -> + if (granted) { + startScanning() + } + } + + return@setOnClickListener + } + + startScanning() + } + + CameraStatus.SCANNING -> { + stopScanning() + } + } + } + } + }, + PanelRegistration(R.integer.info_panel_id) { + config { + themeResourceId = R.style.PanelAppThemeTransparent + includeGlass = false + layoutWidthInDp = 632f + width = 0.632f + height = 0.644f + layerConfig = LayerConfig() + layerBlendType = PanelShapeLayerBlendType.MASKED + enableLayerFeatheredEdge = true + } + composePanel { + stopScanning() + val displayInfo = entityRepository.newViewModelData ?: return@composePanel + setContent { + val viewmodel = DeviceSelectionViewModel(displayInfo.data.type, get()) + DeviceSelectionScreen( + viewModel = viewmodel, + onOptionSelected = { device -> + entityRepository.deleteEntity(displayInfo.entityId) + val spawnPose = + getPanelHitSpawnPosition(displayInfo.data.raycastInfo) + if (spawnPose != null) { + activityScope.launch { + mrukSidePanelRaycasterFeature.addSmartThing( + device, + spawnPose + ) + } + } else { + Log.w( + "MainActivity", + "No se pudo obtener la posición de spawn para el objeto detectado." + ) + } + + } + ) + } + } + }, + ) + } + + private fun getPanelHitSpawnPosition(raycastModel: RaycastRequestModel): Pose? { + val currentRoom = mrukFeature.getCurrentRoom() + if (currentRoom == null) { + Log.w("UpdateRaycastSystem", "Cannot raycast, no current room available.") + return null + } + + + val hit = mrukFeature.raycastRoom( + currentRoom.anchor.uuid, + origin = raycastModel.headPosition, + direction = raycastModel.direction, + maxDistance = MAX_DISTANCE, + SurfaceType.PLANE_VOLUME, + ) ?: return null + return Pose(hit.hitPosition, Quaternion.lookRotation(hit.hitNormal.normalize())) + + } + + + /** Activates the object detection feature scanning, which turns on the user's camera. */ + private fun startScanning() { + objectDetectionFeature.scan() + } + + /** Stops the object detection and device camera. */ + private fun stopScanning() { + objectDetectionFeature.pause() + } + + /** + * Executed when the object detection feature has scanning status has changed. + * + * @param newStatus The new [CameraStatus] camera scanning status + */ + private fun onObjectDetectionFeatureStatusChanged(newStatus: CameraStatus) { + scanControlsBtn?.setBackgroundResource( + when (newStatus) { + CameraStatus.PAUSED -> R.drawable.escaneo + CameraStatus.SCANNING -> com.meta.spatial.uiset.R.drawable.ic_pause_circle_24 + } + ) + } + + + override fun onPause() { + stopScanning() + super.onPause() + } + + private fun loadGLXF(): Job { + gltfxEntity = Entity.Companion.create() + return activityScope.launch { + glXFManager.inflateGLXF( + "apk:///scenes/Composition.glxf".toUri(), + rootEntity = gltfxEntity!!, + keyName = "scanner_app_main_scene", + ) + } + } + + private fun hasPermissions() = + PERMISSIONS_REQUIRED.all { + ContextCompat.checkSelfPermission(this, it) == PackageManager.PERMISSION_GRANTED + } + + private fun requestPermissions(callback: (granted: Boolean) -> Unit) { + permissionsResultCallback = callback + + if (hasPermissions()) { + Log.d(TAG, "Los permisos ya estaban concedidos. Saltando la solicitud.") + callback(true) + } else { + Log.d(TAG, "Permisos no concedidos. Solicitando al usuario...") + ActivityCompat.requestPermissions(this, PERMISSIONS_REQUIRED, PERMISSIONS_REQUEST_CODE) + } + } + + override fun onRequestPermissionsResult( + requestCode: Int, + permissions: Array, + grantResults: IntArray, + ) { + super.onRequestPermissionsResult(requestCode, permissions, grantResults) + + if (requestCode == PERMISSIONS_REQUEST_CODE) { + val allGranted = grantResults.all { it == PackageManager.PERMISSION_GRANTED } + if (allGranted) { + Log.d(TAG, "Todos los permisos fueron concedidos por el usuario.") + permissionsResultCallback.invoke(true) + } else { + Log.w(TAG, "Al menos un permiso fue denegado.") + permissionsResultCallback.invoke(false) + } + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/dto/DeviceResponseDto.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/dto/DeviceResponseDto.kt new file mode 100644 index 00000000..b1273374 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/dto/DeviceResponseDto.kt @@ -0,0 +1,17 @@ +package com.meta.pixelandtexel.scanner.android.datasource.dto + +import com.google.gson.annotations.SerializedName +import kotlinx.serialization.Serializable + +data class DeviceListResponseDto( + @SerializedName("devices") + val devices: List +) + +@Serializable +data class SmartDeviceDto( + @SerializedName("name") + val name: String, + @SerializedName("entities") + val entities: List +) diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/dto/ThingStateResponseDto.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/dto/ThingStateResponseDto.kt new file mode 100644 index 00000000..6e205854 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/dto/ThingStateResponseDto.kt @@ -0,0 +1,58 @@ +package com.meta.pixelandtexel.scanner.android.datasource.dto + +import com.google.gson.annotations.SerializedName + +data class ThingsResponseDto( + @SerializedName("entity_id") val entityId: String, + val state: String, + @SerializedName("attributes") val attributes: Attributes?, + @SerializedName("last_changed") val lastChanged: String?, + @SerializedName("last_reported") val lastReported: String?, + @SerializedName("last_updated") val lastUpdated: String?, + val context: Context? +) + +data class Attributes( + @SerializedName("device_class") val deviceClass: String? = null, + @SerializedName("friendly_name") val friendlyName: String? = null, + @SerializedName("state_class") val stateClass: String? = null, + @SerializedName("unit_of_measurement") val unitOfMeasurement: String? = null, + @SerializedName("volume_level") val volumeLevel: Float? = null, + @SerializedName("is_volume_muted") val isVolumeMuted: Boolean? = null, + @SerializedName("source_list") val source: List? = null, + @SerializedName("min_color_temp_kelvin") val minColorTempKelvin: Int? = null, + @SerializedName("max_color_temp_kelvin") val maxColorTempKelvin: Int? = null, + @SerializedName("min_mireds") val minMireds: Int? = null, + @SerializedName("max_mireds") val maxMireds: Int? = null, + @SerializedName("effect_list") val effectList: List? = null, + @SerializedName("supported_color_modes") val supportedColorModes: List? = null, + @SerializedName("effect") val effect: String? = null, + @SerializedName("color_mode") val colorMode: String? = null, + @SerializedName("brightness") val brightness: Int? = null, + @SerializedName("color_temp_kelvin") val colorTempKelvin: Int? = null, + @SerializedName("color_temp") val colorTemp: Int? = null, + @SerializedName("hs_color") val hsColor: List? = null, + @SerializedName("rgb_color") val rgbColor: List? = null, + @SerializedName("xy_color") val xyColor: List? = null, + @SerializedName("supported_features") val supportedFeatures: Int? = null, + @SerializedName("temperature") val temperature: Double? = null, + @SerializedName("dew_point") val dewPoint: Double? = null, + @SerializedName("temperature_unit") val temperatureUnit: String? = null, + @SerializedName("humidity") val humidity: Int? = null, + @SerializedName("cloud_coverage") val cloudCoverage: Double? = null, + @SerializedName("uv_index") val uvIndex: Double? = null, + @SerializedName("pressure") val pressure: Double? = null, + @SerializedName("pressure_unit") val pressureUnit: String? = null, + @SerializedName("wind_bearing") val windBearing: Double? = null, + @SerializedName("wind_speed") val windSpeed: Double? = null, + @SerializedName("wind_speed_unit") val windSpeedUnit: String? = null, + @SerializedName("visibility_unit") val visibilityUnit: String? = null, + @SerializedName("precipitation_unit") val precipitationUnit: String? = null, + @SerializedName("attribution") val attribution: String? = null, +) + +data class Context( + val id: String, + @SerializedName("parent_id") val parentId: String? = null, + @SerializedName("user_id") val userId: String? = null +) \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/mapper/DeviceMapper.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/mapper/DeviceMapper.kt new file mode 100644 index 00000000..befd4271 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/mapper/DeviceMapper.kt @@ -0,0 +1,28 @@ +package com.meta.pixelandtexel.scanner.android.datasource.mapper + +import com.meta.pixelandtexel.scanner.android.datasource.dto.DeviceListResponseDto +import com.meta.pixelandtexel.scanner.android.datasource.dto.SmartDeviceDto +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.pixelandtexel.scanner.models.devices.ThingEntity + +object DeviceMapper { + fun map(dto: DeviceListResponseDto): List { + return dto.devices.map { mapDevice(it) } + } + + private fun mapDevice(dto: SmartDeviceDto): Device { + return Device( + name = dto.name, + entityList = dto.entities.mapNotNull { mapEntity(it) } + ) + } + + private fun mapEntity(entityId: String): ThingEntity? { + val domain = DomainMapper.fromEntityId(entityId) ?: return null + + return ThingEntity( + id = entityId, + domain = domain + ) + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/mapper/DomainMapper.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/mapper/DomainMapper.kt new file mode 100644 index 00000000..795bbac2 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/mapper/DomainMapper.kt @@ -0,0 +1,119 @@ +package com.meta.pixelandtexel.scanner.android.datasource.mapper + +import com.meta.pixelandtexel.scanner.android.datasource.dto.Attributes +import com.meta.pixelandtexel.scanner.models.devices.domain.Domain +import com.meta.pixelandtexel.scanner.models.devices.domain.DomainServices +import com.meta.pixelandtexel.scanner.models.devices.domain.LightAttributes +import com.meta.pixelandtexel.scanner.models.devices.domain.LightDomain +import com.meta.pixelandtexel.scanner.models.devices.domain.MediaPlayerAttributes +import com.meta.pixelandtexel.scanner.models.devices.domain.MediaPlayerDomain +import com.meta.pixelandtexel.scanner.models.devices.domain.SensorDomain +import com.meta.pixelandtexel.scanner.models.devices.domain.SwitchDomain +import com.meta.pixelandtexel.scanner.models.devices.domain.WeatherDomain +import com.meta.pixelandtexel.scanner.models.devices.domain.WeatherDomainAttributes + +object DomainMapper { + + fun fromEntityId(entityId: String): Domain? { + val domainString = entityId.substringBefore(".", missingDelimiterValue = "unknown") + + return when (domainString) { + "switch" -> SwitchDomain( + value = false, + services = listOf(DomainServices.TURN_OFF, DomainServices.TURN_ON) + ) + + "sensor" -> SensorDomain( + value = "", + services = emptyList() + ) + + "binary_sensor" -> SensorDomain( + value = "OFF", + services = emptyList() + ) + + "media_player" -> MediaPlayerDomain( + value = false, + services = listOf( + DomainServices.TURN_OFF, DomainServices.TURN_ON, DomainServices.VOLUME_SET, + DomainServices.VOLUME_MUTE, DomainServices.MEDIA_PLAY + ), + attributes = MediaPlayerAttributes( + volumeLevel = null, + isMuted = null, + source = null + ) + ) + "light" -> LightDomain( + value = false, + services = listOf(DomainServices.TURN_OFF, DomainServices.TURN_ON), + attributes = LightAttributes() + ) + + "weather" -> WeatherDomain( + value = "", + services = emptyList(), + attributes = WeatherDomainAttributes() + ) + + else -> null + } + } + + fun fromOtherDomainNewValue(domain: Domain, newValue: String, attributes: Attributes?): Domain { + return when (domain) { + is SwitchDomain -> { + val newValueBoolean = + newValue.equals("ON", ignoreCase = true) || newValue.equals( + "true", + ignoreCase = true + ) + domain.copy(value = newValueBoolean) + } + + is SensorDomain -> domain.copy(value = newValue) + + is MediaPlayerDomain -> { + val newValueBoolean = + newValue.equals("ON", ignoreCase = true) || newValue.equals( + "true", + ignoreCase = true + ) + domain.copy( + value = newValueBoolean, + attributes = MediaPlayerAttributes( + volumeLevel = attributes?.volumeLevel, + isMuted = attributes?.isVolumeMuted, + source = attributes?.source + ) + ) + } + + is LightDomain -> { + val newValueBoolean = + newValue.equals("ON", ignoreCase = true) || newValue.equals( + "true", + ignoreCase = true + ) + domain.copy( + value = newValueBoolean, + attributes = LightAttributes( + brightness = attributes?.brightness, + colorTempKelvin = attributes?.colorTempKelvin, + hsColor = attributes?.hsColor, + minColorTempKelvin = attributes?.minColorTempKelvin, + maxColorTempKelvin = attributes?.maxColorTempKelvin, + ) + ) + } + + is WeatherDomain -> { + domain.copy( + value = newValue, + attributes = WeatherDomainAttributes.fromAttributes(attributes ?: Attributes()) + ) + } + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/repository/SmartHomeRepositoryImpl.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/repository/SmartHomeRepositoryImpl.kt new file mode 100644 index 00000000..1411cfdd --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/datasource/repository/SmartHomeRepositoryImpl.kt @@ -0,0 +1,106 @@ +package com.meta.pixelandtexel.scanner.android.datasource.repository + +import com.meta.pixelandtexel.scanner.datasource.network.SmartHomeApi +import com.meta.pixelandtexel.scanner.android.datasource.mapper.DeviceMapper +import com.meta.pixelandtexel.scanner.android.datasource.mapper.DomainMapper +import com.meta.pixelandtexel.scanner.android.domain.repository.SmartHomeRepository +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.pixelandtexel.scanner.models.devices.ThingEntity +import com.meta.pixelandtexel.scanner.models.devices.domain.SensorDomain + +class SmartHomeRepositoryImpl ( + private val api: SmartHomeApi +) : SmartHomeRepository { + + + override suspend fun getDevices(): List { + val template = """ + {%- set javi_entities = states | selectattr("entity_id", "search") | map(attribute="entity_id") | list -%} + {% set devices = javi_entities | map("device_id") | unique | reject("eq", None) | list -%} + {% set ns = namespace(devices=[]) -%} + {% for device in devices -%} + {% set entities = device_entities(device) | list -%} + {% if entities -%} + {% set device_name = device_attr(device, "name_by_user") or device_attr(device, "name") or device -%} + {% set ns.devices = ns.devices + [{"name": device_name, "entities": entities | sort}] -%} + {% endif -%} + {% endfor -%} + {{ {"devices": ns.devices} | to_json }} + + """.trimIndent() + + try { + val response = api.postTemplate( + body = mapOf("template" to template) + ) + + if (response.isSuccessful) { + val responseBody = response.body() + if (responseBody != null) { + return DeviceMapper.map(responseBody) + } + } + + return emptyList() + } catch (e: Exception) { + e.printStackTrace() + return emptyList() + } + } + + override suspend fun getThingEntities(thingEntities: List): List { + try { + val updatedEntities = thingEntities.map { entity -> + val response = api.getEntityState(entity.id) + val responseBody = response.body() ?: return@map entity + + var newState = responseBody.state + if (entity.domain is SensorDomain) { + val unit = responseBody.attributes?.unitOfMeasurement ?: "" + newState = "$newState $unit" + } + + entity.copy( + domain = DomainMapper.fromOtherDomainNewValue( + entity.domain, + newState, + responseBody.attributes + ) + ) + } + return updatedEntities + } catch (e: Exception) { + e.printStackTrace() + return emptyList() + } + + } + + override suspend fun getActionForThing( + thingId: String, + action: String, + newValue: Pair? + ): Boolean { + return try { + val thingDomain = thingId.substringBefore(".", missingDelimiterValue = "") + val service = action.lowercase() + val bodyMap = mutableMapOf("entity_id" to thingId) + if (newValue != null) { + bodyMap[newValue.first] = newValue.second + } + + + val response = api.postActionToDeviceDomain( + device = thingDomain, + action = service, + body = bodyMap.toMap() + ) + + response.isSuccessful + } catch (e: Exception) { + e.printStackTrace() + false + } + + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/repository/SmartHomeRepository.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/repository/SmartHomeRepository.kt new file mode 100644 index 00000000..bb2f3a1b --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/repository/SmartHomeRepository.kt @@ -0,0 +1,17 @@ +package com.meta.pixelandtexel.scanner.android.domain.repository + +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.pixelandtexel.scanner.models.devices.ThingEntity + +interface SmartHomeRepository { + suspend fun getDevices(): List + + suspend fun getThingEntities(thingEntities: List): List + + suspend fun getActionForThing( + thingId: String, + action: String, + newValue: Pair? + ): Boolean + +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/usecases/GetDeviceInfoUsecase.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/usecases/GetDeviceInfoUsecase.kt new file mode 100644 index 00000000..e49f2ebb --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/usecases/GetDeviceInfoUsecase.kt @@ -0,0 +1,13 @@ +package com.meta.pixelandtexel.scanner.android.domain.usecases + +import com.meta.pixelandtexel.scanner.android.domain.repository.SmartHomeRepository +import com.meta.pixelandtexel.scanner.models.devices.ThingEntity + +class GetDeviceInfoUsecase( + private val repository: SmartHomeRepository +) { + suspend fun run(thingEntities: List): List { + return repository.getThingEntities(thingEntities) + } + +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/usecases/GetDeviceOfASmarthomeType.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/usecases/GetDeviceOfASmarthomeType.kt new file mode 100644 index 00000000..90ddbfa8 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/usecases/GetDeviceOfASmarthomeType.kt @@ -0,0 +1,14 @@ +package com.meta.pixelandtexel.scanner.android.domain.usecases + +import com.meta.pixelandtexel.scanner.android.domain.repository.SmartHomeRepository +import com.meta.pixelandtexel.scanner.models.devices.Device + +class GetDevicesOfASmarthomeType( + private val repository: SmartHomeRepository +) { + suspend fun run(): List { + val devices = repository.getDevices() + return devices + } + +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/usecases/UseActionDevice.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/usecases/UseActionDevice.kt new file mode 100644 index 00000000..ee0d0a9a --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/domain/usecases/UseActionDevice.kt @@ -0,0 +1,17 @@ +package com.meta.pixelandtexel.scanner.android.domain.usecases + +import com.meta.pixelandtexel.scanner.android.domain.repository.SmartHomeRepository + +class UseActionDevice( + private val repository: SmartHomeRepository +) { + suspend fun run(thingId: String, action: String, newValue: Any?, attribute: String?): Boolean { + val newValuePair = if (newValue != null && attribute != null) { + Pair(attribute, newValue) + } else { + null + } + return repository.getActionForThing(thingId, action, newValuePair) + } + +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/viewmodels/WelcomeViewModel.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/viewmodels/WelcomeViewModel.kt new file mode 100644 index 00000000..448383e3 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/viewmodels/WelcomeViewModel.kt @@ -0,0 +1,29 @@ +package com.meta.pixelandtexel.scanner.android.viewmodels + +import androidx.compose.runtime.State +import androidx.compose.runtime.mutableStateOf +import androidx.lifecycle.ViewModel +import com.meta.pixelandtexel.scanner.android.views.welcome.Routes +import com.meta.pixelandtexel.scanner.services.settings.SettingsKey +import com.meta.pixelandtexel.scanner.services.settings.SettingsService + +class WelcomeViewModel( + initialRoute: String = Routes.EMPTY // for @Preview +) : ViewModel() { + private val _route = mutableStateOf(initialRoute) + val route: State = _route + + fun checkShouldShowNotice() { + val hasUserAcceptedNotice = SettingsService.get(SettingsKey.ACCEPTED_NOTICE, false) + + if (hasUserAcceptedNotice) { + navTo(Routes.CAMERA_CONTROLS_INTRO) + } else { + navTo(Routes.NOTICE) + } + } + + fun navTo(dest: String) { + _route.value = dest + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/Panel.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/Panel.kt new file mode 100644 index 00000000..6446cd9c --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/Panel.kt @@ -0,0 +1,38 @@ +package com.meta.pixelandtexel.scanner.android.views.components + +import androidx.compose.foundation.background +import androidx.compose.foundation.layout.Box +import androidx.compose.foundation.layout.padding +import androidx.compose.runtime.Composable +import androidx.compose.ui.Modifier +import androidx.compose.ui.draw.clip +import androidx.compose.ui.tooling.preview.Preview +import androidx.compose.ui.unit.dp +import com.meta.spatial.uiset.theme.LocalColorScheme +import com.meta.spatial.uiset.theme.LocalShapes +import com.meta.spatial.uiset.theme.SpatialTheme + +@Composable +fun Panel( + outerPadding: Boolean = true, + content: @Composable () -> Unit, +) { + Box( + modifier = + Modifier + .clip(LocalShapes.current.large) + .background( + brush = LocalColorScheme.current.panel, + shape = LocalShapes.current.large + ) + .padding(if (outerPadding) 20.dp else 0.dp) + ) { + content.invoke() + } +} + +@Preview(widthDp = 516, heightDp = 414) +@Composable +private fun PanelPreview() { + SpatialTheme { Panel {} } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/EntityRow.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/EntityRow.kt new file mode 100644 index 00000000..0df119b2 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/EntityRow.kt @@ -0,0 +1,125 @@ +package com.meta.pixelandtexel.scanner.android.views.components.smart + +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Row +import androidx.compose.foundation.layout.fillMaxWidth +import androidx.compose.foundation.layout.padding +import androidx.compose.foundation.layout.size +import androidx.compose.material.icons.Icons +import androidx.compose.material.icons.filled.Build +import androidx.compose.material.icons.filled.PlayArrow +import androidx.compose.material3.Card +import androidx.compose.material3.CardDefaults +import androidx.compose.material3.CircularProgressIndicator +import androidx.compose.material3.Icon +import androidx.compose.material3.IconButton +import androidx.compose.material3.MaterialTheme +import androidx.compose.material3.Slider +import androidx.compose.material3.Switch +import androidx.compose.material3.Text +import androidx.compose.runtime.Composable +import androidx.compose.runtime.LaunchedEffect +import androidx.compose.runtime.getValue +import androidx.compose.runtime.mutableFloatStateOf +import androidx.compose.runtime.mutableStateOf +import androidx.compose.runtime.remember +import androidx.compose.runtime.setValue +import androidx.compose.ui.Alignment +import androidx.compose.ui.Modifier +import androidx.compose.ui.graphics.vector.ImageVector +import androidx.compose.ui.text.font.FontWeight +import androidx.compose.ui.unit.dp +import com.meta.pixelandtexel.scanner.utils.mytheme.MyPaddings + +@Composable +fun EntityRow( + title: String, + isUpdating: Boolean, + actualValue: Any, + onSwitchToggle: ((Boolean) -> Unit)? = null, + onSliderChange: ((Float) -> Unit)? = null, + buttonIcon: ImageVector = Icons.Default.PlayArrow, + onButtonToggled: (() -> Unit)? = null, + minMaxSlider: ClosedFloatingPointRange = 0f..1f, + enabled: Boolean = true +) { + Card( + elevation = CardDefaults.cardElevation(defaultElevation = 2.dp), + colors = CardDefaults.cardColors( + containerColor = MaterialTheme.colorScheme.surfaceVariant + ) + ) { + Row( + modifier = Modifier + .fillMaxWidth() + .padding(MyPaddings.M), + verticalAlignment = Alignment.CenterVertically, + horizontalArrangement = Arrangement.SpaceBetween + ) { + Row( + horizontalArrangement = Arrangement.spacedBy(MyPaddings.S) + ) { + Icon( + imageVector = Icons.Default.Build, + contentDescription = null, + ) + Text( + text = title, + style = MaterialTheme.typography.bodyLarge, + fontWeight = FontWeight.SemiBold + ) + } + + if (isUpdating) { + CircularProgressIndicator( + modifier = Modifier.size(24.dp), + strokeWidth = 2.dp + ) + } else { + if (actualValue is Boolean) { + var switchValue by remember { mutableStateOf(actualValue) } + LaunchedEffect(actualValue) { + switchValue = actualValue + } + Switch( + checked = switchValue, + onCheckedChange = { isChecked -> + onSwitchToggle?.invoke(isChecked) + switchValue = isChecked + }, + enabled = enabled + ) + } else if (actualValue is Float) { + var sliderPosition by remember { mutableFloatStateOf(actualValue) } + LaunchedEffect(actualValue) { + sliderPosition = actualValue + } + + Slider( + value = sliderPosition, + onValueChangeFinished = { + onSliderChange?.invoke(sliderPosition) + }, + onValueChange = { volume -> + sliderPosition = volume + }, + valueRange = minMaxSlider, + enabled = enabled + ) + } else if (onButtonToggled != null) { + IconButton( + onClick = { + onButtonToggled.invoke() + } + ) { + Icon( + imageVector = buttonIcon, + contentDescription = null, + tint = MaterialTheme.colorScheme.primary + ) + } + } + } + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/InfoColumn.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/InfoColumn.kt new file mode 100644 index 00000000..4f03c619 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/InfoColumn.kt @@ -0,0 +1,65 @@ +package com.meta.pixelandtexel.scanner.android.views.components.smart + +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.Spacer +import androidx.compose.foundation.layout.height +import androidx.compose.foundation.layout.padding +import androidx.compose.material3.Card +import androidx.compose.material3.CardDefaults +import androidx.compose.material3.MaterialTheme +import androidx.compose.material3.Text +import androidx.compose.runtime.Composable +import androidx.compose.ui.Alignment +import androidx.compose.ui.Modifier +import androidx.compose.ui.text.font.FontWeight +import androidx.compose.ui.tooling.preview.Preview +import androidx.compose.ui.unit.dp +import androidx.compose.ui.unit.sp +import com.meta.pixelandtexel.scanner.utils.mytheme.MyPaddings + + +@Composable +fun InfoColumn( + label: String, + value: String?, + modifier: Modifier = Modifier +) { + Card( + elevation = CardDefaults.cardElevation(defaultElevation = 2.dp), + colors = CardDefaults.cardColors( + containerColor = MaterialTheme.colorScheme.surfaceVariant + ), + modifier = modifier.padding(MyPaddings.M) + ) { + Column( + horizontalAlignment = Alignment.CenterHorizontally, + modifier = modifier.padding(MyPaddings.M) + ) { + Text( + text = label, + style = MaterialTheme.typography.bodyMedium, + color = MaterialTheme.colorScheme.onSurfaceVariant, + maxLines = 2, + + ) + Spacer(modifier = Modifier.height(4.dp)) + Text( + text = value ?: "--", + style = MaterialTheme.typography.bodyLarge, + fontWeight = FontWeight.SemiBold, + fontSize = 16.sp, + color = MaterialTheme.colorScheme.primary, + + ) + } + } +} + +@Preview +@Composable +private fun InfoColumnPreview() { + InfoColumn( + label = "Temperature", + value = "22°C" + ) +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/LightComposable.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/LightComposable.kt new file mode 100644 index 00000000..62c4a2e6 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/LightComposable.kt @@ -0,0 +1,63 @@ +package com.meta.pixelandtexel.scanner.android.views.components.smart + +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.fillMaxWidth +import androidx.compose.foundation.layout.padding +import androidx.compose.material3.Card +import androidx.compose.material3.CardDefaults +import androidx.compose.runtime.Composable +import androidx.compose.ui.Modifier +import androidx.compose.ui.unit.dp +import com.meta.pixelandtexel.scanner.models.devices.domain.LightDomain + +@Composable +fun LightComposable( + modifier: Modifier = Modifier, + lightDomain: LightDomain, + onStateChange: (Boolean) -> Unit, + onKelvinChange: (Float) -> Unit, + onBrightnessChange: (Float) -> Unit +) { + val attributes = lightDomain.attributes + val isLightOn = lightDomain.value + + Card( + modifier = modifier.fillMaxWidth(), + elevation = CardDefaults.cardElevation(defaultElevation = 4.dp) + ) { + Column( + modifier = Modifier.padding(16.dp), + verticalArrangement = Arrangement.spacedBy(16.dp) + ) { + EntityRow( + title = "Light State", + isUpdating = false, + actualValue = lightDomain.value, + onSwitchToggle = onStateChange, + ) + + if (attributes.minColorTempKelvin != null && attributes.maxColorTempKelvin != null && lightDomain.attributes.colorTempKelvin != null) { + EntityRow( + title = "Temperature (K)", + isUpdating = false, + actualValue = lightDomain.attributes.colorTempKelvin.toFloat(), + onSliderChange = onKelvinChange, + minMaxSlider = attributes.minColorTempKelvin.toFloat()..attributes.maxColorTempKelvin.toFloat(), + enabled = isLightOn + ) + } + + if (attributes.brightness != null) { + EntityRow( + title = "Brightness", + isUpdating = false, + actualValue = lightDomain.attributes.brightness.toFloat(), + onSliderChange = onBrightnessChange, + minMaxSlider = 0f..255f, + enabled = isLightOn + ) + } + } + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/MediaPlayerComposable.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/MediaPlayerComposable.kt new file mode 100644 index 00000000..437740ff --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/MediaPlayerComposable.kt @@ -0,0 +1,97 @@ +package com.meta.pixelandtexel.scanner.android.views.components.smart + +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.padding +import androidx.compose.material3.Card +import androidx.compose.material3.MaterialTheme +import androidx.compose.material3.Text +import androidx.compose.runtime.Composable +import androidx.compose.ui.Modifier +import androidx.compose.ui.tooling.preview.Preview +import com.meta.pixelandtexel.scanner.models.devices.domain.DomainServices +import com.meta.pixelandtexel.scanner.models.devices.domain.MediaPlayerAttributes +import com.meta.pixelandtexel.scanner.models.devices.domain.MediaPlayerDomain +import com.meta.pixelandtexel.scanner.utils.mytheme.MyPaddings + +@Composable +fun MediaPlayerComposable( + modifier: Modifier = Modifier, + title: String = "Media Player", + mediaPlayerDomain: MediaPlayerDomain, + onStartChange: ((Boolean) -> Unit)? = null, + onMuteChange: ((Boolean) -> Unit)? = null, + onVolumenChange: ((Float) -> Unit)? = null, + onPlayChange: ((Boolean) -> Unit)? = null, +) { + Card(modifier = modifier) { + Column( + verticalArrangement = Arrangement.spacedBy(MyPaddings.S) + ) { + Text( + text = title, + style = MaterialTheme.typography.headlineSmall, + modifier = Modifier.padding(MyPaddings.M) + ) + + EntityRow( + title = "Actual State", + isUpdating = false, + actualValue = mediaPlayerDomain.value, + onSwitchToggle = onStartChange + ) + + if (mediaPlayerDomain.attributes.volumeLevel != null) { + EntityRow( + title = "Volume Level", + isUpdating = false, + actualValue = mediaPlayerDomain.attributes.volumeLevel, + onSliderChange = onVolumenChange, + ) + } + + if (mediaPlayerDomain.attributes.isMuted != null) { + EntityRow( + title = "Mute", + isUpdating = false, + actualValue = mediaPlayerDomain.attributes.isMuted, + onSwitchToggle = onMuteChange + ) + } + + + EntityRow( + title = "Play/Pause", + isUpdating = false, + actualValue = 0, + onButtonToggled = { + onPlayChange?.invoke(true) + } + ) + + } + } +} + +@Preview +@Composable +fun MediaPlayerComposablePreview() { + val thingEntity = MediaPlayerDomain( + value = true, + services = listOf( + DomainServices.TURN_OFF, DomainServices.TURN_ON, DomainServices.VOLUME_SET, + DomainServices.VOLUME_MUTE, DomainServices.MEDIA_PLAY + ), + attributes = MediaPlayerAttributes( + volumeLevel = 0.5f, + isMuted = false, + source = listOf("TV", "HDMI") + ) + ) + MediaPlayerComposable( + mediaPlayerDomain = thingEntity, + onMuteChange = { isChecked -> /* Handle switch toggle */ }, + onVolumenChange = { volume -> /* Handle slider change */ }, + onStartChange = { isChecked -> /* Handle switch toggle */ }, + ) +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/SensorGrid.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/SensorGrid.kt new file mode 100644 index 00000000..ec70ed33 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/SensorGrid.kt @@ -0,0 +1,52 @@ +package com.meta.pixelandtexel.scanner.android.views.components.smart + +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.Row +import androidx.compose.foundation.layout.fillMaxWidth +import androidx.compose.foundation.layout.padding +import androidx.compose.material3.HorizontalDivider +import androidx.compose.material3.MaterialTheme +import androidx.compose.material3.Text +import androidx.compose.runtime.Composable +import androidx.compose.ui.Modifier +import com.meta.pixelandtexel.scanner.android.views.smarthome.dynamic_smart_thing.state.EntityUiModel +import com.meta.pixelandtexel.scanner.models.devices.domain.SensorDomain +import com.meta.pixelandtexel.scanner.utils.mytheme.MyPaddings + +const val MAX_COLUMNS = 3 + +@Composable +fun SensorGrid( + sensors: List, + modifier: Modifier = Modifier +) { + if (sensors.isNotEmpty()) { + Column(modifier = modifier.padding(MyPaddings.S)) { + HorizontalDivider( + modifier = Modifier.padding(vertical = MyPaddings.M) + ) + Text( + text = "Sensors", + style = MaterialTheme.typography.titleMedium, + color = MaterialTheme.colorScheme.primary, + ) + + sensors.chunked(MAX_COLUMNS).forEach { rowEntities -> + Row( + modifier = Modifier + .fillMaxWidth(), + horizontalArrangement = Arrangement.SpaceBetween + ) { + rowEntities.forEach { entity -> + val domain = entity.domain as SensorDomain + InfoColumn( + label = entity.name, + value = domain.value, + ) + } + } + } + } + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/TitleRow.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/TitleRow.kt new file mode 100644 index 00000000..95c4313f --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/TitleRow.kt @@ -0,0 +1,72 @@ +package com.meta.pixelandtexel.scanner.android.views.components.smart + +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.Row +import androidx.compose.foundation.layout.fillMaxWidth +import androidx.compose.foundation.layout.padding +import androidx.compose.material.icons.filled.Close +import androidx.compose.material3.HorizontalDivider +import androidx.compose.material3.Icon +import androidx.compose.material3.IconButton +import androidx.compose.material3.MaterialTheme +import androidx.compose.material3.Text +import androidx.compose.runtime.Composable +import androidx.compose.ui.Alignment +import androidx.compose.ui.Modifier +import androidx.compose.ui.graphics.vector.ImageVector +import androidx.compose.ui.text.font.FontWeight +import androidx.compose.ui.tooling.preview.Preview +import androidx.compose.ui.unit.dp +import com.meta.pixelandtexel.scanner.utils.mytheme.MyPaddings + +@Composable +fun TitleComposable( + title: String, + modifier: Modifier = Modifier, + topButtonIcon: ImageVector, + topButtonContentDescription: String = "Close Button", + onClickTopButton: () -> Unit = { } +) { + Column { + Row( + modifier = modifier + .padding(MyPaddings.M) + .fillMaxWidth(), + verticalAlignment = Alignment.CenterVertically, + horizontalArrangement = Arrangement.SpaceBetween + ) { + Text( + text = title, + style = MaterialTheme.typography.titleLarge.copy( + fontWeight = FontWeight.Bold + ), + color = MaterialTheme.colorScheme.primary, + modifier = Modifier.padding(vertical = MyPaddings.S) + ) + + IconButton( + onClick = onClickTopButton + ) { + Icon( + imageVector = topButtonIcon, + contentDescription = topButtonContentDescription, + tint = MaterialTheme.colorScheme.error + ) + } + + } + HorizontalDivider( + thickness = 3.dp, + ) + } +} + +@Preview +@Composable +fun TitleRowPreview() { + TitleComposable( + title = "Smart Device", + topButtonIcon = androidx.compose.material.icons.Icons.Default.Close + ) +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/WeatherGrid.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/WeatherGrid.kt new file mode 100644 index 00000000..8dc6dbb4 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/components/smart/WeatherGrid.kt @@ -0,0 +1,89 @@ +package com.meta.pixelandtexel.scanner.android.views.components.smart + +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.Row +import androidx.compose.foundation.layout.fillMaxWidth +import androidx.compose.foundation.layout.padding +import androidx.compose.material3.HorizontalDivider +import androidx.compose.material3.MaterialTheme +import androidx.compose.material3.Text +import androidx.compose.runtime.Composable +import androidx.compose.ui.Modifier +import com.meta.pixelandtexel.scanner.models.devices.domain.WeatherDomain +import com.meta.pixelandtexel.scanner.utils.mytheme.MyPaddings + +@Composable +fun WeatherGrid( + domain: WeatherDomain, + modifier: Modifier = Modifier +) { + Column(modifier = modifier.padding(MyPaddings.S)) { + HorizontalDivider( + modifier = Modifier.padding(vertical = MyPaddings.M) + ) + Text( + text = "Sensors", + style = MaterialTheme.typography.titleMedium, + color = MaterialTheme.colorScheme.primary, + ) + + InfoColumn( + label = "Weather Condition", + value = domain.value, + ) + + Row( + modifier = Modifier + .fillMaxWidth(), + horizontalArrangement = Arrangement.SpaceBetween + ) { + if (domain.attributes.temperature != null) { + InfoColumn( + label = "Temperature", + value = "${domain.attributes.temperature} ${domain.attributes.temperatureUnit ?: "°C"}", + ) + } + if (domain.attributes.humidity != null) { + InfoColumn( + label = "Humidity", + value = "${domain.attributes.humidity}%", + ) + } + + if (domain.attributes.windSpeed != null) { + InfoColumn( + label = "Wind Speed", + value = "${domain.attributes.windSpeed} ${domain.attributes.windSpeedUnit ?: "m/s"}", + ) + } + } + + Row( + modifier = Modifier + .fillMaxWidth(), + horizontalArrangement = Arrangement.SpaceBetween + ) { + if (domain.attributes.pressure != null) { + InfoColumn( + label = "Pressure", + value = "${domain.attributes.pressure} ${domain.attributes.pressureUnit ?: "hPa"}", + ) + } + if (domain.attributes.uvIndex != null) { + InfoColumn( + label = "UV Index", + value = domain.attributes.uvIndex.toString(), + ) + } + if (domain.attributes.cloudCoverage != null) { + InfoColumn( + label = "Cloud Coverage", + value = "${domain.attributes.cloudCoverage}%", + ) + } + } + } +} + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/DynamicSmartThingScreen.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/DynamicSmartThingScreen.kt new file mode 100644 index 00000000..7bb99071 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/DynamicSmartThingScreen.kt @@ -0,0 +1,211 @@ +package com.meta.pixelandtexel.scanner.android.views.smarthome.dynamic_smart_thing + +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.fillMaxWidth +import androidx.compose.foundation.layout.padding +import androidx.compose.foundation.lazy.LazyColumn +import androidx.compose.foundation.lazy.items +import androidx.compose.material.icons.Icons +import androidx.compose.material.icons.filled.Close +import androidx.compose.material3.Button +import androidx.compose.material3.Card +import androidx.compose.material3.CardDefaults +import androidx.compose.material3.MaterialTheme +import androidx.compose.material3.Text +import androidx.compose.runtime.Composable +import androidx.compose.runtime.LaunchedEffect +import androidx.compose.runtime.collectAsState +import androidx.compose.runtime.getValue +import androidx.compose.ui.Modifier +import androidx.compose.ui.tooling.preview.Preview +import androidx.compose.ui.unit.dp +import com.meta.pixelandtexel.scanner.android.views.components.smart.EntityRow +import com.meta.pixelandtexel.scanner.android.views.components.smart.LightComposable +import com.meta.pixelandtexel.scanner.android.views.components.smart.MediaPlayerComposable +import com.meta.pixelandtexel.scanner.android.views.components.smart.SensorGrid +import com.meta.pixelandtexel.scanner.android.views.components.smart.TitleComposable +import com.meta.pixelandtexel.scanner.android.views.components.smart.WeatherGrid +import com.meta.pixelandtexel.scanner.android.views.smarthome.dynamic_smart_thing.state.EntityUiModel +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.pixelandtexel.scanner.models.devices.domain.AttributeServices +import com.meta.pixelandtexel.scanner.models.devices.domain.DomainServices +import com.meta.pixelandtexel.scanner.models.devices.domain.LightDomain +import com.meta.pixelandtexel.scanner.models.devices.domain.MediaPlayerDomain +import com.meta.pixelandtexel.scanner.models.devices.domain.SensorDomain +import com.meta.pixelandtexel.scanner.models.devices.domain.SwitchDomain +import com.meta.pixelandtexel.scanner.models.devices.domain.WeatherDomain +import com.meta.pixelandtexel.scanner.utils.mytheme.MyPaddings + +@Composable +fun DynamicSmartThingScreen( + device: Device, + viewModel: DynamicSmartThingViewmodel, + modifier: Modifier = Modifier +) { + fun onSwitchToggled( + entity: EntityUiModel, + newValue: Boolean, + ) { + var domainService = DomainServices.TURN_ON + if (!newValue) { + domainService = DomainServices.TURN_OFF + } + viewModel.onActionExecuted(entity, newValue, domainService) + } + + val uiState by viewModel.uiState.collectAsState() + LaunchedEffect(device) { + viewModel.initialize(device) + } + + val sensors = + uiState.entities.filter { it.domain is SensorDomain || it.domain is WeatherDomain } + val controllers = uiState.entities.filter { it.domain !is SensorDomain } + + Card( + modifier = modifier + .fillMaxWidth() + .padding(MyPaddings.M), + elevation = CardDefaults.cardElevation(defaultElevation = 4.dp) + ) { + Column( + modifier = Modifier + .padding(MyPaddings.M) + ) { + TitleComposable( + title = uiState.deviceName, + topButtonIcon = Icons.Default.Close, + onClickTopButton = viewModel.onCloseSmartThing + ) + + LazyColumn( + verticalArrangement = Arrangement.spacedBy(MyPaddings.S) + ) { + if (controllers.isNotEmpty()) { + item { + Text( + text = "Controls", + style = MaterialTheme.typography.titleMedium, + color = MaterialTheme.colorScheme.primary, + modifier = Modifier.padding(vertical = MyPaddings.S) + ) + } + } + + items( + items = controllers, + key = { it.id } + ) { entity -> + if (entity.domain is SwitchDomain) { + EntityRow( + title = entity.name, + isUpdating = entity.isUpdating, + actualValue = entity.domain.value, + onSwitchToggle = { newValue -> + onSwitchToggled(entity, newValue) + }, + ) + } else if (entity.domain is MediaPlayerDomain) { + MediaPlayerComposable( + title = entity.name, + mediaPlayerDomain = entity.domain, + onStartChange = { newValue -> + onSwitchToggled(entity, newValue) + }, + onMuteChange = { newValue -> + viewModel.onActionExecuted( + entity, + newValue, + DomainServices.VOLUME_MUTE, + AttributeServices.IS_VOLUME_MUTED + ) + }, + onVolumenChange = { newValue -> + viewModel.onActionExecuted( + entity, + newValue, + DomainServices.VOLUME_SET, + AttributeServices.VOLUME_LEVEL + ) + }, + onPlayChange = { newValue -> + viewModel.onActionExecuted( + entity, + true, + DomainServices.MEDIA_PLAY, + ) + }, + ) + } else if (entity.domain is LightDomain) { + LightComposable( + lightDomain = entity.domain, + onStateChange = { newValue -> + onSwitchToggled(entity, newValue) + }, + onKelvinChange = { newValue -> + viewModel.onActionExecuted( + entity, + newValue, + DomainServices.TURN_ON, + AttributeServices.COLOR_TEMP_KELVIN + ) + }, + onBrightnessChange = { newValue -> + viewModel.onActionExecuted( + entity, + newValue, + DomainServices.TURN_ON, + AttributeServices.BRIGHTNESS + ) + }, + ) + } + } + + if (sensors.isNotEmpty()) { + item { + if (sensors[0].domain is WeatherDomain) { + WeatherGrid( + domain = sensors[0].domain as WeatherDomain, + modifier = Modifier + .padding(top = MyPaddings.L) + ) + } else { + SensorGrid( + sensors = sensors, + modifier = Modifier + .padding(top = MyPaddings.L) + ) + } + } + } + + + + item { + Button( + onClick = { viewModel.onDisconnectDevice() } + ) { + Text(text = "Disconnect Common Device") + } + } + } + } + } + + +} + + +@Preview(widthDp = 400, heightDp = 100) +@Composable +fun EntityRowPreview() { + EntityRow( + title = "Living Room Light", + isUpdating = false, + actualValue = 0.2f, + onSwitchToggle = {}, + onSliderChange = {} + ) +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/DynamicSmartThingViewmodel.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/DynamicSmartThingViewmodel.kt new file mode 100644 index 00000000..d44c825f --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/DynamicSmartThingViewmodel.kt @@ -0,0 +1,102 @@ +package com.meta.pixelandtexel.scanner.android.views.smarthome.dynamic_smart_thing + +import androidx.lifecycle.ViewModel +import androidx.lifecycle.viewModelScope +import com.meta.pixelandtexel.scanner.android.domain.usecases.GetDeviceInfoUsecase +import com.meta.pixelandtexel.scanner.android.domain.usecases.UseActionDevice +import com.meta.pixelandtexel.scanner.android.views.smarthome.dynamic_smart_thing.state.EntityUiModel +import com.meta.pixelandtexel.scanner.android.views.smarthome.dynamic_smart_thing.state.SmartDeviceUiState +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.pixelandtexel.scanner.models.devices.ThingEntity +import com.meta.pixelandtexel.scanner.models.devices.domain.AttributeServices +import com.meta.pixelandtexel.scanner.models.devices.domain.DomainServices +import kotlinx.coroutines.delay +import kotlinx.coroutines.flow.MutableStateFlow +import kotlinx.coroutines.flow.StateFlow +import kotlinx.coroutines.flow.update +import kotlinx.coroutines.launch + +class DynamicSmartThingViewmodel( + private val getDeviceInfoUsecase: GetDeviceInfoUsecase, + private val useActionDevice: UseActionDevice, + val onCloseSmartThing: () -> Unit = {}, + val onDisconnectDevice: () -> Unit = {} +) : ViewModel() { + companion object { + private const val WAIT_FOR_NEXT_REQUEST_MS = 5000L + private const val DELAY_FOR_UPDATING_FROM_API_MS = 500L + } + private val _uiState = MutableStateFlow(SmartDeviceUiState()) + val uiState: StateFlow = _uiState + + fun initialize(device: Device) { + val newState = SmartDeviceUiState( + deviceName = device.name, + entities = device.entityList.sortedBy { entity -> entity.domain.toString() } + .map { entity -> + EntityUiModel.fromThingEntityWithoutDeviceName( + entity, + device.name + ) + } + ) + + viewModelScope.launch { + updateAllEntitiesState(newState) + updateStatePeriodically() + } + + } + + suspend fun updateStatePeriodically() { + while (true) { + updateAllEntitiesState(_uiState.value) + delay(WAIT_FOR_NEXT_REQUEST_MS) + } + + } + + fun onActionExecuted( + entity: EntityUiModel, + newValue: Any, + action: DomainServices, + attribute: AttributeServices? = null + ) { + viewModelScope.launch { + val entityId = entity.id + + if (action !in entity.domain.services) { + return@launch + } + useActionDevice.run( + thingId = entityId, + action = action.serviceName, + newValue = newValue, + attribute = attribute?.serviceName + ) + + updateAllEntitiesState(_uiState.value) + } + + } + + private suspend fun updateAllEntitiesState(newState: SmartDeviceUiState) { + val listThings = newState.entities.map { ThingEntity(id = it.id, domain = it.domain) } + delay(DELAY_FOR_UPDATING_FROM_API_MS) + + val updatedEntities = getDeviceInfoUsecase.run(listThings) + + val newEntities = updatedEntities.map { entity -> + EntityUiModel.fromThingEntityWithoutDeviceName( + thingEntity = entity, + deviceName = newState.deviceName + ) + } + _uiState.update { currentState -> + currentState.copy( + entities = newEntities, + deviceName = newState.deviceName + ) + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/state/EntityUiModel.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/state/EntityUiModel.kt new file mode 100644 index 00000000..695f356f --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/state/EntityUiModel.kt @@ -0,0 +1,26 @@ +package com.meta.pixelandtexel.scanner.android.views.smarthome.dynamic_smart_thing.state + +import com.meta.pixelandtexel.scanner.models.devices.ThingEntity +import com.meta.pixelandtexel.scanner.models.devices.domain.Domain + +data class EntityUiModel( + val id: String, + val name: String, + val domain: Domain, + val isUpdating: Boolean = false +) { + companion object { + fun fromThingEntityWithoutDeviceName( + thingEntity: ThingEntity, + deviceName: String + ): EntityUiModel { + val name = + thingEntity.id.substringAfterLast('.').replace('_', ' ').replace(deviceName, "") + return EntityUiModel( + id = thingEntity.id, + name = if (name.isBlank()) deviceName else name.trim(), + domain = thingEntity.domain + ) + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/state/SmartDeviceUiState.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/state/SmartDeviceUiState.kt new file mode 100644 index 00000000..8c671ce9 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/dynamic_smart_thing/state/SmartDeviceUiState.kt @@ -0,0 +1,7 @@ +package com.meta.pixelandtexel.scanner.android.views.smarthome.dynamic_smart_thing.state + +data class SmartDeviceUiState( + val deviceName: String = "", + val isLoading: Boolean = false, + val entities: List = emptyList() +) \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/selection/DeviceSelectionScreen.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/selection/DeviceSelectionScreen.kt new file mode 100644 index 00000000..f50703db --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/selection/DeviceSelectionScreen.kt @@ -0,0 +1,62 @@ +package com.meta.pixelandtexel.scanner.android.views.smarthome.selection + +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.fillMaxWidth +import androidx.compose.foundation.layout.padding +import androidx.compose.material3.Button +import androidx.compose.material3.Card +import androidx.compose.material3.CardDefaults +import androidx.compose.material3.CircularProgressIndicator +import androidx.compose.material3.MaterialTheme +import androidx.compose.material3.Text +import androidx.compose.runtime.Composable +import androidx.compose.runtime.LaunchedEffect +import androidx.compose.runtime.collectAsState +import androidx.compose.runtime.getValue +import androidx.compose.ui.Alignment +import androidx.compose.ui.Modifier +import androidx.compose.ui.unit.dp +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.pixelandtexel.scanner.utils.mytheme.MyPaddings + +@Composable +fun DeviceSelectionScreen( + viewModel: DeviceSelectionViewModel, + onOptionSelected: (Device) -> Unit +) { + val options by viewModel.options.collectAsState() + + LaunchedEffect(Unit) { + viewModel.loadOptions() + } + + Card( + modifier = Modifier.padding(MyPaddings.M), + elevation = CardDefaults.cardElevation(defaultElevation = 8.dp) + ) { + Column( + modifier = Modifier.padding(MyPaddings.L), + verticalArrangement = Arrangement.spacedBy(12.dp), + horizontalAlignment = Alignment.CenterHorizontally + ) { + Text( + text = "Which device do you want to pair?", + style = MaterialTheme.typography.headlineSmall + ) + + if (options.isEmpty()) { + CircularProgressIndicator() + } else { + options.forEach { option -> + Button( + onClick = { onOptionSelected(option) }, + modifier = Modifier.fillMaxWidth() + ) { + Text(text = option.name) + } + } + } + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/selection/DeviceSelectionScreenViewModel.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/selection/DeviceSelectionScreenViewModel.kt new file mode 100644 index 00000000..c096a949 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/smarthome/selection/DeviceSelectionScreenViewModel.kt @@ -0,0 +1,32 @@ +package com.meta.pixelandtexel.scanner.android.views.smarthome.selection + +import androidx.lifecycle.ViewModel +import androidx.lifecycle.viewModelScope +import com.meta.pixelandtexel.scanner.android.domain.usecases.GetDevicesOfASmarthomeType +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.pixelandtexel.scanner.models.smarthomedata.TypeSmartHomeInfo +import kotlinx.coroutines.flow.MutableStateFlow +import kotlinx.coroutines.flow.StateFlow +import kotlinx.coroutines.flow.asStateFlow +import kotlinx.coroutines.launch + +class DeviceSelectionViewModel( + private val displayInfo: TypeSmartHomeInfo = TypeSmartHomeInfo.UNKNOWN, + private val getDevicesOfASmarthomeType: GetDevicesOfASmarthomeType +) : ViewModel() { + + private val _options = MutableStateFlow>(emptyList()) + val options: StateFlow> = _options.asStateFlow() + + private val _deviceType = MutableStateFlow(TypeSmartHomeInfo.UNKNOWN) + val deviceType: StateFlow = _deviceType.asStateFlow() + + fun loadOptions() { + _deviceType.value = displayInfo + + viewModelScope.launch { + + _options.value = getDevicesOfASmarthomeType.run() + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/welcome/CameraControlsView.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/welcome/CameraControlsView.kt new file mode 100644 index 00000000..dd7c32fb --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/welcome/CameraControlsView.kt @@ -0,0 +1,149 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.android.views.welcome + +import androidx.compose.animation.AnimatedVisibility +import androidx.compose.animation.core.tween +import androidx.compose.animation.fadeIn +import androidx.compose.animation.fadeOut +import androidx.compose.foundation.Image +import androidx.compose.foundation.background +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Box +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.fillMaxSize +import androidx.compose.foundation.layout.fillMaxWidth +import androidx.compose.foundation.layout.padding +import androidx.compose.material3.Text +import androidx.compose.runtime.Composable +import androidx.compose.runtime.LaunchedEffect +import androidx.compose.runtime.getValue +import androidx.compose.runtime.mutableStateOf +import androidx.compose.runtime.remember +import androidx.compose.runtime.setValue +import androidx.compose.ui.Alignment +import androidx.compose.ui.Modifier +import androidx.compose.ui.graphics.Color +import androidx.compose.ui.res.painterResource +import androidx.compose.ui.res.stringResource +import androidx.compose.ui.tooling.preview.Preview +import androidx.compose.ui.unit.dp +import com.meta.pixelandtexel.scanner.R +import com.meta.pixelandtexel.scanner.android.views.components.Panel +import com.meta.spatial.uiset.button.PrimaryButton +import com.meta.spatial.uiset.theme.SpatialTheme +import kotlinx.coroutines.delay + +@Composable +fun CameraControlsView(onContinue: (() -> Unit)) { + var visible by remember { mutableStateOf(true) } + + // toggle visibility every 2 seconds + LaunchedEffect(Unit) { + while (true) { + delay(2000L) + visible = !visible + } + } + + Panel(outerPadding = false) { + Column(modifier = Modifier.fillMaxSize()) { + Box(modifier = Modifier + .weight(1f) + .background(Color(0x88000000)) + .fillMaxWidth()) { + Column( + verticalArrangement = Arrangement.Center, + horizontalAlignment = Alignment.CenterHorizontally, + modifier = Modifier.fillMaxSize(), + ) { + AnimatedVisibility( + visible = visible, + enter = fadeIn(tween(durationMillis = 800)), + exit = fadeOut(tween(durationMillis = 800)), + ) { + Box( + contentAlignment = Alignment.TopCenter, + modifier = Modifier.fillMaxSize() + ) { + Image( + painter = painterResource(R.drawable.hand_swipe), + contentDescription = "", + modifier = Modifier + .fillMaxSize() + .padding(top = 40.dp, bottom = 10.dp), + ) + Image( + painter = painterResource(R.drawable.arrow_over_3x), + contentDescription = "", + modifier = + Modifier + .fillMaxSize() + .padding(top = 25.dp, bottom = 100.dp, end = 10.dp), + ) + } + } + } + Column( + verticalArrangement = Arrangement.Center, + horizontalAlignment = Alignment.CenterHorizontally, + modifier = Modifier.fillMaxSize(), + ) { + AnimatedVisibility( + visible = !visible, + enter = fadeIn(tween(durationMillis = 800)), + exit = fadeOut(tween(durationMillis = 800)), + ) { + Box( + contentAlignment = Alignment.TopCenter, + modifier = Modifier.fillMaxSize() + ) { + Image( + painter = painterResource(R.drawable.hand_open), + contentDescription = "", + modifier = Modifier + .fillMaxSize() + .padding(vertical = 10.dp), + ) + } + } + } + } + Column( + horizontalAlignment = Alignment.CenterHorizontally, + verticalArrangement = Arrangement.SpaceBetween, + modifier = Modifier + .weight(0.95f) + .fillMaxWidth() + .padding(20.dp), + ) { + Text( + stringResource(R.string.camera_controls_title), + style = SpatialTheme.typography.headline3Strong, + color = Color.White, + ) + Text( + stringResource(R.string.camera_controls_1), + style = SpatialTheme.typography.body2, + color = Color.White, + ) + Text( + stringResource(R.string.camera_controls_2), + style = SpatialTheme.typography.body2, + color = Color.White, + ) + PrimaryButton( + stringResource(R.string.btn_got_it), + onClick = onContinue, + expanded = true + ) + } + } + } +} + +@Preview(widthDp = 368, heightDp = 404) +@Composable +private fun CameraControlsViewPreview() { + CameraControlsView {} +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/welcome/NoticeView.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/welcome/NoticeView.kt new file mode 100644 index 00000000..e4aca083 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/welcome/NoticeView.kt @@ -0,0 +1,49 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.android.views.welcome + +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.fillMaxSize +import androidx.compose.runtime.Composable +import androidx.compose.ui.Alignment +import androidx.compose.ui.Modifier +import androidx.compose.ui.res.stringResource +import androidx.compose.ui.text.TextStyle +import androidx.compose.ui.tooling.preview.Preview +import androidx.compose.ui.unit.dp +import com.meta.pixelandtexel.scanner.R +import com.meta.pixelandtexel.scanner.android.views.components.Panel +import com.meta.spatial.uiset.button.PrimaryButton +import com.meta.spatial.uiset.theme.SpatialColor +import com.meta.spatial.uiset.theme.SpatialTheme +import dev.jeziellago.compose.markdowntext.MarkdownText + +@Composable +fun NoticeView(onLinkClicked: ((String) -> Unit)? = null, onContinue: (() -> Unit)) { + Panel { + Column(verticalArrangement = Arrangement.Center, modifier = Modifier.fillMaxSize()) { + Column( + verticalArrangement = Arrangement.spacedBy(30.dp), + horizontalAlignment = Alignment.CenterHorizontally, + ) { + MarkdownText( + stringResource(R.string.notice), + style = SpatialTheme.typography.body1.merge(TextStyle(color = SpatialColor.white100)), + onLinkClicked = onLinkClicked, + ) + PrimaryButton( + stringResource(R.string.btn_continue), + onClick = onContinue, + expanded = true + ) + } + } + } +} + +@Preview(widthDp = 368, heightDp = 404) +@Composable +private fun InterstitialViewPreview() { + NoticeView {} +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/welcome/WelcomeScreen.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/welcome/WelcomeScreen.kt new file mode 100644 index 00000000..c93d6a3b --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/android/views/welcome/WelcomeScreen.kt @@ -0,0 +1,63 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.android.views.welcome + +import androidx.compose.runtime.Composable +import androidx.compose.runtime.LaunchedEffect +import androidx.compose.runtime.getValue +import androidx.compose.ui.tooling.preview.Preview +import androidx.lifecycle.viewmodel.compose.viewModel +import androidx.navigation.NavHostController +import androidx.navigation.compose.NavHost +import androidx.navigation.compose.composable +import androidx.navigation.compose.rememberNavController +import com.meta.pixelandtexel.scanner.android.viewmodels.WelcomeViewModel +import com.meta.pixelandtexel.scanner.services.settings.SettingsKey +import com.meta.pixelandtexel.scanner.services.settings.SettingsService +import com.meta.spatial.uiset.theme.SpatialTheme + +object Routes { + const val EMPTY = "EMPTY" + const val NOTICE = "NOTICE" + const val CAMERA_CONTROLS_INTRO = "CAMERA_CONTROLS_INTRO" +} + +@Composable +fun WelcomeScreen( + vm: WelcomeViewModel = viewModel(), + navController: NavHostController = rememberNavController(), + dismissPanel: (() -> Unit)? = null, +) { + val route by vm.route + + LaunchedEffect(route) { navController.navigate(route) { launchSingleTop = true } } + + LaunchedEffect(null) { vm.checkShouldShowNotice() } + + SpatialTheme { + NavHost(navController = navController, startDestination = route) { + composable(Routes.EMPTY) { + // purposefully empty + } + composable(Routes.NOTICE) { + NoticeView { + SettingsService.set(SettingsKey.ACCEPTED_NOTICE, true) + vm.navTo(Routes.CAMERA_CONTROLS_INTRO) + } + } + composable(Routes.CAMERA_CONTROLS_INTRO) { CameraControlsView { dismissPanel?.invoke() } } + } + } +} + +@Preview(widthDp = 368, heightDp = 404) +@Composable +private fun WelcomeScreenPreviewInterstitial() { + WelcomeScreen(WelcomeViewModel(Routes.NOTICE)) +} + +@Preview(widthDp = 368, heightDp = 404) +@Composable +private fun WelcomeScreenPreviewWelcome() { + WelcomeScreen(WelcomeViewModel(Routes.CAMERA_CONTROLS_INTRO)) +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/datasource/network/NetworkModule.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/datasource/network/NetworkModule.kt new file mode 100644 index 00000000..d70d6ef2 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/datasource/network/NetworkModule.kt @@ -0,0 +1,60 @@ +package com.meta.pixelandtexel.scanner.datasource.network + +import com.meta.pixelandtexel.scanner.BuildConfig +import com.meta.pixelandtexel.scanner.datasource.network.SmartHomeApi +import okhttp3.Interceptor +import okhttp3.OkHttpClient +import org.koin.dsl.module +import retrofit2.Retrofit +import retrofit2.converter.gson.GsonConverterFactory +import java.util.concurrent.TimeUnit + +interface TokenProvider { + fun bearerToken(): String +} + +class StaticTokenProvider( + private val token: String +) : TokenProvider { + override fun bearerToken(): String = "Bearer $token" +} + +const val BASE_URL_GET = BuildConfig.HTTP_API + +private fun authInterceptor(tokenProvider: TokenProvider) = Interceptor { chain -> + val request = chain.request() + .newBuilder() + .addHeader("Authorization", tokenProvider.bearerToken()) + .addHeader("Content-Type", "application/json") + .build() + chain.proceed(request) +} + +fun provideHttpClient(tokenProvider: TokenProvider): OkHttpClient { + return OkHttpClient.Builder() + .addInterceptor(authInterceptor(tokenProvider)) + .readTimeout(60, TimeUnit.SECONDS) + .connectTimeout(60, TimeUnit.SECONDS) + .build() +} + + +fun provideRetrofit( + okHttpClient: OkHttpClient, +): Retrofit { + return Retrofit.Builder() + .baseUrl(BASE_URL_GET) + .client(okHttpClient) + .addConverterFactory(GsonConverterFactory.create()) + .build() +} + +fun provideService(retrofit: Retrofit): SmartHomeApi = + retrofit.create(SmartHomeApi::class.java) + +val networkModule = module { + single { StaticTokenProvider(token = BuildConfig.HOME_ASSISTANT_TOKEN) } + single { provideHttpClient(get()) } + single { provideRetrofit(get()) } + single { provideService(get()) } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/datasource/network/SmartHomeApi.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/datasource/network/SmartHomeApi.kt new file mode 100644 index 00000000..a18e05bb --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/datasource/network/SmartHomeApi.kt @@ -0,0 +1,30 @@ +package com.meta.pixelandtexel.scanner.datasource.network + +import com.meta.pixelandtexel.scanner.android.datasource.dto.DeviceListResponseDto +import com.meta.pixelandtexel.scanner.android.datasource.dto.ThingsResponseDto +import retrofit2.Response +import retrofit2.http.Body +import retrofit2.http.GET +import retrofit2.http.POST +import retrofit2.http.Path + +interface SmartHomeApi { + + @POST("services/{domain}/{action}") + suspend fun postActionToDeviceDomain( + @Path("domain") device: String, + @Path("action") action: String, + @Body body: Map + ): Response + + + @POST("template") + suspend fun postTemplate( + @Body body: Map + ): Response + + @GET("states/{entity_id}") + suspend fun getEntityState( + @Path("entity_id") entityId: String + ): Response +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/ecs/OutlinedSystem.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/ecs/OutlinedSystem.kt new file mode 100644 index 00000000..2c8aec5f --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/ecs/OutlinedSystem.kt @@ -0,0 +1,158 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.ecs + +import com.meta.pixelandtexel.scanner.Outlined +import com.meta.pixelandtexel.scanner.R +import com.meta.spatial.core.Entity +import com.meta.spatial.core.Query +import com.meta.spatial.core.SystemBase +import com.meta.spatial.core.Vector2 +import com.meta.spatial.core.Vector3 +import com.meta.spatial.core.Vector4 +import com.meta.spatial.runtime.BlendMode +import com.meta.spatial.runtime.DepthTest +import com.meta.spatial.runtime.SceneMaterial +import com.meta.spatial.runtime.SceneMaterialAttribute +import com.meta.spatial.runtime.SceneMaterialDataType +import com.meta.spatial.runtime.SceneMesh +import com.meta.spatial.runtime.SceneTexture +import com.meta.spatial.runtime.SortOrder +import com.meta.spatial.runtime.StereoMode +import com.meta.spatial.toolkit.AppSystemActivity +import com.meta.spatial.toolkit.Scale +import com.meta.spatial.toolkit.SceneObjectSystem +import com.meta.spatial.uiset.theme.SpatialColor + +/** + * System that creates and manages 9-slice outlines for entities. Observes entities with an + * [Outlined] component and generates a quad mesh with a custom 9-slice material to render an + * outline around them. + * + * @param activity The current [AppSystemActivity] context. + */ +class OutlinedSystem(activity: AppSystemActivity) : SystemBase() { + companion object { + private const val TAG: String = "OutlinedSystem" + + // material params for the 9-slice shader and texture + + // the size of the input texture + private val SLICE_TEX_SIZE = Vector2(96f, 96f) + + // the slice size of the image – left, top, right, bottom + private val SLICE_SIZE = Vector4(32f, 32f, 32f, 32f) + + // the pixels per unit multiplier – increasing this scales the outline width + private const val PPU_MULTIPLIER = 1.6f // increasing this scales the outline width + } + + // key is the entity id + private var outlinedObjects = HashMap() + private val outlineDrawable = activity.getDrawable(R.drawable.rounded_box_outline)!! + + /** + * Queries for entities with an [Outlined] component, and for each such entity, creates and + * applies an outline mesh and material. + */ + override fun execute() { + val query = Query.where { changed(Outlined.id) } + for (entity in query.eval()) { + val completable = systemManager.findSystem().getSceneObject(entity) + + completable?.thenAccept { + val outlinedComponent = entity.getComponent() + val scale = Vector3(outlinedComponent.size.x, outlinedComponent.size.y, 1f) + + // setup our mesh and material + + val material = createNewOutlineMaterial(scale.x, scale.y) + + val quadMesh = + SceneMesh.quad(Vector3(-0.5f, -0.5f, 0f), Vector3(0.5f, 0.5f, 0f), material) + it.setSceneMesh(quadMesh, "trackedObjectQuad") + + entity.setComponent(Scale(scale)) + + // add to our map of outlined objects + + outlinedObjects[entity.id] = + OutlinedObjectInfo( + entity, + material, + ) + } + } + } + + /** + * Called when an entity is being deleted from the system. Removes the outline information + * associated with the entity and destroys its outline material to free up resources. + * + * @param entity The entity being deleted. + */ + override fun delete(entity: Entity) { + super.delete(entity) + + val removed = outlinedObjects.remove(entity.id) + removed?.outlineMaterial?.destroy() + } + + /** + * Creates a new [SceneMaterial] configured for rendering a 9-slice outline. + * + * @param width The width of the quad on which the outline will be rendered. + * @param height The height of the quad on which the outline will be rendered. + * @return A new [SceneMaterial] instance for the outline. + */ + private fun createNewOutlineMaterial(width: Float, height: Float): SceneMaterial { + return SceneMaterial.custom( + "9slice", + arrayOf( + SceneMaterialAttribute("sliceTex", SceneMaterialDataType.Texture2D), + SceneMaterialAttribute("sliceParams", SceneMaterialDataType.Vector4), + SceneMaterialAttribute("sliceSize", SceneMaterialDataType.Vector4), + SceneMaterialAttribute("tintColor", SceneMaterialDataType.Vector4), + SceneMaterialAttribute("stereoParams", SceneMaterialDataType.Vector4), + ), + ) + .apply { + setBlendMode(BlendMode.TRANSLUCENT) + setSortOrder(SortOrder.TRANSLUCENT) + setDepthTest(DepthTest.ALWAYS) + setStereoMode(StereoMode.None) + + setAttribute("sliceTex", SceneTexture(outlineDrawable)) + + // x = quad width, y = quad height, z = texture width, w = texture height + setAttribute( + "sliceParams", + Vector4(width, height, SLICE_TEX_SIZE.x, SLICE_TEX_SIZE.y) + ) + // slice size in order: left, right, top, bottom + setAttribute("sliceSize", SLICE_SIZE) + + // rgb, a = pixels per unit multiplier + setAttribute( + "tintColor", + Vector4( + SpatialColor.b70.red, + SpatialColor.b70.green, + SpatialColor.b70.blue, + PPU_MULTIPLIER, + ), + ) + } + } + + /** + * Data class to hold information about an outlined entity. + * + * @property outlineEntity The [Entity] entity that has an outline. + * @property outlineMaterial The [SceneMaterial] used for rendering the entity's outline. + */ + private data class OutlinedObjectInfo( + val outlineEntity: Entity, + val outlineMaterial: SceneMaterial, + ) +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/ecs/WristAttachedSystem.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/ecs/WristAttachedSystem.kt new file mode 100644 index 00000000..8d82165e --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/ecs/WristAttachedSystem.kt @@ -0,0 +1,136 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.ecs + +import com.meta.pixelandtexel.scanner.HandSide +import com.meta.pixelandtexel.scanner.WristAttached +import com.meta.pixelandtexel.scanner.utils.MathUtils.fromSequentialPYR +import com.meta.spatial.core.Entity +import com.meta.spatial.core.Pose +import com.meta.spatial.core.Quaternion +import com.meta.spatial.core.Query +import com.meta.spatial.core.SystemBase +import com.meta.spatial.core.Vector3 +import com.meta.spatial.toolkit.AvatarBody +import com.meta.spatial.toolkit.Transform +import com.meta.spatial.toolkit.Visible + +/** + * Manages entities that are designated to be attached to the user's wrists, responsible for + * updating the position, rotation, and visibility of such entities based on the user's hand and + * head movements. + */ +class WristAttachedSystem : SystemBase() { + companion object { + private const val TAG: String = "WristAttachedSystem" + } + + private val wristAttachedEntities = mutableListOf() + + /** + * Finds new wrist-attached entities, retrieves the current transforms of the player's head and + * hands, and then updates the pose and visibility of each tracked wrist-attached entity. + * Visibility is determined by whether the entity (and by extension, the user's palm) is facing + * towards the user's head, and if the head is looking at the palm. + */ + override fun execute() { + findNewEntities() + + // get our head and hands/controllers transforms + + val playerBody = getAvatarBody() + if ( + !playerBody.head.hasComponent() || + !playerBody.leftHand.hasComponent() || + !playerBody.rightHand.hasComponent() + ) { + // Failed to find transform components on avatar body parts; controllers may be + // disconnected and hands out of view + return + } + + val headTransform = playerBody.head.getComponent() + val leftHandTransform = playerBody.leftHand.getComponent() + val rightHandTransform = playerBody.rightHand.getComponent() + + // now process existing entities + + for (entity in wristAttachedEntities) { + val comp = entity.getComponent() + + val handTransform = + when (comp.side) { + HandSide.LEFT -> leftHandTransform + HandSide.RIGHT -> rightHandTransform + } + + // calculate the new pose for the attached entity + + val quatOffset = + Quaternion.fromSequentialPYR(comp.rotation.x, comp.rotation.y, comp.rotation.z) + val rotation = handTransform.transform.q.times(quatOffset) + + // use the offset rotation as our basis orientation for translation + val position = handTransform.transform.t + rotation.times(comp.position) + + val pose = Pose(position, if (comp.faceUser) headTransform.transform.q else rotation) + entity.setComponent(Transform(pose)) + + // hide the entity if the palm isn't facing the user's head + + val vHeadFwd = headTransform.transform.forward() + val vAnchorFwd = rotation.times(Vector3.Forward) + val vHeadToAnchor = (position - headTransform.transform.t).normalize() + + val lookingAtHand = vHeadFwd.dot(vHeadToAnchor) > 0.85f + val handFacingHead = vAnchorFwd.dot(vHeadToAnchor) > 0.4f + entity.setComponent(Visible(lookingAtHand && handFacingHead)) + } + } + + /** + * Handles the deletion of an entity from the system, removing the entity from the internal list + * of tracked wrist-attached entities. + * + * @param entity The entity to be deleted. + */ + override fun delete(entity: Entity) { + super.delete(entity) + + wristAttachedEntities.remove(entity) + } + + /** + * Finds new entities that should be managed by this system, querying for local entities that have + * both [WristAttached] and [Transform] components, and adds to the [wristAttachedEntities] list. + */ + private fun findNewEntities() { + val query = + Query.where { has(WristAttached.id, Transform.id) and changed(WristAttached.id) } + for (entity in query.eval()) { + if (wristAttachedEntities.contains(entity)) { + continue + } + + if (!entity.isLocal()) { + continue + } + + wristAttachedEntities.add(entity) + } + } + + /** + * Retrieves the [AvatarBody] component for the local, player-controlled avatar. + * + * @return The [AvatarBody] component of the player's avatar. + * @throws NoSuchElementException if no local, player-controlled avatar body is found. + */ + private fun getAvatarBody(): AvatarBody { + return Query.where { has(AvatarBody.id) } + .eval() + .filter { it.isLocal() && it.getComponent().isPlayerControlled } + .first() + .getComponent() + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/MRUKSidePanelRaycasterFeature.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/MRUKSidePanelRaycasterFeature.kt new file mode 100644 index 00000000..f5f4daeb --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/MRUKSidePanelRaycasterFeature.kt @@ -0,0 +1,170 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting + +import android.os.Bundle +import com.meta.pixelandtexel.scanner.DiApplication +import com.meta.pixelandtexel.scanner.FollowHead +import com.meta.pixelandtexel.scanner.R +import com.meta.pixelandtexel.scanner.android.domain.repository.SmartHomeRepository +import com.meta.pixelandtexel.scanner.android.views.smarthome.dynamic_smart_thing.DynamicSmartThingScreen +import com.meta.pixelandtexel.scanner.android.views.smarthome.dynamic_smart_thing.DynamicSmartThingViewmodel +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.model.MrukRaycastModel +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.repository.IMRUKObjectsRepository +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.system.FollowHeadSystem +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.spatial.compose.composePanel +import com.meta.spatial.core.ComponentRegistration +import com.meta.spatial.core.Pose +import com.meta.spatial.core.SendRate +import com.meta.spatial.core.SpatialFeature +import com.meta.spatial.core.SystemBase +import com.meta.spatial.runtime.LayerConfig +import com.meta.spatial.runtime.PanelShapeLayerBlendType +import com.meta.spatial.toolkit.AppSystemActivity +import com.meta.spatial.toolkit.PanelRegistration +import kotlinx.coroutines.CoroutineScope +import kotlinx.coroutines.Dispatchers +import kotlinx.coroutines.cancel +import kotlinx.coroutines.launch +import org.koin.android.ext.android.get + +/** + * A Spatial SDK Feature which uses the device camera feed and a CV object detection model to + * discover objects in the user's surroundings, assign them labels and persistent ids, and track + * their position over time. + * + **/ + +class MRUKSidePanelRaycasterFeature( + private val activity: AppSystemActivity, +) : SpatialFeature { + companion object { + private const val TAG = "MRUKSidePanelRaycasterFeature" + } + + private val subscriptionScope = CoroutineScope(Dispatchers.Main) + + private var di: DiApplication = activity.application as DiApplication + private val mrukObjectRepository: IMRUKObjectsRepository + private val smartHomeRepository: SmartHomeRepository + + init { + mrukObjectRepository = di.get() + smartHomeRepository = di.get() + subscriptionScope.launch {} + } + + override fun onCreate(savedInstanceState: Bundle?) { + + activity.registerPanel( + PanelRegistration(R.integer.object_panel_id) { + config { + themeResourceId = R.style.PanelAppThemeTransparent + includeGlass = false + layoutWidthInDp = 632f + width = 0.632f + height = 0.644f + layerConfig = LayerConfig() + layerBlendType = PanelShapeLayerBlendType.MASKED + enableLayerFeatheredEdge = true + } + composePanel { + try { + val device = + mrukObjectRepository.lastAddedObjectDevice ?: return@composePanel + mrukObjectRepository.lastAddedObjectDevice = null + setContent { + val viewmodel = DynamicSmartThingViewmodel( + di.get(), + di.get(), + onCloseSmartThing = { removeSmartThing(device.name) }, + onDisconnectDevice = { disconnectSmartThing(device.name) }) + DynamicSmartThingScreen( + device = device, + viewModel = viewmodel, + ) + } + } finally { + subscriptionScope.launch { + mrukObjectRepository.unlock() + } + } + + } + }, + ) + } + + suspend fun addSmartThing(device: Device, spawnPose: Pose) { + mrukObjectRepository.addMRUKObject( + MrukRaycastModel( + device = device, + pose = spawnPose + ) + ) + } + + fun removeSmartThing(deviceName: String) { + CoroutineScope(Dispatchers.IO).launch { + mrukObjectRepository.deleteMRUKObject(deviceName) + } + } + + fun disconnectSmartThing(deviceName: String) { + CoroutineScope(Dispatchers.IO).launch { + mrukObjectRepository.deleteFromDatabase(deviceName) + } + } + + suspend fun getAllSmartThings() { + val mrukObjects = mrukObjectRepository.getAllMRUKObjects() + val devices = smartHomeRepository.getDevices() + mrukObjects.forEach { mrukObject -> + val alreadyExists = + mrukObjectRepository.mrukEntities.containsKey(mrukObject.device.name) + if (alreadyExists) return@forEach + + val device = devices.find { it.name == mrukObject.device.name } + if (device != null) { + mrukObjectRepository.addMRUKObject( + MrukRaycastModel( + device = device, + pose = mrukObject.pose + ) + ) + } + } + } + + suspend fun deleteAllSmartThingEntities() { + mrukObjectRepository.deleteAllMRUKObjects() + } + + override fun systemsToRegister(): List { + val systems = mutableListOf() + systems.add(FollowHeadSystem()) + return systems + } + + override fun componentsToRegister(): List { + return listOf( + ComponentRegistration.createConfig( + FollowHead.Companion, + SendRate.DEFAULT, + ), + ) + } + + override fun onSceneReady() { + } + + + override fun onPauseActivity() { + super.onPauseActivity() + } + + override fun onDestroy() { + + subscriptionScope.cancel() + super.onDestroy() + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/MRUKObjectsRepositoryImpl.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/MRUKObjectsRepositoryImpl.kt new file mode 100644 index 00000000..f8b84bae --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/MRUKObjectsRepositoryImpl.kt @@ -0,0 +1,115 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting.datasource + +import android.net.Uri +import com.meta.pixelandtexel.scanner.FollowHead +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.model.MrukRaycastModel +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.model.ObjectEntityModel +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.repository.IMRUKObjectsRepository +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.datasource.local.MrukLocalDatasource +import com.meta.spatial.core.Entity +import com.meta.spatial.core.Pose +import com.meta.spatial.core.Vector3 +import com.meta.spatial.toolkit.Box +import com.meta.spatial.toolkit.Grabbable +import com.meta.spatial.toolkit.GrabbableType +import com.meta.spatial.toolkit.Mesh +import com.meta.spatial.toolkit.Transform +import com.meta.spatial.toolkit.Visible +import com.meta.spatial.toolkit.createPanelEntity +import com.meta.pixelandtexel.scanner.R +import kotlinx.coroutines.Dispatchers +import kotlinx.coroutines.coroutineScope +import kotlinx.coroutines.launch +import kotlinx.coroutines.sync.Mutex +import kotlinx.coroutines.withContext + + +class MRUKObjectsRepositoryImpl( + private val localDatasource: MrukLocalDatasource, +) : IMRUKObjectsRepository { + + override val mutex: Mutex = Mutex() + override var lastAddedObjectDevice: Device? = null + + override val mrukEntities: HashMap = HashMap() + + override suspend fun unlock() { + mutex.unlock() + } + + override suspend fun addMRUKObject(addObject: MrukRaycastModel): Boolean { + if (!mrukEntities.containsKey(addObject.device.name)) { + localDatasource.save(addObject) + + val newMeshPose = Pose(addObject.pose.t, addObject.pose.q) + + val boxEntity = Entity.create( + listOf( + Mesh(Uri.parse("mesh://box")), + Box(Vector3(.1f, .1f, 0.1f)), + Transform(newMeshPose), + Visible(true) + ) + ) + + coroutineScope { + launch(Dispatchers.Default) { + mutex.lock() + lastAddedObjectDevice = addObject.device + + val panelEntity = Entity.createPanelEntity( + R.integer.object_panel_id, + Transform(newMeshPose * Pose(Vector3(0f, 0.5f, 0f))), + Grabbable(type = GrabbableType.PIVOT_Y), + FollowHead(true), + ) + + val objectEntity = ObjectEntityModel( + objectEntity = boxEntity, + panelEntity = panelEntity + ) + mrukEntities[addObject.device.name] = objectEntity + } + } + + return true + } else { + return false + } + } + + override suspend fun deleteFromDatabase(objectId: String): Boolean { + val deleteMRUKObject = deleteMRUKObject(objectId) + localDatasource.delete(objectId) + return deleteMRUKObject + } + + override suspend fun deleteMRUKObject(objectId: String): Boolean { + return if (mrukEntities.containsKey(objectId)) { + val objectEntityModel = mrukEntities[objectId] + objectEntityModel?.objectEntity?.destroy() + objectEntityModel?.panelEntity?.destroy() + mrukEntities.remove(objectId) + + true + } else { + false + } + } + + override suspend fun deleteAllMRUKObjects(): Boolean { + mrukEntities.forEach { (_, objectEntityModel) -> + objectEntityModel.objectEntity.destroy() + objectEntityModel.panelEntity.destroy() + } + mrukEntities.clear() + return true + } + + override suspend fun getAllMRUKObjects(): List { + return withContext(Dispatchers.IO) { + localDatasource.getAll() + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/DBModule.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/DBModule.kt new file mode 100644 index 00000000..dda17abb --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/DBModule.kt @@ -0,0 +1,19 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting.datasource.local + +import androidx.room.Room.databaseBuilder +import com.meta.pixelandtexel.scanner.R +import org.koin.android.ext.koin.androidContext +import org.koin.dsl.module + +val dbModule = module { + single { + databaseBuilder( + androidContext(), + MrukDatabase::class.java, + androidContext().getString(R.string.db_name) + ).build() + } + + single { get().mrukDao() } + single { MrukLocalDatasource(get()) } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukDao.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukDao.kt new file mode 100644 index 00000000..37dbf4ab --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukDao.kt @@ -0,0 +1,26 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting.datasource.local + +import androidx.room.Dao +import androidx.room.Insert +import androidx.room.OnConflictStrategy +import androidx.room.Query +import androidx.room.Delete + +@Dao +interface MrukDao { + @Insert(onConflict = OnConflictStrategy.REPLACE) + suspend fun insert(entity: MrukEntity) + + @Query("SELECT * FROM mruk_objects WHERE id = :id") + suspend fun getById(id: String): MrukEntity? + + @Query("SELECT * FROM mruk_objects") + suspend fun getAll(): List + + @Delete + suspend fun delete(entity: MrukEntity) + + @Query("DELETE FROM mruk_objects WHERE id = :id") + suspend fun deleteById(id: String) +} + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukDatabase.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukDatabase.kt new file mode 100644 index 00000000..2a797782 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukDatabase.kt @@ -0,0 +1,10 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting.datasource.local + +import androidx.room.Database +import androidx.room.RoomDatabase + +@Database(entities = [MrukEntity::class], version = 1) +abstract class MrukDatabase : RoomDatabase() { + abstract fun mrukDao(): MrukDao +} + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukEntity.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukEntity.kt new file mode 100644 index 00000000..bd0b5557 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukEntity.kt @@ -0,0 +1,22 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting.datasource.local + +import androidx.room.ColumnInfo +import androidx.room.Entity +import androidx.room.PrimaryKey + +@Entity(tableName = "mruk_objects") +data class MrukEntity( + @PrimaryKey + @ColumnInfo(name = "id") + val id: String, + + @ColumnInfo(name = "q_w") val q_w: Float, + @ColumnInfo(name = "q_x") val q_x: Float, + @ColumnInfo(name = "q_y") val q_y: Float, + @ColumnInfo(name = "q_z") val q_z: Float, + + @ColumnInfo(name = "v_x") val v_x: Float, + @ColumnInfo(name = "v_y") val v_y: Float, + @ColumnInfo(name = "v_z") val v_z: Float, +) + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukLocalDatasource.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukLocalDatasource.kt new file mode 100644 index 00000000..18af5f1c --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/datasource/local/MrukLocalDatasource.kt @@ -0,0 +1,40 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting.datasource.local + +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.model.MrukRaycastModel +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.spatial.core.Pose +import com.meta.spatial.core.Quaternion +import com.meta.spatial.core.Vector3 + +class MrukLocalDatasource(private val dao: MrukDao) { + suspend fun save(model: MrukRaycastModel) { + val pose = model.pose + val q = pose.q + val t = pose.t + + val entity = MrukEntity( + id = model.device.name, + q_w = q.w, + q_x = q.x, + q_y = q.y, + q_z = q.z, + v_x = t.x, + v_y = t.y, + v_z = t.z + ) + + dao.insert(entity) + } + + suspend fun getAll(): List { + return dao.getAll().map { e -> + val q = Quaternion(e.q_w, e.q_x, e.q_y, e.q_z) + val t = Vector3(e.v_x, e.v_y, e.v_z) + MrukRaycastModel(Device(e.id, emptyList()), Pose(t, q)) + } + } + + suspend fun delete(id: String) { + dao.deleteById(id) + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/domain/model/MrukRaycastModel.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/domain/model/MrukRaycastModel.kt new file mode 100644 index 00000000..4a42a0d5 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/domain/model/MrukRaycastModel.kt @@ -0,0 +1,9 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.model + +import com.meta.pixelandtexel.scanner.models.devices.Device +import com.meta.spatial.core.Pose + +data class MrukRaycastModel( + val device: Device, + val pose: Pose +) \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/domain/model/ObjectEntityModel.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/domain/model/ObjectEntityModel.kt new file mode 100644 index 00000000..47c32edf --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/domain/model/ObjectEntityModel.kt @@ -0,0 +1,8 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.model + +import com.meta.spatial.core.Entity + +data class ObjectEntityModel( + val objectEntity: Entity, + val panelEntity: Entity +) \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/domain/repository/IMRUKObjectsRepository.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/domain/repository/IMRUKObjectsRepository.kt new file mode 100644 index 00000000..9fb3f4f6 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/domain/repository/IMRUKObjectsRepository.kt @@ -0,0 +1,18 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.repository + +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.model.MrukRaycastModel +import com.meta.pixelandtexel.scanner.feature.mrukraycasting.domain.model.ObjectEntityModel +import com.meta.pixelandtexel.scanner.models.devices.Device +import kotlinx.coroutines.sync.Mutex + +interface IMRUKObjectsRepository { + var lastAddedObjectDevice: Device? + val mrukEntities: HashMap + val mutex: Mutex + suspend fun addMRUKObject(addObject: MrukRaycastModel): Boolean + suspend fun deleteFromDatabase(objectId: String): Boolean + suspend fun deleteMRUKObject(objectId: String): Boolean + suspend fun deleteAllMRUKObjects(): Boolean + suspend fun getAllMRUKObjects(): List + suspend fun unlock() +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/system/FollowHeadSystem.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/system/FollowHeadSystem.kt new file mode 100644 index 00000000..ea14e229 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/mrukraycasting/system/FollowHeadSystem.kt @@ -0,0 +1,34 @@ +package com.meta.pixelandtexel.scanner.feature.mrukraycasting.system + +import com.meta.pixelandtexel.scanner.FollowHead +import com.meta.pixelandtexel.scanner.RotationMode +import com.meta.spatial.core.Pose +import com.meta.spatial.core.Quaternion +import com.meta.spatial.core.Query +import com.meta.spatial.core.SystemBase +import com.meta.spatial.toolkit.Transform + + +class FollowHeadSystem : SystemBase() { + override fun execute() { + val q = Query.Companion.where { has(FollowHead.id, Transform.Companion.id) } + + val headPose: Pose = getScene().getViewerPose() + + for (entity in q.eval()) { + var targetPose = headPose + + val eyePose = entity.getComponent().transform + val followHead = entity.getComponent() + + val rotationOfEntity = when (followHead.rotationMode) { + RotationMode.FULL -> Quaternion.lookRotation((eyePose.t - targetPose.t)) + RotationMode.Y_ROTATION -> Quaternion.lookRotationAroundY((eyePose.t - targetPose.t)) + } + + eyePose.q = rotationOfEntity + + entity.setComponent(Transform(eyePose)) + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/ObjectDetectionFeature.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/ObjectDetectionFeature.kt new file mode 100644 index 00000000..24830bde --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/ObjectDetectionFeature.kt @@ -0,0 +1,363 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection + +import android.animation.Keyframe +import android.animation.ObjectAnimator +import android.animation.PropertyValuesHolder +import android.os.Bundle +import android.view.View +import android.view.View.GONE +import android.widget.TextView +import com.meta.pixelandtexel.scanner.DiApplication +import com.meta.pixelandtexel.scanner.R +import com.meta.pixelandtexel.scanner.TrackedObject +import com.meta.pixelandtexel.scanner.ViewLocked +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.display.IDisplayedEntityRepository +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.system.TrackedObjectSystem +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.system.ViewLockedSystem +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.CameraController +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.enums.CameraStatus +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.models.CameraProperties +import com.meta.pixelandtexel.scanner.feature.objectdetection.android.views.android.CameraPreview +import com.meta.pixelandtexel.scanner.feature.objectdetection.android.views.android.GraphicOverlay +import com.meta.pixelandtexel.scanner.feature.objectdetection.android.views.android.ISurfaceProvider +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.detection.IObjectDetectionRepository +import com.meta.spatial.core.ComponentRegistration +import com.meta.spatial.core.Entity +import com.meta.spatial.core.SendRate +import com.meta.spatial.core.SpatialFeature +import com.meta.spatial.core.SystemBase +import com.meta.spatial.core.Vector3 +import com.meta.spatial.runtime.LayerConfig +import com.meta.spatial.runtime.PanelConfigOptions +import com.meta.spatial.runtime.PanelShapeLayerBlendType +import com.meta.spatial.toolkit.AppSystemActivity +import com.meta.spatial.toolkit.Hittable +import com.meta.spatial.toolkit.MeshCollision +import com.meta.spatial.toolkit.PanelRegistration +import com.meta.spatial.toolkit.Transform +import com.meta.spatial.toolkit.createPanelEntity +import kotlinx.coroutines.CoroutineScope +import kotlinx.coroutines.Dispatchers +import kotlinx.coroutines.cancel +import kotlinx.coroutines.delay +import kotlinx.coroutines.launch +import org.koin.android.ext.android.get + +/** + * A Spatial SDK Feature which uses the device camera feed and a CV object detection model to + * discover objects in the user's surroundings, assign them labels and persistent ids, and track + * their position over time. + * + **/ + +class ObjectDetectionFeature( + private val activity: AppSystemActivity, + private val onStatusChanged: ((CameraStatus) -> Unit)? = null, + private val spawnCameraViewPanel: Boolean = false, +) : SpatialFeature { + companion object { + private const val TAG = "ObjectDetectionFeature" + } + + // our core services + private val cameraController: CameraController + + // systems + private lateinit var viewLockedSystem: ViewLockedSystem + private lateinit var trackedObjectSystem: TrackedObjectSystem + + // status ui + private var cameraStatusRootView: View? = null + private var cameraStatusText: TextView? = null + private var _cameraStatus: CameraStatus = CameraStatus.PAUSED + val status: CameraStatus + get() = _cameraStatus + + private lateinit var cameraStatusEntity: Entity + + // debug ui + private var cameraPreviewView: CameraPreview? = null + private var graphicOverlayView: GraphicOverlay? = null + private lateinit var cameraViewEntity: Entity + + private val subscriptionScope = CoroutineScope(Dispatchers.Main) + + private var di: DiApplication = activity.application as DiApplication + private var displayRepository: IDisplayedEntityRepository + private val detectionRepository: IObjectDetectionRepository + + init { + cameraController = CameraController(activity) + cameraController.onCameraPropertiesChanged += ::onCameraPropertiesChanged + + displayRepository = di.get() + detectionRepository = di.get() + + subscriptionScope.launch { + detectionRepository.detectionState.collect { state -> + if (state == null) return@collect + try { + if (state.foundObjects.isNotEmpty()) { + CoroutineScope(Dispatchers.Main).launch{trackedObjectSystem.onObjectsFound(state.foundObjects)} + } + if (state.updatedObjects.isNotEmpty()) { + CoroutineScope(Dispatchers.Main).launch{ + trackedObjectSystem.onObjectsUpdated(state.updatedObjects) + } + } + if (state.lostObjectIds.isNotEmpty()) { + CoroutineScope(Dispatchers.Main).launch{ + trackedObjectSystem.onObjectsLost(state.lostObjectIds) + } + } + + } finally { + state.finally() + } + } + } + + subscriptionScope.launch { + cameraController.imageState.collect { + if (it == null) { + return@collect + } + + detectionRepository.processImage(it.image, it.width, it.height, it.finally) + } + } + } + + override fun onCreate(savedInstanceState: Bundle?) { + + activity.registerPanel( + PanelRegistration(R.layout.ui_camera_view) { + // size our panel to the camera's output size + val cameraOutputSize = cameraController.cameraOutputSize + + config { + themeResourceId = R.style.PanelAppThemeTransparent + includeGlass = false + layoutWidthInPx = cameraOutputSize.width + layoutHeightInPx = cameraOutputSize.height + // use the default texel density + width = cameraOutputSize.width / (PanelConfigOptions.EYEBUFFER_WIDTH * 0.5f) + height = cameraOutputSize.height / (PanelConfigOptions.EYEBUFFER_HEIGHT * 0.5f) + layerConfig = LayerConfig() + layerBlendType = PanelShapeLayerBlendType.MASKED + enableLayerFeatheredEdge = true + } + panel { + cameraPreviewView = rootView?.findViewById(R.id.preview_view) + graphicOverlayView = rootView?.findViewById(R.id.graphic_overlay) + + // if the ui view is marked as gone in the view, change it to null so it isn't sent + // to the camera controller + if (cameraPreviewView?.visibility == GONE) { + cameraPreviewView = null + } + + // start the camera automatically after initialization + this@ObjectDetectionFeature.scan() + } + } + ) + + activity.registerPanel( + PanelRegistration(R.layout.ui_camera_status_view) { + config { + themeResourceId = R.style.PanelAppThemeTransparent + includeGlass = false + layoutWidthInDp = 100f + width = 0.1f + height = 0.05f + layerConfig = LayerConfig() + layerBlendType = PanelShapeLayerBlendType.MASKED + enableLayerFeatheredEdge = true + } + panel { + cameraStatusRootView = rootView + cameraStatusText = + rootView?.findViewById(R.id.camera_status) + ?: throw RuntimeException("Missing camera status text view") + } + } + ) + } + + override fun systemsToRegister(): List { + val systems = mutableListOf() + // setup our systems, and subscribe to relevant events + + viewLockedSystem = ViewLockedSystem() + cameraController.onCameraPropertiesChanged += viewLockedSystem::onCameraPropertiesChanged + systems.add(viewLockedSystem) + + // only use the trackedObjectSystem to draw the outlines and labels of detected objects if + // we aren't displaying the camera debug view + if (!spawnCameraViewPanel) { + trackedObjectSystem = TrackedObjectSystem(activity, detectionRepository) + cameraController.onCameraPropertiesChanged += trackedObjectSystem::onCameraPropertiesChanged + + systems.add(trackedObjectSystem) + } + + return systems + } + + override fun componentsToRegister(): List { + return listOf( + ComponentRegistration.createConfig(ViewLocked.Companion, SendRate.DEFAULT), + ComponentRegistration.createConfig( + TrackedObject.Companion, + SendRate.DEFAULT, + ), + ) + } + + override fun onSceneReady() { + // create the camera status panel entity + + cameraStatusEntity = + Entity.createPanelEntity( + R.layout.ui_camera_status_view, + Transform(), + Hittable(MeshCollision.NoCollision), + ViewLocked(Vector3(-0.16f, 0.15f, 0.7f), Vector3(0f), false), + ) + } + + /** + * Start the device camera, and wait to receive any camera frames for CV analysis. If we haven't + * initialized the camera controller yet, first do that, awaiting the onCameraPropertiesChanged + * call to resume scanning. + */ + fun scan() { + if (cameraController.isInitialized) { + cameraController.start( + surfaceProviders = listOfNotNull(cameraPreviewView as? ISurfaceProvider), + ) + updateCameraStatus(CameraStatus.SCANNING) + return + } + + cameraController.initialize() + } + + /** + * Pause the device camera and any object detection CV logic. Wait for the camera session and + * inferencing to end before clearing the last results from the cache and overlay. + * + * @param immediate Whether or not to force the immediate clearing of the cache/overlay. + */ + fun pause(immediate: Boolean = false) { + if (!cameraController.isInitialized || !cameraController.isRunning) { + return + } + + cameraController.stop() + updateCameraStatus(CameraStatus.PAUSED) + + if (immediate) { + detectionRepository.clear() + graphicOverlayView?.clear() + trackedObjectSystem.clear() + } else { + // wait for the camera to stop, and then clear the results + CoroutineScope(Dispatchers.Main).launch { + delay(100L) + detectionRepository.clear() + graphicOverlayView?.clear() + trackedObjectSystem.clear() + } + } + } + + /** + * Callback for when the device camera is initialized, and its properties are processed. Typically + * only executed once when the user first starts scanning and accepts the camera access + * permissions. + * + * @param properties The device camera properties, encapsulated in a [CameraProperties]. + */ + private fun onCameraPropertiesChanged(properties: CameraProperties) { + // start immediately since we aren't spawning an overlay panel + if (!spawnCameraViewPanel) { + scan() + return + } + + if (::cameraViewEntity.isInitialized) { + return + } + + // create our panel here, so we can size it to the camera output + + val offsetPose = properties.getHeadToCameraPose() + cameraViewEntity = + Entity.createPanelEntity( + R.layout.ui_camera_view, + Transform(), + Hittable(MeshCollision.NoCollision), + ViewLocked(offsetPose.t, offsetPose.q.toEuler(), true), + ) + } + + /** + * Updates the camera status label displayed to the user at the top left of their view. + * + * @param newStatus The new [CameraStatus] status to display + */ + private fun updateCameraStatus(newStatus: CameraStatus) { + if (_cameraStatus == newStatus) { + return + } + + when (newStatus) { + CameraStatus.PAUSED -> { + cameraStatusText?.setText(R.string.camera_status_off) + } + + CameraStatus.SCANNING -> { + cameraStatusText?.setText(R.string.camera_status_on) + } + } + + // play a subtle pulse animation when the status changes + cameraStatusRootView?.let { + val durationMs = 250L + val kf0 = Keyframe.ofFloat(0f, 1f) + val kf1 = Keyframe.ofFloat(0.5f, 1.5f) + val kf2 = Keyframe.ofFloat(1f, 1f) + val pvhScaleX = PropertyValuesHolder.ofKeyframe("scaleX", kf0, kf1, kf2) + val pvhScaleY = PropertyValuesHolder.ofKeyframe("scaleY", kf0, kf1, kf2) + ObjectAnimator.ofPropertyValuesHolder(it, pvhScaleX, pvhScaleY).apply { + duration = durationMs + start() + } + } + + _cameraStatus = newStatus + onStatusChanged?.invoke(newStatus) + } + + override fun onPauseActivity() { + pause() + super.onPauseActivity() + } + + override fun onDestroy() { + pause(true) + cameraController.dispose() + + if (::cameraViewEntity.isInitialized) { + cameraViewEntity.destroy() + } + if (::cameraStatusEntity.isInitialized) { + cameraStatusEntity.destroy() + } + + subscriptionScope.cancel() + super.onDestroy() + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/viewmodels/ObjectLabelViewModel.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/viewmodels/ObjectLabelViewModel.kt new file mode 100644 index 00000000..f02976df --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/viewmodels/ObjectLabelViewModel.kt @@ -0,0 +1,16 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.android.viewmodels + +import androidx.compose.runtime.State +import androidx.compose.runtime.mutableStateOf +import androidx.lifecycle.ViewModel + +class ObjectLabelViewModel(objectName: String = "") : ViewModel() { + private val _name = mutableStateOf(objectName) + val name: State = _name + + fun updateName(value: String) { + _name.value = value.replaceFirstChar { it.uppercaseChar() } + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/ObjectLabelScreen.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/ObjectLabelScreen.kt new file mode 100644 index 00000000..351f56b9 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/ObjectLabelScreen.kt @@ -0,0 +1,84 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.android.views + +import androidx.compose.foundation.background +import androidx.compose.foundation.layout.Arrangement +import androidx.compose.foundation.layout.Box +import androidx.compose.foundation.layout.Column +import androidx.compose.foundation.layout.PaddingValues +import androidx.compose.foundation.layout.fillMaxSize +import androidx.compose.foundation.layout.height +import androidx.compose.foundation.layout.padding +import androidx.compose.foundation.layout.wrapContentWidth +import androidx.compose.foundation.shape.CutCornerShape +import androidx.compose.material3.Button +import androidx.compose.material3.ButtonColors +import androidx.compose.material3.Text +import androidx.compose.runtime.Composable +import androidx.compose.runtime.getValue +import androidx.compose.ui.Alignment +import androidx.compose.ui.Modifier +import androidx.compose.ui.draw.clip +import androidx.compose.ui.graphics.Color +import androidx.compose.ui.tooling.preview.Preview +import androidx.compose.ui.unit.dp +import androidx.lifecycle.viewmodel.compose.viewModel +import com.meta.pixelandtexel.scanner.feature.objectdetection.android.viewmodels.ObjectLabelViewModel +import com.meta.spatial.uiset.theme.SpatialColor +import com.meta.spatial.uiset.theme.SpatialTheme + +/** + * A simple Composable that renders the label or category of a detected object on a panel, which is + * positioned under the outline of the object, wrapped in a button composable to listen for clicks. + * + * @param vm [ObjectLabelViewModel] view model containing the label string to display. + * @param onClick Callback for if the user selects this composable. + */ +@Composable +fun ObjectLabelScreen(vm: ObjectLabelViewModel = viewModel(), onClick: (() -> Unit)? = null) { + val name by vm.name + + SpatialTheme { + Column( + verticalArrangement = Arrangement.Center, + horizontalAlignment = Alignment.CenterHorizontally, + ) { + Button( + onClick = { onClick?.invoke() }, + colors = + ButtonColors( + containerColor = Color.Transparent, + contentColor = Color.Transparent, + disabledContainerColor = Color.Transparent, + disabledContentColor = Color.Transparent, + ), + shape = CutCornerShape(0.dp), + contentPadding = PaddingValues(0.dp), + modifier = Modifier.fillMaxSize(), + ) { + Box( + contentAlignment = Alignment.Center, + modifier = + Modifier + .background( + color = SpatialColor.b70, + shape = SpatialTheme.shapes.large + ) + .clip(SpatialTheme.shapes.large) + .padding(horizontal = 20.dp) + .height(40.dp) + .wrapContentWidth(), + ) { + Text(name, color = SpatialColor.white100, style = SpatialTheme.typography.body1) + } + } + } + } +} + +@Preview(widthDp = 516, heightDp = 414) +@Composable +private fun ObjectLabelScreenPreview() { + ObjectLabelScreen(ObjectLabelViewModel("Name")) +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/android/CameraPreview.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/android/CameraPreview.kt new file mode 100644 index 00000000..222ca4c2 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/android/CameraPreview.kt @@ -0,0 +1,73 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.android.views.android + +import android.content.Context +import android.util.AttributeSet +import android.util.Log +import android.view.Surface +import android.view.SurfaceHolder +import android.view.SurfaceView +import android.view.ViewGroup + +/** + * An Android ViewGroup implementing the [ISurfaceProvider] pattern which adds a Surface to itself, + * and can be added to a panel view and used to preview the camera feed. + */ +class CameraPreview : ViewGroup, ISurfaceProvider { + companion object { + private const val TAG = "Camera" + } + + private val surfaceView = SurfaceView(context) + + private var _surface: Surface? = null + override val surface: Surface? + get() = _surface + + constructor(context: Context) : super(context) + + constructor(context: Context, attrs: AttributeSet) : super(context, attrs) + + init { + surfaceView.holder.addCallback( + object : SurfaceHolder.Callback { + override fun surfaceCreated(holder: SurfaceHolder) { + Log.d(TAG, "surfaceCreated") + _surface = holder.surface + } + + override fun surfaceChanged( + holder: SurfaceHolder, + format: Int, + width: Int, + height: Int + ) { + Log.w(TAG, "surfaceChanged: $format, ${width}x$height") + } + + override fun surfaceDestroyed(holder: SurfaceHolder) { + Log.w(TAG, "surfaceDestroyed") + _surface?.release() + _surface = null + } + } + ) + + addView(surfaceView) + } + + override fun onMeasure(widthMeasureSpec: Int, heightMeasureSpec: Int) { + super.onMeasure(widthMeasureSpec, heightMeasureSpec) + + val width = MeasureSpec.getSize(widthMeasureSpec) + val height = MeasureSpec.getSize(heightMeasureSpec) + setMeasuredDimension(width, height) + + measureChild(surfaceView, widthMeasureSpec, heightMeasureSpec) + } + + override fun onLayout(changed: Boolean, left: Int, top: Int, right: Int, bottom: Int) { + surfaceView.layout(left, top, right, bottom) + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/android/GraphicOverlay.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/android/GraphicOverlay.kt new file mode 100644 index 00000000..a7dbf032 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/android/GraphicOverlay.kt @@ -0,0 +1,211 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.android.views.android + +import android.annotation.SuppressLint +import android.content.Context +import android.graphics.Canvas +import android.graphics.Color +import android.graphics.Paint +import android.graphics.Rect +import android.graphics.RectF +import android.os.SystemClock +import android.text.TextPaint +import android.util.AttributeSet +import android.view.View +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models.DetectedObject +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.NumberSmoother +import kotlin.math.floor + +/** + * A custom [View] that is meant to be rendered over a [android.view.Surface] displaying the device + * camera feed, and draws object detection results from the processed the camera frames. Takes a + * list of [DetectedObject] instances and draws them as rectangular bounding boxes with labels. Also + * displays a provided statistics string and calculates/shows the rate of result updates per second. + */ +class GraphicOverlay : View { + private var results: List? = null + private var stats: String? = null + private var xScale: Float = 1f + private var yScale: Float = 1f + + private var lastResultsUpdateTimeMs = 0L + private val smoothedUpdatesPerSec = NumberSmoother() + + private var boxPaint = Paint() + private var textBackgroundPaint = Paint() + private var textPaint = TextPaint() + + constructor(context: Context) : super(context) + + constructor(context: Context, attrs: AttributeSet) : super(context, attrs) + + init { + // setup our paints + + boxPaint.color = Color.BLUE + boxPaint.strokeWidth = 2f + boxPaint.style = Paint.Style.STROKE + + textBackgroundPaint.color = Color.BLUE + textBackgroundPaint.style = Paint.Style.FILL + textBackgroundPaint.textSize = 16f + + textPaint.color = Color.WHITE + textPaint.style = Paint.Style.FILL + textPaint.textSize = 16f + } + + /** + * Clears all bounding boxes and labels of detected objects, plus the stats String, and + * invalidates the View so that draw is called with no results. + */ + fun clear() { + results = null + stats = null + + invalidate() + } + + /** + * Updates the detection results to be drawn and triggers a view redraw. + * + * @param newResults A list of [DetectedObject] instances representing the latest detection + * results from the model. + * @param imageWidth The width of the source image from which the `newResults` were derived, used + * to calculate the horizontal scaling factor. + * @param imageHeight The height of the source image from which the `newResults` were derived, + * used to calculate the vertical scaling factor. + */ + fun drawResults(newResults: List, imageWidth: Int, imageHeight: Int) { + results = newResults + xScale = width.toFloat() / imageWidth + yScale = height.toFloat() / imageHeight + + invalidate() + } + + /** + * Updates the statistics string to be displayed and triggers a view redraw. + * + * @param newStats A [String] containing the new statistics or informational text to be displayed + * on the view. + */ + fun drawStats(newStats: String) { + stats = newStats + + updateResultsTiming() + + invalidate() + } + + /** Updates our local timing stat for how often updates are sent. */ + private fun updateResultsTiming() { + // keep track of how often we're receiving updates + + val nowMs = SystemClock.uptimeMillis() + val elapsedMs = nowMs - lastResultsUpdateTimeMs + if (elapsedMs > 0) { + smoothedUpdatesPerSec.update(floor(1000f / elapsedMs)) + } + lastResultsUpdateTimeMs = nowMs + } + + /** + * Overrides [draw] to render object detection results and statistics onto the canvas. Draws + * rectangular bounding boxes around the detected objects, with labels underneath. Also draws a + * statistics String, if supplied, plus a local timing stat. + * + * @param canvas The [Canvas] instance on which all drawing operations will be performed. + */ + @SuppressLint("DefaultLocale") + override fun draw(canvas: Canvas) { + super.draw(canvas) + + // for debugging, draw borders around this entire view + canvas.drawRect(RectF(4f, 4f, width.toFloat() - 4, height.toFloat() - 4), boxPaint) + + // draw boxes around all of our results, with a category and category score + + results + ?.map { Rect(it.bounds.left, it.bounds.top, it.bounds.right, it.bounds.bottom) } + ?.forEachIndexed { i, rect -> + val top = rect.top * yScale + val bottom = rect.bottom * yScale + val left = rect.left * xScale + val right = rect.right * xScale + val centerX = left + (right - left) / 2 + + // draw the box + + val drawableRect = RectF(left, top, right, bottom) + canvas.drawRect(drawableRect, boxPaint) + + // create text to display alongside detected objects + + var drawableText = + results!![i].label + " :: " + String.format("%.2f", results!![i].confidence) + if (results!![i].id != null) { + drawableText = "${results!![i].id} :: $drawableText" + } + + // draw rect behind display text + + val bounds = Rect() + textBackgroundPaint.getTextBounds(drawableText, 0, drawableText.length, bounds) + + val textWidth = bounds.width() + val textHeight = bounds.height() + val padding = 4 + canvas.drawRect( + centerX - textWidth / 2 - padding, + bottom, + (centerX - textWidth / 2) + textWidth + padding, + bottom + textHeight + padding, + textBackgroundPaint, + ) + + // draw text for detected object + + canvas.drawText( + drawableText, + centerX - textWidth / 2, + bottom + bounds.height(), + textPaint, + ) + } + + // draw our inference stats near the top left + + if (stats != null) { + val resultsUpdatesPerSec = smoothedUpdatesPerSec.getSmoothedNumber().toInt() + val drawableStats = "$resultsUpdatesPerSec results/sec\n${stats!!}" + + drawString(canvas, drawableStats, 150f, 100f, textPaint) + } + } + + /** + * Convenience function for drawing strings with new lines. + * + * @param canvas The [Canvas] on which to draw the text. + * @param text The [String] of text to be drawn. If it contains newline characters, it will be + * rendered as multiple lines. + * @param x The x-coordinate for the starting position of the text (left edge). + * @param y The y-coordinate for the baseline of the first line of text. + * @param paint The [TextPaint] object that defines the text's appearance. The text size from this + * paint is used to calculate line spacing for multi-line text. + */ + private fun drawString(canvas: Canvas, text: String, x: Float, y: Float, paint: TextPaint) { + var dy = y + if (text.contains("\n")) { + val texts = text.split("\n") + for (txt in texts) { + canvas.drawText(txt, x, dy, paint) + dy += paint.textSize.toInt() + } + } else { + canvas.drawText(text, x, y, paint) + } + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/android/ISurfaceProvider.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/android/ISurfaceProvider.kt new file mode 100644 index 00000000..165db424 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/android/views/android/ISurfaceProvider.kt @@ -0,0 +1,15 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.android.views.android + +import android.view.Surface + +/** + * A contract for components that can provide an Android [Surface]. Used by + * CameraController to facilitate displaying the camera video feed on panels. + */ +interface ISurfaceProvider { + val surface: Surface? + val surfaceAvailable: Boolean + get() = surface != null +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/IObjectDetectorHelper.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/IObjectDetectorHelper.kt new file mode 100644 index 00000000..c0995b90 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/IObjectDetectorHelper.kt @@ -0,0 +1,28 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector + +import android.media.Image +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models.DetectedObjectsResult + +/** + * Defines the contract for an object detection helper. Implementations of this interface are + * responsible for processing an image and identifying objects within it. + */ +interface IObjectDetectorHelper { + /** + * Initiates the object detection process on the provided image. + * + * This method should perform detection asynchronously. The `finally` lambda must be called once + * the detection process for this specific image is complete, regardless of success or failure. + * This is crucial for managing processing state, such as releasing resources or allowing the next + * frame to be processed. + * + * @param image The [Image] to be processed for object detection. + * @param width The width of the image. + * @param height The height of the image. + * @param finally A callback function that must be invoked when the detection for this image has + * finished (successfully or unsuccessfully). + */ + + fun detect(image: Image, width: Int, height: Int, finally: (DetectedObjectsResult?) -> Unit) + +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/MLKitObjectDetector.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/MLKitObjectDetector.kt new file mode 100644 index 00000000..cd34fd8d --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/MLKitObjectDetector.kt @@ -0,0 +1,89 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector + +import android.media.Image +import android.os.SystemClock +import android.util.Log +import com.google.mlkit.common.model.LocalModel +import com.google.mlkit.vision.common.InputImage +import com.google.mlkit.vision.objects.ObjectDetection +import com.google.mlkit.vision.objects.ObjectDetector +import com.google.mlkit.vision.objects.custom.CustomObjectDetectorOptions +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models.DetectedObjectsResult + +/** + * A stateless worker that utilizes Google's ML Kit to perform object detection. + * + * This class has a single responsibility: to process an `Image` and return the + * detection result through a callback. It does not manage any state related to + * concurrency or stream processing. + * + */ +class MLKitObjectDetector() : IObjectDetectorHelper { + companion object { + private const val TAG = "MLKitObjectDetector" + private const val MODEL_EFFICIENTNET = "mlkit/mobile_object_labeler_v1.tflite" + // Other models... + } + + private var objectDetector: ObjectDetector? = null + + init { + val localModel = LocalModel.Builder().setAssetFilePath("models/$MODEL_EFFICIENTNET").build() + try { + val customOptions = + CustomObjectDetectorOptions.Builder(localModel) + .setDetectorMode(CustomObjectDetectorOptions.STREAM_MODE) + .setMaxPerObjectLabelCount(1) + .setClassificationConfidenceThreshold(0.75f) + .enableMultipleObjects() + .enableClassification() + .build() + objectDetector = ObjectDetection.getClient(customOptions) + } catch (e: Exception) { + Log.e(TAG, "Failed to initialize ML Kit Detector", e) + } + } + + /** + * Performs object detection on a single image. + * + * @param image The Image to be processed. + * @param width The width of the image. + * @param height The height of the image. + * @param finally A callback function that will be invoked with the [DetectedObjectsResult] + * or null if detection failed. + */ + override fun detect( + image: Image, + width: Int, + height: Int, + finally: (DetectedObjectsResult?) -> Unit + ) { + if (objectDetector == null) { + Log.w(TAG, "objectDetector not initialized, cannot detect.") + finally(null) + return + } + + val startTime = SystemClock.uptimeMillis() + val iImage = InputImage.fromMediaImage(image, 0) + + objectDetector!! + .process(iImage) + .addOnSuccessListener { detectedObjects -> + val finishTimeMs = SystemClock.uptimeMillis() + val inferenceTime = finishTimeMs - startTime + val result = DetectedObjectsResult.Companion.fromMLKitResults( + detectedObjects, + inferenceTime, + width, + height + ) + finally(result) + } + .addOnFailureListener { e -> + Log.e(TAG, "Detection failed", e) + finally(null) + } + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/OpenCVObjectDetector.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/OpenCVObjectDetector.kt new file mode 100644 index 00000000..1126b754 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/OpenCVObjectDetector.kt @@ -0,0 +1,262 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector + +import android.content.Context +import android.graphics.Rect +import android.media.Image +import android.os.SystemClock +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models.DetectedObjectsResult +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models.JavaCamera2Frame +import java.io.File +import java.io.FileOutputStream +import kotlinx.coroutines.CoroutineScope +import kotlinx.coroutines.Dispatchers +import kotlinx.coroutines.launch +import kotlinx.coroutines.withContext +import org.opencv.android.OpenCVLoader +import org.opencv.core.Mat +import org.opencv.core.MatOfByte +import org.opencv.core.Scalar +import org.opencv.core.Size +import org.opencv.dnn.Dnn +import org.opencv.dnn.Net +import org.opencv.imgproc.Imgproc + +/** + * Implementation adapted from: + * https://github.com/opencv/opencv/blob/master/samples/android/mobilenet-objdetect/src/org/opencv/samples/opencv_mobilenet/MainActivity.java + */ +class OpenCVObjectDetector(context: Context) : IObjectDetectorHelper { + companion object { + private const val TAG = "ObjectDetection" + } + + private var modelBuffer: MatOfByte? = null + private var configBuffer: MatOfByte? = null + + private lateinit var net: Net + + private var resultsListener = null + + private val classNames: List = + listOf( + "background", + "aeroplane", + "bicycle", + "bird", + "boat", + "bottle", + "bus", + "car", + "cat", + "chair", + "cow", + "diningtable", + "dog", + "horse", + "motorbike", + "person", + "pottedplant", + "sheep", + "sofa", + "train", + "tvmonitor", + ) + + init { + try { + val success = OpenCVLoader.initLocal() + if (!success) { + throw RuntimeException("Failed to initialize OpenCV") + } + + // initialize open cv model net + + modelBuffer = + getAssetMatOfBytes("models/opencv/mobilenet_iter_73000.caffemodel", context) + configBuffer = getAssetMatOfBytes("models/opencv/deploy.prototxt", context) + + if (modelBuffer == null || configBuffer == null) { + throw RuntimeException("Failed to read model or config") + } + + net = Dnn.readNet("caffe", modelBuffer, configBuffer) + } catch (e: Exception) { + e.printStackTrace() + } + } + + /** + * Initiates the object detection process on the provided image. Converts to an OpenCV Mat with + * RGB format, calls [getDetections], and processes the resulting Mat to extract the detected + * results. + * + * @param image The [Image] to be processed for object detection. + * @param width The width of the image. + * @param height The height of the image. + * @param finally A callback function that must be invoked when the detection for this image has + * finished (successfully or unsuccessfully). + */ + fun detect(image: Image, width: Int, height: Int, finally: () -> Unit) { + val startTime = SystemClock.uptimeMillis() + + // convert image to a CvCameraViewFrame and get the Mat + + val cvFrame = JavaCamera2Frame(image) + val frameSrc = cvFrame.rgba() + val frame = Mat() + + // convert to 3 channel + Imgproc.cvtColor(frameSrc, frame, Imgproc.COLOR_RGBA2RGB) + + CoroutineScope(Dispatchers.IO).launch { + var detections = getDetections(frame) + + val finishTimeMs = SystemClock.uptimeMillis() + val inferenceTime = finishTimeMs - startTime + + val rows = frame.rows() + val cols = frame.cols() + + detections = detections.reshape(1, detections.total().toInt() / 7) + + val cvDetectedObjects = mutableListOf() + + for (i in 0 until detections.rows()) { + val confidence = detections.get(i, 2)[0] + if (confidence < 0.5) { + continue + } + + val classId = detections.get(i, 1)[0].toInt() + val label = classNames[classId] + + val left = (detections.get(i, 3)[0] * cols).toInt() + val top = (detections.get(i, 4)[0] * rows).toInt() + val right = (detections.get(i, 5)[0] * cols).toInt() + val bottom = (detections.get(i, 6)[0] * rows).toInt() + val bounds = Rect(left, top, right, bottom) + + cvDetectedObjects.add(CvDetectedObject(bounds, label, confidence)) + } + + val result = + DetectedObjectsResult.fromOpenCVResults( + cvDetectedObjects, + inferenceTime, + width, + height + ) + // resultsListener?.onObjectsDetected(result, image) No se hace por herencia + + frame.release() + cvFrame.release() + + finally() // calls image.close() + } + } + + /** + * Creates the input blob from the provided image Mat, and forwards the input to the neural + * network for processing. + * + * @param frame The [Mat] representing the image frame for processing. + * @return The [Mat] containing the results from the neural network processing. + */ + private suspend fun getDetections(frame: Mat): Mat { + return withContext(Dispatchers.IO) { + // set our new input + + val size = Size(640.0, 640.0) + val mean = Scalar(127.5) + val inScaleFactor = 1.0 / 127.5 + val blob = Dnn.blobFromImage(frame, inScaleFactor, size, mean, false, false) + + net.setInput(blob) + + // compute the detections + + net.forward() + } + } + + /** + * Some model loading for OpenCV must be done by passing an absolute path to the file. This + * function reads the file data from the assets bundle, writes it to a temporary file, and returns + * that absolute path. + * + * @param assetFilePath The path to the asset file within the assets directory (e.g., + * "models/my_model.caffemodel"). + * @param context The Android [Context] used to access the application's assets and cache + * directory. + * @return The absolute path to the created temporary file. + */ + private fun getAssetTempPath(assetFilePath: String, context: Context): String { + // extract our file name and extension from the path + val fileNameWithExtension = assetFilePath.substringAfterLast('/') + val fileName = fileNameWithExtension.substringBeforeLast('.', fileNameWithExtension) + val extension = + if (fileNameWithExtension.contains('.')) { + fileNameWithExtension.substringAfterLast('.') + } else { + "" + } + + val inputStream = context.assets.open(assetFilePath) + val suffix = if (extension.isNotEmpty()) ".$extension" else null + val tempFile = File.createTempFile(fileName, suffix, context.cacheDir) + + FileOutputStream(tempFile).use { outputStream -> inputStream.copyTo(outputStream) } + + return tempFile.absolutePath + } + + /** + * Reads the bytes of the model from the assets bundle, and returns a [MatOfByte] of that model. + * + * @param assetFilePath The path to the asset file within the assets directory (e.g., + * "models/my_model.caffemodel"). + * @param context The Android [Context] used to access the application's assets and cache + * directory. + * @return An OpenCV [MatOfByte] object representing the bytes of the model read from the assets + * bundle. + */ + private fun getAssetMatOfBytes(assetFilePath: String, context: Context): MatOfByte? { + val buffer: ByteArray + try { + val inputStream = context.assets.open(assetFilePath) + val size = inputStream.available() + buffer = ByteArray(size) + inputStream.read(buffer) + inputStream.close() + } catch (e: Exception) { + e.printStackTrace() + return null + } + + return MatOfByte(*buffer) + } + + override fun detect( + image: Image, + width: Int, + height: Int, + finally: (DetectedObjectsResult?) -> Unit + ) { + TODO("Not yet implemented") + } + + /** + * Represents an object detected by an OpenCV Net, encapsulating info about a single detected + * object, including its location, classification, and the confidence level of the detection. + * + * @property bounds The rectangular bounding box delineating the detected object within an image + * or frame. + * @property label The classification label assigned to the detected object (e.g., "cat", "car", + * "text"). + * @property confidence A score, typically between 0.0 and 1.0, indicating the detector's + * certainty about the classification and localization of the object. + */ + data class CvDetectedObject(val bounds: Rect, val label: String, val confidence: Double) +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/models/DetectedObject.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/models/DetectedObject.kt new file mode 100644 index 00000000..35d83d4d --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/models/DetectedObject.kt @@ -0,0 +1,29 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models + +import android.graphics.PointF +import android.graphics.Rect + +/** + * Represents an object detected by a model, including its location, bounding box, descriptive + * label, detection confidence, and an optional identifier. + * + * @property point A [PointF] representing the location of the detected object, in the processed + * image. + * @property bounds A [Rect] defining the bounding box that encloses the detected object within the + * image. + * @property label A [String] providing a human-readable name or category for the detected object + * (e.g., "cat", "car", "person"). + * @property confidence A [Float] value indicating the detector's certainty about the correctness of + * the detection, in the range of [0.0, 1.0]. + * @property id An optional [Int] used to uniquely identify or track an instance of the detected + * object across multiple frames or detections. Defaults to null if no specific ID is assigned. + */ +data class DetectedObject( + val point: PointF, + val bounds: Rect, + val label: String, + val confidence: Float, + val id: Int? = null, +) diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/models/DetectedObjectsResult.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/models/DetectedObjectsResult.kt new file mode 100644 index 00000000..304ccf2c --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/models/DetectedObjectsResult.kt @@ -0,0 +1,144 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models + +import android.graphics.PointF +import androidx.core.graphics.toRect +import com.google.mediapipe.tasks.components.containers.Detection +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.OpenCVObjectDetector +import com.meta.pixelandtexel.scanner.models.smarthomedata.plugStringList + +/** + * Represents the result of a object detection, encapsulating a list of detected objects, the time + * taken for inference, and the dimensions of the input image on which detection was performed. + * + * @property objects A list of [DetectedObject] instances found in the image. + * @property inferenceTime The time taken for the object detection model to process the image, in + * milliseconds. + * @property inputImageWidth The width of the image that was processed for object detection. + * @property inputImageHeight The height of the image that was processed for object detection. + */ +data class DetectedObjectsResult( + val objects: List, + val inferenceTime: Long, + val inputImageWidth: Int, + val inputImageHeight: Int, +) { + companion object { + private val neededObjects = setOf("spotlight", "lampshade", "switch", "modem", "radiator", "space heater") + + /** + * Creates a [DetectedObjectsResult] instance from MediaPipe's object detection results. + * + * This function converts a list of [Detection] + * objects into a standardized [DetectedObjectsResult]. It extracts bounding boxes, labels, and + * confidence scores. Objects without categories are filtered out. + * + * @param mpDetectedObjects A list of detection results from the MediaPipe object detector. + * @param inferenceTime The time taken for inference, in milliseconds. + * @param inputImageWidth The width of the input image. + * @param inputImageHeight The height of the input image. + * @return A [DetectedObjectsResult] containing the processed detection information. + */ + fun fromMPResults( + mpDetectedObjects: List, + inferenceTime: Long, + inputImageWidth: Int, + inputImageHeight: Int, + ): DetectedObjectsResult { + val detectedObjects = + mpDetectedObjects.mapNotNull { + if (it.categories().isEmpty()) { + return@mapNotNull null + } + + val rect = it.boundingBox().toRect() + val point = PointF(it.boundingBox().centerX(), it.boundingBox().centerY()) + val label = it.categories()[0].categoryName() + val confidence = it.categories()[0].score() + + DetectedObject(point, rect, label, confidence) + } + + return DetectedObjectsResult( + detectedObjects, + inferenceTime, + inputImageWidth, + inputImageHeight, + ) + } + + /** + * Creates a [DetectedObjectsResult] instance from ML Kit's object detection results, + * transforming a list of [com.google.mlkit.vision.objects.DetectedObject] objects from ML Kit + * into a standardized [DetectedObjectsResult]. It extracts bounding boxes, labels, confidence + * scores, and tracking IDs. Objects without labels are filtered out. + * + * @param mlkitDetectedObjects A list of detected object results from ML Kit object detection. + * @param inferenceTime The time taken for inference, in milliseconds. + * @param inputImageWidth The width of the input image. + * @param inputImageHeight The height of the input image. + * @return A [DetectedObjectsResult] containing the processed detection information. + */ + fun fromMLKitResults( + mlkitDetectedObjects: List, + inferenceTime: Long, + inputImageWidth: Int, + inputImageHeight: Int, + ): DetectedObjectsResult { + val detectedObjects = + mlkitDetectedObjects.mapNotNull { + if (it.labels.isEmpty()) { + return@mapNotNull null + } + + val point = PointF(it.boundingBox.exactCenterX(), it.boundingBox.exactCenterY()) + val label = it.labels[0].text + val confidence = it.labels[0].confidence + + + DetectedObject(point, it.boundingBox, label, confidence, it.trackingId) + } + + return DetectedObjectsResult( + detectedObjects, + inferenceTime, + inputImageWidth, + inputImageHeight, + ) + } + + /** + * Creates a [DetectedObjectsResult] instance from custom OpenCV object detection results, + * converting a list of [OpenCVObjectDetector.CvDetectedObject] into a standardized + * [DetectedObjectsResult], extracting bounding boxes, labels, and confidence scores. + * + * @param cvDetectedObjects A list of custom detected object results from our OpenCV-based + * detector. + * @param inferenceTime The time taken for inference, in milliseconds. + * @param inputImageWidth The width of the input image. + * @param inputImageHeight The height of the input image. + * @return A [DetectedObjectsResult] containing the processed detection information. + */ + fun fromOpenCVResults( + cvDetectedObjects: List, + inferenceTime: Long, + inputImageWidth: Int, + inputImageHeight: Int, + ): DetectedObjectsResult { + val detectedObjects = + cvDetectedObjects.map { + val point = PointF(it.bounds.exactCenterX(), it.bounds.exactCenterY()) + + DetectedObject(point, it.bounds, it.label, it.confidence.toFloat()) + } + + return DetectedObjectsResult( + detectedObjects, + inferenceTime, + inputImageWidth, + inputImageHeight, + ) + } + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/models/JavaCamera2Frame.java b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/models/JavaCamera2Frame.java new file mode 100644 index 00000000..42deea21 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/detector/models/JavaCamera2Frame.java @@ -0,0 +1,133 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models; + +import android.media.Image; + +import java.nio.ByteBuffer; + +import org.opencv.android.CameraBridgeViewBase; +import org.opencv.core.CvType; +import org.opencv.core.Mat; +import org.opencv.imgproc.Imgproc; + +/** + * Copied from the OpenCV Java SDK - JavaCamera2View.java. The constructor expects the Image to be + * ImageFormat.YUV_420_888 + */ +public class JavaCamera2Frame implements CameraBridgeViewBase.CvCameraViewFrame { + @Override + public Mat gray() { + Image.Plane[] planes = mImage.getPlanes(); + int w = mImage.getWidth(); + int h = mImage.getHeight(); + assert (planes[0].getPixelStride() == 1); + ByteBuffer y_plane = planes[0].getBuffer(); + int y_plane_step = planes[0].getRowStride(); + mGray = new Mat(h, w, CvType.CV_8UC1, y_plane, y_plane_step); + return mGray; + } + + @Override + public Mat rgba() { + Image.Plane[] planes = mImage.getPlanes(); + int w = mImage.getWidth(); + int h = mImage.getHeight(); + int chromaPixelStride = planes[1].getPixelStride(); + + if (chromaPixelStride == 2) { // Chroma channels are interleaved + assert (planes[0].getPixelStride() == 1); + assert (planes[2].getPixelStride() == 2); + ByteBuffer y_plane = planes[0].getBuffer(); + int y_plane_step = planes[0].getRowStride(); + ByteBuffer uv_plane1 = planes[1].getBuffer(); + int uv_plane1_step = planes[1].getRowStride(); + ByteBuffer uv_plane2 = planes[2].getBuffer(); + int uv_plane2_step = planes[2].getRowStride(); + Mat y_mat = new Mat(h, w, CvType.CV_8UC1, y_plane, y_plane_step); + Mat uv_mat1 = new Mat(h / 2, w / 2, CvType.CV_8UC2, uv_plane1, uv_plane1_step); + Mat uv_mat2 = new Mat(h / 2, w / 2, CvType.CV_8UC2, uv_plane2, uv_plane2_step); + long addr_diff = uv_mat2.dataAddr() - uv_mat1.dataAddr(); + if (addr_diff > 0) { + assert (addr_diff == 1); + Imgproc.cvtColorTwoPlane(y_mat, uv_mat1, mRgba, Imgproc.COLOR_YUV2RGBA_NV12); + } else { + assert (addr_diff == -1); + Imgproc.cvtColorTwoPlane(y_mat, uv_mat2, mRgba, Imgproc.COLOR_YUV2RGBA_NV21); + } + return mRgba; + } else { // Chroma channels are not interleaved + byte[] yuv_bytes = new byte[w * (h + h / 2)]; + ByteBuffer y_plane = planes[0].getBuffer(); + ByteBuffer u_plane = planes[1].getBuffer(); + ByteBuffer v_plane = planes[2].getBuffer(); + + int yuv_bytes_offset = 0; + + int y_plane_step = planes[0].getRowStride(); + if (y_plane_step == w) { + y_plane.get(yuv_bytes, 0, w * h); + yuv_bytes_offset = w * h; + } else { + int padding = y_plane_step - w; + for (int i = 0; i < h; i++) { + y_plane.get(yuv_bytes, yuv_bytes_offset, w); + yuv_bytes_offset += w; + if (i < h - 1) { + y_plane.position(y_plane.position() + padding); + } + } + assert (yuv_bytes_offset == w * h); + } + + int chromaRowStride = planes[1].getRowStride(); + int chromaRowPadding = chromaRowStride - w / 2; + + if (chromaRowPadding == 0) { + // When the row stride of the chroma channels equals their width, we can copy + // the entire channels in one go + u_plane.get(yuv_bytes, yuv_bytes_offset, w * h / 4); + yuv_bytes_offset += w * h / 4; + v_plane.get(yuv_bytes, yuv_bytes_offset, w * h / 4); + } else { + // When not equal, we need to copy the channels row by row + for (int i = 0; i < h / 2; i++) { + u_plane.get(yuv_bytes, yuv_bytes_offset, w / 2); + yuv_bytes_offset += w / 2; + if (i < h / 2 - 1) { + u_plane.position(u_plane.position() + chromaRowPadding); + } + } + for (int i = 0; i < h / 2; i++) { + v_plane.get(yuv_bytes, yuv_bytes_offset, w / 2); + yuv_bytes_offset += w / 2; + if (i < h / 2 - 1) { + v_plane.position(v_plane.position() + chromaRowPadding); + } + } + } + + Mat yuv_mat = new Mat(h + h / 2, w, CvType.CV_8UC1); + yuv_mat.put(0, 0, yuv_bytes); + Imgproc.cvtColor(yuv_mat, mRgba, Imgproc.COLOR_YUV2RGBA_I420, 4); + return mRgba; + } + } + + public JavaCamera2Frame(Image image) { + super(); + mImage = image; + mRgba = new Mat(); + mGray = new Mat(); + } + + @Override + public void release() { + mRgba.release(); + mGray.release(); + } + + private Image mImage; + private Mat mRgba; + private Mat mGray; +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/repository/DetectionRepository.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/repository/DetectionRepository.kt new file mode 100644 index 00000000..e159ed93 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/repository/DetectionRepository.kt @@ -0,0 +1,159 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.repository + +import android.graphics.Rect +import android.media.Image +import android.util.Log +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models.DetectedObject +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.ImageUtils.getBitmap +import kotlin.collections.mutableListOf +import kotlinx.coroutines.flow.MutableStateFlow +import kotlinx.coroutines.flow.StateFlow +import java.util.concurrent.atomic.AtomicBoolean +import com.meta.pixelandtexel.scanner.R +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.IObjectDetectorHelper +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.detection.DetectionState +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.detection.IObjectDetectionRepository +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.display.IDisplayedEntityRepository +import com.meta.pixelandtexel.scanner.feature.objectdetection.model.RaycastRequestModel +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.MathUtils.area +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.MathUtils.intersection +import com.meta.pixelandtexel.scanner.models.smarthomedata.SmartHomeInfoRequest +import com.meta.pixelandtexel.scanner.models.smarthomedata.getEnumFromString +import java.util.concurrent.atomic.AtomicReference + +/** + * Manages the entire object detection workflow, including state reconciliation. + * This repository coordinates with a detector worker to process images, + * reconciles the results against a cache, and emits the final set of + * found, updated, and lost objects. + * + * @param detector The worker implementation that performs the actual detection. + */ +class ObjectDetectionRepository(private val detector: IObjectDetectorHelper, + private val displayRepository: IDisplayedEntityRepository + ) : IObjectDetectionRepository { + companion object { + private const val TAG = "ObjectDetectionRepo" + } + + private val isDetecting = AtomicBoolean(false) + private val objectInfoRequest = AtomicReference?>(null) + + private val cachedObjectIds = mutableSetOf() + private val cachedObjects = mutableListOf() + + private val _detectionState = MutableStateFlow(null) + override val detectionState: StateFlow = _detectionState + + + override fun processImage(image: Image, width: Int, height: Int, `finally`: () -> Unit) { + if (!isDetecting.compareAndSet(false, true)) { + Log.v(TAG, "Frame dropped, detector busy.") + finally() + return + } + + detector.detect(image, width, height) { result -> + try { + if (result != null) { + val (found, updated, lost) = reconcileDetectedObjects(result.objects) + cachedObjects.clear() + cachedObjects.addAll(result.objects.filter { cachedObjectIds.contains(it.id) }) + + checkRequestImageForObject(image) + + _detectionState.value = DetectionState(image, found, updated, lost, finally) + } else { + finally() + } + } finally { + isDetecting.set(false) + } + } + } + + private fun checkRequestImageForObject(image: Image) { + objectInfoRequest.getAndSet(null)?.let { (id, raycastModelPose) -> + val obj = cachedObjects.firstOrNull { it.id == id } + val bmp = obj?.let { image.getBitmap(it.bounds) } + + if (obj != null && bmp != null) { + val typeSmartHome = getEnumFromString(obj.label) + displayRepository.createGenericInfoPanel( + R.integer.info_panel_id, + SmartHomeInfoRequest( + typeSmartHome, + raycastModelPose, + ), + ) + } + } + } + + + private fun reconcileDetectedObjects( + incomingObjects: List + ): Triple, List, List> { + val incomingIds = incomingObjects.mapNotNull { it.id }.toSet() + val existingIds = cachedObjectIds intersect incomingIds + val foundIds = incomingIds subtract existingIds + val lostIds = cachedObjectIds subtract existingIds + + val newIds = existingIds union foundIds + val newDetectedObjects = incomingObjects.filter { newIds.contains(it.id) } + + val trimmedIds = getOverlappingObjects(newDetectedObjects) + val finalNewIds = newIds subtract trimmedIds + + val finalFoundIds = foundIds subtract trimmedIds + val finalExistingIds = existingIds subtract trimmedIds + val finalLostIds = lostIds union trimmedIds + + val found = incomingObjects.filter { finalFoundIds.contains(it.id) } + val updated = incomingObjects.filter { finalExistingIds.contains(it.id) } + + cachedObjectIds.clear() + cachedObjectIds.addAll(finalNewIds) + + return Triple(found, updated, finalLostIds.toList()) + } + + private fun getOverlappingObjects(incomingObjects: List): Set { + if (incomingObjects.isEmpty()) { return setOf() } + val trimmed = mutableSetOf() + for (i in incomingObjects.indices) { + if (trimmed.contains(i)) continue + for (q in incomingObjects.indices) { + if (trimmed.contains(q) || i == q) continue + val objA = incomingObjects[i] + val objB = incomingObjects[q] + if (objA.id == null || objB.id == null) continue + val intersection = Rect() + if (!objA.bounds.intersection(objB.bounds, intersection)) continue + + if (intersection == objA.bounds) { trimmed.add(objA.id); continue } + if (intersection == objB.bounds) { trimmed.add(objB.id); continue } + + val areaA = objA.bounds.area() + val areaB = objB.bounds.area() + val intersectionArea = intersection.area() + + if (areaA <= areaB && intersectionArea >= areaA / 2f) { + trimmed.add(objA.id) + } else if (areaB <= areaA && intersectionArea >= areaB / 2f) { + trimmed.add(objB.id) + } + } + } + return trimmed + } + + override fun clear() { + cachedObjectIds.clear() + cachedObjects.clear() + } + + override fun requestInfoForObject(id: Int, pose: RaycastRequestModel) { + objectInfoRequest.set(id to pose) + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/repository/DisplayEntities.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/repository/DisplayEntities.kt new file mode 100644 index 00000000..8bcf478e --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/datasource/repository/DisplayEntities.kt @@ -0,0 +1,90 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.repository + +import com.meta.pixelandtexel.scanner.FollowHead +import com.meta.pixelandtexel.scanner.RotationMode +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.display.IDisplayedEntityRepository +import com.meta.pixelandtexel.scanner.models.EntityData +import com.meta.pixelandtexel.scanner.models.smarthomedata.SmartHomeInfoRequest +import com.meta.pixelandtexel.scanner.utils.MathUtils.fromAxisAngle +import com.meta.spatial.core.Entity +import com.meta.spatial.core.Pose +import com.meta.spatial.core.Quaternion +import com.meta.spatial.core.Vector3 +import com.meta.spatial.toolkit.Grabbable +import com.meta.spatial.toolkit.GrabbableType +import com.meta.spatial.toolkit.Transform +import com.meta.spatial.toolkit.createPanelEntity +import kotlin.math.PI + +class DisplayedEntityRepository : IDisplayedEntityRepository { + companion object { + private const val INFO_PANEL_WIDTH = 0.632f + } + + private var nextId = 0 + + override var newViewModelData: EntityData? = null + get() { + val data = field + field = null + return data + } + override val entitiesHashMap: HashMap = HashMap() + + override fun createGenericInfoPanel( + panelId: Int, // R.integer.info_panel_id + data: SmartHomeInfoRequest + ): Entity { + val spawnPose = getPanelSpawnPosition( + Pose(data.raycastInfo.headPosition, data.raycastInfo.rotation), + INFO_PANEL_WIDTH + ) + val nextId = this.nextId++ + + this.newViewModelData = EntityData(nextId, data) + + val entity = Entity.createPanelEntity( + panelId, + Transform(spawnPose), + Grabbable(type = GrabbableType.PIVOT_Y), + FollowHead(lookAtHead = true, rotationMode = RotationMode.FULL) + ) + entitiesHashMap[nextId] = entity + return entity + } + + + override fun deleteEntity(entityId: Int) { + entitiesHashMap.get(entityId) + ?.let { entity -> + entity.destroy() + entitiesHashMap.remove(entityId) + } + } + + + /** + * Calcula la pose del panel. + * Lógica movida desde MainActivity. + */ + private fun getPanelSpawnPosition( + rightEdgePose: Pose, + panelWidth: Float, + zDistance: Float = 1f, + ): Pose { + // get angle based on arc length of panel width / 2 at z distance + val angle = (panelWidth / 2) / zDistance + + // rotate the pose forward direction by angle to get the new forward direction + val newFwd = + Quaternion.Companion.fromAxisAngle(Vector3.Companion.Up, angle * 180f / PI.toFloat()) + .times(rightEdgePose.forward()) + .normalize() + + // apply offset to lower the panel to eye height + val position = rightEdgePose.t - Vector3(0f, 0.1f, 0f) + newFwd * zDistance + val rotation = Quaternion.Companion.lookRotationAroundY(newFwd) + + return Pose(position, rotation) + } +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/CameraController.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/CameraController.kt new file mode 100644 index 00000000..2ec16070 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/CameraController.kt @@ -0,0 +1,496 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera + +import android.annotation.SuppressLint +import android.content.Context +import android.graphics.ImageFormat +import android.hardware.camera2.CameraCaptureSession +import android.hardware.camera2.CameraCaptureSession.CaptureCallback +import android.hardware.camera2.CameraCharacteristics +import android.hardware.camera2.CameraDevice +import android.hardware.camera2.CameraManager +import android.hardware.camera2.params.OutputConfiguration +import android.hardware.camera2.params.SessionConfiguration +import android.media.Image +import android.media.ImageReader +import android.os.Handler +import android.os.HandlerThread +import android.util.Log +import android.util.Size +import android.view.Surface +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.enums.CameraEye +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.models.CameraProperties +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.Event1 +import com.meta.pixelandtexel.scanner.feature.objectdetection.android.views.android.ISurfaceProvider +import com.meta.spatial.core.Quaternion +import com.meta.spatial.core.Vector2 +import com.meta.spatial.core.Vector3 +import java.util.concurrent.Executor +import java.util.concurrent.ExecutorService +import java.util.concurrent.Executors +import java.util.concurrent.atomic.AtomicBoolean +import kotlin.coroutines.resume +import kotlin.coroutines.resumeWithException +import kotlin.coroutines.suspendCoroutine +import kotlinx.coroutines.CoroutineScope +import kotlinx.coroutines.Dispatchers +import kotlinx.coroutines.delay +import kotlinx.coroutines.flow.MutableStateFlow +import kotlinx.coroutines.flow.StateFlow +import kotlinx.coroutines.launch + +data class ScanNewImageData( + val image: Image, + val width: Int, + val height: Int, + val finally: () -> Unit +) + +/** + * Manages the device camera lifecycle and provides access to camera frames and properties. + * + * This class handles camera initialization, querying available cameras (specifically targeting + * left/right eye cameras based on metadata), and retrieving intrinsic and extrinsic properties. It + * manages background threads for camera operations and image acquisition. + * + * The controller can start a camera capture session, directing the output to provided surfaces (via + * [ISurfaceProvider]) and/or an internal [ImageReader]. Frames from the [ImageReader] are delivered + * via the [ImageAvailableListener] interface, suitable for tasks like computer vision processing. + * It supports averaging the poses of both eye cameras for a central viewpoint. + * + * Adapted from: + * https://github.com/android/camera-samples/blob/main/Camera2Basic/app/src/main/java/com/example/android/camera2/basic/fragments/CameraFragment.kt + * + * @param context The application context, used to access the [CameraManager]. + * @param cameraEye The primary camera eye ([CameraEye.LEFT] or [CameraEye.RIGHT]) to use. + */ +class CameraController( + private val context: Context, + private val cameraEye: CameraEye = CameraEye.LEFT, +) { + companion object { + private const val TAG = "Camera" + + private const val CAMERA_IMAGE_FORMAT = ImageFormat.YUV_420_888 + private const val CAMERA_SOURCE_KEY = "com.meta.extra_metadata.camera_source" + private const val CAMERA_POSITION_KEY = "com.meta.extra_metadata.position" + } + + private var _isRunning = AtomicBoolean(false) + + private val _imageState = MutableStateFlow(null) + val imageState: StateFlow = _imageState + + val isRunning: Boolean + get() = _isRunning.get() + + val isInitialized: Boolean + get() = ::cameraProperties.isInitialized + + private lateinit var cameraManager: CameraManager + private val cameraEyeIds = HashMap() + private val cameraEyeCharacteristics = HashMap() + + private lateinit var cameraProperties: CameraProperties + val onCameraPropertiesChanged = Event1() + + // threads for camera handling and image frame acquisition + private lateinit var cameraExecutor: ExecutorService + private lateinit var imageReaderThread: HandlerThread + private lateinit var imageReaderHandler: Handler + + // objects initialized per session + private var camera: CameraDevice? = null + private var session: CameraCaptureSession? = null + private var imageReader: ImageReader? = null + + // our frame ready for object detection + private val isProcessingFrame = AtomicBoolean(false) + + // the output size from our camera characteristics + val cameraOutputSize: Size + get() = cameraProperties.resolution + + /** + * Initializes the resources and services needed for the camera frame image reading, and assembles + * the properties of the camera, which are used throughout the app. Call this after the user has + * accepted permissions to use the device camera. + * + * @throws [RuntimeException] If the camera manager failed to get any device cameras, or it failed + * to fetch the camera for the requested eye. + */ + fun initialize() { + // start our background threads + + cameraExecutor = Executors.newSingleThreadExecutor() + imageReaderThread = + HandlerThread("ImageReaderThread").apply { + start() + imageReaderHandler = Handler(this.looper) + } + + // get our camera manager + + cameraManager = context.getSystemService(Context.CAMERA_SERVICE) as CameraManager + + if (cameraManager.cameraIdList.isEmpty()) { + throw RuntimeException("Failed to get system camera") + } + + // get our camera characteristics + + Log.d(TAG, "Found camera ids: ${cameraManager.cameraIdList.joinToString()}") + + for (id in cameraManager.cameraIdList) { + val characteristics = cameraManager.getCameraCharacteristics(id) + + val position = + characteristics.get(CameraCharacteristics.Key(CAMERA_POSITION_KEY, Int::class.java)) + val eye = + when (position) { + 0 -> CameraEye.LEFT + 1 -> CameraEye.RIGHT + else -> CameraEye.UNKNOWN + } + + // store the characteristics of each camera + cameraEyeIds[eye] = id + cameraEyeCharacteristics[eye] = characteristics + } + + if (cameraEyeIds[cameraEye] == null) { + throw RuntimeException("Failed to get camera for ${cameraEye.name} eye") + } + + cameraProperties = getCameraProperties(cameraEye) + + onCameraPropertiesChanged.invoke(cameraProperties) + } + + /** + * Function to start the camera session, and begin reading image frames. Call this after calling + * initialize. + * + * @param surfaceProviders List of [ISurfaceProvider] objects to display the camera feed. + * @param imageAvailableListener Optional object which implements the [ImageAvailableListener] + * pattern to receive image frames for processing. + * @throws [RuntimeException] if the camera hasn't been initialized yet, or not all surfaces were + * ready. + */ + fun start( + surfaceProviders: List = listOf(), + ) { + if (!isInitialized) { + throw RuntimeException("Camera not initialized") + } + + if (surfaceProviders.isEmpty() && _imageState.subscriptionCount.value <= 0) { + Log.w(TAG, "No reason to start camera") + return + } + + if (_isRunning.get()) { + Log.w(TAG, "Camera controller already running") + return + } + + CoroutineScope(Dispatchers.Main).launch { + var elapsed = 0L + while (surfaceProviders.any { !it.surfaceAvailable }) { + Log.w(TAG, "Waiting for the camera preview surface to become available...") + + // wait for the surface to become available + delay(10L) + elapsed += 10L + + if (elapsed >= 10000L) { + throw RuntimeException("Timeout while waiting for surface(s)") + } + } + + startInternal(surfaceProviders) + } + } + + /** + * The internal start function, called after all surface providers are ready, which performs the + * actual camera session initialization and image reader setup. + * + * @param surfaceProviders The list of [ISurfaceProvider] surface providers onto which to display + * the device camera feed. + * @param imageAvailableListener An (optional) object which implements the + * [ImageAvailableListener] pattern to receive camera frame images for processing. + */ + private suspend fun startInternal( + surfaceProviders: List, + ) { + try { + _isRunning.set(true) + + // open our device camera + + val id = cameraEyeIds[cameraEye] + camera = openCamera(cameraManager, id!!, cameraExecutor) + + // assemble our targets + val targets = surfaceProviders.map { it.surface!! }.toMutableList() + + // initialize our image ready for frame object detection + imageReader = + ImageReader.newInstance( + cameraOutputSize.width, + cameraOutputSize.height, + CAMERA_IMAGE_FORMAT, + 2, + ) + targets.add(imageReader!!.surface) + + + // create and start our session with the open camera and list of target surfaces + + session = createCameraPreviewSession(camera!!, targets, cameraExecutor) + + val captureRequestBuilder = + camera!!.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW).apply { + targets.forEach { addTarget(it) } + } + + session!!.setSingleRepeatingRequest( + captureRequestBuilder.build(), + cameraExecutor, + object : CaptureCallback() {}, + ) + + // setup our image reader to receive frames + + imageReader?.setOnImageAvailableListener( + { reader -> + if (isProcessingFrame.get()) { + // still processing our last image + return@setOnImageAvailableListener + } + + val image = reader.acquireLatestImage() ?: return@setOnImageAvailableListener + + // Log.d(TAG, "Image available: ${image.format}, ${image.width}x${image.height}") + isProcessingFrame.set(true) + + _imageState.value = ScanNewImageData(image, image.width, image.height) { + image.close() + isProcessingFrame.set(false) + } + }, + imageReaderHandler, + ) + } catch (e: Exception) { + e.printStackTrace() + this.stop() + } + } + + /** + * Asynchronously opens a camera device with the specified [cameraId]. + * + * @param manager The [CameraManager] system service instance. + * @param cameraId The unique identifier of the camera to open. + * @param executor The [Executor] on which camera state callbacks will be invoked. + * @return The opened [CameraDevice] upon successful connection. + * @throws RuntimeException If the camera encounters an error during the opening process. + */ + @SuppressLint("MissingPermission") + private suspend fun openCamera( + manager: CameraManager, + cameraId: String, + executor: Executor, + ): CameraDevice = suspendCoroutine { cont -> + Log.d(TAG, "openCamera") + + manager.openCamera( + cameraId, + executor, + object : CameraDevice.StateCallback() { + override fun onOpened(camera: CameraDevice) { + Log.d(TAG, "camera onOpened") + + cont.resume(camera) + } + + override fun onDisconnected(camera: CameraDevice) { + Log.d(TAG, "camera onDisconnected") + + this@CameraController.stop() + } + + override fun onError(camera: CameraDevice, error: Int) { + Log.d(TAG, "camera onError") + + val msg = + when (error) { + ERROR_CAMERA_DEVICE -> "Fatal (device)" + ERROR_CAMERA_DISABLED -> "Device policy" + ERROR_CAMERA_IN_USE -> "Camera in use" + ERROR_CAMERA_SERVICE -> "Fatal (service)" + ERROR_MAX_CAMERAS_IN_USE -> "Maximum cameras in use" + else -> "Unknown" + } + val ex = RuntimeException("Camera $cameraId error: ($error) $msg") + Log.e(TAG, ex.message, ex) + + cont.resumeWithException(ex) + } + }, + ) + } + + /** + * Creates a camera capture session for preview. + * + * @param device The [CameraDevice] for which the session is to be created. + * @param targets A list of [Surface] objects to be used as output targets for the camera preview. + * @param executor The [Executor] on which the [CameraCaptureSession.StateCallback] will be + * invoked. + * @return The configured [CameraCaptureSession]. + * @throws RuntimeException if the camera session configuration fails. + */ + private suspend fun createCameraPreviewSession( + device: CameraDevice, + targets: List, + executor: Executor, + ): CameraCaptureSession = suspendCoroutine { cont -> + Log.d(TAG, "createCameraPreviewSession") + + device.createCaptureSession( + SessionConfiguration( + SessionConfiguration.SESSION_REGULAR, + targets.map { OutputConfiguration(it) }, + executor, + object : CameraCaptureSession.StateCallback() { + override fun onConfigured(session: CameraCaptureSession) { + Log.d(TAG, "CameraCaptureSession::onConfigured") + + cont.resume(session) + } + + override fun onConfigureFailed(session: CameraCaptureSession) { + Log.e(TAG, "CameraCaptureSession::onConfigureFailed") + val exc = + RuntimeException("Camera ${device.id} session configuration failed") + Log.e(TAG, exc.message, exc) + + cont.resumeWithException(exc) + } + }, + ) + ) + } + + /** + * Retrieves and processes the properties for the camera corresponding to the specified + * [CameraEye]. + * + * @param eye The specific [CameraEye] (e.g., left or right) for which to retrieve properties. + * @return A [CameraProperties] object containing detailed information about the specified camera. + * @throws NullPointerException if the [CameraCharacteristics] for the given [eye] are not found + * in the `cameraEyeCharacteristics` map. + * @throws RuntimeException if essential camera intrinsic properties (pose translation, rotation, + * calibration, or sensor size) cannot be queried from the [CameraCharacteristics]. + */ + private fun getCameraProperties(eye: CameraEye): CameraProperties { + val characteristics = cameraEyeCharacteristics[eye]!! + + // camera source (always 0) and position (0: left camera; 1: right camera from user's + // perspective) + + val source = + characteristics.get(CameraCharacteristics.Key(CAMERA_SOURCE_KEY, Int::class.java)) + val position = + characteristics.get(CameraCharacteristics.Key(CAMERA_POSITION_KEY, Int::class.java)) + Log.d(TAG, "Using camera source $source with position $position") + + // output formats: 256 (JPEG), 34 (PRIVATE), 35 (YUV_420_888) + + val formats = + characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)!!.outputFormats + Log.d(TAG, "Found camera output formats: ${formats.joinToString()}") + + // output sizes: 320x240, 640x480, 800x600, 1280x960 + + val sizes = + characteristics + .get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)!! + .getOutputSizes(CAMERA_IMAGE_FORMAT) + Log.d(TAG, "Found camera output sizes: ${sizes.joinToString()}") + + // physical properties for screen-to-world calculations + // https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples/blob/main/Assets/PassthroughCameraApiSamples/PassthroughCamera/Scripts/PassthroughCameraUtils.cs + + val translation = characteristics.get(CameraCharacteristics.LENS_POSE_TRANSLATION) + val rotation = characteristics.get(CameraCharacteristics.LENS_POSE_ROTATION) + val intrinsicsArr = characteristics.get(CameraCharacteristics.LENS_INTRINSIC_CALIBRATION) + val sensorSizePx = + characteristics.get(CameraCharacteristics.SENSOR_INFO_PRE_CORRECTION_ACTIVE_ARRAY_SIZE) + + if (translation == null || rotation == null || intrinsicsArr == null || sensorSizePx == null) { + throw RuntimeException("Failed to query camera intrinsics") + } + Log.d(TAG, "$sensorSizePx ${intrinsicsArr.joinToString()}} $translation $rotation") + + var quat = Quaternion(rotation[3], -rotation[0], -rotation[1], rotation[2]).inverse() + quat = quat.times(Quaternion(180f, 0f, 0f)) + + val props = + CameraProperties( + eye, + Vector3(translation[0], translation[1], -translation[2]), + quat, + Vector2(intrinsicsArr[0], intrinsicsArr[1]), + Vector2(intrinsicsArr[2], intrinsicsArr[3]), + Size(sensorSizePx.right, sensorSizePx.bottom), + ) + + return props + } + + /** + * Stops the camera session, closing the opened camera and stopping the image reader. Call this + * when you want to pause or stop the camera, but possibly resume later. + */ + fun stop() { + Log.d(TAG, "stop") + + _isRunning.set(false) + + // wait for any object detection service to finish performing an inference with the image + CoroutineScope(Dispatchers.Main).launch { + delay(100L) + imageReader?.close() + imageReader = null + } + + session?.close() + session = null + + camera?.close() + camera = null + } + + /** + * Stops the camera session by calling [stop], and then disposes of the background thread + * resources used for the camera session and image reader. Call this when you want to stop the + * camera, and don't plan on restarting it again. + */ + fun dispose() { + Log.d(TAG, "dispose") + + stop() + + if (::cameraExecutor.isInitialized) { + cameraExecutor.shutdown() + } + if (::imageReaderThread.isInitialized) { + imageReaderThread.quitSafely() + } + } + +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/enums/CameraEye.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/enums/CameraEye.kt new file mode 100644 index 00000000..bc04257f --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/enums/CameraEye.kt @@ -0,0 +1,9 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.enums + +enum class CameraEye { + UNKNOWN, + LEFT, + RIGHT, +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/enums/CameraStatus.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/enums/CameraStatus.kt new file mode 100644 index 00000000..684d7d19 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/enums/CameraStatus.kt @@ -0,0 +1,8 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.enums + +enum class CameraStatus { + PAUSED, + SCANNING, +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/models/CameraProperties.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/models/CameraProperties.kt new file mode 100644 index 00000000..6b09d154 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/camera/models/CameraProperties.kt @@ -0,0 +1,99 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.models + +import android.util.Size +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.enums.CameraEye +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.MathUtils +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.Plane +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.Ray +import com.meta.spatial.core.Pose +import com.meta.spatial.core.Quaternion +import com.meta.spatial.core.Vector2 +import com.meta.spatial.core.Vector3 + +/** + * Represents the intrinsic and extrinsic properties of a camera. This class stores information + * about the camera's position, orientation, lens characteristics, and image resolution. It also + * provides utility methods for calculations related to screen points and camera rays. + * + * Adapted from: + * https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples/blob/main/Assets/PassthroughCameraApiSamples/PassthroughCamera/Scripts/PassthroughCameraUtils.cs + * + * @property eye Specifies which [CameraEye] eye the camera properties correspond to (e.g. LEFT or + * RIGHT). + * @property translation The [Vector3] position of the camera in 3D space. + * @property rotation The [Quaternion] orientation of the camera. + * @property focalLength The focal length of the camera lens, typically in pixels, for both x and y + * axes in a [Vector2]. + * @property principalPoint The optical center of the image, typically in pixels, in a [Vector2]. + * @property resolution The [Size] width and height of the camera's image sensor or output image, in + * pixels. + */ +class CameraProperties( + val eye: CameraEye, + val translation: Vector3, + val rotation: Quaternion, + val focalLength: Vector2, + val principalPoint: Vector2, + val resolution: Size, +) { + val fov: Float + + init { + // calculate the horizontal field of view in degrees + + val left = Vector2(0f, resolution.height.toFloat() / 2f) + val right = Vector2(resolution.width.toFloat(), resolution.height.toFloat() / 2f) + val leftSidePointInCamera = screenPointToRayInCamera(left) + val rightSidePointInCamera = screenPointToRayInCamera(right) + fov = leftSidePointInCamera.angleBetweenDegrees(rightSidePointInCamera) + } + + /** + * Retrieves the pose (position and rotation) of the camera relative to the head. + * + * @return A [Pose] object representing the camera's translation and rotation. + */ + fun getHeadToCameraPose(): Pose { + return Pose(translation, rotation) + } + + /** + * Converts a 2D point on the screen (in pixel coordinates) to a normalized 3D ray originating + * from the camera's optical center and passing through that screen point, represented in the + * camera's coordinate system. The y-coordinate of the screen point is assumed to be from the + * top-left corner. + * + * @param screenPoint The [Vector2] 2D coordinates (x, y) of the point on the screen. + * @return A normalized [Vector3] representing the direction of the ray in camera space. The ray's + * z-component is 1f, indicating it points forward from the camera. + */ + fun screenPointToRayInCamera(screenPoint: Vector2): Vector3 { + val direction = + Vector3( + x = (screenPoint.x - principalPoint.x) / focalLength.x, + y = ((resolution.height - screenPoint.y) - principalPoint.y) / focalLength.y, + z = 1f, + ) + .normalize() + return direction + } + + /** + * Projects a 2D screen point (in pixel coordinates) onto a virtual 3D plane located at a + * specified distance in front of the camera. + * + * @param screenPoint The 2D coordinates (x, y) of the point on the screen, in a [Vector2]. + * @param viewDistance The distance of the virtual view plane from the camera origin along the + * camera's forward axis. + * @return A [Vector3] representing the 3D coordinates of the projected point on the view plane, + * in the camera's coordinate system. + */ + fun screenPointToPointOnViewPlane(screenPoint: Vector2, viewDistance: Float): Vector3 { + val viewPlane = Plane(Vector3.Forward * viewDistance, -Vector3.Forward) + val direction = screenPointToRayInCamera(screenPoint) + val intersection = MathUtils.rayPlaneIntersection(Ray(Vector3(0f), direction), viewPlane) + return intersection!! + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/repository/detection/DetectionState.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/repository/detection/DetectionState.kt new file mode 100644 index 00000000..3ee34718 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/repository/detection/DetectionState.kt @@ -0,0 +1,12 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.detection + +import android.media.Image +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models.DetectedObject + +data class DetectionState( + val image: Image, + val foundObjects: List, + val updatedObjects: List, + val lostObjectIds: List, + val finally: () -> Unit, +) \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/repository/detection/IDetectionRepository.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/repository/detection/IDetectionRepository.kt new file mode 100644 index 00000000..cc61f13a --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/repository/detection/IDetectionRepository.kt @@ -0,0 +1,41 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.detection + +import android.media.Image +import com.meta.pixelandtexel.scanner.feature.objectdetection.model.RaycastRequestModel +import kotlinx.coroutines.flow.StateFlow + +/** + * Defines the contract for a repository that manages the object detection workflow. + * + * This interface abstracts the logic for processing image frames and exposing + * the detection results, allowing for different implementations to be used + * interchangeably. + */ + +interface IObjectDetectionRepository { + + /** + * A [StateFlow] that emits the latest [DetectionState] result. + * Consumers can collect this flow to receive updates when a new object + * has been detected. A `null` value indicates no current detection result. + */ + val detectionState: StateFlow + + /** + * Tries to process a new image frame for object detection. + * + * Implementations of this method should handle concurrency, such as dropping + * frames if the detection process is already busy. + * + * @param image The image to process. + * @param width The width of the image. + * @param height The height of the image. + * @param finally A callback that must be executed to release the image resource, + * regardless of whether detection was successful or the frame was dropped. + */ + fun processImage(image: Image, width: Int, height: Int, `finally`: () -> Unit) + + fun clear() + + fun requestInfoForObject(id: Int, pose: RaycastRequestModel) +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/repository/display/IDisplayEntities.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/repository/display/IDisplayEntities.kt new file mode 100644 index 00000000..fbf8c276 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/repository/display/IDisplayEntities.kt @@ -0,0 +1,35 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.display + +import com.meta.pixelandtexel.scanner.models.EntityData +import com.meta.pixelandtexel.scanner.models.smarthomedata.SmartHomeInfoRequest +import com.meta.spatial.core.Entity + +/** + * Defines the contract for a repository that manages the creation + * and data handling of displayed entities in the scene, such as information panels. + */ +interface IDisplayedEntityRepository { + /** + * Holds the data for the next information panel to be displayed. + * This data is consumed when a new panel entity is created. + */ + var newViewModelData: EntityData? + + val entitiesHashMap: HashMap + + /** + * Creates a generic information panel entity at a calculated position. + * It also stores the provided data to be consumed by the panel's composable. + * + * @param panelId The resource ID for the panel registration. + * @param data The information to be displayed on the panel. + * @return The newly created panel [Entity]. + */ + fun createGenericInfoPanel( + panelId: Int, + data: SmartHomeInfoRequest, + ): Entity + + fun deleteEntity(entityId: Int) + +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/system/TrackedObjectSystem.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/system/TrackedObjectSystem.kt new file mode 100644 index 00000000..0ce319dd --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/system/TrackedObjectSystem.kt @@ -0,0 +1,579 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection.domain.system + +import android.graphics.Rect +import android.net.Uri +import com.meta.pixelandtexel.scanner.R +import com.meta.pixelandtexel.scanner.TrackedObject +import com.meta.pixelandtexel.scanner.feature.objectdetection.android.viewmodels.ObjectLabelViewModel +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.models.CameraProperties +import com.meta.pixelandtexel.scanner.feature.objectdetection.datasource.detector.models.DetectedObject +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.IPoolable +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.ObjectPool +import com.meta.pixelandtexel.scanner.feature.objectdetection.android.views.ObjectLabelScreen +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.repository.detection.IObjectDetectionRepository +import com.meta.pixelandtexel.scanner.feature.objectdetection.model.RaycastRequestModel +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.MathUtils +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.MathUtils.copy +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.MathUtils.toVector2 +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.Plane +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.Ray +import com.meta.spatial.compose.composePanel +import com.meta.spatial.core.Entity +import com.meta.spatial.core.Pose +import com.meta.spatial.core.Quaternion +import com.meta.spatial.core.Query +import com.meta.spatial.core.SystemBase +import com.meta.spatial.core.Vector2 +import com.meta.spatial.core.Vector3 +import com.meta.spatial.core.Vector4 +import com.meta.spatial.runtime.BlendMode +import com.meta.spatial.runtime.ButtonBits +import com.meta.spatial.runtime.DepthTest +import com.meta.spatial.runtime.HitInfo +import com.meta.spatial.runtime.InputListener +import com.meta.spatial.runtime.LayerConfig +import com.meta.spatial.runtime.PanelShapeLayerBlendType +import com.meta.spatial.runtime.SceneMaterial +import com.meta.spatial.runtime.SceneMaterialAttribute +import com.meta.spatial.runtime.SceneMaterialDataType +import com.meta.spatial.runtime.SceneMesh +import com.meta.spatial.runtime.SceneObject +import com.meta.spatial.runtime.SceneTexture +import com.meta.spatial.runtime.SortOrder +import com.meta.spatial.runtime.StereoMode +import com.meta.spatial.toolkit.AppSystemActivity +import com.meta.spatial.toolkit.Hittable +import com.meta.spatial.toolkit.Material +import com.meta.spatial.toolkit.Mesh +import com.meta.spatial.toolkit.MeshCollision +import com.meta.spatial.toolkit.PanelRegistration +import com.meta.spatial.toolkit.Quad +import com.meta.spatial.toolkit.Scale +import com.meta.spatial.toolkit.SceneObjectSystem +import com.meta.spatial.toolkit.Transform +import com.meta.spatial.toolkit.Visible +import com.meta.spatial.toolkit.createPanelEntity +import com.meta.spatial.toolkit.reparentChildInWorldCoordinates +import com.meta.spatial.uiset.theme.SpatialColor + +/** + * Manages the lifecycle and rendering of tracked objects detected in the scene – handles creating, + * updating, and removing visual representations (quads with outlines and labels) for objects + * detected in the device camera feed. Also handles user interaction with these tracked objects. + * + * @property activity The main activity context, used for accessing resources and registering + * panels. + * @property fov The field of view of the camera, used in projection calculations. + * @property headToCameraOffset The transformation from the head's pose to the camera's pose. + * @property screenPointToRayInCamera A function that converts a 2D screen point to a 3D ray in + * camera space. + * @property screenPointToPointOnViewPlane A function that converts a 2D screen point to a 3D point + * on the view plane at a specified distance. + */ +class TrackedObjectSystem( + activity: AppSystemActivity, + private val detectionRepository: IObjectDetectionRepository, + private var fov: Float = 72f, + private var headToCameraOffset: Pose = Pose(), + private var screenPointToRayInCamera: ((Vector2) -> Vector3) = { _ -> Vector3.Forward }, + private var screenPointToPointOnViewPlane: ((Vector2, Float) -> Vector3) = { _, _ -> + Vector3.Forward + }, +) : SystemBase() { + companion object { + private const val TAG: String = "TrackedObjectSystem" + + private const val Z_DIST: Float = 2f + private const val LABEL_PANEL_Y_OFFSET = 0.07f + + // material params for the 9-slice shader and texture + + // the size of the input texture + private val SLICE_TEX_SIZE = Vector2(96f, 96f) + + // the slice size of the image – left, top, right, bottom + private val SLICE_SIZE = Vector4(32f, 32f, 32f, 32f) + + // the pixels per unit multiplier – increasing this scales the outline width + private const val PPU_MULTIPLIER = 1f + } + + private val trackedObjectPool = ObjectPool(::createNewTrackedObj) + + // key is the detected object id, not the entity id + private var trackedObjects = HashMap() + + private val outlineDrawable = activity.getDrawable(R.drawable.rounded_box_outline)!! + + private var lastTime = System.currentTimeMillis() + + private var currentlyClickedObjectId: Int? = null + + init { + activity.registerPanel( + PanelRegistration(R.integer.object_label_panel_id) { entity -> + config { + themeResourceId = R.style.PanelAppThemeTransparent + includeGlass = false + fractionOfScreen = 0.25f + width = 0.5f + height = 0.1f + layerConfig = LayerConfig() + layerBlendType = PanelShapeLayerBlendType.MASKED + enableLayerFeatheredEdge = true + } + composePanel { + val info = + trackedObjects.values.firstOrNull { it.labelPanelEntity.id == entity.id } + ?: throw RuntimeException("Failed to find tracked object for panel") + + setContent { ObjectLabelScreen(info.uiVM) { onTrackedObjectClicked(info.entity) } } + } + } + ) + } + + /** + * Updates the positions and appearances of tracked objects, and processing new object detections. + */ + override fun execute() { + // calculate our delta time + val currentTime = System.currentTimeMillis() + val dt = ((currentTime - lastTime) / 1000f).coerceAtMost(0.1f) + lastTime = currentTime + + findNewObjects() + + val headPose = getScene().getViewerPose() + + // update the positions of all tracked objects + + trackedObjects.forEach { (_, info) -> + // our first frame with this pooled item; reveal it and re-enable collision + + if (info.shouldTeleport) { + info.entity.setComponent(Visible(true)) + info.entity.setComponent(Hittable(MeshCollision.LineTest)) + info.labelPanelEntity.setComponent(Visible(true)) + } + + val targetPose = getObjectPoseForRay(headPose, info.cameraRayToObject) + val targetScale = getObjectScaleForBounds(headPose, info.cameraFrameBounds) + + // snap to target pose/scale on first frame, or smoothly lerp there otherwise + + val newPose = + if (info.shouldTeleport) { + targetPose + } else { + val currentPose = info.entity.getComponent().transform + currentPose.lerp(targetPose, dt * 4f) + } + val newScale = + if (info.shouldTeleport) { + targetScale + } else { + val currentScale = info.entity.getComponent().scale + currentScale.lerp(targetScale, dt * 4f) + } + + info.entity.setComponent(Transform(newPose)) + info.entity.setComponent(Scale(newScale)) + + // update the shader to maintain a consistent outline width + + val sliceParams = Vector4(newScale.x, newScale.y, SLICE_TEX_SIZE.x, SLICE_TEX_SIZE.y) + info.outlineMaterial.setAttribute("sliceParams", sliceParams) + + // update the label panel location + + val panelY = -newScale.y / 2f - LABEL_PANEL_Y_OFFSET + info.labelPanelEntity.setComponent(Transform(Pose(Vector3(0f, panelY, 0f)))) + + info.shouldTeleport = false + } + } + + /** + * Queries for newly created entities with [TrackedObject] components and sets up their mesh, + * material, and input listeners. + */ + private fun findNewObjects() { + val query = Query.where { changed(TrackedObject.id) } + for (entity in query.eval()) { + val completable = systemManager.findSystem().getSceneObject(entity) + + completable?.thenAccept { + // setup our mesh and material + val trackedObjectComp = entity.getComponent() + + val id = trackedObjectComp.objectId + + val quadMesh = + SceneMesh.quad( + Vector3(-0.5f, -0.5f, 0f), + Vector3(0.5f, 0.5f, 0f), + trackedObjects[id]!!.outlineMaterial, + ) + it.setSceneMesh(quadMesh, "trackedObjectQuad") + + // add our on click listener + it.addInputListener( + object : InputListener { + override fun onInput( + receiver: SceneObject, + hitInfo: HitInfo, + sourceOfInput: Entity, + changed: Int, + buttonState: Int, + downTime: Long, + ): Boolean { + val selectButtons: Int = + ButtonBits.ButtonA or + ButtonBits.ButtonX or + ButtonBits.ButtonTriggerR or + ButtonBits.ButtonTriggerL + + val isButtonPressed = (selectButtons and buttonState and changed) != 0 + val isButtonHeld = (selectButtons and buttonState) != 0 + + if (isButtonPressed && currentlyClickedObjectId == null) { + val trackedObjectComp = + receiver.entity!!.getComponent() + currentlyClickedObjectId = trackedObjectComp.objectId + onTrackedObjectClicked(receiver.entity!!) + } + + if (!isButtonHeld) { + currentlyClickedObjectId = null + } + + return true + } + } + ) + } + } + } + + /** + * Handles the click event on a tracked object by calculating the interaction pose + * based on the object's position and the user's head position. + * + * @param entity The entity representing the clicked tracked object. + */ + private fun onTrackedObjectClicked(entity: Entity) { + val comp = entity.getComponent() + val headPose = getScene().getViewerPose() + val headPosition = headPose.t + + val transformComp = entity.getComponent() + val pose = transformComp.transform + + val direction = (pose.t - headPosition).normalize() + + + val scaleComp = entity.getComponent() + val scale = scaleComp.scale + val position = pose.times(Vector3.Right * (scale.x / 2)) + + // zero out any pitch to calculate our direction vector + position.y = headPosition.y + + val rotation = Quaternion.lookRotationAroundY(position - headPosition) + + detectionRepository.requestInfoForObject( + comp.objectId, + RaycastRequestModel(headPosition, direction, rotation) + ) + } + + /** + * Creates a new [TrackedObjectInfo] instance for the [trackedObjectPool] to generate new objects + * when the pool is empty. Sets up the entity, its label panel, and the outline material. + * + * @return A new [TrackedObjectInfo] instance. + */ + private fun createNewTrackedObj(): TrackedObjectInfo { + val material = + SceneMaterial.custom( + "9slice", + arrayOf( + SceneMaterialAttribute("sliceTex", SceneMaterialDataType.Texture2D), + SceneMaterialAttribute("sliceParams", SceneMaterialDataType.Vector4), + SceneMaterialAttribute("sliceSize", SceneMaterialDataType.Vector4), + SceneMaterialAttribute("tintColor", SceneMaterialDataType.Vector4), + SceneMaterialAttribute("stereoParams", SceneMaterialDataType.Vector4), + ), + ) + .apply { + setBlendMode(BlendMode.TRANSLUCENT) + setSortOrder(SortOrder.TRANSLUCENT) + setDepthTest(DepthTest.ALWAYS) + setStereoMode(StereoMode.None) + + setAttribute("sliceTex", SceneTexture(outlineDrawable)) + + // x = quad width, y = quad height, z = texture width, w = texture height + setAttribute("sliceParams", Vector4(1f, 1f, SLICE_TEX_SIZE.x, SLICE_TEX_SIZE.y)) + // slice size in order: left, right, top, bottom + setAttribute("sliceSize", SLICE_SIZE) + + // rgb, a = pixels per unit multiplier + setAttribute( + "tintColor", + Vector4( + SpatialColor.b70.red, + SpatialColor.b70.green, + SpatialColor.b70.blue, + PPU_MULTIPLIER, + ), + ) + } + + val entity = + Entity.create( + Transform(), + Scale(1f), + TrackedObject(), + Mesh(Uri.parse("mesh://quad")), + Quad(), + Material(), + Visible(false), + Hittable(MeshCollision.NoCollision), + ) + val labelPanelEntity = + Entity.createPanelEntity( + R.integer.object_label_panel_id, + Transform(), + Visible(false), + Hittable(MeshCollision.NoCollision), + ) + + // wait until panel is created to do this? + reparentChildInWorldCoordinates(entity, labelPanelEntity) + + return TrackedObjectInfo(entity, labelPanelEntity, material) + } + + /** + * Called when new objects are detected in the camera's video feed. For each detected object, it + * takes or creates a [TrackedObjectInfo] instance, updates its properties, and adds it to the + * [trackedObjects] map. + * + * @param objects A list of [DetectedObject] instances representing the newly found objects. + */ + fun onObjectsFound(objects: List) { + for (obj in objects) { + if (obj.id == null) { + continue + } + + val ray = screenPointToRayInCamera(obj.point.toVector2()) + + // create (or recycle) a new entity to represent the tracked object + val trackedObj = trackedObjectPool.take() + trackedObj.update(obj.id, ray, obj.bounds, obj.label) + + trackedObjects[obj.id] = trackedObj + } + + // Log.d(TAG, "Added ${objects.size} tracked objects; new size: ${trackedObjects.size}") + } + + /** + * Called when existing tracked objects are updated by the object detector. Updates the camera ray + * and bounding box for each corresponding [TrackedObjectInfo]. + * + * @param objects A list of [DetectedObject] instances with updated information. + */ + fun onObjectsUpdated(objects: List) { + for (obj in objects) { + if (obj.id == null || !trackedObjects.keys.contains(obj.id)) { + continue + } + + val ray = screenPointToRayInCamera(obj.point.toVector2()) + + // update the camera ray to detected object + + trackedObjects[obj.id]!!.cameraRayToObject = ray + trackedObjects[obj.id]!!.cameraFrameBounds = obj.bounds + } + + // Log.d(TAG, "Updated ${objects.size} tracked objects; new size: ${trackedObjects.size}") + } + + /** + * Called when objects are lost or no longer detected by the object detector. It removes the + * corresponding [TrackedObjectInfo] instances from [trackedObjects] and returns them to the + * [trackedObjectPool]. + * + * @param objectIds A list of IDs for the objects that were lost. + */ + fun onObjectsLost(objectIds: List) { + for (id in objectIds) { + // remove the tracked object, and recycle the associated entity + + val trackedObject = trackedObjects[id] ?: continue + + trackedObjectPool.put(trackedObject) + trackedObjects.remove(id) + } + + // Log.d(TAG, "Removed ${objectIds.size} tracked objects; new size: ${trackedObjects.size}") + } + + /** + * Calculates the world pose (position and rotation) for a tracked object based on a ray from the + * camera. The object is positioned on a view plane at a fixed Z distance and oriented to face the + * camera. + * + * @param headPose The current pose of the user's head. + * @param ray The direction ray from the camera to the object in camera space. + * @return The calculated [Pose] for the object in world space. + */ + private fun getObjectPoseForRay(headPose: Pose, ray: Vector3): Pose { + val cameraPose = headPose.times(headToCameraOffset) + val worldRayDir = cameraPose.q.times(ray).normalize() + + // find position by intersecting direction to view plane + val viewPlane = Plane(cameraPose.forward() * Z_DIST, -cameraPose.forward()) + val position = MathUtils.rayPlaneIntersection(Ray(cameraPose.t, worldRayDir), viewPlane)!! + + // orient this to face the user + val rotation = cameraPose.q + + return Pose(position, rotation) + } + + /** + * Calculates the world scale for a tracked object based on its screen-space bounding box. + * Projects the corners of the bounding box onto the view plane and uses the distances between + * these projected points to determine the width and height. + * + * @param headPose The current [Pose] pose of the user's head. + * @param bounds The screen-space bounding box ([Rect]) of the detected object. + * @return The calculated [Vector3] scale for the object. + */ + private fun getObjectScaleForBounds(headPose: Pose, bounds: Rect): Vector3 { + val cameraPose = headPose.times(headToCameraOffset) + val viewPlane = Plane(cameraPose.forward() * Z_DIST, -cameraPose.forward()) + + // find 3 points of the bounds projected into world space relative to the camera pose + + val topLeft = + screenPointToPointOnViewPlane( + Vector2(bounds.left.toFloat(), bounds.top.toFloat()), + Z_DIST + ) + val tlWorldRayDir = cameraPose.q.times(topLeft).normalize() + val tlPosition = + MathUtils.rayPlaneIntersection(Ray(cameraPose.t, tlWorldRayDir), viewPlane)!! + + val topRight = + screenPointToPointOnViewPlane( + Vector2(bounds.right.toFloat(), bounds.top.toFloat()), + Z_DIST + ) + val trWorldRayDir = cameraPose.q.times(topRight).normalize() + val trPosition = + MathUtils.rayPlaneIntersection(Ray(cameraPose.t, trWorldRayDir), viewPlane)!! + + val bottomLeft = + screenPointToPointOnViewPlane( + Vector2(bounds.left.toFloat(), bounds.bottom.toFloat()), + Z_DIST, + ) + val blWorldRayDir = cameraPose.q.times(bottomLeft).normalize() + val blPosition = + MathUtils.rayPlaneIntersection(Ray(cameraPose.t, blWorldRayDir), viewPlane)!! + + // use the distance between those corners to calculate the new panel scale + + val height = (tlPosition - blPosition).length() + val width = (trPosition - tlPosition).length() + + return Vector3(width, height, 1f) + } + + /** + * Called when the camera configuration changes, and updates camera-related properties used by the + * system. + * + * @param properties The new [CameraProperties] containing FOV, head-to-camera offset, and + * screen-to-ray conversion functions. + */ + fun onCameraPropertiesChanged(properties: CameraProperties) { + fov = properties.fov + headToCameraOffset = properties.getHeadToCameraPose() + screenPointToRayInCamera = properties::screenPointToRayInCamera + screenPointToPointOnViewPlane = properties::screenPointToPointOnViewPlane + } + + fun clear(){ + trackedObjects.forEach { + (_, trackedObjectInfo) -> + trackedObjectPool.put(trackedObjectInfo) + } + + trackedObjects.clear() + } + + /** + * Data class holding information about a single tracked object – including its entity, label + * panel entity, material, UI view model, and state related to its position and appearance. + * Implements [IPoolable] to be used with [ObjectPool]. + * + * @property entity The main entity representing the tracked object (the quad). + * @property labelPanelEntity The entity for the label panel associated with this object. + * @property outlineMaterial The [SceneMaterial] used for the object's outline. + * @property uiVM The [ObjectLabelViewModel] for the object's label UI. + * @property cameraRayToObject The ray from the camera to the object in camera space. + * @property cameraFrameBounds The bounding box of the object in the camera frame. + * @property targetPose The target pose for the object in the next frame. + * @property targetScale The target scale for the object in the next frame. + */ + private data class TrackedObjectInfo( + val entity: Entity, + val labelPanelEntity: Entity, + val outlineMaterial: SceneMaterial, + val uiVM: ObjectLabelViewModel = ObjectLabelViewModel(), + var cameraRayToObject: Vector3 = Vector3.Forward, + var cameraFrameBounds: Rect = Rect(), + var targetPose: Pose = Pose(), + var targetScale: Vector3 = Vector3(0f), + ) : IPoolable { + var shouldTeleport = false + + /** + * Updates the [TrackedObjectInfo] with new data from a [DetectedObject]. Sets the object ID, + * camera ray, bounding rectangle, and display name. Marks [shouldTeleport] as true to indicate + * it's a new or re-activated object. + * + * @param id The ID of the detected object. + * @param ray The ray from the camera to the object. + * @param rect The bounding box of the object in the camera frame. + * @param name The display name for the object's label. + */ + fun update(id: Int, ray: Vector3, rect: Rect, name: String) { + cameraRayToObject.copy(ray) + cameraFrameBounds = rect + + entity.setComponent(TrackedObject(id)) + uiVM.updateName(name) + + shouldTeleport = true + } + + /** + * Resets the [TrackedObjectInfo] to a default state when it's returned to the pool. Hides the + * entity and its label panel, and disables collision. + */ + override fun reset() { + entity.setComponent(Visible(false)) + entity.setComponent(Hittable(MeshCollision.NoCollision)) + labelPanelEntity.setComponent(Visible(false)) + } + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/system/ViewLockedSystem.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/system/ViewLockedSystem.kt new file mode 100644 index 00000000..bb5e3261 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/domain/system/ViewLockedSystem.kt @@ -0,0 +1,144 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.domain.system + +import android.util.Log +import com.meta.pixelandtexel.scanner.ViewLocked +import com.meta.pixelandtexel.scanner.feature.objectdetection.domain.camera.models.CameraProperties +import com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math.MathUtils +import com.meta.spatial.core.Entity +import com.meta.spatial.core.Pose +import com.meta.spatial.core.Quaternion +import com.meta.spatial.core.Query +import com.meta.spatial.core.SystemBase +import com.meta.spatial.runtime.PanelSceneObject +import com.meta.spatial.toolkit.Panel +import com.meta.spatial.toolkit.SceneObjectSystem +import com.meta.spatial.toolkit.Transform + +/** + * Manages entities that are "view-locked," meaning their position and orientation are relative to + * the user's head pose. This system ensures these entities remain in a fixed position within the + * user's field of view, optionally adjusting their distance to fill the view if they are panels. + * + * @property fov The current field of view of the camera in degrees. Defaults to 72f. + */ +class ViewLockedSystem(private var fov: Float = 72f) : SystemBase() { + companion object { + private const val TAG: String = "ViewLockedSystem" + } + + // key is the entity id + private var viewLockedEntities = HashMap() + + override fun execute() { + findNewEntities() + processEntities() + } + + /** + * Queries the scene for new entities that have a [ViewLocked] component and a [Transform] + * component. For each new entity, it retrieves its scene object and initializes its + * [ViewLockedInfo]. If the entity has a [Panel] component and `fillView` is true in its + * [ViewLocked] component, it calculates the appropriate distance to make the panel fill the view. + */ + private fun findNewEntities() { + val q = Query.where { has(ViewLocked.id, Transform.id) and changed(ViewLocked.id) } + for (entity in q.eval()) { + if (viewLockedEntities.contains(entity.id)) { + continue + } + + Log.d(TAG, "Found view locked entity ${entity.id}") + + val completable = + systemManager.findSystem().getSceneObject(entity) ?: continue + + completable.thenAccept { sceneObject -> + val viewLockedComp = entity.getComponent() + + var distance = 0f + var panelWidth = 0f + + if (entity.hasComponent() && viewLockedComp.fillView) { + val panelSceneObject = sceneObject as PanelSceneObject + val shapeConfig = panelSceneObject.getPanelShapeConfig() + + panelWidth = shapeConfig!!.width + distance = MathUtils.panelDistanceForSize(fov, shapeConfig.width) + } + + // initialize our view locked info + viewLockedEntities[entity.id] = ViewLockedInfo(entity, panelWidth, distance) + } + } + } + + /** + * Processes all currently tracked view-locked entities, retrieving the user's head transform and + * updating each entity's transform to be relative to the head pose. + */ + private fun processEntities() { + val headPose = getScene().getViewerPose() + + viewLockedEntities.forEach { (_, info) -> + val viewLockedComp = info.entity.getComponent() + + val quat = + Quaternion( + viewLockedComp.rotation.x, + viewLockedComp.rotation.y, + viewLockedComp.rotation.z, + ) + val newPose = headPose.times(Pose(viewLockedComp.position, quat)) + newPose.t += newPose.forward() * info.distance + + info.entity.setComponent(Transform(newPose)) + } + } + + /** + * Called when the camera configuration changes, and updates camera-related properties used by the + * system. Updates the system's field of view (FOV) and recalculates the distance for view- locked + * entities that are configured to fill the view. It also updates the position and rotation + * offsets based on the new head-to-camera pose if `fillView` is enabled. + * + * @param properties The new [CameraProperties] containing FOV, head-to-camera offset, and + * screen-to-ray conversion functions. + */ + fun onCameraPropertiesChanged(properties: CameraProperties) { + fov = properties.fov + + // update each ViewLockedInfo distance with the calculated value using the new fov + + viewLockedEntities.forEach { (_, info) -> + val viewLockedComp = info.entity.getComponent() + + // recalculate the z offset with the new fov + if (viewLockedComp.fillView) { + // use the new pose offset values + val offsetPose = properties.getHeadToCameraPose() + viewLockedComp.position = offsetPose.t + viewLockedComp.rotation = offsetPose.q.toEuler() + + info.entity.setComponent(viewLockedComp) + info.distance = MathUtils.panelDistanceForSize(fov, info.panelWidth) + } + } + } + + /** + * Data class to store information about a view-locked entity. + * + * @property entity The view-locked [Entity] itself. + * @property panelWidth The width of the panel, if the entity is a panel and `fillView` is true. + * Used for FOV-based distance calculations. Defaults to 0f. + * @property distance The calculated or specified distance from the head to the entity. Can be + * dynamically updated, for example, when FOV changes. + */ + private data class ViewLockedInfo( + val entity: Entity, + val panelWidth: Float, + var distance: Float + ) +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/model/RaycastRequestModel.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/model/RaycastRequestModel.kt new file mode 100644 index 00000000..a941adae --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/model/RaycastRequestModel.kt @@ -0,0 +1,10 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection.model + +import com.meta.spatial.core.Quaternion +import com.meta.spatial.core.Vector3 + +data class RaycastRequestModel( + val headPosition: Vector3, + val direction: Vector3, + val rotation: Quaternion +) \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/Event.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/Event.kt new file mode 100644 index 00000000..fa53f138 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/Event.kt @@ -0,0 +1,20 @@ +package com.meta.pixelandtexel.scanner.feature.objectdetection.utils + +class Event1 { + private val observers = mutableSetOf<(T) -> Unit>() + + operator fun plusAssign(observer: (T) -> Unit) { + observers.add(observer) + } + + operator fun minusAssign(observer: (T) -> Unit) { + observers.remove(observer) + } + + operator fun invoke(value: T) { + for (observer in observers) { + observer(value) + } + } +} + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/ImageUtils.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/ImageUtils.kt new file mode 100644 index 00000000..0fe495d7 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/ImageUtils.kt @@ -0,0 +1,269 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.utils + +import android.graphics.Bitmap +import android.graphics.ImageFormat +import android.graphics.Rect +import android.media.Image +import androidx.core.graphics.createBitmap +import java.nio.ByteBuffer + +object ImageUtils { + /** Returns a RGB888 ByteBuffer representation of a YUV_420_888 Image */ + fun Image.getByteBuffer(): ByteBuffer { + if (this.format != ImageFormat.YUV_420_888) { + throw IllegalArgumentException( + "Unsupported format ${this.format}; expected ${ImageFormat.YUV_420_888}" + ) + } + + val width = this.width + val height = this.height + + // RGB values (3 bytes per pixel) + val rgbBuffer = ByteBuffer.allocateDirect(width * height * 3) + + // retrieve the image planes + + val planes = this.planes + val yBuffer = planes[0].buffer + val uBuffer = planes[1].buffer + val vBuffer = planes[2].buffer + + // get row and pixel strides for each plane + + val yRowStride = planes[0].rowStride + val yPixelStride = planes[0].pixelStride + val uRowStride = planes[1].rowStride + val uPixelStride = planes[1].pixelStride + val vRowStride = planes[2].rowStride + val vPixelStride = planes[2].pixelStride + + // loop over every pixel in the Y plane + + for (row in 0 until height) { + for (col in 0 until width) { + // calculate the index for Y value + + val yIndex = row * yRowStride + col * yPixelStride + val y = yBuffer.get(yIndex).toInt() and 0xFF + + // U and V are subsampled by 2 (YUV 4:2:0) + + val uvRow = row / 2 + val uvCol = col / 2 + val uIndex = uvRow * uRowStride + uvCol * uPixelStride + val vIndex = uvRow * vRowStride + uvCol * vPixelStride + val u = uBuffer.get(uIndex).toInt() and 0xFF + val v = vBuffer.get(vIndex).toInt() and 0xFF + + // convert YUV to RGB using standard formulas + + val c = y - 16 + val d = u - 128 + val e = v - 128 + + var r = (298 * c + 409 * e + 128) shr 8 + var g = (298 * c - 100 * d - 208 * e + 128) shr 8 + var b = (298 * c + 516 * d + 128) shr 8 + + // clamp the values to the [0, 255] range + + r = r.coerceIn(0, 255) + g = g.coerceIn(0, 255) + b = b.coerceIn(0, 255) + + // write the RGB values into the ByteBuffer sequentially + + rgbBuffer.put(r.toByte()) + rgbBuffer.put(g.toByte()) + rgbBuffer.put(b.toByte()) + } + } + + // reset the ByteBuffer's position to the beginning + rgbBuffer.rewind() + + return rgbBuffer + } + + /** Returns a ARGB_8888 Bitmap representation of a YUV_420_888 Image */ + fun Image.getBitmap(): Bitmap { + if (this.format != ImageFormat.YUV_420_888) { + throw IllegalArgumentException( + "Unsupported format ${this.format}; expected ${ImageFormat.YUV_420_888}" + ) + } + + val width = this.width + val height = this.height + + // init output structures + + val bitmap = createBitmap(width, height, Bitmap.Config.ARGB_8888) + val pixels = IntArray(width * height) + + // retrieve the image planes + + val planes = this.planes + val yBuffer = planes[0].buffer + val uBuffer = planes[1].buffer + val vBuffer = planes[2].buffer + + // get row and pixel strides for each plane + + val yRowStride = planes[0].rowStride + val yPixelStride = planes[0].pixelStride + val uRowStride = planes[1].rowStride + val uPixelStride = planes[1].pixelStride + val vRowStride = planes[2].rowStride + val vPixelStride = planes[2].pixelStride + + // index for the 'pixels' array + var pixelIndex = 0 + + // loop over every pixel in the Y plane + + for (row in 0 until height) { + for (col in 0 until width) { + // calculate the index for Y value + + val yIndex = row * yRowStride + col * yPixelStride + val y = yBuffer.get(yIndex).toInt() and 0xFF + + // U and V are subsampled by 2 (YUV 4:2:0) + + val uvRow = row / 2 + val uvCol = col / 2 + val uIndex = uvRow * uRowStride + uvCol * uPixelStride + val vIndex = uvRow * vRowStride + uvCol * vPixelStride + val u = uBuffer.get(uIndex).toInt() and 0xFF + val v = vBuffer.get(vIndex).toInt() and 0xFF + + // convert YUV to RGB using standard formulas + + val c = y - 16 + val d = u - 128 + val e = v - 128 + + var r = (298 * c + 409 * e + 128) shr 8 + var g = (298 * c - 100 * d - 208 * e + 128) shr 8 + var b = (298 * c + 516 * d + 128) shr 8 + + // clamp the values to the [0, 255] range + + r = r.coerceIn(0, 255) + g = g.coerceIn(0, 255) + b = b.coerceIn(0, 255) + + // combine into ARGB_8888 format (fully opaque alpha) and store in IntArray + + pixels[pixelIndex++] = (0xFF shl 24) or (r shl 16) or (g shl 8) or b + } + } + + // set our bitmap pixels + + bitmap.setPixels(pixels, 0, width, 0, 0, width, height) + + return bitmap + } + + /** + * Returns a ARGB_8888 Bitmap representation of a YUV_420_888 Image, cropped to the Rect. + * + * @param crop The [Rect] representing the area from the source image to crop from. + * @return + */ + fun Image.getBitmap(crop: Rect): Bitmap { + if (this.format != ImageFormat.YUV_420_888) { + throw IllegalArgumentException( + "Unsupported format ${this.format}; expected ${ImageFormat.YUV_420_888}" + ) + } + + // validate our bounds + + require(!crop.isEmpty) { "Bounds empty" } + require( + crop.left >= 0 && crop.top >= 0 && crop.right <= this.width && crop.bottom <= this.height + ) { + "Bounds not within image dimensions" + } + require(crop.width() > 0 && crop.height() > 0) { "Bounds must be non zero" } + + val width = crop.width() + val height = crop.height() + + // init output structures + + val bitmap = createBitmap(width, height, Bitmap.Config.ARGB_8888) + val pixels = IntArray(width * height) + + // retrieve the image planes + + val planes = this.planes + val yBuffer = planes[0].buffer + val uBuffer = planes[1].buffer + val vBuffer = planes[2].buffer + + // get row and pixel strides for each plane + + val yRowStride = planes[0].rowStride + val yPixelStride = planes[0].pixelStride + val uRowStride = planes[1].rowStride + val uPixelStride = planes[1].pixelStride + val vRowStride = planes[2].rowStride + val vPixelStride = planes[2].pixelStride + + // index for the 'pixels' array + var pixelIndex = 0 + + // loop over every pixel in the Y plane + + for (row in crop.top until crop.bottom) { + for (col in crop.left until crop.right) { + // calculate the index for Y value + + val yIndex = row * yRowStride + col * yPixelStride + val y = yBuffer.get(yIndex).toInt() and 0xFF + + // U and V are subsampled by 2 (YUV 4:2:0) + + val uvRow = row / 2 + val uvCol = col / 2 + val uIndex = uvRow * uRowStride + uvCol * uPixelStride + val vIndex = uvRow * vRowStride + uvCol * vPixelStride + val u = uBuffer.get(uIndex).toInt() and 0xFF + val v = vBuffer.get(vIndex).toInt() and 0xFF + + // convert YUV to RGB using standard formulas + + val c = y - 16 + val d = u - 128 + val e = v - 128 + + var r = (298 * c + 409 * e + 128) shr 8 + var g = (298 * c - 100 * d - 208 * e + 128) shr 8 + var b = (298 * c + 516 * d + 128) shr 8 + + // clamp the values to the [0, 255] range + + r = r.coerceIn(0, 255) + g = g.coerceIn(0, 255) + b = b.coerceIn(0, 255) + + // combine into ARGB_8888 format (fully opaque alpha) and store in IntArray + + pixels[pixelIndex++] = (0xFF shl 24) or (r shl 16) or (g shl 8) or b + } + } + + // set our bitmap pixels + + bitmap.setPixels(pixels, 0, width, 0, 0, width, height) + + return bitmap + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/NumberSmoother.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/NumberSmoother.kt new file mode 100644 index 00000000..b2e602b1 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/NumberSmoother.kt @@ -0,0 +1,55 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.utils + +/** + * A simple class that smoothes a number over time, useful for tracking rapidly changing numbers + * like frame rates. Applies exponential smoothing over time. + * + * @property alpha How much smoothing to apply to the number. + */ +class NumberSmoother(private val alpha: Float = 0.1f) { + private var smoothed: Float = 0.0f + private var firstUpdate: Boolean = true + + /** + * Updates the smoothed value with a new integer input, first converting to a float. + * + * @param current The new integer value to incorporate into the smoothed average. + */ + fun update(current: Int) { + update(current.toFloat()) + } + + /** + * Updates the smoothed value with a new long input, first converting to a float. + * + * @param current The new long value to incorporate into the smoothed average. + */ + fun update(current: Long) { + update(current.toFloat()) + } + + /** + * Updates the smoothed value with a new float input using exponential smoothing. If this is the + * first call to an `update` method for this instance, the `smoothed` value is initialized + * directly with the `current` value. + * + * @param current The new float value to incorporate into the smoothed average. + */ + fun update(current: Float) { + if (firstUpdate) { + smoothed = current + firstUpdate = false + } else { + smoothed = alpha * current + (1 - alpha) * smoothed + } + } + + /** + * Retrieves the current smoothed number. + * + * @return The latest calculated smoothed float value. + */ + fun getSmoothedNumber(): Float = smoothed +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/ObjectPool.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/ObjectPool.kt new file mode 100644 index 00000000..7d9d3b0d --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/ObjectPool.kt @@ -0,0 +1,58 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.utils + +/** + * Meant to be used in tandem with [ObjectPool], implement this interface to support an object + * pooling pattern, with support for resetting objects as they are returned to the object pool. + */ +interface IPoolable { + /** Implement this to reset your type before it is returned to the object pool. */ + fun reset() +} + +/** + * A barebones class implementing an object pooling or caching pattern. Useful for often added or + * deleted structures, as it reuses allocated memory instead of repeatedly allocating and releasing. + * + * @param T A structure implementing the [IPoolable] pattern, which will be the pooled object type + * specified for this object pool instance. + * @param initialSize The initial size to pre-populate the object pool, if any. + * @property factory The factory function which is used to generate new pooled object instances. + */ +class ObjectPool(private val factory: () -> T, initialSize: Int = 0) { + private val pool = ArrayDeque() + + val availableObjects: Int + get() = pool.size + + init { + require(initialSize >= 0) { "Invalid initial size" } + + // optional, initial population of pool + repeat(initialSize) { pool.addLast(factory()) } + } + + /** + * Takes an instance of the [IPoolable] object type from the pool, creating one if necessary. + * + * @return The [IPoolable] instance. + */ + fun take(): T { + return if (pool.isNotEmpty()) { + pool.removeLast() + } else { + factory() + } + } + + /** + * Returns the instance to the object pool, first resetting it. + * + * @param obj The [IPoolable] instance to return. + */ + fun put(obj: T) { + obj.reset() + pool.addLast(obj) + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/math/MathUtils.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/math/MathUtils.kt new file mode 100644 index 00000000..e65680f9 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/math/MathUtils.kt @@ -0,0 +1,115 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math + +import android.graphics.PointF +import android.graphics.Rect +import com.meta.spatial.core.Vector2 +import com.meta.spatial.core.Vector3 +import kotlin.math.PI +import kotlin.math.abs +import kotlin.math.max +import kotlin.math.min +import kotlin.math.tan + +object MathUtils { + const val EPSILON = 1e-9 + const val DEG_TO_RAD = PI / 180f + + /** + * Calculates the distance at which a panel must be spaced from the camera (at the specified + * camera horizontal field of view and panel width) in order for the panel to exactly fill the + * field of view, width-wise. + * + * @param fov The camera's horizontal field of view, in degrees. + * @param size The width of the panel. + * @return The distance at which to place the panel. + */ + fun panelDistanceForSize(fov: Float, size: Float): Float { + val rad = fov * DEG_TO_RAD + val distance = (size / 2f) / tan(rad / 2f) + return distance.toFloat() + } + + /** + * Tests whether or not a ray intersects with a plane, and returns the intersection point if so. + * + * @param ray The ray to test the intersection. + * @param plane The plane against which to test the intersection + * @return The 3D point representing the intersection, or null if there is no intersection. + */ + fun rayPlaneIntersection(ray: Ray, plane: Plane): Vector3? { + // direction vectors are too close to 0 + if ( + plane.normal.lengthSquared() < EPSILON * EPSILON || + ray.direction.lengthSquared() < EPSILON * EPSILON + ) { + return null + } + + val denom = plane.normal.dot(ray.direction) + + // check if ray is parallel to plane + if (abs(denom) < EPSILON) { + return null + } + + // ray intersects; calculate distance parameter t + val originToPlanePoint = plane.point - ray.origin + val t = plane.normal.dot(originToPlanePoint) / denom + + // check if intersection is behind ray's origin + if (t < -EPSILON) { + return null + } + + val intersectionPoint = ray.origin + ray.direction * t + return intersectionPoint + } + + fun Vector3.lengthSquared(): Float { + return this.dot(this) + } + + fun Vector3.copy(other: Vector3): Vector3 { + this.x = other.x + this.y = other.y + this.z = other.z + return this // for chaining + } + + fun PointF.toVector2(): Vector2 { + return Vector2(this.x, this.y) + } + + /** + * Computes the overlapping intersection between this Rect and another, setting the result to the + * intersection, and returning a boolean representing whether or not there was any intersection. + * + * @param other The other [Rect] against with to test the intersection. + * @param result The resulting [Rect] intersection overlap between the two rectangles. + * @return Whether or not there was any intersection. + */ + fun Rect.intersection(other: Rect, result: Rect): Boolean { + val intersectLeft = max(this.left, other.left) + val intersectTop = max(this.top, other.top) + val intersectRight = min(this.right, other.right) + val intersectBottom = min(this.bottom, other.bottom) + + val intersection = Rect(intersectLeft, intersectTop, intersectRight, intersectBottom) + result.copy(intersection) + + return !result.isEmpty + } + + fun Rect.copy(other: Rect) { + this.left = other.left + this.top = other.top + this.right = other.right + this.bottom = other.bottom + } + + fun Rect.area(): Int { + return this.width() * this.height() + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/math/Plane.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/math/Plane.kt new file mode 100644 index 00000000..04bcd9a7 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/math/Plane.kt @@ -0,0 +1,7 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math + +import com.meta.spatial.core.Vector3 + +data class Plane(var point: Vector3, var normal: Vector3) diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/math/Ray.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/math/Ray.kt new file mode 100644 index 00000000..955bacfd --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/feature/objectdetection/utils/math/Ray.kt @@ -0,0 +1,7 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.feature.objectdetection.utils.math + +import com.meta.spatial.core.Vector3 + +data class Ray(var origin: Vector3, var direction: Vector3) diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/EntityData.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/EntityData.kt new file mode 100644 index 00000000..8d516e22 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/EntityData.kt @@ -0,0 +1,8 @@ +package com.meta.pixelandtexel.scanner.models + +import com.meta.pixelandtexel.scanner.models.smarthomedata.SmartHomeInfoRequest + +data class EntityData( + val entityId: Int, + val data: SmartHomeInfoRequest +) diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/Device.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/Device.kt new file mode 100644 index 00000000..4cb39b2a --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/Device.kt @@ -0,0 +1,6 @@ +package com.meta.pixelandtexel.scanner.models.devices + +data class Device( + val name: String, + val entityList: List +) diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/ThingEntity.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/ThingEntity.kt new file mode 100644 index 00000000..29221304 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/ThingEntity.kt @@ -0,0 +1,8 @@ +package com.meta.pixelandtexel.scanner.models.devices + +import com.meta.pixelandtexel.scanner.models.devices.domain.Domain + +data class ThingEntity( + val id: String, + val domain: Domain +) diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/AttributeServices.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/AttributeServices.kt new file mode 100644 index 00000000..8713a65b --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/AttributeServices.kt @@ -0,0 +1,8 @@ +package com.meta.pixelandtexel.scanner.models.devices.domain + +enum class AttributeServices(val serviceName: String) { + IS_VOLUME_MUTED("is_volume_muted"), + VOLUME_LEVEL("volume_level"), + COLOR_TEMP_KELVIN("color_temp_kelvin"), + BRIGHTNESS("brightness"), +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/Domain.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/Domain.kt new file mode 100644 index 00000000..eafdde75 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/Domain.kt @@ -0,0 +1,6 @@ +package com.meta.pixelandtexel.scanner.models.devices.domain + +sealed interface Domain { + val value: Any + val services: List +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/DomainServices.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/DomainServices.kt new file mode 100644 index 00000000..23559168 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/DomainServices.kt @@ -0,0 +1,9 @@ +package com.meta.pixelandtexel.scanner.models.devices.domain + +enum class DomainServices(val serviceName: String) { + TURN_ON("turn_on"), + TURN_OFF("turn_off"), + VOLUME_SET("volume_set"), + VOLUME_MUTE("volume_mute"), + MEDIA_PLAY("media_play"), +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/LightDomain.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/LightDomain.kt new file mode 100644 index 00000000..d2f6ec75 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/LightDomain.kt @@ -0,0 +1,27 @@ +package com.meta.pixelandtexel.scanner.models.devices.domain + + +data class LightAttributes( + val minColorTempKelvin: Int? = null, + val maxColorTempKelvin: Int? = null, + val minMireds: Int? = null, + val maxMireds: Int? = null, + val effectList: List? = null, + val supportedColorModes: List? = null, + val effect: String? = null, + val colorMode: String? = null, + val brightness: Int? = null, + val colorTempKelvin: Int? = null, + val colorTemp: Int? = null, + val hsColor: List? = null, + val rgbColor: List? = null, + val xyColor: List? = null, + val friendlyName: String? = null, + val supportedFeatures: Int? = null +) + +data class LightDomain( + override val value: Boolean, + override val services: List, + val attributes: LightAttributes +) : Domain \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/MediaPlayerDomain.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/MediaPlayerDomain.kt new file mode 100644 index 00000000..1640816d --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/MediaPlayerDomain.kt @@ -0,0 +1,14 @@ +package com.meta.pixelandtexel.scanner.models.devices.domain + + +data class MediaPlayerAttributes( + val volumeLevel: Float?, + val isMuted: Boolean?, + val source: List?, +) + +data class MediaPlayerDomain( + override val value: Boolean, + override val services: List, + val attributes: MediaPlayerAttributes +) : Domain diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/SensorDomain.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/SensorDomain.kt new file mode 100644 index 00000000..43c28a3d --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/SensorDomain.kt @@ -0,0 +1,6 @@ +package com.meta.pixelandtexel.scanner.models.devices.domain + +data class SensorDomain( + override val value: String, + override val services: List +) : Domain diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/SwitchDomain.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/SwitchDomain.kt new file mode 100644 index 00000000..11edad50 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/SwitchDomain.kt @@ -0,0 +1,6 @@ +package com.meta.pixelandtexel.scanner.models.devices.domain + +data class SwitchDomain( + override val value: Boolean, + override val services: List +) : Domain diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/WeatherDomain.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/WeatherDomain.kt new file mode 100644 index 00000000..57bd5b5a --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/devices/domain/WeatherDomain.kt @@ -0,0 +1,52 @@ +package com.meta.pixelandtexel.scanner.models.devices.domain + +import com.meta.pixelandtexel.scanner.android.datasource.dto.Attributes + + +data class WeatherDomainAttributes( + val temperature: Double? = null, + val dewPoint: Double? = null, + val temperatureUnit: String? = null, + val humidity: Int? = null, + val cloudCoverage: Double? = null, + val uvIndex: Double? = null, + val pressure: Double? = null, + val pressureUnit: String? = null, + val windBearing: Double? = null, + val windSpeed: Double? = null, + val windSpeedUnit: String? = null, + val visibilityUnit: String? = null, + val precipitationUnit: String? = null, + val attribution: String? = null, + val friendlyName: String? = null, + val supportedFeatures: Int? = null, +) { + companion object { + fun fromAttributes(attributes: Attributes): WeatherDomainAttributes { + return WeatherDomainAttributes( + temperature = attributes.temperature, + dewPoint = attributes.dewPoint, + temperatureUnit = attributes.temperatureUnit, + humidity = attributes.humidity, + cloudCoverage = attributes.cloudCoverage, + uvIndex = attributes.uvIndex, + pressure = attributes.pressure, + pressureUnit = attributes.pressureUnit, + windBearing = attributes.windBearing, + windSpeed = attributes.windSpeed, + windSpeedUnit = attributes.windSpeedUnit, + visibilityUnit = attributes.visibilityUnit, + precipitationUnit = attributes.precipitationUnit, + attribution = attributes.friendlyName, + friendlyName = attributes.friendlyName, + supportedFeatures = attributes.supportedFeatures + ) + } + } +} + +data class WeatherDomain( + override val value: String, + override val services: List = emptyList(), + val attributes: WeatherDomainAttributes +) : Domain {} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/smarthomedata/SmartHomeInfoRequest.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/smarthomedata/SmartHomeInfoRequest.kt new file mode 100644 index 00000000..80bcf294 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/smarthomedata/SmartHomeInfoRequest.kt @@ -0,0 +1,8 @@ +package com.meta.pixelandtexel.scanner.models.smarthomedata + +import com.meta.pixelandtexel.scanner.feature.objectdetection.model.RaycastRequestModel + +data class SmartHomeInfoRequest( + val type: TypeSmartHomeInfo, + val raycastInfo: RaycastRequestModel +) \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/smarthomedata/TypeSmartHomeInfo.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/smarthomedata/TypeSmartHomeInfo.kt new file mode 100644 index 00000000..ce0d82b0 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/models/smarthomedata/TypeSmartHomeInfo.kt @@ -0,0 +1,44 @@ +package com.meta.pixelandtexel.scanner.models.smarthomedata + +enum class TypeSmartHomeInfo { + LIGHT, + PLUG, + UNKNOWN +} + +fun getEnumFromString(type: String): TypeSmartHomeInfo { + if (lightStringList.any { it.equals(type, ignoreCase = true) }) { + return TypeSmartHomeInfo.LIGHT + } + if (plugStringList.any { it.equals(type, ignoreCase = true) }) { + return TypeSmartHomeInfo.PLUG + } + + return TypeSmartHomeInfo.UNKNOWN +} + + +val lightStringList = listOf( + "light", + "lamp", + "bulb", + "ceiling light", + "floor lamp", + "table lamp", + "chandelier", + "sconce", + "spotlight", + "downlight" +) + +val plugStringList = listOf( + "plug", + "power outlet", + "socket", + "electrical outlet", + "wall socket", + "power point", + "extension cord", + "power strip", + "switch", +) \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/services/settings/SettingsKey.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/services/settings/SettingsKey.kt new file mode 100644 index 00000000..10ea7154 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/services/settings/SettingsKey.kt @@ -0,0 +1,14 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.services.settings + +/** + * The keys used for accessing settings from [SettingsService]. Each enum constant represents a + * specific setting and holds the actual string key used for storing and retrieving the setting + * value in [android.content.SharedPreferences]. + * + * @property value The string representation of the settings key. + */ +enum class SettingsKey(val value: String) { + ACCEPTED_NOTICE("accepted_user_notice") +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/services/settings/SettingsService.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/services/settings/SettingsService.kt new file mode 100644 index 00000000..f169112a --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/services/settings/SettingsService.kt @@ -0,0 +1,113 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.services.settings + +import android.content.Context +import android.content.SharedPreferences +import android.util.Log +import androidx.core.content.edit + +/** + * Manages application settings using SharedPreferences with an in-memory caching layer, providing + * methods to initialize, retrieve, and store settings of various primitive types. Ensures that + * SharedPreferences is initialized only once and uses a cache to improve performance by reducing + * direct access to SharedPreferences. + */ +object SettingsService { + private const val TAG: String = "SettingsService" + + private var initialized = false + + private lateinit var prefs: SharedPreferences + + // cached versions of our key/values + private var cache = HashMap() + + /** + * Initializes the SettingsService with the application context, setting up the SharedPreferences + * instance used for storing settings. Ensures that initialization only occurs once. If already + * initialized, the method will simply return. + * + * @param context The application context used to access SharedPreferences. + */ + fun initialize(context: Context) { + if (initialized) { + return + } + + prefs = context.getSharedPreferences("settings_preferences", Context.MODE_PRIVATE) + + initialized = true + } + + /** + * Retrieves a setting value for the given key, first checking an in-memory cache for the value. + * If the value is found in the cache and matches the expected type, it is returned. Otherwise, + * the function attempts to retrieve the value from SharedPreferences. If retrieved from + * SharedPreferences, the value is then stored in the cache for future access. + * + * Supported types are [String], [Int], [Boolean], [Long], and [Float]. + * + * @param T The type of the setting value. + * @param key The [SettingsKey] identifying the setting. + * @param default The default value to return if the key is not found or if there's a type + * mismatch. + * @return The setting value associated with the key, or the default value if not found or on type + * mismatch. + * @throws Exception if the requested type is not supported. + */ + fun get(key: SettingsKey, default: T): T { + Log.d(TAG, "Fetching prefs '${key.value}' of type ${default::class.simpleName}") + + val cachedValue = cache[key] + + // value in cache AND correct type + if (cachedValue != null && cachedValue::class == default::class) { + @Suppress("UNCHECKED_CAST") + return cachedValue as T + } + + // value not in cache OR type mismatch; try to get from prefs + val newValue = + when (default) { + is String -> prefs.getString(key.value, default) + is Int -> prefs.getInt(key.value, default) + is Boolean -> prefs.getBoolean(key.value, default) + is Long -> prefs.getLong(key.value, default) + is Float -> prefs.getFloat(key.value, default) + else -> throw Exception("Unsupported value type ${default::class.simpleName}") + } + cache[key] = newValue!! + + @Suppress("UNCHECKED_CAST") + return newValue as T + } + + /** + * Sets a setting value for the given key, persisting to SharedPreferences and also updated in the + * in-memory cache. + * + * Supported types for the value are [String], [Int], [Boolean], [Long], and [Float]. + * + * @param T The type of the setting value. + * @param key The [SettingsKey] identifying the setting. + * @param value The value to be stored for the setting. + * @throws Exception if the type of the value is not supported. + */ + fun set(key: SettingsKey, value: T) { + Log.d(TAG, "Setting prefs '${key.value}' of type ${value::class.simpleName} to: $value") + + prefs.edit { + when (value) { + is String -> putString(key.value, value) + is Int -> putInt(key.value, value) + is Boolean -> putBoolean(key.value, value) + is Long -> putLong(key.value, value) + is Float -> putFloat(key.value, value) + else -> throw Exception("Unsupported value type ${value::class.simpleName}") + } + } + + cache[key] = value + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/HttpUtils.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/HttpUtils.kt new file mode 100644 index 00000000..22c41931 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/HttpUtils.kt @@ -0,0 +1,30 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.utils + +import android.util.Log +import java.net.NetworkInterface +import java.util.Collections + +object HttpUtils { + fun getIPAddress(): String? { + try { + val interfaces: List = + Collections.list(NetworkInterface.getNetworkInterfaces()) + + for (networkInterface in interfaces) { + val addresses = networkInterface.inetAddresses + + for (address in addresses) { + if (!address.isLoopbackAddress && address.isSiteLocalAddress) { + Log.i("HttpUtils", "ip address: ${address?.hostAddress}") + return address.hostAddress + } + } + } + } catch (e: Exception) { + e.printStackTrace() + } + return null + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/MRUKUtils.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/MRUKUtils.kt new file mode 100644 index 00000000..feaf4e78 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/MRUKUtils.kt @@ -0,0 +1,12 @@ +package com.meta.pixelandtexel.scanner.utils + +import com.meta.spatial.core.Entity +import com.meta.spatial.core.SystemManager +import com.meta.spatial.toolkit.PlayerBodyAttachmentSystem + +fun getRightController(systemManager: SystemManager): Entity? { + return systemManager + .tryFindSystem() + ?.tryGetLocalPlayerAvatarBody() + ?.rightHand +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/MathUtils.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/MathUtils.kt new file mode 100644 index 00000000..74e347fb --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/MathUtils.kt @@ -0,0 +1,62 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner.utils + +import com.meta.spatial.core.Bound3D +import com.meta.spatial.core.Quaternion +import com.meta.spatial.core.Vector3 +import kotlin.math.PI +import kotlin.math.cos +import kotlin.math.sin + +object MathUtils { + /** + * Constructs a quaternion from an axis-angle representation of a rotation. + * + * @param axis The 3D [Vector3] axis around which to perform the rotation. + * @param angleDegrees The angle in degrees of rotation to perform around the axis + * @return The [Quaternion] representing the rotation. + */ + fun Quaternion.Companion.fromAxisAngle(axis: Vector3, angleDegrees: Float): Quaternion { + val angleRadians = angleDegrees * PI / 180f + val halfAngle = angleRadians / 2 + val sinHalfAngle = sin(halfAngle).toFloat() + + return Quaternion( + cos(halfAngle).toFloat(), + axis.x * sinHalfAngle, + axis.y * sinHalfAngle, + axis.z * sinHalfAngle, + ) + .normalize() + } + + /** + * Creates a [Quaternion] representing a rotation in 3D space that is the combination of the + * supplied rotations in degrees around the pitch, yaw, and roll axes, in that order. + * + * @param pitchDeg The angle in degrees to apply to the rotation around the x axis. + * @param yawDeg The angle in degrees to apply to the rotation around the y axis. + * @param rollDeg The angle in degrees to apply to the rotation around the z axis. + * @return The [Quaternion] representing the rotation around the axes in sequential order. + */ + fun Quaternion.Companion.fromSequentialPYR( + pitchDeg: Float, + yawDeg: Float, + rollDeg: Float, + ): Quaternion { + return Quaternion.fromAxisAngle(Vector3.Right, pitchDeg) + .times(Quaternion.fromAxisAngle(Vector3.Up, yawDeg)) + .times(Quaternion.fromAxisAngle(Vector3.Forward, rollDeg)) + .normalize() + } + + fun Bound3D.isValid(): Boolean { + return this.min.x.isFinite() && + this.min.y.isFinite() && + this.min.z.isFinite() && + this.max.x.isFinite() && + this.max.y.isFinite() && + this.max.z.isFinite() + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/mytheme/AppTextStyles.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/mytheme/AppTextStyles.kt new file mode 100644 index 00000000..eb6089f2 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/mytheme/AppTextStyles.kt @@ -0,0 +1,30 @@ +package com.meta.pixelandtexel.scanner.utils.mytheme + +import androidx.compose.ui.graphics.Color +import androidx.compose.ui.text.TextStyle +import androidx.compose.ui.text.font.FontWeight +import androidx.compose.ui.text.style.TextDecoration +import androidx.compose.ui.unit.sp + + +object AppTextStyles { + val Title = TextStyle( + fontSize = 28.sp, + fontWeight = FontWeight.Bold + ) + val Subtitle = TextStyle( + fontSize = 18.sp, + fontWeight = FontWeight.Medium + ) + + + val Body = TextStyle( + fontSize = 14.sp, + fontWeight = FontWeight.Normal + ) + val Url = TextStyle( + fontSize = 14.sp, + color = Color.Blue, + textDecoration = TextDecoration.Underline + ) +} \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/mytheme/MyPaddings.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/mytheme/MyPaddings.kt new file mode 100644 index 00000000..f2b14f99 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/java/com/meta/pixelandtexel/scanner/utils/mytheme/MyPaddings.kt @@ -0,0 +1,11 @@ +package com.meta.pixelandtexel.scanner.utils.mytheme + +import androidx.compose.ui.unit.dp + +object MyPaddings { + val XS = 4.dp + val S = 8.dp + val M = 16.dp + val L = 24.dp + val XL = 32.dp +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/arrow_over_3x.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/arrow_over_3x.png new file mode 100644 index 00000000..a3f9d087 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/arrow_over_3x.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/connected_dev.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/connected_dev.png new file mode 100644 index 00000000..f2b920f7 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/connected_dev.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/delete_smart_things_view.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/delete_smart_things_view.png new file mode 100644 index 00000000..029b08a7 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/delete_smart_things_view.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/escaneo.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/escaneo.png new file mode 100644 index 00000000..83ef7e67 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/escaneo.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/hand_open.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/hand_open.png new file mode 100644 index 00000000..ec5106db Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/hand_open.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/hand_swipe.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/hand_swipe.png new file mode 100644 index 00000000..eaa27cf5 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/hand_swipe.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/layout_bg.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/layout_bg.xml new file mode 100644 index 00000000..2a5dbde6 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/layout_bg.xml @@ -0,0 +1,12 @@ + + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/object_label_bg.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/object_label_bg.xml new file mode 100644 index 00000000..a98debe3 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/object_label_bg.xml @@ -0,0 +1,11 @@ + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/placeholder.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/placeholder.png new file mode 100644 index 00000000..ec8e3b7e Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/placeholder.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/rounded_box_outline.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/rounded_box_outline.png new file mode 100644 index 00000000..bc64707b Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/rounded_box_outline.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/scan_svgrepo_com.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/scan_svgrepo_com.xml new file mode 100644 index 00000000..8bd34b01 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/scan_svgrepo_com.xml @@ -0,0 +1,59 @@ + + + + + + + + + + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/status_ellipse_3x.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/status_ellipse_3x.png new file mode 100644 index 00000000..730d56e2 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/drawable/status_ellipse_3x.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_camera_controls_view.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_camera_controls_view.xml new file mode 100644 index 00000000..3072a217 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_camera_controls_view.xml @@ -0,0 +1,24 @@ + + + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_camera_status_view.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_camera_status_view.xml new file mode 100644 index 00000000..f9e9192e --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_camera_status_view.xml @@ -0,0 +1,46 @@ + + + + + + + + + + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_camera_view.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_camera_view.xml new file mode 100644 index 00000000..89e159f8 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_camera_view.xml @@ -0,0 +1,29 @@ + + + + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_delete_smart_things_button_view.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_delete_smart_things_button_view.xml new file mode 100644 index 00000000..ecdd2271 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_delete_smart_things_button_view.xml @@ -0,0 +1,24 @@ + + + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_show_smart_things_button_view.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_show_smart_things_button_view.xml new file mode 100644 index 00000000..383cc609 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/layout/ui_show_smart_things_button_view.xml @@ -0,0 +1,21 @@ + + + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/colors.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/colors.xml new file mode 100644 index 00000000..1d240821 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/colors.xml @@ -0,0 +1,3 @@ + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/constants.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/constants.xml new file mode 100644 index 00000000..65a9a01c --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/constants.xml @@ -0,0 +1,16 @@ + + + + 0 + 1 + 2 + 3 + + + 4 + 5 + 6 + 7 + + 10 + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/strings.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/strings.xml new file mode 100644 index 00000000..34e63b44 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/strings.xml @@ -0,0 +1,56 @@ + + + + + Briefly describe the {{object_name}} in this image. Specifically, provide technical details or information about its construction, but do not guess details which are not apparent in the photo. + + + + Paused + Scanning + + + Got It + Continue + Dismiss + Generate Objects + + + Play/pause button + Help button + Refrigerator + Dot + + + + This app has been implemented to be able to scan and identify objects in your environment using Mixed Reality capabilities. No information is sent to any external servers. + + Camera Controls + + Look at your left palm to activate camera scanner controls. + + + Press play to begin the scan, and pause to end scanning. + + + + Select Object + Select an object to be able to configure it for Mixed Reality. + No Objects Detected + Try moving to a different room or location, or select one of the generated objects. + Find Curated Objects + Look for one of the curated objects: a phone, fridge, or television. Or select one of the generated objects. + Help + + You can scan your surroundings to identify objects with the play button. Select a detected object to learn more about it, or choose from generated objects. + + + + mruk_db + Delete Smart Things Button + Show Smart Things Button\n + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/styles.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/styles.xml new file mode 100644 index 00000000..6557bb71 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/values/styles.xml @@ -0,0 +1,18 @@ + + + + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/xml/network_security_config.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/xml/network_security_config.xml new file mode 100644 index 00000000..ddabe07e --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/xml/network_security_config.xml @@ -0,0 +1,6 @@ + + + + 192.168.1.216 + + \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/xml/objects.xml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/xml/objects.xml new file mode 100644 index 00000000..1d240821 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/main/res/xml/objects.xml @@ -0,0 +1,3 @@ + + + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/shaders/9slice.frag b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/shaders/9slice.frag new file mode 100644 index 00000000..2461a765 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/shaders/9slice.frag @@ -0,0 +1,107 @@ +/** + * Displays a texture on a surface, implementing 9-slice scaling + * (https://en.wikipedia.org/wiki/9-slice_scaling). + * + * Learn more about using custom shader with Meta's Spatial SDK here: + * https://developers.meta.com/horizon/documentation/spatial-sdk/spatial-sdk-custom-shaders + */ + +#version 400 +#extension GL_ARB_separate_shader_objects: enable +#extension GL_ARB_shading_language_420pack: enable + +#include + +layout (std140, set = 3, binding = 0) uniform MaterialUniform { + // .xy = quad width and height, .zw = texture width and height + vec4 sliceParams; + // slice size in order: left, right, top, bottom + vec4 sliceSize; + // .rgb = tint color, .w = pixels-per-unit multiplier + vec4 tintColor; + vec4 stereoParams; +} g_MaterialUniform; + +layout (set = 3, binding = 1) uniform sampler2D sliceTex; + +layout (location = 0) in struct { + vec4 color; + vec2 albedoCoord; + vec2 roughnessMetallicCoord; + vec2 emissiveCoord; + vec2 occlusionCoord; + vec2 normalCoord; + vec3 lighting; + vec3 worldNormal; + vec3 worldPosition; +} vertexOut; + +layout (location = 0) out vec4 outColor; + +void main() { + // default panel resolution (PanelConfigOptions.EYEBUFFER_WIDTH * 0.5f) + float pixelsPerUnit = 1032.0 * g_MaterialUniform.tintColor.a; + + vec2 uTextureSize = g_MaterialUniform.sliceParams.zw; + vec2 uOutputSize = g_MaterialUniform.sliceParams.xy * pixelsPerUnit; + vec4 uSliceSize = g_MaterialUniform.sliceSize.xyzw; + + // convert slice sizes to UVs + vec2 texelSize = 1.0 / uTextureSize; + float left = uSliceSize.x * texelSize.x; + float right = uSliceSize.y * texelSize.x; + float top = uSliceSize.z * texelSize.y; + float bottom = uSliceSize.w * texelSize.y; + + // thresholds for dividing into 3x3 regions in UV space + float leftLimit = uSliceSize.x; + float rightLimit = uOutputSize.x - uSliceSize.y; + float topLimit = uSliceSize.z; + float bottomLimit = uOutputSize.y - uSliceSize.w; + + // normalized position in output space + vec2 uv = vertexOut.albedoCoord; + vec2 finalUV; + + // pixel position + float x = uv.x * uOutputSize.x; + float y = uv.y * uOutputSize.y; + + // horizontal slicing + + if (x < leftLimit) { + finalUV.x = x / leftLimit * left; + } + else if (x > rightLimit) { + finalUV.x = 1.0 - (uOutputSize.x - x) / (uOutputSize.x - rightLimit) * right; + } + else { + float stretchWidth = uOutputSize.x - uSliceSize.x - uSliceSize.y; + float innerX = (x - uSliceSize.x) / stretchWidth; + float texWidth = 1.0 - left - right; + finalUV.x = left + innerX * texWidth; + } + + // vertical slicing + + if (y < topLimit) { + finalUV.y = y / topLimit * top; + } + else if (y > bottomLimit) { + finalUV.y = 1.0 - (uOutputSize.y - y) / (uOutputSize.y - bottomLimit) * bottom; + } + else { + float stretchHeight = uOutputSize.y - uSliceSize.z - uSliceSize.w; + float innerY = (y - uSliceSize.z) / stretchHeight; + float texHeight = 1.0 - top - bottom; + finalUV.y = top + innerY * texHeight; + } + + // our final sampling and tinting + + vec4 pixel = texture(sliceTex, finalUV); + vec3 linearTint = srgb_to_linear(g_MaterialUniform.tintColor.rgb); // convert to linear color + vec3 color = pixel.rgb * linearTint; + + outColor = vec4(color, pixel.a); +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/shaders/9slice.vert b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/shaders/9slice.vert new file mode 100644 index 00000000..3d5e414b --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/shaders/9slice.vert @@ -0,0 +1,47 @@ +/** + * Learn more about using custom shader with Meta's Spatial SDK here: + * https://developers.meta.com/horizon/documentation/spatial-sdk/spatial-sdk-custom-shaders + */ + +#version 430 +#extension GL_ARB_separate_shader_objects: enable +#extension GL_ARB_shading_language_420pack: enable + +#include +#include + +layout (location = 0) out struct { + vec4 color; + vec2 albedoCoord; + vec2 roughnessMetallicCoord; + vec2 emissiveCoord; + vec2 occlusionCoord; + vec2 normalCoord; + vec3 lighting; + vec3 worldNormal; + vec3 worldPosition; +} vertexOut; + +layout (std140, set = 3, binding = 0) uniform MaterialUniform { + vec4 sliceParams; + vec4 sliceSize; + vec4 tintColor; + vec4 stereoParams; +} g_MaterialUniform; + +vec2 stereo(vec2 uv) { + return getStereoPassId() * g_MaterialUniform.stereoParams.xy + uv * g_MaterialUniform.stereoParams.zw; +} + +void main() { + App2VertexUnpacked app = getApp2VertexUnpacked(); + + vec4 wPos4 = g_PrimitiveUniform.worldFromObject * vec4(app.position, 1.0f); + vertexOut.albedoCoord = stereo(app.uv); + vertexOut.worldPosition = wPos4.xyz; + vertexOut.worldNormal = normalize((transpose(g_PrimitiveUniform.objectFromWorld) * vec4(app.normal, 0.0f)).xyz); + + gl_Position = getClipFromWorld() * wPos4; + + postprocessPosition(gl_Position); +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/shaders/customPanel.frag b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/shaders/customPanel.frag new file mode 100644 index 00000000..1ec27a64 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/shaders/customPanel.frag @@ -0,0 +1,22 @@ +/** + * Learn more about using custom shader with Meta's Spatial SDK here: + * https://developers.meta.com/horizon/documentation/spatial-sdk/spatial-sdk-custom-shaders + */ + +#version 430 +#extension GL_ARB_separate_shader_objects: enable +#extension GL_ARB_shading_language_420pack: enable + +layout (location = 0) out vec4 outColor; +layout (location = 0) in vec2 otc; + +layout (binding = 0) uniform texture2D tex; +layout (binding = 1) uniform sampler samp; + +void main() { + vec4 pixel = texture(sampler2D(tex, samp), otc); + + // Add any additional processing to the panel texture here + + outColor = pixel; +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/test/java/com/meta/pixelandtexel/scanner/ExampleUnitTest.kt b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/test/java/com/meta/pixelandtexel/scanner/ExampleUnitTest.kt new file mode 100644 index 00000000..768f4313 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/app/src/test/java/com/meta/pixelandtexel/scanner/ExampleUnitTest.kt @@ -0,0 +1,18 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +package com.meta.pixelandtexel.scanner + +import org.junit.Assert.* +import org.junit.Test + +/** + * Example local unit test, which will execute on the development machine (host). + * + * See [testing documentation](http://d.android.com/tools/testing). + */ +class ExampleUnitTest { + @Test + fun addition_isCorrect() { + assertEquals(4, 2 + 2) + } +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/build.gradle.kts b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/build.gradle.kts new file mode 100644 index 00000000..2ec645db --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/build.gradle.kts @@ -0,0 +1,9 @@ +// Top-level build file where you can add configuration options common to all sub-projects/modules. +plugins { + alias(libs.plugins.android.application) apply false + alias(libs.plugins.jetbrains.kotlin.android) apply false + + id("org.jetbrains.kotlin.plugin.serialization") version "1.8.10" + id("com.google.devtools.ksp") version "2.1.20-1.0.31" apply false + +} diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/MANUAL.md b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/MANUAL.md new file mode 100644 index 00000000..33dfe667 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/MANUAL.md @@ -0,0 +1,80 @@ +# User manual + +All decisions executed in this project, along with the fundamental aspects +developed through software engineering, have resulted in a functional application usable +by those with the necessary resources. + +# Tech configuration + +## Home Assistant + +This project combines Mixed Reality in home automation with Artificial Intelligence. It is necessary +to have the appropriate resources to use the application. + +- Server: create the Home Assistant server on a compatible device, such as a Docker container. +- Network: access the network to find the server IP and add it to `local.properties`. +- Token: access the web application after configuring Home Assistant and obtaining its IP. In the + `Profile > Security` section, create a _Long-Lived Access Token_ to access the API. This token + must be added to `local.properties`. + +![img.png](images/home_assistant_token.png) + +Finally, update the IP in the file `app/src/main/res/xml/network_security_config.xml`. The +`local.properties` file must look like this: + +``` +HTTP_API=http://192.168.X.X:8123/api +HTTP_TOKEN=YOUR_LONG_LIVED_ACCESS_TOKEN +``` + +## Multimodal System + +To determine the location of virtual objects, it is necessary to know the environment. When starting +the application, you must scan the environment with the device or have a previous scan ready. + +From the Meta Quest 3 settings, access the environment configuration to perform the scan. + +Meta will present the final scan for validation before continuing. +![img.png](images/conf_espacio_1.png) +![img.png](images/conf_espacio.png) + +# Application Usage + +After configuration, start the application on the Meta Quest 3. It will request permission to access +the network (communication with Home Assistant) and the environment. After accepting, the +application will start. + +Control resides in the user's hands to interact with panels and activate buttons. + +The left hand shows three buttons with different functionalities when facing the device. These are +activated with the opposite hand. +![img.png](images/botones_mano.png) +![img.png](images/accion_mano_izda.png) + +## Scan button + +Represented by a scanner icon. It allows you to rescan the environment to recognize interactable +objects. When pressed, transparent panels appear over recognizable objects. +![img.png](images/interaccion con luz.png) + +An information panel will appear next to the identified object to assign the corresponding smart +device. +![img.png](images/seleccionParaparear.png) + +The recognized object will remain associated and saved in memory to the selected smart device, +allowing interaction through Mixed Reality. + +To unlink or hide the panels, use the buttons integrated into each panel: + +![img.png](images/botones_panel.png) + +## Load Smart Things button + +Retrieves all panels identified with associated Smart Things and shows them in the environment to +interact without needing a prior scan. + +## Clear Smart Things button + +Removes the panels associated with Smart Things from view, maintaining the internal association +between the object and the smart device. + diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/accion_mano_izda.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/accion_mano_izda.png new file mode 100644 index 00000000..d9498c7d Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/accion_mano_izda.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/botones_mano.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/botones_mano.png new file mode 100644 index 00000000..d9f52ace Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/botones_mano.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/botones_panel.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/botones_panel.png new file mode 100644 index 00000000..0bda4cf1 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/botones_panel.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/clean_arch_features.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/clean_arch_features.png new file mode 100644 index 00000000..8553bcbf Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/clean_arch_features.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/conf_espacio.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/conf_espacio.png new file mode 100644 index 00000000..64a1037d Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/conf_espacio.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/conf_espacio_1.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/conf_espacio_1.png new file mode 100644 index 00000000..6db61b06 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/conf_espacio_1.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/home_assistant_token.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/home_assistant_token.png new file mode 100644 index 00000000..3b0211f8 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/home_assistant_token.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/img.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/img.png new file mode 100644 index 00000000..b4b4002f Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/img.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/interaccion con luz.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/interaccion con luz.png new file mode 100644 index 00000000..13db74a0 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/interaccion con luz.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/samsung_state_media_player.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/samsung_state_media_player.png new file mode 100644 index 00000000..da321753 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/samsung_state_media_player.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/seleccionParaparear.png b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/seleccionParaparear.png new file mode 100644 index 00000000..20af82aa Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/documentation/images/seleccionParaparear.png differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle.properties b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle.properties new file mode 100644 index 00000000..92aab9b0 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle.properties @@ -0,0 +1,28 @@ +# Project-wide Gradle settings. +# IDE (e.g. Android Studio) users: +# Gradle settings configured through the IDE *will override* +# any settings specified in this file. +# For more details on how to configure your build environment visit +# http://www.gradle.org/docs/current/userguide/build_environment.html +# Specifies the JVM arguments used for the daemon process. +# The setting is particularly useful for tweaking memory settings. +org.gradle.jvmargs=-Xmx2048m -Dfile.encoding=UTF-8 +# When configured, Gradle will run in incubating parallel mode. +# This option should only be used with decoupled projects. More details, visit +# http://www.gradle.org/docs/current/userguide/multi_project_builds.html#sec:decoupled_projects +# org.gradle.parallel=true +# AndroidX package structure to make it clearer which packages are bundled with the +# Android operating system, and which are packaged with your app's APK +# https://developer.android.com/topic/libraries/support-library/androidx-rn +android.useAndroidX=true +# Kotlin code style for this project: "official" or "obsolete": +kotlin.code.style=official +# Enables namespacing of each library's R class so that its R class includes only the +# resources declared in the library itself and none from the library's dependencies, +# thereby reducing the size of the R class for that library +android.nonTransitiveRClass=true + +# Set this flag to 'false' to disable @AddTrace annotation processing and +# automatic monitoring of HTTP/S network requests +# for all build variants at compile time. +firebasePerformanceInstrumentationEnabled=false diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle/libs.versions.toml b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle/libs.versions.toml new file mode 100644 index 00000000..422224a7 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle/libs.versions.toml @@ -0,0 +1,100 @@ +[versions] +koinAndroid = "3.5.6" +retrofit = "2.9.0" +retrofit2KotlinxSerializationConverter = "1.0.0" +roomRuntime = "2.8.4" +spatialsdk = "0.8.0" +agp = "8.6.1" +kotlin = "2.1.20" +coreKtx = "1.16.0" +junit = "4.13.2" +junitVersion = "1.2.1" +espressoCore = "3.6.1" +composeActivity = "1.10.1" +navigationCompose = "2.8.2" +lifecycleRuntimeKtx = "2.8.6" +composeBom = "2025.05.00" +appcompat = "1.7.0" +composeMarkdown = "0.5.7" +constraintlayout = "2.2.1" +mediapipe = "0.20230731" +mlkit = "17.0.2" +opencv = "4.10.0" +aws = "1.3.3" +gson = "2.11.0" +ktorServer = "3.1.2" +hiltAndroid = "2.51.1" +hiltNavigationCompose = "1.2.0" + + +[libraries] +# Meta Spatial SDK Libraries +androidx-room-compiler = { module = "androidx.room:room-compiler", version.ref = "roomRuntime" } +androidx-room-runtime = { module = "androidx.room:room-runtime", version.ref = "roomRuntime" } +koin-android = { module = "io.insert-koin:koin-android", version.ref = "koinAndroid" } +meta-spatial-sdk-base = { group = "com.meta.spatial", name = "meta-spatial-sdk", version.ref = "spatialsdk" } +meta-spatial-sdk-animation = { group = "com.meta.spatial", name = "meta-spatial-sdk-animation", version.ref = "spatialsdk" } +meta-spatial-sdk-castinputforward = { group = "com.meta.spatial", name = "meta-spatial-sdk-castinputforward", version.ref = "spatialsdk" } +meta-spatial-sdk-compose = { group = "com.meta.spatial", name = "meta-spatial-sdk-compose", version.ref = "spatialsdk" } +meta-spatial-sdk-datamodelinspector = { group = "com.meta.spatial", name = "meta-spatial-sdk-datamodelinspector", version.ref = "spatialsdk" } +meta-spatial-sdk-hotreload = { group = "com.meta.spatial", name = "meta-spatial-sdk-hotreload", version.ref = "spatialsdk" } +meta-spatial-sdk-isdk = { group = "com.meta.spatial", name = "meta-spatial-sdk-isdk", version.ref = "spatialsdk" } +meta-spatial-sdk-mruk = { group = "com.meta.spatial", name = "meta-spatial-sdk-mruk", version.ref = "spatialsdk" } +meta-spatial-sdk-ovrmetrics = { group = "com.meta.spatial", name = "meta-spatial-sdk-ovrmetrics", version.ref = "spatialsdk" } +meta-spatial-sdk-physics = { group = "com.meta.spatial", name = "meta-spatial-sdk-physics", version.ref = "spatialsdk" } +meta-spatial-sdk-spatialaudio = { group = "com.meta.spatial", name = "meta-spatial-sdk-spatialaudio", version.ref = "spatialsdk" } +meta-spatial-sdk-toolkit = { group = "com.meta.spatial", name = "meta-spatial-sdk-toolkit", version.ref = "spatialsdk" } +meta-spatial-sdk-uiset = { group = "com.meta.spatial", name = "meta-spatial-sdk-uiset", version.ref = "spatialsdk" } +meta-spatial-sdk-vr = { group = "com.meta.spatial", name = "meta-spatial-sdk-vr", version.ref = "spatialsdk" } + +# Core Android Libraries +androidx-core-ktx = { group = "androidx.core", name = "core-ktx", version.ref = "coreKtx" } +junit = { group = "junit", name = "junit", version.ref = "junit" } +androidx-junit = { group = "androidx.test.ext", name = "junit", version.ref = "junitVersion" } +androidx-espresso-core = { group = "androidx.test.espresso", name = "espresso-core", version.ref = "espressoCore" } + +# Compose Libraries +androidx-lifecycle-runtime-ktx = { group = "androidx.lifecycle", name = "lifecycle-runtime-ktx", version.ref = "lifecycleRuntimeKtx" } +androidx-lifecycle-viewmodel-compose = { group = "androidx.lifecycle", name = "lifecycle-viewmodel-compose", version.ref = "lifecycleRuntimeKtx" } +androidx-activity-compose = { group = "androidx.activity", name = "activity-compose", version.ref = "composeActivity" } +androidx-navigation-compose = { group = "androidx.navigation", name = "navigation-compose", version.ref = "navigationCompose" } +androidx-compose-bom = { group = "androidx.compose", name = "compose-bom", version.ref = "composeBom" } +androidx-ui = { group = "androidx.compose.ui", name = "ui" } +androidx-ui-graphics = { group = "androidx.compose.ui", name = "ui-graphics" } +androidx-ui-tooling = { group = "androidx.compose.ui", name = "ui-tooling" } +androidx-ui-tooling-preview = { group = "androidx.compose.ui", name = "ui-tooling-preview" } +androidx-ui-test-manifest = { group = "androidx.compose.ui", name = "ui-test-manifest" } +androidx-ui-test-junit4 = { group = "androidx.compose.ui", name = "ui-test-junit4" } +androidx-appcompat = { group = "androidx.appcompat", name = "appcompat", version.ref = "appcompat" } +androidx-material3 = { group = "androidx.compose.material3", name = "material3" } +compose-markdown = { group = "com.github.jeziellago", name = "compose-markdown", version.ref = "composeMarkdown" } +androidx-hilt-android = { group = "com.google.dagger", name = "hilt-android", version.ref = "hiltAndroid" } +androidx-hilt-android-compiler = { group = "com.google.dagger", name = "hilt-android-compiler", version.ref = "hiltAndroid" } +androidx-hilt-navigation-compose = { module = "androidx.hilt:hilt-navigation-compose", version.ref = "hiltNavigationCompose" } + +# Android UI +androidx-constraintlayout = { group = "androidx.constraintlayout", name = "constraintlayout", version.ref = "constraintlayout" } + +# AWS Llama invocation +aws-bedrockruntime = { group = "aws.sdk.kotlin", name = "bedrockruntime", version.ref = "aws" } +google-gson = { group = "com.google.code.gson", name = "gson", version.ref = "gson" } + +# Ktor Netty embedded server +ktor-server-core = { group = "io.ktor", name = "ktor-server-core", version.ref = "ktorServer" } +ktor-server-netty = { group = "io.ktor", name = "ktor-server-netty", version.ref = "ktorServer" } + +# CV +google-mediapipe-tasks-vision = { group = "com.google.mediapipe", name = "tasks-vision", version.ref = "mediapipe" } +google-mlkit-object1-detection = { group = "com.google.mlkit", name = "object-detection", version.ref = "mlkit" } +google-mlkit-object1-detection-custom = { group = "com.google.mlkit", name = "object-detection-custom", version.ref = "mlkit" } +opencv = { group = "org.opencv", name = "opencv", version.ref = "opencv" } +retrofit = { module = "com.squareup.retrofit2:retrofit", version.ref = "retrofit" } +retrofit2-kotlinx-serialization-converter = { module = "com.jakewharton.retrofit:retrofit2-kotlinx-serialization-converter", version.ref = "retrofit2KotlinxSerializationConverter" } + + +[plugins] +android-application = { id = "com.android.application", version.ref = "agp" } +android-library = { id = "com.android.library", version.ref = "agp" } +jetbrains-kotlin-android = { id = "org.jetbrains.kotlin.android", version.ref = "kotlin" } +jetbrains-kotlin-plugin-compose = { id = "org.jetbrains.kotlin.plugin.compose", version.ref = "kotlin"} +meta-spatial-plugin = { id = "com.meta.spatial.plugin", version.ref = "spatialsdk" } diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle/wrapper/gradle-wrapper.jar b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle/wrapper/gradle-wrapper.jar new file mode 100644 index 00000000..e6441136 Binary files /dev/null and b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle/wrapper/gradle-wrapper.jar differ diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle/wrapper/gradle-wrapper.properties b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle/wrapper/gradle-wrapper.properties new file mode 100644 index 00000000..b82aa23a --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradle/wrapper/gradle-wrapper.properties @@ -0,0 +1,7 @@ +distributionBase=GRADLE_USER_HOME +distributionPath=wrapper/dists +distributionUrl=https\://services.gradle.org/distributions/gradle-8.7-bin.zip +networkTimeout=10000 +validateDistributionUrl=true +zipStoreBase=GRADLE_USER_HOME +zipStorePath=wrapper/dists diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradlew b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradlew new file mode 100755 index 00000000..1aa94a42 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradlew @@ -0,0 +1,249 @@ +#!/bin/sh + +# +# Copyright © 2015-2021 the original authors. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +############################################################################## +# +# Gradle start up script for POSIX generated by Gradle. +# +# Important for running: +# +# (1) You need a POSIX-compliant shell to run this script. If your /bin/sh is +# noncompliant, but you have some other compliant shell such as ksh or +# bash, then to run this script, type that shell name before the whole +# command line, like: +# +# ksh Gradle +# +# Busybox and similar reduced shells will NOT work, because this script +# requires all of these POSIX shell features: +# * functions; +# * expansions «$var», «${var}», «${var:-default}», «${var+SET}», +# «${var#prefix}», «${var%suffix}», and «$( cmd )»; +# * compound commands having a testable exit status, especially «case»; +# * various built-in commands including «command», «set», and «ulimit». +# +# Important for patching: +# +# (2) This script targets any POSIX shell, so it avoids extensions provided +# by Bash, Ksh, etc; in particular arrays are avoided. +# +# The "traditional" practice of packing multiple parameters into a +# space-separated string is a well documented source of bugs and security +# problems, so this is (mostly) avoided, by progressively accumulating +# options in "$@", and eventually passing that to Java. +# +# Where the inherited environment variables (DEFAULT_JVM_OPTS, JAVA_OPTS, +# and GRADLE_OPTS) rely on word-splitting, this is performed explicitly; +# see the in-line comments for details. +# +# There are tweaks for specific operating systems such as AIX, CygWin, +# Darwin, MinGW, and NonStop. +# +# (3) This script is generated from the Groovy template +# https://github.com/gradle/gradle/blob/HEAD/subprojects/plugins/src/main/resources/org/gradle/api/internal/plugins/unixStartScript.txt +# within the Gradle project. +# +# You can find Gradle at https://github.com/gradle/gradle/. +# +############################################################################## + +# Attempt to set APP_HOME + +# Resolve links: $0 may be a link +app_path=$0 + +# Need this for daisy-chained symlinks. +while + APP_HOME=${app_path%"${app_path##*/}"} # leaves a trailing /; empty if no leading path + [ -h "$app_path" ] +do + ls=$( ls -ld "$app_path" ) + link=${ls#*' -> '} + case $link in #( + /*) app_path=$link ;; #( + *) app_path=$APP_HOME$link ;; + esac +done + +# This is normally unused +# shellcheck disable=SC2034 +APP_BASE_NAME=${0##*/} +# Discard cd standard output in case $CDPATH is set (https://github.com/gradle/gradle/issues/25036) +APP_HOME=$( cd "${APP_HOME:-./}" > /dev/null && pwd -P ) || exit + +# Use the maximum available, or set MAX_FD != -1 to use that value. +MAX_FD=maximum + +warn () { + echo "$*" +} >&2 + +die () { + echo + echo "$*" + echo + exit 1 +} >&2 + +# OS specific support (must be 'true' or 'false'). +cygwin=false +msys=false +darwin=false +nonstop=false +case "$( uname )" in #( + CYGWIN* ) cygwin=true ;; #( + Darwin* ) darwin=true ;; #( + MSYS* | MINGW* ) msys=true ;; #( + NONSTOP* ) nonstop=true ;; +esac + +CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar + + +# Determine the Java command to use to start the JVM. +if [ -n "$JAVA_HOME" ] ; then + if [ -x "$JAVA_HOME/jre/sh/java" ] ; then + # IBM's JDK on AIX uses strange locations for the executables + JAVACMD=$JAVA_HOME/jre/sh/java + else + JAVACMD=$JAVA_HOME/bin/java + fi + if [ ! -x "$JAVACMD" ] ; then + die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME + +Please set the JAVA_HOME variable in your environment to match the +location of your Java installation." + fi +else + JAVACMD=java + if ! command -v java >/dev/null 2>&1 + then + die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. + +Please set the JAVA_HOME variable in your environment to match the +location of your Java installation." + fi +fi + +# Increase the maximum file descriptors if we can. +if ! "$cygwin" && ! "$darwin" && ! "$nonstop" ; then + case $MAX_FD in #( + max*) + # In POSIX sh, ulimit -H is undefined. That's why the result is checked to see if it worked. + # shellcheck disable=SC2039,SC3045 + MAX_FD=$( ulimit -H -n ) || + warn "Could not query maximum file descriptor limit" + esac + case $MAX_FD in #( + '' | soft) :;; #( + *) + # In POSIX sh, ulimit -n is undefined. That's why the result is checked to see if it worked. + # shellcheck disable=SC2039,SC3045 + ulimit -n "$MAX_FD" || + warn "Could not set maximum file descriptor limit to $MAX_FD" + esac +fi + +# Collect all arguments for the java command, stacking in reverse order: +# * args from the command line +# * the main class name +# * -classpath +# * -D...appname settings +# * --module-path (only if needed) +# * DEFAULT_JVM_OPTS, JAVA_OPTS, and GRADLE_OPTS environment variables. + +# For Cygwin or MSYS, switch paths to Windows format before running java +if "$cygwin" || "$msys" ; then + APP_HOME=$( cygpath --path --mixed "$APP_HOME" ) + CLASSPATH=$( cygpath --path --mixed "$CLASSPATH" ) + + JAVACMD=$( cygpath --unix "$JAVACMD" ) + + # Now convert the arguments - kludge to limit ourselves to /bin/sh + for arg do + if + case $arg in #( + -*) false ;; # don't mess with options #( + /?*) t=${arg#/} t=/${t%%/*} # looks like a POSIX filepath + [ -e "$t" ] ;; #( + *) false ;; + esac + then + arg=$( cygpath --path --ignore --mixed "$arg" ) + fi + # Roll the args list around exactly as many times as the number of + # args, so each arg winds up back in the position where it started, but + # possibly modified. + # + # NB: a `for` loop captures its iteration list before it begins, so + # changing the positional parameters here affects neither the number of + # iterations, nor the values presented in `arg`. + shift # remove old arg + set -- "$@" "$arg" # push replacement arg + done +fi + + +# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script. +DEFAULT_JVM_OPTS='"-Xmx64m" "-Xms64m"' + +# Collect all arguments for the java command: +# * DEFAULT_JVM_OPTS, JAVA_OPTS, JAVA_OPTS, and optsEnvironmentVar are not allowed to contain shell fragments, +# and any embedded shellness will be escaped. +# * For example: A user cannot expect ${Hostname} to be expanded, as it is an environment variable and will be +# treated as '${Hostname}' itself on the command line. + +set -- \ + "-Dorg.gradle.appname=$APP_BASE_NAME" \ + -classpath "$CLASSPATH" \ + org.gradle.wrapper.GradleWrapperMain \ + "$@" + +# Stop when "xargs" is not available. +if ! command -v xargs >/dev/null 2>&1 +then + die "xargs is not available" +fi + +# Use "xargs" to parse quoted args. +# +# With -n1 it outputs one arg per line, with the quotes and backslashes removed. +# +# In Bash we could simply go: +# +# readarray ARGS < <( xargs -n1 <<<"$var" ) && +# set -- "${ARGS[@]}" "$@" +# +# but POSIX shell has neither arrays nor command substitution, so instead we +# post-process each arg (as a line of input to sed) to backslash-escape any +# character that might be a shell metacharacter, then use eval to reverse +# that process (while maintaining the separation between arguments), and wrap +# the whole thing up as a single "set" statement. +# +# This will of course break if any of these variables contains a newline or +# an unmatched quote. +# + +eval "set -- $( + printf '%s\n' "$DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS" | + xargs -n1 | + sed ' s~[^-[:alnum:]+,./:=@_]~\\&~g; ' | + tr '\n' ' ' + )" '"$@"' + +exec "$JAVACMD" "$@" diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradlew.bat b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradlew.bat new file mode 100644 index 00000000..de157914 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/gradlew.bat @@ -0,0 +1,94 @@ +@REM (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +@rem +@rem Copyright 2015 the original author or authors. +@rem +@rem Licensed under the Apache License, Version 2.0 (the "License"); +@rem you may not use this file except in compliance with the License. +@rem You may obtain a copy of the License at +@rem +@rem https://www.apache.org/licenses/LICENSE-2.0 +@rem +@rem Unless required by applicable law or agreed to in writing, software +@rem distributed under the License is distributed on an "AS IS" BASIS, +@rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +@rem See the License for the specific language governing permissions and +@rem limitations under the License. +@rem + +@if "%DEBUG%"=="" @echo off +@rem ########################################################################## +@rem +@rem Gradle startup script for Windows +@rem +@rem ########################################################################## + +@rem Set local scope for the variables with windows NT shell +if "%OS%"=="Windows_NT" setlocal + +set DIRNAME=%~dp0 +if "%DIRNAME%"=="" set DIRNAME=. +@rem This is normally unused +set APP_BASE_NAME=%~n0 +set APP_HOME=%DIRNAME% + +@rem Resolve any "." and ".." in APP_HOME to make it shorter. +for %%i in ("%APP_HOME%") do set APP_HOME=%%~fi + +@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script. +set DEFAULT_JVM_OPTS="-Xmx64m" "-Xms64m" + +@rem Find java.exe +if defined JAVA_HOME goto findJavaFromJavaHome + +set JAVA_EXE=java.exe +%JAVA_EXE% -version >NUL 2>&1 +if %ERRORLEVEL% equ 0 goto execute + +echo. 1>&2 +echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. 1>&2 +echo. 1>&2 +echo Please set the JAVA_HOME variable in your environment to match the 1>&2 +echo location of your Java installation. 1>&2 + +goto fail + +:findJavaFromJavaHome +set JAVA_HOME=%JAVA_HOME:"=% +set JAVA_EXE=%JAVA_HOME%/bin/java.exe + +if exist "%JAVA_EXE%" goto execute + +echo. 1>&2 +echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME% 1>&2 +echo. 1>&2 +echo Please set the JAVA_HOME variable in your environment to match the 1>&2 +echo location of your Java installation. 1>&2 + +goto fail + +:execute +@rem Setup the command line + +set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar + + +@rem Execute Gradle +"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %* + +:end +@rem End local scope for the variables with windows NT shell +if %ERRORLEVEL% equ 0 goto mainEnd + +:fail +rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of +rem the _cmd.exe /c_ return code! +set EXIT_CODE=%ERRORLEVEL% +if %EXIT_CODE% equ 0 set EXIT_CODE=1 +if not ""=="%GRADLE_EXIT_CONSOLE%" exit %EXIT_CODE% +exit /b %EXIT_CODE% + +:mainEnd +if "%OS%"=="Windows_NT" endlocal + +:omega diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/secrets.properties.example b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/secrets.properties.example new file mode 100644 index 00000000..3394d6e8 --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/secrets.properties.example @@ -0,0 +1,2 @@ +HTTP_API=http://...:xxxx/api/ +HOME_ASSISTANT_TOKEN= \ No newline at end of file diff --git a/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/settings.gradle.kts b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/settings.gradle.kts new file mode 100644 index 00000000..5fb8a35b --- /dev/null +++ b/Showcases/meta_spatial_home_assistant/meta_spatial_scanner/settings.gradle.kts @@ -0,0 +1,22 @@ +// (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary. + +pluginManagement { + repositories { + google() + gradlePluginPortal() + mavenCentral() + } +} + +dependencyResolutionManagement { + repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS) + repositories { + google() + mavenCentral() + maven { url = uri("https://jitpack.io") } + } +} + +rootProject.name = "Scanner App" + +include(":app")