Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Orientation Sensor support #2440

Open
MrFr33man123 opened this issue Apr 8, 2022 · 13 comments · May be fixed by #4634
Open

Add Orientation Sensor support #2440

MrFr33man123 opened this issue Apr 8, 2022 · 13 comments · May be fixed by #4634
Labels
enhancement New feature or request sensor-tracking

Comments

@MrFr33man123
Copy link

Describe the solution you'd like
An extra Sensor that is made avaliable.
It should output an orientation like face down, face up, horizontal, and so on like a dice.

Describe alternatives you've considered, if any
Alternative i think one could give the raw degree angles for each axis. XYZ
But it would be a pain to use.

Additional context
Example: You want to Automate something that is only Triggered if you wake up at night. if state was face up and charging and between 10pm and 5am(you where probably asleep) then turn on some light in nightlight mode.

Or in the Car if connected to specific Bluetooth device and upright orientation (phone holder) + leaving home zone
=> send notification to partner phone with text "I'am leaving"

@MrFr33man123 MrFr33man123 added the enhancement New feature or request label Apr 8, 2022
@dshokouhi
Copy link
Member

So looking more into this request. I think we can get by with reporting the rotation angle of the device that can translate to portrait/landscape based on the 4 data points (0, 90, 180, 270). The other part of this request should be doable already. The proximity sensor should be able to tell the user if something is blocking the sensor or not so on a flat surface that can determine if its laying face down or not. Might be a bit of a stretch, if that is not good enough here then yes we will be needing to create a sensor with raw values from the accelerometer so the user can translate them for their own use case.

@tschaedi
Copy link

Any update on this?
I'm interested in this new sensor, too. This would me allow to make the Tasker App obsolete, which I use for updating the device orientation. My use case is similar to the above use case. I bring my phone(s) and the lighting in house to "sleep mode" if after 10pm and phone is charging and face down.

@Hish15
Copy link

Hish15 commented Sep 10, 2024

I would use this to give more power to NFC tags. Here is an example:
To control a motorized blind, scanning the NFC tag with the phone upward make it go up, and downward make it go down.

@dshokouhi
Copy link
Member

Hey guys I have a submitted a PR adding 3 sensors containing the requested data. It would be great to get some end user testing on this to make sure it works as one would expect. In particular if anyone has a foldable to test on that would be great.

You can grab the debug APK from the artifact in https://github.com/home-assistant/android/actions/runs/10854606133

Extract the file and install the full or minimal debug APK. This will install side by side production, the app will have an easy identifiable red icon. I recommend using a unique device name to easily delete after the test.

The 3 sensors:

  • Orientation of the device screen (should update immediately when the screen is on and orientation updates)
  • Rotation angle of the device, relative to devices natural orientation. This will tell you if the device is horizontal or not, will vary from phone to tablets as phone will be 0 for portrait while a tablet can be 0 for landscape.
  • Is face down - this one requires more testing as it relies on the devices accelerometer rather than a native API

please let me know how it works for you! you may need to grant the app background access if you dont see updates continuing in the background

@tschaedi
Copy link

Hi,

thx a lot. Unfortunately I can't test it, cause I don't use HA at the moment.

@MrFr33man123
Copy link
Author

I tried it with a Huawei Mate20Pro.
The Landscape Portrait sensor worked.
The degree sensor was limited by the OS,only 0, 90, 270 where shown no 180.
Only those numbers no exact degrees.
Face down or just side wasn't working at all.

@dshokouhi
Copy link
Member

dshokouhi commented Sep 18, 2024

The degree sensor was limited by the OS,only 0, 90, 270 where shown no 180.

ok thats to be expected

Only those numbers no exact degrees.

that is also to be expected thats all the API reports

Face down or just side wasn't working at all.

not sure what you mean by just side as thats not a real state. Face down sensor only updates once every 15 minutes OR when any other sensor is updated. I assume you did not wait long enough to update?

sensors that rely solely on the accelerometer will not update immediately otherwise it will be a big battery drain, pair those sensors with another sensor like interactive and it should update more quickly as one would expect.

@MrFr33man123
Copy link
Author

oh yeah i didn't wait 15min i will try again today.

@dshokouhi
Copy link
Member

speed up your testing and enable the "interactive" sensor then you can turn the screen on/off and it will force an update faster :)

@MrFr33man123
Copy link
Author

Screenshot_20240919_064349_io homeassistant companion android debug
It works as expected. Sadlyy use case is not matched. As i need some check that the phone is laying on a flat surface face up.

@dshokouhi
Copy link
Member

As i need some check that the phone is laying on a flat surface face up.

i believe we may be able to do that so a sensor that has up down and none states? 🤔 may need to do some consideration here and see what makes sense. Thanks for testing it again and the feedback!

@MrFr33man123
Copy link
Author

well thank you for developing! Thats awesome. And yeah some considerations may be the activity sensor from android. Still? Not aktive like in hand.

@dshokouhi
Copy link
Member

And yeah some considerations may be the activity sensor from android. Still? Not aktive like in hand.

that is not a real state that sensor provides. https://developers.google.com/android/reference/com/google/android/gms/location/DetectedActivity#constant-summary

you can probably infer device in hand using something like the interactive sensor paired with light sensor and maybe proximity.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request sensor-tracking
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants