Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API to get directions from navigation app running on Android Auto #451

Open
BodoMinea opened this issue Mar 31, 2021 · 7 comments
Open

API to get directions from navigation app running on Android Auto #451

BodoMinea opened this issue Mar 31, 2021 · 7 comments
Assignees
Labels
enhancement New feature or request

Comments

@BodoMinea
Copy link
Contributor

Hello everyone,

It's hard to find any documentation regarding the head-unit side of things in Android Auto, but I am sure what I want is possible, at least on the „official” channel. I saw in a factory system on a Ford car, that while Waze was running on Android Auto on the main screen, a small secondary screen in the dashboard showed the next directions instruction (ex. turn left in .... and an arrow indicating it).

Do you know of any kind of API or other kind of interface that I could get this kind of info out on the raspberry Pi side? I am just interested in harvesting this information, so that I can write my own software that does something with it (display on a secondary screen for my instrument panel or project it as a HUD on the base of the windshield, etc.)

Just a random guess but maybe it has something to do with NDS - https://nds-association.org/tomtom-go-navigation-app-2/ (although I see no mention of Google around there but it seems like the right research direction).

Any help or link to relevant docs is very much appreciated.

@Milek7
Copy link

Milek7 commented May 10, 2021

For docs, see here https://milek7.pl/.stuff/galdocs/
There's headunit integration guide found in the abyss of google cache, which might provide high-level overview of AA features. It is a bit outdated (version 1.3, while current version seem to be 1.6). It seems that at least GenericNotificationService(13) and WifiProjectionService(14) and multistep NavigationManeuvers with lane assist were added since then. For navigation look on page 82. In protobuf protocol it is described by various Navigation* messages. Look in protos.proto, which is extracted from desktop-head-unit emulator from Android SDK. This binary is not stripped, which is quite useful for reverse engineering.
This doesn't seem to be implemented in aasdk though.

Besides you can just experiment with google emulator, it supports instrumentcluster option and displays all that navigation info in separate window: https://developer.android.com/training/cars/testing#dhu-commands
(as for running emulator, docs tell you to use adb, but that's not actually required. You can start debug headunit server on phone and just connect emulator to it directly, like socat TCP-LISTEN:5277 TCP:phoneip:5277.

Or you can use accessory USB interface as it is intended for production usage, using proxy like this: https://gist.github.com/Milek7/7f70d29287be6eaef5440c516fb2f123. It wil create unix domain socket, and you can use it with desktop-head-unit using socat TCP-LISTEN:5277 UNIX:unixsock)

@matt2005
Copy link
Contributor

Sounds great I'll take a look when I get my laptop repaired

@matt2005 matt2005 self-assigned this May 10, 2021
@matt2005 matt2005 added the enhancement New feature or request label May 10, 2021
@BodoMinea
Copy link
Contributor Author

Wow, that's so cool, thanks @Milek7!! It's exactly how I was expecting this to work but for some reason didn't know the right combination of words to look for.

I found some more info here: https://source.android.com/devices/automotive/displays/cluster_api

Have you tried fiddling around with these commands on a Raspberry Pi that is already running the head-unit emulator for Crankshaft?

I have experience working with sockets / protocol buffers but I'm a bit at a loss on how to start with this one. Maybe I could run Crankshaft's emulator with some parameters to grab the second screen output for debug? But I have no clues where the relevant files are in the filesystem.

I am attempting to run this alongside the main Crankshaft setup on my Raspberry Pi 4 (currently outputting the main screen output through the RCA jack to my car's stock screen) and attach a small HDMI display that I'll mount to my cluster.

@BodoMinea
Copy link
Contributor Author

I was able to start tinkering with this for a bit. Not on the Crankshaft-NG rpi but on my laptop.

I grabbed the head-unit emulator exe from here: https://dl.google.com/android/repository/desktop-head-unit-windows_r02.0.rc1.zip
Modified one of the sample config files to have instrumentcluster = true
Executed socat TCP-LISTEN:5277 TCP:phoneip:5277 while the headunit server was enabled on the phone.
Executed desktop-head-unit.exe -c path/to/config.ini

Android Auto started up, I opened up Google Maps, navigated to an address aaand... it is kinda working!

iclust
t
So now I got the debug window showing information that conforms to the protos.proto format. Does anyone have any idea on how could I extract this information in another way...? Have the protocol buffer dumped to a file or unix/TCP socket rather than in this secondary window? This would be the next step in interfacing this to my HUD/second display (for which I need to render something based on this data).

@Milek7
Copy link

Milek7 commented Oct 24, 2021

I added some info to my website: https://milek7.pl/.stuff/galdocs/readme.md
Wireshark dissector might be useful, but you would need to add navigation message mapping to it.

@BodoMinea
Copy link
Contributor Author

Thank you so much for this valuable resource! So, if my understanding is correct, I could study what packets carry this data that I'm interested in using Wireshark and then, for the implementation, I could use the approach described to proxy the AAP USB communication and intercept the info that I need for my secondary display.
Does that sound right?

@OneB1t
Copy link

OneB1t commented Oct 13, 2023

Hello guys i want to be able to get this type od data on MIB2 vw headunit to show them on virtual cockpit. Is there some easy way to do it on "production system"? Here is my code: https://github.com/OneB1t/VcMOSTRenderMqb

There is GAL running on top of MIB and i have python available on the unit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants