Satellite Tracking

For my capstone this semester my team and I are creating an app to help amateur radio enthusiasts track satellites so they can make contact. None of us have had any prior experience with satellite tracking so this has been quite the learning experience.

The premise of the app is to use alternate reality to have the positions of the satellites displayed as 3d models overlain on the camera of the device.

My portion of the project was figuring out the AR side of things and how to render the models in the app. At first I thought it would be as simple as getting a location of the satellite in the real world and then plugging those in as the location of the model in AR. Boy, was I wrong.

Typical AR apps use plane detection and taps to place objects in the scene. The AR framework then keeps track of that object based on its internal plane detection. Unfortunately this won’t work in our situation. Luckily, the framework I found to use keeps track of a world origin at the time of loading the scene. That is useful because I can place objects relative to that world origin. With that problem solved I had to move on to the problem of converting real world coordinates to coordinates that the device could understand.

In my search for how to do this I came across a coordinate system called Earth centered Earth fixed. ECEF coordinates are x, y, z cartesian coordinates measured from the center of the Earth. The equations and constants used in the conversion of latitude, longitude and altitude to ECEF coordinates are well established and vetted.

Once you the device’s and the satellite’s positions converted to ECEF the next step is to get the satellites position in a frame of reference relative to the device. Luckily there is also a coordinate system that is useful for this. East, North, Up is another cartesian coordinate system that is typically used in radar tracking of airplanes. From a reference point you can get the location of an object with North being positive X, East being positive Y and Up being positive Z. With the device being the reference point I now had a way I could plot the models in the AR view. The only catch being, that I cannot ensure the device will always be pointed north when the scene is launched and the origin placed.

To solve this latest problem requires a rotation of the coordinate system about the z axis. Using the device bearing at the time the scene is loaded I can rotate the coordinate system the difference between the device bearing and north to get ENU coordinates. Now that all of the coordinate transformations are complete we can move on to rendering models in our scene. That will be a topic for next week.

Leave a comment

Your email address will not be published. Required fields are marked *