I've done some searching and research on this, and I've come back confused. I want to measure the latency in a video broadcast system at various distances (for FPV drones). It's easy to do on the bench using one Teensy. An LED and a light sensor are connected to the Teensy. The LED is placed in front of the camera lens, and the sensor is taped to the OLED display. Pushing a button tells the Teensy to turn on the light and capture the time. Once the light is turned on and the time captured, the Teensy waits for the sensor to see the light and captures that time. The difference between the two is the "on bench latency" (after the system is initially calibrated). This measurement is used to evaluate the efficiency and speed of the video transmitter and receiver.
What I want to do next is to measure the latency at fixed distances, like 100 meters, 200 meters, etc. The solution I've thought of is to use two Teensys. The first is connected to the light sensor, while the second is connected to the LED. These two can be separated by any give distance. In order for this to work I need to initially synchronize the "clock" on the two Teensys. The anticipated readings are between 30ms and 100ms, so I need to be accurate to within perhaps 3 milliseconds (more or less). Is there an easy way to do this, and do you think it would be accurate enough???
What I want to do next is to measure the latency at fixed distances, like 100 meters, 200 meters, etc. The solution I've thought of is to use two Teensys. The first is connected to the light sensor, while the second is connected to the LED. These two can be separated by any give distance. In order for this to work I need to initially synchronize the "clock" on the two Teensys. The anticipated readings are between 30ms and 100ms, so I need to be accurate to within perhaps 3 milliseconds (more or less). Is there an easy way to do this, and do you think it would be accurate enough???