GMDSS Trainer
Subelement A: RADAR Principles – 10 Key Topics – 10 Exam Questions – 8 Drawings— Topic 2: Distance and Time

Question 8-2A2

Element 8 (RADAR)

One RADAR mile is how many microseconds?

Explanation
A "RADAR mile" refers to the time it takes for a radio signal to travel one *nautical* mile and return to the source. Since the signal travels out and back, the total distance covered for a "RADAR mile" is two nautical miles. Radio waves travel at the speed of light, which is approximately 161,874 nautical miles per second. To find the time for a RADAR mile: 1. **Total distance:** 2 nautical miles (one out, one back). 2. **Speed:** 161,874 nautical miles/second. 3. **Time = Distance / Speed** Time = 2 nautical miles / 161,874 nautical miles/second Time ≈ 0.00001235 seconds Converting this to microseconds (1 second = 1,000,000 microseconds): 0.00001235 seconds * 1,000,000 µs/second ≈ 12.35 µs. Therefore, 12.34 microseconds is the correct value for a RADAR mile. Option A, 6.2 µs, represents the approximate one-way travel time for one nautical mile, not the round trip. The other options are incorrect values for this calculation.

Ready to test your knowledge?