FCC Exam Question: 8-2A2
One RADAR mile is how many microseconds?
Explanation: A "RADAR mile" refers to the time it takes for a radio signal to travel one *nautical* mile and return to the source. Since the signal travels out and back, the total distance covered for a "RADAR mile" is two nautical miles. Radio waves travel at the speed of light, which is approximately 161,874 nautical miles per second. To find the time for a RADAR mile: 1. **Total distance:** 2 nautical miles (one out, one back). 2. **Speed:** 161,874 nautical miles/second. 3. **Time = Distance / Speed** Time = 2 nautical miles / 161,874 nautical miles/second Time ≈ 0.00001235 seconds Converting this to microseconds (1 second = 1,000,000 microseconds): 0.00001235 seconds * 1,000,000 µs/second ≈ 12.35 µs. Therefore, 12.34 microseconds is the correct value for a RADAR mile. Option A, 6.2 µs, represents the approximate one-way travel time for one nautical mile, not the round trip. The other options are incorrect values for this calculation.
8-50F1
8-10A2
8-29D4
8-26C6
8-5A1
Pass Your FCC Exam!
Study offline, track your progress, and simulate real exams with the GMDSS Trainer app.
Includes Elements 1, 3, 6, 7R, 8, and 9.