Subelement A: RADAR Principles – 10 Key Topics – 10 Exam Questions – 8 Drawings— Topic 2: Distance and Time
Question 8-2A2
Element 8 (RADAR)One RADAR mile is how many microseconds?
Explanation
A "RADAR mile" refers to the time it takes for a radio signal to travel one *nautical* mile and return to the source. Since the signal travels out and back, the total distance covered for a "RADAR mile" is two nautical miles.
Radio waves travel at the speed of light, which is approximately 161,874 nautical miles per second.
To find the time for a RADAR mile:
1. **Total distance:** 2 nautical miles (one out, one back).
2. **Speed:** 161,874 nautical miles/second.
3. **Time = Distance / Speed**
Time = 2 nautical miles / 161,874 nautical miles/second
Time ≈ 0.00001235 seconds
Converting this to microseconds (1 second = 1,000,000 microseconds):
0.00001235 seconds * 1,000,000 µs/second ≈ 12.35 µs.
Therefore, 12.34 microseconds is the correct value for a RADAR mile. Option A, 6.2 µs, represents the approximate one-way travel time for one nautical mile, not the round trip. The other options are incorrect values for this calculation.
Related Questions
8-29D6 Select the statement, which is most correct regarding a raster scan display.8-2A1 A radio wave will travel a distance of three nautical miles in:8-2A3 RADAR range is measured by the constant:8-2A4 If a target is 5 miles away, how long does it take for the RADAR echo to be received back at the antenna?8-2A5 How long would it take for a RADAR pulse to travel to a target 10 nautical miles away and return to the RADAR receiver?