FCC Exam Question: 8-2A5

How long would it take for a RADAR pulse to travel to a target 10 nautical miles away and return to the RADAR receiver?

A. 12.34 microseconds.
B. 1.234 microseconds.
C. 123.4 microseconds.
D. 10 microseconds.
Correct Answer: C

Explanation: Radar ranging fundamentally relies on the constant speed of radio waves, which travel at the speed of light (approximately 162,000 nautical miles per second or $3 \times 10^8$ meters/second in a vacuum/air). To determine the target's distance, a radar system measures the time it takes for a pulse to travel from the transmitter, reflect off the target, and return to the receiver. This means the pulse travels twice the distance to the target. 1. **Total Distance:** The pulse travels 10 nautical miles *to* the target and 10 nautical miles *back* to the receiver, for a total of 20 nautical miles. 2. **Speed of Light (Radar-specific):** A common conversion factor in radar is that radio waves travel approximately 1 nautical mile in 6.18 microseconds (one way). Therefore, for a round trip, it takes roughly 12.36 microseconds per nautical mile. 3. **Calculation:** For 10 nautical miles (round trip): $10 \text{ NM} \times 12.36 \text{ microseconds/NM} = 123.6 \text{ microseconds}$. This aligns perfectly with option C. Option A represents the round-trip time for just one nautical mile, while B and D are significantly incorrect magnitudes.

Pass Your FCC Exam!

Study offline, track your progress, and simulate real exams with the GMDSS Trainer app.


Includes Elements 1, 3, 6, 7R, 8, and 9.