FCC Exam Question: 8-2A6

What is the distance in nautical miles to a target if it takes 308.5 microseconds for the RADAR pulse to travel from the RADAR antenna to the target and back.

A. 12.5 nautical miles.
B. 25 nautical miles.
C. 50 nautical miles.
D. 2.5 nautical miles.
Correct Answer: B

Explanation: RADAR determines distance by measuring the time it takes for a radio pulse to travel from the antenna to a target and return. Since the pulse travels to the target and back, the measured time represents a round trip. Therefore, to find the one-way distance, we must use half of the total travel time. The speed of radio waves (like RADAR pulses) is approximately the speed of light, which is 300,000,000 meters per second (3 x 10^8 m/s). First, calculate the one-way travel time: 308.5 µs / 2 = 154.25 µs = 154.25 x 10^-6 seconds. Next, calculate the distance in meters: Distance = Speed x Time = (3 x 10^8 m/s) * (154.25 x 10^-6 s) = 46,275 meters. Finally, convert meters to nautical miles. One nautical mile is approximately 1852 meters. 46,275 meters / 1852 meters/NM ≈ 24.986 nautical miles. Rounding this gives 25 nautical miles. Option A (12.5 NM) would result if you incorrectly divided the final distance by two. Option C (50 NM) would result if you did not divide the travel time by two (calculating a 2-way distance). Option D (2.5 NM) is a result of significant calculation errors.

Pass Your FCC Exam!

Study offline, track your progress, and simulate real exams with the GMDSS Trainer app.


Includes Elements 1, 3, 6, 7R, 8, and 9.