FCC Exam Question: 6A401

How would you determine the cathode bias resistance necessary to provide correct grid bias for anamplifier?

A. Bias is equal to IR drop across the cathode resistance
B. Measure with a regular multimeter
C. Both A & B
D. None of the above
Correct Answer: A

Explanation: Cathode bias is a form of self-bias in vacuum tube amplifiers. The cathode resistor is placed in series with the cathode, and the total cathode current (primarily plate current, plus screen current if applicable) flows through it. According to Ohm's Law, this current creates a voltage drop across the resistor. This voltage drop makes the cathode positive with respect to ground. Since the grid is typically connected to ground via a grid leak resistor, the grid becomes negative relative to the cathode by precisely this voltage drop. Thus, the bias voltage is equal to the IR drop across the cathode resistance (V_bias = I_cathode * R_cathode). To determine the necessary resistance, you calculate R = V_bias / I_cathode for your desired operating point. Option B is incorrect because a multimeter can *measure* an existing bias voltage or resistance, but it doesn't *determine* the necessary resistance value to *achieve* a specific bias; that requires calculation based on Ohm's Law.

Pass Your FCC Exam!

Study offline, track your progress, and simulate real exams with the GMDSS Trainer app.


Includes Elements 1, 3, 6, 7R, 8, and 9.