Cathode-ray tube (CRT) monitors, which were commonly used before the widespread adoption of LCD and LED displays, typically operated at relatively high voltages. The specific voltage requirements could vary depending on the size and type of the CRT monitor, but generally, CRT monitors required voltages in the range of thousands of volts to function properly.
The main voltage requirements in a CRT monitor include:
Anode Voltage (High Voltage or HV): This is the high voltage applied to the CRT's anode, also known as the accelerating anode or the high-voltage anode. This voltage accelerates the electrons emitted by the cathode, allowing them to hit the phosphor-coated screen and create the images. Anode voltages for CRT monitors typically ranged from a few kV (kilovolts) to tens of kV, depending on the monitor's size and resolution.
Cathode Heater Voltage (Filament Voltage): CRTs have a cathode that emits electrons when heated. This cathode heater voltage is usually lower than the anode voltage, typically around 6V to 12V, and it's used to heat the cathode filament.
Grid and Focus Voltages: CRTs often have control grids and focus electrodes that require relatively lower voltages compared to the anode voltage. These voltages help control the electron beam's intensity and focus.
It's important to note that working with high voltages can be dangerous, and CRT monitors should only be handled by professionals who are trained in working with electronics and high-voltage equipment. Additionally, due to the advances in display technology, CRT monitors are becoming less common, and newer display technologies like LCD and OLED have largely replaced them in modern devices.