![]() So partially because of this, the unofficial "Utilization Voltage" level has been 230V for decades, but the tolerance is +-10%, meaning the devices are supposed to be designed to accept anything from 207V to 253V. Generically, it's all referred to as being the " nominal" voltage 220, 230, 240 all being relatively close enough. It's actually somewhat rare for it to actually BE 220V. But because old habits die hard, and because SOME utilities never actually changed, "220V" is a common term still used all over the place. So for residential single phase distribution, the official voltage is actually 240VAC. So to avoid having the REA workers have to carry different products for different utility voltages, a standard was established that has become codified as the ANSI (American National Standards Institute) Distribution Voltages that power utilities are SUPPOSED to adhere to. ![]() ![]() Some time around the 1930s, as part of one of Roosevelt's "New Deal" programs called the "REA" (Rural Electrification Act), power lines were run out to farms and small communities all across the country. Here in the US (we don't know where you are), 220V was an old original standard going back to the 1920s. Distribution voltage levels have changed over the years.Distribution voltages are required to be +-5% maximum deviation, Utilization Voltages are supposed to be +-10% minimum acceptable. They are not the same values, because it is EXPECTED that there will be a "voltage drop" that takes place between the utility transformer and the point at which the device connects due to the resistance of the wire between them. In the world of electrical power devices, there is a "Distribution Voltage" that your utility is providing to you, and there is a "Utilization Voltage" that your devices are designed to work on.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |