View Single Post
Old 06-03-2007, 07:17 AM   #30
linknoid
Superior Inhabitant
 
Join Date: Sep 2002
Posts: 74
Electricity is an electromagnetic wave. There is an electrically charged component, and a magnetic component. The moving electrons induce a magnetic field, and the changing magnetic field induces an electric field, and this effect propogates energy through whatever medium is being used.

Now if you put something conductive near this electromagnetic field surrounding the power lines, part of the magnetic field that would have been inducing current into the power lines instead induces electric fields into the conductor, in this case flourescent bulbs. So the main conductor, the power line, has energy bled off of it.

The flourescent bulbs then turn that induced current into light. However, if there were nowhere for the energy to go, the electromagnetic field induced into an external object would be stored and returned to the power line when the current reverses direction, (60 times every seconds in the U.S., 50 in England). So in that case the energy isn't lost, it gets fed back into the power line. However, this effect does through the voltage and current out of sync with each other, which does bad things for efficiency if you don't correct for the problem by pushing them back in sync again.

This induction effect also means that we wouldn't get 100% efficient energy transfer by changing our power lines to superconductors. I once read that the main power line losses are through induction, not resistance, so even though superconductors have no resistance, we wouldn't really gain much savings by switching to them.

Hopefully this explanation wasn't too confusing to those who haven't studied physics/electrical engineering, and not too inaccurate for those who have
linknoid is offline   Reply With Quote