I'm curious to know, at what frequency in a pair of wires is it considered to be EM waves travelling through it? I know that Ethernet is changing voltage (electric field) at frequencies of ~30MHz, while WiFi travelling through a pair is an EM wave at a frequency of 2.4 GHz. At what point does it change to EM?
Ethernet is only spec'd to 250MHz for CAT6. That said, coaxial installed for CATV is only rated to 1GHz (less the cable and more the ubiquitous passive taps), so neither ought to carry 2.4GHz WiFi unmodified, although with CATV I've seen it work for sufficiently relaxed definitions of working.
At frequencies greater than 0 Hz (DC) structures of two or more conductors can support transmission of TEM electromagnetic waves. At microwave frequencies and above it is more efficient to switch to waveguide (a conductive tube).
Wouldnt any wire with electric charges in motion be considered to have EM waves? I dont think there is a minimum frequency per se, at least not a relevant one.
Well to counter that point, wifi has to go through coax so that the EM waves it creates don't induce current in nearby conductors. But for a standard Ethernet (100Mbit) cable, i.e. unshielded pair, you can place it right up against metal if you like.
Got it! Ethernet uses balanced pairs, which by themselves have some form of common mode rejection, but twisting it greatly reduces the harmful effects in the case where one conductor is closer to the EM interference. Twisted balanced pair has a lot of loss at >1GHz, at which point using a confined EM space in a coaxial cable can help prevent that. Ethernet stays under 250MHz, while WiFi is >2.4GHz, hence the cable differences. As another parent mentions, all changing currents produce EM waves, it's just the losses are much more important as the frequency increases.
The following page helped me understand it as well.