# Radio Basics for RFID, Part 1

The following is excerpted from Chapter 3: Radio Basics for UHF RFID from the Book, __The RF in RFID: Passive UHF RFID in Practice__ by Daniel M. Dobkin. Order a copy of __The RF in RFID: Passive UHF RFID in Practice__ before December 31, 2007 to receive an additional 20% off! Visit www.newnespress.com or call 1-800-545-2522 and use code 91090.

While this book excerpt from The RF in RFID:Passive UHF RFID in Practice, focuses on RFID applications, it is an excellent primer for RF basics. This Part covers electromagnetic waves, signal voltage, and power.

Part 2 covers modulation and multiplexing.

Part 3 covers backscatter radio links and introduces link budgets.

Part 4 reveals how to determine the link budget.

Part 5 focuses on the effect of antenna gain on range.

Part 6 covers antenna polarization.

Part 7 covers antenna propagation.

**Electromagnetic Waves**

Recent estimates by cosmological folks suggest that around 95% of the mass in the universe is composed of dark matter and more recently minted dark energy, about which essentially nothing is known. Dark matter and dark energy don't appear to interact with our alternately glowing and dusty stuff except through gravitational means. Folks made of dark matter (if such were to exist) couldn't watch reruns of American Idol even if you forced them: they don't have any means of interacting with the broadcast signal and probably don't want to pay for cable.

For those condemned to the world of baryons and leptons, electromagnetic waves are a fact of life. In most textbooks on electromagnetic theory, you'll wade through Maxwell's equations and possibly laborious arguments on mysterious exchanges between the electric and magnetic fields launching self supporting structures with little Poynting vectors pointing out of them: all true but unnecessarily obscure. Before we go on to the mundane tasks of introducing the relevant terminology and technology of radio, let's share a little secret, implicit but not readily apparent in the standard texts, which the author has found to considerably simplify his view of electromagnetic radiation. It goes like this:

To expand a bit: every object in the world that has an electric charge creates an *electrostatic
potential,* which falls inversely as the distance. The potential sensed at some distance *r*corresponds to what the charged object was doing at an earlier time (r/c), because signals move at the speed of light c = 3x10^{8} m/s. The total electric potential in the space between your nose and the pages of the book you're reading depends on the amount of charge on the fur of a cat in Bulgaria (or Wisconsin, if you happen to be in Dobrich).

However, we almost never care, because electric charge comes in two flavors, positive and negative, and the amount of energy associated with an isolated charge of only one type is enormous: a microgram of hydrogen, split into its constituent protons and electrons and separated by 1 m, could support a mass of 8 million kilograms against the gravitational attraction of the entire earth. So in almost every case, adjacent to each electron with a negative charge is a proton with a positive charge, such that the two cancel, and have no net effect on your cellphone conversation. Electric currents similarly give rise to a magnetic vector potential in the direction of the current flow, which again exists everywhere with amplitude decreasing with distance, at a correspondingly delayed time.

Similar arguments show that most currents don't have any effect on distant objects. If a current is flowing in one direction, with no compensating countercurrent, charge must be accumulating somewhere, leading after a while to enormous energies (voltages). Most electric currents flow in a balanced loop: the potential from current flowing up cancels that from current flowing down, and again no net effect results on distant observers. These points are made pictorially in **Figure 3.1**, where we also introduce a bit of the mathematical terminology associated with the subject.

At first glance, we're left with no potentials and no waves, but, of course, this is not correct. For example, we can run an uncompensated current for a little while before charge accumulation causes too much voltage to build up and then turn it around. This uncompensated current will lead to a detectable signal at a distance. In addition, cancellation will often fail to be exact when the charges and currents are changing in time because of the slight differences in delays due to the finite size of the region over which the currents flow. For example, if in **Figure 3.2**, the loop current is suddenly turned on all around the loop at some time *t* = 0, the potential from the downward-flowing current arrives at r just a bit sooner than that from the upward-flowing current. Cancellation fails, and an observer sees some resulting potential: *radiation* has occurred.

This leads to our second key observation:

For an antenna to work, it should be apparent that something has to change: radiation is the result of the transient failure of delayed signals to cancel each other. In order to create a continuous signal, currents flowing on an antenna must continuously change, without actually getting anywhere: that is, currents and charges are usually periodic functions of time, alternately increasing and decreasing but returning to the same state again and again after the same interval. Periodic functions have a *period*—a time duration over which the signal is exactly repeated—and a frequency, conventionally measured in Hertz (Hz) and equal to (1/period). Thus, a signal that repeats itself every second has a frequency 1 Hz.

The sine and cosine are archetypal periodic functions, widely used in science and electrical engineering; in electrical engineering these are often combined into a complex exponential function, which absorbs both frequency and delay (phase) into one expression: e^{ix} = cos(x)+i sin(x), where i is the imaginary unit √(-1).

We should note that instead of arranging the currents on an antenna so as to frustrate cancellation at a distance, we can place the observer (the receiving antenna) so close to the transmitting antenna that cancellation is defeated simply because some currents on the transmitting antenna are close to the receiving antenna and have a larger effect than those more distant. This sort of interaction is known as *near-field coupling* or alternatively as *inductive
Coupling*. We can think of inductive coupling as being fundamentally about differences in distance between differing parts of an antenna, whereas radiation is usually more closely related to differences in propagation time (phase) from one part of an antenna to another.

Armed with an antenna carrying a periodic current, we can create electromagnetic waves, propagating at the speed of light and falling in amplitude inversely with the distance (**Figure 3.3**).

The waves induce a voltage in the receiving circuit, periodic with the same frequency as the transmitted signal, whose magnitude is inversely proportional to the distance between the transmitter and receiver. Using harmonic notation, the delay in time of Figure 3.1 becomes a phase offset by the wavenumber *k* multiplied by the distance *r*. (The absolute phase is often not readily observable or controllable in practical radio systems, so we can generally drop this term.) It is this voltage we make use of to transmit information—in the case of RFID, from a reader to a tag and back. How should we measure and describe it?