Perfect Wi-Fi energy harvester gathers scant bounty

  News
image_pdfimage_print

I am pretty nutty when it comes to energy efficiency. It upsets me to know that if the morons who built my house had used the latest information available at the time, my house would require almost no heating. One thing that really gets me excited is the prospect of reusing waste energy. I like the idea of taking energy that would otherwise be destined to diffuse out into the environment and turning it into something useful.

As such, it was inevitable that a paper on recovering microwave energy would catch my eye. And, yes, I shall inflict it on you, too. Unfortunately, harvesting Wi-Fi radiation doesn’t seem like it will win us very much. But before we get to that, let’s take a look at the very cool ideas behind the harvester.

Stopping reflections

The basic idea behind harvesting Wi-Fi radiation is a very old one: just construct a circuit that absorbs all that microwave energy. Let’s take a very artificial situation: imagine a microwave traveling along a bit of coaxial cable. A coaxial cable consists of a central conducting wire enclosed in a cylinder of non-conducting dielectric material, all wrapped in a conductor. The power from the microwaves is not transmitted “in” the central wire. It is actually in the electric and magnetic fields in the dielectric. These propagate as waves down the cable with a speed that is partially given by the properties of the dielectric material.

When the wave hits the end of the cable, it has a problem. Right at the boundary between the air and the dielectric material, the wave has to instantaneously change from one speed to another. If all of the power in the microwave were to be transmitted out of the end of the cable, the electric field would have to have two different values at the same time and place, which cannot happen. So, the wave is reflected from the end and returns back up the cable (possibly destroying the transmitter in the process).

If we don’t want the power to be reflected, we have to terminate the cable in such a way that it looks (from the microwave’s perspective) like the cable just continues on indefinitely. This is called matching, and it is an essential concept in microwave engineering, optics, and, actually, pretty much everywhere in physics and engineering.

In the case of a coaxial cable, the dielectric is usually chosen such that a 50Ohm resistor will match the properties of the cable. So, if I put a 50Ohm resistor between the outside conductive coating and the central wire, all of the microwave energy will be absorbed by the resistor.

For a coaxial cable, or, in fact, for any transmission line, designing an electrical circuit that matches the properties of the line is not that difficult. And, in fact, the antenna on your mobile phone has exactly this sort of circuit: the antenna and its terminating circuit need to match each other and they need to match the properties of free space propagation as closely as possible. A good match means you can absorb a lot of radiation from a small antenna.

Wi-Fi wastage

Wi-Fi faces the same restraints as receivers. But its energy doesn’t just go to the receiving antenna; it spreads out to a much broader area. This means that most of the energy is lost. If we were to place the right sort of antennas around the place, we could recover a lot of that energy, right? Well, it’s kind of tricky.

First of all, these sorts of receivers would have to be built into the walls of the house or apartment. Which means that, unlike the antennas on devices, they cannot be adjusted for optimum reception. In the meantime, the Wi-Fi signal arrives from every direction, and the polarization (the spatial orientation of the electric field relative to the direction the wave is traveling) could be anything. Antennas are sensitive to both direction and polarization.

Then there is the fact that the energy is spread very thin. At the source, we might have something like 10mW of radiated power. But, move 10 meters away, and the power passing through your body would amount to 10-20 microWatts. So the losses just pile up. Distance is a problem, and, if your antenna is only good for one polarization, you lose half of that immediately. Then you can add any losses in the circuit to take the microwave energy and down convert it to a DC supply. It starts to look like a very difficult problem.

Going meta on antennas

To get around these types of problems, a trio of researchers suggested an antenna network that tries to minimize these losses.

The first step was to remove polarization dependence. The way that they did this was to design a patch antenna (an antenna that is a flat patch of metal) that would respond optimally to vertically and horizontally polarized microwaves. Although the antennas respond to both polarizations, the physical location of the wire that couples the antenna to the rest of the circuitry determines which polarization provides power. Antennas that have wires attached at the side are sensitive to horizontally polarized light, while antennas with a wire attached at the top are sensitive to vertically polarized light.

(I use the word “attached” in the loosest possible sense here. The connecting wires can themselves be thought of as antennas, with the patch antenna re-radiating the received power to the wire.)

To create an energy harvester based on this design, the researchers created a grid of antennas. Odd-numbered columns of antennas were set up to receive vertically polarized light, while even-numbered columns were set up to receive horizontally polarized light.

Now, you might think that this is silly, because you lose half the power from each polarization. But this is not the case. The antennas all talk to each other. The column of antennas that is coupled at the top still receives both polarizations. The vertically polarized microwaves are passed on to the wires located near the top of each antenna. The horizontally polarized microwaves are passed to the antennas in adjacent columns, where they are then passed on to the wires located near the side of each antenna. With the right design, all of the incident power can be passed to the power conversion circuit.

So the antenna looks, basically, like a grid of metal patches sitting on a non-conducting material. And, like our coaxial cable from earlier, the power that the antenna gathers is stored in the fields that are in the dielectric material. That means we want a dielectric material that absorbs as little power as possible. The amount of power absorbed by the dielectric is often measured by something called the loss tangent. The researchers searched for—and found—a material that has a loss tangent about a 100 times smaller than most commonly used materials in printed circuit boards.

Don’t let reality near my model

Now, of course, in models, the antenna array absorbs 100 percent of all known Wi-Fi radiation (well, 2.4GHz Wi-Fi radiation). But how does it work in reality? There the story gets more complicated. If the energy sent to the coupling wires is directly measured, then the energy transfer is about 97 percent, which is pretty darn good.

But we want to use that energy, and that is where things take a dive. If the wires are directly coupled to load resistors (so we turn the Wi-Fi energy into heat), then it still works okay, with 92 percent of the radiation getting absorbed by the load resistors. The five percent loss is due to absorption in the dielectric material as the energy is transmitted to the load resistors.

The real losses start when the energy is transformed from microwave power to a useable DC electrical signal. Even in models, only 80 percent efficiency is possible here. In experiments, the researchers managed about 70 percent. Normally, I’d take 70 percent, but not on this occasion. The problem is that you only get 70-percent conversion efficiency when the incident power is quite high. In fact, the researchers tested incident powers (and this is the total power falling on the antenna grid, not radiated power) of 1 to 10mW. At 1mW, the conversion efficiency was about 30 percent. Scaling the vaguely linear trend (on a log scale) to real-world use of a 100mW transmitter located some 10m from the antenna, the incident power is on the order of a microWatt. That corresponds to a conversion efficiency of about five percent, which doesn’t really seem so good.

The problem, according to the researchers, lies in the power conversion network. There are some losses transmitting the power to the part where the microwave power is turned into a direct current power. But there are even bigger losses in the diodes. Diodes only allow current to flow in one direction, so a network of diodes can take the oscillating microwave field, where the voltage changes from negative to positive every few nanoseconds, to a voltage that is only positive.

Diodes are not perfect—they take time to switch, and they require that the applied voltage reaches a certain value before they allow current to flow. The result is that a lot of the microwave power is not converted but is lost as heat because it’s below that threshold.

I’m pretty sure that this is fundamental to how diodes operate, so while those losses can be reduced somewhat, I don’t think we are going to get diodes that are an order of magnitude more efficient any time soon. On the other hand, I think the authors might argue that it doesn’t matter too much. You see, because the antennas all couple to each other, the grid can be made bigger so that the total power incident on the grid is large enough to reach peak efficiency.

I’m not sure that this works either. At a 10m distance, the antenna grid has to cover one entire wall of a room. That is the intended purpose, of course. Unfortunately, that is where other problems kick in. At present, the transmission of power from the individual antennas to the diodes costs us about five percent of the total power. But, power losses scale with distance. And, in our real-world scenario, where antennas are spread across an entire wall, the distances increase by a factor of about 40.

The conclusion is that the antenna design is really cool. It has the advantage that it works no matter its orientation with respect to the Wi-Fi transmitter and no matter the interference. But the antenna has to be coupled to non-ideal components, and those really make it difficult to imagine getting it to work in the real world.

I want my half Watt back

If we could, would it be worth it? “Yes,” thinks my energy-efficiency-obsessed brain, “of course it would.” But once the rest of my brain has soaked in coffee for a while, things look a lot less worthwhile.

According to the specifications on my base stations, the transmitted power is a maximum of 100 to 200mW per channel. I have one dual-band and one tri-band base station, or about 800mW of radio frequency power being emitted (maximum). In terms of my power bill, this corresponds to about 0.02kWh per month. Now, the power absorbed by the connected devices is negligible. My computer is currently reporting -54dBm, which corresponds to just under four microWatts. So, let’s assume that all the Wi-Fi transmitted power is available for recovery.

This means that recovering the energy from the microwave radiation emitted by my Wi-Fi base stations would save me about two bucks per year. Put differently, I’d harvest 0.02kWh out of my total monthly (electrical) energy consumption, which in winter is about 19kWh per day.

That doesn’t mean this is utterly useless, though. It could be valuable for wireless power delivery. Microwaves can be focused to relatively small areas. And, with some computation, a transmitter can use the multipath interference in most environments to transmit the power efficiently to a small target area, while still keeping the power density in the rest of the environment relatively low (so no one is walking through radiation beams of 100W). Under these circumstances, an antenna system that is highly flexible and highly efficient makes the whole idea a bit simpler to implement. And, provided that the conversion efficiency was reasonable, I think most of us would like that.

A second use might be to help create better Wi-Fi networks. Most of the problems associated with Wi-Fi networks are due to interference, either multipath interference from your own Wi-Fi transmitter or channel competition among neighbors. Now, a way to fix that might be to place these boards (without the power conversion electronics) at strategic locations around the home, where they’d block some of this interference. The advantage they would have over a simple sheet of aluminum foil is that antennas like this have a larger effective area than their physical size. That means that for some situations, a few square meters of foil might be replaced by a smaller antenna.

Are these two use cases enough motivation to develop this system? I’m not sure. Nevertheless, I bet these antenna designs turn up in a device somewhere.

Scientific Reports, 2017, DOI: 10.1038/s41598-017-15298-5

https://arstechnica.com/?p=1244115