Consider the devices we use every day: From smartphones and smartwatches and potentially electric vehicles, electronics are becoming as mobile as people themselves. We rely and expect our devices to be charged at all times, ready-to-use when needed. But as it currently stands, we still must plug in our phones, our electric cars, and our smartwatches, tethering us to cords and cables, triggering range anxiety and obsessing about the remaining juice on our devices.
Wouldn’t it be nice instead to have a user experience where our electronics are charged autonomously and seamlessly, without any conscious human intervention, even while our devices are in use? The physical act of “plugging in” should—and soon will be—an outdated concept due to technology enabling us to deliver energy to our devices sans burdensome cords.
While wireless charging is not a new concept, the industry has been fluctuating over the past decade or so as different forms of the technology enter the arena, all promising to eliminate the last tether. However, those excited for what’s to come are left confused in terms of what is wireless charging reality and what is just a myth.
Before we set out to clarify, let’s look at the history of wireless charging for context. Inductive—the first generation of wireless-charging technology—was very limiting, providing only low power transfer and almost zero spatial freedom. Still, it offered wireless charging’s first form factors, like the electric toothbrush. More recently, power transfer via RF fields have been developed to enable energy transfer over greater distances, but it’s not very efficient at doing so.