This one blew the top of my head off, and I don’t throw that comment around lightly.

It begins with two simple tenets: (1) Technological advancements have doubled our computing power every 1.5 years since the ’70s, and (2) the electrical efficiency of modern technologies has also been doubling at the same rate (i.e., by a factor of 100 for every decade). These are crucial linchpins of today’s mobile-oriented society: If it weren’t for these principles, we wouldn’t have the laptops, cell phones, tablets and handheld game consoles we know today.

We can already see the road ahead: We want these to be ever-faithful, ever-active instruments at our disposal. And for them to always be there when we need them, they can’t constantly be dying. We are quite literally tied down by our power cables. The alternative? Well, no one likes a burdensome machine, with fat batts weighing us down and bulking up those pockets. This is the obstacle between us and the next level of mobile technology: Those big ole batts.

If, however, energy efficiency continues to double every year and a half, it won’t be long until this pesky issue gives way too. And that’s reason to be excited. MIT’s Technology Review cited this example:

“…if a modern-day MacBook Air operated at the energy efficiency of computers from 1991, its fully charged battery would last all of 2.5 seconds. Similarly, the world’s fastest supercomputer, Japan’s 10.5-petaflop Fujitsu K, currently draws an impressive 12.7 megawatts. That is enough to power a middle-sized town. But in theory, a machine equaling the K’s calculating prowess would, inside of two decades, consume only as much electricity as a toaster oven. Today’s laptops, in turn, will be matched by devices drawing only infinitesimal power.”

 

Remember this term: Ultra-low-power computing. This is what is coming, and it will change the game.

The energy efficient systems-on-a-chip (SOC) we know now are just the tip of the iceberg. For an idea of what ultra-low-power computing could look like, consider The University of Washington’s Joshua R. Smith and his wireless no-battery sensor. It uses so little electricity (50 microwatts) that it doesn’t need batteries or an outlet plug — it simply culls energy from mid-air (actually, from television and radio signals), powering itself to deliver data from a weather station to an in-home unit, which displays the info every five seconds.

That’s not the only potential source. Researchers are hard at work on photovoltaic charging (from ambient light sources), kinetic sensors (to pull energy from movement), and heat-inductive power sources. These endeavors have never really proven all that usable in the mass market, but with processing advancements delivering excellent performance using less and less energy, such charging scenarios would become more feasible for a wider range of technologies.

Someday, could we see devices that never run out of juice? It’s more than possible. Not that there aren’t challenges to overcome — like how to deal with naturally occurring fluctuations of airborn or other power sources — but it appears to be where we’re heading. In fact, it’s possible that we could see such technology become mainstream within our lifetime.

For more information about ultra-low-power computing, check out the source link. If you’re a science geek, or interested in the business implications of these technologies, it makes for a fascinating (if heady) read.

[via Technology Review]