Advertisement

Machine learning could give us the next big boost in battery life

by Eric Frederiksen | September 24, 2017September 24, 2017 1:00 pm PDT

Batteries are at the core of modern life. Sure, they’ve been a big deal for a long time – the Energizer bunny is iconic for a good reason – but now more than ever batteries dictate our lives. They power our laptops, watches, and phones, and they’re starting to power our cars and homes. It wouldn’t be a wild idea to say that battery technology is the next big jump we’re waiting for. Advancements in smartphones and computers are slowing down, and battery weight and size are holding electric cars back. But when it comes to those small devices, one of the biggest problems is also one of the oldest, and machine learning might be coming to the rescue.

With hardware and batteries, there’s a certain impetus to improve things. All the bezel-less phones we’re seeing this year look downright magical, and they’re often sporting phones that are brighter and more efficient than ever. It’s easy to sell that stuff, and it’s easy to justify buying it, assuming you don’t have to sell (your) organs to afford one of those phones. But one of the biggest problems so far has been one of the hardest to solve.

Over the last 20 years, the code that runs our favorite applications has become immensely more complex and is incredibly difficult to optimize. When people were developing Atari games and the like, they had so little space to use and code to run in that space that optimization was a necessity. Now we have the processing power that makes that unnecessary, but it’s still a huge drag for our batteries.

Artificial Intelligence is coming to the rescue, though. Academic blogĀ The Conversation points to a couple studies that have machine learning analyzing code and software modules to estimate energy consumption. Once these systems are thoroughly trained, they should be able to look at code and make suggestions for alternative coding that would perform the same functions for less power draw from the hardware the code is running on. The coolest part is that the programmer won’t have to do anything. The process would be applied after the code is compiled into machine languages – the 1s and 0s – and put into effect after the fact.

There’s still a long way to go – this stuff is still being researched. Getting the AI to get the energy-consumption estimates just right takes a lot of work, especially considering the variety of hardware and code libraries out there. The researchers working on the project, however, have seen energy-usage reductions of 40 to 70% on some preliminary tasks. It’s possible that broader applications of the technology won’t show quite so much improvement, but even smaller numbers will end up meaning hours of battery life.

When piled on with other developing technologies like the ‘battery-free’ cellphone, and existing, improving technology like ambient display tech we’re seeing on phones now, cellphones five or ten years now could be leaps and bounds more efficient even as they get more powerful.


Eric Frederiksen

Eric Frederiksen has been a gamer since someone made the mistake of letting him play their Nintendo many years ago, pushing him to beg for his own,...

Advertisement

Advertisement

Advertisement