Saving on the Edge: Efficiently Allocating Energy in Mobile Edge Computing for Augmented Reality Applications
Some may think people expended too much energy playing Pokémon Go during last year’s craze around the augmented reality game. It turns out the technology may have, too. That’s because augmented reality games, such as Pokémon Go, require a level of processing that quickly drain our phone batteries. With this in mind, researchers have proposed a model that uses the inherent collaboration of mobile augmented reality applications for increased energy efficient mobile edge computing.
Once implemented, the new approach could drastically save battery life for electronic devices when using augmented reality applications. In simulations, the researchers’ proposed system saved 63 percent of the energy augmented reality applications currently use.
Augmented reality applications require large computation power due to the need to respond quickly to users’ actions and to update the state of the augmented reality environment. Furthermore, when people play the same augmented reality application in the same location, extensive power is required to avoid delays in receiving or processing data to make sure every user is seeing the same augmented reality scene in real time.
In previous mobile edge computing designs, multiple mobile devices would individually offload the high energy processes of augmented reality applications through a mobile network – compiling the data and routing it to cloudlet processors (small-scale cloud data centers that use less energy by being physically closer to mobile users) through wireless access points. The close proximity of cloudlets reduces network congestion and allows data from applications to be deployed quickly.
However, even when using a cloudlet and engaging in mobile edge computing, the energy consumption of having each user separately offload data requires significant processing from mobile devices.
The new approach from the New Jersey Institute of Technology efficiently allocates resources in mobile edge computing by using the shared data of augmented reality applications to lower the energy usage of all devices in the network.
The software selects the users with the best connection in a mobile network to upload all shared information of an augmented reality scene in the app on behalf of all local users. The selected devices route the information to a cloud processor. After the upload, the cloud processor transmits the now processed data to all users in the network at once via a shared downlink, instead of transmitting and uploading the data separately for each user.
As users access augmented reality applications, these uploads and downloads continue to occur. The chart below represents the difference in energy usage between the proposed system and a system where each user separately offloads data.
This method produced 63 percent energy savings in simulation, but the researchers believe the system could result in more energy savings than recorded in their simulations.
“While the variation in energy savings from our allocation method calls for further study, it is fair to expect that the gains should increase with the number of users present within the same area,” said Osvaldo Simeone, Professor of Information Engineering at King’s College London, previously at the New Jersey Institute of Technology. “This is because a larger number of users would increase the opportunities for collaborative uploading and downloading.”
With the study completed only a few months ago, the researchers are looking to refine their proposed model. Simultaneously, they are working to uncover the best methods for testing the implementation of the system.
Once completed, companies creating augmented reality applications and those looking for ways to improve their energy efficiency could benefit from these performance gains.
According to TechCrunch1, by 2021 mobile augmented reality could lead the charge in the $108 billion market for virtual and augmented reality. Since one of the biggest obstacles for mobile augmented reality has been battery life, this energy efficient approach could help bring that prediction to life.
Leave a Reply
Want to join the discussion?Feel free to contribute!