Researchers from MIT and the MIT-IBM Watson AI Lab have developed a machine-learning accelerator that combats cyber threats, thereby protecting sensitive user data. While certain health or fitness apps employ these vast machine-learning models to provide insights, they can sometimes prove to be sluggish and consume a large amount of energy due to the shifting of data between a smartphone and the central memory server.
Generally, engineers tend to speed up this process by implementing hardware that reduces the need of transferring data to and fro. Although this tends to simplify the computability process, it also becomes vulnerable to cyber-attacks that can risk the privacy of the user’s data.
To combat this issue, the team of researchers developed a few optimizations that create a balance between conferring strict security and slightly slowing down the device. This pioneering technology doesn’t negatively impact the accuracy of computations and is particularity beneficial in AI applications with high demand, such as autonomous driving and augmented and virtual reality.
However, an implementation of this chip could lead to an increase in the device’s cost and its expenditure on energy, a sacrifice that the lead author, Maitreyi Ashok, deems worthy for ensuring security. Ashok affirms the importance of keeping security at the forefront during the design process to avoid any costly additions at a later stage.
In order to guard against hacking threats, the team adopted a three-fold approach – they split the IMC data into random pieces, preventing the hacker from gaining complete access; they employed a lightweight cipher that encrypted the model stored in the external memory and only decrypted it when necessary; finally, the key that decrypts the cipher is produced directly on the chip to improve security.
Despite these groundbreaking advances, the added security reduces the accelerator’s energy efficiency. Furthermore, the requirement of a larger chip area could increase the fabrication costs. Therefore, the team intends to find ways to diminish the energy consumption and size of the chip to make it easier to apply on a broader scale. The research would not have been possible without funding from sources such as the National Science Foundation, the MIT-IBM Watson AI Lab, and a Mathworks Engineering Fellowship.