Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Ultra-efficient machine learning transistor cuts AI energy use by 99%

Oct 18, 2023 - newatlas.com
Researchers at Northwestern University have developed a microtransistor that is 100 times more efficient than current technology, potentially enabling machine learning to be performed on mobile and wearable devices. The device, built from molybdenum disulfide and carbon nanotubes, can be quickly reconfigured for multiple steps in the data processing chain, unlike traditional silicon transistors. In tests, the device was able to analyze and classify different types of heartbeats with 95% accuracy using just two microtransistors and 1% of the energy that current machine learning approaches would require.

The new technology could allow small, battery-powered devices to run machine learning AI over their own sensor data, providing quicker results and keeping personal data local, private, and secure. However, it is unclear when this technology will be ready for production or if it could be used for larger machine learning and AI equipment. The research was published in the journal Nature Electronics.

Key takeaways:

  • Researchers at Northwestern University have developed a new microtransistor that is 100 times more efficient than current technology, potentially enabling machine learning to be performed on mobile and wearable devices.
  • The new device is built from two-dimensional sheets of molybdenum disulfide and one-dimensional carbon nanotubes, allowing for quick tuning and reconfiguration.
  • In tests, the microtransistors were able to correctly classify abnormal heartbeats with 95% accuracy using just two of these devices and 1% of the energy that current machine learning approaches would require.
  • This technology could allow for faster, more efficient data processing on portable devices, keeping personal data local and secure, and significantly reducing energy consumption and associated emissions in the field of AI.
View Full Article

Comments (0)

Be the first to comment!