Revolutionary for AI calculations
For those who missed it today machine learning or artificial intelligence processes, the logic in which information is processed within the system and the memories where data is stored, or in other words by Neumann It is based on architecture. This means large amounts of power and energy consumption during data transfer.
Additionally, scientists say the project is based on more than 20 years of research and pioneering work by Engineering Professor Jian-Ping Wang using MTJ nanodevices.
Up to 2,500 times energy saving
Ulya Karpuzcu, co-author of the article published in Nature magazine, said, “As a highly energy-efficient digital-based in-memory computing infrastructure, CRAM is highly flexible in that computation can be performed at any location in the memory array. Accordingly, we can reconfigure CRAM to best suit the performance needs of various AI algorithms,” he said. He also stated that traditional artificial intelligence systems are more energy efficient than their building blocks.
The team behind the work, including researchers at the University of Minnesota, is now working with leaders in the semiconductor industry to finalize the project and produce hardware to advance AI functionality. However, let us point out that this is not a short process. Likewise, researchers need to overcome challenges such as scalability, manufacturing, and integration with existing silicon.
This news our mobile application Download using
You can read it whenever you want (even offline):
Source link: https://www.donanimhaber.com/yapay-zekada-enerji-devrimi-yenilikci-cram-teknolojisi-geliyor–180022