当前位置: X-MOL 学术Nat. Electron. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A carbon-nanotube-based tensor processing unit
Nature Electronics ( IF 33.7 ) Pub Date : 2024-07-22 , DOI: 10.1038/s41928-024-01211-2
Jia Si , Panpan Zhang , Chenyi Zhao , Dongyi Lin , Lin Xu , Haitao Xu , Lijun Liu , Jianhua Jiang , Lian-Mao Peng , Zhiyong Zhang

The growth of data-intensive computing tasks requires processing units with higher performance and energy efficiency, but these requirements are increasingly difficult to achieve with conventional semiconductor technology. One potential solution is to combine developments in devices with innovations in system architecture. Here we report a tensor processing unit (TPU) that is based on 3,000 carbon nanotube field-effect transistors and can perform energy-efficient convolution operations and matrix multiplication. The TPU is constructed with a systolic array architecture that allows parallel 2 bit integer multiply–accumulate operations. A five-layer convolutional neural network based on the TPU can perform MNIST image recognition with an accuracy of up to 88% for a power consumption of 295 µW. We use an optimized nanotube fabrication process that offers a semiconductor purity of 99.9999% and ultraclean surfaces, leading to transistors with high on-current densities and uniformity. Using system-level simulations, we estimate that an 8 bit TPU made with nanotube transistors at a 180 nm technology node could reach a main frequency of 850 MHz and an energy efficiency of 1 tera-operations per second per watt.



中文翻译:


基于碳纳米管的张量处理单元



数据密集型计算任务的增长需要具有更高性能和能效的处理单元,但使用传统半导体技术越来越难以实现这些要求。一种潜在的解决方案是将设备的开发与系统架构的创新结合起来。在这里,我们报告了一种基于 3000 个碳纳米管场效应晶体管的张量处理单元 (TPU),可以执行节能的卷积运算和矩阵乘法。 TPU 采用脉动阵列架构构建,允许并行 2 位整数乘法累加运算。基于TPU的五层卷积神经网络可以在295 µW的功耗下执行MNIST图像识别,准确率高达88%。我们使用优化的纳米管制造工艺,提供 99.9999% 的半导体纯度和超洁净表面,从而使晶体管具有高导通电流密度和均匀性。通过系统级模拟,我们估计采用 180 nm 技术节点的纳米管晶体管制成的 8 位 TPU 可达到 850 MHz 的主频率和每秒每瓦 1 万亿次运算的能效。

更新日期:2024-07-22
down
wechat
bug