3D stacked finFETs

At the upcoming 2018 IEEE International Electron Devices Meeting (IEDM), Imec is expected to present a paper on a 3D stacked finFET architecture. IEDM is slated from Dec. 1-5 in San Francisco.

Imec’s technology is based what on the R&D organization calls sequential integration. Another R&D organization, Leti, calls it 3D monolithic integration.

Regardless, the idea is that you form one layer of transistors on a substrate. Then, you form another layer of transistors on top of the first layer. The two layers, which are stacked, are connected using tiny interconnects.

At a recent event, Imec and Soitec demonstrated a sequential 3D front-end integration process. At IEDM, Imec will report on a 3D stacked finFET architecture with a 45nm fin pitch and a 110nm gate pitch.

The 3D architecture uses Imec’s sequential integration process, which enables a precise alignment between the top and bottom layers. The junction-less devices on the top layer were fabricated using low-temperature (≤525ºC) processes, according to Imec.

Imec evaluated various gate stacks for the technology. Ultimately, it choose a combination of TiN/TiAl/TiN/HfO2 with a LaSiOx dipole inserted into the stack. “The combination demonstrated good threshold voltage tuning, reliability and low-temperature performance,” according to an abstract from the paper.

Neural networks

Deep neural networks (DNNs) will also be a hot topic at IEDM. DNNs are used in machine learning for tasks such as image and speech recognition.

At the event, IBM will describe a synaptic cell based on a new electrochemical random access memory (ECRAM) device. ECRAM is based on a lithium ion intercalation formula in tungsten oxide.

ECRAMs have demonstrated high switching symmetry and linearity, good data retention and up to 1,000 discrete conductance levels for multi-level operation in large memory arrays, according to the IEDM abstract.

IBM has demonstrated high-speed programming using 5ns pulse widths with a switching energy of 1fJ. MNIST image-recognition simulations based on experimental data showed a 96% accuracy.

In a separate paper, IBM researchers will describe a technology called “projected” phase-change memory (Proj-PCM). The technology is said to achieve high (8-bit) precision for the scalar multiplication mathematics needed for AI-related computations.

Separately, at the event, Arizona State University, Notre Dame and Georgia Institute of Technology will describe an in-memory computing architecture.

The technology is based on an analog synaptic cell using ferroelectric FET (FeFET) technology. The group proposes an analog synaptic weight cell using two MOSFETs and one FeFET transistor (2T-1FeFET) to handle both the training and inferencing functions.

Researchers validated its performance using both the MNIST and CIFAR-10 training datasets, and achieved accuracies of ~97.3% and ~87%, respectively, according to the abstract from IEDM.

More papers

At IEDM, Samsung will describe more details about its 3nm gate-all-around (GAA) technology. Samsung calls this a Multi-Bridge-Channel architecture.

AIST, meanwhile, will present a paper on a superjunction silicon carbide (SiC) device. AIST has achieved a low on-resistance with a blocking voltage ≥600 V. “It is an 1,170 V superjunction SiC device that achieved an Ron of 0.63 mΩcm2,” according to the IEDM abstract.