TOKYO -- For all but tech geeks and gamers, the name Nvidia may not ring a bell. But the American chipmaker is no "new kid in town;" he is already a tech community celebrity.

Though still relatively young, Nvidia has already joined hands with the world's top automakers, including Japan's Toyota Motor, Germany's Daimler and its peer Audi. It has also partnered with leading electric car maker Tesla. Last month, news came that Japanese technology conglomerate SoftBank Group has acquired around $4 billion worth of Nvidia shares. The reports caused the shares' price to spike on May 24.

Nvidia CEO Jensen Huang

In a keynote speech at a recent meeting with the company's software developers, CEO Jensen Huang explained the three things that made the "big bang of modern AI [artificial intelligence]" possible: a big evolution in deep learning, enormous amounts of data, and the use of GPUs, or graphics processing units. Huang was apparently confident about his company being at the forefront of the market for AI-use semiconductors.

GPU maker

As a chipmaker, Nvidia is not big. In the three months ended this year, it made $1.93 billion in sales, less than 15% of the $14.8 billion made by industry leader Intel. Its sales were similar to those of Sony's chip unit, which now focuses almost entirely on image sensors.

Of the $1.93 billion in sales, $1.56 billion, or roughly 80%, came from GPUs -- chips that process 3-D computer graphics (Pixar Animation Studios is a customer). GPU sales to data center operators were especially strong, almost tripling from the year-earlier quarter. Sales to the gaming sector, which includes GPUs for computers, were also brisk, thanks to Japan's Nintendo putting Nvidia chips inside its Switch game consoles.

Nvidia introduced its first product, the NV1, in 1995. In 1997, the Riva 128 became the company's first hit, giving Nvidia a strong foothold as a GPU maker. Many vendors existed at that time, but after a number of mergers and reorganizations, only two survive: Nvidia and AMD.

Nvidia faces a tough future. In the past, all computers had graphics chips inside. Now, however, Intel's CPU, or central processing unit, is capable of handling graphics -- though not sophisticated enough for 3-D computer games. As a result, most of today's computers do not carry a graphics chip. And the PC market itself is shrinking.

Realizing its situation, Nvidia set its sights on two markets: mobile and high-performance computing, or HPC.

Short-lived mobile success

For mobile devices, Nvidia developed the Tegra, a combination of its own graphics chips technology and a CPU core from U.K. chipmaker Arm Holdings. The Tegra chip first made its way into a device in 2011. Since then, it has been adopted by numerous makers, including Google, which uses the Tegra chip for its Nexus 7 tablet.

But the good times ended when rival Qualcomm of the U.S. introduced Snapdragon, a processor that comes with a communications chip, which Nvidia lacked. Nvidia struggled as its chip failed to satisfy market needs. Snapdragon became an industry standard.

Most smartphones have Snapdragons inside. Those that do not, like the iPhone, use proprietary chips, like apple's A series. Samsung Electronics uses its Exynos technology, and Huawei has Kirin. One of the few exceptions is Media Tek, a Taiwanese chipmaker that supplies chips to emerging smartphone makers.

Chips for AI

As for high-performance computing, the sector began drawing attention in 2008, when the Tokyo Institute of Technology used Nvidia chips for its Tsubame supercomputer.

Nvidia next developed the GPGPU, or general-purpose GPU, which allows for processing information other than graphics. Nvidia made this possible by introducing Cuda software, which makes it easier to process bulk amounts of information in a tiny moment, thus allowing for 3-D graphics.

Initially, there was no market for the GPGPU. Then artificial intelligence came along.

Deep learning, the self-teaching process that enables artificial intelligence, involves processing large amounts of information all at once -- something GPGPUs were designed for. Many deep learning development tools -- such as Google's TensorFlow and Preferred Networks's Chainer -- have since been designed with GPGPUs in mind.

With Nvidia's GPGPU, a computer's graphics board can handle AI development tools. This allows developers to use their own computers while in the early stages of their projects.

This versatility has made Nvidia No. 1 in the AI chip market.

Intel fighting back with FPGA

Meanwhile, a new Nvidia mobile GPU, the Tegra X1, has found its way into a hot-selling product -- the Nintendo Switch.

And the old Tegra has found a second life aboard Drive PX, an AI-based platform for vehicles. Nvidia offers several products that enable cruise control, autonomous driving and other acts of magic.

The company offers hardware for deep learning, software development tools and a host of other technology.

Nvidia's advantage in AI is not guaranteed. Makers of FPGAs, or field programmable gate arrays -- chips whose circuit structure can be freely altered -- are looking to take on GPU producers.

One advantage of FPGAs is that unlike GPUs, which only process calculations, FPGAs can process AI-related information as a whole and at a faster speed.

Xilinx, a major American FPGA maker, said when its partner companies used its FPGA for genome sequencing and for the kind of deep learning needed to perfect speech recognition, FPGAs were shown to be faster and consumed less power than GPUs.

Intel in late 2015 acquired a U.S. company known as Altera for its FPGA products. Intel has an edge over rivals with its general-purpose processors but has been lagging Nvidia and AMD in GPUs. For AI-use chips, Intel is fighting back with the Xeon Phi, designed for high-performance computing. The company has teamed up with Google to optimize Google's TensorFlow for the Xeon Phi.

But the lineup has not been as diverse as Nvidia's GPU lineup -- with even low-end processors priced relatively high at around $2,200 -- and includes no AI-use chips for vehicles. By acquiring Altera, Intel hopes to fill the void.