HiPEAC18 keynote speaker Dileep Bhandarkar has been working in the semiconductor industry for almost 50 years, during a period of extraordinary growth in computing technologies. Now Vice President, Technology at Qualcomm Datacenter Technologies, this wealth of experience is invaluable in maintaining a lead position in technology development for next-generation server platforms. We caught up with Dileep prior to the HiPEAC conference in Manchester to find out how he got into electrical engineering, why technology development tends to be evolutionary rather than revolutionary, and what the future holds for semiconductor development.

1. What first got you interested in electrical engineering, and what's kept your interest over the years?

I grew up in India and my father used to build audio amplifiers at home for playing music. That got me interested in electrical engineering. The field of electronics was quite nascent in India in the mid-1960s. I joined a BTech program at the Indian Institute of Technology Bombay in 1965. Most of my friends had selected chemical or mechanical engineering courses, which were more popular. If it had not been for my father’s influence, I would not have selected electrical engineering. It turned out to be great decision! Fortunately, by the time I got to my final two years, the school had added a specialization in electronics. I left India upon graduating in 1970 to start a PhD programme at Carnegie Mellon University. The rest is history.

2. What are you most proud of in your career to date?

I am proud to have had a long a productive career and to be still doing leading work in my field more than 47 years after getting my first degree. In 1997, I was elected an IEEE Fellow for contributions and technical leadership in the design of complex and reduced instruction set architecture and in computer system performance analysis. In 1998, I was recognized as a distinguished alumnus of the Indian Institute of Technology Bombay. I continue to work on some exciting projects at Qualcomm.

3. What are the most surprising developments you've seen in semiconductor technology over your career? How does revolutionary technology tend to fare against evolutionary in your experience, and why?

I started my work career in 1973 at Texas Instruments, working on magnetic bubble memories. We thought that this new technology would surpass and replace hard disk drives. I quickly learned that evolution of an incumbent technology is difficult to beat. The semiconductor node back then was 5 or 6 micron, and we were very worried about what would happen to the lithography methods when we got to submicron geometries. As it has turned out the semiconductor technology has evolved using optical lithography all the way down to 10 nm today. Moore’s Law has evolved over the last 40+ years through incremental improvements in materials, device structures, and manufacturing processes.

The investment and knowledge in evolutionary technologies makes it very easier to make incremental changes. Revolutionary technologies in all fields have a high bar to overcome, especially with reaching high volume at low cost and high reliability.

4. How do you see microprocessor development evolving over the next ten years?

Microprocessor development has evolved greatly over the last two decades. Single-thread performance growth has slowed down and the focus has shifted to multicore throughput performance. Energy efficiency has lagged, and the future requires a new approach to central processing unit (CPU) design that emphasizes energy efficiency. At Qualcomm, we are taking advantage of the early access to fab process tech driven by mobile phone volume. Qualcomm Datacenter Technologies is uniquely positioned to leverage mobile growth and drive datacentre energy efficiency leadership.

The industry has transitioned from the single core era to the multicore era, and is now transitioning to the era of application-specific accelerators.

5. Cloud computing has already revolutionized the way that most of us live and work. What do you think the computer architectures of future data centres will look like?

For decades, we have been able to take advantage of Moore’s Law to improve single-thread performance, reduce power and cost with each generation of semiconductor technology. While technology has advanced after the end of Dennard scaling more than 10 years ago, the advances have slowed down. Server performance increases have relied on increasing core counts and power budgets.

At the same time, workloads have changed in the era of cloud computing. Scale out is becoming more important than scale up. Domain specific architectures have started to emerge to improve the energy efficiency of emerging workloads like deep learning.

My talk at the HiPEAC conference will provide a historical perspective and discuss emerging trends driving the development of modern processors for the data centre.

Find out more about Dileep's career history by watching this video.

Follow the HiPEAC conference on social media: #HiPEAC18