While the industry loves to combine “R&D” and we see this in every tech company’s P&L, research and development are very different. Research is high risk, market making investments and discoveries that are unattached to products. Development is applying that research and other’s IP to create an end product or services. Development is less risky. Very few companies do research, and Intel has had a heritage in research for decades.
One of the most exciting aspects of working as a tech analyst is, quite frankly, being one of the first to learn of these new, research-driven, cutting-edge technologies coming down the pipeline in the not-so-distant future—from the expected to the truly mind-boggling. As such, I always look forward to Intel Labs disclosures when the company shines a light on its research initiatives to advance computing. This year’s Labs Day event, held digitally, of course, featured the theme “In Pursuit of 1000X: Disruptive Research for the Next Decade in Computing”—music to a tech geek’s ears.
The illuminating keynote for the event featured Intel Labs’ managing director, Rich Uhlig, the director of Intel’s Neuromorphic Computing Lab, Mike Davies and the senior principal engineer in Intel’s photonics lab, James Jaussi. The event covered various topics, including neuromorphic computing, quantum computing, integrated photonics, confidential computing and machine programming. Of these, I found the conversations around neuromorphic computing and silicon photonics the most compelling. Today I wanted to share what I learned at the event and define the two concepts for the unacquainted (let’s face it—probably most people).
One thing that gets overlooked is that, as far as we’ve come with technology, we still haven’t even begun to touch the brain’s capabilities in some areas. During the keynote, Davies compared an autonomous drone with a cockatiel (a small parrot, native to Australia and famous worldwide as a pet) in terms of energy efficiency. The latest racing drones consume 18 watts of power, which means they must charge every 10-20 minutes. Additionally, according to Davies, these drones can barely autonomously navigate through a pre-programmed sequence of gates at a walking pace. On the other hand, a cockatiel can fly 22 miles per hour, forage for food, communicate with cockatiels, and even learn to speak a few English words—all with a 2-gram birdbrain running off of an infinitesimal 50 milliwatts of power. We need to be able to get on that level.
Neuromorphic computing, in a nutshell, refers to the futuristic concept of building computer systems capable of mimicking neurological brain structure. A neuromorphic chip seeks to approximate the way brains learn in real-time from external stimuli. Today’s AI is limited in that it hinges on deterministic, literal interpretations of events. The next generation of AI needs to be able to address unique situations on the fly. The world is a dynamic, uncertain place, and AI must handle any unusual situation that comes its way (especially when the decision could mean the difference between life and death, as in autonomous vehicles).
This is a complex undertaking. It requires a rethinking of the fundamental traditional computing model across software, architecture, circuits and more. It’s a big job, but the potential applications for such technology make it worth the effort. Robotics, autonomous vehicles, optimization, and voice, gesture and object recognition are just a few areas that stand to be revolutionized by true neuromorphic computing capabilities.
Intel Labs began researching neuromorphic computing in 2015, followed by the announcement of its first neuromorphic research chip, Loihi, in 2017. In 2018, Intel launched the Intel Neuromorphic Research Community, or INRC, which it calls a “collaborative research effort” between academic, governmental and industrial teams around the world to tackle the most significant obstacles facing neuromorphic computing.
Intel Labs Day 2020 provided an INRC status update, including the announcement of new, high-profile members, Lenovo, Logitech and Mercedes-Benz, who were brought on board to explore potential business use cases for neuromorphic computing. We also learned that the INRC now includes over 100 members—impressive growth for a two-year-old organization.
Additionally, Intel showcased a crop of benchmark results for Loihi gleaned from INRC member tests. As with any up-and-coming technology, it is essential to show the industry that neuromorphic computing models can keep up with or, preferably, beat traditional computers. For example, Accenture found that Loihi could recognize voice commands at a similar accuracy level as a standard GPU. And get this—it was as a thousand times more energy-efficient than the GPU and 200 milliseconds quicker. Additionally, the INRC says it is making progress leveraging Loihi’s ability to self-learn individualized human gestures—a communication medium that has proven difficult for conventional AI to comprehend. According to Intel, Loihi can learn new gestures using a neuromorphic camera after only being exposed to them a couple of times. That’s not too far off from our learning abilities.
Image retrieval is another area where Loihi shows some impressive results—retail industry researchers have found that Loihi can deliver image feature vectors three times more efficiently than a typical CPU in terms of energy consumption. Additionally, Intel and researchers have shown that Loihi can solve optimization and search issues one hundred times faster than CPUs and over 1,000 times more efficiently. In the realm of robotics, Rutgers researchers found that their Loihi-powered robotic navigation solutions and micro-drone control applications demand 75 times less power than conventional GPUs, without sacrificing any performance.
In short, there’s a lot to be excited about in Intel’s neuromorphic computing efforts. Intel has a crucial advantage over other potential competitors in that it already has research muscle devoted to it in Intel Labs. Another big differentiator, in my view, is the INRC community around Loihi. Intel understands that no one company will be able to single-handedly deliver on the promises of neuromorphic computing and is wise to marshal a broad, diverse range of voices to help drive the technology forward and build out the ecosystem.
Another issue Intel Labs is seeking to address is a looming I/O power wall and bandwidth gap. Today’s electrical I/O is approaching its limits as the bandwidth demand for computing continues to skyrocket—a situation Intel says will likely limit performance scaling in the near future. The answer to this, as Intel Labs sees is, is silicon photonics—the integration of photonics with high-value, low-cost silicon (in short, transmitting data by light). Intel says this will enable much faster data transfer over longer distances to boot.
Intel Labs has been hard at work on what it considers to be the building blocks for silicon photonics, and it shared several developments at Intel Labs Day 2020. One of the significant steps forward is Intel’s successful miniaturization of a silicon modulator, which traditionally has been too big and expensive to put on a compute package. These “micro-ring” modulators are more than 1,000 times smaller than standard modulators. Another finding of Intel Labs is that contrary to industry assumptions, silicon has light detection capability in the 1.3-1.6um wavelength range. This breakthrough stands to potentially lower the costs of building infrastructure.
Intel Labs also showcased its integrated semiconductor optical amplifier, an essential component of silicon photonics, and its integrated multi-wavelength lasers—both built from the same materials. Multi-wavelength lasers utilize a process called wavelength division multiplexing, which enables multiple wavelengths to use the same laser. This approach boosts bandwidth density since one can transmit more data over the same beam of light. It’s also important to note that without these lasers, one cannot build an optical amplifier. Notably, Intel is the only company to have demonstrated both of these technologies.
Lastly, Intel Labs discussed how it could achieve low power, higher bandwidth and a reduced pin count by combining photonics and CMOS silicon with advanced packaging techniques. Notably, Intel is the only one in the industry who can claim multi-wavelength lasers, semiconductor optical amplifiers, all-silicon photodetectors, and micro-ring modulators on a single technology platform tightly integrated with CMOS silicon. This, according to Intel, paves the way for the scaling of integrated photonics.
Among analysts, media and investors alike, the tech conversation generally focuses on addressing current and near-term challenges and situations. This is not unreasonable—these are the most pressing issues. That said, Intel Labs demonstrates that, with a dedicated science and research arm, you can stay on top of the present and still keep a focused eye on the long game. While the rest of the company focuses on driving profit in the here and now, Intel Labs can devote its 700-strong team of scientists and researchers to postulating, experimenting and developing the solutions to tomorrow’s technological problems. The division is a strategic, long-term asset that should do much to ensure Intel’s place at the forefront of wherever technology goes in the next decade. While more short-sighted companies may be scratching their heads in a few years over some new, unforeseen technological hurdle, Intel Labs will likely be standing at the ready saying, “Don’t worry, we’ve got this.”
Note: Moor Insights & Strategy writers and editors may have contributed to this article.
Moor Insights & Strategy, like all research and analyst firms, provides or has provided paid research, analysis, advising, or consulting to many high-tech companies in the industry, including 8×8, Advanced Micro Devices, Amazon, Applied Micro, ARM, Aruba Networks, AT&T, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Digital Optics, Dreamchain, Echelon, Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Google (Nest-Revolve), Google Cloud, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Ion VR, Inseego, Infosys, Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, MapBox, Marvell, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Mesophere, Microsoft, Mojo Networks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nuvia, ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Poly, Panasas, Peraso, Pexip, Pixelworks, Plume Design, Poly, Portworx, Pure Storage, Qualcomm, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Residio, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak, SONY, Springpath, Spirent, Splunk, Sprint, Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connectivity, TensTorrent, Tobii Technology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zebra, Zededa, and Zoho which may be cited in blogs and research.