Basic Research

Basic Research Plays an Outsized Role in Technology Breakthroughs

Standing on the shoulders of Bell Labs and Xerox PARC, NTT Research is laying the foundation for significant advances in quantum computing, cryptography and medicine.

As an institution focused solely on pure, basic research, NTT Research stands in rare company. While at one time it was not unusual for large companies to invest significantly in research, in recent years that model has largely gone by the wayside. NTT Research believes it’s time to revive the practice, as only deep research will bring to fruition dramatic, game-changing technological gains.

The most famous example of research driven by large companies is Bell Labs, initially formed by Bell Telephone Company (which later became AT&T) and Western Electric in the late 1800s. Bell Telephone Laboratories was officially launched on Jan. 1, 1925.

Government research entities soon followed. Prior to World War I, President Woodrow Wilson created the Naval Consulting Board to pursue research that would be helpful to the U.S. Navy. Companies such as General Electric and Bell Labs focused on sonar and early radio technology, Ford on military boats and trucks, and DuPont researched synthetic fuels, fabrics and more.

Eventually, other government labs came on the scene, including the Air Force Research Laboratory (AFRL) and Naval Information Warfare Systems Command (NAVWAR). Today, 17 national laboratories are under the auspices of the Dept. of Energy while the Defense Advanced Research Projects Agency (DARPA) funds research at various third-party labs.

Industry Research Investment: Bell Labs and PARC

Industry investment in research likewise continues, but it’s not typically focused on “pure” or basic research.

Bell Labs and PARC are notable exceptions. Now known as Nokia Bell Labs, the Bell Labs Core Research unit remains, along with Bell Labs Solutions Research. They have diverse research programs, ranging from robotics and artificial intelligence to networking, all with the same goal: “devising the technologies that will have the most sustained impact on the service providers, enterprises and industries Nokia serves.”

In 2002 PARC became an independent subsidiary of Xerox and dropped the “Xerox” from its name to become just PARC. PARC, which ordinally stood for Palo Alto Research Center, is probably the research institution that is most similar to Bell Labs in that it was established by Xerox in 1970 to explore new information technologies aimed at creating the “office of the future.” These technologies strayed from Xerox’s core photocopier business to include programming languages (Smalltalk), personal computers (Alto), networking (Ethernet) and the beginnings of natural language processing, to name a few.

Why We Need Basic Research

It is upon the foundation of these institutions that NTT Communications decided to build NTT Research as an entity focused on pure research.

NTT believes we need basic research because truly ground-breaking technologies aren’t developed overnight, or from some single “Aha!” moment – although such moments may contribute.

Rather, breakthrough technologies develop from years of research, which involves plenty of trial and error. Researchers continually build on what came before to eventually create something truly different – and useful.

To get a sense for how important this endeavor is, let’s look at a few developments that can trace their origins to basic research.

The Transistor

The transistor was created in 1947 by Bell Labs researchers John Barden and Walter Brattain. They made the first “point contact” transistor, which was a three-terminal device consisting of two-point contacts between metal and a semiconductor. William Shockley later developed a junction transistor that was easier to fabricate.

In 1956, Barden, Brattain and Shockley won the Nobel Prize in Physics for their work on the transistor. But it wasn’t until decades after Barden and Brattain’s original invention that the full promise of the transistor was realized. It proved to be a foundational concept in electronics, leading the way to solid-state electronics, integrated circuits, memory chips and microprocessors.

Global Positioning System (GPS)

The global positioning system that we all rely on today for all manner of navigation got its start in 1957 when researchers at the Applied Physics Laboratory (APL) at Johns Hopkins University were developing a way to track the Russian Sputnik satellite. They found the frequency of radio signals transmitted by Sputnik increased as it approached and decreased as it moved away, a shift known in physics as the Doppler Effect.

Inspired by this discovery, in 1958 researchers at the Advanced Research Projects Agency (ARPA, a federal government precursor to DARPA) developed Transit, the world’s first global satellite navigation system. In 1960 the first Transit navigation satellites were launched to provide navigation for military and commercial users, including Navy missile submarines. By 1968 a constellation of 36 satellites were orbiting the earth.

“The system’s surveying capabilities – generally accurate to tens of meters – contributed to improving the accuracy of maps of the Earth’s land areas by nearly two orders of magnitude,” according to DARPA.

Transit was in operation until 1996, when the Defense Department replaced it with the current Global Positioning System (GPS).

The Internet

The Internet is another technological achievement that evolved out of ARPA. In the 1960s, ARPA computer scientists-built ARPANET, a communications network to connect the agency’s computers. It used a data transmission system called packet switching, developed by Lawrence Roberts, who based it on prior work of other computer scientists. The technology involves breaking up data and assembling it into fixed-size packets for transmission, then reassembling them into coherent messages on the receiving end.

In the 1970s scientists Robert Kahn and Vinton Cerf, often credited as the “fathers of the Internet,” developed the protocols that enabled packets to be addressed and sent far and wide, the Transmission Control Protocol (TCP) and the Internet Protocol (IP). It was TCP/IP that enabled packets to flow between individual networks – at the time mostly belonging to academic and government entities.

A further breakthrough came in 1989 when computer scientist Tim Berners-Lee invented the World Wide Web while working at CERN, the European Organization for Nuclear Research. CERN needed a way to communicate among the 17,000 scientists from over 100 countries that contributed to its work, most of them working at universities and laboratories in their home countries.

“The basic idea of the WWW was to merge the evolving technologies of computers, data networks and hypertext into a powerful and easy to use global information system,” according to CERN.

The Web snowballed relatively quickly beyond its academic roots, driven in no small part by the Mosaic browser, released in early 1993 by the National Center for Supercomputing Applications (NCSA) at the University of Illinois. That same year, CERN made Web source code available for free. By late 1993, there were over 500 known Web servers. Today, it’s impossible to count, but the number is likely in the hundreds of millions.

Ethernet

Around the same time the Internet was taking shape, so was a local-area networking (LAN) technology that is now ubiquitous: Ethernet.

Ethernet was created in 1973 by a team at Xerox PARC led by Bob Metcalf. They were charged with developing a network to link PARC computers while overcoming some limitations of existing networking systems. Those LANs could only connect 16 systems, all of which had to be in the same room, and used a cable that was about 1.5 inches thick, according to a Computerworld interview with Metcalf.

The solution Metcalf’s team came up with operated at a top speed of 2.94M bps, about 10,000 times faster than what existed at the time.

In 1979 Metcalf founded 3Com Corp. to commercialize Ethernet, but he was hardly alone. LAN network operating system vendors such as Banyan and IBM had their own networking schemes, dubbed VINES and Token Ring, respectively. So did minicomputer makers, such as Digital Equipment Corporation with DECnet .

But Ethernet proved more resilient and flexible, able to adapt to ever-increasing speeds. In 1983 it was adopted by the IEEE as the 802.3 standard, at 10M bps. While that was an improvement over Metcalf’s original 2.94M bps, it proved to be just a starting point. Over the years Ethernet would be improved to operate at speeds of 10M, 100M, Gigabit and even 100G bps.

NTT Research: Searching for the Next Frontier

Founded in 2019 to focus on long-term, groundbreaking research, NTT Research aims to make its own mark on the technology landscape. It comes from a rich lineage of groundbreaking research and development, as the NTT R&D Lab in Tokyo boasts more than 1,600 patents.

The intent of NTT Research is to bring basic research to the world from its base in Sunnyvale, Calif., in the heart of Silicon Valley. Like Bell Labs and PARC, the company pursues research that is not immediately tied to the core business of its parent NTT or any of its subsidiaries.

It’s also no accident that NTT Research is based in Silicon Valley, an area rich with academic and commercial institutions conducting cutting edge research. NTT Research actively seeks to partner with other research organizations locally and around the globe, and has dozens of such partnerships in place, not unlike CERN.

Currently NTT Research is focused on three core areas, each of which has its own research lab:

  • Physics & Informatics (PHI)
  • Cryptography and Information Security (CIS)
  • Medical & Health Informatics (MEI)

In each focus area, NTT Research hires leading researchers in the field and gives them significant autonomy to pursue their particular areas of interest. Following are some of the labs’ current research projects.

PHI Lab: Optical Computing

The PHI Lab’s mission is to build simple, efficient and practical ways to solve real-world problems with today’s computing systems. The lab is rethinking “computation” within the fundamental principles of quantum physics and brain science.

It fosters an environment for physicists, computer scientists, brain scientists and electrical engineers to collaborate in creating computing hardware and software that extend the limits of Moore’s Law.

Moore’s Law is reaching its limits with respect to traditional electronics for basic physics reasons, including transistor size limits and interconnect bottlenecks. NTT Research sees photonics as offering a solution to extend Moore’s Law and create next-generation computers capable of supporting complex applications such as deep neural networks.

Optical neural networks are one such solution NTT Research is pursuing, including the coherent ising machine (CIM). A CIM is a network of photonic systems that show promise for speeding up the solution of complex combinatorial optimization problems, such as the traveling salesman, by several orders of magnitude. (Learn more about NTT Research’s work on the CIM.)

CIS Lab: Next-Generation Cryptography

The Cryptography and Information Security Lab sees cryptography as essential to a Smart World. The ways in which technology touches our lives will only continue to expand, creating a need for new ways to deliver security and privacy protection. The CIS Lab focuses on foundational research problems in cryptography to deliver long term impact.

Research areas the CIS Lab is exploring include:

Attribute-based encryption: Encryption is currently an all-or-nothing proposition. Either you have the correct key to unlock a given dataset or you don’t. Attributed-based encryption (ABE) introduces the idea that multiple keys could exist for a given dataset, or ciphertext, to enable different users to access different parts of the underlying dataset. With ABE, users can decide who gets access to what types of content, leading to much more granular levels of access control.

Homomorphic encryption, which enables calculations to be performed on encrypted data without exposing the underlying data in the clear. This may be useful in research involving sensitive data, such as medical records. If the records are homomorphically encrypted, researchers can perform calculations on them, such as analytics, and get useful output without ever decrypting the underlying data.

Functional encryption, which enables a user to reveal a specific subset of an encrypted dataset, but no more. It could be used to determine if an encrypted email message is spam, for example, without ever revealing the contents of the email. Similarly, functional encryption could enable law enforcement authorities to examine a series of encrypted surveillance photos to determine if a specific person is included in them, without unveiling other image content.

Post-quantum encryption. Quantum computers are under development and may be on the scene by 2030. But data that’s encrypted today, which is expected to remain sensitive into the future, must bear quantum capabilities in mind because quantum computers will be able to easily crack today’s most secure encryption schemes. Work at NTT Research aims to deliver encryption that will stand up in a post-quantum world.

MEI Lab: Bio Digital Twins and Nanotechnology

The vision of the Medical & Health Informatics (MEI) Lab is to facilitate improved health outcomes by advancing medical and health sciences. It is focused on promoting the application of a more Personalized, Preventive, Predictive, and Participatory (P4) practice of medicine and wellness.

Research projects the MEI Lab is pursuing include:

The Bio Digital Twin: Digital twin technology is well-known in IT circles, as it enables elements of the physical world to be modeled in software – anything from machines and cars to entire buildings. Bio Digital Twin technology applies the same concept to the human body, by creating software models of the body for health care providers. The models enable providers to test therapies and drugs in minutes, and without the risk of doing any harm because the “patient” is merely a software model. NTT Research’s first target is a Cardiovascular Bio-Digital Twin, which will address acute cardiac conditions, notably myocardial infarction and acute heart failure. The goal is to enable doctors to use the digital twins to quickly home in on the best treatment plan for each individual patient.

Nanotechnology: The MEI Lab is also researching bio-electronics and bio-interfaces with the goal of creating functional, implantable cardiac micro-physiological systems (MPS), such as the Heart-on-a-Chip. The research promises significant advances in materials science, including soft, flexible, stretchable materials including nano-carbon, nano-fiber and conductive polymers that are friendly to human bodies. Research is also underway on implementation of implantable electrodes for use as sensors, stimulators and surgical tools.

Learn More About NTT Research’s Various Fields of Study

  • ABE
  • CIM
  • Bio Digital Twin
  • Nanotechnology
  • PQE

View presentations on the latest research form each lab as presented at Upgrade 2021, the NTT Research Summit.
PHI Lab: https://ntt-research.com/2021-phi-summit-index-page/
CIS Lab: https://ntt-research.com/2021-cis-summit-index-page/
MEI Lab: https://ntt-research.com/2021-mei-summit-index-page/

Work with NTT Research: Explore our Careers page, to learn if an amazing opportunity awaits you.