“About a decade ago,” explains George Mason University economist Tyler Cowen, “the techno-optimist narrative began to fall apart.” Some of that was due to unmet promises or just impatience on the part of the public. But this year, Cowen says, he’s seeing a more positive story. On this year’s list, the success of COVID vaccines based on messenger RNA (mRNA), AI that can predict protein folding, gene editing with CRISPR, new lithium-metal batteries for electric vehicles, and advances in solar energy have prompted an introductory article titled “Are you ready to be a techno-optimist again?” in the MIT Technology Review.

The list is more explanation than prognostication because, of the 10 breakthrough technologies we already have eight, and only two require an added “when to expect it” note. The two yet to arrive are deliverable lithium-metal batteries to replace lithium-ion (in four years), and something called “data trusts” (in two to three years). Already part of the landscape are mRNA vaccines; computers that compose articles and reports; digital contact tracing; hyper-accurate GPS positioning; remote business, school, and medicine; multiskilled AI; TikTok’s recommendation algorithms; and green hydrogen.


The work on using mRNA in vaccines began 20 years ago with Katalin Karikó and Drew Weissman. Instead of live or dead viruses in a vaccine, mRNA uses “the short-lived middleman molecule that, in our cells, conveys copies of genes to where they can guide the making of proteins”—proteins that fire up our immune response after the alert of the presence of disease. The alert is directly derived from the coronavirus, but unlike the actual virus, it can’t make a person sick.

The vaccines developed from this technique for COVID-19 have an almost unprecedented 90%+ efficiency. And beyond those successes, Weissman adds, “Messenger RNA has an incredible future.” Researchers are already looking to reprogram the technique to deal with HIV, herpes, infant respiratory virus, and malaria, as well as possibly producing a universal flu vaccine.

2. GPT-3

Developed by OpenAI in San Francisco, Calif., the generative pretrained transformer (GPT-3) is the biggest language model ever built. It’s an “algorithm that uses deep learning, trained on the text of thousands of books and most of the internet, to string words and phrases together. When it was launched in 2020, its ability to mimic human-written text with uncanny realism seemed to many like a milestone on the road to true machine intelligence.”

The latest iteration, v3, has significantly grown its neural network and training set over the previous version. It now has 175 billion parameters that can be adjusted during training. “It gives the impression it can write anything: fan fiction, philosophical polemics, and even code.” The editors provide a measure of its value with the advice “If you want to know the state of deep learning today, look at GPT-3.”


This is one of only two items in the list that are in planning, and it’s expected to be available in about two to three years. Data trusts will help guarantee personal security and privacy. Just as legal trusts have people and laws in place to protect trust owners’ assets, a data trust would have trustees to look after the data and data rights of groups of individuals. “And just as doctors have a duty to act in the interest of their patients, data trustees would have a legal duty to act in the interest of the beneficiaries.”

There are governments that have already enacted laws to protect individual privacy—Europe’s General Data Protection Regulation, for instance—but much more needs to be done to create, protect, and disseminate privacy rights. In addition to data trusts, there are groups also looking into other measures, including data cooperatives and data unions.


The change over to electric vehicles has been slowed by current lithium-ion batteries that are costly and heavy, and that lack the power needed and the ability to recharge quickly. As a result, we have a meager 2% of new car sales that are electric. Lithium-ion batteries also can ignite in flames in a collision because of their liquid electrolytes.

Now, after two decades of research and development, a new lithium-metal battery, with a solid-state electrolyte, is almost ready to replace lithium-ion batteries. One producer, QuantumScape, staged a presentation in December 2020, which showed a battery that can be charged to 80% of its capacity in 15 minutes and has an ultimate driving range of 450 miles, would last for hundreds of thousands of miles and worked well even in freezing temperatures. 2025 is the target date for release of the lighter, more powerful lithium-metal batteries.


You might recall an SF TechNotes posting in June 2020 titled “Tracing the Virus with Bluetooth.” It described a collaboration between Google and Apple to enable a broad, Bluetooth-based contact tracing platform. Well, that didn’t work out. The technology was ready, but the will to do it, to set up services to operate across state or international borders, to have sufficient testing, and to convince users of the anonymity of their devices ultimately failed.

The data could have been very valuable, but the power of Big Data was ignored. The editors point out that although “COVID exposure notifications didn’t live up to the hype...there’s still a lot to learn from their rollout.”


The global positioning system (GPS) was one of the first applications for the newly launched satellite systems, and “since 1993, at least 24 GPS satellites have been orbiting the earth and constantly broadcasting their positions.” Since then, the GPS devices in your car or on your phone can tell you almost immediately your exact location within five to 10 meters by triangulating signals from three of the 24 space satellites. Beginning in 2020, China deployed its BeiDou Navigation Satellite System, which has a total of 55 satellites with accuracy within 1.5 to two meters. And now the next positioning system will be based on the quantum properties of matter, not satellite signals.

Science journalist Ling Xin explains, “When atoms are cooled down to just above absolute zero, they reach a quantum state that is particularly sensitive to outside forces. Thus, if we know an object’s initial position and can measure the changes in the atoms (with the help of a laser beam), we can calculate the object’s movements and find its real-time location.” This would be practical where GPS or BeiDou don’t work—in deep space, under the oceans, or as a backup navigation system for autonomous cars. Xin points out, “A very early version of a quantum positioning system, developed by ColdQuanta in Boulder, Colorado, is now operating on the International Space Station.”


In its simplest terms, “The pandemic set off a global experiment in virtual living that will continue to shape our lives for years to come.” Remote working, learning online, and remote healthcare all have moved us toward “the one thing that’s going to stay with us post-COVID—we’re going to center our lives around our homes.”


Today, computers have computer-vision systems that can detect and interpret images. Neural networks can imitate the way we hear, speak, write, and reason, but the two systems are built to do only one of these at a time. Now with advances in language processing algorithms, like GPT-3, there are efforts to combine the language abilities with sensing capabilities to create bimodal models, or visual-language AI.

Children learn early how to associate words with sights and sounds, and the combination helps create a unified sophisticated model of their world. OpenAI has released two visual-language models for its GPT-3. MIT Technology Review senior AI reporter Karen Hao explains, “In the long run, multimodal breakthroughs could help overcome some of AI’s biggest limitations.”


You might wonder why a social app is on this list. It’s the success of the recommendation algorithm for the “For You” page on TikTok that put it here. Since its launch in China in 2016, the app has been downloaded 2.6 billion times, with 100 million users in the United States. The editors explain, “While other social media platforms favor viral content with mass appeal, TikTok’s algorithms have proved especially adept at plugging creators into niche communities that share interests, hobbies, or a particular identity.” This magic ingredient is improving over time, and “competitors like Instagram, Snapchat, and Triller have sped up attempts to copy what it is that makes their rival’s recommendations so good.”


Hydrogen as a fuel can provide about three times the energy you can get from a similar volume of gasoline or diesel. Today, most manufactured hydrogen is made by combining methane with steam at high temperatures. But you can also make something called “green hydrogen” by splitting water into its two component parts using electrolysis (subjecting the water to electricity). The resulting hydrogen is green because when used as a substitute for fossil fuels—coal, petroleum, or natural gas—instead of producing climate-warming carbon dioxide, it produces water vapor.

The problem with it is the amount of energy needed for the electrolysis, which makes it much more expensive than conventional hydrogen fuels. But it’s important to note that the cost of green hydrogen is half what it was 10 years ago. “And as the cost of wind and solar power continues to drop, and economies of scale around green hydrogen production kick in, it could get a lot cheaper.” And that leads the editors to surmise, “If that happens, green hydrogen has the potential to become a core fuel for a decarbonized future.”

Europe is leading the way for green hydrogen with “hydrogen valleys.” These are “regional projects that situate electrolysis plants where they can serve multiple industrial purposes.” These large projects exist in Germany, Netherlands, Italy, Spain, France, and Britain, as well as Canada, Australia, Japan, and China. But not yet in the U.S., where the discussions about the alternative fuel have barely begun.


This March/April MIT Technology Review issue is dubbed The Progress Issue, and the editors appropriately include a short piece by David Rotman on what we have learned about breakthroughs since the first year of the list. Included in that first list were brain-machine interfaces, flexible transistors, data mining, digital-rights management, biometrics, natural-language processing, microphotonics, untangling code, robot design, and microfluidics. Rotman offers four lessons that have emerged since that 2001 inaugural list of technologies to be watched.

Progress is often slow. The idea of brain-machine interfaces was in the first list, and since then, neuroscientists have struggled to manage the complexity of neural circuits and systems with only modest gains.

Sometimes it takes a crisis. The most dramatic example of this type of catalyst is the COVID-19 pandemic.

Be careful what you wish for. Biometrics produced the first facial recognition applications, which today have provided a raft of privacy problems. So many problems, there is discussion currently within the European Union about possibly banning facial recognition in Europe.

The trajectory of progress matters. List one includes data mining, and Rotman notes, “Thanks to ever increasing computational power, the exploding size of databases, and closely related advances in artificial intelligence, data mining (the term is now often interchangeable with AI) rules the business world. It’s the lifeblood of big tech companies, from Google and its subsidiary YouTube to Amazon and Facebook.… Yet these great successes mask an underlying failure that became particularly evident during the pandemic. We have not exploited the power of big data in areas that matter most.…We could have learned so much more about how the virus spreads, how it evolves, how to treat it, and how to allocate resources, potentially saving countless lives. We didn’t seem to have a clue about how to collect the data we needed.”

There are summaries of the MIT Technology Review’s breakthrough lists over the 20 years, available by year, online at 2020 | MIT Technology Review. Whether you approach the lists with a jubilant or jaundiced eye, as indicators of where we have been and where we’re going, they’re very interesting.

About the Authors