Exponentials has been saved
Exponentials represent unprecedented opportunities as well as existential threats. Explore six with far-reaching, transformative impact.
In our Technology Trends 2014 report, we took a look at “exponentials” for the first time. In collaboration with faculty at Singularity University, a leading research institution—based in the heart of Silicon Valley and whose founders include Cisco, Google, and others—we explored innovations that are accelerating faster than the pace of Moore’s law, that is, technologies whose performance relative to cost (and size) doubles every 12 to 18 months. The rapid growth of exponentials has significant implications. Powerful technologies—including quantum computing, artificial intelligence (AI), robotics, additive manufacturing, and synthetic or industrial biology—are ushering in new and disruptive competitive risks and opportunities for enterprises that have historically enjoyed dominant positions in their industries. In this year’s Technology Trends report, we once again discuss exponentials to build awareness and share new knowledge about their trajectory and potential impact.
View all 2015 technology trends
Create and download a custom PDF of the Business Trends 2015 report.
There is significant hype surrounding exponentials, and we caution organizations not to pursue each exponential as the next new “shiny object.” For corporate executives who are crafting three- to five-year strategic plans, such a narrow focus could overshadow the broader issue: Significant change in industry landscapes will likely need much more than a technology strategy. The goal should be as much about developing organizational capabilities, particularly related to innovation intent, to navigate the pace of change as about understanding and determining the implications of any individual breakthrough. A totally new type of “exponential entrepreneur” is emerging as exponentials usher in new players and alter market dynamics. The new breed is using crowdsourcing, crowdfunding, and cloud solutions to scale quickly. An army of entrepreneurs enjoying diminishing barriers to entry and with an appetite for taking on significant risk pose new competition for established organizations. At the same time, they can provide new opportunities for market partnerships and alliances. The disruption theory of Singularity University’s co-founder Dr. Peter Diamandis is playing out across industries: As technologies become digitized and thus able to dematerialize existing products and solutions, there is potential for them to become democratized, which could threaten the market positions and demonetize the margins of existing players. Exponentials are not solely the domain of research labs, start-ups, and incubators. Indeed, large global organizations can also harness these forces with dramatic effect. But doing so may likely require bold and imaginative thinking, discipline, and a commitment to reshaping business as usual.
Enough attention has been paid to innovation to debunk myths of spontaneous, breakthrough eureka moments. As Peter Drucker famously said, “Innovation is work rather than genius.”1 How do large enterprises, agencies, and organizations invoke and harness these forces? By establishing a deliberate, measurable approach to identifying, experimenting, and investing, while also cultivating an environment for organic and creative entrepreneurial-minded innovation. Leading companies are crafting effective innovation strategies that encompass four dimensions: trend sensing, ecosystems, experimentation, and edge scaling.
In trend sensing, organizations find ways to stay on top of new developments in technology, and to identify and understand the exponential forces and sustaining advances affecting established fields. Establishing a culture of curiosity and learning in your organization helps, but it likely won’t be enough, considering the pace of change and the complexity of emerging fields. To scope and scan, or “sense,” and then participate in emerging technology and business trends, companies should consider several concurrent approaches. First, companies should leverage their existing set of partners, vendors, and alliances in order to get the pulse of their direct and closest collaborators. This can include holding joint innovation workshops to understand the variables directly impacting a company’s organization. This can help CIOs and other strategists tap into new thinking and their legacy partners’ roadmaps that can, in turn, spur new ideas. This can also start the process of collaboration within traditional circles while identifying and launching leading change agents.
In addition, many organizations are establishing internal research and development functions to explicitly monitor advances and imagine impacts to the business. Scenario planning techniques are then being used to turn these due diligence and discovery activities into potential real-life business opportunities or threats. For example, one retail company has hired science fiction writers to explore the future of retail.2 Other organizations have incorporated demos, prototypes, and visual storyboards to bring R&D-originated concepts to life—using “show” instead of “tell” to break out of incremental thinking. Some leading companies are also forging new relationships and developing a broader ecosystem with non-traditional stakeholders—such as start-ups, scientists, incubators, venture investor communities, academia, and research bodies—which can lead to a wide range of fresh perspectives.
New relationships indicate shifting ecosystems—the communal networks in which companies partner, compete, collaborate, grow, and survive. Historically, many organizations have created ecosystems characterized by traditional relationships—for instance, partnerships with long-standing vendors, suppliers, and customers. Or they might have forged alliances with complementary organizations to collaborate on sales and marketing efforts. With these partnerships come well-established operating protocols that foster consistency and predictability. For example, long-standing supplier and manufacturer ecosystems have been prewired for efficiency, with little margin for error.
When innovation is linearly paced, these ecosystems are effective. However, as the pace of change from disruptive technologies increases exponentially, traditional notions of an ecosystem may need to be redefined and, in many cases, flipped on their head. The mind-set for new ecosystems should be agile and adaptable instead of prewired and static. For companies that are accustomed to linear change but are planning for exponential change, the requirements gap may prove to be significant.
Consider an established automotive manufacturer with an ecosystem evolved from, and centered on, traditional manufacturing techniques to produce consistent products for a fairly constant set of customers. Then suddenly, due to exponentials such as AI, robotics, and sensors, the important parts of the automobile shift from engine performance to in-car technologies such as self-driving controls and augmented reality infotainment systems. Instead of manufacturing techniques, on-board mini-supercomputers, analytics, and customer experiences have become competitive differentiators. At the same time, crowdsourcing is disrupting the consumer automobile market, for example, through services such as Uber, Zipcar, and Lyft. Combined, this perspective shift will likely need to be accompanied by the introduction of a new set of partners working in collaboration with a company’s existing alliances and strategic partners—players from relevant adjacencies such as a new breed of entrepreneurs, start-ups, venture capitalists, scientists, and engineers, that have not been traditionally on an organization’s radar.
In the third dimension, experimentation, organizations rewire planning, funding, and delivery models so they can explore new concepts and ideas. Most large enterprises have somewhat linear approaches to investing, predicated on thorough business cases built around concrete ROI and fixed timelines. For an exponential organization, a culture of critical mass experimentation should be fostered and embedded into the process of learning and growing—failing fast and cheap, yet moving forward. Doblin3 has defined 10 distinct types of innovation, many oriented around organizational structure, business model, channel, or customer experience.4 Strategies include: Think beyond incremental impacts. Moreover, don’t constrain ideas to new products and business offerings. In early stages, forgo exhaustive business cases, and focus instead on framing scenarios around impact, feasibility, and risk. Consider co-investment models for emerging concepts that allow vendors and partners to more collectively shoulder the risk.
A new technology that scales quickly from one to a million users has become a common and straightforward phenomenon. Scaling an organization at exponential speed, however, is quite another matter. Organizational growth is usually linear—incremental and slow. In recent years, however, a new breed of exponential organizations (ExOs) such as Waze and WhatsApp have experienced dramatic growth trajectories and achieved multibillion-dollar valuations in just a few years.5
Unlike traditional businesses that combine assets and workforces within organizational boundaries and sell access to it, ExOs leverage the world around them, such as other people’s cars and spare rooms. In our benchmarks, they outperform their competitors at least tenfold. A traditional consumer product company, for example, takes an average of 300 days to move a product from idea to store shelves. Quirky, an ExO that crowdsources new product ideas, can move products through the same process in 29 days.6
Most of the key drivers of this new breed of ExOs can be boiled down to two acronyms: SCALE and IDEAS. SCALE comprises the external mechanisms that ExOs use to fuel their growth:
IDEAS comprises the internal control mechanisms that ExOs use to increase their organizational velocity:
There are currently between 60 and 80 ExOs that are delivering extraordinary results.7 Their rapid rise portends a new way of how businesses may be built and operate in the future. Established enterprises should consider these changes sooner rather than later.
Enterprises should learn to scale at the edge. Even if they are conscious of looming disruptive forces, linear biases may cause company executives to underestimate exponentials’ pace of change. In the early years, exponentials have little discernible impact. Their positive effect will likely be dwarfed by the impact of incremental changes to existing forces. Core businesses will likely continue to require care and feeding, even if they are under threat of longer-term disruption. On top of that, executives who attempt large-scale internal change and transformation may face a great wall of resistance. John Seely Brown and John Hagel have conducted extensive research on achieving innovation at an institutional level.8 Their recommendation is to establish new teams on the fringes, or “edges,” of the organization to foster exponential innovation.9Design principles and strategic levers for the full life cycle of scaling edges are described in figure 1.
Exponentials will continue to change the landscape for many industries in the next decade. Beyond understanding the technologies coming online, organizations—particularly those that have been evolving more linearly—need to understand the broader context of their industries, markets, and business models.
A continuum exists across sectors and geographies. On one extreme is the unprecedented opportunity for innovation and growth, as work is being reshaped, customers are being engaged with different technologies, and competitive fabrics are being reimagined. On the other lies the existential threat of disruption. These are conditions in which start-ups and entrepreneurial market makers can thrive. Large enterprises that have historically dominated their industries can also thrive in these conditions, but it is necessary to recognize the opportunities at hand (and their potential impacts), take deliberate action, and evolve into exponential organizations. In this report, we provide a high-level introduction to six exponentials: artificial intelligence, robotics, additive manufacturing, quantum computing, industrial biology, and cyber security. Each exponential is being propelled by significant investments and research across the public and private sectors. Our goal is to drive awareness by providing a snapshot of what each is, why it matters, and where it is going. Executives should consider how to embrace exponentials to innovate and disrupt their enterprises, agencies, and organizations—before they themselves are disrupted. Exponentials are deceptive by nature, as the doubling effect seems relatively minute in the early phases, but then the impact appears quite abruptly and powerfully, as the larger doublings double. By taking a more proactive stance on innovation as an organizational competency, companies have an opportunity to better understand and harness exponentials as building blocks so as not to be caught unaware—or unprepared.
Although artificial intelligence (AI) may still conjure up futuristic visions of cyborgs and androids, it is becoming embedded in everyday life, and supporting significant strides in everything from medical systems to transportation. AI is augmenting human intelligence—enabling individual and group decision makers to utilize torrents of data in evidence-based decisions.
Since the first AI meeting in 1956, the field has developed along three vectors. The first, machine learning, emulates the brain’s ability to identify patterns. The second, knowledge engineering, seeks to utilize expert knowledge in narrow domain and task-specific problem solving. Reverse engineering of the brain is the third. Tools such as electroencephalograms (EEGs) and functional magnetic resonance imaging are used to identify what parts of the brain perform specialized tasks. Researchers then attempt to replicate those tasks in software and hardware using similar principles of operation.
Although IBM’s Watson has captured much of the AI attention, it is not the only recent breakthrough. In 2011, for example, the DARPA-sponsored Synapse program developed a neuromorphic, or cognitive computing, chip that replicates some neural processes with 262,000 programmable synapses. In 2014, IBM announced the True North architecture for neuromorphic computing and a chip that has 256 million configurable synapses and uses less power than a hearing aid. These chips could eventually outperform today’s supercomputers. They also represent a dramatic increase in the processing power of a low-power chip and a significant shift in the architecture that can support AI.
The Deep Learning algorithm is a powerful general-purpose form of machine learning that is moving into the mainstream. It uses a variant of neural networks to perform high-level abstractions such as voice or image pattern recognition. Pioneered in 2006 by Geoffrey Hinton, a professor at the University of Toronto and researcher at Google, Deep Learning is proving its metal across a diverse spectrum of challenges—from new drug development to translating human conversations from English to Chinese. Google is using Deep Learning with its Android phone to recognize voice commands and on its social network to identify and tag images. Facebook is exploring Deep Learning as a means to target ads and identify faces and objects.
A company called Deep Mind that was acquired by Google recently announced a Neural Turing Machine. This computing architecture utilizes a form of recurrent neural networks that can do end-to-end processing from a sensory perception data stream to interpretation and action. This system has been used to learn games from scratch, and in some cases has demonstrated better-than-human-level performance.
AI is expected to impact the world of work significantly. It can augment humans in complex work requiring creativity and judgment, and likely will increasingly substitute for routine labor. We are beginning to see task assistants and associate systems that, with the right interface, allow humans to delegate work to a computer.
Authored in collaboration with Neil Jacobstein, Artificial Intelligence & Robotics co-chair, Singularity University
In addition to Jacobstein’s role at Singularity University, he is a distinguished visiting scholar in the Stanford University Media X Program and chairman of the Institute for Molecular Manufacturing. Jacobstein has served as a technical consultant on AI research and development for a variety of industry, nonprofit, and government organizations.
Although mankind has been seeking to create mechanical devices that can perform simple and complex tasks for millennia, AI and exponential improvements in technology are bringing what were once futuristic visions into the mainstream of business and society.
Replacing menial tasks was the first foray, and many organizations introduced robotics into their assembly line, warehouse, and cargo bay operations. Since those initial efforts, the use of robotics has been marching steadily forward. Amazon, for example, has largely automated its fulfillment centers, with robots picking, packing, and shipping products in more than 18 million square feet of warehouses.10 Traditional knowledge work is the next frontier, and real-time gathering and interpretation of data is likely to be replaced by machines. Essentially, almost every job will be affected by robotics at some point.
Robotics replacing existing jobs, however, is only part of the picture. The International Federation of Robotics estimates that these devices will create between 900,000 and 1.5 million new jobs between 2012 and 2016. Between 2017 and 2020, the use of robotics will generate as many as 2 million additional positions.11 A major factor in robotics-driven job growth is the simple fact that the combination of humans and machines can often produce better results than can either on their own.
An example is the 2009 emergency landing of a US Airways jet in the Hudson River. Confronted with complete engine failure, the pilot had to assess several risky options ranging from gliding to a nearby airstrip to landing in the river. AI could fly the plane, and one day it may be able to land one on water. The decision to do so, however, may always require an experienced pilot.
More prosaic examples are steadily emerging in business. Marlin Steel, for example, once employed minimum-wage workers to perform the dangerous task of bending long metal wires. Now robots do the work, which substantially increases the company’s output and ability to grow demand by reducing its prices. As a result, workers can now earn more by maintaining and supervising a growing number of robots.12
Robotics should be on many companies’ radar, and businesses should anticipate workplace tension as they are introduced. To ease the tension, companies should start by replacing repetitive, unpleasant work. Business leaders should then identify jobs that robotics will replace over the next 10 years and leverage attrition and training to prepare employees for new roles. The challenge for businesses—and society as a whole—is to drive job creation as robotics makes many jobs redundant. It certainly can be done.
Authored in collaboration with Dan Barry, Artificial Intelligence & Robotics co-chair, Singularity University
Dan Barry is a former NASA astronaut and a veteran of three space flights, four spacewalks, and two trips to the International Space Station. He is a physician and engineer (MD, PhD) and his research interests include robotics, signal processing with an emphasis on joint time-frequency methods, and human adaptation to extreme environments.
The roots of additive manufacturing (AM) go back to 19th-century experiments in topography and photosculpture. Centuries later, the technology, commonly referred to as 3D printing, holds the potential of eliminating some long-standing trade-offs between cheap and good by potentially erasing the cost of complexity. Using AM, whole objects, including those with moving parts, can be created layer by layer, significantly lowering assembly costs. By applying the exact amount of material required for each layer, AM can also reduce waste in production processes. And by virtue of its ability to handle a broad range of geometric configurations, AM opens the door to radically new manufacturing designs. Because of its sweeping potential, AM is moving from a rapid prototyping tool to end-product manufacturing: Companies such as General Electric, Boeing, and Diametal have used AM to build end-user products, with positive results such as faster production runs and more durable products.13
Deloitte research found that many companies using AM typically travel along one of four paths based on how much, or how little, they choose to impact their supply chains or products.14 The first path, stasis, is often the entry point as businesses seek low-risk opportunities to improve current products and supply chains. For example, jewelry companies are using AM to create assembly jigs, and aerospace manufacturers are printing parts used in chroming and coating processes.15 Along the second path, enterprises are turning to AM to transform supply chains. For example, hearing aid producers are using the high level of customization available with AM to reduce the back and forth between the doctor’s office and manufacturer.
Product innovation is the third path, and businesses are increasingly using AM to customize products. Footwear companies, for example, are starting to use the technology to manufacture running shoes based on customer biomechanics. Along the fourth path, companies alter both supply chains and products in pursuit of new business models. The bathroom fixture manufacturer Symmons is a case in point. It is interacting directly with its supply chain customers to design and create new custom-made fixtures.16
The choice of path should be anchored on a business case that compares AM against traditional manufacturing methods from the perspective of both direct costs and indirect factors. In terms of direct expense, AM can significantly reduce the costs of tooling. In one example, an aircraft manufacturer used AM to produce brackets that reduced the weight of an airliner by 22 pounds.17 Although the reduction may seem slight, it saved customers more than $400,000 annually in fuel costs per plane.18 Businesses should consider the value to their customers when evaluating AM for their organization.
Authored in collaboration with Mark Cotteleer, research director, Deloitte Services LP
Mark Cotteleer is a research director with Deloitte Services LP. His research focuses on issues related to performance and performance improvement and covers a wide range of topics including advanced manufacturing, supply chain, distribution, and business analytics.
With news of 3D printing projects appearing almost daily, it is easy to assume that the technology is a “big bang” disruption with overnight achievements. That is not the case. Like many disruptions, 3D printing took decades to develop. It finally stepped center stage because of Chuck Hull’s indefatigable vision.
Hull invented 3D printing in the 1980s and founded 3D Systems to commercialize the technology. For the first 20 years, development was slow, extraordinarily expensive, and saddled with complicated user interface challenges. In the early 2000s, despite its first mover advantage, the company nearly went bankrupt. Today, 3D Systems’ market cap is more than $3 billion19.
Hull’s vision girded his confidence to fight the many battles encountered on the road to disruption. Like many entrepreneurs, Hull had to cope with investors, board members, employees, and customers who continually doubted the new idea. When a company is creating something disruptive and new, many people won’t believe it until they can hold it. And even then, they might be skeptical about its market potential.
Hull’s commitment, courage, and capability helped him surmount these challenges and execute his vision. Conviction is needed to endure the experience of repeatedly failing and learning, and, ultimately, convincing and disrupting. It arguably underpins what Apple Inc. famously dubbed as “the crazy ones.” In its 1997 television commercial, the company pointed to Bob Dylan, Martin Luther King Jr., and Thomas Edison as exemplars whose visions and convictions fell outside the status quo and ultimately changed it.
Vision and conviction aren’t always enough by themselves, however. They need fluid organizations to achieve their impact. Businesses likely won’t be able to keep pace with exponentials, much less disrupt markets, if they stick to an incremental approach to innovation that is siloed in many departments. Exponential technologies, including these six, are simply too intertwined and advancing too rapidly for status quo organizations to keep pace.
3D printing, for example, is part of the larger exponential, robotics. Robots, in turn, are being endowed with AI, which will likely move them far beyond stocking shelves to running driverless cars and even performing surgery. By 2020, the Internet of Things exponential will connect more than 50 billion devices to the Internet.20 IoT devices and trillions of sensors will connect to machines with sophisticated artificial intelligence. Finally, infinite computing power will combine with AI to transform the field of synthetic biology to create everything from new foods to new vaccines.
Vision breeds confidence, which, in turn, breeds conviction. Both are crucial to surmounting the obstacles of introducing the new. In an age of daunting uncertainty, being the first to disrupt the status quo should be a top agenda item. If you don’t disrupt yourself, a competitor eventually will.
Computing, as we typically understand it, reflects our experience of the physical world and the mathematics behind it. Beneath that experience, however, is the complex and counterintuitive mathematical reality of quantum mechanics. Some 30 years ago, Paul Benioff, a scientist at Agronne National Laboratory, posed the idea that if computing could be based on quantum physics, it could gain unimaginable power and capability.21
The potential power of quantum computing stems from its ability to free computing from sequential binary operations. Even the most powerful computers rely on combinations of 0s and 1s and computations performed discretely. Quantum computing, on the other hand, encodes information as quantum bits, or qubits. Qubits mimic the reality of quantum physics where subatomic particles can exist in multiple states concurrently. As a result, qubits can represent not only 0s and 1s but everything in between at the same time. Thus, instead of a sequence of individual calculations, quantum computing can perform countless computations simultaneously. The potential is profound. By some estimates, quantum computers could solve problems that would take conventional machines millions of years to figure out.22 In theory, they could perform some calculations that conventional computers would take more than the life of the universe to complete.
Although no one has created a practical version of a quantum computer, D-Wave, in Vancouver, Canada, is researching an alternate model of quantum computing called quantum annealing.
Google is also investing in the technology. Google has been experimenting with D-Wave’s computers since 2009 and recently opened its own labs to build chips similar to those of D-Wave.23 Whether or not these efforts pan out is an open question. However, the attention of technology industry giants underscores the lure.
Given the mounting volume and importance of big data, quantum computing’s ability to solve complex mathematical problems opens astonishing vistas in everything from predicting the weather to developing new drugs. Nonetheless, the ability to solve complex mathematical problems is also a source of angst. Quantum computing may be able to factor extremely large numbers quite readily. Factoring, however, lies at the heart of virtually all public encryption systems. Current cryptography is deemed safe because it could take lifetimes for even the most powerful computers to crack the code. A quantum computer could conceivably crack it in minutes. Were that to happen, all financial transactions and authentications moving across the Internet could suddenly become vulnerable.
Authored in collaboration with Brad Templeton, Networks & Computing chair, Singularity University
Brad Templeton is a developer of and commentator on self-driving cars, software architect, board member of the Electronic Frontier Foundation, Internet entrepreneur, futurist lecturer, and writer and observer of cyberspace issues. He is noted as a speaker and writer covering copyright law, political and social issues related to computing and networks, and the emerging technology of automated transportation.
Unlocking the complex logic of the genome has been a quest of scientists and businesses for decades. In the past, the high cost of experimentation has prevented organizations from pursuing industrial-strength genomics. The budget to first sequence the human genome, for example, was almost $3 billion.24
Today, however, digital technologies are fueling the field of industrial and digital biology, ushering in a renaissance in the ability to manipulate DNA, splice genes, and control genomes. Broad access to hardware, data, and tools is bringing costs down quickly. With CRISPRs (clustered regularly interspaced short palindromic repeats), for instance, researchers can edit a gene for less than $1,000. On the not-so-distant horizon, desktop applications may drop that expense to a few dollars.
Costs have not been the sole barrier to the advancement of genomics, however. The public’s concerns over genetic engineering have constrained research frontiers despite the value of many applications. Already in the 1990s, genomics saved Hawaiian papayas from near extinction. Similar efforts are underway to protect US orange trees from a deadly parasite that could turn a staple of American breakfast, orange juice, into a luxury item. Companies such as NatureWorks and Calysta are making notable progress in bioplastics to create environmentally sound plastics and fibers.
The tide of public opinion may turn as digital biology drives significant strides in human health. Such strides are already underway. Pharmaceutical companies are increasingly bringing genetically engineered drugs to market that target previously almost intractable conditions. Companies are also making advances in creating and controlling microbes. Soap, for instance, could be replaced with a microbial spray that contains good microbes to control harmful ones. Microbes can also be ingested to control weight and improve moods.
Digital biology’s greatest impact on health may likely occur at the intersection of medical science and big data. That intersection holds the potential of genetic maps of our body’s systems and wearable devices that sense when something is wrong in those systems. Google recently launched a major research project that will underpin the creation of such maps. In 2014, the company announced its Baseline Study that is crowdsourcing a genetic picture of human health to help medical professionals identify biomarkers of diseases early in their development. The project is collecting genetic and molecular data from 175 people, and Google plans to expand the research to include thousands of participants.25
Google is using wearables to capture data such as how people metabolize food and how their hearts beat—for example, smart contact lenses that measure glucose levels. Companies such as Germany’s Bragi are starting to commercialize wearables with similar capabilities. Bragi is developing tiny wireless earphones that not only store 4GB of audio files, but also monitor heart rate and oxygen consumption. Bragi’s goal is to turn earphones into a platform with an overall sensor system for the body.
Industrial biology’s combination of individual genetic maps and sensors that detect what may be amiss will drive vibrant new ecosystems. Technology, pharmaceutical, health care, and electronics companies may find themselves collaborating in new ways to bring industrial biology’s potential to bear on the public’s health.
Authored in collaboration with Raymond McCauley, Biotechnology & Bioinformatics chair, Singularity University
Raymond McCauley is a scientist, engineer, and entrepreneur working at the forefront of biotechnology. Raymond explores how applying technology to life—biology, genetics, medicine, agriculture—is affecting every one of us. In addition to his role at SU, McCauley is co-founder and chief architect of BioCurious.
Cybercrime has traditionally been a two-dimensional problem: individuals hiding behind computer screens and hacking into the world’s information systems. That’s about to change as cybercrime goes 3D. Today, the steady advance of exponential technologies, including robotics, artificial intelligence, additive manufacturing, and industrial biology, is adding a third dimension to the risks we face: our physical space.
Consider robotics. Drones, or robotically controlled aircraft, can deliver packages to our doors but can also carry firearms and explosives. Indeed, a few years ago, the FBI arrested an al Qaeda affiliate who was planning to use remote controlled drones to drop explosives on American government buildings.26
Industrial biology is also on the cusp of becoming a threat. Advances in this field are moving rapidly with some research estimating the market will grow at a CAGR of 32.6 percent from 2013 to 2019.27 Biology is rapidly becoming an information technology and as a result, genetic engineering will likely soon become a desktop application. When it does, bad actors may be able to create their own bio-viruses such as weaponized flu strains—computer-designed viruses that could permeate our physical world.
Perhaps, the more immediate threat on the horizon is the growth of the Internet of Things. As we transition from Internet Protocol version 4 to version 6,28 the size of our global information grid is expected to explode in size. If today’s Internet is the metaphorical size of a golf ball, tomorrow’s will be the size of the sun. That means that nearly every car, computer, appliance, toy, thermostat, and piece of office equipment could be connected and online—and potentially vulnerable to hacking from anywhere on the planet.
Recently, for example, researchers and hackers have claimed that they can access a plane’s satellite communications system during commercial flights via Wi-Fi or the entertainment console.29 Internet-connected thermostats at the US Chamber of Commerce were breached.30 Webcams, insulin pumps, automobiles, and refrigerators have all been demonstrated as hackable in certain situations.31
The most effective means to combat cyber threats remain elusive and any successful approach will likely require a combination of technical, legal, entrepreneurial, and public policy collaboration. Traditional law enforcement efforts are often closed systems and siloed in individual countries. These efforts struggle to achieve global scale, while criminal networks easily subvert national boundaries in the Information Age. One development demonstrating potential is the use of crowdsourcing for the purposes of public safety and security. For example, in Latin America, citizens are helping the government fight narcotics-related murders by mapping the activities of drug dealers.32 Ultimately, the effective fight against cybercrime may well rest on the efforts of a global crowd helping to root it out. Organizations should realize that while no system is hacker-proof, the good news is, they can take steps to create a more secure, vigilant, and resilient enterprise information infrastructure.
Authored in collaboration with Marc Goodman, chair for Policy, Law, and Ethics, Singularity University
Marc Goodman is a global strategist, author, and consultant focused on the disruptive impact of advancing technologies on security, business, and international affairs. Goodman’s latest book on cybercrime, to be released on February 24, 2015, is Future Crimes: Everything Is Connected, Everyone Is Vulnerable and What We Can Do About It. Over the past 20 years, he has built his expertise in next-generation security threats such as cybercrime, cyber terrorism, and information warfare working with both industry and government globally.