From breakthroughs and new innovations to established technologies, these are the inventions, gadgets, and trends that shaped the last decade.
It’s been a busy decade in the tech space. New innovations emerged and older ones finally matured in ways that have had a major impact. The 2010s brought us the rise of 3D printing, the rebirth of VR, and an explosion in AI technologies. The health industry was all about wearables. And a digital currency gold rush made us rethink encryption.
As we prepare to enter the 2020s, let’s take a look back at how far we’ve come.
Here are the 15 technologies, gadgets, and trends that had the biggest impact on the industry, and our lives, in the last decade.
A technology first developed in the 60s has become as common a phrase in manufacturing as injection molding or CNC machining. 3D printing has grown from a novel way to create tchotchkes and plastic parts into a serious technology with applications ranging from automotive and aerospace to even medical. 3D printing has become a serious option for prototyping and small-scale production. And rise of new materials and even metal 3D printing has expanded its applications. We may only be a generation or two away from seeing patients with 3D-printed organs in their bodies.
(Image source: Airwolf 3D)
You couldn’t open a newspaper in the 2010s without some sort of AI-related headlines. Whether it was IBM Watson winning at Jeopardy, fears of robots taking jobs, or the rise of autonomous vehicles, the last 10 years have put AI on everyone’s mind like never before. AI has potential to transform nearly every industry on the planet and already has in many cases. And the growing ethical and moral concerns around the technology only further demonstrate that it’s here to stay.
Bitcoin went from the currency of choice for Internet drug dealers to sparking a full on gold rush as investors looked to cash in on Bitcoin’s skyrocketing value. But the best thing Bitcoin did this decade was bring new attention to the technology underneath it – blockchain. Increased interest in blockchain has found the technology finding implementations in cybersecurity, manufacturing, fintech, and even video games. Blockchain made us rethink security, automation, and accountability and is going to be a key component in the ever-expanding Internet of Things going forward.
(Image source: Pixabay)
Robots have worked alongside humans for a long time, but never like they have in recent years. The rise of collaborative robots (cobots) brought machines into factories that can work right next to human workers without the need for safety cages. The now defunct Rethink Robotics created arguably the most memorable cobot with Baxter (shown), but several major robotics companies including Boston Dynamics, Fanuc, and Universal Robots have all gotten into the game.
Cobots also sparked a lot of debate as to their impact on jobs and the economy. But concerns haven’t slowed their growth. You’d be hard pressed to find an industrial robotics company today without at least one cobot offering in its portfolio.
(Image source: Rethink Robotics)
The rise of the Internet of Things and Industry 4.0 has brought with it new ways of thinking of the design and manufacturing process. None of these has been more praised than the digital twin. Consumer electronics, automobiles, even factories themselves can be modeled in virtual space, providing real-time insights into design and production workflows without the costly expense of physical prototyping. Add VR and AR to the mix and engineers get an added layer of immersion and visualization.
(Image source: B&R)
Chip technology overall has come a long way in the last decade, but none further than the GPU. Spearheaded by chipmakers including Nvidia (especially Nvidia), AMD, Intel, and Asus, GPUs grew from their specialized role as graphics processors into a key enabler behind the high-end computing needed for AI. Even autonomous cars have leveraged GPUs to handle their computing needs.
It used to be that only serious video gamers cared about the quality of their GPU. Now any company, engineer, or even hobbyist developing hardware that leverages AI has to take a serious look at GPUs as a solution.
(Image source: Nvidia)
The Internet of Things / Industry 4.0
There was a time when going on about how, “everything is connected,” might have made you sound like a conspiracy theorist. Now, it makes you sound more like an IT professional. From factory automation; to devices in our homes like thermostats, locks, and cameras; even to our cars – pretty much anything that could have wireless or Internet connectivity added to it got it.
Sure some use cases were certainly more valuable than others, but the rapid growth of the IoT made one thing certain – the future is connected. And whether you prefer cloud-based solutions or handling things on the edge, no device is ever going to be an island ever again. As staggering as it may sound, the march toward 1 trillion connected devices is far from an exaggeration.
You need a lot of technologies to create an autonomous vehicle – AI, radar, even thermal sensors – but LiDAR is what really put self-driving cars on the road. It’s not enough on its own, and needs to work alongside other sensors, but engineers have found the technology – traditionally used in meteorology and GPS – to be absolutely crucial in allowing autonomous vehicles to recognize their surroundings – including humans and animals in the road.
(Image source: Innoviz)
The key innovators behind lithium-ion batteries received a long-overdue Nobel Prize in 2019. That’s likely because there’s no avoiding just how significant an impact lithium-ion has had – particularly in recent years. New battery technologies have made electric vehicles an attractive option for any consumer, and new battery chemistries and configurations are making our devices lighter and thinner with every generation. Researchers are always looking for better alternatives, but lithium-ion established itself as the heavyweight king of batteries in the last 10 years and it doesn’t look ready to relinquish that title anytime soon.
(Image source: Johan Jarnestad/The Royal Swedish Academy of Sciences)
The Mars Rovers
We learned more about the Red Planet than ever before thanks to NASA’s Mars exploration rovers. The rovers, Spirit and Opportunity (shown), first landed on Mars in 2004 and since then have brought scientists incredible insights about our neighboring planet – including that Mars was once wetter and had conditions that could have sustained microbial life. The knowledge gained from both will surely be carried on as NASA continues to plot a manned mission to Mars in the coming decades. Spirit ended its mission in 2011, while Opportunity operated for an unprecedented 15 years, finally ending its mission in 2018. And we’ll always remember Opportunity’s last communication to NASA – poetically interpreted as, “”My battery is low and it’s getting dark.”
(Image source: NASA)
Open source used to be a dirty word for developers and consumers. The perception was that RISC-Vanything open source was likely to be insecure, shoddily put together, and lacking any long term support. But open source has proven to be a viable option for developers, and a valuable tool. Microsoft and IBM both made big investments in open source with the acquisitions of Github and Red Hat respectively.
We’ve even seen the growth of open-source hardware for the first time. The open-source chip architecture has seen an ever-growing ecosystem of companies emerge around it in recent years – all aimed at changing the way we build and use processors.
You can’t mention DIY electronics without thinking of the Raspberry Pi. Since its introduction in 2012, the single board computer has gone from a go-to platform for hobbyists and makers to a serious development platform for engineers working in IoT and even AI. Even if you use another single board computer, or even a microcontroller like the Arduino, for your projects, we all owe a debt to Raspberry Pi for bringing electric engineering a bit closer to home.
(Image source: Raspberry Pi Foundation)
enormous impact smartphones have had on our lives. Smartphones have grown into full-It doesn’t matter whether you prefer iOS, Android, or another option, there’s no denying the fledged computing platforms – enabling entirely new business models ranging from digital health to mobile VR. The gaming market in particular has enjoyed huge returns thanks to the computing power offered by today’s smartphones.
(Image source: Apple)
VR, AR, MR, and XR (The new realities)
Virtual reality has had a lot of starts and stops over the decades. But thanks to the Oculus Rift and other headsets such as the HTC Vive – VR is finally delivering on its promise. Ten years ago if you had asked anyone if they used VR in their workflow they might have laughed. Today, it’s become more and more commonplace.
The rise of augmented reality (AR), mixed reality (MR), and extended reality (XR) have sparked even more use cases in both the consumer and enterprise space. Pokemon Go showed us consumers will value AR for entertainment, but plenty of big names including Microsoft, Google, and HP brought the technology into the enterprise space as well.
(Image source: HP)
The 2010s saw technology grow from something we carry to an actual accessory that we can wear. From consumer focused products like the Apple Watch, Samsung Galaxy Gear, and even the FitBit, to serious medical devices like the AlivCor EEG, intended to track and help diagnose diseases, wearables found their way onto millions of bodies. There was certainly a wearables bubble that has since burst, but the digital health sector owes much of its success to wearables. And Google’s recent major acquisition of Fitbit shows that the tech industry believes there’s more to wearables than being a high-tech fashion statement.
(Image source: Fitbit)
Chris Wiltz is a Senior Editor at Design News covering emerging technologies including AI, VR/AR, blockchain, and robotics.