the-top-consumer-electronic-articles-of-2019

According to Statista, the consumer electronics industry is expected to grow by around 2.2 percent between 2018 and 2019. While this rate is significantly less than in previous years, it still shows a strong upward trend. 

Design News covered all of the leading growth areas in both consumer electronics and product manufacturing trends. Below are 12 of the editor’s top picks for 2019.

Image: Vectorfusionart/Adobe Stock

Which companies were naughty—and which were nice—in 2019?

It’s time to draft the naughty and nice lists for 2019 and see which companies deserve to be rewarded and which ones will get a lump of coal in their stocking.

printed circuits, Linköping University, RISE, Research Institutes of Sweden, Campus Norrköping, organic transistors

Image source: Thor Balkhed

Complete integrated circuits fabricated using printing press

The breakthrough eliminates the need to use multiple manufacturing methods to create an integrated circuit with more than 100 organic transistors.

Image Source: Adam Traidman, Turkey

What does every engineer want for the holidays?

Skip the presents and go for the (engineering) experience.

During the holiday season, one tends to think of presents. But today’s designers, manufacturers and sellers tell us the product is but a commodity and what we really want is the experience.

10 Technologies That Can Make You Into a Superhero

There are technologies that exist today that aren’t far off from what you’ve seen in superhero movies and comic books.

digital twin, Altair, physics-based testing, simulation testing, crash test dummies

Image source: Altair

Save Your Crash Test Dummy

The blend of physics-based testing and data-based simulation has an impressive impact on the ability to shorten the time-to-market of new designs. Add simulation, but don’t throw away the dummy.

Image Source: Netflix

8 Popular Products You Didn’t Know Were Built with Open Source

A popular streaming service, video games consoles, and mobile messaging all owe a debt to FreeBSD.

Image Source: Cisco

The 7 Best LoRaWAN Devices on the Market

Whether you’re building a DIY project, or attempting to manufacture something for market, there are ready-made LoRaWAN gateway products to support your efforts.

New Material Could Transform How Electronics Are Built

A new family of crystal materials can serve a dual purpose in electron movement in electronic devices, potentially changing how they will be designed in the future.

Image source: Clarke Lab/Harvard John A. Paulson School of Engineering and Applied Sciences

Method for Soft Actuation Eyed for New Devices, Robots

A new method for actuation can change the shape of a flat sheet of elastomer with rapid and reversible action for new designs in robotics and other applications.

Image source: NürnbergMesse

3 Trends from Embedded World 2019

Embedded World revealed a number of trends that we can expect to see in the mass markets over the next six months to two years.

e-bandage

Image source: UW/Sam Million-Weaver

Electronic Bandage Can Speed Wound Healing 

An e-bandage dramatically speeds wound-healing using electrical energy harvested from a patient’s body.

smart textile

(Image source: The Laboratory of Monica Craciun, University of Exeter)

Graphene-Based Electronic Fibers for Wearable Textiles

Graphene could be used to incorporate electronics directly into fabric for next-generation smart textiles.

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

19-in-2019:-best-electronic-bits-from-the-design-news-vault

Check out the best curated technical content from the editors at Design News.

For electronic developers, 2019 was another stellar year. In addition to ongoing challenges and successes in areas such as embedded systems, hardware chips-board systems and system-of-systems (like satellites), there were new materials, evolving design approaches and insights to be considered. Here are the best stories that covered these issues from the 2019 Design News arhives.

  1. Top 10 2019 engineering, science and technology awards – Each year reveals further advances in the disciplines of technology, engineering and science. This year, luminaries were awarded for their work in cosmology, photonics, GPS systems, video processing, semiconductors, brain neurons and more.
  2. Who’s left to make chip development tools? –Electronic design automation (EDA) are the software tools used for designing electronic systems, such as system-on-chip (SoC) integrated circuits and printed circuit boards. The tools work in a design-verification flow that chip designers use to analyze and develop semiconductor chips. Here’s a look at the remaining major EDA tool companies after years of consolidation.
  3. Complete integrated circuits fabricated using printing press – Researchers have for the first time printed complete integrated circuits that have more than 100 organic transistors, a breakthrough in the quest to use printing to create complex next-generation electronic and computing devices. The breakthrough eliminates the need to use multiple manufacturing methods to create an integrated circuit with more than 100 organic transistors.
  4. 2 game-changing trends that will define embedded systems in the 2020s – The last decade has seen an amazing advancement in embedded system development techniques, tools and technologies. The next decade has the potential to dramatically change the way that products and embedded systems are developed.
  5. Developing an embedded software build pipeline – One interesting fact that I’ve noticed about embedded software development is that development processes and techniques tend to lag the general software industry. Developing a more sophisticated build pipeline can have dramatic effects on the embedded software development life cycle.
  6. 8 criteria to evaluate when selecting an RTOS – Real-time operating systems (RTOS) are finding their way into nearly two-thirds of all applications. Cost is a factor. But there are more important things to consider when choosing a real-time operating system.
  7. Old 3G battle shifts to 5G struggle – The old 3G battle between communication and computational industries has been replaced with the 5G struggle between nations and sub-6 vs mmWave global spectrums.
  8. Internet of Space or Space Junk?– When bad movies make good predictions and how to lessen the Kessler Syndrome with everything from AI to space harpoons.
  9. Did Edison Really Lose a Non-Existent ‘Current War?’ – The recent movie, The Current War, dramatizes the struggles between Edison, Westinghouse, and Tesla to bring electrical power to the US. But was the “war” actually fabricated?
  10. 3 Do’s and Don’ts for Medical Device Software Development – Medical devices are one of the fastest growing areas of embedded hardware and software development. Here are some successful strategies – and potential pitfalls – gleaned from real-world medical device development projects.
  11. Microorganisms Provide Solar Energy Independent of Using Solar Cells – The concept of solar energy usually inspires images of long rows of solar panels lined up in a vast field. Researchers in Sweden achieved production-potential amounts of butanol using carbon dioxide and sunlight
  12. Beware the Hidden Costs of a Free Embedded RTOS – If you’re basing your selection of a real-time operating system (RTOS) solely on initial cost, then you may be in for a rude awakening.
  13. 8 RISC-V Companies to Watch – The open source nature of RISC-V promises to enable companies to create custom chip hardware specifically tailored to their products and devices. These eight companies are developing their own RISC-V technologies and are committing to helping third parties do the same to help push adoption of the open-source chip architecture. April 2019
  14. New Material Could Transform How Electronics Are Built – A new family of crystal materials can serve a dual purpose in electron movement in electronic devices, potentially changing how they will be designed in the future.
  15. Biocompatible Transistor Invented for New Devices – Researchers have developed what they said is the first biocompatible ion-driven transistor fast enough to enable real-time signal sensing and stimulation of brain signals.
  16. Efficient Fabrication Method Achieved for Nano-Sized Processors – A new rapid fabrication method for nano-scale semiconductors could help advance the design of next-generation processors.
  17. The Biggest Embedded Software Issue Is … – There are many different problems and challenges that embedded software developers are facing today. One of the biggest and least spoken about issues is that too many developers are writing software code without considering what could go wrong.
  18. Smart Manufacturing Expert Says It’s Time to Embrace Fuzziness – Combining fuzzy sensing technologies with artificial intelligence, manufacturers can learn more about their enterprise for less cost.
  19. 2019 Will Be the Year of Open Source – After decades of being looked at as more of a subculture (or arguably counter-culture) in the larger technology landscape, open source is finally getting its due. From software and even hardware, we saw more activity in open source than ever before in 2018. 

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

best-ai-stories-of-2019
(Image source: Adobe Stock)

We’ve picked our favorite AI-related stories from 2019.

The 10 greatest issues AI needs to face

While we celebrate the positive impacts of artificial intelligence let’s not forget there’s also a lot to be concerned about.

The Apple Card Is the Most High-Profile Case of AI Bias Yet

Apple Card users have alleged that its credit decision algorithm discriminates against women.

How AI at the Edge Is Defining Next-Generation Hardware Platforms

Moving AI from the cloud to the edge was a big trend in 2019. Chris Cheng, distinguished technologist on the hardware machine learning team at Hewlett Packard, takes a look at some of the latest research being done on AI inference at the edge.

(Image source: OpenAI)

OpenAI’s Robot Hand Taught Itself How to Solve a Rubik’s Cube

Rubik’s Cube Solving Robot Hand Sparks Debate in the AI Community

Using novel neural networks, OpenAI enabled a robotic hand is able to learn how to solve a Rubik’s Cube on its own. Concerns regarding OpenAI’s robot hand that can solve a Rubik’s Cube have created a debate among engineers and AI experts on social media.

What’s the State of Emotional AI?

Artificial intelligence that can recognize human emotions – emotional AI – has been gaining momentum. But something’s missing. How long until we’ll be seeing it in our devices and cars?

(Image source: TuSimple)

UPS Has Invested in Autonomous Trucks After Ongoing Tests

TuSimple’s Autonomous Trucks Are Being Tested by the USPS

In 2019, TuSimple entered into partnerships with UPS and the US Postal Service to test self-driving trucks for hauling mail freight.

The New Raspberry Pi 4 Is All About AI and Embedded IoT

The Raspberry Pi has grown from a hobbyist machine to an IoT developer platform capable of even handling machine learning applications. Here’s our hands-on look.

A Look at the US/China Battle for AI Leadership

The US and China are waging a behind-the-scenes war over who will emerge as the global powerhouse of artificial intelligence. Where do each country’s strengths and weaknesses lie?

There’s a Diversity Crisis in the AI Industry

A lack of racial and gender diversity at the companies creating AI ties closely with issues of bias and racial discrimination in artificial intelligence algorithms, according to a new NYU study.

(Image source: Pixabay)

Can Trump’s New Initiative Make American AI Great Again?

A look at President Trump’s executive order aimed at accelerating America’s lead in artificial intelligence.

AI Could Make Quantum Computers a Reality

New research is examining the use of artificial intelligence to handle the calculations necessary for quantum computers to function.

2019's-10-best-books-for-engineers-and-technologists

Engineers will find something of interest in these selections, from Heaviside and Silicon Valley, to sustainable manufacturing, organs-on-a-chip, and more.

  • Don’t know what to get the engineer in your life? Here’s a mix of easily understood, yet engaging, books combined with a few hardcore technical works. All of these books were published in 2019, except for two that still remain worthy of note today.

  • The Forgotten Genius of Oliver Heaviside: A Maverick of Electrical Science

    By: Basil Mahon

    Publisher: Prometheus

    With the release of the film The Current War, it’s easy to forget the contributions of Oliver Heaviside. While The “current war” focused on the competition between Edison, Westinghouse, and Tesla to bring electricity to all of America, Heaviside (a contemporary of Edison and Westinghouse) was focused on electrical engineering technology to help bring mass communication to the country.

    Heaviside gave us the unit step function (remember calculus class?), coaxial cable, and the small coils placed in series with every telephone line to improve the signal by providing inductive loading.

    From the publisher:

    “This biography of Oliver Heaviside profiles the life of an underappreciated genius and describes his many contributions to electrical science, which proved to be essential to the future of mass communications. Oliver Heaviside (1850 -1925) may not be a household name, but he was one of the great pioneers of electrical science: His work led to huge advances in communications and became the bedrock of the subject of electrical engineering as it is taught and practiced today. His achievements include creating the mathematical tools that were to prove essential to the proper understanding and use of electricity, finding a way to rid telephone lines of the distortion that had stifled progress, and showing that electrical power doesn’t flow in a wire but in the space alongside it.

    At first his ideas were thought to be weird, even outrageous, and he had to battle long and hard to get them accepted. Yet by the end of his life he was awarded the first Faraday Medal. This story will restore long-overdue recognition to a scientist whose achievements in many ways were as crucial to our modern age as those of Edison’s and Tesla’s.”

  • Make, Think, Imagine: Engineering the Future of Civilization

    By: John Browne

    Publisher: Pegasus Books

    From the publisher:

    “Today’s unprecedented pace of change leaves many people wondering what new technologies are doing to our lives. Has social media robbed us of our privacy and fed us with false information? Are the decisions about our health, security and finances made by computer programs inexplicable and biased? Will these algorithms become so complex that we can no longer control them? Are robots going to take our jobs? Will better health care lead to an aging population which cannot be cared for? Can we provide housing for our ever-growing urban populations? And has our demand for energy driven the Earth’s climate to the edge of catastrophe? John Browne argues that we need not and must not put the brakes on technological advance. Civilization is founded on engineering innovation; all progress stems from the human urge to make things and to shape the world around us, resulting in greater freedom, health and wealth for all. Drawing on history, his own experiences and conversations with many of today’s great innovators, he uncovers the basis for all progress and its consequences, both good and bad. He argues compellingly that the same spark that triggers each innovation can be used to counter its negative consequences. This book provides an blueprint for how we can keep moving towards a brighter future.”

  • The Code: Silicon Valley and the Remaking of America

    By: Margaret O’Mara

    Publisher: Penguin

    Margaret O’Mara worked in the White House of Bill Clinton and Al Gore in the earliest days of the commercial Internet. There she saw firsthand how deeply intertwined Silicon Valley was with the federal government–and always had been–and how shallow the common understanding of the secrets of the Valley’s success actually was.

    In this work, she tells the story of mavericks and visionaries, but also of powerful institutions creating the framework for innovation, from the Pentagon to Stanford University. It is also a story of a community that started off remarkably homogeneous and tight-knit and stayed that way, and whose belief in its own mythology has deepened into a collective hubris that has led to astonishing triumphs as well as devastating second-order effects.

  • The Design of Coffee: An Engineering Approach

    By: William Ristenpart, Tonya Kuhl

    Publisher: CreateSpace Independent Publishing Platform

    Here’s another work that was published a few years ago but is relevant this year for its emphasis on cross-discipline collaboration, a trend noted in the chemistry industry.

    From the publisher:

    “[This book] provides a non-mathematical introduction to chemical engineering, as illustrated by the roasting and brewing of coffee. Hands-on coffee experiments demonstrate key engineering principles, including material balances, chemical kinetics, mass transfer, fluid mechanics, conservation of energy, and colloidal phenomena. The experiments lead to an engineering design competition where contestants strive to make the best tasting coffee using the least amount of energy – a classic engineering optimization problem, but one that is both fun and tasty! 

    Anybody with access to a sink, electricity, and inexpensive coffee roasting and brewing equipment can do these experiments, either as part of a class or with your friends at home. The Design of Coffee will help you understand how to think like an engineer – and how to make excellent coffee!”

  • Human Compatible: AI and the Problem of Control

    By: Stuart Russell, Allen Lane

    Publisher: Viking

    From the publisher:

    “Creating superior intelligence would be the biggest event in human history. Unfortunately, according to the world’s pre-eminent AI expert, it could also be the last. In this book on the biggest question facing humanity, the author explains why he has come to consider his own discipline an existential threat to his own species, and lays out how we can change course before it’s too late. There is no one better placed to assess the promise and perils of the dominant technology of the future than Russell, who has spent decades at the forefront of AI research. Through brilliant analogies prose, he explains how AI actually works, how it has an enormous capacity to improve our lives – but why we must ensure that we never lose control of machines more powerful than we are. Here Russell shows how we can avert the worst threats by reshaping the foundations of AI to guarantee that machines pursue our objectives, not theirs.”

  • Organ-on-a-Chip: Engineered Microenvironments for Safety and Efficacy Testing

    By: Julia Hoeng (Editor), David Bovard (Editor), Manuel Peitsch (Editor)

    Publisher: Academic Press/Elsevier

    From the publisher:

    “[This book] contains chapters from world-leading researchers in the field of organ on a chip development and applications, with perspectives from life sciences, medicine, physiology and engineering. The book details the field, with sections covering the major organ systems and currently available technologies, platforms and methods. As readers may also be interested in creating biochips, materials and engineering best practice, these topics are also described. Users will learn about the limitations of 2D in-vitro models and the available 3D in-vitro models (what benefits they offer and some examples). Finally, the MOC section shows how the organ on a chip technology can be adapted to improve the physiology of in-vitro models.”

  • Sustainable Engineering Products and Manufacturing Technologies

    By: Kaushik Kumar (Editor), Divya Zindani (Editor), J. Paulo Davim (Editor)

    Publisher: Academic Press/Elsevier

    From the publisher:

    “[This book] provides the reader with a detailed look at the latest research into technologies that reduce the environmental impacts of manufacturing. All points where engineering decisions can influence the environmental sustainability of a product are examined, including the sourcing of non-toxic, sustainable raw materials, how to choose manufacturing processes that use energy responsibly and minimize waste, and how to design products to maximize reusability and recyclability. The subject of environmental regulation is also addressed, with references to both the US and EU and the future direction of legislation.”

    Finally, sustainability factors are investigated alongside other product considerations, such as quality, price, manufacturability and functionality, to help readers design processes and products that are economically viable and environmentally friendly.”

  • Introductory Electrical Engineering With Math Explained in Accessible Language

    By: Magno Urbano

    Publisher: Wiley

    From the publisher:

    “[This work] offers a text that explores the basic concepts and principles of electrical engineering. The author explains the underlying mathematics involved in electrical engineering through the use of examples that help with an understanding of the theory. The text contains clear explanations of the mathematical theory that is needed to understand every topic presented, which will aid students in engineering courses who may lack the necessary basic math knowledge.”

    “Designed to breakdown complex math concepts into understandable terms, the book incorporates several math tricks and knowledge such as matrices determinant and multiplication. The author also explains how certain mathematical formulas are derived. In addition, the text includes tables of integrals and other tables to help, for example, find resistors’ and capacitors’ values. The author provides the accessible language, examples, and images that make the topic accessible and understandable.”

  • What Is Data Engineering?

    By: Lewis Gavin

    Publisher: O’Reilly Media, Inc.

    From the publisher:

    “The demand for data scientists is well-known, but when it comes time to build solutions based on data, your company also needs data engineers—people with strong data warehousing and programming backgrounds. In fact, whether you’re powering self-driving cars or creating music playlists, this field has emerged as one of the most important in modern business. In this report, Lewis Gavin explores key aspects of data engineering and presents a case study from Spotify that demonstrates the tremendous value of this role.”

  • Lithium-Ion Battery Failures in Consumer Electronics

    By: Ashish Arora, Sneha Arun Lele, Noshirwan Medora, Shukri Souri 

    Publisher: Artech House

    From the publisher:

    “This comprehensive resource caters to system designers that are looking to incorporate lithium ion (li-ion) batteries in their applications. Detailed discussion of the various system considerations that must be addressed at the design stage to reduce the risk of failures in the field is presented. The book includes technical details of all state-of-the-art Li-on energy storage subsystems and their requirements and provides a system designer a single resource detailing all of the common issues navigated when using Li-ion batteries to reduce the risk of field failures.

    “The book details the various industry standards that are applicable to the subsystems of Li-ion energy storage systems and how the requirements of these standards may impact the design of their system. Checklists are included to help readers evaluate their own battery system designs and identify gaps in the designs that increase the risk of field failures. The book is packed with numerous examples of issues that have caused field failures and how a proper design/assembly process could have reduced the risk of these failures.”

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

5-trends-that-will-guide-automotive-technology-in-2020

Here are five trends that will be playing a key role in making cars safer and more efficient in the years to come.

  • Auto manufacturers have no option other than to realign their strategies in order to accommodate the looming revolution. Connected and electric cars are already on our roads. And the reality of fully-autonomous cars is coming closer and closer. Technology is helping auto companies to not only modernize their manufacturing processes but also to gather, manage, and analyze data. There’s also tons of data being generated by vehicles themselves All of this data will soon be the guiding factor for the automotive industry going forward.

    Here are five trends that will be playing a key role in making rides smoother, safer, and more efficient.

  • cybersecurity, automotive, MCU, networks, infotainment system

    1.) Vehicle VPNs and automotive cybersecurity

    We might not quite be there yet, but we are for sure on the verge of completely adopting autonomous vehicles. There has been a lot of talk surrounding self-driven vehicles, especially in regard to their safety and security. But the promise of connected and autonomous vehilces, and vehicle-to-everything (V2X) communcation, also opens up new avenues for hackers to attack our cars.

    Virtual Private Networks (VPNs), which allow users to create secure and private connections across even public networks, have been around for some time now. They even allow you to appear online as if you’re in another country. They have been successfully deployed by consumers and businesses as well as in many high-risk cybersecurity situations, including safeguarding government data.

    With the rise of connected vehicles, it is now clear that car owners and manufacturers are going to be adopting VPNs and other cybersecurity solutions to protect their connected and autonomous cars from cybersecurity threats.

    (Image source: Microchip Technology)

  • 2.) Multimodal mobility

    New options like ridesharing, e-scooters, and electric bikes are transforming the way we think about transportation. Powerful tools have made Big Data collection and analysis seamless. When this data is harnessed under a public-private partnership, it starts to bring flexible, multimodal mobility solutions to life. We are already witnessing this partnership change the travel and tourism industry through white-label journey planning apps. Going forward, urban transportation will get more efficient, streamlined, and, in the long run, sustainable thanks to the adoption of multimodal mobility.

    (Image source: VeoRide)

  • 3.) AI that understands drivers and passengers

    Real-time Big Data analysis enables vehicles to recognize user preferences and automatically adjust their settings in order to make rides more comfortable and customized. Image recognition and processing technologies are also being integrated into cars as a way of training vehicles to identify their owners and users without the need of car keys. Systems like the one being developed by Affectiva can even recognize the emotional states of drivers and passengers. Deep learning is already helping fleet operators monitor drivers remotely. Farther into the future, AI and brain-to-vehicle technologies will also be instrumental in the actualization of driverless car technology.

    (Image source: Affectiva)

  • 4.) Vehicle-to-everything (V2X) communication

    Decision making in our roads is now based on real-time, accurate, and well-analyzed data thanks to the Internet of Things (IoT). V2X technology is bringing connected cars to our roads that will have the ability to capture and digest data from other vehicles and infrastructure, and then act upon that data in order to make our roads safer and more efficient. IoT connectivity will allow vehicles to assess the effectiveness of different features such as their braking and steering systems, perform predictive maintenance, and even update a their firmware and software without human intervention. Experts agree, V2X will get a big boost from the emergence of 5G as well.

    (Image source: NXP Semiconductors

  • 5.) More sensors on the road

    Cars are already packed with sensors, and more and more advanced sensors such as LiDAR and even thermal are implemented into autonomous cars. But more sensors will also be coming to our roads. Road scanning will be using sensors and cameras to scan the road ahead, identifying any possible imperfections or hitches. Smart vehicles will then use that information to adjust their routes accordingly.WaveSense, a Boston-based sensor company, for example, is using ground-penetrating radar to help vehicles map topography.

    (Image source: WaveSense)

As a child, Ariana Merrill loved to figure out how cars worked, and this has translated into her love and passion for mechanical engineering.  For the past 12 years, Ariana has been helping communities thrive through careful monitoring and innovation of electrical and mechanical systems. Ariana also is a tech enthusiast living in New Jersey. She is a computer science and engineering graduate, specialized in artificial intelligence. She loves to write on how AI is paving all industries.  

10-semi-electronic-device-tech-reveals-from-ieee-iedm-2019

2019 IEEE IEDM event reveals latest node chips, chiplets, memories for AI, densest thin-film batteries, 400Gbits/s silicon photonics, quantum computing tools and much more.

  • The theme for this year’s 65th IEEE International Electron Devices Meeting (IEDM) was, “Innovative Devices for an Era of Connected Intelligence.” As in previous years, major semiconductor players including and international research organizations (e.g., imec, CEA-Leti, UC universities and others) presented the latest detailed technology for processors, memories, interfaces and power device devices. Additionally, the event included quantum computing advances, medical uses and other newer areas of application.

    Here are 10 of the major semiconductor “reveals” at the show for 2019.

  • Leading Edge 5nm Chip with Super Dense Memory

    Moore’s Law may be hitting the wall but it’s not dead yet. TSMC unveiled a complete 5nm technology platform that advanced silicon chip scaling (miniaturization) to the next process node. Reaching the 5nm node milestone was due in part to advances in lithography and improvements in process and packaging techniques.

    TSMC researchers described a 5nm CMOS process optimized for both mobile and high-performance computing. It offered nearly twice the logic density and a 15% speed gain or 30% power reduction over the company’s 7nm process. The process optimization incorporated extensive use of EUV lithography to replace immersion lithography at key points in the manufacturing process.

    TSMC’s 5nm platform also featured FinFETs and high-density SRAM cells. The SRAM could be optimized for low-power or high-performance applications, and the researchers say the high-density version was the highest-density SRAM ever reported. The researchers say high-volume production was targeted for 1H20.

  • Quantum computing 

    Great strides have been made in quantum computing. At the Semicon West/Electronic System Design (ESD) 2019 conference, IBM displayed it’s  IBM Q Experience, a cloud-based quantum computer available for free to anyone with a web browser and an internet connection.

    Creating a quantum computer has been an amazing technological achievement, but like any computer it needs software. Imec – the international Flemish R&D nanoelectronics organization – presented the first step toward developing a systematic approach to the design of quantum computing devices.

    EDA chip design software such as TCAD is necessary to produce highly accurate models of semiconductor devices and their operation. To date, no analogous tools exist to model qubits, the basis of quantum computing, because the field is so new and complex. If these design tools did exist, the development of quantum computers could take place much more quickly.

    The Imec team has taken a step to create such a software framework using multiphysics simulation methods to develop a comprehensive design methodology for qubits built in silicon. They modeled device electrostatics, stress, micro-magnetics, band structure and spin dynamics. Based on the results of these studies, they say that single-electron qubits in quantum dots can be induced and optimized in silicon MOSFETs with thin (<20nm) gate oxides. The researchers will discuss critical aspects of their methodology, the parameters they modeled, and next steps.

  • 3D Chiplets

    Intel presented a novel 3D heterogeneous integration process for chiplet creation. It is seen as an evolution of Moore’s Law, a way to keep the scaling, size and cost benefits continuing into the foreseeable future.

    Chiplets are a type of advanced packaging which offers a different way to integrate multiple dies into a package or system. There are a number of ways to make chiplets, but all use a library of modular chips – like Lego building blocks. These module chips are assembled in a package that connects them using a die-to-die interconnect scheme.

    There are many other approaches to combining chip dies, i.e., 2.5D dies that are stacked on top of an interposer. But the hope with a chiplet approach is that it’s a faster and less expensive way to assemble various types of third-party chips like processors, memory, interfaces and the like.

    Here are the details: Intel believes that heterogeneous 3D integration will drive scaling. CMOS technology requires both NMOS and PMOS devices. Intel researchers used 3D sequential stacking architecture to combine these different devices. They first built Si FinFET NMOS transistors on a silicon wafer. On a separate Si wafer they fabricated a single-crystalline Ge film for use as a buffer layer. They flipped the second wafer, bonded it to the first, annealed them both to produce a void-free interface, cleaved the second wafer away except for the Ge layer, and then built gate-all-around (GAA) Ge-channel PMOS devices on top of it. The researchers say these results show that heterogeneous 3D integration is promising for CMOS logic in highly scaled technology nodes.

    This images hows a schematic and a cross-section of a fully processed 3D CMOS transistor structure achieved by this process; in the middle is a thickness contour map of the Ge transfer layer, showing good uniformity; and at right is a 3D cross-sectional view of the completed 3D CMOS chip showing Ge-channel GAA transistors on top of Si FinFET NMOS transistors.

  • AI That Does’t Forget

    Embedded STT-MRAM and other non-volatile memories (NVMs) are getting a lot of attention lately. NVMs devices retain their memory even after the power is removed. Embedded SST-NRAM is one NVM that shows particular promise in the embedded memory space for cache memory in IoT and AI applications.

    At IEDM 2019, TSMC described a versatile 22nm STT-MRAM technology for AI while Intel talked about STT-MRAMs for use in L4 cache applications.

    In STT-RAM writing, an electric current is polarized by aligning the spin direction of the electrons flowing through a magnetic tunnel junction (MTJ) element. Data writing is performed by using the spin-polarized current to change the magnetic orientation of the information storage layer in the MTJ element. Intel improved the process and stack for L4 cache applications. STT-MRAM technology for L4 cache requires tighter bitcell pitches, which translate into smaller MTJ sizes and reduced available write current.

  • Organ Forceps With a Special Touch

    Our internal organs are slippery because they’re covered with blood and other body fluids, so grasping and pulling them with forceps can be challenging. Although contact-force sensors have been placed on the tips of forceps used in diagnostic laparoscopic and robotic surgeries, there currently is no way to know if they are slipping, other than visually via a monitor, which has limited usefulness. A Kagawa University team described a highly sensitive slip-sensing imager (sub-mm resolution) and novel algorithm that can, in effect, give forceps a sense of touch. The idea is to use the device to visualize the spatial distribution of the grasping force across the organ’s surface. The center of that distributed load is calculated, and as the forceps are moved the algorithm relates any corresponding movements of the load center to slippage. Built on an SOI wafer, the device’s force-sensor pixels consist of a 20µm–thick piezoelectric silicon diaphragm (400µm diameter) with a center contact, and with a force detection circuit integrated on the diaphragm. The diaphragm acts as a strain gauge as it flexes due to varying grasping force.

  • Impedance Sensor for Fingerprint Imaging

    Researchers led by Cornell discussed the monolithic integration of a piezoelectric aluminum nitride (AlN) resonator into a CMOS-controlled, GHz ultrasonic impedance sensor/imager. The device measures changes in surface properties such as surface oxidation, materials, liquid viscosity and others, and is meant for use in wearable, IoT and smartphone systems to detect fingerprints with high resolution, determine tissue states, and for other applications. This is the first time monolithic fabrication – all in one chip or die –  has been successfully demonstrated, and it led to small, power-efficient GHz sensing arrays with improved performance vs. the standard two-chip heterogeneous integration approach, thanks to less parasitic coupling and a higher signal-to-noise ratio.

  • Thin-Film Battery Goes High-Density

    The miniaturization of power sources hasn’t kept pace with the miniaturization of electronics. Although integrated electrochemical capacitors offer high power density, high frequency response and novel form factors, their low energy densities are of limited value for MEMS and autonomous device applications that require long periods between charging. CEA-Leti researchers discussed a thin-film battery (TFB) with the highest areal energy density yet reported (890 µAh/cm-2) and high-power density (450 µAh/cm-2). Built on silicon wafers using UV photolithography and etching for the successive deposition and patterning of each layer, the thin-film battery integrates a 20µm-thick LiCoO2 cathode in a Li-free anode configuration. It showed good cycling behavior over 100 cycles, and the fact it was built using a wafer-level process opens up the possibility to tightly integrate this battery technology with future electronic devices.

  • Physically Unclonable Function (PUF) for Mobile and Smart Devices

    The spread of networked mobile devices and smart gadgets in the IoT landscape has created an urgent need to protect them with lightweight and low-power cryptographic solutions. A physically unclonable function (PUF) is a hardware-intrinsic security primitive, or basic programming element. UC Santa Barbara researchers discussed an ultra-low-power PUF that operates on the varying electrical resistances and current leakages that arised from intrinsic process variations in ReRAM crossbar arrays. The team built 4K-ReRAM passive crossbar circuit arrays fabricated with a CMOS-compatible process suitable for back-end-of-the-line (BEOL) integration. The arrays allow for an extremely large number of challenge-response pairs (a common cryptographic protocol), as well as 4x better density vs. other ReRAM architectures plus a ~100x improvement in power efficiency and more robust security metrics.

  • Silicon photonics

    Very fast speed data races around within data centers via optical fiber, using silicon photonic (light-based) interfaces that operate at 100 Gb/s. But cloud data center traffic is growing at nearly 30% per year and there soon will be a need to increase the data rates. A STMicroelectronics-led team described a new silicon photonics technology platform built on 300mm Silicon-on-Insulator (SOI) wafers, yielding devices that operate at 400Gbits/s (each device has 4 channels, each of which operates at 100Gbits/s, for a total of 400Gbits/s).

    Optical coupling and polarization management are key requirements, and their devices incorporate a 60 GHz high-speed photodiode and a high-speed phase modulator. They also built devices with a supplementary SiN waveguide layer for higher coupling efficiency, to meet evolving data-transmission requirements. The researchers say the photonics platform has the potential to meet the requirements of applications other than data centers, too, such as automotive.

    The image is a photo of the chip-on-board assembly of an analog front-end (AFE) function implemented in a 400G-DR4 optical transceiver using the technology, and (2b) are PAM4 signal eye diagrams at 106 Gbits/s per channel, used to measure high-speed signal quality.

  • 5G and beyond

    One of the challenges for chip makers is how to integrate III-V materials with silicon to make ultra-fast devices for 5G and other uses, which are compatible with conventional CMOS technology.  In addition to silicon, III-V compound semiconductors are obtained by combining group III elements (essentially Al, Ga, In) with group V elements (essentially N, P , As, Sb). This gives us 12 possible combinations; the most important ones are probably GaAs, InP GaP and GaN.

    IOT and 5G applications typically use sensors that transmit wireless data to anedge or cloud network. This requires a combination of RF capabilities with a small form factor and low operating power. A promising approach to achieve this combination is to create single chips that combine the capabilities of silicon CMOS with those of III-V devices, such as gallium nitride (GaN) and indium gallium arsenide (InGaAs). The unique properties of III-V compounds make then well suited for optoelectronics (LEDs) and communications (5G).

    At IEDM, Intel talked described how low-leakage, high-k dielectric enhancement mode GaN NMOS and Si PMOS transistors were built monolithically on a 300mm Si substrate. The goal was to combine GaN’s high-frequency/-temperature/-power attributes with silicon CMOS circuitry’s digital signal processing, logic, memory and analog capabilities, to create compact devices for next-generation solutions for power delivery, RF and system-on-chip (SoC) applications. The researchers say both device types demonstrated excellent performance across a range of electrical specifications.

    III-V materials offer higher electron mobilities than silicon, and HBTs made from them are very fast transistors often used for RF and other high-frequency applications. A key goal is to build them on 300mm silicon wafers instead of other substrates, to take advantage of silicon’s lower manufacturing costs. A team led by imec described how they used a unique nano-ridge engineering technique to build GaAs/InGaP HBTs on a 300mm silicon substrate.

RELATED ARTICLES:

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier

edge-computing-key-industrial-automation-trend-in-2020

Factory connectivity and communications have become cornerstone technology trends for automation and control engineers in the last 10 years as the development of the Industrial Internet of Things (IIoT) has emerged as a corporate objective. But as we head into 2020, edge computing is evolving into a unifying force for machine designers implementing “computing at the edge” architectures that provide performance and security in a world offering a wide range of communication solutions.

IIoT, IoT, edge computing, automation, OT, IT

New edge computing architectures are leveraging edge nodes and gateways to connect IoT devices and subsystems with different types of data centers (private, public or hybrid). Edge nodes perform local processing and storage operations. (Image source: Industrial Internet Consortium

A new white paper from the Industrial Internet Consortium, “The Edge Computing Advantage” explores not only the business benefits of edge computing but also how it has become a keystone in the IIoT’s evolution in the smart factory.

The authors of the white paper conclude that edge computing has grown steadily as a way to extend the technology of data centers closer to the physical devices within the factory. Cloud computing offers flexibility and scale, offering benefits by connecting systems, but also need to be balanced against increased security risks.

Emergence of edge computing paradigm

Given that many industrial facilities have maintained a so-called “airgap” between plants and the Internet (by not being physically connected to the Internet), edge computing has continued to emerge. The benefits: better use of bandwidth on factory networks, reduced latency and variation of data along with use of local data and computation that improves privacy, reliability, resiliency and safety.

Along with these practical benefits, edge computing technology itself is providing a flexible approach that uses a fully distributed computing model between IoT devices and layers of edge nodes that provide communications to the data center.

According to the white paper, “the topology of the network enables IoT systems to make use of layers of edge nodes and gateways to interconnect IoT devices and connected subsystems with various types of data centers. The cloud is the ‘highest-order’ resource, and is usually implemented in large, protected data centers. It may be public, private or a hybrid to process and store data for specific vertical applications. Edge nodes perform local processing and storage operations.”

IT and OT convergence

Efficient, reliable and maintainable Industrial IoT data handling presents significant challenges because the data management solutions that exist today have been mainly designed for information technology (IT) applications. A customized solution to fill the gap between the IT and OT (operations technology) applications is required.

A wide range of suppliers are providing intelligent IoT gateways to help build seamless data processing solutions that bridge this gap. Gateways are being used to mass-deploy IoT devices in the field, acquire data and route it on-demand to a centralized system, other devices or a remote site. The use of edge nodes along with traditional routers, gateways and firewalls provides both storage and computation capabilities that is distributed across devices, nodes and the data center itself.

Opportunities and challenges

The white paper concludes with a discussion of both the opportunities and challenges that this new computing paradigm is creating. What’s expected in 2020 is a continuation of the “blurred lines from the edge to the data center, as cloud-computing and edge-computing architectural models merge and emerge”.

To read the full IIC white paper, view this PDF.

Al Presher is a veteran contributing writer for Design News, covering automation and control, motion control, power transmission, robotics, and fluid power.

DesignCon 2020 25th anniversary Logo

January 28-30: North America’s largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? Register to attend!

the-10-greatest-issues-ai-needs-to-face
  • There are a lot of reasons to be excited about artificial intelligence. AI is transforming industries in innovative ways and even enabling entirely new business models to emerge.

    But there are also a lot of reasons to be cautious about AI. The 2019 AI Now Report, created by the AI Now Institute, takes a look back on the social impact of AI in 2019, and some of the most important issues surrounding the technology as it moves forward. The AI Now Institute is a non-profit, interdisciplinary research institute “dedicated to understanding the social implications of AI technologies.”

    “This year we saw a wave of pushback, as community groups, researchers, policymakers, and workers demanded a halt to risky and dangerous AI,” the report says.

    As AI moves into the next decade we’ve outlined some of the most important issues AI will have to grapple with in the coming years.

  • 1.) Algorithmic bias is already affecting us

    As more and more AI algorithms are implemented into decision making processes in everything from real estate to healthcare, it is important to for developers to be aware of the inherent biases within the datasets they use to train AI.

    Apple’s Apple Pay service recently came under fire from customers – including Apple’s co-founder Steve Wozniak – over allegations that the services approval system was assigning lower credit limits to female customers.

    Experts agree it will likely be impossible to completely safeguard systems again bias, but steps can be taken to mitigate the impact of bias.

    (Image source: Apple)

  • 2.) Facial recognition is watching us

    Facial recognition is already here and being widely deployed throughout the world. In China facial recognition technology has become a part of surveillance and security systems and even allowed customers to use their face to access ATMs.

    While there is an argument for convenience and security, there are also wide spread privacy and ethics concerns around using AI facial recognition. The city of Detroit is facing pushback over plans to add facial recognition to its Project Green Light – a camera system that allows police departments to monitor businesses and intersections in real time.

    In 2019 cities of Oakland, Calif., Somerville, Mass., and San Francisco voted to pass a ordinances banning municipal use of face recognition technology.

    By contrast however, the Department of Homeland Security (DHS) announced that it has plans to issue a proposed regulation that could require all travelers, including US citizens, to submit to face and other biometric scans at airports and other ports of entry.

    Regarding the DHS announcement, ACLU Senior Policy Analyst Jay Stanley had this to say:

    “Time and again, the government told the public and members of Congress that US citizens would not be required to submit to this intrusive surveillance technology as a condition of traveling. This new notice suggests that the government is reneging on what was already an insufficient promise.”

    (Image source:  teguhjati pras from Pixabay )

  • 3.) Deepfakes are a reality

    If you want to see the power of deepfakes you only need to browse around YouTube, to channels like Ctrl Shift Face.

    This isn’t a special effect. With enough data (including images and audio) AI algorithms can actually reconstruct and superimpose individual’s faces onto existing video footage. It makes for some entertaining viral videos, but there are wider, more frightening implications for deepfakes as they can be used to create fraudulent videos of political figures, celebrities, and even private citizens. Left unchecked, deepfakes could become a powerful tool for the spread of misinformation.

    (Image source: Ctrl Shift Face)

  • 4.) Algorithms are ruining our social media experience

    Have you ever watched one video on YouTube or liked a post on Facebook or other social media only to be sent down a rabbit hole of increasingly questionable recommendations? That’s not an accident – that’s AI trying to predict what you’ll “like.” And by “like” we mean it’s trying to figure out what content you’re most likely to engage with – and that often means offending or shocking you. Algorithmic issues are being blamed for both a rise in the quantity of extremist content on social media as well as its proliferation. Google, Facebook, and others have pledged to search for ways to tamp down on the spread of dangerous and extremist content as well as misinformation. But many would argue the damage has already been done.

    (Image source: Pixelkult from Pixabay  )

  • 5.) AI is a powerful tool for hacking

    Automation is meant to handle the dirty, dangerous, and repetitive tasks humans can’t or don’t want to perform, right? Well the benefits go both ways. More and more malicious hackers are leveraging AI technology to assist with sophisticated cybersecurity attacks. A well-trained algorithm can attack a target with a level of speed and efficiency that would be difficult for one or even a larger group of hackers. Fortunately, cybersecurity companies like XM Cyber are fighting fire with fire and are also using machine learning algorithms to safeguard networks and sensitive systems as well.

    (Image source: XM Cyber)

  • 6.) AI developers lack diversity

    Issues with AI can be correlated to a lack of racial and gender diversity among the engineers and developers being hired at the top technology companies working on AI. The AI Now Institute has found that Black and Latinx workers are substantially underrepresented in the tech workforce, and women are particularly underrepresented as AI researchers.

    “Rather than recognizing the scale and systemic nature of the problem, tech companies have responded to mounting evidence of bias and misuse by primarily focusing on narrow diversity solutions,” the AI Now Institute said. “They have also attempted technical debiasing, working to ‘fix’ algorithms and diversify data sets, even though these approaches have proven insufficient and raise serious privacy and consent concerns. Notably, neither approach addresses underlying structural inequalities.”

    (Image source: PixLoger from Pixabay)

  • 7.) AI isn’t green

    As engineers come to terms with the realities of climate change and the need to develop greener technologies, AI is having its own energy crisis. The massive amount of compute power required for AI also comes with a massive energy bill.

    “As a whole, the industry’s energy dependence is on an exponential trajectory, with best estimates showing that its 2020 global footprint amounts to 3–3.6 percent of global greenhouse emissions, more than double what the sector produced in 2007,” the AI Now Institute said. “This is comparable to that of the aviation industry,and larger than that of Japan, which is the fifth biggest polluter in the world.”

    Tech companies are already implementing renewable energy sources and other means to make data centers more efficient. But the emergence of 5G and other advanced networking technologies only threatens to make the problem worse before it gets better. “In the worst-case scenario, this footprint could increase to 14 percent of global emissions by 2040,” the Institute warned.

    (Image source: Free-Photos from Pixabay )

  • 8.) AI helps privatize public infrastructure

    “Troubling partnerships between government and private tech companies also emerged as a trend this year, especially those that extended surveillance from public environments into private spaces like private properties and the home,” the AI Now Institute said.

    In 2019 the city of Detroit established the “Neighborhood Real-Time Intelligence Program,” a $9 million, state- and federally-funded initiative that would expand the city’s Project Green Light surveillance system to 500 intersections, in addition to the 500 businesses where it is already deployed, as well as add facial recognition technology to the system. The city has reported reduced crime in areas thanks to Project Green Light, but that hasn’t stopped privacy advocates from protesting the technology.

    In 2018, Amazon came under fire for offering to let police departments utilize its facial recognition software. The company has also negotiated with over 700 police departments in the US to give police access to videos from Ring smart home cameras if the footage can help with a criminal investigation, according to the AI Now Institute.

    (Image source: Pixabay)

  • 9.) Automation impacts people of color and the poor the most

    The debate about automation and labor likely won’t ever stop. But the narrative is taking new shape as more data emerges about specific groups affected by rapid automation due to AI.

    Depending on who you ask, automation will be a boon to the economy as well as personal productivity, or it will usher in a dystopian nightmare where humans struggle for basic needs while robots handle all of the jobs.

    “Both narratives are predicated on the assumption that automation in the workplace is inevitable and that automated systems are capable of performing tasks that had previously been the work of humans. What is missing from both conflicting narratives is the more nuanced prediction of who will be harmed and who will benefit from labor automation in the years to come,” the AI Now Institute said.

    The 2019 AI Now Report predicts that Black, Latinx, and low-wage workers in the US will be disproportionately impacted by increased levels of automation.

    (Image source: mohamed_hassan from Pixabay)

  • 10. ) AI is removing the ‘human’ from human resources

    More and more companies are using AI to manage and oversee workers. AI is even being implemented into the hiring process. Amazon, for example, uses an AI system to set shifting performance goals for its warehouse workers. Workers are assigned a daily “rate” of productivity to hit each day, based on their prior performance and the overall goals of the warehouse.

    “If a worker falls behind, they are subject to disciplinary action. In many warehouses, termination is an automated process (not unlike being “kicked off” a gig-economy platform),” the AI Now Institute said. “According to Abdi Muse, an organizer with Amazon warehouse workers in Minneapolis, if workers fall behind the algorithmically set productivity rate three times in one day, they are fired, however long they may have worked for the company, and irrespective of the personal circumstances that led to their ‘mistakes.’ ”

    “The introduction of AI-enabled labor-management systems raises significant questions about worker rights and safety. The use of these systems—from Amazon warehouses to Uber and InstaCart—pools power and control in the hands of employers and harms mainly low-wage workers (who are disproportionately people of color) by setting productivity targets linked to chronic injuries, psychological stress, and even death and by imposing unpredictable algorithmic wage cuts that undermine economic stability.”

    (Image source: iosphere / Freedigitalphotos.net)

keynotes-worth-seeing-at-designcon-2020

What do these topics have in common?

  1. The Future of Fiber Optic Communications: Datacenter and Mobile
  2. Design for Security: The Next Frontier of Smart Silicon
  3. Microchips in Space: How Device Design Enables Amazing Astronomy

The answer is that all use microchips and microsystems but in very different ways and for differing motivations.

In the first one, complex system-on-chips (SoC) are integrated with fiber optics to enable dizzyingly fast high-speed connections between processors, memory storage, and interfaces in data rooms and mobile devices across the world.

With so much going on in the world of fiber optic communications, it’s important for designers to keep up to date with the basic engineering issues. The catalyst for this interest is that the global fiber optics market is predicted to grow from 5 billion USD in 2018 to 9 billion USD by the end of 2025.

In his upcoming keynote at Designcon 2020, Chris Cole, VP of Advanced Development at II-VI, will discuss past trends and new developments in fiber optics for datacenter and mobile applications. Two ongoing trends are the replacement of copper wires by fiber optics in the data room as well as the replacement of direct detection by coherent detection in optical systems.

Cole will also explain the major limitations of power and density in communications, and new technologies like Silicon Photonics (SiPh) and co-packaging. Silicon photonics involves the study of optical properties of the group-IV semiconductor and how it can be used to generate, manipulate and detect light. Silicon is prevalent in photodetectors and solar cells, among other technologies.

To learn more, visit: The Future of Fiber Optic Communications: Datacenter

Image Source: Imec
vote-for-the-2020-engineer-of-the-year

Now is the time to cast your vote for the DesignCon 2020 Engineer of the Year. This award is given out each year during the DesignCon event and seeks to recognize the best of the best in engineering and new product advancements at the chip, board, or system level, with a special emphasis on signal integrity and power integrity.

Editors of Design News and the staff of DesignCon would like to offer hearty congratulations to the finalists. For this year’s award, the winner (or his/her representative) will be able to direct a $1,000 donation to any secondary educational institution in the United States. The details on each nominee are below as provided in their published biographies and by the person/s who made the nomination. Please cast your vote by following this link.

Voting closes at noon Pacific Time on Friday, December 27. The winner will be announced at DesignCon 2020, January 28-30, at the Santa Clara Convention Center, Santa Clara, CA.

The six finalists for the 2020 DesignCon Engineer of the Year Award are (click each name to see finalist’s bio and community activity):

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

See the Official Rules of the Engineer of the Year Award

Please click here to learn more about DesignCon and register to attend

Jay Diepenbrock

Consultant, SIRF Consultants LLC

DesignCon 2020 Engineer of the Year finalist Jay Diepenbrock from SIRF ConsultantsJoseph C. (Jay) Diepenbrock holds an Sc. B. (EE) from Brown University and an MSEE from Syracuse University. He worked in a number of development areas in IBM including IC, analog and RF circuit, and backplane design. He then moved to IBM’s Integrated Supply Chain, working on the electrical specification, testing, and modeling of connectors and cables and was IBM’s Subject Matter Expert on high speed cables. After a long career at IBM he left there and joined Lorom America as Senior Vice President, High Speed Engineering, and led the Lorom Signal Integrity team, supporting its high speed product development. He left Lorom in 2015 and is now a signal integrity consultant with SIRF Consultants, LLC. 

Holding 12 patents, 30 publications, and a recognized expert in SI, Jay is currently the technical editor of the IEEE P370 standard and has worked on numerous other industry standards. He is a Senior Member of the IEEE and was an EMC Society Distinguished Lecturer. Jay has a steadfast commitment to solid engineering and communicating/teaching about it. He regularly contributes to industry discourse and education at events and in trade publications. He has made a distinguished career in high-speed product development, including backplane design, high speed connectors and cables, and signal integrity consulting. Beyond that, Jay actively volunteers his time for disaster and humanitarian relief around the world, including being part of the IEEE MOVE truck, which provides emergency communications during and after a disaster. He truly uses his engineering skills to make the world a better place.

Jay is a long-time, active member of the DesignCon Technical Program Committee.

This year at DesignCon, Jay will be presenting the tutorial “Introduction to the IEEE P370 Standard & Its Applications for High Speed Interconnect Characterization” and speaking in the panel “Untangling Standards: The Challenges Inside the Box.”

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Vladimir Dmitriev-Zdorov

Senior Key Expert, EBS Product Development, Mentor, A Siemens Business

DesignCon 2020 Engineer of the Year finalist Vladimir Dmitriew-Zhorov from Mentor, A Siemens BusinessDr. Vladimir Dmitriev-Zdorov has developed a number of advanced models and novel simulation methods used in Mentor products. His current work includes development of efficient methods of circuit/system simulation in the time and frequency domains, transformation and analysis of multi-port systems, and statistical and time-domain analysis of SERDES links. He received Ph.D. and D.Sc. degrees (1986, 1998) based on his work on circuit and system simulation methods. The results have been published in numerous papers and conference proceedings, including DesignCon. Several DesignCon papers such as “BER-and COM-Way of Channel-Compliance Evaluation: What are the Sources of Differences?” and “A Causal Conductor Roughness Model and its Effect on Transmission Line Characteristics” have received the Best Paper Award. Dr. Vladimir Dmitriev-Zdorov holds 9 patents.

Vladimir is an active member of the DesignCon Technical Program Committee.

This year at DesignCon, Vladimir will be presenting the technical session, “How to Enforce Causality of Standard & “Custom” Metal Roughness Models” and on the panel “Stump the SI/PI Experts.”

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Tim Hollis

Fellow, Micron Technology

DesignCon 2020 Engineer of the Year finalist Tim Hollis from Micron TechnologiesTim Hollis is a distinguished member of the Macron Technologies technical staff and an advanced signaling R&D lead. His main focus is in identifying and directing forward-looking projects for the SI R&D team to pursue and driving a cross-functional working group intended to provide forward-looking technical guidance to upper management.

Tim has shown outstanding technical leadership in solving numerous challenges with regard to high-speed DDR memory interfaces, for both computing and graphics applications. He has contributed papers to DesignCon as received a Best Paper Award in 2018 as lead author for “16Gb/s and Beyond with Single-Ended I/O in High-Performance Graphics Memory.” His 85 patents reflect his innovative mind and his prodigious contributions to technology.

Tim received a BS in Electrical Engineering from University of Utah and a Ph.D. in Electrical Engineering from Brigham Young University.

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Istvan Novak

Principle SI and PI Engineer, Samtec

DesignCon 2020 Engineer of the Year finalist Istvan Novak from SamtecIstvan Novak is a Principle Signal and Power Integrity Engineer at Samtec, working on advanced signal and power integrity designs. Prior to 2018 he was a Distinguished Engineer at SUN Microsystems, later Oracle. He worked on new technology development, advanced power distribution and signal integrity design and validation methodologies for SUN’s successful workgroup server families. He introduced the industry’s first 25um power-ground laminates for large rigid computer boards, and worked with component vendors to create a series of low-inductance and controlled-ESR bypass capacitors. He also served as SUN’s representative on the Copper Cable and Connector Workgroup of InfiniBand, and was engaged in the methodologies, designs and characterization of power-distribution networks from silicon to DC-DC converters. He is a Life Fellow of the IEEE with twenty-five patents to his name, author of two books on power integrity, teaches signal and power integrity courses, and maintains a popular SI/PI website.

Istvan has in many cases single handedly helped the test and measurement industry develop completely new instruments and methods of measurement. New VNA types and Scope probes and methodologies are in the market today thanks to Istvan’s efforts and openness to help others. He was responsible for the power distribution and high-speed signal integrity designs of SUN’s V880, V480, V890, V490, V440, T1000, T2000, T5120 and T5220 midrange server families. Last, but not least, Istvan has been a tremendous contributor to SI List, educating and helping engineers across the world with their SI/PI problems. Istvan is an active member of the DesignCon Technical Program Committee, sharing his expertise by participating in the review of content for multiple tracks. He is an IEEE Fellow and has been a tutor at the University of Oxford, Oxford, UK for the past 10 years. He has also been a faculty member at CEI Europe AB since 1991 and served as Vice Dean of Faculty, Associate Professor at the Technical University of Budapest.

At DesignCon 2020, Istvan will be participating in the technical session, “Current Distribution, Resistance & Inductance in Power Connectors,” and the panel, “Stump the SI/PI Experts.”

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Michael Schnecker

Business Development Manager, Rohde & Schwarz

DesignCon 2020 Engineer of the Year finalist JMichael Schnecker from Rohde & SchwarzMichael Schnecker’s experience in the test and measurement industry includes applications, sales and product development and specialization in signal integrity applications using oscilloscopes and other instruments. Prior to joining Rohde & Schwarz, Mike held positions at LeCroy and Tektronix. While at LeCroy, he was responsible for the deployment of the SDA series of serial data analyzers.    

Mike has more than two decades of experience working with oscilloscope measurements. His background in time and frequency domains provides him with unique insight into the challenges engineers face when testing high-speed systems for both power and signal integrity. Interacting with engineers in the industry daily has allowed Mike to master the ability to explain complex measurement science to engineers at any level. He also holds several patents, including methods and apparatus for analyzing serial data streams as well as coherent interleaved sampling. Thus, Mike is recognized as a thought leader and exceptional mentor in the signal and power integrity community.

Mike has a BS from Lehigh University and an MS from Georgia Tech, both in electrical engineering. 

This year at DesignCon, Mike will be presenting the tutorial “Signal Integrity: Measurements & Instrumentation“ and at the technical session, “Real-Time Jitter Analysis Using Hardware Based Clock Recovery & Serial Pattern Trigger.”

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Yuriy Shlepnev

President and Founder, Simberian

DesignCon 2020 Engineer of the Year finalist Yuriy Shlepnev from SimberianYuriy Shlepnev is President and Founder of Simberian Inc., where he develops Simbeor electromagnetic signal integrity software. He received M.S. degree in radio engineering from Novosibirsk State Technical University in 1983, and the Ph.D. degree in computational electromagnetics from Siberian State University of Telecommunications and Informatics. He was principal developer of electromagnetic simulator for Eagleware Corporation and leading developer of electromagnetic software for simulation of signal and power distribution networks at Mentor Graphics. The results of his research are published in multiple papers and conference proceedings.

Yuriy conceived and brought to market a state of the art electromagnetic field solver tool suite and is considered an expert in his field and regularly posts teaching videos. He is a senior member of IEEE AP, MYY, EMC, and CPMT societies. He is also a Fellow of Kong’s Electromagnetics Academy and a member of the Applied Computational Electromagnetics Society (ACES).

Yuriy is active in the Technical Program Committee for DesignCon and has served a track co-chair in the past. At DesignCon this year he will be presenting the tutorial “Design Insights from Electromagnetic Analysis & Measurements of PCB & Packaging Interconnects Operating at 6- to 112-Gbps & Beyond” and speaking in the technical session “Machine Learning Applications for COM Based Simulation of 112Gb Systems.”

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Learn more about DesignCon and register to attend