10-semi-electronic-device-tech-reveals-from-ieee-iedm-2019

2019 IEEE IEDM event reveals latest node chips, chiplets, memories for AI, densest thin-film batteries, 400Gbits/s silicon photonics, quantum computing tools and much more.

  • The theme for this year’s 65th IEEE International Electron Devices Meeting (IEDM) was, “Innovative Devices for an Era of Connected Intelligence.” As in previous years, major semiconductor players including and international research organizations (e.g., imec, CEA-Leti, UC universities and others) presented the latest detailed technology for processors, memories, interfaces and power device devices. Additionally, the event included quantum computing advances, medical uses and other newer areas of application.

    Here are 10 of the major semiconductor “reveals” at the show for 2019.

  • Leading Edge 5nm Chip with Super Dense Memory

    Moore’s Law may be hitting the wall but it’s not dead yet. TSMC unveiled a complete 5nm technology platform that advanced silicon chip scaling (miniaturization) to the next process node. Reaching the 5nm node milestone was due in part to advances in lithography and improvements in process and packaging techniques.

    TSMC researchers described a 5nm CMOS process optimized for both mobile and high-performance computing. It offered nearly twice the logic density and a 15% speed gain or 30% power reduction over the company’s 7nm process. The process optimization incorporated extensive use of EUV lithography to replace immersion lithography at key points in the manufacturing process.

    TSMC’s 5nm platform also featured FinFETs and high-density SRAM cells. The SRAM could be optimized for low-power or high-performance applications, and the researchers say the high-density version was the highest-density SRAM ever reported. The researchers say high-volume production was targeted for 1H20.

  • Quantum computing 

    Great strides have been made in quantum computing. At the Semicon West/Electronic System Design (ESD) 2019 conference, IBM displayed it’s  IBM Q Experience, a cloud-based quantum computer available for free to anyone with a web browser and an internet connection.

    Creating a quantum computer has been an amazing technological achievement, but like any computer it needs software. Imec – the international Flemish R&D nanoelectronics organization – presented the first step toward developing a systematic approach to the design of quantum computing devices.

    EDA chip design software such as TCAD is necessary to produce highly accurate models of semiconductor devices and their operation. To date, no analogous tools exist to model qubits, the basis of quantum computing, because the field is so new and complex. If these design tools did exist, the development of quantum computers could take place much more quickly.

    The Imec team has taken a step to create such a software framework using multiphysics simulation methods to develop a comprehensive design methodology for qubits built in silicon. They modeled device electrostatics, stress, micro-magnetics, band structure and spin dynamics. Based on the results of these studies, they say that single-electron qubits in quantum dots can be induced and optimized in silicon MOSFETs with thin (<20nm) gate oxides. The researchers will discuss critical aspects of their methodology, the parameters they modeled, and next steps.

  • 3D Chiplets

    Intel presented a novel 3D heterogeneous integration process for chiplet creation. It is seen as an evolution of Moore’s Law, a way to keep the scaling, size and cost benefits continuing into the foreseeable future.

    Chiplets are a type of advanced packaging which offers a different way to integrate multiple dies into a package or system. There are a number of ways to make chiplets, but all use a library of modular chips – like Lego building blocks. These module chips are assembled in a package that connects them using a die-to-die interconnect scheme.

    There are many other approaches to combining chip dies, i.e., 2.5D dies that are stacked on top of an interposer. But the hope with a chiplet approach is that it’s a faster and less expensive way to assemble various types of third-party chips like processors, memory, interfaces and the like.

    Here are the details: Intel believes that heterogeneous 3D integration will drive scaling. CMOS technology requires both NMOS and PMOS devices. Intel researchers used 3D sequential stacking architecture to combine these different devices. They first built Si FinFET NMOS transistors on a silicon wafer. On a separate Si wafer they fabricated a single-crystalline Ge film for use as a buffer layer. They flipped the second wafer, bonded it to the first, annealed them both to produce a void-free interface, cleaved the second wafer away except for the Ge layer, and then built gate-all-around (GAA) Ge-channel PMOS devices on top of it. The researchers say these results show that heterogeneous 3D integration is promising for CMOS logic in highly scaled technology nodes.

    This images hows a schematic and a cross-section of a fully processed 3D CMOS transistor structure achieved by this process; in the middle is a thickness contour map of the Ge transfer layer, showing good uniformity; and at right is a 3D cross-sectional view of the completed 3D CMOS chip showing Ge-channel GAA transistors on top of Si FinFET NMOS transistors.

  • AI That Does’t Forget

    Embedded STT-MRAM and other non-volatile memories (NVMs) are getting a lot of attention lately. NVMs devices retain their memory even after the power is removed. Embedded SST-NRAM is one NVM that shows particular promise in the embedded memory space for cache memory in IoT and AI applications.

    At IEDM 2019, TSMC described a versatile 22nm STT-MRAM technology for AI while Intel talked about STT-MRAMs for use in L4 cache applications.

    In STT-RAM writing, an electric current is polarized by aligning the spin direction of the electrons flowing through a magnetic tunnel junction (MTJ) element. Data writing is performed by using the spin-polarized current to change the magnetic orientation of the information storage layer in the MTJ element. Intel improved the process and stack for L4 cache applications. STT-MRAM technology for L4 cache requires tighter bitcell pitches, which translate into smaller MTJ sizes and reduced available write current.

  • Organ Forceps With a Special Touch

    Our internal organs are slippery because they’re covered with blood and other body fluids, so grasping and pulling them with forceps can be challenging. Although contact-force sensors have been placed on the tips of forceps used in diagnostic laparoscopic and robotic surgeries, there currently is no way to know if they are slipping, other than visually via a monitor, which has limited usefulness. A Kagawa University team described a highly sensitive slip-sensing imager (sub-mm resolution) and novel algorithm that can, in effect, give forceps a sense of touch. The idea is to use the device to visualize the spatial distribution of the grasping force across the organ’s surface. The center of that distributed load is calculated, and as the forceps are moved the algorithm relates any corresponding movements of the load center to slippage. Built on an SOI wafer, the device’s force-sensor pixels consist of a 20µm–thick piezoelectric silicon diaphragm (400µm diameter) with a center contact, and with a force detection circuit integrated on the diaphragm. The diaphragm acts as a strain gauge as it flexes due to varying grasping force.

  • Impedance Sensor for Fingerprint Imaging

    Researchers led by Cornell discussed the monolithic integration of a piezoelectric aluminum nitride (AlN) resonator into a CMOS-controlled, GHz ultrasonic impedance sensor/imager. The device measures changes in surface properties such as surface oxidation, materials, liquid viscosity and others, and is meant for use in wearable, IoT and smartphone systems to detect fingerprints with high resolution, determine tissue states, and for other applications. This is the first time monolithic fabrication – all in one chip or die –  has been successfully demonstrated, and it led to small, power-efficient GHz sensing arrays with improved performance vs. the standard two-chip heterogeneous integration approach, thanks to less parasitic coupling and a higher signal-to-noise ratio.

  • Thin-Film Battery Goes High-Density

    The miniaturization of power sources hasn’t kept pace with the miniaturization of electronics. Although integrated electrochemical capacitors offer high power density, high frequency response and novel form factors, their low energy densities are of limited value for MEMS and autonomous device applications that require long periods between charging. CEA-Leti researchers discussed a thin-film battery (TFB) with the highest areal energy density yet reported (890 µAh/cm-2) and high-power density (450 µAh/cm-2). Built on silicon wafers using UV photolithography and etching for the successive deposition and patterning of each layer, the thin-film battery integrates a 20µm-thick LiCoO2 cathode in a Li-free anode configuration. It showed good cycling behavior over 100 cycles, and the fact it was built using a wafer-level process opens up the possibility to tightly integrate this battery technology with future electronic devices.

  • Physically Unclonable Function (PUF) for Mobile and Smart Devices

    The spread of networked mobile devices and smart gadgets in the IoT landscape has created an urgent need to protect them with lightweight and low-power cryptographic solutions. A physically unclonable function (PUF) is a hardware-intrinsic security primitive, or basic programming element. UC Santa Barbara researchers discussed an ultra-low-power PUF that operates on the varying electrical resistances and current leakages that arised from intrinsic process variations in ReRAM crossbar arrays. The team built 4K-ReRAM passive crossbar circuit arrays fabricated with a CMOS-compatible process suitable for back-end-of-the-line (BEOL) integration. The arrays allow for an extremely large number of challenge-response pairs (a common cryptographic protocol), as well as 4x better density vs. other ReRAM architectures plus a ~100x improvement in power efficiency and more robust security metrics.

  • Silicon photonics

    Very fast speed data races around within data centers via optical fiber, using silicon photonic (light-based) interfaces that operate at 100 Gb/s. But cloud data center traffic is growing at nearly 30% per year and there soon will be a need to increase the data rates. A STMicroelectronics-led team described a new silicon photonics technology platform built on 300mm Silicon-on-Insulator (SOI) wafers, yielding devices that operate at 400Gbits/s (each device has 4 channels, each of which operates at 100Gbits/s, for a total of 400Gbits/s).

    Optical coupling and polarization management are key requirements, and their devices incorporate a 60 GHz high-speed photodiode and a high-speed phase modulator. They also built devices with a supplementary SiN waveguide layer for higher coupling efficiency, to meet evolving data-transmission requirements. The researchers say the photonics platform has the potential to meet the requirements of applications other than data centers, too, such as automotive.

    The image is a photo of the chip-on-board assembly of an analog front-end (AFE) function implemented in a 400G-DR4 optical transceiver using the technology, and (2b) are PAM4 signal eye diagrams at 106 Gbits/s per channel, used to measure high-speed signal quality.

  • 5G and beyond

    One of the challenges for chip makers is how to integrate III-V materials with silicon to make ultra-fast devices for 5G and other uses, which are compatible with conventional CMOS technology.  In addition to silicon, III-V compound semiconductors are obtained by combining group III elements (essentially Al, Ga, In) with group V elements (essentially N, P , As, Sb). This gives us 12 possible combinations; the most important ones are probably GaAs, InP GaP and GaN.

    IOT and 5G applications typically use sensors that transmit wireless data to anedge or cloud network. This requires a combination of RF capabilities with a small form factor and low operating power. A promising approach to achieve this combination is to create single chips that combine the capabilities of silicon CMOS with those of III-V devices, such as gallium nitride (GaN) and indium gallium arsenide (InGaAs). The unique properties of III-V compounds make then well suited for optoelectronics (LEDs) and communications (5G).

    At IEDM, Intel talked described how low-leakage, high-k dielectric enhancement mode GaN NMOS and Si PMOS transistors were built monolithically on a 300mm Si substrate. The goal was to combine GaN’s high-frequency/-temperature/-power attributes with silicon CMOS circuitry’s digital signal processing, logic, memory and analog capabilities, to create compact devices for next-generation solutions for power delivery, RF and system-on-chip (SoC) applications. The researchers say both device types demonstrated excellent performance across a range of electrical specifications.

    III-V materials offer higher electron mobilities than silicon, and HBTs made from them are very fast transistors often used for RF and other high-frequency applications. A key goal is to build them on 300mm silicon wafers instead of other substrates, to take advantage of silicon’s lower manufacturing costs. A team led by imec described how they used a unique nano-ridge engineering technique to build GaAs/InGaP HBTs on a 300mm silicon substrate.

RELATED ARTICLES:

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier

keynotes-worth-seeing-at-designcon-2020

What do these topics have in common?

  1. The Future of Fiber Optic Communications: Datacenter and Mobile
  2. Design for Security: The Next Frontier of Smart Silicon
  3. Microchips in Space: How Device Design Enables Amazing Astronomy

The answer is that all use microchips and microsystems but in very different ways and for differing motivations.

In the first one, complex system-on-chips (SoC) are integrated with fiber optics to enable dizzyingly fast high-speed connections between processors, memory storage, and interfaces in data rooms and mobile devices across the world.

With so much going on in the world of fiber optic communications, it’s important for designers to keep up to date with the basic engineering issues. The catalyst for this interest is that the global fiber optics market is predicted to grow from 5 billion USD in 2018 to 9 billion USD by the end of 2025.

In his upcoming keynote at Designcon 2020, Chris Cole, VP of Advanced Development at II-VI, will discuss past trends and new developments in fiber optics for datacenter and mobile applications. Two ongoing trends are the replacement of copper wires by fiber optics in the data room as well as the replacement of direct detection by coherent detection in optical systems.

Cole will also explain the major limitations of power and density in communications, and new technologies like Silicon Photonics (SiPh) and co-packaging. Silicon photonics involves the study of optical properties of the group-IV semiconductor and how it can be used to generate, manipulate and detect light. Silicon is prevalent in photodetectors and solar cells, among other technologies.

To learn more, visit: The Future of Fiber Optic Communications: Datacenter

Image Source: Imec
vote-for-the-2020-engineer-of-the-year

Now is the time to cast your vote for the DesignCon 2020 Engineer of the Year. This award is given out each year during the DesignCon event and seeks to recognize the best of the best in engineering and new product advancements at the chip, board, or system level, with a special emphasis on signal integrity and power integrity.

Editors of Design News and the staff of DesignCon would like to offer hearty congratulations to the finalists. For this year’s award, the winner (or his/her representative) will be able to direct a $1,000 donation to any secondary educational institution in the United States. The details on each nominee are below as provided in their published biographies and by the person/s who made the nomination. Please cast your vote by following this link.

Voting closes at noon Pacific Time on Friday, December 27. The winner will be announced at DesignCon 2020, January 28-30, at the Santa Clara Convention Center, Santa Clara, CA.

The six finalists for the 2020 DesignCon Engineer of the Year Award are (click each name to see finalist’s bio and community activity):

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

See the Official Rules of the Engineer of the Year Award

Please click here to learn more about DesignCon and register to attend

Jay Diepenbrock

Consultant, SIRF Consultants LLC

DesignCon 2020 Engineer of the Year finalist Jay Diepenbrock from SIRF ConsultantsJoseph C. (Jay) Diepenbrock holds an Sc. B. (EE) from Brown University and an MSEE from Syracuse University. He worked in a number of development areas in IBM including IC, analog and RF circuit, and backplane design. He then moved to IBM’s Integrated Supply Chain, working on the electrical specification, testing, and modeling of connectors and cables and was IBM’s Subject Matter Expert on high speed cables. After a long career at IBM he left there and joined Lorom America as Senior Vice President, High Speed Engineering, and led the Lorom Signal Integrity team, supporting its high speed product development. He left Lorom in 2015 and is now a signal integrity consultant with SIRF Consultants, LLC. 

Holding 12 patents, 30 publications, and a recognized expert in SI, Jay is currently the technical editor of the IEEE P370 standard and has worked on numerous other industry standards. He is a Senior Member of the IEEE and was an EMC Society Distinguished Lecturer. Jay has a steadfast commitment to solid engineering and communicating/teaching about it. He regularly contributes to industry discourse and education at events and in trade publications. He has made a distinguished career in high-speed product development, including backplane design, high speed connectors and cables, and signal integrity consulting. Beyond that, Jay actively volunteers his time for disaster and humanitarian relief around the world, including being part of the IEEE MOVE truck, which provides emergency communications during and after a disaster. He truly uses his engineering skills to make the world a better place.

Jay is a long-time, active member of the DesignCon Technical Program Committee.

This year at DesignCon, Jay will be presenting the tutorial “Introduction to the IEEE P370 Standard & Its Applications for High Speed Interconnect Characterization” and speaking in the panel “Untangling Standards: The Challenges Inside the Box.”

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Vladimir Dmitriev-Zdorov

Senior Key Expert, EBS Product Development, Mentor, A Siemens Business

DesignCon 2020 Engineer of the Year finalist Vladimir Dmitriew-Zhorov from Mentor, A Siemens BusinessDr. Vladimir Dmitriev-Zdorov has developed a number of advanced models and novel simulation methods used in Mentor products. His current work includes development of efficient methods of circuit/system simulation in the time and frequency domains, transformation and analysis of multi-port systems, and statistical and time-domain analysis of SERDES links. He received Ph.D. and D.Sc. degrees (1986, 1998) based on his work on circuit and system simulation methods. The results have been published in numerous papers and conference proceedings, including DesignCon. Several DesignCon papers such as “BER-and COM-Way of Channel-Compliance Evaluation: What are the Sources of Differences?” and “A Causal Conductor Roughness Model and its Effect on Transmission Line Characteristics” have received the Best Paper Award. Dr. Vladimir Dmitriev-Zdorov holds 9 patents.

Vladimir is an active member of the DesignCon Technical Program Committee.

This year at DesignCon, Vladimir will be presenting the technical session, “How to Enforce Causality of Standard & “Custom” Metal Roughness Models” and on the panel “Stump the SI/PI Experts.”

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Tim Hollis

Fellow, Micron Technology

DesignCon 2020 Engineer of the Year finalist Tim Hollis from Micron TechnologiesTim Hollis is a distinguished member of the Macron Technologies technical staff and an advanced signaling R&D lead. His main focus is in identifying and directing forward-looking projects for the SI R&D team to pursue and driving a cross-functional working group intended to provide forward-looking technical guidance to upper management.

Tim has shown outstanding technical leadership in solving numerous challenges with regard to high-speed DDR memory interfaces, for both computing and graphics applications. He has contributed papers to DesignCon as received a Best Paper Award in 2018 as lead author for “16Gb/s and Beyond with Single-Ended I/O in High-Performance Graphics Memory.” His 85 patents reflect his innovative mind and his prodigious contributions to technology.

Tim received a BS in Electrical Engineering from University of Utah and a Ph.D. in Electrical Engineering from Brigham Young University.

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Istvan Novak

Principle SI and PI Engineer, Samtec

DesignCon 2020 Engineer of the Year finalist Istvan Novak from SamtecIstvan Novak is a Principle Signal and Power Integrity Engineer at Samtec, working on advanced signal and power integrity designs. Prior to 2018 he was a Distinguished Engineer at SUN Microsystems, later Oracle. He worked on new technology development, advanced power distribution and signal integrity design and validation methodologies for SUN’s successful workgroup server families. He introduced the industry’s first 25um power-ground laminates for large rigid computer boards, and worked with component vendors to create a series of low-inductance and controlled-ESR bypass capacitors. He also served as SUN’s representative on the Copper Cable and Connector Workgroup of InfiniBand, and was engaged in the methodologies, designs and characterization of power-distribution networks from silicon to DC-DC converters. He is a Life Fellow of the IEEE with twenty-five patents to his name, author of two books on power integrity, teaches signal and power integrity courses, and maintains a popular SI/PI website.

Istvan has in many cases single handedly helped the test and measurement industry develop completely new instruments and methods of measurement. New VNA types and Scope probes and methodologies are in the market today thanks to Istvan’s efforts and openness to help others. He was responsible for the power distribution and high-speed signal integrity designs of SUN’s V880, V480, V890, V490, V440, T1000, T2000, T5120 and T5220 midrange server families. Last, but not least, Istvan has been a tremendous contributor to SI List, educating and helping engineers across the world with their SI/PI problems. Istvan is an active member of the DesignCon Technical Program Committee, sharing his expertise by participating in the review of content for multiple tracks. He is an IEEE Fellow and has been a tutor at the University of Oxford, Oxford, UK for the past 10 years. He has also been a faculty member at CEI Europe AB since 1991 and served as Vice Dean of Faculty, Associate Professor at the Technical University of Budapest.

At DesignCon 2020, Istvan will be participating in the technical session, “Current Distribution, Resistance & Inductance in Power Connectors,” and the panel, “Stump the SI/PI Experts.”

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Michael Schnecker

Business Development Manager, Rohde & Schwarz

DesignCon 2020 Engineer of the Year finalist JMichael Schnecker from Rohde & SchwarzMichael Schnecker’s experience in the test and measurement industry includes applications, sales and product development and specialization in signal integrity applications using oscilloscopes and other instruments. Prior to joining Rohde & Schwarz, Mike held positions at LeCroy and Tektronix. While at LeCroy, he was responsible for the deployment of the SDA series of serial data analyzers.    

Mike has more than two decades of experience working with oscilloscope measurements. His background in time and frequency domains provides him with unique insight into the challenges engineers face when testing high-speed systems for both power and signal integrity. Interacting with engineers in the industry daily has allowed Mike to master the ability to explain complex measurement science to engineers at any level. He also holds several patents, including methods and apparatus for analyzing serial data streams as well as coherent interleaved sampling. Thus, Mike is recognized as a thought leader and exceptional mentor in the signal and power integrity community.

Mike has a BS from Lehigh University and an MS from Georgia Tech, both in electrical engineering. 

This year at DesignCon, Mike will be presenting the tutorial “Signal Integrity: Measurements & Instrumentation“ and at the technical session, “Real-Time Jitter Analysis Using Hardware Based Clock Recovery & Serial Pattern Trigger.”

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Yuriy Shlepnev

President and Founder, Simberian

DesignCon 2020 Engineer of the Year finalist Yuriy Shlepnev from SimberianYuriy Shlepnev is President and Founder of Simberian Inc., where he develops Simbeor electromagnetic signal integrity software. He received M.S. degree in radio engineering from Novosibirsk State Technical University in 1983, and the Ph.D. degree in computational electromagnetics from Siberian State University of Telecommunications and Informatics. He was principal developer of electromagnetic simulator for Eagleware Corporation and leading developer of electromagnetic software for simulation of signal and power distribution networks at Mentor Graphics. The results of his research are published in multiple papers and conference proceedings.

Yuriy conceived and brought to market a state of the art electromagnetic field solver tool suite and is considered an expert in his field and regularly posts teaching videos. He is a senior member of IEEE AP, MYY, EMC, and CPMT societies. He is also a Fellow of Kong’s Electromagnetics Academy and a member of the Applied Computational Electromagnetics Society (ACES).

Yuriy is active in the Technical Program Committee for DesignCon and has served a track co-chair in the past. At DesignCon this year he will be presenting the tutorial “Design Insights from Electromagnetic Analysis & Measurements of PCB & Packaging Interconnects Operating at 6- to 112-Gbps & Beyond” and speaking in the technical session “Machine Learning Applications for COM Based Simulation of 112Gb Systems.”

Cast your vote for the 2020 Engineer of the Year by noon PT, December 27.

Learn more about DesignCon and register to attend

the-9-most-disruptive-tech-trends-of-2019

What were the breakthrough technologies for 2019? The answer depends on who you ask. Several common themes have emerged such as cobots, emerging energy source, AI, and cybersecurity breaches. Let’s consider each in more detail.

1.) Robotics – collaborative robots (or cobots)

(Image source: OpenAI and Dactyl)

Remember Dum-E (short for dummy) from the first Iron Man movie? Dum-E was a cobot that helped Tony Stark created his flying robotic suit. It was a scaled down, more human, interactive version of the traditional industrial-grade manufacturing line arm robots.

Cobots are designed to collaboratively work alongside human with a gentle touch, i.e., to not smash fingers or step on the toes of their work buddies. Doing so requires that cobots be much more aware of their location in relation to the humans, via sensing and perception technologies. To achieve this goal, one company, Veo Robotics, uses a variety of 3D sensors placed around the robot’s workcell to aid in location awareness. The company’s sensors add an extra measure of safety by automatically slowing down the movement of the industrial cobots whenever a human co-worker comes close.

To help supplement actual human activity, cobots are becoming more dexterous and moving beyond merely picking components on an assembly line. Robots need greater dexterity to pick up objects that have moved even slightly beyond their programmed parameters. Cobots cannot yet grasp any object just by looking at it, but they can now learn to manipulate an object on their own. 

OpenAI, a nonprofit company, recently introduced Dactyl, a dexterous robotic arm that taught itself to flip a toy building block in its fingers. Dactyl uses neural network software to learn how to grasp and turn the block within a simulated environment before the hand tries it out for real. According to the company, they’ve been able to train neural networks to solve the Rubik’s Cube Problem using reinforcement learning and Kociemba’s algorithm for picking the solution steps.

magic-leap-is-hoping-enterprise-will-be-its-salvation
Magic Leap is rolling out a suite of solutions targeted at enterprise users. But is it too little too late? (Image source: Magic Leap)

When the Magic Leap One first rolled out, we wondered if the headset would be better suited to enterprise applications, rather than consumers. Now it looks like the company is finally ready to see if engineers and designers will embrace its product.

Thinking of all the hype that once surrounded Magic Leap to where it is now it’s easy to recall that scene of Obi Wan confronting Anakin in Revenge of the Sith. Magic Leap was supposed to be the chosen one – the company that would make a quantum leap in extended reality (XR) technology.

What has emerged in the year since the “spatial computing” company released its flagship headset – the Magic Leap One – is less of a vision of a bold, new future and more of an emerging cautionary tale.

Magic Leap has announced it will now be offering a suite of services and applications targeted specifically at enterprise, as well as a new headset to go with it, the Magic Leap 1 – an updated version of its Magic Leap One Creator Edition.

“Today’s announcement heralds the arrival of a new chapter for spatial computing with an advanced technology platform for enterprises across all industry sectors,” Omar Khan, chief product officer at Magic Leap, said in a press statement. “Our innovative partners are leading the charge by developing groundbreaking solutions that will transform their businesses and customer experiences. Together, we are rewriting the rules of business with spatial solutions that will yield greater efficiencies, deeper engagement, and significant new business opportunities for all stakeholders.”

This new chapter that Khan speaks of looks less like a new innovation and more like a re-branding. For the price of $2,995, a couple hundred more dollars than the headset alone, Magic Leap’s Enterprise Suite offers customers the Magic Leap 1 along with access to Device Manager – a cloud-based support and metric analytics system. However there are no new significant hardware upgrades to the Magic Leap 1 or any features that might pull potential customers away from other options like the Hololens, or convince skeptics to add AR to their workflow.

Though it refers to its technology as spatial computing (a term you’d think NASA would have adopted decades ago if it really meant anything), Magic Leap is offering the expected benefits associated with augmented reality for enterprise: digital twin and 3D visualizations; education and training; and remote collaboration. None of that is at all bad in and of itself, but you have to ask if Magic Leap is late to the party at this point.

The company has however managed to secure a good number of partners for its enterprise venture. Big names like McLaren, JetBlue Travel, Deloitte, and NTT DOCOMO, among others, have already “committed to bringing spatial computing to their companies and customers,” according to Magic Leap. There’s even a hyperloop company, Hyperloop TT, committed to using Magic Leap for remote demonstrations of its transportation technology.

Magic Leap is also touting a healthy ecosystem of enterprise app developers including Across Realities, Arvizio, Eon Reality, Immersion Analytics, and PTC, that will be rolling out apps for everything from design and virtual prototyping to remote collaboration in the coming months.

Look before you leap

This latest announcement from Magic Leap could excite engineers and developers who have been itching to get their hands on the headset for applications outside of entertainment. However, it also comes at an embattled time for the company.

A recent report by The Information paints Magic Leap less as the next Microsoft or Apple and more like the next WeWork or Theranos (though such a comparison is a tad unfair given Magic Leap has released an actual, working product).

Upon its initial release, Magic Leap’s CEO, Rony Abovitz, aimed for the company to sell one million headset units in its first year. That number was later significantly downgraded to 100,000 by the company’s more conservative executives. The latest sales figures, according to The Information, reveal the company has only sold about 6,000 units to date.

Magic Leap has responded to The Information piece in a statement, calling the article “clickbait.”

“The Information’s reporting is littered with inaccuracies and misleading statements, and erroneously portrays Magic Leap’s operations, internal plans, and overall strategy,” a company statement released to GamesIndustry.biz said.

Magic Leap did not return Design News‘ request for further comment.

Where’s the magic?

What’s most surprising is that Magic Leap didn’t target its hardware at enterprise sooner – as in, upon its initial release (or even before). The company heavily marketed itself as a next-wave entertainment product. But the hype fizzled when reports emerged that the company’s hardware was nothing revolutionary and was more in line with current market trends. Early demo videos of Magic Leap’s technology were also discovered to have been created by special FX houses, and not on the company’s actual hardware.

A lightweight headset with limited head tracking ability seems like a much better fit for engineers and designers working in 3D CAD than for a gamer looking to play an action-packed first person shooter in their living room.

Magic Leap begged to differ. Outside of a partnership with CAD software company OnShape, the company really had no enterprise-focused content for the Magic Leap One on its initial release.

Did we mention the headset also costs $2,2,95? Surely there early adopters willing to pay that price, but any savvy consumer knows they can put together a respectable VR PC rig with a headset from Oculus, HTC, or HP for almost half that price – and enjoy a healthy library of gaming content to boot.

Now, if the disappointing sales figures are to be believed, Magic Leap has found itself at a crossroads. It is not the consumer darling it pledged itself to be, but it also has a lot of ground to gain in enterprise with companies like Microsoft, Google, Vuxiz, HTC, HP, and ThirdEye Gen having already offered enterprise AR, VR, and mixed reality (MR) hardware for a while now.

The company is also falling behind on the hardware front. Magic Leap said its second-generation headset – the Magic Leap Two – will offer new features like 5G connectivity. But insiders have speculated that it is years away from releasing a new headset. Meanwhile Microsoft’s Hololens 2 is available in limited release and is expected to go wide next year. And Qualcomm has already announced a 2020 release for its new XR2 platform for developing AR and MR hardware, which will offer 5G capability. Niantic, the company behind Pokemon Go, is reportedly working on its own AR glasses based on the XR2.

Magic Leap’s consumer ambitions were not misguided. There is still no singular AR product that has taken over the consumer market. What Magic Leap has been aiming to do was become a general-purpose AR headset – to do for augmented and extended reality what consoles like the Sony Playstation or Nintendo Switch do for gaming. More broadly, the Magic Leap One is meant to spark an entirely new device category along the lines of the PC or smartphone.

But maybe consumers don’t want that product? So far the most successful consumer deployments of AR have come on the software end – where games like Pokemon Go have leveraged existing smartphone hardware. On the enterprise end even offerings like ABB’s Ability Remote Insights services remain hardware agnostic.

More and more, the market seems to be saying that customers want AR for specific, niche applications (like enterprise).

In another example of niche demand, earlier this year, Tilt Five, a startup founded by Jeri Ellsworth, an AR entrepreneur and former R&D engineer at Valve, launched a successful Kickstarter campaign for its augmented reality system targeted solely at holographic table top gaming.

While a system like Tilt Five’s doesn’t offer the horsepower of something like the Magic Leap 1, it does offer another attractive incentive for consumers – a price point expected to be around $300-350. For the price of one Magic Leap system you can outfit your entire immediate family with Tilt Five glasses.

The AR market is still in flux – both in enterprise and consumer – with no one big name leading the pack yet. AR technology itself also still has some technical issues to iron out – most notably optics-related issues. With the right partnerships in place, Magic Leap could establish a firm enough foothold to keep itself afloat long enough to course correct and release a next-generation hardware platform. But time isn’t on Magic Leap’s side – particularly in an increasingly crowded enterprise space and with the company having already fallen short of so many lofty promises.

Chris Wiltz is a Senior Editor at  Design News covering emerging technologies including AI, VR/AR, blockchain, and robotics.

DesignCon 2020 25th anniversary Logo

January 28-30: North America’s largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? 

Register to attend!

who’s-left-to-make-chip-development-tools?

Here’s a look at the remaining major EDA tool companies after years of consolidation.

  • The EDA market continues to consolidate. At this year’s 2019 Design Automation Conference (DAC), Rich Valera from Needham and Company noted that since the collapse of the tech bubble in the early 2000’s, the EDA market has been all about consolidation.

    “Many larger scale private companies, including multiple “near IPO’s” – e.g., Denali, Tensilica, Apache, and Spyglass (Atrenta), – have been bought before going public in the last 15 years,” explained Valera. “It goes without saying that the EDA industry has become very concentrated, one could argue an oligopoly, with most of the revenue driven by 3 major companies.”

    The above graphic does not include many of the more recent consolidations:

    Cadence acquisition of AWR from National InstrumentsInphi Corp acquired the majority of eSiliconSynopsys acquired DINI Group, QTronic GmbH and certain assets of eSiliconDassault Systemes acquisition of CST

    The number of private EDA startup company exits through acquisitions or going public (IPOs) has been declining, which is probably attributed to fewer companies being formed. Additionally, the time to exit for startup EDA companies has generally been well over 10 years. This is a long time period for most startups and their investors, which may explain the modest amount of venture capital funding flowing into EDA.

  • It would seem that the main EDA tool vendors have formed an oligopoly, i.e., Synopsys, Cadence and Mentor Graphics (recently acquired by Siemens PLM). According to Valera, one might expect to see less competition, reduced investment and a push to maximize profits – say, as opposed to growing into new markets. This has not been the case. Rather, the combined Cadence/Synopsys research and development (R&D) budget has been on a generally upward trend over the last 10 years, which is a positive activity as it relates to job growth.

    The three major EDA companies have realized healthy growth thanks to their movement into new application areas like autonomous vehicle electronics, ongoing advancement and roll-outs in industrial and commercial IOT, AI and edge-cloud computing.

    What about the other EDA tool vendors? According to Crunchhub, there are 132 organizations listed as semiconductor EDA companies, not including fabs like TSMC and OEMs like Intel. But we don’t need to consider all EDA companies to understand what makes up this industry. Instead, let’s consider the top 8 EDA tool providers.

  • Synopsys

    In 1986, a small synthesis startup called Optimal Solutions was created by a team of engineers from GE Microelectronics Center in Research Triangle Park, N.C. The team included Dr. de Geus, who would later become the CEO. Shortly thereafter, the company moved to Mountain View, Calif., to become Synopsys (for SYNthesis and OPtimization SYStems). Their first task was to focus on commercializing an automated logic synthesis “Design Compiler” tool. Today, Synopsys has a suite of chip design and verification tools plus verification intellectual property (IP).

    One of the significant announcements from Synopsys in 2019 was the completion of its acquisition of the DINI Group, an FPGA-based boards and solutions company. SoC designers are deploying FPGA-based prototyping platforms to enable rapid software development in automotive, artificial intelligence (AI), 5G, and high-performance computing (HPC) applications.

    DINI’s FPGA boards are frequently used to create a complete logic prototyping system that can emulate up to 130 million ASIC gates with over 20 FPGAS.

  • Cadence Design Systems

    Two small startups that emerged in the early 1980’s – Solomon Design Automation and ECAD – grew and merged to form Cadence Design Systems in 1988. Shortly thereafter, Cadence bought Gateway Design Automation, a developer of the Verilog hardware description language. A year later Cadence put Verilog into the public domain, and it became the most widely used hardware description language. In the ensuring year, Cadence pushed into the custom/analog design automation tool market and later IC layout automation.

    Today, Cadence offers a broad portfolio of tools to address an array of challenges related to custom IC / Analog / RF Design, digital, IC package, and PCB design and system-level verification.

    One of the more interesting announcements in 2019 was the introduction of a complete electrical-thermal co-simulation solution for ICs to physical enclosures. The thermal solver integrated with the company’s IC, package and PCB implementation platforms. Design insights from the solver will help design teams detect and mitigate thermal issues early in the design process, thus reducing electronic system development iterations.

  • Mentor Graphics (A Siemens PLM Company)

    Mentor Graphics was founded in 1981 by a small group of engineers in Oregon. All had left Tektronix to form Mentor Graphics, one of the first commercial EDA companies, along with Daisy Systems and Valid Logic Systems. Mentor Graphics was also the first EDA company that had its software run on a non-proprietary hardware system, i.e., the Apollo Computer workstations.

    Today, the company offers chip design, PCB design, systems, automotive, CAE Simulation and Test and Embedded tools. Mentor is involved in EDA, printed circuit board and system-of-system level design.

    One of the announcements this year was in the area of high-level-synthesis (HLS) for edge computing networks. The challenge is that moving machine learning to the edge has critical requirements on power and performance. Using off-the-shelf solutions like CPUs or GPUs are too slow or too expensive, respectively. Even generic machine learning accelerators can be overbuilt and are not optimal for power. That’s why HLS tools can help create new power/memory efficient hardware architectures to meet machine learning hardware demands at the edge.

  • ANSYS

    Ansys was founded in 1970 by John Swanson. In 1996, the company went public. During the next five years, Ansys made numerous acquisitions to gain additional technology for fluid dynamics, electronics design, and other physics analysis.

    The company develops and markets engineering simulation software used to design products and semiconductors, as well as to create simulations that test a product’s durability, temperature distribution, fluid movements, and electromagnetic properties.

    As an example of the company’s simulation capabilities, TURBOTECH is using Ansys fluids tools to potentially redesign aeronautical propulsion. TURBOTECH is developing an energy storage system capable of powering the hybrid-electric aircraft of the future. The idea is to develop regenerative cycle turbogenerators based on small turbines that recover energy from exhaust gases to reduce fuel consumption. By recharging batteries in-flight, the turbogenerators claim to improve the endurance of electric aircrafts by 10x — enabling significant weight and cost savings. The turbogenerators can produce electricity from virtually any type of renewable flammable material, including bio-fuel, bio-gas, hydrogen and conventional fuels.

  • Keysight Technologies

    Keysight Technologies’ can trace its origins back to the original Hewlett-Packard business founded in 1939 by Bill Hewlett and Dave Packard. In 1999, the HP spun off Agilent Technologies in 1999. Five years later, Agilent spun off Keysight Technologies as a wireless, semiconductor and aerospace test and measurement company.

    Significant news in 2019 includes the partnership with Marvin Test Solutions to develop advanced beamformer integrated circuit (IC) test technology to accelerate the production of high performance 5G chips and test associated mmWave antenna systems. To ensure reliable and efficient 5G mmWave communications, the performance of critical elements that form part of the beamformer chips need to be rigorously tested under linear and nonlinear conditions.

    Also noteworthy is the company’s simulation software that is being used for rapid development, integration and test of sophisticated electronic warfare (EW) systems with real-time RF modeling. Software and hardware simulation systems are needed so engineers can test their EW designs by easily generating specific RF environments.

  • Zuken

    Zuken is a Japanese-based company that started out in CAD systems in 1976. The company’s software is primarily used for designing printed circuit boards (PCBs), Multi-Chip Modules (MCM), and for the engineering of electrotechnical, wiring, wiring harness, pneumatics and hydraulics applications.

    Recently, Zuken moved firmly into the systems-of-systems engineering and model-based-systems engineering (MBSE) spaces with the acquisition of ViTech. This acquisition required the approval of the US Department of Defense (DoD) and the Committee on Foreign Investment in the United States (CFIUS). Vitech was a US company with more than 25 years of industry experience in systems engineering.

    In the fall of 2019, Zuken reinforced it’s presence in the world of digital twins by agreeing to develop system design and manufacturing process interfaces to Dassault Systèmes (DS) 3DEXPERIENCE platform. Zuken will provide electronic libraries and design data management capabilities within DS’s platform to enable cross-discipline systems engineering and traceability.

    In particular, Zuken’s component management process will permit the transfer, synchronization and authorization of component metadata and related files between the databases of the two companies. Zuken’s integration will enable creation and lifecycle management of electronic systems from the Dassault Systemes’s platform.

  • Altium

    Altium was founded in 1985 by Nick Martin as a PCB Computer-Aided Design (CAD) vendor. The company has continued to improve its original product over the last several decades, e.g., Altium Designer. Improvements in 2019 provide for a faster schematic editor, high-speed design and enhanced interactive router for PCB design.

    This year, the company also unveiled a cloud-based application for CAD component management. It may seem un-glamorous but selecting and managing components in the development of a PCB is critical to design and cost.

    The effective creation and reuse of component data in the PCB design process, including footprints, schematic symbols, and 3D models, is critical in meeting tight time-to-market windows. Until now, most PCB designers have created and stored component data in private file systems rather than in a shared, managed, and maintained library. Others have tried to use shared spreadsheets or proprietary databases. These outdated approaches led to multiple re-design cycles due to redundant, inaccurate or outdated component data that is often discovered only late in the product development process, when board designs are sent to manufacturers.

  • Applied Wave Research (AWR)

    Several former companies providing EDA tools (like CST and AWR), FPGA boards systems (like DINI) and design services (like eSilicon) have been “removed” through acquisitions from the official list of EDA companies. Yet the brands and product live on either as the original brand or under the flag of the acquiring company. Let’s look at the most recent of these acquired EDA vendors.

    AWR was founded in 1994 to improve the design efficiency for radio frequency and microwave circuit and system design. After several prior acquisitions, AWR was acquired by National Instruments (NI) in 2011. A further acquisition by Cadence was announced in late 2019.

    AWR software is used for radio frequency (RF), microwave and high frequency analog circuit and system design. Recently, The Italian National Institute for Astro Physics of the Institute of Radio Astronomy (INAF-IRA) used NI AWR software to design the circuitry of the receiver chains for a multi-channel heterodyne receiver antenna for radio astronomy applications operating across the 2.3–8.2 GHz RF band.  

    Large-scale surveys using highly sensitive electronics are an essential tool for new discoveries in radio astronomy. INAF designers were challenged to develop, fabricate, and test a room temperature, multi-channel heterodyne receivers needed for radio astronomy applications. AWR software helped in the critical modeling and design of the phased array for reflector observing systems (PHAROS) which uses a super-cooled feed with an analog beamformer.

    RELATED ARTICLES:

    February 25 – Day 1 – Introduction to EDA and the tools

    Announcing Heidi Barnes as Winner of the DesignCon 2017 Engineer of the Year Award

    John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

Electronic design automation (EDA) are the software tools used for designing electronic systems, such as system-on-chip (SoC) integrated circuits and printed circuit boards. The tools work in a design-verification flow that chip designers use to analyze and develop semiconductor chips.

But the EDA tool market has gone through massive consolidation over the couple of decades. Which companies are left? We’ll find out.

more-engineering-salaries-at-leading-companies

How does your salary match up? Here’s a sampling of engineering salaries at another 20 top companies.

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Jobsite, Glassdoor, has collected research on the major employers for engineers. In the following slides, you can see how the companies stack up for engineers. We’ve included some of the largest engineering employers, and we show a sampling of engineering salaries. Right now, the average engineering salary across all disciplines and all employers is $81,948.

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Amatel

    $73,648

    The typical Amatel engineer salary is $73,648. Engineer salaries at Amatel can range from $55,304 – $84,330. This estimate is based upon engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Amatel)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Siemens

    $86,959

    The typical Siemens Engineer salary is $86,959. Engineer salaries at Siemens can range from $62,140 – $166,214. This estimate is based on Siemens engineer salary reports provided by employees or estimated based upon statistical methods. (Image Source: Siemens)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Wipro

    $82,915

    The typical Wipro engineer salary is $82,915. Engineer salaries at Wipro can range from $66,172 – $114,836. This estimate is based upon Wipro engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Wipro)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Westinghouse

    $77,683

    The typical Westinghouse engineer salary is $77,683. Engineer salaries at Westinghouse can range from $68,562 – $110,462. This estimate is based upon Westinghouse engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Westinghouse)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Bechtel

    $91,668

    The typical Bechtel engineer salary is $91,668. Engineer salaries at Bechtel can range from $62,690 – $124,254. This estimate is based upon Bechtel Engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Bechtel)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Thornton Tomasetti

    $65,699

    The typical Thornton Tomasetti engineer salary is $65,699. Engineer salaries at Thornton Tomasetti can range from $54,200 – $72,107. This estimate is based upon Thornton Tomasetti engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Thornton Tomasetti)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    ExxonMobil

    $114,075

    The typical ExxonMobil engineer salary is $114,075. Engineer salaries at ExxonMobil can range from $50,229 – $174,902. This estimate is based upon ExxonMobil engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: ExxonMobil)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Intertek

    $64,944

    The typical Intertek engineer salary is $64,944. Engineer salaries at Intertek can range from $50,873 – $84,428. This estimate is based upon Intertek engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Intertek)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Toyota North America

    $87,931

    The typical Toyota North America engineer salary is $87,931. Engineer salaries at Toyota North America can range from $63,889 – $116,439. This estimate is based upon Toyota North America engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Toyota North America)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Cummins

    $77,299

    The typical Cummins engineer salary is $77,299. Engineer salaries at Cummins can range from $67,594 – $100,893. This estimate is based upon Cummins engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Cummins)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Whirlpool

    $71,124

    The typical Whirlpool Corporation engineer salary is $71,124. Engineer salaries at Whirlpool Corporation can range from $53,566 – $94,781. This estimate is based upon Whirlpool Corporation engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Whirlpool)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Shell

    $113,441

    The typical Shell engineer salary is $113,441. Engineer salaries at Shell can range from $69,562 – $179,008. This estimate is based upon Shell engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Shell)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Micron Technology

    $92,781

    The typical Micron Technology engineer salary is $92,781. Engineer salaries at Micron Technology can range from $68,489 – $120,839. This estimate is based upon Micron Technology engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Micron Technology)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    US Navy

    $85,424

    The typical US Navy engineer salary is $85,424. Engineer salaries at US Navy can range from $53,873 – $139,410. This estimate is based upon US Navy engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: US Navy)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Pratt & Whitney

    $79,423

    The typical Pratt & Whitney engineer salary is $79,423. Engineer salaries at Pratt & Whitney can range from $70,269 – $91,417. This estimate is based upon Pratt & Whitney engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Pratt & Whitney)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Samsung Austin Semiconductor

    $82,766

    The typical Samsung Austin Semiconductor engineer salary is $82,766. Engineer salaries at Samsung Austin Semiconductor can range from $71,991 – $97,436. This estimate is based upon Samsung Austin Semiconductor engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Samsung)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Eaton

    $80,725

    The typical Eaton engineer salary is $80,725. Engineer salaries at Eaton can range from $69,443 – $113,084. This estimate is based upon 30 Eaton engineer salary reports provided by employees or estimated based upon statistical methods. (Image source” Eaton)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    John Deere

    $83,329

    The typical John Deere engineer salary is $83,329. Engineer salaries at John Deere can range from $66,371 – $96,302. This estimate is based upon John Deere engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: John Deere)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Proctor & Gamble

    $91,487

    The typical Procter & Gamble engineer salary is $91,487. Engineer salaries at Procter & Gamble can range from $66,863 – $127,439. This estimate is based upon Procter & Gamble engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Proctor & Gamble)

  • Engineering salaries, engineering employers, Glassdoor, Siemens, John Deere, Shell, ExxonMobil, Samsung, Verizon, P&G, US Navy, Eaton

    Verizon

    $92,889

    The typical Verizon engineer salary is $92,889. Engineer salaries at Verizon can range from $51,020 – $134,905. This estimate is based upon 24 Verizon engineer salary reports provided by employees or estimated based upon statistical methods. (Image source: Verizon)

Rob Spiegel has covered automation and control for 19 years, 17 of them for Design News . Other topics he has covered include supply chain technology, alternative energy, and cyber security. For 10 years, he was owner and publisher of the food magazine Chile Pepper.

DesignCon 2020 25th anniversary Logo

January 28-30: North America’s largest chip, board, and systems event,  DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard?  Register to attend !

growth-returns-to-semiconductor-and-eda-tools-m&a-markets-–-for-now

2019 was a great time for merger and acquisition business in the semiconductor and electronic design automation tools industries. But what will the future hold?

There was an uptick in M&A activity for the semiconductor space in the first eight months of 2019. This was a welcome change after the market slowing in 2017 and 2018. The combined value of 20-plus M&A agreement announcements reached over $28 billion for the purchase of chip companies, business units, product lines, intellectual property (IP), and wafer fabs between January and the end of August – according to the Fall 2019 IC Insight’s McClean Report. This amount does not include transactions between semiconductor capital equipment suppliers, material producers, chip packaging and testing companies, and electronic design automation (EDA) software firms.

The activity in the first eight months of 2019 have surpassed the $25.9 billion total for all of 2018.

Image Source: IC Insights

what-does-every-engineer-want-for-the-holidays?

During the holiday season, one tends to think of presents. But today’s designers, manufacturers and sellers tell us the product is but a commodity and what we really want is the experience.

Engineers and scientists are really like most ordinary consumers except in their interest in experiences that deal with great technical achievements, failures and the future – technologies that are yet to be. So, rather than a set of catchy products, this list will focus on unique experiences with particular appeal to engineers and scientists. 

I. Books 

Reading is an experience unlike no other in that it can be done by any literate person at almost any time and in any place. Here is a very short list of science and engineering related books released in 2019:

> Infinite Powers: The Story of Calculus – The Language of the Universe, by Steven Strogatz (Atlantic Books) 

This is the story of mathematics’ greatest ever idea: calculus. Without it, there would be no computers, no microwave ovens, no GPS, and no space travel. But before it gave modern man almost infinite powers, calculus was behind centuries of controversy, competition, and even death.

Professor Steven Strogatz charts the development of this seminal achievement from the days of Archimedes to today’s breakthroughs in chaos theory and artificial intelligence. Filled with idiosyncratic characters from Pythagoras to Fourier, Infinite Powers is a compelling human drama that reveals the legacy of calculus on nearly every aspect of modern civilization, including science, politics, medicine, philosophy, and much besides.

> Six Impossible Things: The ‘Quanta of Solace’ and the Mysteries of the Subatomic World, by John Gribbin (Icon Books Ltd.) 

Quantum physics is strange. It tells us that a particle can be in two places at once. Indeed, that particle is also a wave, and everything in the quantum world can be described entirely in terms of waves, or entirely in terms of particles, whichever you prefer.

All of this was clear by the end of the 1920s. But to the great distress of many physicists, let alone ordinary mortals, nobody has ever been able to come up with a common sense explanation of what is going on. Physicists have sought ‘quanta of solace’ in a variety of more or less convincing interpretations. Popular science master John Gribbin takes us on a tour through the ‘big six’, from the Copenhagen interpretation via the pilot wave and many worlds approaches.

> Hacking Darwin: Genetic Engineering and the Future of Humanity by Jamie Metzl (Sourcebooks) 

At the dawn of the genetics revolution, our DNA is becoming as readable, writable, and hackable as our information technology. But as humanity starts retooling our own genetic code, the choices we make today will be the difference between realizing breathtaking advances in human well-being and descending into a dangerous and potentially deadly genetic arms race.

Enter the laboratories where scientists are turning science fiction into reality. Look towards a future where our deepest beliefs, morals, religions, and politics are challenged like never before and the very essence of what it means to be human is at play. When we can engineer our future children, massively extend our lifespans, build life from scratch, and recreate the plant and animal world, should we?

Image Source: Sourcebooks
3-painless-tips-for-writing-documentation

Writing documentation is not the most exciting endeavor an engineer can embark on. It’s often boring, time consuming, and there are so many more interesting things that could be done. It sometimes amaze me that development projects are documented so poorly – if they are even documented at all. Documentation is meant to help preserve important concepts and information about the development cycle. This information could be used to get up to speed on the product, make decisions about updates and changes, or even to prove that proper procedures were followed to create the product. Here are a set of tips for developing documentation that decreases the pain factor for developers and improves documentation quality.

Tip #1 – Write the documentation as you develop

The problem with a lot of documentation (which includes code comments), is that the documentation is done after the development is completed. Engineers are often in a hurry due to delivery constraints, so they focus on getting things working first and then document second. The problem with this is that there is a good chance that the documentation is never written and if it is, the developer may be writing it weeks or months later which means they have forgotten the design decisions that were made. The resultant documentation is often better than nothing, but lacking in critical steps or thought processes that make it easy to pick-up right where the developer left off.

developer documentation, 3 tips, document as you go

Following through on these tips can speed up the time it takes to develop documentation. (Image source: Samsung Know)

I’ve found that the best documentation, and the quickest way to develop it, is to document as you go. For example, when I am writing documentation that describes how to setup and run a test, I don’t set it up and then go back and try to remember all the steps I took. I literally create the document and with each step write down what I did and more importantly, why I did what I did. Now when I make a misstep and have to go back and adjust, it’s the perfect opportunity to include a few comments of how to recover the system or what mistakes to avoid.

I’ve also found that by creating the documentation as I go, I can use the documentation to outline what I’m about to do which can help guide my efforts. I’ve always found that taking the time to think through what I’m going to do gathers my thoughts and seems to make me more productive. This works far better than trying to do something “on-the-fly”.

Tip #2 – Pictures are worth 1,000 words

It’s quite amazing to me how in a world that is driven by video, rich images and photographs that the documentation engineers create is almost entirely text driven. I can’t tell you how often I’ll come across documentation that includes almost no pictures whatsoever. I was recently working on a project where an engineer sent me a procedure for setting up a toolchain and deploying production code. The entire document was two pages that was not only difficult to follow but had missing steps and no pictures or diagrams! The engineer even assumed that the reader would know how to wire the development board up to a sensor without a wiring diagram!

While the text-based version could be used to repeat the original procedure, anyone following it would have to find several external, review schematics and make several leaps of faiths in order to successfully complete it. What should have been a one-hour process ended up requiring about four hours. If you are following the first tip which is to write your documentation as you go, taking screen shots of important steps in a procedure or taking a picture with a smart phone only takes about 30 seconds. The result can be documentation that is much clearer and saves the user (which could be the future you) a lot of grief.

Tip #3 – Have a colleague review the documentation

The last tip for us to discuss today, and one that should not be overlooked, is to have a colleague go through your documentation when you are done with it. As engineers, we often make assumptions that someone who comes after us will be thinking the same way that we are or that some information tidbit is obvious. Giving your documentation to a colleague to review will help to ensure that all the required information is included in the document so that if someone comes along later, they will be able to understand the process and reproduce or maintain the system.

A colleague can act as a great sounding board to ensure that everything is required. For example, I mentioned that I had a procedure that was provided to me that didn’t have any images. As I reviewed that procedure, I was able to point out screen shots, diagrams and images that should be added to the documentation that would make it easier for someone to understand what the procedure was and how to replicate it. Having no prior knowledge about the procedure and being forced to repeat it helped to provide critical feedback that resulted in a well-established procedure that is not easy to replicate.

Conclusions

These simple documentation steps might seem obvious, but I know for a fact there are lots of engineers that do not follow these simple tips. I come across lots of projects that are sparsely or not documented at all. It may seem obvious to the developer what needs to be done to use a code base, setup an experiment or whatever. The fact though is that it’s often not obvious and the same developer coming back a year later will often find it takes them time to figure out what they were thinking a year ago.

Following through on these tips can speed up the time it takes to develop documentation. That documentation will also be at a higher quality level. Take the time this month to start putting these into practice and you’ll find that overtime, developing documentation will become painless (or at least a little less painful).

Jacob Beningo is an embedded software consultant who currently works with clients in more than a dozen countries to dramatically transform their businesses by improving product quality, cost and time to market. He has published more than 200 articles on embedded software development techniques, is a sought-after speaker and technical trainer, and holds three degrees which include a Masters of Engineering from the University of Michigan. Feel free to contact him at [email protected], at his website, and sign-up for his monthly Embedded Bytes Newsletter.

DesignCon 2020 25th anniversary Logo

January 28-30: North America’s largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? 

Register to attend!