novel-clean-fuel-cell-eyed-for-next-generation-vehicles

With all the talk and worry about climate change, researchers are seeking myriad alternative energy sources. Among those that researchers are eyeing are fuel cells as an environmentally friendly replacement for fossil fuels in next-generation vehicles. 

Researchers at the University of Waterloo in Ontario, Canada, have now made a breakthrough in this technology with the development of a fuel cell that they said lasts at least 10 times longer than current comparable cells.

This type of performance makes it more economically viable to mass produce fuel cells making them cost-comparable or even less expensive than using gasoline engines, said Xianguo Li, a professor of mechanical and mechatronics engineering and director of the Fuel Cell and Green Energy Lab at Waterloo, who led the research.

“This is clean energy that could boom,” he said in a press statement.

Professor Xianguo Li of the University of Waterloo with a new fuel cell he and his team designed that could make the clean-energy technology more viable for mass production. (Image source: University of Waterloo)

Making Clean Cars Possible

Fuel cells which produce electricity from the chemical reaction when hydrogen and oxygen are combined to make water, making them one of the cleanest sources of energy available.

In hybrid automobiles, fuel cells can power generators that recharge batteries while the vehicles are in operation. While fuel cells that currently exist already could replace gasoline engines, they are currently not practical for mass production because they are too expensive.

Li’s team solved this problem by designing fuel cells that are more durable than current technology using a power-management strategy to deliver a constant rather than fluctuating amount of electricity, researchers report in a paper on their work in the journal Applied Energy.

Researchers achieved this by designing each fuel cell stack to work only at a fixed operating point, or constant output power, and by shortening its active operation time via an on-off switching control, researchers wrote.

“A hysteresis control strategy of power management is designed to make the active time evenly distributed over the three fuel cell stacks and to reduce the number of on-off switching,” researchers wrote. “The results indicate that the durability of the onboard fuel cells can be increased 11.8, 4.8 and 6.9 times, respectively, for an urban, highway and a combined urban-highway driving cycle.”

Promising Technology

Key to the design is that the average power demand of real-time driving cycles is only a fraction of the maximum power that the FC-PHEVs researchers designed can provide, they wrote. This significant increase in durability can be used to reduce the current over-design, and hence the cost, of fuel cells, researchers wrote.

“We have found a way to lower costs and still satisfy durability and performance expectations,” Li said in a press statement. “We’re meeting economic targets while providing zero emissions for a transportation application.”

The team aims to introduce their fuel cells for hybrid vehicles to drive mass production and lower unit costs for the technology in general. One day they hope to see the replacement of both batteries and gas engines entirely with an affordable, safe, dependable, and clean source of electrical power, Li said

“This is a good first step, a transition to what could be the answer to the internal combustion engine and the enormous environmental harm it does,” he said in the statement.

Elizabeth Montalbano is a freelance writer who has written about technology and culture for more than 20 years. She has lived and worked as a professional journalist in Phoenix, San Francisco and New York City. In her free time she enjoys surfing, traveling, music, yoga and cooking. She currently resides in a village on the southwest coast of Portugal.

sebastian-thrun-is-driving-a-future-where-cars-fly-themselves
Through his company, Kitty Hawk, Sebastian Thrun is working to make personal flying vehicles a reality. Shown above: Flyer, one of vehicles being developed by the company. (Image source: Kitty Hawk)

What do you get the industry that is disrupting everything? Ask serial entrepreneur and inventor Sebastian Thrun and the answer you’d get is, “more disruption.”

As the CEO of Kitty Hawk, a company with the bold vision of bringing flying cars to consumers, Thrun is imaging a future where autonomous cars have been replaced by personal autonomous planes. “I believe the flying car revolution will disrupt self driving cars,” he told an audience during his keynote at the 2019 Drive World Conference & Expo. “I believe we will see flying cars at scale before self-driving cars.”

“Don’t Trust the Experts”

Thrun speaks to an audience at Drive World 2019. (Image source: Drive World Conference & Expo)

While he admitted his own predictions may sound fantastical, Thrun has been involved with autonomous vehicles for over a decade. He headed up the team that won the 2005 DARPA Grand Challenge – a 132-mile, off-road autonomous vehicle race. While the race itself is more noted today for its mishaps than any particular innovation, winning the competition put Thrun on the path that has today led him to imagine a future most people would associate with The Jetsons or Back to the Future: Part 2.

Following his success at the DARPA competition, Thrun was invited to Google to lead the company’s then fledgling autonomous car program (he founded Google X). “I was the go-to person [at Google] for self-driving cars,” Thrun said.

Google’s ambition was to create a self-driving car that could navigate even the most difficult roads. The company’s founders, Larry Page and Sergy Brin, even personally selected the most difficult routes in Northern California and tasked Thrun and his team with creating a self-driving car that could handle them.

But according to Thrun, being Google’s in-house expert taught him two major lessons: “Don’t trust incumbents” and “don’t trust the experts.”

“Larry [Page] came to me and said, ‘you’re the world expert. Can you start a team?’ And I said it can’t be done,” Thrun recalled regarding Google’s ask for its autonomous cars. When Page asked for a purely technical reason why it wasn’t possible however, “I couldn’t say all the technical reasons. I had to tell him I know it can’t be done, but there is no technical reason.”

The experience brought Thrun to a realization, “experts know the past, not the future.” He found himself presenting the same reasoning to Google that the traditional automakers had told him about autonomous cars. “We talked to automotive companies, but they didn’t believe it,” he said. “The incumbents are the least interested in disruption.”

If you need any evidence of how reluctant traditional automakers were toward true disruption at the time, Thrun pointed to a 2011 ad campaign for the Dodge Challenger in which the vehicle was touted as, “the leader of the human resistance” against AI-driven cars.

This 2011 Dodge Challenger ad took a jab at the idea of autonomous vehicles. 

“We Are Not the Gatekeepers”

Today, thanks to advances such as deep learning and advanced sensor technologies, autonomous cars are doing things that engineers only a decade ago weren’t sure would be possible. Most major automakers are developing self-driving cars, an entire startup ecosystem has risen up around autonomous vehicles, and autonomous trucks are even being tested on public roads.

And to disrupters that means it’s time to move on to the next thing. For Thrun that’s developing electric vertical take-off and landing (eVTOL) vehicles at Kitty Hawk (Google’s Larry Page is a financial backer of the company). Though the idea of flying cars soaring over our heads every day might feel like science fiction, Thrun firmly believes the advanced in autonomous cars lend themselves directly to the development of creating autonomous flying machines.

He argued that whereas as roads are two-dimensional spaces with a limited capacity, the sky is three-dimensional and offers many more benefits in terms of travel efficiency and capacity. By Thrun’s own estimation, the same stretch of road that can hold a few dozen cars, would be able to hold upwards of a million flying vehicles of the same size. Adding full autonomy to these vehicles, Thrun said, would also alleviate issues around navigation. He told the Drive World audience the key would be in automating the sort of systems used today in air traffic control.

Kitty Hawk has yet to release a commercial product to the public, but has been actively testing its vehicles with human pilots. It recently reported that one of its vehicles, Flyer, has already been flown over 25,000 times.

Earlier this year Kitty Hawk entered to a strategic partnership with Boeing to further develop , Cora, its two-seat flying vehicle. Kitty Hawk would like Cora to someday function as an autonomous flying taxi that consumers can summon with a simple app similar to Uber or Lyft.

Getting to that vision, Thrun admitted, will be no small feat. But it’s one he certainly believes is reachable as long as technology, regulation, and society come together to make it happen. “Innovation is a matter of society.” he said. “As much as we in Silicon Valley like to believe we are the pinnacle for what’s possible, we are not the gatekeepers. We are just technologists…It’s not just the technologists and engineers that change the world. It’s all of society that makes an innovation successful.”

Chris Wiltz is a Senior Editor at  Design News covering emerging technologies including AI, VR/AR, blockchain, and robotics.

The Battery Show logoBattery, EV/HV, & Stationary Power All in One Place.

Learn everything you need to know at our in-depth conference program with 70 technical sessions in eight tracks covering topics on battery, electric & hybrid vehicles, and stationary power technologies. 

The Battery Show. Sept. 10-12, 2019, in Novi, MI. Register for the event, hosted by Design News’ parent company Informa.

vr-and-ar-tools-provide-improved-products,-faster-production,-and-lower-costs

The idea of using virtual reality in manufacturing processes is not new. Its value has been proven by designers, engineers and service technicians over more than two decades. Immersive environments that provide scalable, intuitive representations of prototypes and designs enable realistic collaboration, improve the decision-making process, and increase end-customer confidence. For automotive, aerospace and equipment manufacturers using VR visualizations reduces the cost and time of developing physical prototypes, and the cost of rework due to design errors. With today’s computing power and advanced design software, detailed building and factory process modeling is easier to develop and has increasing value. Now VR is an essential tool for organizations with smart manufacturing and digital transformation initiatives. 

augmented reality, AR, virtual reality, VR, simulation tools, modeling, Commonwealth Center for Advanced Manufacturing
The team at the Commonwealth Center for Advanced Manufacturing (CCAM), work with interactive virtual reality. (Imge source: CCAM)

The Commonwealth Center for Advanced Manufacturing (CCAM), is one of the only facilities today with an interactive virtual reality (VR) factory model that is an exact representation of the physical machines in the facility.  CCAM, a non-profit research and development center, has members from organizations, such as Siemens, Airbus, Rolls Royce, and NASA Langley Research Center, as well the University of Virginia, Virginia Tech,

and others. Its goals are to provide a collaborative environment for development of innovative manufacturing processes and technologies, and to grow a qualified manufacturing workforce with an industry-driven educational pathway.

CCAM’s “Digital Factory of the Future” helps solve the most complex manufacturing challenges and is used to develop best practices and new processes. Real-time collaboration and intuitive interactivity are enabled by a powerwall from Mechdyne Corporation, which provides an immersive view of the virtual factory. 

CCAM is doing groundbreaking work using immersive visualization to examine processes and performance, according to Matt Stremler, CCAM’s Director of Research. “Visualizing factory dataflow is one of our biggest challenges,” Stremler said. “With a virtual reality factory model, we can work interactively with machine connectivity and the information passing between equipment and controlling processes. The large-scale VR display system we use also provides a useful interface for real time remote operation of robotics in harsh or inaccessible environments.”

CCAM’s virtual reality display system projects 3D images of the factory and/or equipment on an 8’ x 14’ wall and onto a same size floor screen. Users wear light weight shutter glasses fitted with motion trackers. The trackers monitor the user’s position, orientation, and interactions, which change the on-screen images in real-time to match the user’s perspective. When virtual images move in real-time response to motion, the user experiences an extremely convincing sense of presence with the virtual factory.

Users can simply walk around or look under virtual equipment as if they were physically there in the room. The floor projection adds to a more immersive sense of presence when ‘standing’ in the factory model or examining a prototype machine or assembly. The system can be used for virtual prototyping, assembly testing and training, and display of traditional two-dimensional documents and Power Point presentations when scale can benefit communications and teamwork.

CCAM also uses head-mounted displays (HMD) as an alternative resource for immersive experiences. For the collaborative work, which is part of the organization’s mandate, they find the larger scale VR system preferable for several reasons:

  • It is typically more comfortable for long work sessions
  • Users are not cut off from their real surroundings and can interact with colleagues, take notes, and remain generally aware of what is happening in the room
  • Presentations can be made to larger groups without sharing one or more HMDs

When used to design and analyze both products and manufacturing processes, the researchers and partners at CCAM report that interacting with immersive models and visualizations contributes to faster decision making, and programs that help manufacturers control costs and improve safety and quality.

Among the on-going projects at the facility is testing ways to integrate autonomous robots into the Digital Factory of the Future for remote monitoring and tool transport. The organization is also studying how the large-scale immersive display can play a role in cybersecurity informatics and visualizing corporate networks of nodes, switches, routers, firewalls and the flow of data. They believe that for the manufacturing and design enterprise, large-scale immersive visualization provides a perspective that results in new insight into security risks and can uncover Common Vulnerabilities and Exposures (CVE) to help prevent possible data breaches.

Next-Generation Immersive Displays

The virtual factory model helps CCAM serve its mission by providing a deeper understanding of the factory of the future, including dataflows and processes. The large-scale immersive virtual reality system is an excellent interface for collaboration among its academic, government, and industry ecosystem.

Located in a 62,000 square-foot facility in Prince George County, Virginia, CCAM has more than 35 members, including private companies, local public research universities, and government agencies. Its members share the resources, including pooled talent, advanced tools and technologies, and work together to grow talent and build the manufacturing workforce.

The center focuses on automation, robotics, 3D printing, surface engineering, and artificial intelligence, with the goal of providing a collaborative environment where members work together to develop inventive solutions, particularly for industry sponsored, government funded, academic research programs to make technology useful in meeting real world requirements.

CCAM works closely with one of its member companies to get the most benefit from its large-scale display system. Its engineering team is expert in helping designers convert highly complex 3D models into virtual reality applications that include interactivity. The team created CCAM’s digital factory using a popular game engine with tools that allowed importing CAD or computer-aided design data of the equipment and machinery, as well as enabled creation of physics-based, dynamic, interactive virtual representations of parts, assemblies and the entire facility, and overlaying the real-time information flow.

augmented reality, AR, virtual reality, VR, simulation tools, modeling, Commonwealth Center for Advanced Manufacturing

The CCAM team designs aircraft structural components. (Image source: Mechdyne)

Game engines have become powerful enterprise-level modeling tools, as well.  The trend to integrate VR display systems into next-generation digital factories requires expertise in managing software workflow conversion to interactive VR for a broad range of computer-aided design tools. There are resources available for managing remote collaboration that enable experts from other regions to participate in the virtual design reviews. If organizations do not have the resources to convert CAD to game engines and create virtual environments, there are VR integrators and specialty content organizations who can help designers and engineers develop visualizations, and add elements, such as photorealism, characteristics and behaviors, and complex interactivity.

 The large-scale VR system at CCAM is designed to provide a comfortable setting where teams can collaborate for hours to optimize designs, unlock insights, speed products to market, and inform decision makers. CCAM is using next-gen immersive visualization to enable multiple viewers to experience their own accurate perspectives in large-scale VR environments and share highly realistic motion-tracked simulations, wearing light weight 3D shutter glasses.

To create stereoscopic 3D images, projectors present separate images for right eye and left eye perspectives in a rapidly alternating sequence. The images alternate so quickly, ideally at 60 frames per second per eye, that to the viewer they appear to be on the screen at the same time.  For multiple viewers to each see their own perspective, two images for each user must be on screen at the same time.

augmented reality, AR, virtual reality, VR, simulation tools, modeling, Commonwealth Center for Advanced Manufacturing

Multiple users viewi stereoscopic 3D images. (Image source: Thomas Motta)

Digital Factory of the Future

Will Powers, CCAM’s president and CEO, and the former chief financial officer for aircraft engine maker Rolls-Royce North America believes that, to be successful, next-generation intelligent factories require automation systems which are contextually aware and can adapt to variability in parts, processes, environments, and people. 

Reflecting on the power of using immersive visualization in manufacturing environments, Powers said, “Automation and artificial intelligence are where modern manufacturing is headed. Large-scale immersive VR is an important resource that helps our researchers and members share unique perspectives and better understand the systems and processes under development.”

Digital factories of the future will derive their power from the interconnectivity of computers, machinery, autonomous vehicles, advanced sensor technology, and robotics. The complexity of these interactions can be difficult to conceptualize and make true collaboration challenging. With its wall-sized immersive display system, CCAM researchers and members can more easily share concepts and make discoveries that are only possible to understand through this type of data visualization.

Throughout the design and manufacturing world, virtual reality and augmented reality are playing an increasingly important role in enabling insights that result in better products, reduced costs, and faster time to market. Embracing these tools drives the advances that make manufacturing organizations successful.

David Gsell is general manager of the Mechdyne Software Business Unit at Mechdyne Corporation.

The Battery Show logoBattery, EV/HV, & Stationary Power All in One Place.

Learn everything you need to know at our in-depth conference program with 70 technical sessions in eight tracks covering topics on battery, electric & hybrid vehicles, and stationary power technologies. 

The Battery Show. Sept. 10-12, 2019, in Novi, MI. Register for the event, hosted by Design News’ parent company Informa.

10-vr-companies-to-watch-in-2019

Here are 10 companies in the enterprise VR space that everyone should be paying attention to for the rest of this year and on into 2020.

  • Enterprise virtual reality is having a busy year in 2019. With new companies, new hardware products, and updates from established companies, we’re seeing more and more applications for VR emerge in commercial spaces ranging from product design to healthcare and even skilled labor and trades. More and more companies are aiming to leverage VR to transform the way engineers and designers work forever.

    Here are 10 companies in the enterprise VR space that everyone should be paying attention to this year and on into 2020.

  • Fundamental Surgery

    A good deal of work is being done in the area of applying VR to medical training. But London-based Fundamental VR is adding a new dimension to VR for medical training by developing systems that also offer haptic feedback to users. Through a partnership with Seattle-based HaptX, a maker of innovative haptic gloves, Fundamental VR is currently deploying systems that allow for realistic touch interactions in virtual training environments in medical institutions all over the world.

    (Image source: Fundamental Surgery)

  • HP

    While companies like Oculus (Facebook), HTC, Sony, and Samsung have focused their VR efforts primarily in the consumer market, HP stands out as one of the few big-name tech companies going exclusively after the enterprise VR market. While its flagship VR products – the HP VR Backpack and the recently released HP Reverb headset can be applied to consumer and entertainment applications, the company has made no bones about being primarily concerned with creating products for design and engineering workflows. The HP Reverb (shown above) in particular is a high-resolution headset designed with engineers and designers in mind – making the resolution and comfort needs of its target audience its highest priority.

    (Image source: HP)

  • HTC

    Long viewed as the main competitor to Oculus in the VR entertainment sphere (particularly in PC gaming), HTC added a new wrinkle to its traditionally consumer VR-focused product portfolio with the release of the HTC Vive Pro. The Vive Pro was HTC’s first VR product targeted at the enterprise market – offering a 78% increase in resolution over the original Vive headset. HTC’s aim with the Vive has been to attract enterprise users interested in users interested in virtual collaboration and product design as well as other application.

    In 2019 the company released an upgraded version – the Vive Pro Eye (shown above) – with integrated eye-tracking technology from Swedish company Tobii. Using Tobii’s infrared tracking system, the Vive Pro eye can allow users to move about and control VR environments hands-free using only their eye movements.

    (Image source: HTC)

  • Mechdyne

    Iowa-based Mechdyne focuses on large-scale VR systems for smart manufacturing and other enterprise applications. The company deals more specifically in cave automatic virtual environments (CAVEs) – systems that create virtual environments via room-scale projections on walls.

    One of its products,the Powerwall, is a VR display system that projects 3D visualizations onto an 8 x 14-foot wall and onto the floor in front of it. By wearing shutter glasses users can explore virtual environments such as factories at room-scale with the sensation of standing in the actual space. The system lends itself to interactive training as well as data analysis and visualization.

    (Image source: Mechdyne / Thomas Motta)

  • Neurable

    Boston-based Neurable is taking an innovative approach to control schemes for virtual reality – mind control. The company has developed an EEG headset that can attach to a VR HMD such as the HTC Vive to allow users to control applications using only their thoughts. The hope is to not only create easier and more efficient means of control for VR users but to also allow better access for the disabled, provide user insights, and create whole new applications for VR.

    This year the Neurable debuted a new software product, Neurable Analytics, that “provides neural insights for objective feedback in human insights, design, and immersive training applications.” The software uses machine learning to classify EEG signals and can be used in applications including market research, product design, and high consequence training and industrial safety.

    (Image source: Neurable)

  • Oculus

    Facebook-owned Oculus is the company that brought VR back onto the map. In the past, Facebook has offered business packages of the Oculus Rift. But now rumors are circulating that the company may be looking to launch enterprise editions of its VR hardware in the near future.

    In 2018 the company released the Oculus Quest (shown above), its first standalone VR headset (requiring no wires or a PC). The headset offers resolution and specs comparable to some of the latest PC-tethered headsets and has become one of the best-reviewed standalone units on the market since its release.

    In 2019 the company released the Oculus Rift S to mixed reviews. While the Rift S included next-generation features such as inside-out tracking it was not the giant leap forward in innovation many were hoping. However, that still hasn’t stopped many from eagerly anticipating a true follow up to the original Oculus Rift.

    (Image source: Oculus / Facebook)

  • SE4

    Based out of Tokyo, SE4 specializes in creating software for robots that operate in high-latency environments, where it may be difficult to operate them remotely. Most recently, the company has developed a robot operating system that combines, VR and machine learning and AI to accelerate remote robotic control in applications such as excavation and construction. Rather than training robots in a 2D environment or via tedious programming, SE4’s solution allows robots to be trained via simulation in a 3D virtual environment. A user performs the task in VR and the AI extrapolates that task into a sort of to-do list that the robot is then able to execute, even remotely.

    The company also has larger ambitions and is targeting its software solution at robots deployed in space. If SE4 has its way, we may someday be using VR to help robots build colonies on Mars.

    (Image source: SE4)

  • VR Electronics

    London-based VR Electronics is the manufacturer of the Teslasuit – a full-body haptic suit that provides wearers with a sensation of touch in VR via electrostimulation. The suit can also capture biometric data, which the company says can be used in personalized experiences as well as training, performance, and healthcare applications by providing feedback on the wearer’s key health indicators, stress levels, and even emotional state.

    The Teslasuit also doubles as a motion capture device and can be used in related applications.

    (Image source: VR Electronics)

  • VRgineers

    There are many VR headset makers on the market, but very few are creating devices specifically aimed at professionals like engineers and product designers. But Prague-based VRgineers does just that. In the 2018 the company released the VR Hero 5K – a powerful headset with a whopping 5K video resolution (that’s 2.5K per eye). The company soon followed that up with the XTAL (shown above), another 5K headset that added a slew of additional features including eye tracking and an integrated Leap Motion sensor for tracking hand movement without the need of any external controllers or sensors. The company’s headsets are spec’d and priced at a level targeted at large organizations, where VRgineers is aiming to become a go-to supplier for VR headsets for training, product development, and other engineering applications.

    (Image source: VRgineers)

  • VRSim

    VRSim creates interactive tools aimed at training workers in skilled trades and professions. The latest version of the company’s SimSpray software is an HTC Vive-compatible VR painting tool targeted at the coating and paint industry. SimSpray creates VR simulations applicable to a variety of sectors, including automotive and aerospace, and gives users a realistic experience and feedback, complete with paint finishes and even defects.

    (Image source: VRSim)

Chris Wiltz is a Senior Editor at    Design News   covering emerging technologies including AI, VR/AR, blockchain, and robotics.

5-reasons-you’ll-need-a-3d-printer-on-mars

3D printing will play a vital role when we get to Mars. Here are five reasons why.

  • The vision of actually stepping foot on another planet is closer than it has ever been, thanks to technologies that truly would have seemed like something out of science fiction to even early Apollo astronauts. But as that vision inches closer to someday being a reality, it raises legitimate questions not just about how to get to Mars, but how to live once we’re there.

    According to NASA’s own timeline, a manned mission to Mars is years away and a lot of work has to be done before it could actually happen. But whenever we get to Mars, 3D printing will play a vital role. Here are five reasons why.

  • 1. ) 3D-Printed Habitats

    It’s a pretty safe bet that when the first astronauts get to Mars, Marriott or Hilton won’t have beat them there. They’ll need a place to stay which they’ll have to build themselves. In May NASA awarded $500,000 to AI SpaceFactory for their design of a structure that could be 3D-printed from natural materials found on Mars – specifically basalt, a dark volcanic rock in abundance on the red planet’s surface. Basalt would be extracted and mixed with renewable bioplastic resources processed from plants in a hydroponic garden to create common 3D printing filaments like PLA. The result would be a structure providing protection from the extreme temperature swings and intense radiation on Mars.

    (Image source: AI SpaceFactory)

  • 3D printing, additive manufacturing, food, food manufacturing, health

    2.) 3D-Printed Food

    After moving in to your 3D-printed home and workspace you’re going to be looking for something to eat. Some of your nutritional needs can be met from the aforementioned hydroponic garden that will be growing in one of the modules in your Martian home. But that can’t meet all your needs. And of course the Martian landscape won’t be providing any help. But packing, storing, and maintaining the freshness of food for a trip that is estimated to take 32 months just to get there – and then staying for perhaps years – is simply impractical. One potential solution is to 3D print at least some of the nutritional requirements. Today, 3D-printed foods are still in the novelty stage. It’s common at trade shows to see 3D-printed chocolates or other candies. But the Holy Grail of 3D-printed food has been to create a meal – chicken, rice, and a vegetable for example – that would satisfy the nutritional needs specific to any individual. This is a long way off, but such a solution in one form another would be required to meet the sustenance needs of an extended stay in space.

    Above: A graphic from  Ewha Womans University in South Korea shows how food can be 3D-printed to provide people with a healthier, better-balanced diet and promote healthier eating

    (Image source: Jin-Kyu Rhee, Ewha Womans University)

  • 3.) 3D-Printed Medicine

    Humans get sick and have accidents. Any extended stay in space has to take into consideration medical needs. Research into 3D-printable bioinks is expanding at an accelerating rate. Today, there are projects experimenting with 3D printing medical applications in low gravity environments trying to replicate everything from bone cartilage to skin tissue. In addition, the idea of using 3D printing for personalized medicine (creatinf prescriptions specific to the individual) has been gaining ground in recent years and would certainly be a necessity in any extended space exploration.

    Above: MIT engineers have 3D-printed stretchy mesh, with customized patterns designed to be flexible yet strong, for use in ankle and knee braces.

    (Image source: Felice Frankel)

  • 4.) 3D-Printed Tools

    Although no one talks about this when discussing the role of additive manufacturing in space exploration, but the fact is there’ll be lots of equipment on some future Mars colony. And equipment, no matter how well built or how well designed, breaks. So if you’re cruising around on the rough Martian landscape in your Mars rover and a strut snaps, how would you replace it? Although it wouldn’t seem likely to be an early deployment, at some point it would make sense that 3D printers capable of producing different materials would eventually make their way to any Moon or Mars colony. Printers that could print in materials compatible with the atmospheric conditions or, ideally, extracted from the planet’s natural resources could solve a lot of maintenance and spare parts issues.

    Above: A team from Curtin University in Perth, Australia, for example, is creating a 3D-printed toolkit to assist with living and working on Mars.

    (Image source: Curtin University)

  • 5.) Creative Problem Solving with 3D Printing

    Having been in the industry for several years now, what has become obvious to me is that 3D printers, regardless of which additive technology or material, are problem solving tools. Years ago I sold 3D printers to a lab at Pfizer Corporation in Groton, Connecticut. At the time I was very puzzled. Why would a pharmaceutical lab want a 3D printer that only prints in PLA? Well, they printed a fish-food dispenser that could properly mix the drug being tested with the right amount of fish food. They printed test tube holders that would hold test tubes in a specific orientation. They printed a tablet for counting pills that was easier to use than the one they had. In other words, they started solving problems that prior to the printer they just had to accept.

    On the Toyota assembly line in Princeton, Indiana I saw a simple 3D-printed tool that reduced a basic task from fours steps to two. That might not sound like much but multiply it by 4,000 cars a week times 50 weeks. Reducing individual tasks even a little reaps huge benefits in productivity and reducing worker fatigue. There’s really very little doubt that a 3D printer on Mars printing tools or parts as they are needed for the hydroponic garden or the rover would be enormously valuable.

    (Image source:  Olav Ahrens Røtne on Unsplash)

Jack Heslin is the Founder and President of 3DTechTalks, as well as the Head of Business Development for Lazarus3D, a medical 3D printing start-up. 

nasa’s-dart-mission-aims-to-save-earth-from-asteroids

NASA is well along on a new mission that may ultimately save the earth from an event like the one that caused the extinction of the dinosaurs. To Far Side readers, no, it wasn’t because of smoking. Looks like the dinosaurs went through the trauma that Bruce Willis and Ben Affleck try to avoid in the movie, Armageddon, when Earth was threatened by an asteroid on a direct path to our planet.

The Double Asteroid Redirection Test (DART) is a planetary defense-driven test of technologies for preventing an impact of Earth by a hazardous asteroid.  DART will be the first demonstration of the kinetic impactor technique to change the motion of an asteroid in space.

The Double Asteroid Redirection Test, DART, NASA
The DART mission is tasked with hitting the moon of this asteroid. Project manager choose an asteroid with a moon so they can measure the ability to move one asteroid by measuring its relation to the other asteroid. (Image source: NASA)

The germ of the idea for DART came from the need to protect the Earth from a potential cataclysmic event of an asteroid hitting the planet. “The DART project just started with scientists thinking about planetary defense and how you deflect a threat,” Elena Adams is a space systems engineer at the Johns Hopkins University Applied Physics Laboratory told Design News. “It’s hard to measure the hit on an asteroid that is traveling by itself, so we decided to find an asteroid with a moon. You can measure the moon before the test and after the test. You can see if you hit the moon hard enough to move it from an hour to 52 minutes.”

The Double Asteroid Redirection Test, DART, NASA
DART spacecraft. The DRACO (Didymos Reconnaissance & Asteroid Camera for OpNav) imaging instrument is based on the LORRI high-resolution imager from New Horizons. The left view also shows the Radial Line Slot Array (RLSA) antenna with the ROSAs (Roll-Out Solar Arrays) rolled up. The view on the right shows a clearer view of the NEXT-C ion engine (Image source: NASA)

As well as testing the ability to move a large object in space, the DART mission is also testing a new spacecraft. “We’re demonstrating a NEXT-C engine powered by ions,” said Adams. “DART is the first mission to demonstrate smart navigation, which means we’re going to be guiding ourselves into the asteroid autonomously.”

The Double Asteroid Redirection Test, DART, NASA
DART spacecraft with the Roll Out Solar Arrays (ROSA) extended. Each of the two ROSA arrays in 8.6 meters by 2.3 meters. (Image source: NASA)

Once launched, DART will deploy the Roll Out Solar Arrays (ROSA) to provide the solar power needed for the craft’s electric propulsion system. The DART spacecraft will demonstrate the NASA Evolutionary Xenon Thruster – Commercial (NEXT-C) solar electric propulsion system as part of its in-space propulsion.  NEXT-C is a next-generation system based on the Dawn spacecraft propulsion system that was developed at NASA’s Glenn Research Center. 

As for changing an asteroid’s trajectory the test will help determine the amount of impact necessary to make the difference in the asteroid’s path. “You want to hit the asteroid and see if you can move it. For a larger object, you would use other techniques such as gravity. Then there is the nuclear option,” said Adams. “There will be follow-up missions. The Planetary Defense Coordination Office coordinates different efforts, not just for the US, but for organizations in Europe and other countries. It’s an ongoing effort.”

The DART mission is directed by NASA to the Applied Physics Laboratory with support from several NASA centers:  the Jet Propulsion Laboratory, Goddard Space Flight Center, Johnson Space Center, Glenn Research Center, and Langley Research Center. 

Elena Adams will present the keynote address, Earth Strikes Back with the Double Asteroid Redirection Test, at the Drive World Conference in Santa Clara August 27-29.

Rob Spiegel has covered automation and control for 19 years, 17 of them for Design News. Other topics he has covered include supply chain technology, alternative energy, and cyber security. For 10 years, he was owner and publisher of the food magazine Chile Pepper.

Drive World with ESC Launches in Silicon Valley

This summer (August 27-29), Drive World Conference & Expo launches in Silicon Valley with North America’s largest embedded systems event, Embedded Systems Conference (ESC). The inaugural three-day showcase brings together the brightest minds across the automotive electronics and embedded systems industries who are looking to shape the technology of tomorrow.

Will you be there to help engineer this shift? Register today!

artemis-–-apollo’s-twin-sister-–-aims-for-the-moon

The Greek god Apollo had many siblings, but only one twin, his sister Artemis. Quite a fitting, Artemis is the name given to the NASA mission that will take up the work of the 1960’s Apollo program. Even more fitting, in Greek mythology, Artemis rules the Moon.

The Artemis mission is planned in eight stages beginning in 2021. By 2024 the mission will take men and women to the Moon. The later stages will create a place where humans live a lunar life, even if just on the Gateway vehicle that will become a Moon-orbiting studio apartment.

News about the details of the mission are coming out nearly every day. Just this Monday, NASA revealed its conception of the Artemis Moon Lander. In the statement that included an artist rendering of the ascent vehicle, NASA noted it is seeking comments from American companies interested in providing an integrated human landing system to put the first woman and next man on the Moon by 2024.

NASA, Artemis, Moon program, Apollo Program, Gateway

Artist’s rendering of an ascent vehicle separating from a descent vehicle and departing the lunar surface. (Image source: NASA)

Seeking Private Sector Input

NASA anticipates a three-stage human landing system. The agency is interested in alternative approaches that can meet the same long-term goals of global lunar access and a reusable landing system. The three-stage concept includes the lunar Gateway to hover in low-lunar orbit to carry the crew to the surface, and an ascent element to return them to the Gateway. From there, they would board Orion for the trip back to Earth.

In a statement NASA Administrator, Jim Bridenstine, noted that “The Gateway will be our home base in lunar orbit – it is our command and service module for missions to the surface of the Moon. Using it as a port for the human landing system, its orbit around the Moon will give us access to the entire lunar surface, and a place to refurbish and refuel the landing system. This is no small feat. Building a 21st century landing system takes the best of our government and private-sector teams.”

The Plans for Returning to the Moon

In May, NASA announced it will return to the Moon by 2024. Here’s the agency’s details about the program to land men and women on the Moon. In this video, NASA also explains the role nine private companies will have in the mission:

The Artemis Mission in Stages

Artemis 1 (2012):

During Artemis 1, Orion will venture thousands of miles beyond the moon during an approximately three-week trip. Artemis 1 is designed to test all the hardware and operations, and then splashdown in the Pacific Ocean off Baja California.

Artemis 2 (2023):

Artemis 2 will be the first mission to the Moon with astronauts on board. Although taking a different trajectory than Artemis 1, this test flight of the Space Launch System (SLS) and Orion will test Orion’s life support systems with four astronauts aboard for a 21-day mission. 

Artemis 3 (2024):

This is where men and women actually return to the Moon. It will be the first crewed lunar landing since Apollo 17 in 1972. It will be a mission lasting less than 30 days, and it will involve a short rendezvous of the Orion capsule with Gateway, the space station in lunar orbit that will have been pieced together during five launches by private space companies under contract to NASA.

Artemis 4 (late 2025):

Astronauts will continue to help build the Gateway’s habitation module and descend to the lunar surface.

Artemis 5 (late 2026):

Astronauts will descend to the lunar surface for various tasks not yet determined.

Artemis 6 (late 2027):

Astronauts will install a robotic arm on Gateway and descend to the lunar surface.

Artemis 7 (early 2028) and Artemis 8 (late 2028):

Astronauts will descend to the lunar surface on one or both of these missions.

Moon Exploration

The first stop on the lunar surface will be the South Pole region, which is believed to contain water ice. It is also potentially rich in other resources, so the area is a good target for future human landing. NASA has studied this region heavily with robots. While this is far from the Apollo landing sites near the lunar equator, NASA’s Lunar Reconnaissance Orbiter has collected precise information offering details about the South Pole’s topography, temperature and locations of likely frozen water – a necessity for sustainable human exploration.

While water ice is important for sustaining human life, it has other qualities that sustain lunar exploration as well. NASA is considering refueling capabilities to make the landing system reusable, while also working on in-situ resource utilization technologies to make rocket propellants using water ice, and rock and dust from the Moon. Once the ability to harness resources becomes viable, NASA could refuel lunar lander elements with the Moon’s own resources.

Ultimately the Artemis mission may lead to Moon colonization. On a recent press visit to NASA’s Houston facility, agency managers proudly showed off lunar vehicles. They even took a handful of eager journalists on lunar vehicle rides. I felt like a kid at Christmas. When asked if a Moon visit was on the horizon, the NASA host noted, “We’re ready to go as soon as the government pulls the funding trigger.” Well, Houston, that trigger has been pulled.

Rob Spiegel has covered automation and control for 19 years, 17 of them for Design News. Other topics he has covered include supply chain technology, alternative energy, and cyber security. For 10 years, he was owner and publisher of the food magazine Chile Pepper.

Drive World with ESC Launches in Silicon Valley

This summer (August 27-29), Drive World Conference & Expo launches in Silicon Valley with North America’s largest embedded systems event, Embedded Systems Conference (ESC). The inaugural three-day showcase brings together the brightest minds across the automotive electronics and embedded systems industries who are looking to shape the technology of tomorrow.

Will you be there to help engineer this shift? Register today!

private-companies-will-lead-the-next-wave-of-space-travel

All across the world, entrepreneurs like Elon Musk, Jeff Bezos, and Richard Branson are taking their boyhood dreams of space travel and turning them into commercial space enterprises. Much of NASA’s work is now done by commercial companies. Indeed, when man heads off to the moon again over the coming decade, it will be commercial space vehicles – coordinating with NASA – that will carry the human payload.

Boeing, space travel, commercial space companies, NASA

Companies such as Boeing are developing space vehicles for private space travel. (Image source: Boeing)

NASA is all for it, since commercial enterprises take a good portion of the burden off taxpayer funding. “Jeff Bezos, Elon Musk and others were inspired by the Apollo missions and they were frustrated that space technology petered out after that,” Loretta Hall, author of a number of books space development such as Space Pioneers: In Their Own Words, told Design News. When the manned spaceflight ended with the Apollo missions, these entrepreneurs nurtured dreams of taking the next steps. “Spaceflight didn’t progress beyond the moon like people expected it to, so these private citizens decided to do it themselves.

NASA leaders have viewed the emergence of commercial space companies as an encouraging development. “NASA views it as a positive when they have multiple technologies to choose from. In the past, they designed space equipment and choose someone to build it. Now, a lot of the raw research is going on outside NASA.”

Space Vendors Become Space OEMs

NASA has always used private companies to develop and product technology for space flight, so the aspect of private space technology is not new. “NASA has always had private companies building their equipment, usually aerospace companies already involved with military production,” said Hall. “Companies such as Lockheed Martin and Boeing were involved.”

However, in the past two decades, private companies have started to produce their own space-bound vehicle technology. “What’s new is NASA is no longer the sole customer for commercial producers,” said Hall. “Private companies are either building for themselves or building rockets for other entities. In the past, for anything related to space, NASA was the only buyer. That’s no longer true.”

The Space Passenger May Fund Space R&D

The concept of commercial space often comes from the idea that not all research and development funding has to come from taxpayers. “Commercial space is going into the technical areas of trying to utilize space resources for commercial purposes, and it’s also going in the direction of space tourism because that becomes a way to finance the more technical pursuits,” said Hall.

Even NASA plans to get into the business of commercializing space. “NASA has announced that beginning next year they’re willing to sell tickets for people to spend time on the International Space Station,” said Hall. “They see the value of commercializing space.”

A number of billionaires such as Elon Musk, Jeff Bezos, and Richard Branson all want to send private citizens to space. Their respective companies, SpaceX, Blue Origin, and Virgin Galactic are dedicated to making space travel and space tourism more accessible. Here’s how they plan to do it:

Managing Commercial Space Operations

In the past, NASA developed the program and used vendors to build to spec. In the past 15 years, commercial companies have started to develop their own technology. Now when they work with NASA, it’s under an agreement that NASA can’t disclose the technology. “NASA works cooperatively with anyone they’re trying to develop a space product,” said Hall. “Yet the commercial companies don’t want to share any proprietary technology with NASA unless they have an agreement.”

There are also restrictions on how commercial space companies can operate. For one, don’t expect any non-US conglomerates to buy up any of the commercial space companies. “There is a treaty called ITAR (International Traffic in Arms Regulations) that put restrictions on what technology American companies can share with other companies,” said Hall. “Anything they do is subject to government approval.”

Even with all of the commercial space development, NASA is still developing its own space technology. “NASA is continuing its development of space launch systems,” said Hall. “They have a rocket in development. There is question about whether it is going to be successful. There have been delays, but that’s sort of normal.”

Rob Spiegel has covered automation and control for 19 years, 17 of them for Design News. Other topics he has covered include supply chain technology, alternative energy, and cyber security. For 10 years, he was owner and publisher of the food magazine Chile Pepper.

Drive World with ESC Launches in Silicon Valley

This summer (August 27-29), Drive World Conference & Expo launches in Silicon Valley with North America’s largest embedded systems event, Embedded Systems Conference (ESC). The inaugural three-day showcase brings together the brightest minds across the automotive electronics and embedded systems industries who are looking to shape the technology of tomorrow.

Will you be there to help engineer this shift? Register today!

that-small-step-is-still-there-after-50-years

Apollo 11 landing site captured from 24 km (15 miles) above the surface by NASA’s Lunar Reconnaissance Orbiter(LRO). Tracks of the astronauts can be seen between the LM and various other discarded pieces of equipment. (Image source: NASA Goddard/Arizona State University)

The remnants of the footsteps are still there. Fifty years after Neil Armstrong and Buzz Aldrin walked on the surface of the moon, the evidence of humankind’s first venture off our small blue planet is still visible. The astronauts had spent over 21 hours on the lunar surface after their Lunar Module (LM) had landed there on July 20, 1969. They had explored for more than 2-1/2 hours the surface outside of their spacecraft. Then they blasted off using the ascent stage of the LM, and leaving behind the descent stage on the surface.

In November of 2009, NASA released images of the Apollo 11 lunar landing site in the Sea of Tranquility. The images, taken by the Lunar Reconnaissance Orbiter (LRO) from just 15 miles above the Moon’s surface, shows the discarded descent stage of the LM, as well as tracks created by the astronauts as they moved about in the dust on the surface.

One of the astronaut’s trails leads to the Passive Seismic Experiment Package (PSEP), which was set up to provide the first lunar seismic data. It continued to return data for three weeks after the astronauts left. Also visible in the LRO photo is the Laser Ranging RetroReflector (LRRR), which allows precise laser measurements between the Earth and Moon to be made. It is still operating to this day and the discarded cover of the LRRR can be spotted nearby, where it was dropped by one of the astronauts.

Another trail follows an unplanned excursion near the end of the time spent on the surface. Armstrong ran over to get a look inside Little West crater, about 50 meters (164 feet) from the LM. This was the farthest either astronaut ventured from the landing site. Armstrong and Aldrin’s tracks during their time on the lunar surface cover less area than a city block.

An artist’s illustration of the LRO taking photographs and measurements of the surface of the Moon. ( Image source: NASA)

The LRO was launched on June 18, 2009 and entered lunar orbit on June 23, 2009. LRO’s mission is to help identify sites close to potential resources with high scientific value, favorable terrain, and the environment necessary for safe future robotic and human lunar missions. The LRO has also photographed all of the Apollo lunar landing sites, as well as the locations where various the jettisoned Lunar Modules have impacted the lunar surface, after having returned the astronauts safely to the orbiting Command Module.

According to NASA, the instruments on board the LRO spacecraft return a range of global data, including day-night temperature maps, a global geodetic grid, high resolution color imaging and the moon’s UV albedo. There has been particular emphasis on the polar regions of the moon where continuous access to solar illumination may be possible, and the potential for frozen water in the permanently shadowed regions at the poles may exist. LRO data sets have been deposited in the Planetary Data System (PDS), a publicly accessible repository of planetary science information.

Because the Moon lacks any atmosphere that would cause erosion, short of a major meteor strike at the landing location, the only degradation of the tracks of footprints and equipment remaining on the moon comes from the impact of micro-meteors. It theory this means that the artifacts from the Apollo 11 Moon landing could remain undisturbed for centuries—or at least long enough to become a prime tourist attraction for the inhabitants for a future Moon base.

Senior Editor Kevin Clemens has been writing about energy, automotive, and transportation topics for more than 30 years. He has masters degrees in Materials Engineering and Environmental Education and a doctorate degree in Mechanical Engineering, specializing in aerodynamics. He has set several world land speed records on electric motorcycles that he built in his workshop.

return-to-earth-and-splashdown

SPACE WEEK: Re-entry from space has never been easy, but Apollo 11 successfully returned from the Moon and ended its historic journey on July 24, 1969.

July 22-24

Apollo 11’s three parachutes bring it safely home to a splashdown in the Pacific Ocean. (Image source: NASA)

Returning from the Moon took two days for Apollo 11, during which time two more television transmissions were made by the astronauts.

Re-entry procedures were initiated on July 24, 44 hours after leaving lunar orbit. The Command Module (CM) separated from the Service Module (SM) and was rotated around to a heat-shield-forward position. Because of bad weather in the original Pacific Ocean target area, the landing point was changed by about 250 miles. The CM Columbia entered the Earth’s atmosphere at 12:35 pm, protected from the intense heat, caused by friction with the air, by the spacecraft’s heat shield.

President Richard Nixon welcomes home the Apollo 11 astronauts (from left, Neil Armstrong, Michael Collins, and Buzz Aldrin.) The astronauts were quarantined after their mission to ensure they did not bring back any contamination from the moon. (Image source: NASA)

As Apollo 11 entered the denser part of the atmosphere, three parachutes were deployed and Columbia splashed down 13 miles away from the USS Hornet recovery ship. Apollo 11’s total flight time to the Moon and back had been 195 hours, 18 minutes, and 35 seconds. After the spacecraft hatch was opened by the recovery crew, the astronauts donned isolation suits to ensure that they wouldn’t spread any possible lunar microbes. President Richard Nixon was on-board the Hornet to congratulate and welcome the astronauts home. These three men had just returned from one of humankind’s most remarkable, challenging, and historic journeys.