Embedded development for medical devices comes with its own unique challenges and considerations. (Image source: ICS)

Medical devices are one of the fastest growing areas of embedded hardware and software development. Increasing hardware performance, reductions in hardware size and cost, and increased demand due to an aging population in most developed countries are all making for a generation of market-driven products that are vastly more powerful, intuitive, connected, and inexpensive when compared with devices from as little as a decade ago.

But medical device development presents an array of challenges that engineers and developers used to traditional embedded development may not be familiar with.

Understanding Design Controls

Weak execution in any key area, like product definition or testing, can result in a vastly delayed time-to-market as the product awaits FDA clearance/approval.

The reputation of the documentation burden associated with medical devices precedes itself. Some developers have haunting recollections of nightmare projects bogged down in a pedantic quagmire of box-checking in the name of regulatory compliance.

But all experiences are not equal. The chief FDA regulation governing medical device development, 21 CFR 820.30, exists solely to ensure a medical device meets its stated intended use (aka user needs). And contrary to the reputation, it does so in a fairly straightforward way:

  1. Define what your users want (the Product Requirement Definition).

  2. Define how your product will provide it (Software Requirements Specification and Design Documentation).

  3. Analyze the risks and make it safe (ISO 14971).

  4. Test and prove that #1 is satisfied by the implementation of #2 (Verification & Validation).

While this is simple when described as intent, the most egregious complications are introduced when the process is over-applied.

There’s a substantial documentation requirement embedded in 21 CFR 820.30 called “design controls. Design controls are only required to be applied in the design of the final product. The controls are broad and span definition, documentation, review, approval, update and implementation.

Design controls contain common elements, but the specifics are defined by the company and to the extent that they can be kept simple and linear, they will be easy to implement and defend.

A key part of keeping them linear is knowing when to turn them on. Design controls applied too early can create loops of rework later in the design process.

Do’s and Don’ts for Successful Medical Device Software Development

Here are some successful strategies (and potential pitfalls) gleaned from real-world medical device development projects:

DO: Use Prototypes to Streamline the Design Process

Leverage the “I’ll-know-it-when-I-see-it” principle to get past the most common roadblocks to product definition. A great prototype will have enough fidelity to enable organizational alignment, establish a common vision, and fine-tune the user experience (UX).

UX is a field of growing importance and has been recognized by the FDA as a key contributor to medical device events resulting in patient harm.

A high-fidelity UX prototype consisting of user interfaces, navigation, and a little business logic can be used to define workflow, detect and solve safety issues, and uncover logical inconsistencies well ahead of the labor-intensive design control.

Conceptually, a good prototype details the boundaries of product features and then relegates the remaining effort to providing filler (i.e. a methodical process of design based on user needs). The alternative is working out design issues under the burden of full design controls. 

The difference is akin to ‘paint-by-numbers’ rather than painting a picture that’s changing while you paint it. The productivity differences are profound. But the price for this boon is organizational discipline. Iterate on a prototype until it’s socialized across the organization; only then move into design controls.

DO: Utilize a Build-As-You-Go Quality System

If you’re a startup and don’t have a quality system, the perceived burden can be daunting. There are roughly 29 processes and procedures implicit in an ISO 13485 compliant quality system, but less than half of them are applicable when you’re designing the product.

Don’t try to turn on a full quality system all at once; rather turn on processes as they become relevant. Which processes to turn on and when is situation dependent, but the basics for a startup are likely to include:

  • Design controls

  • Risk management

  • Document control and record management

  • Supplier management

When building your quality system, it’s important to avoid overcommitting, so only describe the processes you’re confident you can execute on. When trying to judge what’s essential, keep in mind that a quality system’s primary objective is ensuring that user needs are met by the product design. However, also keep in mind the other maxim that usually accompanies regulatory submissions: if it isn’t documented, it didn’t happen.

DO: Write Product Specifications with the Test in Mind

Another dicey area that typically takes way too long in medical device development is writing the product specifications. The Software Requirement Specification (SRS) is tremendously important because it potentially defines the largest test effort within the product. Every specification written will be painstakingly verified multiple times (with regression).

For a novice, it’s hard to know what should be included in an SRS and what shouldn’t. A key strategy is to focus on what test the specification will motivate. Is that something we need to test? Does it tell us how to test it? Is it clear what passing is?

These questions collectively embody the term “testability.” Writing the SRS with the test in mind encourages a deliberate focus on what’s important to explicitly verify.

DON’T: Gloss Over the Formal Definition of Your Product

Getting the Product Requirement Definitions (PRD) right early in the program is a core leadership challenge. It’s also the biggest opportunity for trimming the distractions in a program. The effort to complete the PRD is frequently underestimated.

An effective PRD will galvanize the organization around one product vision early within the lifecycle and will expedite the product validation process at the end of a program. It will also streamline all the activities in between.

An ineffective PRD will echo throughout the organization with ambiguous or incomplete requirements that fail to constrain the design, misalign engineering groups and that will be difficult to validate.

DON’T: Turn on Your Quality System Too Late

There’s this thing that happens: Trying to make a product FDA-compliant after it has already been designed. Every so often a company, usually a startup, will find itself in possession of a near-complete product that wasn’t developed with design controls – a huge issue. This is what can happen when a company procrastinates over implementation of a quality system.

A medical device that is designed and implemented without formal tracking of design inputs and design outputs isn’t following the FDA’s 21 CFR 820.30.

21 CFR 820.30 is a federal regulation, which means it carries the force of law. The FDA is an enforcement agency of the federal government so the consequences of misapplication are severe.

Designing a product and then trying to make it FDA friendly after the fact not only puts a company’s future on questionable ground, but it’s actually less efficient. In this approach, regulatory compliance is entirely overhead to the design process.

Done well, the quality system makes product execution efficient by keeping product objectives in front of you, and explicitly delivering on top-level concerns like product safety and efficacy.

DON’T: Tack on Human Factors at the End

In 2016, the FDA issued new guidance for their expectations of human factors in medical devices. According to the newly-minted guidance, the FDA prefers a method of progressive study of use related hazards as the design evolves and a summative test at the end of design as part of the product validation process.

For products whose misuse could result in serious harm, the FDA is “strongly encouraging” a formal report be included with a product’s submission. This Human Factors Engineering/Usability Engineering (HFE/UE) report can be fairly extensive. A small sample of things it should include are:

  • Intended user populations

  • Use environments

  • List of mitigations

  • Discussions of residual use-related risk

  • Evaluation methods used

  • Process used to identify critical tasks

  • Evidence of effectiveness of each risk management measure

With eight sections and 34 sub-sections, the HFE/UE implies a lengthy process that should run parallel to the design process. And any stage of the analysis is likely to cause product changes. Tacking it on at the end means that those product changes ripple through the design control process when its burden is at a peak; causing maximum rework. The converse is also true: The earlier the human factor analysis is done, the more minimal the impact.

The Takeaway

Medical device development is vastly different from unregulated development, but most of the differences stem from a formalization of good product development practices. The key is to ingest the good practices and not treat the regulatory burden as external to a productive and efficient design process.

Strategic decisions, for instance determining when to turn on design controls, can have a staggering impact on the balance of the program – either chewing up resources with rework or streamlining development with a well-considered product description.

Milton Yarberry is a veteran of product development, architectures and software project management in the medical device space. He is the Director of Medical Programs at Integrated Computer Solutions (ICS), which specializes in custom software development and user experience (UX) design. ICS’ UX-first development approach improves end-user satisfaction and reduces overall program development costs.