top of page

Best practices for decision-making in hardware development

Designers, engineers and decision-makers have a pretty significant responsibility to make the right decisions. From medical implants, to submarines, to gas boilers - the consequences of choices today can be disastrous down the line. Even trivial objects like a vegetable peeler can have fatal flaws, able to cause serious injury.

Throughout the product development process, the people are faced with the continual need to make a choice. 3103 or 4015 Alloy? 1mm or 2mm chamfer? Domestic or overseas manufacture? The gravitas of some decisions are greater than others, but a choice needs to be made nonetheless.

Meanwhile, the chaotic and iterative nature of the development process means making the right choice every time is practically impossible. Depending on the situation, there are a number of best-practices and tools available, to help reduce the burden on the developers brains. In turn, this allows us make more logical choices, and minimise failure.

Stick to the plan

As the development cycle progresses, more discoveries, more learning and more ideas arise. It can be quite tempting to act on this intuition and implement all these great new features that have presented themselves. The truth is though, spontaneous changes and deviations from the brief are usually not smart to apply.

There are exceptions to this rule, such as safety-related changes, or desirable improvements that are considered urgent (e.g. to excel competing products).

Any serious product development should have a PDS (Product Design Specification) detailed at the outset. It's a live document, that encompasses all aspects of the product design, in quantifiable, bullet-style format. Some of the topics include performance, ergonomics, target cost and service life. When it comes to making the right choices, decision-makers should use this document to ensure the outcome is in-line with the desired specification.

Let’s take the ‘end-of-life’ requirement as an example: If the designer wants to combine two housing parts with adhesive, it would contradict the specification that requires a 'device that can be disassembled into all individual parts'. In essence, the PDS has decided that for you!

Validated decisions

Ideally, all significant decisions should be based on data. Whether it be research, testing or market data - and when the stakes are high, keeping the failure potential to a minimum is essential. It sounds obvious, but any decision based on data has better odds at being successful. If 80% of test units incurred corrosion, it’s quite clear that a material with better corrosion resistance is required.

You don’t have data? Then get data! Even the most basic experiments can reveal some surprising truths. I think most people would agree that fixing a problem in the field costs 10x the amount of avoiding the fault in development. By data, I'm not exclusively referring to hard numerical datasets, but any quantitative or qualitative resources.


Almost all decisions will have a cost associated with it. Whether that be with the immediate implementation, or through the course of the products life. Commonly, decision-makers must be willing to make a trade-off between budget and solution - as there is usually a more preferable option that costs extra bucks. The PDS should be referenced here, because it should outline the budget limitations. This highlights the importance of being aware of the cost implications and outlining a rational budget at the outset.


Virtually all products with a degree of complexity are developed by a team of people. Whether that be a team of 10 people you employ, casual freelancers online, or a couple of mates from uni - they are all project contributors with a valid opinion.

Sometimes, founders contract a harmful disease called Hopium. It causes them to explode with optimism and perceive their project as almighty and flawless. Now while this disease is fictitious, the symptoms are not. Optimism is good, but some founders need a wake-up call. Gathering opinions and data from your other contributors is important. This is particularly true for people who are specialists in their domain. Not only does this allow you to make more informed decisions, but getting your team onboard with the process ensures a collaborative effort.


Thankfully, there are a number of tools available to support developers in the decision-making process. In the following section I highlight three common methods, which can help to bring the right decision to light. These examples are used mostly within development applications, but there are other more general models that might be worth a look too - like the Vroom-Yetton Model for example.

Pugh Decision Matrix

One of my favourites is the Pugh decision matrix, developed by Stuart Pugh for the design and engineering industry. It’s a smart evaluation tool, that provides valuable comparison data.

How to use the Pugh Decision Matrix:

1. Establish the most relevant characteristics (selection criteria) and their importance (weighting).

2. Define a datum – this could be a competitor’s product, a former design or any applicable baseline.

3. Establish each design option, then evaluate against the datum.

The weighting factor is a great feature that helps to put each option into perspective. In the examples given, Option A and B are almost identical, but ‘B’ is quite considerably stronger, due to the ‘Usability’ score. This tool can be used to evaluate holistic choices like which concept to pursue, down to small decisions like what thickness to use for a gasket.

I developed a template for the Pugh Decision Matrix, which can be downloaded as an .xlsx file.


A comprehensive analysis tool also used, is ‘FMEA’ - Failure Mode and Effects analysis. At its core, it allows developers to assess parts, subassemblies and systems to identify potential modes of failure. It is typically used in complex products/systems and critical applications that have big repercussions with failure.

The FMEA analysis’ main function is to evaluate all modes of failure, and this type of systematic analysis allows for decisions to be backed up by data. Despite this, some of this ‘data’ is effectively an assumption. Different templates and configurations of the FMEA tool are readily-available online – the Wiki page has a pretty substantial overview of the topic.

NUF Test

A comparable tool to the Pugh Matrix, is the NUF Test – standing for ‘new, useful and feasible’. It is popular in industry, for its simple format and 3 common criteria. The scoring system is a little ambiguous because it doesn’t reference a datum, but this tool could be modified to do so. While limited in scope, it is a fast comparison tool to quickly evaluate a lot of ideas.

How to use the NUF test:

1. Establish the challenge.

2. Brainstorm potential solutions.

3. Evaluate each solution based on the NUF criteria. Additional comments can be made too.

A template for this NUF Test can be downloaded as an .xlsx file.


While developers are faced with challenging choices, there are a set of best practices to tease out the more logical answer. To summarise the most important considerations would be:

  1. Stick to the specification.

  2. Base it on data.

  3. Make it a team effort.

And where necessary, use tools to allow for comparisons with competing solutions. You can't make the right decision every time, but hopefully this info will support you in making more informed picks!

Download links: 1. Pugh Decision Matrix 2. NUF Test


bottom of page