Bridging The Designer-Engineer Divide

Instead of specialization, we should strive for some overlap. When engineers and designers share experiences, they develop empathy, which leads to clearer communication. This in turn improves outcomes.

Designer Saves The Day (True Story)

Sam was engaged on a renovation project, where the goal was to open up a high-traffic space in a house.

Sam was the “designer”, who drew up a detailed plan to remove a main load bearing wall that would meet the permit requirements. Sam also defined the functional requirements: the beam had to be concealed, the opening had to be this wide, and so on. Mike was the highly skilled “engineer” tasked with making it happen.

As Mike and Sam discussed implementation, Mike made specific decisions about materials and specific dimensions. A few compromises were made, but all looked doable, until Mike exposed the floor where the beam cross-support would go and realized the space contained an HVAC conduit that could not be relocated.

At this point, Mike the engineer sighed and said, “You know, this is a show stopper. I would reconsider removing this wall. We’ll have to move up past the vent and at that point, you’re not really removing that much wall.”

To this, Sam replied, “No, this wall is in the way. We can’t stop now. Let’s just think about it for a moment.”

Very quickly it occurred to Sam that they could build on top of the floor without impacting the vent. Sam described how it could be done. Apparently, this option didn’t occur to the engineer, because it’s not usually done this way and if they got a grumpy inspector, it would not pass inspection. So Sam vetted his idea with an architect, who highlighted the risks with doing that. Fortunately, Mike the engineer jumped in and offered solutions that would mitigate those risks. Sam then relayed this final plan to the inspector, who signed off on it.

And so Sam’s solution was implemented and the project was a success.

Mike was 50X more skilled than Sam. But two heads are better than one. Although Sam was a designer and project manager, he acquired some basic construction knowledge. This knowledge turned out to be critical in the successful implementation of his design. And his not being an expert was actually helpful, because he naturally thought outside the box.

The critical takeaway here was that BOTH the specialized skillsets of the engineer and the architect AND the designer’s basic technical acumen came together to produce the perfect solution.

Now let me tell you a different story.

Engineer Saves The Day (True Story)

Sam was involved in a bathroom renovation project. Since Mike the engineer was busy, Sam decided to enlist the help of a different engineer, named Anton. Anton said he required design drawings even for a small project. “You tell me what you need, and I’ll build it” was his motto.

So Sam decided to contract out the design to a dedicated designer, Julie. Julie came highly recommended. She looked at the existing layout and drew up several recommendations. Sam and Julie agreed on a plan to put the shower by the window, because it didn’t seem to fit anywhere else. Julie then produced detailed drawings for the engineer Anton. But at that point Anton was no longer available.

Luckily, Mike now was. Mike the engineer looked at the plans and immediately said it was no go. You can’t put a shower by the window, because obviously the water would go all over the window sill and it would be a mess in the long run. Julie or Sam were so focused on the paper layout that they overlooked the implementation.

So Mike the engineer thought it over for a day or two and proposed a completely different design, which moved everything around in a way that neither Julie nor Sam ever considered. It was a spectacular improvement. A bit more work but definitely worth it. Mike’s design was a rough pencil sketch on the wall. With that in mind, he successfully implemented the new design.

Most often designers are frustrated that engineers haven’t implemented their designs exactly on the first try. But that isn’t always a good expectation. I don’t always care to align my boxes down to the pixel or get the font colors and sizes perfect. I always expect the implementer to critically think about what they are building. The result need not be antagonistic. Here’s a sample tweet I just came across:

Building Rockets Iteratively

There are many examples of designers and engineers collaborating successfully.

If you get a chance to watch the documentary Cosmodrome, it’s an interesting story about how Soviets perfected a closed cycle rocket engine in the 70s. The U.S. thought it impossible and wasn’t even aware of it until the 90s.

The Soviets’ manufacturing and engineering process is a perfect case study in iterative design, prototyping, and collaboration.

The engineers drew up plans, and then planned a dozen test flights to iron out flaws. These were FULL flights, and they fully expected the first few rockets to explode. And they did. They even destroyed the launch complex and had to rebuild it to keep testing. Luck was a factor in these decisions. The Soviets simply didn’t have the right test facilities, so they adapted. Whereas the Americans could test an engine without actually launching it, the Soviets had to do a full launch.

The Soviets learned from each failure and with each test, they refined the engine. This way they achieved something the Americans could not.

Their design method is particularly instructive. Whereas for Americans, design and build phases were separate, Soviet design engineers handed responsibility over their design to build engineers. The build engineers would take over and be free to iterate the design to make it work.

“That’s Not My Job”

I started out my career at a consulting firm as a Business Analyst working closely with other analysts and developers. This company called us renaissance consultants, and in fact all our official titles were plain “Consultant”. I remember a training presentation where it was emphasized that we should all do what is necessary. “If the trash bin is full, we can’t say That’s not my job”.

I believe this kind of environment allows bright individuals to thrive. It allows developers who have a flair for client-facing consulting to take active part in client meetings. It allows technically inclined designers/analysts like myself to pick up coding when needed. This allows the team to go beyond the requirements or design “hand-off” and instead work hands-on together on challenges.

Contrast this with experience all designers have had with *some* developers. Developers who convey “It’s not my job” by saying “Done!” without even bothering to load their work in a browser (they send you a link to a page that’s completely broken). Equally frustrating is the reverse experience with designers, who don’t consider mobile experience or feasibility at all.

What The Future Looks Like For IT

There are many trends in the industry that try to address this divide. The trend toward pattern libraries and design systems can help developers design. Abstraction layers like jQuery made JavaScript coding more accessible to designers. Prototyping software like Adobe XD allows designers to build sophisticated interactions without any coding and make it easier to share specs with developers. But the bigger problem is culture.

Still, I think there is room for optimism. As much as there is a trend to specialization, there is potential for cross-pollination. Modern solutions are simply too complicated for designers to remain oblivious about technology and, I will add, business.

Organizations need to embrace more fluid approaches to specialization. It allows individuals to utilize 100% of their potential, making them more valuable and more satisfied.

It also sends a message to educational institutions. For example, now that UX is coming into its own, there is a growing danger of creating new silos where none existed. The field was built by folks who’ve come from all sorts of diverse backgrounds, yet it might be adopted now by those who would enter the field through specialized programs.

If we are not careful, specialization will change things, and I don’t think it will be for the better.

P.S. As I come to UX from a Business Analyst role, I have a similar view of that divide. In fact, a large consultancy I talked to recently mentioned they were experimenting with joining their two departments. They are not alone. But that’s a story for another day.

Solving Problems with User Research, Best Practices, and A/B Testing

What can I do to persuade more people to buy your product online? I tackled this question for 5 years as I ran A/B tests for diverse clients.

I remember one test idea that everyone on the team loved. The client said “That’s the one. That one’s totally going to win.” Well, it didn’t.

The fact is, most A/B test ideas don’t win.

In fact, interpretation is tough, because there are so many sources of uncertainty: What do we want to improve first? Which of a hundred implementations is a valid test of our hypothesis about the problem? If our implementation does better, how statistically reliable is the result?

Is our hypothesis about the users actually true? Did our idea lose, because our hypothesis is false or because of our implementation? If the idea wins, does that support our hypothesis, or did it win for some completely unrelated reason?

Even if we accept everything about the result in the most optimistic way, is there a bigger problem we don’t even know about? Are we inflating the tires while the car is on fire? 

If you take anything away from this, take this analogy: inflating your car tires while the car is on fire will not solve your real problem.

I believe the most effective means of selling a product and building a reputable brand is to show how the product meets the customer’s needs. This means we have to know what the customer’s problem is. We have to talk to them.

Then if we run an A/B test and lose, we won’t be back to square one. We’ll know our hypothesis is based in reality and keep trying to solve the problem.

Emulating Competitors

“I heard lots of people found gold in this area. I say we start digging there!”

That actually is a smart strategy: knowing about others’ successes helps define the opportunity. That’s how a gold rush happens.

This is why A/B testing blogs are dominated by patterns and best practices. So-and-so gained 50% in sales by removing a form field… that sort of thing. Now don’t get me wrong: you should be doing a lot of those things. Improve your value proposition. Ensure your buttons are noticed. Don’t use tiny fonts that are hard to read. You don’t need to test anything to improve, especially if you focus on obvious usability issues.

So what’s the problem? Well, let’s go back to the gold analogy. Lots of people went broke. They didn’t find any gold where others had or they didn’t find enough:

“The actual reason that so many people walked away from the rush penniless is that they couldn’t find enough gold to stay ahead of their costs.” ~ Tyler Crowe Sept. 27, 2014 in USAToday

You could be doing a lot of great things, just not doing the RIGHT things.

The good thing is many people do some research. The problem is not enough of it or directly enough. They are still digging in the wrong place.

“If I had only one hour to solve a problem, I would spend up to two-thirds of that hour in attempting to define what the problem is.” ~ An unknown Yale professor, wrongly attributed to Einstein.

Think about this for a moment: How can you sell something to anyone when you’ve never talked to them or listened to what they have to say?

Product owners often believe they know their customers, but assumptions usually outnumber verifiable facts. Watching session playback can hint at problems. Google Analytics gives a funnel breakdown, but it doesn’t give much insight into a customer’s mind. It’s like trying to diagnose the cause of indigestion without being able to ask the patient what they had for dinner or if they have other more serious health complaints.

The problem is it’s all impersonal, there’s no empathy. There’s no “Oh man, that sucks, I see how that is a problem for you”. It’s more like “Maybe people would like a screenshot there. I guess that might be helpful to somebody”.

Real empathy spurs action. When you can place yourself in your customer’s situation, you know how to go about helping them. If your solution doesn’t work, you can try again, because you know the problem is real rather than a figment of your imagination.

A Pattern Is A Solution To A Problem

Therapist: “Wait, don’t tell me your problem. Let me just list all the advice that has helped my other patients.”

Let’s say some type of visual change has worked on 10 different sites. Let’s call it a pattern.

A pattern works, because it solves some problem. So choosing from a library of patterns is choosing the problem you have. You don’t chose Tylenol unless you have a headache or fever. You don’t chose Maalox unless you have indigestion.

If you know what YOUR problem is, you can choose the right patterns to solve it.

If you don’t know the problem, you won’t get far choosing a pattern because it’s popular, because of how strongly it worked or how many people it has worked for. That’s like taking a medication you’ve never heard of and seeing what it does for you.

Pattern libraries are great for when you have a problem and want a quick, time-tested way to solve it:

Research Uncovers The Problem: A Short Story

Say you’re a shoe brand. You decide to reach out to people who are on your mailing list but haven’t purchased yet.

So you send out a survey. Within the first day, it becomes clear that many people are avoiding buying your shoes, because they’re not sure about sizing.

You’re shocked, but you shouldn’t be. User research insights are often surprising.

It’s just that you thought you anticipated this by posting precise measurements, a great return policy, and glowing testimonials. If anything, you thought people would mention the price, but no one so far mentioned price.

That’s a big deal for your product strategy. You need to build trust. So you set aside your plans for a full redesign (those fancy carousels on your competitor’s site sure are tempting). You set aside A/B test ideas about the font size of prices, removing fields, and so on.

You tackle the big problem. You do some research and come up with solutions:

  • match sizing to a set of well known brands
  • provide a printable foot template
  • allow people to order two sizes and return one
  • mail out a mock plastic “shoe” free of charge, and so on…

You ask a couple of people to come to the office and try some of your solutions.

Your user testing methodology is simple: First people pick their size based on either the sizing chart or template. Then they see if the real shoe fits.

Result? The matched sizing and the foot template were very effective in predicting fit. In user testing, the initial template didn’t work so well, because it’s hard to place a 3D foot in perfect position on a 2D printout. So, you come up with a template that folds up at the back and front, simulating a shoe. The users liked that much better. In fact, you start working on a cardboard model you can mail cheaply to anyone who requests it.

Now you’re off to testing it in the real world!

You design 2 different foot sizing comparisons, one pretty one with photos of top 3 brands and one long, plain table with 20 different brands. You also create an alternative page that links to the downloadable foot template.

You A/B test these variants over 2 weeks and pick the one that works.

(Then you go back to your research and find the next problem.)

You may also like this post about patterns: Compact Navigation Patterns .

If you want to uncover the biggest problems for your customers, I’m happy to help.

15 Jobs-To-Be-Done Interview Techniques

Here are 15 techniques I extracted from the Jobs-To-Be-Done interview Bob Moesta’s team did with a camera customer (link at bottom):

Set expectations

Give an introduction to how long the interview’s going to take and what sorts of things you’re interested in. For example, “even minor details may be important”.

Ask specific details to jot the customer’s memory

Don’t just ask what the customer bought but why that model, which store, what day, what time of day, where they in a rush…

Use humor to put the customer at ease

Intentionally or not, early in the interview the whole team had a good laugh about something the customer said. I think it did a lot to dull the edge of formality.

Discuss pre-purchase experiences

Ask what the customer used before they bought the product and what they would use without it. Dig into any “I wish I had it now” moments prior to the purchase.

Go back to the trigger

Walked back to what triggered the customer to even start thinking about buying the product and to a time before they ever considered it.

Get detailed about use

Interviewers and the customer talked about how she held the camera, which hand, in which situations she used it, which settings she used, and advantages/disadvantages of the alternatives. You want the customer to remember and imagine the product in their hands. Things like the weight or texture of the product could impact the user experience. Dismiss nothing.

Talk about lifestyle impact

Dig into ways in which the product impacted the customer lifestyle, things they were/are able or unable to do. For example, they talked about how taking pictures without the camera affected the way she presented her trip photos to her sister. Focus on the “use” rather than the specific “thing”. For example, you can ask “do you like this feature”, but then you want to move to “what does this feature mean to you in terms of what you’re able to do, how it affects your lifestyle, your future decisions”.

Explore product constraints

Talked about how other decisions and products impacted the decision. For example, the size of the bag that has to fit the camera and avoiding the slippery slope of requiring additional accessories.

Ask about alternatives

Products don’t exist in isolation. The customer had several other solutions, which serve different, specific purposes. Figure out whether the new product will replace or complement other products.

Point out inconsistencies, such as delays

Interviewers pointed out that the customer waited a long time to buy the product from the initial trigger to making the call after a trip. They asked “Why did you wait so long?”

Talk about the influence of other people

Ask about advice other people gave the customer or how other people may be affected by the decision.

Don’t put words in their mouth

In digesting and summarizing back to the customer, it’s easy to inject your own conclusions and words. Try to elicit attitudes and conclusions from the customer. Lead them to it but don’t do it for them (a related technique is to start talking and then leave a pregnant pause, so the customer can complete the thought). In one clear case in the camera interview, the interviewers asked a leading question but then prompty noticed this and corrected themselves, saying “Don’t use his words”.

Talk about the outcome

Asked open ended questions about whether the customer was happy with their purchase and in what ways. Ask about specific post-purchase moments when the customer felt “I am glad I have it right now”, but focus on how the situation is affected not the product itself.


Here are some additional I considered after listening  to the interview:

Avoid fallacy of the single cause

Don’t push the conversation towards a single cause (see Fallacy of the single cause). Rather than engage in cause reductionism, accept there may be multiple, complex causes.

Let’s say you pose the question: “Joe said that, and so you decided to buy X?” The simple narrative may be intuitive, causing the subject to be persuaded that “Yes, I guess that is why I decided to buy X”. In reality, the events may be true (Joe did say that), but in reality may be unconnected. In these cases, it’s important to point out inconsistencies rather than seek confirmation. For example, in the camera interview the interviewer rightly pointed out an inconsistency: “Why did you wait so long to buy X after he said that?” They also often asked “What didn’t you…” Work together to uncover the truth.

Beware planting false memories

Do not reflect back your own sentiments or ideas to the interviewee when clarifying. For example, asking people to confirm something they did not literally say may cause them to confirm a causal relationship that did not happen (other cognitive biases may aid this: pleasing the interviewer, tendency to fall for reductionism). It may plant a subtle attitude that might then be amplified through the course of the interview. Also be careful with “because” statements, as there is some evidence that we are biased to accept such explanations even when they are irrational (see The Power Of The Word Because).

More on possibility of implanting false memories Video 1 and Video 2.


Listen to the interview for yourself.