Results by Design: UX Insights for Business Leaders
Description: In this episode, Craig Nishizaki and Michael Woo discuss the challenges of handling unclear requirements in fast-paced projects. Based on their own experience, Craig and Michael unpack how unclear product visions, tight deadlines, and internal miscommunication can derail a project while emphasizing the importance of proper planning, aligning stakeholders, and addressing red flags early.
What You’ll Learn:
- The danger of unclear requirements and moving too fast
- Identifying and responding to yellow and red flags
- Importance of stakeholder alignment in product development
- Tips for balancing speed with quality in project timelines
Interview Participants:
- Craig Nishizaki, Head of Business @ UpTop
- Michael Woo, Director of UX @ UpTop
Transcript
Intro:
Welcome to Results by Design UX Insights for business leaders, the podcast that dives deep into the world of UX design, strategy and insights. Tune in, take action, and design your way to success.
Craig Nishizaki:
Hi, I’m Craig. And I’m Mike. And we are your hosts for the Results by Design podcast. So how are you doing, Craig? It’s been a minute since we sat down together and chatted.
Michael Woo:
I know I’m doing well. It’s been really busy, very exciting times. How are you doing, Mike?
Craig Nishizaki:
So the weather has turned to fall, but I can see the sun is stubbornly hanging around, so I’m very grateful for that.
Michael Woo:
Yeah, the transition from summer to fall in the Seattle area is amazing. And you wake up in the morning and it smells like football. That’s what I always say.
Craig Nishizaki:
I love football, right?
Michael Woo:
That’s right.
Craig Nishizaki:
Yeah. So let’s dive into this episode. In a previous episode we talked about challenges that hold companies back in their digital transformation and digital initiatives. Today though, we are going to talk about how unclear requirements are the number one reason that projects fail, what the root causes are, and some ideas on how to solve them. Craig, when we talked about this topic, you had some real life stories that I thought were very relevant. Would you like to share a story to help frame what we are talking about today?
Michael Woo:
Yeah, I would this topic around unclear requirements, we talked about it a number of times. We’ve even recorded a one question with up top where we answered what’s the number one reason projects fail. And a story that I can share that I think would be really relevant is a few years ago we were brought in to redesign a member portal for a company in the health insurance industry. And their chief operating officer reached out to us. She had been a client of ours previously and we’ve had a great relationship, trusted relationship with her. And so we had an initial discussion to discover and understand the needs for the project. And the meeting was with the IT and app development leadership team. That meeting was in July. And as we were talking with them, their stated objective was to design a modern member portal experience with an MVP to launch by the end of the year with phase releases to follow.
And they provided us with a list of requirements. They listed out the features of the current or the legacy portal, they included some new features and then they walked us through a demo of their current system. And in that initial conversation or coming out of that initial conversation, some of our concerns were that it was on a tight timeframe for us to really scope something, we need the business requirements, the user requirements, the user stories, and we need to know what the technical constraints. So we brought that up to them after the meeting and their response was, we need your statement of work, the budget, scope, and timeline so we could get started right away. The definition of success for the MVP is feature parody. I’ll put in quotes, in a modern UI user interface, we have the user requirements, we’ll get them to you, we have the business requirements, we’ll get them to you. We have the technical requirements, we’ll get those to you. You just need to focus on the design. And so we think about things in yellow flags and red flags, and there were definitely some that we discovered along the way. And the big yellow flags were when they said feature parody and you just need to focus on the design.
Craig Nishizaki:
Yeah, sorry to interrupt. So what do you think they meant by feature parody?
Michael Woo:
Well, that’s a great question. And we asked them that and their answer was, feature parody is the same feature functionality as the legacy portal. So it has a login, your ability to see your benefits, et cetera. And our learning from this is the definition of feature parody means different things to different people. So it’s critical to dig deeper to understand from all parties what their definition of that is. Similar to if you said, I want a car, well, is that car a Toyota Corolla or is it a Tesla? They’re both cars, but they mean different things to different people. The other big yellow flag early on was that we weren’t meeting with other stakeholders, the product marketing customer experience. The question we had internally was why weren’t they in the meeting with us with the IT team?
Craig Nishizaki:
That’s an interesting message that it sends when you don’t have representation from the various groups in the discovery and scoping conversations.
Michael Woo:
Yeah, for sure it is. It’s kind of telling when one group is leading all the conversations and you’re not getting that cross-functional conversation and collaboration happening. And then as we progressed in the conversation and in the process, there was some red flags that came out. There was a lot of back and forth on the details of the statement of work took over a month, but the deadline hadn’t moved. So now we’re into end of August and they’re still planning or committed to launching an MVP at the end of the year. And so the timeframe is tighter. And during this process of discovery and scoping, the director of IT and the project manager were functioning like gatekeepers. They weren’t allowing us to have those conversations with the stakeholders. And as we would work through the details, we thought they were nailed down, but then other details started trickling in.
Craig Nishizaki:
So with all the yellow and red flags that came up, remind me again, why did the team decide to continue on with the project?
Michael Woo:
Oh my gosh, that’s a great question. The main reason is we had a relationship with the COO. We had the ability to go back to her with the ability, and also I felt like the responsibility to go back to her with an understanding of their situation. Because what wasn’t clicking for us for our team was why were they in the situation that they were in just four or five months away from when an MVP is supposed to launch and everything was still very loose. So that’s a great question Again for us, like I said, had we not had that relationship at the COO level, at the executive level, we probably would’ve backed out earlier. But as we kept going, we realized that there was a big disconnect. We got the initial scope agreed to for an MVP redesign that would allow for a feature parity.
We set up a kickoff meeting and an alignment workshop where everyone, the it, the product, the marketing, the business leadership, customer experience, we’re all going to be in the room at the same time. And based on what we had been provided as requirements, you and I presented the initial concepts and the product and marketing team blew up. There was definite friction in that meeting between IT and the product and marketing folks. And what we found out as we were going through this conversation, and ultimately the blowup was that product and marketing had provided user stories to the IT group that they thought that they believed were the requirements. And we had not yet seen those.
Craig Nishizaki:
I have to remember that time, it was yesterday, one of the few times in my life where I walked through design concepts on a call and afterwards I had asked the room, the folks on the call, their thoughts on what they saw, and it was silence, crickets. And so then somebody spoke up and kindly said, this wasn’t what we expected and my heart just dropped. Yeah. Anyway, go ahead.
Michael Woo:
There was a big misalignment and for us, there’s a choice. Do we keep moving forward or do we take a step back? And so how we function is we take a step back. And so we asked product and marketing about the requirements, the high level requirements that they had listed out. And so during that meeting they show us and the realization is that this is a re-imagination, a re-imagining of the experience, not a redesign. There’s at least six months of work on the design side and we have one month to get it done. And so the next question we ask is we take it from the marketing folks or the product folks to the IT people and say, do you have all the APIs, the data and the systems in place for these features and functionalities that the product team has in their requirements for the MVP and they body language and everything, they react by saying, you focus on the design, we’ll get you the data. And that right there was a big red flag that it’s very telling when you can’t answer yes or no. And so then we asked the head of customer experience what their goal for the redesign was, and he says, we want a better experience for our members, but it wasn’t, wasn’t measurable. So we knew that there was some big challenges internally that we needed to work through.
Craig Nishizaki:
So before you continue, it sounds like we have this ticking time bomb scenario, right? One, we have good way to put it, a client stakeholder group that is not aligned on expectations to communication back to us is inconsistent and delayed. And then lastly, all while the deadline is getting tighter and tighter. So what happens next?
Michael Woo:
Well, this is again where experience, expertise, the process is really important. And so we know time’s ticking. So for us, even if we’re not going to pursue working on the engagement further than this initial piece, we believe that the highest value we could provide is clarity to the executive team and the sponsor of this initiative. And because ultimately if this thing fails, someone’s losing their job and we wanted to at least provide an assessment of what’s going on, some recommendations and help them move forward successfully. And so it was time to reset. So we needed to work around the IT organization. We met with the executive sponsor and the COO to dig deeper. We met with the product marketing and business folks to gather their requirements and make sure they felt heard, and we found that they didn’t feel like they were heard. They felt like they were pushing, forcing the issue by creating a list of user scenarios and handing those over to the IT team.
Craig Nishizaki:
I think I know the answer to this, but why did we think it was better that it was left out of these reset meetings?
Michael Woo:
So you have to dig deeper to understand the situation. We needed to understand the backstory, why they were in the situation they were in, figure out the highest value ways for us to help them. And there’s two approaches. You can get all the liars in the room at the same time to hear what’s going on, but if there’s a lot of friction, there’s a lot of mistrust if you will, then it’s better to divide and hear each side than triangulate the discrepancies in the story and the facts. And by doing that, what we found out is there was definitely internal disconnect between IT product marketing and business. The IT backend reliance on third party systems added a ton of complexity and constraints. Ultimately, there was a skills and talent gap with the poor project management and turnover. And then the developers that they had in-house were used to managing legacy systems and third party systems.
They were not builders. And so if you’re going to build something new, having a team that’s a development team and a design team that’s has the makeup to run fast and build versus be methodical and manage, I think is a big difference. And from our perspective, if they didn’t accept the bad news, once we did our assessment, then we were ready to end our part of the engagement. So as we did our research or not research, but as we did our discovery with these different groups and found out these different things, we set up a meeting with the COO to share the findings. We let her know there’s no way they would be ready to launch by December. We validated or verified that there was an internal disconnect and friction, and she was open to this conversation around the project management issues. And so at that point, she asked us for recommendations of how can we help them?
Craig Nishizaki:
Yeah. So in your opinion, when things start off this badly, what’s a good approach for getting clarity from stakeholders or clients? And in this case, how do you prioritize requirements when there are conflicting inputs?
Michael Woo:
Yeah, I think the approach that we took digging deeper to understand the situation and then meeting with those groups individually so that we could then gather their requirements and put ’em all together and map out where there’s alignment and where there was discrepancies. And then to do that, we took a design thinking and UX approach to then tease out what the vision for this product would be and get alignment on the vision for the MVP as well as the phased approach. During the design process, we designed the north star, the ultimate desired future state, and then work back to the MVP and then to help them prioritize their features. We did an impact versus effort exercise so that they could really understand, Hey, I want this feature, but yeah, that’s going to take a ton of effort. Maybe we should deprioritize that one for a future a release. And that allowed us to then build out a roadmap for them because ultimately everyone needs predictability and a plan. And in this case, they were looking for outside expertise to really help them define that and develop that.
Craig Nishizaki:
Yeah. Now that we’re reflecting back on this project, what do you think were the root causes of unclear requirements?
Michael Woo:
Yeah, it’s interesting. As we were thinking through this and then looking at other projects, I think the root cause for unclear requirements follow a pattern. Oftentimes it’s trying to move too fast or lack of clear product vision or underestimating the complexity of the solution or the situation that you’re in. And those all lead to bad requirements or undefined requirements.
Craig Nishizaki:
Those are some really good call outs. And let’s lead into these because I feel like that happens quite often. When you say moving too fast or lack of clear product vision and underestimating complexity, do you think you can describe what those mean?
Michael Woo:
Yeah, I think I’ll try my best to describe this. It happens a lot, and they’re often interrelated. So there’s many reasons that these patterns can show up. Maybe the product is an idea that’s been latched onto, but it hasn’t been defined well enough. But the executive leadership wants it to ship by a certain date. And then that challenges the project team or the product team just to really try to figure out what’s the level of effort? How do we know that we could get it done in this timeframe if this idea has already been latched onto at the exact level, or that trickles down to the deadline being aggressive and there’s procrastination upfront, and oftentimes the procrastination is because they don’t know where to start or it’s so complex, how do they get all the moving parts together? Or it’s because the product vision isn’t clear.
So that’s how it’s all interrelated. And they may have found that there’s technical challenges that are going to hinder their ability to deliver, but the big piece of it is they don’t follow the idea of bad news fast. So inside our company, we have a value that if there’s bad news, we reveal it fast so that we can then take in the bad news and make decisions rather than delaying those decisions which downstream impacts the ability to deliver. Another common scenario is APIs and systems don’t have proper documentation. I think that with agile development, one of the bad habits that many developers have developed, if you will, is not documenting or annotating their code or documenting what the API does in a clear, concise manner because they’re moving so fast to release that they’re not updating those notes properly. And so oftentimes you run into situations where API systems and code aren’t documented.
Then I think the most common one that I’ve seen, especially we used to see it years ago at Microsoft or these larger organizations that outsource or they have a dev team internally that they get resourced to them, is that the dev teams already onboard and they’re waiting. And so there’s pressure to move. And so you start doing work before it’s really well thought through. And another example that I can share is a health technology startup that brought us in to help discuss design and development of a new product. It was a remote lab testing solution. They created a solution during covid, and now that the pandemic had been winding down, they wanted to extend this product into other markets. It had functionality like registration and profile, sample lab, sample submission, lab testing of the sample updates and notifications to the individual following the lab process flow, all those things involved.
And then securing that PII data and communications, and then ultimately there would be a consumer experience, a lab experience, and a customer support experience. And our CEO at the time had brought us into this conversation. We had the initial conversations in October of that year and found out that their CEO had committed to investors that the product would be generating revenue in March of the following year, which is only five months away. And so again, you start thinking about yellow flags and red flags. The yellow flags on that short timeframe are that there’s unclear product requirements. They had initial requirements, but those things were all work in process. And so our work in progress, and they were being updated and changed throughout every meeting. The product vision was there, but it wasn’t really solidified. It felt like the need they were trying to meet was being driven by a market that wasn’t well-defined or validated.
And then to accelerate development, someone at the exec level mandated that a certain tech platform be used to build their app, the consumer app on. And as we did research, we found that it wasn’t widely used. It was limited in the number of developers that had experienced working with it. And the platform turned out that it came out of the same university that this project idea was generated. There was some connection, the exec level and the board level for both. And ultimately that platform was very limited in its functionality, and so it would need to be customized to accomplish what the stated product requirements. And so as we went back and relayed that about the tech platform, most of the project team asked for our technology recommendations. As we went down that path, we found that there was other third party backend systems that they were using or they planned to use. And those contracts had not been signed yet. And so it’s five months till commercialization and their backend systems were yet to be confirmed, which went from a yellow flag to a big red flag. And the deadline still hadn’t changed. And they had been working on this for at least six months prior to meeting with us, but there was still all those loose ends, and that’s when it went from yellow flags to red flags in my mind.
Craig Nishizaki:
So you listed some warning signs that indicate a team might be moving too fast, and those included an aggressive deadline, no clear product vision, a predetermined tech solution without full vetting, lack of alignment, changing requirements, no proper documentation for APIs and systems cutting research out of the scope. And all while Deb was on standby adding this added pressure, most consultants wouldn’t touch this with a 10 foot pole, wouldn’t you say?
Michael Woo:
Yeah, in fact, I told our CEO that that was my biggest concern, that all of these things created my biggest concern, which was that we were going to engage and it was going to fail.
Craig Nishizaki:
And
Michael Woo:
So I agree with you, most consultants wouldn’t touch us with a 10 foot pole.
Craig Nishizaki:
So let me ask you, how do you push back when asked to deliver faster than it’s realistically able to?
Michael Woo:
So great question. And again, I think for us, as we were going through the discovery and scoping process, there’s these checkpoints or things that we push back on. And ultimately if the client doesn’t align with this, then that’s when we walk away. And so to go through the process of trying to educate them and push back, we provide relevant examples of work and the timeframes that those products take to design, develop, and commercialize for context to their situation so they can see we’ve done it before from that experience, we can tell you here’s how long it should take, and here’s the effort and the budget and the timeframe. We educate them on the process that it takes for a product to be successful in its development and commercialization. And oftentimes that’s because we may be talking to somebody that’s on the product side that isn’t as technical or doesn’t have as much experience with the design process.
Or you may be talking to someone on the business side that doesn’t understand some of those has some blind spots as well, or you might be talking to someone on the technical side that may have some blind spots. And so educating them the whole process is part of what we do to help educate them and then asking them for fundamental pieces that they should have in place at the point they are in the product development process. So do you have your problem statement? Do you have your product vision defined? Do you have the personas, the roles, use cases and tasks all documented? Do you have your user requirements, your business requirements, your functional requirements? Do you have the technical architecture? Have you already started building out your list of prioritized features that we can then craft into the roadmap? Do you have the roadmap already in place?
And then do you have the workflows, the user flows, and the wireframes already sussed out? And based on what they have in place, we can then assess the level of effort to get to where they need to be for them to be successful. And I think the big lesson though is that I would always say is you can’t expect to get a fixed scope timeframe and budget with loose requirements, right? There’s two sides to the relationship of building out the product. And if you don’t have the requirements defined, the vision, defined the work done, then there’s no way to have any predictability in a fixed scope timeframe or budget. And the reality is everyone wants predictability in a plan. They want to know that when they commit budget and to this project that it’ll get done with high certainty within that budget, within that timeframe and within that quality expectation that they have.
Craig Nishizaki:
Yeah, that is so true. I’m glad you said that, and I hear you say that all the time, and I think folks need to actually read it back to themselves or really listen to those words because when things are ambiguous as they are, I don’t know how you could put a price tag on that, right? Or
Michael Woo:
Absolutely
Craig Nishizaki:
Time.
Michael Woo:
Yeah. It leads to mutual mystification, right? One group thinks they’re doing this, the other group thinks they’re doing this, and ultimately that leads to a bad result.
Craig Nishizaki:
Yeah. So do you think one could manage though the tension between speed and quality and product development?
Michael Woo:
I do, but I think that to do that, everyone needs to agree that sometimes you have to slow down to speed up. And what I mean is take a step back to do the proper planning that will allow you to create velocity. And that is making sure everyone’s aligned on the product vision, confirming the expectations and requirements for the MVP and the future releases in the roadmap, confirming that funding is allocated and approved for the initiative and being very clear and transparent about that so everyone knows what they have to work with. And then if you keep running forward fast without the proper planning, you’re going to risk building a product that doesn’t meet the mark for the end users, doesn’t scale technically brakes, has defects, has security issues, all these things that you don’t want. And just know that when you make the decision to slow down, it’s cheaper and faster to course correct before any code is written. So if you’re at the point where you’re like, Hey, I’m going to have to make a tough call and tell the CEO, we need to slow down to speed up, remind them that it’s cheaper and faster to course correct before any code’s written.
Craig Nishizaki:
I love that phrase, slow down to speed up. It’s one of my favorites. So true. Craig, are there any other patterns that you’ve seen that may lead to unclear requirements?
Michael Woo:
Yeah, the last one, that again, is very common. Again, when you’re on a tight timeframe, you’re trying to cut things out of the project to get there faster is oftentimes companies and teams lack user research. They’ll cut that out because they think, Hey, we already know what the user needs or research takes too long, costs too much, and we’re going to find out what we already know. And I’ll give you another story of an example, and that’ll kind of bring light to this. We were brought into design North Star concept, a three year out vision for a portal that provides third party sellers with actionable insights to help them improve their customer satisfaction and profitability. Ultimately, and again, this is on a very tight timeframe because it fits within the rhythm of a business where they’re doing budget planning for the next year. And this envisioning piece becomes part of a business case that gets evaluated for budgeting.
And we understood that there was a very tight timeframe, but the caveat was that the product owners had already had an approved MVP vision that had been through vetting. And so they were going to launch with that in the next budget season. And the concept design would have some flexibility because it would be a three year out vision. And despite this tight timeframe, we recommended some lean UX research, so user interviews and usability testing to be included in the scope of the envisioning project, but both were ultimately cut out by the client. And their rationale was that they’ve already done a bunch of research internally and they felt like that was enough to go by. And then the time and budget constraints made it easier for them to cut that piece of work out. And so the project owners, before we move forward, we wanted to understand and get alignment or agreement with them on what would happen next.
And the project owner said that they would be the subject matter experts for the design and move forward with their assumptions and their user scenarios. And ultimately, the result of that initial envisioning project was that the project fell flat, the assumptions for the actionable insights and visualizations and next steps got challenged by senior leadership because they were looking at things more granularly and wanted proof as to why did you design this visualization in this way and what’s the next step that this user should take? And through this experience, the project owners realized that they needed to gather seller insights or user insights so they could prove to the leadership that the rationale behind the design, the visualizations, and the actual insights, and ultimately they came back to us to try to get our recommendations of how to get there. And so we reset the scope and had they done what we had originally recommended, the project would’ve moved a lot faster and probably cost a lot less. But because research was cut out, this reset had to happen.
Craig Nishizaki:
That’s a great example of the value of UX research in the design process. In your opinion, what are the key indicators that more UX research is needed on a particular project?
Michael Woo:
Yeah, that’s a great question. I think if you don’t fully understand the problem you’re trying to solve and you can’t defend it in a business case to your senior leadership, there’s some research that should be done. Or if you don’t fully understand your users, who they are, their motivations, their needs, their actions, that’s another indication. If there’s lack of consensus on what the user’s priority should be, and you’re making design decisions and feature functionality decisions based on assumptions, that’s another indication that research would be helpful. Decisions that are made on assumptions, opinions or subjectivity and not data, again, that’s another indication. And using existing research that’s outdated or a lack of recent user feedback, especially if the market conditions have changed since the research was done. And great example of that is any research that was done or any market trends that happened during 2020 to 2021, well, that was during the pandemic.
Everyone was remote, online shopping happened, look at Peloton and the explosion of their business during that time. And then what’s happened now? So the market has changed dramatically. Also, if you’re targeting a new market or user segment, or if you’re working on a high risk or complex problem and you don’t have the data from users research, that’s an indication that you need to get some done. But ultimately, I think to frame it differently, rather than thinking of research as being expensive and taking too long and finding out what you already know, I think that business leaders need to reframe it and understand that research is really a scoping exercise. If you do the research, ideation becomes obvious. You find the real problem, so the solution becomes evident sooner, and then you actually create more focus and reduce features, which then reduces the development scope and accelerates the speed to market. And if you frame it that way, then you would want to do some user research upfront so that you could then speed up the process of the design and development. I’ve been doing all the talking it seems, but Mike, the question for you is how can you balance user research with the needs to move quickly to market?
Craig Nishizaki:
Yeah, that’s a good question. But before I answered that, I was going to say, man, you really come up with the phrases today. You’re really dropping ’em. Research is really a scoping exercise. I love it. But yeah, to answer your question, how can you balance user research with the need to move quickly to market? I always think it starts with a mindset that some research is better than no research. If you’re already practicing some level of research, you’re going to have to scale back that ideal activity, but really kind of plan it out. So you could lean on UX or Gorilla Tactics, and some of them that we’ve used in the past, for example, when it came to usability testing, we approach friends and family or other folks in our network. Though it is biased, it can be quick and you could find folks that fit a particular role if you have them.
Again, in your network, you could also approach coworkers. Ideally, folks that are not close to their project are not working immediately in that realm. And then if you think about how there’s moderator on moderate studies, obviously moderator takes a bit more time. You have to actually be there to conduct those. Look at unmoderated studies, though. It takes some initial set of time. It can get you a good sample size of data when it’s all said and done. So those are just a sample of things you could always pop into, depending on the space itself, you could pop into a cafe or a public space and get participants that way. But again, the point is some research is better than no research, and I don’t like it when people cop out and say, Hey, we just don’t have time. Right? When you do that, you’re going to see the results, which as we’ve talked about isn’t good. You know what I mean? You have that some data point work from.
Michael Woo:
And I think to add to that is in the mindset, are we trying to gather directional feedback and validation or is this going to work or not work? Just get an indication of that. Or are you trying to get something that’s scientific and in most cases, early on, getting directional feedback, like you said, some research is better than no research.
Craig Nishizaki:
Yep, absolutely. Well, Craig, I appreciate the conversation today. It was a good exercise to walk through some past and current projects to break down why unclear requirements often show up today. You’ve heard some ways in which to identify and deal with them. We recommend taking what you’ve learned and seeing if you can identify these challenges at the onset, or maybe you’re in the midst of a project right now and it’s exhibiting some of these characteristics. See if any of these ideas we’ve shared can help you and let us know how it goes, because we’d love to hear about it. So that’s it for today. If you liked our conversation, please click subscribe. Join us next time as we explore more innovative approaches to enhance your products and services, optimize customer interactions, and ultimately drive success for your organization. Tune in, take action and design your way to success.
Outro:
Thanks for tuning in to Results by Design. If you liked this episode, be sure to share and subscribe to our YouTube channel. We are also playing on all your favorite audio streaming podcast platforms, so stay connected and join us for the next one. Results by Design is brought to you by UpTop. Our mission is to equip business leaders like you with the knowledge and tools needed to leverage UX methods and strategies to achieve tangible business outcomes and create lasting value. Whether you’re a seasoned executive or just starting to explore the world of UX, results by Design is your go-to resource for unlocking the potential of user experience to achieve remarkable results. Tune in, take action, and design your way to success.