How far can we go with automation & AI?
How might we automate the planning process in Duke’s Nuclear power plants?
Guiding an interdisciplinary team in an agile environment through human-centered design thinking methods in an 8 week discovery and refinement process that led from opportunity area to product implementation.
PROBLEM
Duke Energy’s Nuclear fleet was looking for ways to provide further automation in the planning process, a potential cost-savings opportunity area. They had an idea: “automated planning.” But what did that look like? And how much automation can - or should - we do?
OUTCOME
A recommendation to build a product concept called PIN (Plan It Now), an automation engine that interacts with an existing product, PlannerPro - and the data to support bringing this product back into build. The output from the 8-week process was a plan moving forward: a long-term vision with leadership buy-in, and a big picture roadmap for Automated Planning, and a roadmap for PlannerPro 2.0 and PIN based on impact, desirability, and feasibility.
METHODS
Quant research through survey
Interviews
Observations
Workshops with SMEs, designers, and tech people
MY ROLE
Design Strategist responsible for leading user research and co-led workshops with a product strategist consultant
Led product owner, product analyst, subject matter experts, IT lead, change manager, and data scientists through Discovery and design process
Coached team in user research and human-centered design
Starting with an idea… but unsure how far we could go.
Leadership wanted Automated Planning. But what did that mean, and what was truly possible? We took the idea through a 8-week process where we dove into the problem space, captured assumptions, and tested ideas.
What problem are we solving for? And who were we impacting?
We were missing a clear problem statement, and we needed one to stay focused on outcomes rather than output. So we had to do a bit of reverse-engineering… What was the problem we were trying to solve for? And why was automated planning the best solution for that problem? We used our current state knowledge to capture all the end-users impacted, their current needs, what barriers existed, and how solving for those would be beneficial. Doing a bit of diverging and converging stimulated good conversation and helped the team craft a problem statement that had leadership buy-in.
What do we know… or think we know?
The team had done some work in the past in this opportunity area, so we went back to the SME’s in that project and had them walk us through a brain dump of their process, their insights, the solution they delivered, and lessons learned along the way.
Revisiting past work on the planning process, planner’s pain points, and PlannerPro vision
We walked through a new current state process of this part of the work management cycle, now that the new solution was in place, and remapped what we understood the experience to be and what pain points remained, and what new pain points had emerged. This was also a good exercise to better understand the technical complexities behind the solution, and to help us capture any questions or assumptions we had.
Assumptions —> Action Plan
We took the questions and assumptions we uncovered up until this point and assessed it based on how known the answer was, and the level of risk to the success of our effort. This gave us a guide to moving forward, and laid out initial plan into further investigation of the problem space and opportunity area.
Talking to users to better understand new current state
We needed to better understand where the planning organization stood now that we had implemented our solution, PlannerPro. Did we hit the mark? What was missing? What were the outstanding pain points, or had new pain points emerged? The biggest question we had was about what their concerns or attitudes were towards Automation. We knew that was a major pivot from our past experience… had their mindset changed? Did we need to manage that better? What were some unknown concerns?
We designed a survey to assess that high level temperature of the current state, and conducted 15 interviews with planners and supervisors to deep dive into our questions even further.
They survey gave quantitative data to support some of what we heard, and more. It verified some of our assumptions but also uncovered some surprises we weren’t expecting. Primarily, our biggest (unhappy) surprise was the overwhelmingly low NPS.
We learned from the survey and the interviews that the negative response was a result of three things: 1) underdeveloped product (not anywhere near feature-complete), 2) instability in the product, and unresolved bugs, 3) the method by which the product was introduced to the org and the (lack of ongoing) support and training provided.
From ideas, to capabilities, to a vision storyboard
We had an ideation workshop
Using the storyboard to create user stories that would inform feature development
Mapping out the “magic behind the scenes”
It took a few tries to map out the logic behind automating planning, and which steps needed a human to vet and when. Once that was captured, we mapped out the most complex journey (where the most human interaction would be required). This allowed us to identify which points of interaction would happen between the user, the existing tools, the new automation engine, and the system of record. With the experience more clearly laid out, we shared to get further user feedback and make sure we weren’t too far base.
Prioritizing our capabilities to inform product roadmap
With our experience mapped out and vetted (to a certain extent), we assessed each capability from our user story in terms of Desirability, Viability, and Feasibility, with an extra layer to capture how this feature would enable true automation (and as a result, true transformation to the process)
The assessment helped us inform our product roadmap. We took the first top item - stabilizing PlannerPro - a next step. We groomed our list of bugs and feature requests from our research and the team collaborated with the IT to identify what was on their next release and what would be net-new - the latter being more fuel to support transitioning PlannerPro back to build.
UX Concept
In addition to the backlog prioritization, we also had a first pass at the UX design of these new features in PlannerPro (now that we identified PlannerPro as the best place to create the experience when a human element would be required). Rough drafts of these were created and team made plans to create prototypes to test with end-users. (project in progress as of April 2021)
Designing the continuous learning experience
The one primary piece of insight we uncovered during research was a gaping opportunity to improve the support and continuous learning experience. From the research, we uncovered that the low use and low NPS was a result - partly - of the roll-out experience. So we did a mini-ideation workshop to identify potential ways to improve training, provide support, and help users feel comfortable with the product.
From the research, we created four personas to guide the project team going forward. These personas capture the user archetypes that fell roughly along the adoption curve. We used these as prompts for our brainstorming session. Ideas were prioritized and next steps will be captured in ongoing work. (Work in progress as of April 2021, at which point I was moved to a new team)
Lessons Learned
Get better quality input
If I could have kicked this project off again, I would have started it differently. I would have asked more questions to get better input going into the workshop: primarily, a clear problem statement, a clear vision, and understanding of what outcomes we needed to get from the process. This would have helped drive the team through the first couple weeks more quickly if we had had clarity on leadership expectations were from this process.
Get leadership alignment on the input immediately
This goes hand-in-hand with the first lessons learned. Although not captured above, the effort took a few twists and turns from a lack of clear alignment with leadership upfront. Having a better clearer input would have allowed us to move through certain issues a lot more effectively. Overall, we got to where we needed to be and learned a lot along the way, but could have saved us some trouble and 2-3 weeks of churn.