Product Portfolio, Managing Stakeholders Steven Jones Product Portfolio, Managing Stakeholders Steven Jones

Create a year-end Product progress report

In an attempt to summarize our collective accomplishments over the past 12 months, I decided to create a simple, 1-page chart that communicates the product advancements and highlights remaining product opportunities.

The Product Decision: Use the familiar customer process as a backdrop for reporting finer-grained enhancements across the entire product suite.

Flickr image source: http://tinyurl.com/gpea4r5

Flickr image source: http://tinyurl.com/gpea4r5

In an attempt to summarize our collective accomplishments over the past 12 months, I decided to create a simple, 1-page chart that communicates the product advancements and highlights remaining product opportunities.

Many people look to the end of the calendar year as a good opportunity for reflection and this is as true in the professional world as it is in the personal one. Twelve consecutive months would seem to provide a sufficient review period for attempting to understand the overall impact of the team's more significant product decisions.

I have previously written here about why I feel the Product Manager has a more pronounced need to establish credibility than most other roles in the organization. It is easy to make grand, sweeping promises but ultimately, a PM and his/her team will be measured (continuously!) by the results they achieve. And even those small victories can be short-lived as the team is pushed to address the next round of challenges.

So, it seems fitting that I take a little time during the last week of the year to revisit those promises I had made and to fairly tally up the "scores".

What drove this decision

In the spirit of transparency, I wanted to conduct an honest assessment of how we did or did not advance the product portfolio and then share that with the entire company. I had established clear objectives at the beginning of the year and now felt an obligation to compare our actual results with those original goals.

I was also hoping to create a new artifact that might even help motivate the teams. Something that said, "Look how far we've come!"  If it turned out the way I thought it would, it could become a new collateral piece that gets the Sales all fired up, sending a clear message to them about how much easier it had become to tell and sell our story to customers.

Above all, I wanted to use this progress report as a reminder to the entire company that we have made great strides together along the very roadmap themes I laid out at the beginning of the year. It turned out that it was also useful for pointing out, even foreshadowing, where we'd likely be spending our time next year.

The decision: Use the familiar customer process as a backdrop for reporting finer-grained enhancements across the entire product suite.

Simplified customer business process

For the past few years, the Product team has been refining our understanding of our customers' core business process. We ultimately captured the process model in a single, clear diagram which we have been promoting and reinforcing with each major release so that, by now, everyone in the company was familiar with it.

This representation of the business process had been especially helpful in orienting the product team's efforts and now seemed to offer the ideal structure for lining up the various product initiatives from the past year. So I thought I would use it as the underlying structure for my progress report.

Plan of attack

Being a fan of 1-page artifacts, I set out to build a single chart that I could use to tell the story of the past year, one that would clearly illustrate our highs and our lows, our triumphs and defeats, our victories -- you get the picture.

Collapse the business process to create a familiar backdrop

Flatten the familiar process for context

I was confident that the process model we had been using for our own internal discussions - and which had been repeatedly validated by prospects and customers alike - would provide the right context for my new chart.

The original process diagram had filled the entire page. I tried to retain the familiar shape of the process but compressed it to fit in a much smaller vertical space. I removed the actor icons but kept the text labels for each step, moving them to the bottom of the new diagram to maintain the various transitions in the process.

Create a simple scale for relative comparisons

Horizontal lines set up a scale for comparisons

Next, I needed to create a way to compare the various product initiatives in a way that my audience could see the respective value of each project as it related to the customer's business process.

Ultimately I created a 3-point scale with the intentionally provocative labels "Poor", "Par", and "Premier." If I was going to use this chart to provide an honest report on how well we did over the past 12 months, I wanted to have a way to communicate where we had done well, where we had fallen flat, and where we were still struggling with mediocrity.

Plot trajectories for each product initiative

Plot each project as a trajectory

With these structural elements in place, I was now ready to add my data to the chart.

Instead of representing each initiative as single points on the graph, I chose to roll up the initiatives under more familiar customer-facing features and plot them as parallel vertical lines. Each feature had a bullet point to show where the feature would have been scored at the beginning of the year. I then extended an arrow from the bullet and the length of each line was meant to show how much we had improved the feature over the course of the past 12 months.

For some features, I stacked more than one arrow if we had taken several passes at it. Some features had no arrows which were meant to show a complete lack of improvement - even more meaningful when they appeared lower on the scale. It was absolutely my intention to highlight these features as ones that would require attention in the months ahead.

The impact

I had no prior experience creating a chart like this or even using a progress report to roll up the results of a year's worth of product roadmap initiatives. I will admit it that it was validating for me to see the results of our collective efforts captured in a single place though I recognize some obvious shortcomings with my approach:

  • My chart only has upward pointing arrows - One might get the wrong impression that none of the work we did actually made things worse for our customers - ha!
  • My chart ignores all the non-feature work - Anyone that did not work close with the Product or Tech teams might get the impression that this is all we accomplished when the truth is that so much more was done to support our growing customer base.
  • My chart only exists because the story is mostly positive - I'm not sure I would have pursued this task if I thought it would have shown us in a bad light.

I'd like to think I'm not purposefully avoiding reporting on any negative outcomes but I knew that I would inevitably focus on those areas that made us look good. In the end, I think this is a fair report and have received similar confirmations from others in the company. I certainly achieved my goal of painting a clear picture for the company and my stakeholders around the improvements we've made from a year's worth of product investments.

Look for more reports from theProductPath around charts, evaluating product portfolios, and managing stakeholders here on PM Decisions.

More articles from our blog PM Decisions

Read More
Product Data, Product Culture Steven Jones Product Data, Product Culture Steven Jones

Evaluate the available data for making better product investment decisions

In recognizing that many of the decisions we had made over this past year were not driven by or supported with real evidence, I decided it was time to change how the Product team gathered and used data in our processes.
The Product Decision: Confirm with the team exactly what data we were missing and what we would need to do to get our hands on it.

Blurred themes.jpg

In recognizing that many of the decisions we had made over this past year were not driven by or supported with real evidence, I decided it was time to change how the Product team gathered and used data in our product development process.

Metrics and the related analysis are as important to Product teams as they are to any other department. But in my experience, it can be hard to nail down exactly what data your Product team should be monitoring. Because we interact with so many other parts of the organization, we may find ourselves gravitating toward (or flat out borrowing) directly from what other departments are (or should be) measuring. For example, it may seem obvious for us to latch on to customer conversion metrics from the Sales funnel. But then we also recognize that Customer Success is interacting directly with actual paying customers so maybe Net Promoter Score is something we should be tracking.

I have not found a single, universal measure that works for every Product team at every stage of its growth. I think good Product teams must continuously evaluate what data they need to help them make the best decisions. It is safe, if not self-evident, to start with metrics that ultimately tie back to and support the organization's high-level goals. But settling for superfluous or vanity metrics would seem to go against the prevailing wisdom which suggests adopting metrics that are truly actionable.

Product teams must continuously evaluate what data they need to help them make the best decisions.

I was interested in advancing my own organization's capabilities around gathering and using product data. In this article, I will share what came out of the exploratory discussions I initiated with members from the Product, UX, Engineering, Operations, and Data Analytics teams. In a companion article, I will report on how this analysis led to the next set of decisions to improve our overall proficiency.

What drove this decision

If you were to ask me to defend any of the decisions I've made with the teams over the past 10-12 months, I would not be able to reach for charts or graphs that clearly show one or more metrics trending up or down over time. In fact, if pressed, I might rattle off mostly anecdotal evidence like:

  • We've experienced little to no push back from internal stakeholders about this past year's product roadmap
  • There has been a noticeable absence of customer complaints around this year's product releases
  • The Product team and I collectively lack any real regrets in our decisions so far (i.e. no major screw-ups)

Oh, and we closed two of the largest deals in the company's history this past year!

Had we been getting lucky by landing indifferent customers? Had the Product team simply taken the easy path or picked the low hanging fruit to avoid complexity or confrontation? I don't think so. I think the more likely story is that we had sensibly chosen to tackle the most pressing/glaring issues that were the highest priorities for all parties.

But it is not as though we were operating in a bubble. Along the way, we had certainly been talking directly with end users and also indirectly with the internal folks who themselves, talk with customers. But many times we pushed forward without adequate data and that hindered our ability to understand the impact of the changes we were making to the products.

I knew we could make better decisions if we had more information with which to work.

The decision: Confirm with the team exactly what data we were missing and what we would need to do to get our hands on it.

I want to be clear that this was an internal Product team endeavor driven by a desire to improve our own capabilities. Good or bad, the organization had not reached a point where my team was responsible for reporting KPIs or similar measures to our senior leadership or other internal stakeholders.

But before I could dream about some impressive analytics dashboard that might make Edward Tufte smile, I thought we could start small and work our way up from there.

Plan of attack

There was unanimous agreement that the Product team would benefit from collecting both qualitative and quantitative data. Thanks to an outstanding UX team, we had been making steady progress with capturing and evaluating qualitative data from customer interviews, surveys and the like. That work had indeed driven some major initiatives including the release of our new product earlier this year.

We determined that we would focus instead on complementing what we already had with more cold hard numbers from our own customer databases.

Focus on understanding feature usage to assess the impact of changes

The team concluded that the most obvious place to start would be to get a better grip on which customers were using what features and how. Much of our ongoing work would continue to revolve around revamping the existing platform components to provide a better user and administrative experience for our customers' primary use case.

Our challenge was knowing the size of the impact of making changes. Which customers would be affected? What migrations would be necessary? How much revenue would be at risk?

I shared the following, relevant story with the team to drive home the point:

Earlier this year, I had to push back on an anxious Customer Support team who was, as it turned out, overly worried about an upcoming feature migration I had scheduled for the entire customer base. Based on no real evidence, they had gotten themselves all worked up over what they believed would be a disruptive conversion. After running some simple reports, I concluded that less than 50 customers would be affected and most of them would barely notice as they had not really invested much in the (soon to be) legacy version of the feature. In the end, we pulled off the feature migration without a hitch and never received a single customer complaint.

Update our product hypotheses to include outcome-based metrics

The Product team had been getting better at creating problem hypotheses to help drive individual enhancements and sometimes, even entirely new products. The hypotheses helped us focus on real problems and generally followed this pattern:

 

We believe that [user persona] is struggling to [complete this task, achieve this outcome, ...]

 

I suggested to the team that we look to extend the hypothesis statement to incorporate a measurable result as suggested in this Medium article by Chris Abad:

 

“We believe if we provide [solution] to [customer], it will result in [outcome] as measured by [measurable success metric].”

 

If we could tie measurable outcomes to the problems we were attempting to solve, it would certainly give us some specific metrics to zero in on.

Help the team with product and design (re-)discovery

I am convinced that the 10-year-old software platform I inherited had accumulated much more code than was necessary to attract and retain customers. So in an attempt to reduce the size of the product set, I committed to devoting some time in every upcoming release to begin paring down the code base.

But we needed to be prudent about how to do this. As the teams looked to rebuild single pages or overhaul entire features, we desperately wanted to know what had to stay and what could be eliminated. We agreed that we would be able to draw good data from our existing customer base to help make smarter design decisions and prevent us from (re-)building stuff that is not needed.

The impact

I felt a little deflated at the end of this week's exercise. In the back of my mind, I had known that we should be operating better but it did not truly hit home until we assessed our current situation. It was a more than a little frustrating that we had so little data to go on.

Ours was not the first Product team at the company and our predecessors apparently had tried similar efforts in the past. I tracked down a previous Head of Product and learned from him that this had been a struggle for him in the past as well.

I am not discouraged though and am prepared to lead the team through the next stage in this process. Look for the companion article that describes the results of our push to acquire the product data identified here.

Look for more reports from theProductPath around product data, metrics & analysis, and product culture here on PM Decisions.

More articles from our blog PM Decisions

Read More