Planview Blog

Ihr Weg zu geschäftlicher Agilität

Produktportfolio-Management, Projektportfoliomanagement, Strategieplanung

Your First Portfolio Review Won’t Go As Planned – Enrich Consulting

Veröffentlicht am von Dr. Richard Sonnenblick

It doesn’t matter how much you think you’ve prepared, how many run-throughs you’ve done, how many assurances you have from your project teams that all the data will be there, and be correct. Something will go wrong: a critical piece of data will be missing, the slides will be wrong, an unanswerable question will derail the discussion. Like an opening night on Broadway or the first docking attempt between two orbiting spaceships, there is simply no way to anticipate what might go wrong—there are no substitutes for the lessons you learn during your inaugural attempt at a portfolio review.

So what can you do about it? In advance of that first review, less than you would like. Data will arrive late, incomplete, or not at all, and the review will happen anyway, warts and all. But you can embrace this opportunity to learn from the experience and make the next portfolio review better.

Run-Throughs and Pilots are No Substitute for the Real Thing

rocketThat’s because there’s no way to rehearse the actual portfolio review. Run-throughs and pilots are a good start, but they’re always inadequate. Beyond the basic unpredictability of the portfolio review process itself, run-throughs often aren’t actual dress rehearsals because, without the pressure of the actual portfolio review, few teams adequately scrub the data on each and every initiative for every run-through. Scrubbing data, looking for inconsistencies, omissions, and outdated information is work—tedious and demanding work.

Without the pressure of an actual portfolio review, project teams won’t take extra pains to ensure all project cost and revenue numbers reflect their latest thinking around development, technology performance, and the market landscape. We’ve seen run-throughs where 10%, 20%, even 50 % of projects never reported, simply because other priorities intervened. Thus, an essential opportunity to look at the entire dataset at once was postponed until the actual review.

Only when the real review rolls around, and project teams realize they will be held accountable for every number they provide, will they update and refine project information, potentially introducing a variety of new inconsistencies into the dataset—and rendering the run-through meaningless.

Similarly, without the pressure of a portfolio review, executives won’t sharpen their red pencils, review project status and projections, and ask the hard questions about each initiative in the portfolio—leaving important disconnects between project direction and strategy to be uncovered in the actual review.

Without the pull of a real, all-hands-on-deck portfolio review, the busiest, connected employees who know enough to call foul when project business cases don’t hold water won’t get involved. Thus, the business cases won’t be adequately vetted in advance of the actual review date.

The Real Thing is More Exciting than You’d Like

coalsThere are sometimes heart-stopping consequences to all these behaviors. Watching companies go through their first systematic review or prioritization processes, we’ve seen a lot of scary moments:

Missing Projects. Division R&D spending is known to be $115 million, but the complete project set is in and current year costs total just $88 million. What happened to the other $27 million in projects?

Missing forecasts. It’s one day before the prioritization meeting, and all of the project teams have finally reported in (whew). But most of the project forecasts came in during the last 24 hours and over half lack revenue forecasts and 2/3 lack an estimate of net present value. How can prioritization be completed without the agreed-upon financial metrics?

Missing milestones. A review of project milestones suggests that a large fraction of project teams never updated their ‘next milestone’ dates, and many of the current projects show dates in the past.

Phantom revenue. Comparing project revenue by year to project launch dates shows 10% of projects accruing revenue before their launch dates. But there’s no time to go back to the project teams and fix these inconsistencies.

The good news is that these issues are coming to light. You and your team will have an opportunity to fix them before the next review. The bad news is that the show must go on. This review will take place, missing data, incorrect data, and all.

The key is to turn the challenge of the review into an opportunity to motivate better data hygiene moving forward.

Lemonade From Lemons

lemonsThe first step is to make the feedback loop from the review back into the data collection and data vetting processes an explicit step in the review. From executives on down, everyone must understand that the portfolio review is not a destination, but rather a way station in a long journey. The process is designed to uncover weaknesses in the project reports, and every omission or inconsistency found is a victory, bringing the team one step closer to a world of quality forecasts that help the company fund only the most valuable initiatives.

Don’t stop with an identification of the data issues in each case. Charging onwards, you conduct forensics around each business case omission or inconsistency to identify why these issues have arisen, and how they can be addressed. Why did so many teams not provide a net present value? Why did millions in costs not get allocated to individual projects? Without understanding each inconsistency, and addressing its root causes, you’ll see the same problems in the next review.

The second step is to conduct the current review with the available data, and make as many decisions as possible, even with less than perfect data. R&D is fraught with uncertainty, and the data will never be perfect anyway. Furthermore, working with incomplete data will generate a hunger among executives for more complete datasets, backed up by teams that stand accountable for every forecast. That will produce more complete data in the next round—for pure motivation, there’s no substitute for a demand for quality forecast from the highest levels of the organization.

A real portfolio review also reinforces the relative importance of the different pieces of project information. As executives discuss each project, it will become very clear which pieces of information are essential and which are less so. You’ll have a clear picture of how to refine the data requirements for the next review.

The third step is to establish a clear connection between data quality and completeness and the release of project funding. After this initial review, projects lacking a solid business case at the next review should have their funding reduced to the bare minimum until the business case is established. If the business case isn’t completed within a specified time, all funding should be cut.

table view

With the Enrich Analytics Platform, custom dashboards enable effective validation and exploration

Finally, consider creating a centralized data repository, like the Enrich Analytics Platform (EAP). Keeping all the project data in a central repository makes it easier to identify data quality issues in real time, as the data comes in. You’ll have reports at your fingertips, for instance, a list of projects still relying on last quarter’s cost estimates or of those with revenue accruing before market launch. Reporting tools that come with a package like EAP enable you to instantly identify which projects are up to date, which are somewhat out of date, and which have been untouched since the last review. You’ll also have information on who has changed which inputs to critical projects, and the net impact of these changes on key project metrics.

Take the First Step

Each year, tens of billions of dollars in R&D investments are shaped, allocated, and refined with the help of Enrich’s EAP. Are you interested in learning how we can help streamline your portfolio reviews, turning months of long nights and frustration into a value-enhancing, confidence-affirming exercise for your R&D organization? Contact us, and learn what our clients already know about the value of the Enrich Analytics Platform.

Ähnliche Beiträge

Geschrieben von Dr. Richard Sonnenblick Chief Data Scientist

Dr. Sonnenblick, Chief Data Scientist bei Planview, verfügt über langjährige Erfahrung in der Zusammenarbeit mit einigen der größten Pharma- und Biowissenschaftsunternehmen der Welt. Dank des im Rahmen seiner Arbeit gewonnenen Wissens hat er erfolgreich aufschlussreiche Priorisierungs- und Portfoliobewertungsprozesse, Scoring-Systeme sowie finanzielle Bewertungs- und Prognosemethoden zur Verbesserung von Produktprognosen und Portfolioanalysen entwickelt. Dr. Sonnenblick hat einen Ph.D. und einen Master in Engineering and Public Policy von der Carnegie Mellon University sowie einen Bachelor in Physik von der University of California Santa Cruz.