Strategy and Vision Analysis

Digging through some old files this afternoon to find this sketch from ’95.

vision-analysis-workflow

After I had decomposed a Senior Executive’s Strategy and Vision to find some of the major weaknesses and suggest mitigations, he asked me into his office.  He had only put out the document a few hours before.  Mind you I had just joined the company a few weeks before when I sent the critique; other’s had warned me that was a career limiting move.  That this Executive didn’t like criticism.  So when he called me into his office shortly after sending my assessment.  Well you can imagine I figured; it was going to be one of the shortest careers in the company.

To my surprise and delight, he gushed over my assessment.  Saying it was a brilliant piece of analysis and wished others in his organization could do such.  All he could ever get from his subordinates was a weak “Because we’ve always done it that way” or other “I just don’t like it without any rational explanation.  His next request during the meeting was simple,  he asked me how I could do such so quickly.  I pulled out some paper, sketching out my process as I describe how I performed each step, and how each step fed the next in line.

Today I’m still analyzing strategy and enterprise architecture, design and construction; though my tools have matured some what, the objective of each step are still the same.

 

I guess the saying the more things change the more they stay the same still rings true.  Though many don’t realize, most of the new solutions touted are really the same ones from the past, just masked in newer technology

 

 

Advertisements

Anti-value and Process Measurement

 Anti-value

The problem with Earn Value (EV) has applied by many PMs and Enterprises is that often there is not value achieved. What has been done is to expend effort (work) towards a goal which is hoped to achieve value. Too often lately earned value and effort have been used as interchangeable; they are not. These are two different concepts.

It is said “Value is in the eye of the beholder”. However, if the beholder is not the ultimate consumer of the effort I would contend you may or may not have achieved value. It is my assertion that value is in the eye of the consumer external to the working entity.

The example I use is rather down to earth rather than using abstract deliverables. Take an aluminum billet, rough mill it into the shape of an aircraft wing spar. Many PMs would claim some percentage of EV at this point. “See we’ve accomplished x percent of the steps towards creating the spar, so we’ve achieved x percent of EV.

However, if we stop there can you sell this rough spar to the customer or another customer for the cost of the materials plus level of effort employed?   Typically, not. More than likely the enterprise would be selling the rough billet as scrap or salvage rate (cost of the raw material). So really what has happened is the enterprise has created Anti-Value.

Process Measurement

In 1985 Dr. Arno Schmackpfeffer, et al. put forth an article in IBM’s Journal of Research and Development “Integrated Manufacturing Modeling System”. In that he and his peers asserted there are five primitive activities in a process: Make, Move, Verify, and Rest. These activities are the basis for creating value.

Five Primatives

At his point many would put forth the argument that only one of the five, make, creates value. However, that neglects other forms of value creating activities. These again are in the eye of the consumer.

Does “Move” create value? Clearly it must, as people are willing to pay firms to move things for them. Even investment firms use move to create achieve value: Arbitrage, moving goods from one location to another to gain value from the price differential in locations.

How about “Rest or Store” this activity? Does nothing but leave an item in place, what value is in that? How many people lease self-storage space to keep things? So there must be value in rest or store as people are willing to pay for it.

Now what about “Verify” clearly verify adds not value? With verify the consumer of verify is looking to get assurance that what was accomplished previously was actually accomplished. Auditors and Consultants are examples of service providers that engage is such activities that enterprises are willing to pay for.

Summary

I had labeled the above section process measurement as a correction to a previous blog article https://briankseitz.wordpress.com/2013/11/11/structure-in-threes-process-value/  to put it in better alignment with the assertion I have that value is not achieved until someone is willing to “pay” for it.

In 1998 I had taken the five primitives a little further to develop a quick analysis method for BPR/M engagements. This approach enabled my team to analyze business processes to determine what activities could be eliminated to increase process efficiency and value contributions

Process Analysis

 

Methods: Designing a new Service

Had a great brainstorming session yesterday with a colleague discussing how to apply DSM techniques to the Service Design activities we’re involved in. [Going to have to update my methods PowerPoint again]  Came up with using DSM to identify and cluster functions across various customer requirements and then a simple prioritization using Risk DSM and Business Impact [ Risk * Business Value = Priority ]

Brainstorming DSM

Clustering activity is initially based on common functions. These are later categorized into one or more basic causes using Fishbone (Iishikawa) diagraming and 5 whys:  Capability, Capacity, Schedule, and other.  The context is total throughput (i.e., “The Goal” ).  This sets the context to prioritize which areas to address first.

 

Afterwards I when to my whiteboard to draft a model of the service I’m working on using Osterwalder’s Business Model Canvas.  Once I summarized the Business Process models I created before and extracted key components, it became obvious that the context needed to be changed.  The business in a business model context yielded some strange relationship issues.  Once internal supplying and receiving groups became Key Partners everything snapped to the grid perfectly, even then cost structure and revenue stream categories which are often hard to identify in internal business models: especially in IT where chargeback is often applied inconsistently.

Business Model Canvas

Benefits Realization

Been a bit busy stoking the home fires of late.  Attended COFES, amazing conversations as usual.  This month I’ve been focused on several areas of applied business architecture research.  IT Portfolio Management, Benefits Management, and Complexity Management.  All three are related to my Structure in Threes project.

  • I continue to develop the portfolio model section by section along with a working prototype.  Started considering the technology and system to offer to the market.
  • Benefits Management this month is really a parallel track, both R&D for ensuring a portfolio action supports Enterprise Goals as well as applied practice for the projects I’m working on at Microsoft.  The past few weeks I’ve been creating a Benefits Dependency Network for one of the subprojects.  I’ll be reviewing and revising that today with stakeholders as well as creating a draft Benefits Management Plan to help ensure the initiatives realize the promised benefits.  Part of that will be a Results Chain Contribution Matrix, a Benefits Distribution Matrix, and a Stakeholder Management plan.  Most of these artifacts I’ll recommend to my group for future projects
  • Complexity Management R&D is part of the BPR/M activities at work as well as Portfolio Management R&D.  Had a great discussion with Dr. Jacek Marczyk discussion elements of complexity.  We’ll have lots more to discuss.  I like his high level model:  Structure Elements x Uncertainty = Complexity.    I had previously separated Uncertainty from the equations I was developing:  Business Process Complexity =  {Information Complexity} x {Activity Complexity} using BPMN models as the base to calculate each factor.  As of yesterday I revised calculations from a standard node count bases to also include network linkages between nodes in each factor.   Later this week I’ll look at how I include Dr. Marczyk’s perspective of accounting for uncertainty.  I think I may also expand on that and use some of Courtney, Day, Schoemaker, and Primozic research into risk and uncertainty.  They’ve a lot of good materials that could apply to the problem space.

Today’s Applied Research Agenda

Today’s agenda: R&D around Capacity Management for Services and Processes.  My presentation on metrics and measurement for processes went well, but I know risk and capacity management are a significant lack in this field.  Somehow everyone has gotten the opinion that Moore’s Law will bail them out…that hope is what the other engineering disciplines know now leads to unsustainability.  Received old book I ordered from Amazon Market [Computer Systems Performance Modeling, 1981 –Sauer]   that covered some of this in regards to computer systems.  Coupled with Meadows Limits to Growth (Systems Dynamics that Forrester introduced) I believe I can develop an approach to monitor, measure, and manage processes in more than a reactive way currently the industry norm.  While it will not likely be Nobel Prize winning stuff, I think it will help make Microsoft more competitive and responsive to customer needs     

Structure in Threes: IT Business Models

Early

Yesterday on the way back from client’s site I started pondering the intersection of Service Quality, Brand Value, and Business Models with regard to Information Technology.  Traditionally, IT functions had been focused around a product development model with operations being dragged in tow as a little more than after market support.  The only difference was that IT had a captive market or so the unconscious bias appears to be within these organizations.  As the IT industry has aged, other goal expectations were placed upon these organizations.  These ranged from being the gatekeeper to precious information, the controller for allocating technology & resources, and the integrating force between other parts of the organization. Whether it is possible for one function to successfully execute on all three missions is a topic for another post.

These expectations struck a nerve with me as I was reading Step Guide for Building a Great Company .  I have been researching business strategy, models and processes for most of my career.  As I read through the first few pages I wondered if IT Functions were not using the wrong business model.  In the past decade many IT functions are desperately trying to change their culture to a service oriented business.  Initiatives such as ITIL, COBIT, and SOA seek to inject a service mindset.  I think the objective is a laudable.  Having experienced a slightly less than customer / service oriented environment decades ago (a story for another time over a beer).

However, the typical service model that is put in place is one for a stabilized or mature business.  A business that has a standardized set of services that are being optimized, not a service that is constantly evolving which has been the nature of the IT function over the past several decades: Mainframes, PC/Workstations, Network, Internet, and now Cloud and BYOD.  This would not be considered a stable and mature industry as the technology keeps changing.  The  consider as the technology keeps changing new services are being either asked for or developed constantly.

With that fact I wonder if the business model IT functions should use is that of a Startup or possibly a bifurcated model that has service groups start out in incubators and move into optimization models as the service matures.  This is very similar to the extension’s I developed for another employer with regard to managing business portfolios.  I had based much of the initial work on The Alchemy of Growth: Practical Insights for Building the Enduring Enterprise adding another horizon and much more recently figuring out how to tie each horizon portfolio together using Real Options and Cluster Analysis.

In this recent expansion new services in ideation are considered a new business opportunity or new technology opportunity.  This determines if they are a Horizon 3 or Horizon 4 portfolio member.  As the internal market/business develops or the technology becomes understandable and stable enough to pilot these opportunities move onto the next portfolio where different operating rules and metrics are applied to manage these.

The result of using such a model makes the IT Function vertically integrated business incubator going from Founder & Angel Investor, to Startup & Venture Capitalist, to an stabilized operating line of business.   Which leads to yet another simulation model to build for my clients and possible discussion at next April’s Engineering Conference.

Modeling and Simulation

A friend and colleague reached out yesterday to ask about modeling and simulation tools.  Then went on to ask about consistent notations as a desire for the practice he supports.  During the span of my career I’ve had to learn various methodologies and notations around BPR/M as both the discipline has matured as well as terminology.  During BPR’s big push during the late 70s courtesy of the USAF’s Mantech program I was introduced and became a modeler for an Aerospace firm.  Dennis Wisnosky -who led the program- gather a few of us at various prime vendors with the objective of creating a generic model of an Aircraft manufacturer.  As part of the program Softech developed IDEF0 a modeling notation / methodology.  Today there is more distinction between what is a notation and what is a methodology.

During that same year I had to create various diagrams for systems, IT applications, shop floor processes and manufacturing processes.  This demand to learn the latest notation continued till I joined IBM, at which time my employer started sending me to various conference, consortiums and programs to develop methodologies and notations due to the years of build models.  Two years ago at a Summit meeting I presented the topic of Accuracy and Precision.  Topics that are closely related but as another of my mentors, Harrington pointed out these are different concepts and when building models one should understand a few things:

  • The greater the level of detail the more complex and expensive it is to create
  • Greater detail does not always translate to greater understanding or the ability to communicate that understanding

Since then when I’m asked about modeling and simulation engagements, the first question I have is what’s the point of the model or simulation?  If its to communicate to executives or other people that requires one level of detail; If its a step by step procedure for people to follow that’s another level; and it is for automating a process that’s still another level.

The next question I typically asks is what is the target audience’s level of experience with models and various notations; show typical line of business executives UML just won’t cut it, show programmers BPMN may or may not be helpful as they are often wanting more technical detail than process –though in my opinion that’s a big flaw in logic, as the process gives a programmer context which will help drive better design decision (enter the latest trend to build scenarios and user personas).

Lastly on my top three scoping questions is usually how much time is planned for the activity and due date for the deliverables.  With such questions notation and methodologies decisions become a little easier.  The only big question left is whether the enterprise as chosen a one size fits all standard or has a taxonomy of notations to use for specific purposes and all those involved have the needed training to use.