Enterprise Architecture Catalog++

Every dream about having a catalog of the complete design of your enterprise?  One that not only gives you and inventory and status of components, but also gives you the relationships between those to create higher level objects: Business and Operational Models, Capabilities, Processes.  How about linking in how these play against your business and technical strategies or assessing those strategies against your abilities to achieve?   Well the wait is almost over, because I’m almost finished building it!

Several decades ago when I met John Zachman for the first time I was impressed by his curiosity about what I was doing at Rockwell.  I was using a CAD/CAM system to design a factory.  Not only the floor plan, but the systems, applications, information flow, etc.  Several years later after John had released his brilliance regarding Enterprise Architecture I got to meet up with his again.  From our talk and his insight I’ve been working on creating a CAD for Enterprise(TM) system.  One that would enable Enterprise Architects to work with Executive Teams to design the enterprise of their dreams like designing a house.  While the UX/UI will take a while longer, the Minimal Viable Product (MVP) database which is what the application will rest upon is almost complete with a usable text-based UI.   I will be considering just a few Testers for this MVP in the next few months as well as possible partnerships with graphical design tool vendors**.  Details to follow on how-to sign up.

**Graphical Design Tool Vendors you can contact me directly now.

Morning’s Ponders and Tool Suite Rational

Prior to jumping into finishing writing the technical approach for an RFP response this morning I spent a little time reflecting on the past few weeks of work.  I’m a big fan of Covey’s approach to analyzing your time to gain insights and understand patterns that could help you become more productive and enjoyment.  Yes I said pleased.  In a day when everyone talks about work-life balance as though these are separate things, I’m wondering if I’m the only one that gets enjoyment out of my work.

Must work translate into drudgery?

That seems odd.  I’m a woodworker, initially to assist my wife’s real estate projects and create items specific to what we need around the home, then it became a hobby.  As I participate in other social media sites around woodworking and makers, the pattern seems the same.  Then I find some others taking it a bit farther and creating businesses around their passion (e.g., Stumpy Nubs, The Wood Whisperer, etc.)  It appears woodworking has gotten a resurgence in popularity.  From their online appearance it seems they are have a passion for their work.  May be I’m reading into what I see in their public appearances and activities, but I continually see signs of real enjoyment in their participation in the craft.  Marc Spagnola,  The Wood Whisperer, has a science background and he uses it daily to expose the science behind the craft, right down to using the scientific method and experimentation to discovery such.  On his video blogs you see him and his wife Nicole banter back and forth.  To me its clear they are enjoying not only success in their business, but the process.

Which brings me back to this morning’s musings.  Do others also enjoy the process of their work like I do?   As I’m about to get back to writing the technical approach, I find myself excited about the process.  I really, no love, the entire process of discovering new methods and figuring out how to solve problems.  This is probably why I had gravitated to Management Consulting and Information Technology.

With that bit of personal insight, like always, after I closed out work last night I when back to working on the next section of my CAD for Enterprise ™ Design Tool suite.  My thoughts around this as a worthwhile endeavor is that there are plenty of technology corporations creating tools for what is the equivalent of CAM for Enterprise.  This matches what happened in the physical product industries for decades, lots of industrial automation technology while Architects, Engineers, and Designers continued to use manual methods and slide rules to accomplish their work till the computer technology became mature enough to be applied.  I’m seeing this as a similar pattern.  Last night I did a quick inventory of the “tools” I’ve built throughout my career to aid/automate various tasks around Enterprise Design, some I.T. oriented, some financial, some business management.  Then I looked at a tool I created in MS Access decades ago, B.A.S.E. ™ (Business Analysis System and Environment), it enable me to work in multiple functional domains on an engagement and reuse the information.  That goal I’ve continued to work on throughout my career.  A few weeks ago I had a brief exchange with my mentor regarding integrating various information domains.  With his encouragement and the involvement of others in their respective fields it looks like I’m close to creating the infrastructure that would support such a tools suite.

In the meantime I continue to create various point tools that will eventually snap-in, like the B.A.S.E. ™ product I created which had a similar idea of point tool modules.  This along with my question-based methodology is the goal I’ve set out to accomplish.  –and yes the point tools can be used in stand-alone mode; and yes I have shared these to others over the years (some on the Office Templates Online under the brand Intellectual Arbitrage Group which appears to have been syndicated on multiple sites as free downloads).

Structure in Threes: Organizational Design

Business Structure verses Organizational Structure

One of the interesting issues that came across my desk today as I was discussing a colleague’s new venture was the taxonomy and ontology of our conversation.  She wanted to cover multiple concepts in the same conversation which is a notable goal regarding economy of one’s time.  However, it became apparent that the terms being used were being overloaded during the conversation.  Example: Discussing the Business Structure and Organizational Structure both terms were used interchangeably.  However, when I hear the words Business Structure I think of the legal form in which the business is established (Corporation, LLC, Partnership, etc.).  When I hear the term Organizational Structure I consider whether it is centralized or distributed; a partitioning along functional, product, customer or geographic lines.  As I continue to develop the Business Design Tool (see below) the question becomes how-to ensure that the dimensions are orthogonal to each other while retaining the interconnectedness of these dimensions.

Organizational Development Tools

Many of the recent texts define various dimensions such as complexity, market, size, etc.  However, the interconnection is only a Infographic.  Perhaps these interconnections are only a probabilistic connection leading one to only heuristics.  I will make a interesting systems dynamic study at the Center for Understanding Change when I have some spare time.  In the meantime I continue to develop the Org Design and Modern IT Portfolio Management tools which are looking more and more like an enhancement to the Business Analysis System and Environment (B.A.S.E.) application built in 1994 on MS Access V1.

B.A.S.E. at that time performed a variety of management consulting analysis:

This application will eventually become the basis for the semi-automated workflow for several of Intellectual Arbitrage Group’s practices and services


Intellectual Arbitrage Group: Website Redesign

intellectual Arbitrage Group’s Office365 Small Business Premium website V0.7

Spent a few minutes each day this week working on Intellectual Arbitrage Group’s new public website.

intelarbgrp-website-homeOffice365 Small Business Premium IAG Contact Up Page Design

It has taken a little while to get some time on my schedule to work this activity.  Like other small consulting firms,  Time to work on marketing, backoffice systems, and practice development is always in short supply when you don’t have a dedicated people working each task.  Fortunately, using most of the Office365 Small Business Premium templates means I can focus on content and customers instead of presentation.  Only wish it had more flexibility or useful help on how-to customize like WordPress.comto is very minimal.  While the library of SharePoint web parts is helpful, the flexibility of these parts is minimal or not very well supported by documentation.

IAG-Office365 Website WP Blog-Webpart

This past Sunday, I asked my DNS Hoster [ZoneEdit] to build new records to forward my URL and Mail over to my Office365 site.  Both Registrar [DomainPeople] and DNS name service [ZoneEdit] were very fast and responsive, I switched over to new services in less than two hours.  I only hope Microsoft can match that Service Level, though I didn’t see a clear Service Level Agreement (SLA) from Microsoft when I signed up.  But then again they are still new to Services.

Creating Workflow for Modern IT Portfolio Management

On this week’s agenda is building out the workflow for the IT Portfolio Management Practice.  Unlike how IBM, DMR, and Microsoft accomplish practice implementation, I plan on creating a semi-automated workflow using SharePoint, MS Access and Excel.  While using PowerPoint and Word templates may capture content and present it in a “pretty” way, it does nothing for ensuring the quality of the output.  That was one of the reasons I created B.A.S.E. years ago.  I had gotten disgusted back then with the quality of analysis peer consultants were performing, choosing to spend all their time on formatting.  I guess I shouldn’t complain about such, as it created a market for me back then; fixing all the poor engagements and projects these people performed.  I’ve see lots of “pretty” engagements go bad due to poor analysis and thinking which creates the ultimate consulting sin in my book; doing harm to the client.  Having a structured process and supporting system may not guarantee perfect results or avoidance of harm, but it sure reduces the probability and provides better visibility to detect such.

Projects Past

As part of my office relocation project this month I’m reviewing, purging and scanning materials from my project archive.  Today I scanned a few of my 1980s projects at Rockwell International and Lockheed.  It’s amazing all the advanced R&D these companies gave me to do:  Building a PDM for the B1-B program, Developing and implementing Product Lifecycle Management, Group Technology Based Shop Floor Scheduling, Automated Archive and Retrieval of CAD drawings, Predictive Analytics of Aircraft Systems and Engines.  I wonder if current executive management will understand the lessons from this era; real mentorship, the discovery projects (e.g., IR&D), and the value of think time.

Rockwell 1983 PLM System Rockwell 1982 PDM Concept ETRAP 1982 Rockwell PDM Rockwell PDM - PLM 1983

Its also amazing how we managed to design and build these system with little automation and tools:  Modeling and Drawing tools such as Visio and ITHINK were not available. I started using a CAD/CAM system –Computervision CADDS III & IV– to diagram and flow chart as personal computers (no IBM PCs on market yet) were just starting to become available which resulted in senior executives asking me to build diagrams fro their projects also rather than using the graphics department.   I started studying some of the works of IBM guru’s then to add to my intellectual toolkit; Ed Yourdon, Gerald Weinberg, Tom Demarco. Robert Benson, and later John Zachman etc.   Never did I think I was going to continue in the IT field, join not one but two of the greatest IT companies (IBM and Microsoft), and have as mentors these greats in industry.  Today I’m still privileged to not only still maintain associations with such people but asked to collaborate together at times.

Old Projects and memories

During my office relocation project I’ve started sorting, purging and scanning files from my archive.  As I look back I’ve been on or lead lots of interesting projects.  I’ve been fortunate to have the trust of many brilliant mentors and executives, such that they gave me projects to challenge me or I was too foolish to say no.  Either way its been a great education and most likely shaped me into the creative thinker and researcher I am today.  This afternoon’s excavation brought to the surface one of my favorite projects which most likely started my journey thinking about language, enterprise as a biological organism, and eventually to the digital nervous system.  I think around that time I presented my Enterprise Linguistic –a Factor in CIM paper.

The project was a stretch assignment my mentor at IBM, Mike Kutcher, gave me based upon the group technology scheduling ad routing project I had built as Rockwell prior to joining IBM.  I was fascinated by the though of using various dialects or languages to solve specific engineering / manufacturing problems.  At that time Darpa and the Mantech program was still in operation.  Mike had volunteered me as principle investigator to come up with a neutral manufacturing language.  That brought me I contact with many brilliant people at IBM Research and around the corporation; Dr. David Grossman, Ricardo Conti, and Dr. Arno Schmackpfeffer to build an intelligent factory; one that would he its devices talk with each other to understand product requirements and determine how to manufacture the product itself.  The components I worked on was the creation of various parts of the device control, scheduling and manufacturing operation descriptions in a man-machine sensible form.  This eventually lead me to meeting John Sowa of conceptual graphs renown.  The POC was a success, but unfortunately research directions at Darpa changed so we never went forward with the full scale pilot.

Neutral Mfg Language

Discipline Maturity Lifecycle: History Data-Point

Yesterday during my drive home, after listening to an interesting Enterprise Architecture and Strategy presentation, another data-point that all the design disciplines mature in similar manner came to mind.  One of the current enterprise architecture trends beyond the usual economic pressures and cloud is enabling enterprise agility.

Enterprise Agility is having the flexibility to change how the organization operates, the services it provides or uses quickly.  This comes at a price.  However, as the product industries as automotive discovered, this cost was minimal compared to the competitive advantage flexibility created.  

One only has to look at the history of competition between Toyota and GM during the seventies.  Doctors Ohno and Shigeo, developed the Toyota production system which was exactly counter to the prevailing wisdom of the day to use economic order quantity / economic production runs (aka Traditional Mass Production).  Dr. Shingeo’s creation of flexible machining concepts –Single Minute Die Exchange- and Ohno’s JIT concepts enabled Toyota to reduce time to create and field new models.  This change from industry standard seven years to five years made Toyota more responsive to the market (read that ability to adjust to the gas crisis).  Suddenly American Automotive Manufacturers market share dropped and never regained its former glory.

The value of this flexibility was not lost on executives from other industries.  Today, IT is now running to catch up to the other engineering disciplines on designing for flexibility, and with that the argument as to the value of such.  As the IT community matures, it will come to the same conclusion as Systems Engineers, to develop the practices around supporting the –ilities and the competence to balance between these –ilities (i.e., Design of the Total Offering).                  

Strategies for managing your legacy –Lessons learned from the past


The situation was a common one in the late 70s early 80s. A company had tens of thousands of drawings or plots generated from a now out of service CAD application making these also a paper legacy. When the new Electronic Engineering Environment was installed it was theorized and assumed that these legacy data should be incorporated into the new CAD database. Literature from numerous vendors seemed to confirm the possibility of such a goal.

Feasibility analysis determined this was not as economically desirable as initially thought. True you would have all the drawings in a new CAD database. However, the cost of import, conversion, and quality assurance for all the drawings was significant given the quantity of drawings built up in the library over the years. Making this strategy even less desirable was the fact that only a small percentage of these drawings would ever be needed in electronic form for modification or digital simulation. The problem was determining which ones.


The strategy to solve this quandary was to enable the management of the legacy data without having to digitize it, and then import/convert only those drawings needed in digital format just prior to usage. While the CAD manager did not know which drawings were needed well in advance, he did know a week or two prior to usage. This advanced notification was just enough time for a Just-in-Time import and conversion activity.


The solution to implement this strategy was to create database and a rapid retrieval, import and conversion process. The database would be an online searchable database containing the inventory of all the drawings and plots. The retrieval, import and conversion process would also enable employees to request product data in legacy format or request import and conversion.


The results were dramatic.

  • The database application resulted in better management of existing drawings as it identified the drawing ID, location, format, and status enabling employees to retrieve a drawing faster regardless storage format/location.
  • By not having all the legacy drawings in the CAD system it reduced those in the active storage resulting in less electronic clutter and improved system performance.
  • Delaying the cost of import, conversion and quality assurance transformed the activity from a capital investment creating a library of limited use into an operational expense used on demand. This reduced the initial cost outlay as well as spread the cost over multiple years when a digital version was needed.

Adoption is the problem

Spent a good portion of my commute on the metro last night and again this morning thinking about  information technology and productivity.    With all the “labor-saving” and productivity tools you’d think as an industry we’d be fairly smart with incorporating such into our standard work efforts.   However, over the past twenty years I’ve been tracking improvement efforts I’ve come to the obvious conclusion there is a big difference between deployment and adoption. 

The first is a series of tasks, the later is a behavioral change.  Often the best ideas and solutions fail.  Not because these were technically wrong, poorly tested, or were not deployed correctly.  These ideas failed because they didn’t take into account the human element into the system.  (By system I don’t only mean physical/informational technology). 

When I worked on developing and deploying a new process for one of my employers, they had a long history of failure in re-engineering this specific area.  The original teams–three previous– all spent time on building the most technically complex and automated system.  Then sent out an edict from corporate demanding compliance.   Groups, divisions and people complied to the letter of the law.  As confused as people were the system was operated, data entered, and not one thing changed on how people managed the business.  

The initiative became just one more set of reports and busy-work for people to do in addition to getting the real job accomplished.  In effect management had increased reporting and visibility but lower productivity.   Over the course of time the effort was seen as a failure as was quietly put to bed by yet other re-engineering efforts.  

I was likely to have repeated the same mistakes — and do on occasion too — if it wasn’t for being asked by Corporate Management how I would do things different and how long would it take.   I broke the problem into two pieces; 1) solution building & deployment 2) Adoption planning, execution and management.  The first took less than a year for version 1.0.  The second took four years in all in which three versions were developed, deployed and adopted.        

Management asked why so long?  My response was “Design and deployment are easy, Adoption is hard”.  With a knowing nod I was given the project.    The details of design are not really important to the point of this article.  What is on keen importance was figuring out how to move from deployment to adoption.

The tools I used though rather unorthodox, in this context, were concepts like CMMI, Six Sigma, and ISO-9000.  This became my adoption management system.  Along with developing a deployment team across the corporation and an educational curriculum the basic tools to address adoption were put into place resulting in a success where another failure had been predicted.  Later on my employer asked me to “box it up” so it could be sold to their customers also.  However, the consulting teams that attempted to sell and execute did not grasp the difference between deployment and adoption, so had poor results on engagements, which provides proof the the premise.

Process Management Language reminiscing and thoughts

During my first career stint with IBM, I was working on Computer Integrated Manufacturing.  I was in the role as thought leader as I had just finished creating several industrial automation projects for the Rockwell B-1B and Lockheed Skunk-works.  One of the first concepts I put out –white paper (Body Enterprise) was terribly written, but the ideas still were valid enough to engage Bill Gates interest later was around dialects, domain/discipline specific languages, semantic mappings and transformations as the basic systems of an Enterprise’s body aka Digital Nervous System.  In the paper I argued for several dialects or languages one being a Product Description Language, another one a Manufacturing Language, still another a Device Control Language, and yet more such as a Job or Process Control Language. [I have a diagram in my files somewhere I’ll get to scanning in some day]

The Process Language would have some basic control constructs (decisions and branching), however, you could define action classes and activities similar to how XML is being used to form dialects (one reason I’m hopeful a process language will emerge someday).  The dialect family would have the same underlying base which would allow mixing the dialects to create an discipline specific language.

 I brought this basic concept to ISO TC 184/SC4 when we started developing the architecture for STEP.  Bill Danner, Jim Kirkley and myself started redesigning STEP around this framework pushing the ideas that the constructs in initial parts of STEP were constituents used to build a domain specific languages; in this case Application Protocols.  In STEP we argued that basic constructs had limited semantics until you placed them into a AP Context that finished the semantic.   This gave the entire system flexibility to grow as engineering needs grow. [Both Kirkley and I received a lot of heat on this for being “Too” object oriented from the DBMS faction in ISO working groups.  I have a gag gift from PDES Inc. crowd for working in the architecture group call the “Mother of all Framework Awards” that I proudly display in my office]

A small offshoot of STEP created a Process Planning Resource Part and an AP.  However, regrettably I and other were pulled off the project as no one at PDES Inc, IBM & DEC executive management saw any strong value in what is now workflow and process.  Dr. Arno Smuckpfeffer (IBM Research) , Dr. Grossman (IBM Mfg Research), Dr. Steven Ray (NIST) and I continued on by ourselves for a while until it was clear we would not get any backing.   Dr. Ray’s decision primitives I thought  were excellent and pushed to get them included in Process Planning data models   Some of the work that Dr. Arno Smuckpfeffer (IBM Research) I expanded upon into areas that enable me to analyze process efficiency and effectiveness.  I had started a consulting practice around this at DMR however the timing wasn’t right as they were moving away from consulting into contract programming.    

These components though are only the infrastructure to support process definition and execution; they are not the process (i.e., the model VS. reality issue I’ve discussed in other forums such as Enterprise Architecture Forum).  This is why I make a distinction between Process Definition, Execution and the I.T. systems that assist in controlling and monitoring the process.   Mapping from Definition to Execution Media is a fairly mechanical activity, a known quantity if you will.  One could and eventually will take the time to create a complier to do such. 

Creating the Process Definition takes time and effort as you are typically extracting the structure from human activities which often have hidden rules, inconsistencies, and capability limitations that cause others to approach the problem in an inefficient or ineffective manner.  The MS Access application (process definition model) I built helps me capture the process content in a structured manner. It has been refined of the years but still could use some more work when I have time.  However, it has made designing or reengineering robust processes much easier than the Visio chart approach most of the industry uses.  Holosofx was getting close at one time, but veered off towards I.T. workflow applications.

Originally posted June 2009