Tuesday, April 29, 2008

What we learned about EA implementations

(based on many people asking why I recommend the implementation approach I do)

The fact is that most EA implementations fail i.e. they fail to produce sustainable value and a useful asset i.e. one that can be used to: answer questions (e.g. impact analysis), maintain knowledge, support business operations and business change, etc. Over almost two decades I have seen dozens fail (often undertaken by some of the brightest, most able, and most diligent people I have met).

I have seen EA initiatives in many types of organisations (forestry, health, insurance, government, finance, health and safety, eCommerce, retail, education, telecommunications, etc) and using many different generic frameworks and methods (e.g. Zachman, TOGAF, etc.) - and the quality of the result does not seem to be associated with either the type of organisation or which of these generic approaches is used.

It is true that in many cases inadequate tooling is the root cause, and EA "consultants", "experts" are quite happy to continue using office documents and essentially manual approaches because for service companies it is far more profitable to do the fishing, than sell a rod and teach the person how to fish for themselves.

Where poor tooling is not the cause what I usually have seen is work on an EA implementation commence:
  • lead by people who only understand about 10-15% of what the technology being used can do (and have a partial understanding of its intended use) or lead by people who understand the tooling reasonably well but who have very limited domain expertise (i.e. in enterprise architecture, ICT strategy, business/IT transformation etc.).
  • without a clear understanding of what is sought (requirements)
  • without a clear understanding of how the solution will be implemented (all the components, how design will be undertaken, how operational procedures will be defined etc.)
  • without a well defined (tried and tested) implementation plan
  • without an understanding of the organisations change impedence issues (the organisational behaviours that impede change) associated with improving touchpoint processes and roles.
Unsurprisingly these implementations seldom proceed well and usually erroneous conclusions drawn from what is delivered (from those who "don't get it" or sometimes "don't want to get it").

If one doesn't start with a clear view of the outcomes being sought - it is unlikely the benefits will effectively be achieved (i.e. if you don't understand the requirements it is unlikely the solution you will deliver something that achieves them).

Therefore what is required upfront is a focus on what outcomes should be sought, i.e.
  • who are the stakeholders and/or beneficiaries ?
  • what are the scenarios of use they have ?
  • what are the business questions they need answered ?
  • and in roughly what stage of an implementation ?
Some EA approaches aim to achieve this but they are so generic (i.e. they don't make an assumption that suitable tooling will be used) so they can't be precise enough about the nature of the requirements captured, or how the design will proceed.

This is why we have developed a detailed implementation approach. This defines all the major areas of work, the key roles, logical milestones etc. that is specific to the implementation technology we use and can be instantiated quickly for a specific project

People often think that there must be a single detailed approach to developing an EA that will suit every organisation. There is a common meta-approach - but the detailed approaches differs based on the organisation and the goal. The problem is that different organisations are focused on making different types of changes e.g. new products or new geographies, mergers, risk reductions (including regulatory compliance, DR, BCP), cost reduction etc. so their focus is naturally different e.g. I have seen EA implementations with many different orientations e.g.
  • Business and/or technology strategy (usually aiming at getting an explicit consensus as to what changes should be initiated and why).
  • Business process redesign (adjusting the way the business operates, as a precursor to looking at technology changes)
  • Application architecture (usually focusing on rationalisation, as a result of merger of unmanaged proliferation)
  • Service architecture (usually focusing on developing approaches for service governance)
  • Integration architecture (often now related to SOA initiatives, but in the past oriented around ESB, EAI initiatives).
  • Security architecture (unfortunately usually as an after thought).
  • Meta data and data architecture (unfortunately usually disconnected from the processes the data exists to support)
  • Rules architecture (usually oriented at understanding how strategies and policies are implemented in processes, procedures and systems)
  • Technology platform architecture (usually focused on getting better utilisation from servers, and/or planning for building extra server capacity)
  • Storage architecture (usually focused on preparing for the addition of extra storage capacity).
  • Standards management (usually oriented at governance)
  • Programme management (usually oriented at understanding how a set of capital initiatives, or business change initiatives relate to how the business seeks to operate in future).
  • Service level management (usually oriented at SLA templates, contracts and delivery exception management)
  • New product or channel definition (usually products that involve systems at the heart of their delivery)
  • Disaster recovery planning (usually leveraging off existing information about processes, applications, platforms and teams)
  • Business continuity planning (as for DR planning)
  • Requirements management (in the context of how the business seeks to operate)
  • Package implementation management (usually focusing on understanding how business operates, and how this relates to the package in question - so either the package can be changed or the business can change how it does things (or both).
  • Compliance management (understanding how regulatory or legislative compliance is being achieved)
  • IT Skills management (understanding what skills are required for what systems, usually to allow rationalisation or as a risk management exercise).
If one looks around an IT organisations one will see information about all of the above being created, collected, used and disposed of (whether that is intent or not). It is usually presented in a diverse set of office documents (business plans, business cases, project charters/statements of work, technology documents, software design documents and models, infrastructure design documents and models, contracts, requirements documents, business continuity planning documents). This suits each person producing the specific artefact admirably and keeps lots of staff and consultants employed. To change this situation take time and needs to be done incrementally and with purpose.

Often when you ask an organisation what they want to focus on they will say: applications, infrastructures (servers, storage, intregration) etc. but be unclear about how they will make intelligent decisions about these things without understanding: business/technology strategy, products/services, and the business processes, information, rules, organisations etc. that these solutions are designed to support. Or you will get organisations that say they want to focus on all the above - but struggle to work out what will the initial focus should be so that the stakeholders with greatest immediate need will be provided answers to the burning questions they have, while knowledge is gathered and organised in such a way that it is reusable in down stream phases to answer questions that are currently of secondary or tertiary importance (or not even thought of).

The art of a successful implementation is prioritising - based on a complete understanding of what the tooling can do, a good understanding of EA best practice and clear understanding of what the organisations current goals and challenges are.


No comments: