Message of the Month: Architecting EA Success

Successful EA programs are rare and for one not-so-obvious reason: the outcome metrics are seldom defined or, if defined, hard to measure accurately. Also, one needs to factor in such things as whether the EA practice is new or has been around for at least a few years. Regardless, there is no simple Architecture Pattern that will assure success but there are guidelines that can help a lot. I’ll share some of them here.

One VP of EA at a very large mortgage company once discussed a new approach/investment for the EA team with her EA Director. The key question she asked about the initiative was: “Will it help us better control risk? If not, we shouldn’t pursue it because that is one of the main reasons to do EA in the first place.” 

Some leading risk management books and standards provide a definition of risk as uncertainty about the outcome of an endeavor. In other words, a risk doesn’t always mean just threats of loss or failure; something that may be classified as a high risk may actually end up on a positive note. The crux of the issue for risk is the uncertainty regarding the outcome of a situation, investment, capability, program, transformation, timeline, budget, etc. It can also relate to human factors, such as defining the right requirements for complex positions, such as for an Enterprise Architect!

My key message this month is that EA Success cannot happen by accident; it must be designed, implemented, managed, and governed. This may be hard to believe, but many longstanding EA programs I’ve become aware of in my 20 years working in the EA field haven’t ever been formally designed and even lack a chartering document and position descriptions, as well as meaningful performance metrics. It is very difficult to start fresh with a legacy EA team because of the risks of someone being embarrassed for their inability to explain the value of their EA program, one that may have cost millions of dollars over a few years in terms of just the human resource commitment. Add to that millions of dollars for under used, if used at all, EA tools for modeling and a repository. 

However, not being able to communicate clearly about one’s EA program doesn’t mean that it has not been valuable; it is just difficult to trace back to the reasons for its genesis, current design, etc., and to plot a new course that may lead to improvements, even in being able to communicate the value that EA brings. I’ve had some Chief Architects refer me to 10 year-old legacy EA documents as part of my team’s research on what useful documentation may already exist. In the case of the legacy documents, they were actually quite professional. I asked why they were never used. The answer was that a requirement came down to create the documents, so doing so was a compliance drill. The documents were never even intended to be operationalized, plus key considerations about the methods, modeling, and EA skill sets/roles and responsibilities were never made explicit. The good news in this particular case is that the small EA team was eager to update everything. They wanted to do so in an actionable way and a way that would clearly communicate the value added via EA integration to important initiative planning and workflows. And they had support from top leadership to invest in program tools, methods, and training to pursue becoming a butterfly after years of being a moth.

When starting with a Green Field for beginning a new EA program, there are some great accelerators for enhanced maturity and effectiveness. These mostly relate to addressing up front the items mentioned above: an EA Charter, EA Job Descriptions and related Performance Reviews, select the right core architects, customizing a method that could be rolled out incrementally in creating continuous, measurable value; learning a modeling language or two; build people and presentation skills to communicate effectively with key stakeholders in a way that works well for them. Many new practices start out by purchasing expensive and complex EA tools before they even know about methods and modeling. This is an anti-pattern — tool purchases should only come when the right methods are defined, mastery of at least one metamodel is achieved by the team, and then carefully, with expert advice if at all possible, selecting the right EA tool environment — one should never expect to use only one tool for key EA work.

Although it is very difficult to think of EA contributions in terms of outcomes instead of outputs, this is essential and must be tackled early on in the design or redesign of an EA program.

An interesting development we are seeing is companies reaching out for Subject Matter Experts to be available a few hours a week to vet evolving methods and review architecture models for optimization. This type of support can be very inexpensive, although a bit difficult to set up; but a right mentor/mentoring team can lead to considerable acceleration of increased maturity for the EA practice being assisted.

Finally, if you have an EA program that has been around for awhile and you feel like it may be at risk in some way, it is time to be objective and clear up some of the uncertainties about what to focus on, etc. If you are just beginning an EA practice, ensure that architecture training is a priority and remember that the biggest investment in such training is the time of the EA team, so quality over lowest price should drive the decisions on what to learn and whom to hire to assist you in doing the training as part of accelerating your time to value.

Authored by Dr. Steve Else, Chief Architect & Principal Instructor