NO WARRANTY

vii 1 A Workshop on Analysis and Evaluation of Enterprise Architectures 1 1.1 Background 1 1.2 Workshop Participants 1 1.3 About This Workshop 2 1.4 Organization of This Report 3 2 Definitions and Boundaries 4 2.1 Defining Enterprise Architecture 4 2.2 Relationship Between Enterprise Architecture and the Enterprise Business 4 2.3 Bounding Enterprise Architecture in Practice 5 3 Enterprise Architecture Design and Documentation Practices 7 3.1 Typical Enterprise Architecture Artifacts 7 3.2 Enterprise Architecture Artifacts – Depth and Detail 8 3.3 Projects Are the Units of Delivery 8 3.4 Decisions Are First-Class Artifacts 9 3.5 Use of Patterns Guides Projects 9 4 Evaluation of Enterprise Architectures in Practice 10 4.1 Enterprise Architecture Evaluation Criteria 10 4.2 Evaluation Context 10 4.3 The Role of Quality Attributes in Enterprise Architecture Evaluation 11 4.4 Evaluation Methods 12 4.5 Federation and Acquisition 13 5 Summary 14 5.1 Workshop Findings 14 5.2 Future Work 15 Appendix A – Survey of Enterprise Architecture Definitions 17 Appendix B –SEI Enterprise Architecture Analysis and Evaluation Engagement Model 19 Acronyms and Abbreviations 27 References 29


is laid out as
ollows:

• Sections 2 throu

ecurring th
mes and crosscutting issues emerged, and so this content is organized thematically, rather than question by question.

• Section 5 summarizes the findings of the workshop.

• Section 6 outlines how the SEI System of Systems (SoS) Architecture Engagement can be applied to enterprise architectures.

• Appendix A surveys definitions of EA.


Definitions and Boundaries

This section summarizes discussions about the definition of EA, boundaries of EA, and relationship of EA to the enterprise business.


Definin Enterprise Architecture

Participants preferred to use the definition of architecture from the Institute of Electrical and Electronic Engineers (IEEE) and substitute system with enterprise to create a working definition of EA (IEEE 2000).

• IEEE-1471 Definition of Architecture: "The fundamental organization of a system embodied in its components, their elationships to each other, and to the environment, and the principles guiding its design and evolution."

• Proposed Definition of Enterprise Architecture: "The fundamental organization of an enterprise embodied in its components, their relationships to each other, a

to the environment, a
d the principles guiding its design and evolution."

EA is different from other architecture disciplines (software, system, and system-of-systems) in that the enterprise generally exists before any EA activity is started, and must continue to exist and function as it is being changed.The evolution perspective included in the IEEE def nition emphasizes that evolution is an essential consideration in EA in practice.

Appendix A presents a brief survey of other definitions of EA.


Relationship Between Enterprise Architecture and the Enterprise Business

Many of the participants supported the assertions that "enterprise architecture is a strategic activity."One participant asserted that this strategic perspective may distinguish EA from SoS and system architectures.

EA plays a critical role in supporting and informing the strategic decisions made within the organization; however, participants report that EA activities are typically underfunded.This seems to stem from short-term management focus, difficulty collecting return-on-investment data, and difficulty quantifying the value of any strategic activity.

One of the participants showed Figure 1, taken from the book by Ross, Weill, and Robertson (Ross, Weill, & Robertson, 2006).(Ross, Weill, & Robertson, 2006) One participant proposed three operating modes for enterprise architects within the enterprise:

1.At the lowest level, enterprise architects operate in an urgent response mode, reacting to crises as they arise.

2. Next, enterprise architects may operate in a continuous improvement mode, making incremental changes and generally avoiding crises.

3. Finally, enterprise architects may operate in a transformative change mode, collaborating with business leaders to enable new business apabilities and new business models.

In pr ctice, enterprise architects are nearly always operating in all three modes, but the ost effective organizations will spend less effort in the urgent response and more in transformative change.


Bounding Enterpri e Architecture in Practice

The Open Group Architecture Framework (TOGAF) defines EA as comprising four domains (The Open Group 2009):

1.The business architecture defines the business strategy, governance, organization, and key business processes.

2. The data arc

tecture describes the struct
re of an organization's logical and physical data assets and data management resources.

3. The application architecture provides a blueprint for the individual application systems to be deployed, their interactions, and their relationships to the core business processes of the organization.

4. The technology architecture describ s the logical software and hardware capabilities that are required to support the deployment of business, data, and application services.This includes IT infrastructure, middlewar , networks, communications, processing,

andards, and so on.

Most p
rticipants considered "enterprise architecture" to be typically implemented and practiced as "enterprise information systems/informat

n technology (IS/IT) architecture
" comprising only the last three domains.An exception among organizations represented at this workshop was the Veterans Health Administration, which is creating an enterprise business architecture that is "ITagnost c."

The partition between business architecture and the other three domains seems to be driven by lines of authority and control of the business architecture domain.An EA practice is generally hosted by the Chief Informati n Officer or some other IS/IT-oriented segment of the enterprise.

Organizational relationships and roles between EA-IS/IT and business architecture within an EA vary with the organization's history, culture, and maturity.Business process modeling is typically managed and performed outside of the EA unit, and may be performed with less precision than is required to support enterprise IS/IT architecture.In these cases, enterprise architects may do "just enough" business modeling to support their own needs.An organization that is just starting to document an EA can recover the business architecture f om existing or legacy systems (by reverse engineering the busin

s rules codified in software and then applying expert interpretation).

T
is delineation of EA scope is a more general issue-various organizational structures and lines of authority result in the organization's adding responsibilities to or removing responsibilities from the scope of the EA team.

One participant asserted that "EA is a management discipline," and the scope should include activities such as outsourcing and managing the supply chain.In the government context, scope would be extended to include acquisition-related considerations.

For this workshop, participants decided to bound EA to "enterprise information systems /i formation technology (IS/IT) architecture," comprising the data, application, and technology domains.Business architecture provides context and linkage from the IS/IT architecture to the organization's business goals.


Enterprise Architecture Design and Documentation Practices

This section summarizes discussions addressing the design life cycle and artifacts, to help answer the following questions:

• Where would architecture analysis and evaluation fit into the EA life cycle?

• What artifacts would be available to support analysis and evaluation?


Typical Enterprise Architecture Artifacts

A roadmap is a plan that defines a sequence of architecture states or transition a chitectures that will change the "as-is" EA to a desired target architecture.Participants agreed that this roadmap is the critical EA artifact that relates technology to business goals, and ties togeth

the many concurrent projects underway in an
rganization at any time.A typical roadmap in industrial practice covers three years-it is felt that project ng beyond that time frame is not efficient.

Other artifacts typically developed and maintained included

• refere ce models

• strategic plans (for DoD enterprise architectures, would include campaign plans and posture statements)

• architecture p inciples

• conceptual architectures and other architecture descriptions, describing baseline, target and transition architectures


• architecture patterns

In general, these artifacts are a mixture of d fferent types of products (the DoDAF, TOGAF, system architecture, spreadsheets, Visio diagrams, and other tools) and reside in different repositories.For the U.S. Army, the Capability Architecture Development and Integration Environment (CADIE) is the primary repository for EA arti acts (products); however, other information relevant to the EA resides in other repositories.Federation of data from multiple repositories is necessary for making meaningful decisions using the EA.

Given that the participants stressed that the primary role of EA is to support enterprise decision making (refer to Section 2.2 above), artifacts need to be selected, developed, and maintained with this role in mind.

During this discussion, it became clear that the terminology used in the DoD, other government agencies, and commercial practice are very different and impact the ability to apply principles across domains.Consistency, or at least a comprehensive mapping, would be a great help to practitioners.


Enterprise Architecture Artifacts -Depth and Detail

All participants agreed that it is easy to get bogged down in EA documentation.Available tools can probe the data network and crawl through the existing systems to extract partial representations of the as-is architecture.It's easy to add more detail to these repositories manually, leading to bloated architecture documentation that is expensive and time consuming to maintain and so often becomes stale.It was noted that each additional level of decomposition adds twice the effort needed to complete the previous level.EA documentation carried down to the level of a technical architecture s too detailed to be useful or effective.The appropriate level of documentation comes back to the use of EA to support strategic decision making within the organization-the documentation detail needs to be "just enough" to sup ort the decisions that the organization must make.

Development tooling is necessary in any case-tools in use ranged from iServer (Visio backed by a database repository) (Orbus Software 2010) for a small enterprise, up to more sophisticated tools such s ARIS (IDS Scheer 2010), Casewise (Casewise 2010), IBM System Architect (IBM 2010), andTroux (Troux Technologies 2010).Even the more sophisticated tools like Troux did not readily support documenting tradeoffs between design alternatives.We will discuss why documenting such tradeoffs is important in Section

.

A critical activity in successful EA teams is communicat
on about the EA with the rest of the enterprise.Several participants noted that the representations produced by the tools (e g., various Unified Modeling Language [UML]) diagrams and entity-relationship d agrams) are not useful for communication outside of the architecture tea

One participant talked about keeping the t
ol repository "in his back pocket" to be used by the EA team for decision making but never shown to stakeholders.Alternative representations-that are more "executive-friendly"-are needed for communication with stakeholders.

Several participants pointed out that the practice of keeping the EA models up to date with the solution/project architectures and as-built infrastructure varies among organizations.Up-to-date models help new stakeholders, but most decision makers seem to quick

internalize any conformance discrepancies.Within an organiz
tion, some models a e kept current while others are allowed to lag.In fact, workshop participants who had invested heavily in sing these tools to develo the initial capture of the EA expressed the sentiment that while they felt this work was crucial to their efforts, ongoi g maintenance of the mod ls was not cost effective.There was agreement that documentation of the invariants and principles should be kept current.


Projects Are the Units of Delivery

The EA roadmap is realized by a number of projects, each producing a project or solution architecture.These projects vary widely in scope and complexity, ranging from simple (for example, a new version of a vendor platform or package), to complex (for example, moving to a new vendor platform or adding a new application).Complexity should not be confused with architect re significance-some "simple" projects are architecturally significant, while some "complex" projects may have little or no architectural impact.The distinction is not always explicit-having a decision model to system tically distinguish between the two types of projects based on architecture significance would be useful.

A project may also be categorized according to the operating mode that drives it (see Section 2.2 above).Projects may include both re-engineering of existing components and packages, and dev

opment of new components and packages.


Decisions A
e First-Class Artifacts

The participants favored defining EA in terms of structures determined by components and relationships.However, in considering how to document an EA, there was agreement that documenting decisions as first-class artifacts is more important than documenting structure.

The process of making, evaluating, documenting, and managing the version/configuration of decisions is at the heart of the EA governance process.

One of the participants presented the metamodel that his organization uses for documenting EA, which includes decisions and rationale as explicit elements in the metamodel.


Use of Patterns Guides Projects

One important type of EA decision is the selection of "patterns."Projects are frequently specified in terms of the selected patterns.

In practice, participants' use of the term patterns was an extension of the typical definitio of architecture patterns.A pattern was defined as a reference to an already instantiated set of architecture elements and relationships in their enterprise-an exemplar of how to address a particular bundle of architecture concerns.The pattern also includes technology and platform decisions.Often, little additional analysis or evaluation is performed, since the quality characteristics of the exemplar are already established through its existence in a working implementation.


Evaluation of Enterprise Architectur s in Practice

This section discusses how enterprise architectures are evaluated in practice.


Enterprise Architecture Evaluation Criteria

The scope of an EA evaluation should include the baseline ("as-is") architecture, the target ("tobe") architecture, and the EA roadmap.The evaluation is focused on evolution from baseline to target architectures.

In current practice, the key evaluation criterion is line-of-sight 1 from roadmap projects and the decisions within the enterprise and solution architectures to the enterprise business goals or capability requirements.The business architecture provide context for the rest of the EA and so can provide this linkage from business goals to architecture.


Evaluation Context

Evaluation occurs at multiple levels: entire enterprise, division or line-of-business, and project.

Higher levels are more focused on alignment than technical issues, while lower levels focus on technical concerns.

Evaluations should consider the

• current (as-is) state, including known open issues, gaps in functionality or quality attributes, and inconsistencies (in solution patterns, for example)

• proposed changes that may impact the architecture, including new capabilities, infrastructure changes, and operational changes

• target (to-be) architecture, including roadmap and tradeoff analysis of architecture optio

Current evaluation practice gives
little consideration to the target architecture.It may happen that the EA is too large to effectively create a target architecture; in such cases snapshots of sections of the architecture may suffice.Several participants cited examples of target architectures for large enterprises, so size is not a hard limiting factor.Like the target architectures, the as-is architecture may also be incomplete, which hinders effective evaluation.

In discussions of current EA evaluation practices, there was a clear difference between government and industry contexts.Government scale is often larger, and government organizations seem to have better discipline around practices such as business process manag ment.On the industry side, time-to-market pressures often drive decisions.Budget constraints play a large role in both contexts, affecting EA development and evaluation.

In the government context, the Clinger-Cohen Act and associat

regulations provide minimum standar
s for EA evaluation.In particular

• The Government Accountability Office (GAO) audits the practices used to create the EA.

1 Participants used the terms line-of-sight and alignment interchangeably.

• The Office of Management and Budget (OMB) audits how the organizat on is using the EA to meet business performance goals.

The SEI principles of evaluating an architecture to determine how well the architecture su ports business goals seems to complement the OMB standard.

In both commercial and government contexts, an organization generally uses the same governance process for all pr

ects, with no tailoring to accou
t for variation in project scope or complexity.For example, changes to existing capabilities use the same process as introduction of ne capabilities.

One particular evaluation context is what one participant termed "disruptive requirements."These are business requirements that have broad architectural significance and may necessitate changes in common infrastructure.Such requirements should be escalated, reviewed, and carefully evaluated.


The Role of Quality Attributes in Enterprise Architecture Evaluation

The participants had varying levels of familiarity with the SEI quality attribute-based approach to architecture.Among participants with more familiarity there was consensus that current EA pract

es have insufficient focus on quality attributes, a
d there is a need to elevate quality attribute concerns and tradeoffs to first-

ass status in EA development and evaluation.
There are some standards that identify EA quality attributes.For example, Control Objectives for Information and related Technology (COBIT) identifies

• effectiveness

• efficiency

• confidentiality

• integrity • reliability

• availability


• compliance

Other typical EA quality attributes would include

• profitability

• affordability

• scalability

• manageability

• alignment

• integration/interoperability

• sustainability


• agility

Presentations of EA artifacts by several participants showed that quality attributes are sometimes alluded to in EA

ocumentation, in pr
ject descriptions, or in roadmap annotations, but a need exists to make quality attribute concerns m re explicit.

Participants discussed the use of end-to-end business threads to systematically address quality attri

te concerns.One participant sugg
sted that end-to-end threads could be used to identify and resolve contradictory architecture decisions.These threads might be constructed by chaining toge her existing business processes.Discussion ensued about the cost of building end-to-end threads, and agreement that more study in this area is necessary.Pilot applications of the approach in multiple contexts could provide insight.


Evaluation Methods

In current practice, EA evaluation is not performed systematically within most organizations.Technical evaluations are more often performed at the project/solution level, based on perceived importance and risk.When performed, these evaluations use ad hoc methods relying on the expertise and experience of the reviewers.(One participant characterized the approach as "heads on sticks.")

The participants discussed whether business threads and quality attribute scenarios could be applied to structure and systematize EA evaluations, using the SoS Architecture Evaluation Method, which is based on the ATAM process model (See Figure 2 in Appendix B).Many agreed that the approach would work for most parts of an EA except for business architecture (for example, the TOGAF Application, Data, and Technology Architectures).The return on investment for modeling and analysis in the b siness architecture domain may not justify performing those activities to the level required for comprehensive evaluation.Work in this rea is typically heavy on governance and light on analysis.Evaluations focus on line-of-s

ht
to business goals and depend on the expertise and experience of the revie ers.Although more systematic and perhaps quantitative evaluation methods might be helpful, there are significant technical, ultural, and organizational challenges to adopting them.

Evaluations using the ATAM process model could be performed at the project/solutions level, ith each project treated as a system of systems, and risks and challenges rolled up to the overall EA level.The process would have to be extended to include evaluation of line-of-sight to business goals.More analysis, including pilot engagements in several contexts, would be needed to validate the u efulness of ATAM-based methods.

There was discussion about whether evaluation approaches for the other three TOGAF architectures (Application Architecture, Data Architecture, and Technology Architecture) would be the same, or if each type of architecture would require different methods.In part

ular, participants identified Data Architecture as different-in most
ases involving detailed models and design standards, and evaluation needs to encompass the data models, data exchanges, import/export, and the tools to manage the data.In general, there was agreement that the evaluation process might not be different for each type of architecture, but the quality attributes and risk impact analysis would certainly be different.

Participant described varying levels of investment in EA evaluation.One organization would spend up to four days with four reviewers to evaluate a project that was identified as architectural-ly significant, but the same organization might not review other projects.Another organization spends only three hours with eight re iewers for most projects.Justifying investment in architecture evaluation is an issue in many organizations, so for a method to be successful in practice, it must scale down to short, lighter weight evaluation scenarios.


Federation and Acquisition

The federation of enterprise architectures is sometimes required, in contexts ranging from military coalitions to post-merger corporate integration.Participants identified this as problematic in practice and suggested that research on evaluation of federated enterprise archite

ures might produce
aluable results.

There was discussion about the role of EA in supporting the acquisition of services, in particular the linkage between EA and contract service-level agreements.Some participants were trying to implement such support, but open questions persisted about how to scale it to the enterprise level and how to include acquired services in the evaluation process.

treated as one r more systems of systems.Again, a pilot study to use the Business Thread Workshop and Enterprise Architecture Evaluation (described inApplying SoS Approaches to Enterprise Architectures in Appendix B) would quantify the cost and benefit of this approach.

This (or another systematic approach to analysis and evaluation) would be an improvement over the state of the EA evaluation practice, which relies on expertise and experience to assess EA designs and plans.

A need also exists to develop decision criteria to determine when to use quality attributebased design and evaluation methods during the EA life cycle.The attention paid to each project is different-for example, incremental change projects tend to get less attention than transformative change projects.The decision criteria must accommodate this.Checklists or tools for making these decisions in a systematic way would be very helpful.

6. Any EA analysis and evaluation methods should avoid drilling down to software architectures.The methods should include software and system architects as both stakeholders and design collaborators, but EA methods should only look at attributes of the software and system architectures that are significant to the EA.


Future Work

It was suggested that the SEI perform pilot studies to investigate issues that may arise during overlay of the EA Engagement Model (shown in Figure 3 in Appendix B) onto vario s EA development models, such as the TOGAF Architecture Development Model (ADM) (The Open Group 2009).Issues such as EA evaluation timing and scope, structure of EA evaluation results, and documentation available to support the EA evaluation needs to be aligned between the EA development model and the SEI Engagement Model.

These pilot studies should also examine the effort required to introduce and apply methods such as the use of end-to-end business threads for capturing quality attribute concerns to support EA design and evaluation.In addition, the pilot studies should investigate how service-level contracts for acquired services can be included in the evaluation scope.

Methods for chara terizing the scope and risk of EA projects are needed.This characterization would inform investment decisions regarding the level of analysis and evaluation that is appropriate for each project.

Development of a comprehensive mapping between the EA terminology used in DoD, other government agencies, and commercial practice is needed to allow application of principles and practices across domains.

enterprise architecture defines how information and technology will support the business operations and provide benefit for the business.

It illust

tes the organization's core
mission, each component critical to performing that mission, and how each of these components is interrelated.These components include − Guiding principles − Organization structure − Business processes − People or stakeholders − Applications, data, and infrastructure − Technologies upon which networks, applications nd systems are built

Guiding principles, organization structure, business processes, and people don't sound very technical.That's because enterprise architecture is about more than technology.It is about the entire organization (or enterprise) and identifying all of the bits and pieces that make the organization work.

• (Lapkin, 2006, for the Gartner Gr

ise architecture i
the process of translating business vision and strategy into effective enterprise change y creating, communicating and improving the key principles and models that describe the enterprise's future state and enable its evolution.The scope of the enterprise architecture includes the people, processes, information and technology of the enterprise, and their relationships to one another and to the external environment.Enterprise architects compose holistic solutions that address the business challenges of the enterprise and suppo t the governance needed to implement them.

• (SearchCIO.com2007) An enterprise architecture (EA) is a conceptual blueprint that defines the structure and operation of an organization.The intent of an enterprise architecture is to determine how an organization can mo t effectively achieve its current and future objectives (also used by [Platt 2002] at Microsoft MSDN and [Ruest 2006] at IBM DeveloperWorks).

• (Zachman 2008) Architecture is the set of descriptive representations that are required to create an object.


Introduction

We believe that EA is critical to achieving business goals and that architectures are shaped by quality attribute requirements (such as those identified in Section 4.3 above).So we consider the following questions:

• How do we ensure that we have correctly and completely translated business goals into quality attribute requirements?

• How do we ensure that these quality attribute requirements are reflected in the tradeoffs and decision that shaped the EA?

We begin by reviewing the SEI perspective on architecture-centric engineering.Next we discuss how that approach scales from its original software context through systems and systems of systems.We review the SEI methods applicable to systems and systems of systems, and finally propose how those methods can be extended to apply to enterprise architectures.


An Architecture-Centric Perspective

The SEI approach to architecture is grounded in the following tenets:

• Every system has an architecture, regardless of scale.

• Architecture is the appropriate abstraction for reasoning about business or mission goal satisfaction.

• Quality attributes have a dominant influence on a system's architecture.

• Value derived from business and mission goal governs quality attribute tradeoffs.

• Well-founded, cost-effective measurements and analyses are the ases for acquiring confidence about system properties.

• Architectural prescriptions must be demonstrably satisfied by the implementation.

• Architectural decisions made today must appropriately reflect the drivers of system change.

We define the architecture of a computing system as the structures of the system, which comprise software elements, the externally visible properties of those elements, and the relationships between them (Bass 2003).This definition i equally applicable to cases where the architecture is accidental and to cases where the architecture is intentional.

In an intentional architecture, the structures result from decisions made by an architect Each decision is a tradeoff which promotes some qualities of the system while diminishing other qualities.

The traceability of quality attributes to business or mission goals provides the decision criteria for these tradeoffs.In the case of accidental architectures, decisions may be made by any stakeholder, and tradeoffs are not systematically tra eable to business goals.


Scaling to Address Enterprise Architecture

The SEI has extende methods for architecture analysis and evaluation in a relatively direct manner from software up through SoS (Gagliardi 2009).All of these methods are characterized by

• direct stakeholder participation in specification of quality attribute requirements and in architecture evaluation

• use of concrete scenarios or en

to-end threa
s to define quality attribute requirements and as the basis for architecture evaluation

• recognition that none of the methods are exhaustive-the results depend on engaging sufficient diversity of stakeholders to address the most important quality attribute requirements.

In considering enterprise architectures, a way to extend these methods was not obvious.Part of the difficulty may be due to definitional mismatch.Appendix A lists a number of definitions of EA, which share the following themes: • An EA is composed of (or realized by) four "sub-architectures"-(1) business architecture, (2) information or data architecture, (3) application architecture, and (4) technology or infrastructure architecture.

• EA refers to both a process and the artifacts produced by the process.

• The elements of an EA include people.

The notions of structure, qual ty attributes, and tradeoffs are not explicit in most of the discussions of EA. (This and other differences between the genres of software, system, system of systems, and EA were explored in detail at an earlier SE workshop [Bergey 2009]).Furthermore, the diversity of stakeholders and the number of scenarios or business processes to consider in an SoS architecture or EA can become intractable, risking spotty coverag

of quality attribute requirements or leading to a very lon
process to achieve adequate breadth.

These differences could lead one to conclude that enterprise architectures are fundamentally different from system architectures (Booch 2010).On the other hand, John Zachman, the father of EA, asserted that "Architecture is Architecture is Architecture" (Zachman 2007).This implies that we should be able to apply the principles and practices that have proven effective for analyzing and evaluating software architectures to architectures for systems, systems of systems, and enterprises.

Architecture evaluations based on methods such as the ATAM (Clements 2002) or the SEI System of ystems Architecture Evaluation Method (Gagliardi 2009) begin by identifying business or mission goals for the system.The business goals are then reflected in quality attribute requirements and specified using concrete scenarios.The scenarios are used to analyze t e architecture to identify decisions and tradeoffs and then to determine if the decisions and tradeoffs reflected in the architecture's structures are consistent with the quality att ibute requirements.

In attempting to extend existing methods, we began by questioning the primacy of quality attr bute requirements as the driver for EA.In looking at the definitions of EA, we considered other perspectives on EA that lead to differ nt evaluation approaches:

• EA is a process.We can evaluate the quality of the process and adherence to the process with methods like CMMI.

• The EA process is carried out by individuals and teams worki g within an organization.We can evaluate the capability of the people, teams, and organization using a method like the SEI Architecture Capability Assessment (Bass 2009).

• Business processes are a first-order element in EA.We could use an Organizational Coordination Theory perspe tive to evaluate alignment between the business processes and organizational structures (Bass 2008).

While these approaches may complement an architecture-centric evaluation approach, we take the position that methods building on the ATAM and the SEI System of Systems Architecture Evaluation Method are necessary to adequately evaluate an EA to ensure alignment of the EA to business goals.


Background -SEI Methods for Software-Reliant Systems and Systems of Systems

The following sections provi e background on mature SEI methods for analyzing and evaluating software-reliant systems and systems of systems.These methods provide the basis of the EA methods described below.


Quality Attribute Elicitation for Software-Reliant Systems and Systems of Systems

Organizations frequently have difficulty developing quality attribute requirements (Barbacci 2003).The SEI Quality Attribute Workshop was developed to provide a systematic method for quality attribute requirement elicitation (Barbacci 2003).

The method brings together as many as 20 stakeholders, and uses scenari s to help them express their quality attribute requirements for the system.It has been used to elicit quality attribute requirements for dozens of software-reliant systems.

The Quality Attribute Workshop (QAW) is a facilitated method that engages system stakeholders early in the system development life cycle to discover the driving quality attributes of a softwareintensive system.The QAW is system-centric and stakeholder focused; it is used before the software architecture has been created.The QAW provides an opportunity to gather stakeholders together to provide input about their needs and expectations with respect to key quality attributes that are of particular concer to them.

As originally developed, the method takes a "bottom-up" brainstorming approach, and the output is simply a list of quality attributes, concerns, and scenarios.In practice, sometimes the intermediate results are structured into a utility tree partway through the workshop, and the elicitation then continues in a more "top-down" fashion.In the case of very large or complex systems, there may be a series of QAWs, each focused on a different slice of functionality.

The use of scenarios to help stakeholders express quality attribute requirements extends through many of the SEI architecture-centric methods.A basic scenario describes how the system responds to a particular stimulus under a particular operating mode or environment.The basic scenario can be extended to include the stimulus source, specific response measure, and system elements involved in the scenario.This scenario-based methodology has proven successful in eliciting actionable quality attribute requirements, so that architects do not have to rely on vague stakeholder requests like "highly available," "low latency," and "user friendly."

Th

QAW has been
xtended to address the needs of military systems of systems.The Mission Thread 4 Workshop is also a facilitated process that brings together SoS stakeholders to both augment existing mission threads with quality attr bute considerations that will shape the SoS architecture and identify SoS architectural challenges.


Architecture Evalu tion of Software-Reliant Systems and Systems of Systems

The SEI Architecture Tradeoff Analysis Method (ATAM) was originally d veloped for evaluating the software architecture of a software-reliant system.It brings together a trained evaluation team, the decision makers for the system and architecture, and representatives of the architecture's stakeholders.The facilitated process helps stakeholders to ask the right questions to discover potentially problematic architectural decisio

.

The ATAM is conceptually depicted
in Figure 2. The method begins by identifying business goals for the s stem.The business goals are then reflected in quality att ibute requirements and specified using concrete scenarios.The scenarios are used to analyze the architect re and determine if the decisions and tradeoffs that led to the architectur 's structures are consistent with the quality attribute requirements.The method ident fies risks (potentially problematic decisions) and nonrisks, and explicitly identifies tradeoffs between quality attribut s and sensitivity points (decisions that significantly affect the ability of the arc itecture to achieve a particular quality attribute response).Risks are summarized into "risk t emes" that provide an executive summary of the evaluation and help organize risk mitigation planning.

It is up to the organization developing the architecture and system to decide whether and how to address the risks.The quantification of each risk and associated mitigation costs can only be determined within the business and organizational context w ere the system is being developed.The SEI Cost Benefit Analysis Method (CBAM) provides a structured method for analyzing alternative courses of action (Clements 2002).

The ATAM was recently extende to evaluate software and systems, that is, the software and associated electrical, mechanical, and other physical elements of the software-reliant system (Gagliardi 2009).For these evaluations, domain experts from the related physical disciplines are given just-in-tim

training to qualify to be evaluation team m
mbers, and the quality attribute scope is extended to the physical domains of interest.


4

A mission thread is a sequence of activities and events beginning with an opportu ity to detect a threat or element that ought to be attacked and ending with a commander's assessment of damage after n attack.The principles underlying the ATAM have been further extended to create the SEI System of Systems Architecture Evaluation metho , which has been applied in C4ISR contexts (Gagliardi 2009).The method uses mission threads augmented with quality attribute concerns (generated during Mission Thread Workshops describ d above) instead of scenarios to express the quality attribute requirements for the SoS.Mission threads are selected for analysis to reflect concerns in several broad categories: operational concerns (tactical operation of the SoS , sustainment concerns (field maintenance, updates, and training), and development (including test, integration, and associated processes and facilities).However, the complexity of SoS architectures, along with he number and breadth of stakeholders, is not conducive to performing exh ustive analysis, and so the success of t e method is sensitive to which mission threads are selected for analysis.It is for this reason that the method is targeted to "first pass" risk identification.


Applying SoS Approaches to Enterprise Architectures

The SEI approach to analyzing and evaluating enterprise architectures is based on the methods used for system-of-systems architectures.Figure 3 below shows how the SEI SoS Engagement Model might be extended to apply to enterprise architectures.In particular, the Mission Thread Workshop is modified as the Business Thread Workshop, and he SoS Architecture Evaluation is extended as the Enterprise Architecture Evaluation.In each case, the extensions are straightforward, and are described below.


Quality Attribute Elicitation for Enterprise Architectures -Business Thread Workshop

The Business Thread Workshop (BTW) is a facilitated engagement where the EA stakeholders augment a business thread with quality attribute considerations.A business thread is defined as an end-to-end flow through the enterprise, perhaps encompa sing multiple business processes.An example might be customer order placement through a contact center through order fulfillment through billing and accounts receivable through delivery through return authorization through receiving and accounts payable.At each step in the flow, quality attribute considerations (throughput, latency, measurability, auditability, etc.) are attached to the step.Additionally, overarching quality attribute concerns are identified.

Based on our experience with Mission Thread Workshops and SoS Architecture Evaluations there are three broad categories of business threads that may be considered during the BTW:

• Core Business -these threads trace through the core business processes of the enterprise.The thread described above is an example of a Core Business thread.

• perations Threads -these threads trace through support operations processes.Examples include deployment, migration, day-to-day management, training, and disaster recovery.

• Development Threads -including development, test, and integration.

Additionally, business threads can be categorized as

• "As-Is" -these threads reflect as-is capabilities th t must be maintained as the EA changes, for example during integration of an acquired company.

• "To-Be" -these threads reflect well-defined future capabilities that must be supported in a new or evolved architec ure.

• "What-If" -these threads are analogous to "Growth Scenarios" in the ATAM, exploring opportunities and testing the limits of the EA.

The SEI has piloted a series of BTWs with a financial services customer to develop analysis and evaluation scenarios, with generally positive results.

Architecture Evaluation for Enterprise Architectures

The SEI Enterprise Ar
hitecture Evaluation method has been developed to identify EA risks.It is identical to the SoS Architecture Evaluation method, except that is uses augmented business threads instead of augmented mission thread

Like the SoS Architecture Evaluation Method, it is sensitive to the threads chosen
for analysis and to stakeholder participation.It is concerned with stakeholder participation in both (1) the BTWs that augment the business threads with quality attributes concerns and (2) the architecture evaluation itself.

The evaluation is carried out by walking each augmented business thread through the architecture, and having the architect use EA documentation artifacts to demonstrate how the architecture will support the functionality and quality attribute requirements embodied in the augmented thread.In cases where risks indicate that the underlying systems may not adequately satisfy the architecture requirements, then more detailed evaluation of those systems using the System and Software ATAM may be warranted.

Since this workshop was held, the SEI recently performed an Enterprise Architecture Evaluation to evaluate the "Enterprise Services and Processes" part of an EA.This included the generation of end-to-end business threads, augmented with quality attribute considerations from var ous stakeholders.The end-to-end thread generation and augmentation was simple and straightforward.

Seven end-to-end threads were developed and augmented in less than one half-day.Stakeholders participating in the evaluation felt that these end-to-end threads provided adequate coverage for the entire EA.Included within the scope of this evaluation were the (1) business processes, (2) enterprise services, (3) user interactions, (4) engineering change processes, (5 engineering development, (6) integration, and (7) deployment processes.The evaluation was executed in one day and successfully identified risks, issues, and non-risks for the Enterprise Services and Processes for the EA.The EA evaluation method appears to be very promising for use in evaluating business, data, application, and technology architecture domains.



Figure 1: Enterprise Architecture Ties the Enterprise's Operating Model to the Foundation for Execution Through an Engagement Model Figure 2: Architecture Analysis Tradeoff Method (ATAM) Process Flow Figure 3: SEI Enterprise Architecture Engagement Model


Figure 1 :
1
Figure 1: Ent rprise Architecture Ties the Enterprise's Operating Model to the Foundation forExecution Through an Engagement Model(Ross, Weill, & Robertson, 2006)


Figure 2 :
2
Figure 2: Architecture Analysis Tradeoff Method (ATAM) Process Flow


Figure 3 :
3
Figure 3: SEI Enterprise Architecture Engagement Model


Workshop on Analysis and Evaluat

n of Enterprise Architectures 1.1 Background
1 ATable 1:Workshop Participan
sTable 2:Workshop AgendaTable 3:Workshop Questions

Table 1 :
1
Workshop Participants
NameOrganizationDavid CuylerSandia National LaboratoriesMichael GagliardiSEI Architecture Centric Engineering InitiativeLinda Parker GatesSEI Acquisition Support ProgramJohn GrassoFederal Railroad AdministrationCOL Michael GrayU.S. Army CIO/G-6John KleinSEI Architecture Centric Engineering InitiativeIan KomorowskiWhitney, Bradley, & Brown, Inc.Don a MarcumVeterans Health AdministrationPlamen PetrovBlue Cross Blue Shield AssociationTodd TiegerDeloitte & Touche LLPJeff TyreeCapital One
® Carnegie Mellon is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.® Architecture Tradeoff Analysis Method and ATAM are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.


Table 2 :
2
Workshop Agenda
Day 120 April 2010TimeTopic

Table 3 :
3
Workshop Questions

v CMU/SEI-2010-TN-023
CMU/SEI-2010-TN-023
CMU/SEI-2010-TN-023
AcknowledgmentsThe authors of this report gratefully acknowledge the participants of the workshop, whose contributions and energy made it possible.NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN "AS-IS" BASIS.CARNEGIE MELLON UNIVERSITY MAKES NO WARRAN IES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL.CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT.Appendix B -SEI Enterprise Architecture Analysis and EvaluationEngagement Mod lAcronyms and AbbreviationsReferences 5 SummaryWorkshop FindingsAt the close of the workshop the participants agreed on the following key summary points:1. Organizations can start by scoping EA as IS/IT (or Application, Data, and Technology Architectures, using TOGAF nomenclature), and then open the scope to include Business Architecture.Although it may not be necessary to consider the full scope of the business architecture if EA is limited to the IS/IT architecture, supporting enterprise-wide strategic decision making may require consideration of a broader business architecture scope.2. Any methods for EA need to accommodate high variability in structure(organization, roles, relationships, etc.)between the IS/IT and the business architecture owners.This variability stems from historical, cultural, and maturity differences between enterprises.Variability includes methods and evaluation results structured for various process frameworks, including − Capability Maturity Model Integration (CMMI ® ) framework − GAO (defines practices) − OMB EA Assessment Framework (defines EA usage) − National Association of State Chief Information Officers (NASCIO) (originally used GAO and OMB frameworks, then developed a separate self-assessment framework) − Gartner Group − TOGAF − others 3. Line-of-sight, or alignment, between the business objectives and strategies must be carried throughout the EA (including the business architectur

.This alignment must be developed down to an individ
al project (with documented traceability) and evaluated for "goodness."End-to-end business threads augmented with quality attribute concerns (as described in Quality Attribute Elicitation for Enterprise Architectures -Business Thread Workshop in Appendix B) seem promising for capturing alignment from architecture to business goals.Threads must trace back to business goals or capability objectives.4. Use of end-to-end business threads at the EA level may be beneficial for capturing quality attrib

e concerns and requirements to support EA design and evaluation.Ideally, these would
e developed incrementally as the EA evolves and then maintained as part of the architecture knowledge base.Introducing this method after the fact, in an "up and running" enterprise with an existing EA, may be prohibitively time consuming and expensive.A pilot study in this area may help clarify the effort required to develop end-to-end business threads.5. It appears feasible to apply quality attribute principles to drive architecture requirements and evaluation criteria for architecturally significant projects/solutions.These projects would be ® CMMI Is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.Appendix A -Survey of Enterprise Architecture Definitions•(Ross, Weill, & Robertson, 2006)Enterprise Architecture is the organizing logic for business processes and IT infrastructure reflecting the integration and standardization requirements of the fi m's operating model…The IT unit typically addresses four levels of architecture below the enterprise architecture: business process architecture…data or informa ion architecture… applications architecture…and technology architecture…The term enterprise architecture can be confusing because the IT unit in some companies refers to one of these architectures-or the set of all four architectures-as the enter rise architecture.• (Enterprise Architecture Research Forum 2009) 2 EA is the continuous practice of describing the essential elements of a socio-technical organization, their relationships to each other nd to the environment, in order to understand complexity and manage change.• (The Open Group 2009) There are four architect re domains that are commonly accepted as subsets of an overall enterprise architecture, 3 all of which TOGAF is designed to support.1.
he business architecture defines the business strategy, governance, organization, and key business processes.2. The data architecture describes the stru