Quality And Architecture Evaluation PDF
Document Details
Uploaded by Deleted User
Vrije Universiteit Amsterdam
dr. Remco de Boer
Tags
Related
- ConnectWise Automate Technical Evaluation Guide PDF
- Architecture Tropical Design Module 3 PDF
- Research Methods in Architecture PDF
- Assignment 2: CISC Architecture Performance Comparison (Fall 2024) PDF
- Ecologia - Licenciatura em Arquitetura Paisagista - 2024-2025 - PDF
- Theory Of Architecture 02 Architectural Design Process PDF
Summary
This document provides an overview of quality and architecture evaluation. It covers project assignments, outlining various aspects of software architecture and its relationships with quality attributes. It details quality attributes, and how they relate to tactics, design checklists, and various evaluation techniques. It also includes several examples of questions and methodologies to assess the different aspects of architectural approaches and systems.
Full Transcript
Quality and Architecture Digital Architecture XM_0127 dr. Remco de Boer Evaluation Vrije Universiteit Amsterdam Project assignments 1. Documentation Roadmap 5. Selected Views 2. Stakeholder profiles...
Quality and Architecture Digital Architecture XM_0127 dr. Remco de Boer Evaluation Vrije Universiteit Amsterdam Project assignments 1. Documentation Roadmap 5. Selected Views 2. Stakeholder profiles Primary presentation Description Element catalog Expectations and demands Design decisions and rationale (for this view) Business Goals 6. Mapping between views Architecturally Significant Requirements 7. Design decisions and Rationale 3. Functional View ‘beyond views’, i.e. for decisions that apply to more than one view 4. Selected Viewpoints 8. Assessment Description Selected assessment scenarios Addressed concerns Utility tree Stakeholders Analysis of architectural approaches Modeling techniques, notation, structure/metamodel 9. Glossary Rationale (for viewpoints selection) Project assignment: step 7 (Asessment) Prepare a number of scenarios (at least 2 per stakeholder) as a means to assess your architecture. These scenarios should be extracted from your stakeholder profiles, and be expressed in the form of quality attribute scenarios. Explain why you selected these scenarios. Restrict yourself to scenarios that together address a few (say 2 or 3) quality attributes only. The chosen scenarios should also be those that challenge the architecture the most. Further, with the selected scenarios, you should create a utility tree in which: you as a group categorize the scenarios developed according to the quality attribute(s) and/or quality attribute refinement(s) assessed by each scenario; you prioritize all scenarios according to their importance (L, M, H) for the stakeholders and architecture impact (L, M, H). Analyze the four highest-ranked scenarios and identify sensitivity points, tradeoff points, risks/non-risks and possible risk themes related to your architecture. 5 We learned How to sharpen stakeholder concerns and quality requirements – with profiles and scenarios How to illustrate the way an architecture solution addresses them – with view & viewpoints How to systematize design decision-making – with options exploration, capturing assumptions and rationale 6 Outline Quality On the relationship between quality and architecture Expressing quality in a disciplined way: scenarios From quality to architecture: tactics and design checklists Evaluation Architecture analysis Architecture documentation review ATAM (Architecture Tradeoff Analysis Method) Architecture metrics 7 Outline Quality On the relationship between quality and architecture Expressing quality in a disciplined way: scenarios From quality to architecture: tactics and design checklists Evaluation Architecture analysis Architecture documentation review ATAM (Architecture Tradeoff Analysis Method) Architecture metrics 8 Architecture influence Architecture is critical for the realization of many qualities (they should be designed in, and evaluated) Architecture by itself is unable to achieve qualities; the truth is in the (implementation) details Qualities are never achieved in isolation; they interact 9 Problems with quality attributes QA's are not testable (e.g. what does modifiable mean?) The concerns of QA's overlap (is a DoS-attack an aspect of Availability? Security? Usability? Performance? …) Each QA community develops its own vocabulary (event vs. attack vs. fault vs. user input) 10 Perspectives on quality Transcendental quality – I can't define it, but I know when I like it Product-based quality – related to attributes of the system User-based quality – fitness for use; (individual) user preferences Manufacturing-based quality – conformance to specs Value-based quality – deals with costs and profits [D. Garvin, Managing Quality, 2009] 12 ISO 25010 (product characteristics) Functional suitability FINAL INTERNATIONAL ISO/IEC DRAFT STANDARD FDIS 25010 Performance efficiency Compatibility ISO/IEC JTC 1 Secretariat: ANSI Systems and software engineering — Systems and software Quality Voting begins on: 2010-10-14 Requirements and Evaluation (SQuaRE) — System and software quality Voting terminates on: 2010-12-14 models Usability Ingénierie des systèmes et du logiciel — Exigences de qualité et évaluation des systèmes et du logiciel (SQuaRE) — Modèles de qualité du système et du logiciel Reliability Security Maintainability RECIPIENTS OF THIS DRAFT ARE INVITED TO SUBMIT, WITH THEIR COMMENTS, NOTIFICATION Please see the administrative notes on page iii OF ANY RELEVANT PATENT RIGHTS OF WHICH THEY ARE AWARE AND TO PROVIDE SUPPORT- ING DOCUMENTATION. IN ADDITION TO THEIR EVALUATION AS Reference number Portability BEING ACCEPTABLE FOR INDUSTRIAL, TECHNO- LOGICAL, COMMERCIAL AND USER PURPOSES, ISO/IEC FDIS 25010:2010(E) DRAFT INTERNATIONAL STANDARDS MAY ON OCCASION HAVE TO BE CONSIDERED IN THE LIGHT OF THEIR POTENTIAL TO BECOME STAN- DARDS TO WHICH REFERENCE MAY BE MADE IN NATIONAL REGULATIONS. © ISO/IEC 2010 13 Quality attributes not in ISO 25010 Availability (part of reliability?, security?) Reusability (part of maintainability?) Traceability (part of functional suitability?) 14 Main quality attributes in Bass (Ch 4-13) Availability Deployability Energy Efficiency Integrability Modifiability Performance Safety Security Testability Usability 15 Outline Quality On the relationship between quality and architecture Expressing quality in a disciplined way: scenarios From quality to architecture: tactics and design checklists Evaluation Architecture analysis Architecture documentation review ATAM (Architecture Tradeoff Analysis Method Architecture metrics 16 QA scenarios stimulus response artefact source of stimulus response measure environment 17 Outline Quality On the relationship between quality and architecture Expressing quality in a disciplined way: scenarios From quality to architecture: tactics and design checklists Evaluation Architecture analysis Architecture documentation review ATAM (Architecture Tradeoff Analysis Method Architecture metrics 18 Decompose QA Scenarios into Tactics Tactics: design decisions that influence the achievement of a quality attribute response Given a QA, tactics help the architect during design One tactic focuses on one QA (no tradeoffs) Tactics are packaged in design patterns Design patterns do include tradeoffs Example: Given the business goal “time-to-market,” the architectural decision would be to increase cohesion and reduce coupling to support parallel development by multiple teams 19 Example: Tactic for Availability An example tactic is: “redundant spare” to increase availability This would require “synchronization” as well. Thus note: Tactics can influence/refine other tactics. Tactics can be combined into “patterns” A pattern to support availability would include a redundancy-tactic and a synchronization-tactic Example: ‘active redundancy’ where all nodes receive and process identical inputs Tactics can be grouped hierarchically (trees) 20 Design patterns Definition: a design pattern is a recurring structure of communicating components that solve a general design problem within a particular context. Does not concern the whole system, just a few (interacting) components. Also termed micro-architecture. Concerns a specific part of the system. Examples: Object pool (Recycle objects, avoid expensive creation/release) Singleton (Ensure a class has only one instance) Adapter (Wrap an object in an interface a client expects) Lazy loading (Don’t load a resource until it is needed) 21 Patterns and tactics Patterns comprise tactics E.g., layers pattern is amalgam of tactics, such as: Increase semantic coherence, such as “hardware dependencies in one layer” Encapsulate/produce explicit interface to a module Restrict communication, such as “can only call from next layer” 22 Tradeoffs in patterns A publish-subscribe pattern is used to increase modifiability, but: Often is a single point of failure (availability) Slows things down (performance) Does not provide possibility to authorize/authenticate clients or servers (security) Does not provide testing functionality (testing interface, playback functionality) (testability) 23 Hierarchy of tactics: energy efficiency Energy Efficiency Monitor Allocate Reduce resources resources resource demand Manage event Metering Reduce usage arrival Static classification Discovery Limit event … … response … [Evolved from Procaccianti, Lago, and Lewis, Green architectural tactics for the cloud, WICSA 2014] 24 One tactic leads to another tactic … Detect faults. In particular, tactic system “ping/echo” aims to determine reachability and the round-trip delay. Ping/echo How to ensure the How to add How to prevent a performance ping/echo to ping flood? overhead is small the system? enough? Legenda Increase resources artifact tactic Resource utilization … Cost … design issue 25 Checking quality attributes through tactics-based questionnaires For all tactics for a quality attribute: Is the tactic supported in the system? Are there any obvious risks with using (or not using) this tactic? What are the specific design decisions made to realize the tactic? Any rationale or assumptions made in the realization of the tactic. See section 3.6, and the ‘Tactics-Based Questionnaire’ section for each of the QA chapters (4-13) 26 Outline Quality On the relationship between quality and architecture Expressing quality in a disciplined way: scenarios From quality to architecture: tactics and design checklists Evaluation Architecture analysis Architecture documentation review ATAM (Architecture Tradeoff Analysis Method Architecture metrics 27 Relative cost of error correction [Barry Boehm’s 1981 book Software Engineering Economics] 28 Architecture evaluation/analysis Assess whether architecture meets certain quality goals, such as those w.r.t. maintainability, modifiability, reliability, performance Mind: the architecture is assessed, while we hope the results will hold for a system yet to be built 29 Two kinds of questions Is this architecture suitable? Which of two or more architectures is the most suitable? 30 Architecture Analysis implementation Architecture System properties Properties Qualities 31 Analysis techniques Questioning techniques: how does the system react to various situations; often make use of scenarios Measuring techniques: rely on quantitative measures; architecture metrics, simulation, etc. 32 Outline Quality On the relationship between quality and architecture Expressing quality in a disciplined way: scenarios From quality to architecture: tactics and design checklists Evaluation Architecture analysis Architecture documentation review (not in Bass book; see report on Canvas) ATAM (Architecture Tradeoff Analysis Method Architecture metrics 33 Nord et al., Structured approach to review AD’s Establish purpose of review E.g., is AD ready for development activity Establish subject of review Which artifacts are needed Build/adopt appropriate question set Tailor question set Plan review Perform review Analyze and summarize results 34 Question set template Question set name Purpose Stakeholders and concerns Questions Respondents Expected answers Criticality Advice 35 Example question set Name: capturing right stakeholders and concerns Purpose: is list of stakeholders appropriate, are concerns complete, over-complete, do stakeholders believe their interests have been captured Stakeholders and concerns: all that have a substantial stake Questions ► Advice: appropriate for an active design review 36 Example questions (stakeholders) 1) State your stakeholder role, list concerns 2) Find all places in AD where your role is listed as being covered 3) Find all places in AD where your concerns are listed as being addressed 4) Record all concerns that you have that are not covered in the AD, or are listed unclear. State the impact of the omission or misunderstanding 5) For each concern, find and record where concern is addressed. Explain why you do or do not believe it is satisfied in the architecture 6) Is there a priorization of concerns? Do you agree with it? 7) Is an important stakeholder missing? 37 Example questions (choice of viewpoints) 1) Do the viewpoints frame the concerns of the stakeholders? 2) Is the approach consistent with dev. organization’s practices/standards? 3) Is the approach consistent with customer’s practices/standards? 4) For each viewpoint, are its models well-defined and clear? 5) What correspondences exist between models in same or across viewpoints? 6) Are all concerns addressed? 7) Can we do with less viewpoints? 8) Is the rationale for viewpoints captured? 38 Outline Quality On the relationship between quality and architecture Expressing quality in a disciplined way: scenarios From quality to architecture: tactics and design checklists Evaluation Architecture analysis Architecture documentation review ATAM (Architecture Tradeoff Analysis Method) Architecture metrics 39 Architecture Tradeoff Analysis Method (ATAM) Designed for Evaluators not familiar with architecture or business goals System not yet built Large number of stakeholders 40 Benefits Financial gains Forced preparation Captured rationale Early detection of problems Validation of requirements Improved architecture 41 Participants in ATAM Evaluation team Project decision makers Architecture stakeholders 42 Phases in ATAM 0: Partnership & Preparation (evaluation team leadership + key decision makers, logistics and arrangements) 1: evaluation (evaluation team + decision makers, one to two days) 2: evaluation (evaluation team + decision makers + stakeholders, two days) 3: follow up (evaluation team + client) 43 Steps in ATAM (phase 1 and 2) 1. Present method 2. Present business goals (by project manager of system) 3. Present architecture (by lead architect) 4. Identify architectural approaches (patterns, tactics) 5. Generate quality attribute utility tree (+ priority, and how difficult) 6. Analyze architectural approaches (scenario walkthrough) 1. Brainstorm and prioritize scenarios 2. Analyze architectural approaches (same as 6.) 3. Present results (incl. risk themes) 44 Example Utility Tree root QAs Attribute QA scenarios refinements A user updates a patient's account in response to a H, M change-of-address notification while the system is under peak load, and the transaction completes in less than 0.75 second Transaction response time A user updates a patient's account in response to a L, H change-of-address notification while the system is under double the peak load, and the transaction completes in less than 4 seconds Performance Throughput At peak load, the system is able to complete 150 transactions per second M, M Utility A physical therapist is allowed to see that part of a Confidentiality patient's record dealing with orthopedic treatment H, L but not other parts nor any financial information Security Integrity The system resists unauthorized intrusion and reports the intrusion attempt to authorities within 90 H, H seconds 45 Summary overview: Conceptual flow of the ATAM Source: www.sei.cmu.edu Important concepts in ATAM Sensitivity point: decision/property critical for certain quality attribute Tradeoff point: decision/property that affects more than one quality attribute Risk: decision/property that is a potential problem Nonrisk: architectural decision that, upon analysis, is deemed safe Risk theme: overarching theme (from the full set of discovered risks) that identifies systemic weaknesses in the architecture, process, or team These concepts are overlapping 48 Examples Sensitivity point (SP): number of simultaneous DB clients affects the number of transactions per second → assignment of #clients to server is a SP w.r.t. response measure #trans/sec (performance) Level of confidentiality is sensitive to the # bits of encryption (security) Average # person/days needed for maintenance might be sensitive to degree of encapsulation (maintainability) Tradeoff point: changing level of encryption → more security, less performance 49 Examples (cont.) Risk: the frequency of heartbeats affects the time of detecting failed components. Some assignments that lead to unacceptable values of response measure, are risks. Risk theme: insufficient attention to backup capability for inabilities to function in case of HW/SW failures. 50 Lessons learned in previous course editions Perceive assessment as critique If architecture description is too global/not precise enough: assessment does not tell much Likewise: if scenarios are too global 51 Outline Quality On the relationship between quality and architecture Expressing quality in a disciplined way: scenarios From quality to architecture: tactics and design checklists Evaluation Architecture analysis Architecture documentation review (not in Bass book; see report on Canvas) ATAM (Architecture Tradeoff Analysis Method) Architecture metrics 52 Architecture metrics: Analyzability 53 Analyzability metric Analyzability: System breakdown: 1 component: bad; very many components: bad too; somewhere in between: better Uniform size of components: the more uniform, the better Component Balance Metric: SB (system breakdown) * CSU (component size uniformity) 54 Analyzability metric: system breakdown [Eric Bouwers et al, Quantifying the analyzability of software architectures, WICSA 2011] 55 Read Quality: Bass et al, chapters 3 (study), 4-13 (read), 14 (skim over) Evaluation: Bass et al, chapter 21 A Structured Approach for Reviewing Architecture Documentation (SEI Report) on Canvas Eric Bouwers et al, Quantifying the analyzability of software architectures, WICSA 2011