Free Essay

Model Driven Development-Future or Failure

In:

Submitted By ghelkhawaga
Words 5437
Pages 22
Model Driven Development – Future or Failure of Software Development?
Ruben Picek, Vjeran Strahonja University of Zagreb, Faculty of Organization and Informatics, Varaždin ruben.picek@foi.hr, vjeran.strahonja@foi.hr

Abstract. This article discusses some issues of the software development paradigm called Model Driven Development (MDD). Its basic idea is to move software development to a higher level of abstraction by using models as primary artifacts, and to transform models into the source code, with the aid of tools. Currently, there are several approaches to the realization of MDD paradigm which should bring a lot of benefits. However there are still problems to be solved, which will be discussed here. The authors are also going to analyze developers pros and cons, and give their own opinion about today's open question: Will MDD become a failure in trying to deal with software crisis, like the idea of CASE tools in 80's or will it become the future of software development? Keywords. MDD, MDD approaches, Software Factories, Software Development 1. Model Driven Development Paradigm
In the last few years, software development has been faced with many challenges. Requirements of new and/or existing systems are growing, systems are complex and it is hard to build them on time and on budget. As an answer to these challenges, a wide spectrum of new approaches occurred, varying from buzzwords to comprehensive methodologies. One of the most prominent paradigms is Model Driven Development (MDD). MDD represents a set of approaches, theories and methodological frameworks for industrialized software development, based on the systematic use of models as primary artifacts throughout the software development cycle. It targets two roots of software crisis – complexity and ability to change. First of all, let us start with definition of MDD paradigm? The basic idea of this paradigm is to move the development efforts from programming to the higher level of abstraction, by using models as primary artifacts and by transforming models into source code or other artifacts The ultimate objective is the automated development (fully or partly). According to [13], MDD is a style of software development where models are primary software artifacts. Other artifacts and code are generated from them, according to best practices In [7], MDD is defined as software engineering approach consisting of the application of models and MDA,

model technology to raise the level of abstraction at which developers create and evolve software, with the goal of both simplifying and formalizing the various activities and tasks that comprise the software life cycle. According to [2], MDD refers to a set of approaches in which code is automatically or semi automatically generated from more abstract models, and which employs standard specification languages for describing those models and the transformations between them. These definitions make clear that the focus of the MDD is a shift from the programming to the modeling. Models are the key artifacts. Currently, models are mostly used as sketches that informally convey some aspects of a system or they can be used as blueprints to describe a detailed design that is then manually implemented [13]. Use of models as documentation and specification is valuable, but it requires strict discipline to ensure that models are kept up to date as implementation progresses. Time constraints mainly cause that the initial models are not updated during the design and implementation, and inaccurate models are harmful. In MDD, models are used not just as sketches or blueprints, but as primary artifacts from which efficient implementations are generated, transforming models into programming code or other artifacts characteristic to domain. As described above, MDD shifts the emphasis of application development away from the platform, enabling developers to design applications independent of the platform-level concerns. Platform is the province of developers with the platform specific expertise. Platform expertise is involved as late as possible, it means at transformations, rather than, being rediscovered many times during a project. Likewise, decisions about the implementation architecture are directly encoded in the transformation engine rather than documented as architectural decisions [13]. According to Selic [10], the essence of model driven development is about two things. One is abstraction, in terms of how we think about the problem and then how we specify our solutions. Second thing that often gets forgotten is the introduction of more and more automation into the software development, specifically by using computer based tools and integrated environments. MDD-style should enable automation to go much further. A software development project needs to produce many non-code artifacts; some of these are

completely or partially derivable from models. The following list gives some common examples of artifacts that are generated from models, but you can probably think of others [13]: Documentation: In organizations that follow a formal development approach, producing documentation takes a significant amount of development effort. Keeping documentation in line with the implementation is notoriously difficult. When using MDD, documentation is generated from models, ensuring consistency and making information available and navigable to developers. Test artifacts: It is possible to generate basic test harnesses, such as using JUnit, from the information contained in software models. If additional testspecific modeling is carried out, for example using the UML Profile for Testing, then complete test cases are generated. Build and deployment scripts: Using their expertise, infrastructure architects can create transformations to generate, build and deploy scripts. Other models: A system involves many interdependent models at different levels of abstraction (analysis, design, implementation), representing different parts of the system (UI, database, business logic, system administration), different concerns (security, performance, and resilience), or different tasks (testing, deployment modeling). To the certain extent, it is possible to generate one model from another, for example moving from an analysis model to a design model, or from an application model to a test model. Pattern application: Patterns capture best practice solutions to recurring problems. Patterns specify characteristics of model elements and relationships between those elements. Generally, a pattern comprises a model together with its implementation. Vice versa, pattern can be used as an element of the model. According to the above mentioned definitions, the heart of MDD paradigm makes: - models, - modeling and - model transformation. Modeling and models are the central point of contemporary software development. But, one fact related to models has to be emphasized. There is a big difference of what models represent and how there are used. Traditional models are just sketches, and blueprints for design. In order to be suitable for the MDD, models must satisfy additional criteria – they must be machine readable. Machine-readability of models is a prerequisite for being able to generate artifacts. There are two types of transformations: model to model (M2M) and model to code (M2C). Automated model transformations are the key for realization of the MDD idea.

2. MDD pros and cons
In this part of the article, the benefits of MDD will be discussed, as well as the problems which arise in their realization. 2.1. Benefits of the MDD According to [13], [12], MDD has the potential to greatly improve current practices in software development. This potential manifests in overcoming the current challenges – reducing the cost of development and increasing the consistency and quality of software. Some advantages of an MDD paradigm are [13], [12]: Increased developer productivity: The aim is the speed-up and cost reduction of the software development, by generating code and artifacts from models. Very important activities are factoring the cost of developing (or buying) transformations and careful planning. Maintainability: Many legacy software applications still run on platforms in which the organization no longer has expertise. MDD leads to a maintainable architecture where changes are made rapidly and consistently, enabling more efficient migration of components onto new technologies. High-level models are kept free of irrelevant implementation detail, making it easier to handle changes in the underlying platform technology and its technical architecture. Ideally, the change of the technical architecture means the new transformation. The transformation is reapplied to the original models to produce implementation artifacts for the new platform. Reuse of legacy: If somewhere are many components implemented on the same legacy platform, reverse transformations from the components to UML is needed. A solution may be a migration of components to the new platform, wrappers for access to the legacy component and integration of technologies such as Web services. Adaptability: Adding or modifying a business function is straightforward since the investment in automation was already made. When adding the new business function, you only develop the behavior specific to that capability. The remaining information needed to generate implementation artifacts was captured in transformations. Consistency: Manual application of coding practices and architectural decisions is error prone. MDD ensures that artifacts are generated consistently. Repeatability: MDD is especially powerful when applied at a program or organization level; the return on investment from developing the transformations increases each time they are reused. Use of proofed and tested transformations also increases the predictability of developing new functions, and reduces risk because architectural and technical issues were already resolved.

Improved stakeholder communication: Models omit implementation details that are not relevant to understand the logical behavior of a system. They are much closer to the problem domain and reduce the semantic gap between the concepts understandable by stakeholders and the language in which the solution is expressed. It facilitates the delivery of solutions that are better aligned to business objectives. Improved design communication: Models help understanding of the systems at the design level and improve discussion and communication about the system. Because models are part of the system definition rather than documentation, they are reliable and up to date. Capture of domain knowledge: Projects or organizations often depend on key experts who repeatedly make best practice decisions. Their expertise is captured in patterns and transformations, so they don't need to be present for other members of a project. If sufficient documentation accompanies the transformations, the knowledge of an organization is maintained in the patterns and transformations even when experts leave. Models as long-term assets: Models are important assets that capture of what the information system of an organization is and does. High-level models are resilient to changes at the state-of-the-art platform level. They change only when the business requirements change. Ability to delay technology decisions: Early phases of an application development are focused on logical issues. You can delay the choice of a specific technology platform or product until the further information is available. In domains with extremely long development cycles (such as air traffic control systems), this is crucial. The target platforms may not even exist when development finishes. 2.2. Current problems The primary goal in MDD paradigm is to raise the level of abstraction at which developers operate. It should reduce both the amount of developer's efforts and the complexity of the software artifacts that the developers use [7], [6]. Of course, there is always a trade-off between simplification by raising the level of abstraction and oversimplification, where details for any useful transformation are missing. As you can assume, problems are bound to model abstractions at different stages of the software life cycle. The open issue is how to transform a model at one level of abstraction, into a model or code at a lower level? In trying to answer this question, new ones arise. How to use models? Some developers use models only for sketching, others for blueprinting while MDD community presumed models as programming language. Which notation and modeling language should be used in order to provide automation? The standardization of modeling notations is

unquestionably an important step for achieving MDD. Standardization provides developers with uniform modeling notations for a wide range of modeling activities. In SW industry today, the Unified Modeling Language (UML) is a standard language for specifying, visualizing, constructing, and documenting the artifacts of software systems. The UML represents a collection of best engineering practices which have been proven in the modeling of large and complex systems. Although UML is widely recognized and used as modeling standard, it provoked a lot of criticism. Here are some of our critic opinions: - To be able to address so many needs, UML 2.0 becomes enormous, ambiguous and unwieldy. It contains some diagrams and constructs that are redundant or infrequently used. - UML 2.0 lacks a reference implementation and a human-readable semantic account to provide an operational semantics, so it's difficult to interpret and correctly implement UML model transformation tools. - The lack of semantics precision makes the production of automated MDD tools difficult because the semantics carries the meaning that is essential to enable automation. - Problems in learning and adopting. Is UML suitable as model programming language? The notion of UML 2.0 as a model programming language is predicated on the belief that the use of higher levels of abstraction will make developers more productive than current programming languages. Fowler [4], discusses whether this opinion is true. He doesn't believe that graphical programming will succeed just because it is graphical. Indeed he has seen several graphical programming environments that failed - primarily because it was slower to use them than writing code. (Compare coding an algorithm to drawing a flow chart for it.). Even if UML is more productive than programming languages, it will take the time to become accepted. Most of the people for many reasons are not using programming language they consider to be the most productive. Furthermore Greenfield et al. [5] argue that although UML 2.0 is a useful modeling language, it is not an appropriate language for MDD, because UML is designed for documenting and not for programming. They promote use of special-purpose, domain-specific languages (DSLs). Clearly, UML or any other MDD language faces significant hurdles to demonstrate sufficient value to satisfy the needs of all the different kinds of MDD users. According to [7], MDD creates other problems, like: redundancy, rampant round-trip problems, moving complexity rather than reducing it and more expertise that is required.

3. MDD approaches
Two leading realizations of MDD paradigm are: The Object Managements Group (OMG) approach called Model Driven Architecture (MDA) and Microsoft's Software Factories (SF). Basic ideas of these approaches will be briefly presented below in this paper. According to [7] it is too early to predict which, if any, of the current MDD approaches will be accepted as an industrial standard. 3.1. Model Driven Architecture (MDA) MDA represents one view of MDD paradigm. It is not a new OMG specification, but an approach to software development which is enabled by existing OMG specifications, such as UML, MOF and CWM, with ability to address the complete development lifecycle. Model Driven Architecture is an approach that separates system's desired functions from their implementation on the specific technology platform. The result is an architecture that is not tied to any language, platform, or vendor [8]. MDA provides an approach for [1]: - specifying a system independently of the platform that supports it - specifying platforms - choosing a particular platform for the system - transforming the system specification into one particular platform. 3.1.1. Principles and Benefits of MDA Brown [3] highlights four principles underlying the OMG's view of MDA: - Models expressed in a well-defined notation are a cornerstone for enterprise-scale solutions. - The building of systems can be organized around a set of models by imposing a series of transformations between models, organized into an architectural framework of layers and transformations. - A formal underpinning for describing models is a set of metamodels, which facilitates meaningful integration and transformation among models, and is the basis for automation through tools. - Acceptance and broad adoption of the modelbased approach requires industry standards to provide openness to consumers, and foster competition among vendors. The three primary benefits of MDA are portability, interoperability and reusability through architectural separation of concerns [1], [12]. Further benefits are [8]: productivity, quality, rapid inclusion of new technology, reduced cost and development time. 3.1.2. Basic MDA Concepts Concepts that make the core of MDA are [1]: System - The context of MDA is the software system, either existing or under construction.

Model – A model is a formal specification of the function, structure and behavior of the system within a given context, and from a specific point of view. A model is often represented by the combination of drawings and text, typically using a formal notation such as UML, augmented with natural language expressions where appropriate. Model driven – This term describes an approach to software development whereby models are used as the primary source for documenting, analyzing, designing, constructing, deploying and maintaining a system. Architecture - The architecture of a system is a specification of parts and connectors of the system, together with rules for the interaction among them. Within the context of MDA, these parts, connectors and rules are expressed via a set of interrelated models. Viewpoint - A viewpoint is an abstraction technique for focusing on a particular set of concerns within a system while suppressing all irrelevant details. A viewpoint can be represented via one or more models. MDA viewpoints - MDA specifies three default viewpoints of the system: computation independent, platform independent and platform specific: − The computation independent viewpoint focuses on the context and requirements of the system without consideration for its structure or processing. − The platform independent viewpoint focuses on the operational capabilities of the system, outside the context of a specific platform, by showing only those parts of the complete specification that can be abstracted out of that platform. − The platform specific viewpoint augments a platform independent viewpoint with details relating to the use of a specific platform. Platform - A platform is a set of subsystems and technologies that provide a coherent set of functionality through interfaces and usage patterns. Clients of a platform make use of it without concern for its implementation details. Examples of platforms include operating systems, programming languages, databases, user interfaces, middleware solutions etc. Platform independence - Platform independence is a quality that a model may exhibit when it is expressed independently of the features of another platform. Independence is a relative indicator in terms of measuring the degree of abstraction, which separates one platform from another (i.e. where one platform is either more or less abstract compared to the other). Platform Model - A platform model describes a set of technical concepts representing its constituent elements and the services it provides. It also specifies constraints on the use of these elements and services by other parts of the system. Model Transformation - Model transformation is the process of converting one model to another within the same system. The transformation combines the

platform independent model with additional information to produce a platform specific model. Implementation - An implementation is a specification that provides all the information required to construct a system and to put it into operation. It must provide all of the information needed to create an object, and to allow the object to participate in providing an appropriate set of services as part of the system. 3.1.3. MDA Models Corresponding to the three MDA viewpoints defined above, MDA specifies three default models of the system. These models can perhaps be described more precisely as layers of abstraction, because a set of models can be constructed within any of these three layers, each one corresponding to some viewpoint of the system. Computation Independent Model CIM) is a model of the system from the computation independent viewpoint. CIM is also often referred to as a business or domain model because it uses a vocabulary that is familiar to the subject matter experts (SMEs). It presents exactly what the system is expected to do, but hides all information technology related specifications to remain independent of how that system will be (or currently is) implemented. CIM plays an important role in bridging the gap between domain experts, users and designers. Platform Independent Model (PIM) is a model of the system from the platform independent viewpoint. A PIM should exhibit a sufficient degree of independence to enable its mapping to one or more platforms. This is commonly achieved by defining a set of services in a way that abstracts out technical details. Platform Specific Model (PSM) is a model of a system from the platform dependent viewpoint. A PSM combines the specifications in the PIM with the details required to stipulate how a system uses a particular type of platform. If the PSM does not include all of the details necessary to produce an implementation of that platform, it is considered abstract (meaning that it relies on other explicit or implicit models which do contain the necessary details). 3.1.4. Model Transformations Form a key part of MDA. The real value of MDA lies in the fact that the CIM can be translated to a PIM by a simple mapping. Likewise, the PIM can be translated to a PSM (by a mapping), and the PSM can be translated to code. The key elements are the mappings and the MDA tool or tools that do the translation [8]. Figure 1. shows how the different models are connected to each other. CIM is in the upper part of the image where everything starts. CIM is translated to PIM and the architect or designer creates the rest of the model. Finally, PIM is translated to one or more PSMs. Before the

translation, the models need to be marked for the translation tool.

Figure 1. MDA models are connected using model mappings According to [9], there are two different interpretations of how the MDA vision might be achieved. These two schools are called the elaborationist and the translationist. In the elaborationist approach once the PIM is created, the tool generates a skeleton of PSM. The developer then elaborates it by adding further detail. Similarly, the tool generates the final code from PSM, and this should also be further elaborated. In the translationist approach, PIM is translated directly into the final code of the system by code generation. 3.2. Software Factories (SF) Software Factory (SF) combines the MDD approach with the idea of industrial software development. SF is a software product line that configures extensible tools (like Microsoft Visual Studio Team System - VSTS), processes, and content by using a software factory template based on a software factory schema, to automate the development and maintenance of variants of an archetypical product by adapting, assembling, and configuring framework based components [5]. A key characteristic of a software factory is that architects and developers can customize, extend, and adjust it to address the unique needs of a project team or an organization. A software factory comprises three central elements: - software factory schema - software factory template - extensible development environment. Software factory schema enables categorizing and summarizing of development artifacts. According to [6], we can imagine the software factory schema as a recipe. It lists ingredients, like projects, source code directories, SQL files and configuration files, and explains how they should be combined to create the product. It specifies which DSLs should be used and describes how models based on these DSLs can be transformed into code and other artifacts, or into other models. It describes the product line architecture, and key relationships between components and

frameworks of which it is comprised. So, the software factory schema describes the artifacts that must be developed to produce a software product. Software factory template in a software factory schema describes necessary but still missing assets at the project. We must implement the software factory schema, defining the DSL's, patterns, frameworks, samples, scripts and so on, and make them available to software developers. According to [6] software factory template is like a bag of groceries containing the ingredients listed in the recipe. Extensible development environment, also called Integrated Development Environment (IDE), defines the place where the assets of software factory template can be loaded. According to [6], IDE is like the kitchen where the meal is cooked. When configured with the software factory template, IDE becomes a software factory for the family of products. This analogy can go further, the products are like meals served by a restaurant. Software factory stakeholders are like customers who order meals from the menu. A product specification is like a specific meal order. The product developers are like cooks who prepare the meals described by the orders, and who may modify meal definitions, or prepare meals outside the menu. The product line developers are like chefs who decide what will appear on the menu, and what ingredients, processes, and kitchen equipment will be used to prepare them. Software factory are based on the convergence of key ideas in systematic reuse, development by assembly, model driven development and process frameworks. Many of those ideas are not new. What is new is their synthesis into integrated and increasingly automated approach [5]. Software factory represents an attempt to learn from other industries facing similar problems, and apply patterns of automation to existing manual development tasks [6]. The author's opinion is that the SF vision will be realized over the next few years in the form of software development environments.

4. Critical Overview of MDD
MDD is not the first attempt to solve the problems which arise during the software development cycle. It is well known that during the 1980s Computer Aided Software Engineering (CASE) was promised panacea for solving world's software development problems [7]. Traditional CASE tools based on the structural methodologies failed [5]. So, the logical questions are: What is different now? Is the model-driven development likely to gain widespread adoption as a future way of developing software? Will it make building software faster, better, and cheaper? Or, will MDD have the same ignoble end like CASE? Danner [10] says that this might be déjà vu all over again. He claims that the models are too difficult to create and maintain and resulting code requires

extensive modifications to work properly. Or maybe it is just that the culture isn't ready for the paradigm shift. Other open issues are: Will this new technology require special skills that still don't exist in the SW industry? Will developers treat models as a new kind of development language, or the model-driven development will be limited to some closed community of willing and able to master it? Will these models be first-class artifacts throughout the entire life-cycle of an application, or the fancy code generators all will be abandoned as soon as the developers start tweaking the code directly? Some authors [7] believe that MDD has a chance to succeed in today's software industry, but still it is far from a sure bet. They base their opinion on the fact that the use of UML-based tools is growing, many standards are defined and technology has involve since CASE era. Greenfield [10], one of the authors of SF, thinks that the pragmatism is the primary factor that will determine the success or failure of model-driven development in the industry. There is a strong contrast in terms of pragmatism between the two leading approaches, SF and MDA. With MDA, you have two levels of model, the platform-independent model (PIM) and the platform-specific model (PSM) They are based on UML, which is a general-purpose modeling language, and they are related by transformation, since PSMs are generated from PIMs. With SF, by contrast, you have an arbitrary number of viewpoints such as user interaction, business work-flow, or logical database design etc. In fact, you can define as many viewpoints as necessary to describe the business requirements in the software under development. Each viewpoint is potentially based on a DSL but is tailor-made to address a set of unique concerns for that viewpoint. What is, from his perspective different as compared with the CASE tools era? He thinks it is precisely the pragmatic bottom-up approach which is taken with SF. Unlike MDA, which optimistically assumes like CASE did that most or all of the software can be generated from models, SF blend modeling with other software development practices to meet the needs of developers in the real world [10]. Kelly [10], a modeling tool developer, says that CASE tools failed because they tried to impose three things on the users, a way of working, a way of modeling, and a way of coding. He claims that perhaps the worst of all the problems is the gap between the generated and the handwritten code. Because the vendors should make one tool work for as many people as possible, the generated code can't be tuned to the specific needs of all users. To generate full production code from models, you need both the modeling language and the code generator to be domain specific. The big question then is what's the cost of building your own modeling language, modeling tools to support it and your own code generator? For a long time that cost was measurable

in tens of man years, way to high to make domainspecific modeling practicable for all but the largest projects. However, now there are tools that make building such support much faster. Selic [10] doesn't believe that CASE actually failed. He thinks that CASE tools were simply an early step in the right direction towards more automation and higher levels of development. So he is quite an optimist when it comes to the future of model driven development. He says that something is standing in the way and perhaps the biggest impediment is cultural change that is prerequisite of the shift of paradigm. Almost all of the vendors admitted the MDD's goals of seamless interoperability of model transformations haven't been fully realized [11]. This is partly due to standards that leave specific implementation details up to commercial vendors who must release their products into market to fund product development as the standards evolve. As the standards are put into real-world practice, additional needs and limitations are identified, and extensions or whole new categories are added to the standards family to address those concerns. Even as these standards continue to evolve, the consolidation and disappearance of some offerings have prompted cynical software industry pundits to conclude that MDD is CASE 2.0 [11].

5. Conclusion
The “modest” intent of MDD is to improve software quality, reduce complexity, and improve reuse by enabling developers to work at reasonably higher levels of abstraction and to ignore “unnecessary” details. In practice, however, MDD raises a number of significant issues which have been discussed in the article. MDD is still evolving so it is too early to talk about failure or success of MDD. Significant ideas for improvements are present, some of them are realized and some are currently immature and need to be solved in the near future. So it is too early to answer the question: is MDD failure or future of software development?

[5] Greenfield, J., Short, K., Cook, S., Kent, S.: Software Factories - Assembling Application with Patterns, Models, Frameworks and Tools, Wiley Publishing, 2004. [6] Greenfield, J., Short, K.: Moving to Software Factories, 2004., http://blogs.msdn.com/askburton/archive/2004/09 /20/232065.aspx, (25.05.2007.) [7] Hailpern, B., Tarr, P.: Model-driven development: The good, the bad, and the ugly, IBM System Yournal, Vol 45, No 3. 2006., p. 451-461., http://www.research.ibm.com/journal/ sj/453/hailpern.html, (05.01.2007.) [8] Kontio, M.: Architectural manifesto: MDA for the entrprise, 2005., , (05.01.2007.) [9] M.: Mc Neile: The Vision with the Hole?, Metamaxim, 2003., http://www.metamaxim. com/download/documents/MDAv1.pdf, (03.10.2006.) [10] Pierson, H.: ARCast #5, 2007. http://channel9. msdn.com/ Showpost.aspx?postid=132943, (09.02.2007.) [11] Riley, M.: A Special Guide-MDA and UML Tools: CASE 2.0—or the Developer's Dream, 2006., http://www.ddj.com/dept/ architect/184415500, (03.05.2007.) [12] Swithinbank, P., Chessell, M., Gardner, T., Griffin, C., Man, J., Wylie, H., Yusuf, L.: Patterns: Model-Driven Development Using IBM Rational Software Architect, IBM Redbooks, 2005. [13] Yusuf, L., Chessel, M., Gardner,T.: Implement model-driven development to increase the business value of your IT system, http://www128.ibm.com/developerworks/library/ar-mdd1/, (14.04.2006.)

6. References
[1] *** MDA Guide Verison 1.0.1, OMG, 2003. [2] Brown, A. W., Iyengar, S., Johnston, S.: A Rational approach to model-driven development, IBM System Yournal, Vol 45, No 3. 2006., p. 463-480., http://www.research.ibm.com/journal /sj/453/brown.html, (05.01.2007.) [3] Brown, A: An introduction to Model Driven Architecture – Part I, 2004., http://www128.ibm.com/developerworks/rational/library/310 0.html (05.12.2005.) [4] Fowler, M.: UML As Programming Language, http://martinfowler.com/bliki/UmlAsProgrammin gLanguage.html, (25.02.2007.)

Similar Documents

Premium Essay

Gem 2011

...GLOBAL ENTREPRENEURSHIP MONITOR Entrepreneurs and Entrepreneurial Employees Across the Globe Niels Bosma, Sander Wennekers and José Ernesto Amorós 2011 Extended Report: GLOBAL  ENTREPRENEURSHIP  MONITOR   2011  EXTENDED  REPORT:   Entrepreneurs  and  Entrepreneurial  Employees   Across  the  Globe     Niels  Bosma,  Utrecht  University  &  Global  Entrepreneurship  Research  Association   Sander  Wennekers,  EIM  Business  &  Policy  Research   José  Ernesto  Amorós,  Universidad  del  Desarrollo   Founding  and  Sponsoring  Institutions:   Babson  College,  Babson  Park,  MA,  United  States   Lead  Sponsoring  Institution  and  Founding  Institution     Universidad  del  Desarrollo,  Santiago,  Chile   Sponsoring  Institution   UniversitiTun  Abdul  Razak,  Malaysia   Sponsoring  Institution     London  Business  School,  London,  United  Kingdom   Founding  Institution       Although  GEM  data  were  used  in  the  preparation  of  this  report,  their  interpretation  and  use  are  the   sole  responsibility  of  the  authors.     The   authors   would   like   to   thank   Erkko   Autio,   Alicia   Coduras,  ...

Words: 10913 - Pages: 44

Premium Essay

Software Engineering

...Faculty of Computer Science in Germany. This document neither claims completeness, nor correctness of the presented topic. Please let me know in case of errors or missing information: contact.benjaminsommer.com [SOFTWARE ENGINEERING LECTURE NOTES] October 21, 2011 OVERVIEW SOFTWARE PROCESSES SOFTWARE PROCESS MODELS PROCESS ACTIVITIES COPING WITH CHANGE THE RATIONAL UNIFIED PROCESS AGILE SOFTWARE DEVELOPMENT AGILE METHODS PLAN-DRIVEN AND AGILE DEVELOPMENT EXTREME PROGRAMMING AGILE PROJECT MANAGEMENT SCALING AGILE METHODS REQUIREMENTS ENGINEERING FUNCTIONAL AND NON-FUNCTIONAL REQUIREMENTS THE SOFTWARE REQUIREMENTS DOCUMENT REQUIREMENTS SPECIFICATION REQUIREMENTS ENGINEERING PROCESSES REQUIREMENTS ELICITATION AND ANALYSIS REQUIREMENTS VALIDATION REQUIREMENTS MANAGEMENT SYSTEM MODELING CONTEXT MODELS INTERACTION MODELS STRUCTURAL MODELS BEHAVIORAL MODELS MODEL-DRIVEN ENGINEERING ARCHITECTURAL DESIGN ARCHITECTURAL DESIGN DECISIONS ARCHITECTURAL VIEWS ARCHITECTURAL PATTERNS APPLICATION ARCHITECTURES DESIGN AND IMPLEMENTATION OBJECT-ORIENTED DESIGN USING THE UML DESIGN PATTERNS IMPLEMENTATION ISSUES OPEN SOURCE DEVELOPMENT SOFTWARE TESTING DEVELOPMENT TESTING TEST-DRIVEN DEVELOPMENT RELEASE TESTING download.benjaminsommer.com | 1 3 5 5 7 10 13 16 16 17 17 19 20 21 21 23 24 25 26 29 30 31 32 32 33 34 35 37 38 39 39 42 44 45 47 48 50 51 52 56 57 October 21, 2011 USER TESTING SOFTWARE EVOLUTION EVOLUTION PROCESSES PROGRAM EVOLUTION DYNAMICS SOFTWARE MAINTENANCE LEGACY SYSTEM MANAGEMENT...

Words: 24348 - Pages: 98

Premium Essay

Network Requirements Analysis

...Designing a new network from the ground up requires the input from many stakeholders of the project. Failure to follow a standard requirements analysis model often leads to network architectures and design that are outside of the scope of the project. For example, the resulting network may not be what the users expected, it may not support applications envisioned, and the technology the design is based on may not support certain devices. Failure to communicate during the requirements analysis process can end up with a network designer doing whatever he/she/ or they feel comfortable with. What the resulting product may become is a network based on proprietary technology of a single vendor, making the network difficult to expand or upgrade in the future. In this paper, the requirements analysis sections will be thoroughly discussed as they apply to the network design process, and how following each one to its completion improves the entire process. In addition, the specific tools for determining performance requirements and the importance of stakeholder input will also be addressed. It is often the case that strict timelines and budgetary concerns result in shortcuts to the requirements analysis process, shortcuts that can become expensive headaches down the road. The process of analyzing requirements is composed of five sections that include gathering and listing requirements, developing service matrices, characterizing behavior, developing requirements, and mapping these requirements...

Words: 1017 - Pages: 5

Premium Essay

21st Century Leadership Skill

...21st Century Leadership Skills - Defined January 14, 2008 by Jeff Brunson A 21st Century Leader understands that if you want engaged employees, you must develop the individual. It is this understanding that leads the 21st Century Leader to a focus on self. Not a selfish focus - but a focus on self for the benefit of others. The Brunson Level II Coaching Program and the Brunson Leadership Development Program for Groups focus on the following key skills for Leadership effectiveness and organizational impact: 7 Skills for Leadership Confidence Personal/Professional Growth Management Skills Personal study and professional application is a must for you as a Leader in the 21st Century. How well you self assess and act on that assessment determines the quality of the goals you set and the effectiveness of your action plans. The impact you deliver is dependent upon how you integrate Leadership behaviors and implement on your strategies. Communication and Connection Skills How skillful a Leader communicates is a major factor of credibility. You must consistently communicate well one-on-one and in group settings. As Leaders, it is imperative that our communication is consistent and credible. We must be able to consistently create safety for dialogue and confidently confront issues. You approach performance management and succession management as key areas for consistent communication. Messaging Skills You are the message. Your Brand is the succinct version of that message. It...

Words: 2540 - Pages: 11

Premium Essay

Adis

...Dynamic System Development Methodology (DSDM) is an agile project that delivery frame, mostly used as a software development method. DSDM is an iterative and incremental way that included the principles of agile development, including continuous user participation. Stage 1A: The Feasibility Study At this stage of the project, the feasibility of the project can be detected using DSDM. The decision is made by judging the type of project, organizational and people problems, whether to use DSDM or not. Subsequently, it will generate a feasibility report, prototypes, and a global outline. The report, prototypes and global outline includes a development plan and risk log. In short, it is the most important techniques used by the workshop in solving the feasibility of the project at hand. Stage 1B: The Business Study Next, this stage analyzed the essential characteristics of business and technology which the feasibility study will be then extended. After that, the project has been considered feasible using DSDM and this step is to check the impact of business processes. *A sufficient number of the customer’s experts are gathered to be able to consider all relevant facts of the system and to be able to agree on development priorities.* A list of requirement is combining by the information from these meetings. After all, it built up a priority list of requirements, a business area definition, structure definition systems, and an outline of prototype. Stage 2: Functional Model Iteration ...

Words: 3168 - Pages: 13

Free Essay

In Vivo in Slico

...BEST PRACTICES Simulation courtesy Foundation of Cardiac Surgery Development. IN VIVO, IN VITRO, IN SILICO! Four best practices help ensure a smooth technology shift to computer modeling and simulation for medical device and pharmaceutical organizations. By Thierry Marchal, Director Healthcare Industry Marketing, ANSYS F ollowing early engineering simulation adopters such as the aeronautic, automotive and nuclear industries, biomedical and pharmaceutical companies have started to widely embrace computer modeling and simulation (CM&S) to accelerate their product development processes and reduce the huge cost of bringing a new drug to market. (That cost could be up to $2 billion U.S.) Yet, some healthcare practitioners and organizations remain hesitant to adopt this unfamiliar approach and technology. Medical history shows that those who embraced true technological revolutions early emerged as new market leaders. Others, including some who were previously dominators, simply disappeared. The medical world experienced this shift a few centuries ago by following the example of Leonardo da Vinci, adopting an in vivo approach to understand how the human body works. In other words, researchers conducted tests and experiments © 2015 ANSYS, INC. on living organisms as well as cadavers. This led to tremendous innovation, such as the development of modern surgery, which saved many lives. Later in the 19th century, innovators developed in vitro testing,...

Words: 2091 - Pages: 9

Premium Essay

Healthcare Reform and Chf

...Health Care Reform and Heart Failure John Jones Heart Failure is the inability of the heart to pump enough blood to support all organs because the muscle of the heart wall is weakened and enlarged. The most common cause of heart failure (HF) is coronary artery disease (CAD); however, some common risk factors that lead to heart failure include heart defects present since birth, high blood pressure, heart valve disease, infection of the heart, abnormal heart rhythm and being overweight . In the United States about 5.7 million people have heart failure and is mentioned as a contributing cause in more than 280,000 deaths, that is one in nine deaths in 2009 and is the primary cause of death in more than 55,000 deaths each year. . (Roger, Lliyd-James, Benjamin, & Borden, 2012) Heart failure costs the nation $34.4 billion each year, including healthcare services, medications, and lost productivity. (Kochanek, Murphy, Minino, & Kung, 2011) Furthermore, adults admitted with a secondary diagnosis of heart failure rather than a primary diagnosis experienced a higher cost of hospitalization. (Wang, Zang, Ayala, Wall, & Fang, 2010) Early diagnosis is important in the treatment of heart disease in order to improve the quality of life, increase life expectancy for people with heart failure and ultimately reduce the fiscal burden to society. Evidence-based treatment involves taking medications, following a proper diet, reducing salt intake, monitoring weight daily...

Words: 2676 - Pages: 11

Premium Essay

Middle Income Trap in India

...Trap…………………………. 4. India enters the Middle Income Group…………………………………………………….. 5. Factors causing Middle Income Trap and Economic Slowdown………………… 6. Income inequality and its relevance…………………………………………………………. 7. How to avoid India falling into Middle Income Trap…………………………………. 8. Conclusion………………………………………………………………………………………………… INTRODUCTION According to International Monetary Fund World Economic Outlook (April-2015), GDP (nominal) per capita of India in 2014 at current prices is $1,627. India is in the lower-middle income category. India’s entry to the middle income group has raised the question whether it will be able to avoid the ‘middle income trap’ which refers to prolonged stay in the middle income category and failure to move ahead to the high income category. India’s economy has developed quickly in the last decade, improving living standards and experiencing strong growth in such critical sectors as ICT (information, communication and technology). In recent years, however, circumstances have become less conducive to growth: macroeconomic conditions in the developed economies point to a prolonged external slowdown, while domestic constraints such as high inflationary pressures and rising fiscal and current account deficits have emerged. Now many also predicted for coming decade Indian economy will grow at very healthy rate and in coming year’s very large number of...

Words: 3190 - Pages: 13

Premium Essay

Ldr 531 Business Failure

...Examining A Business Failure LDR/531 June 25, 2011 Dr. Catherine Garcia Examining a Business Failure This paper is about the company Kodak that recently filed for the bankruptcy. Kodak, once a very profitable organization is on the verge of another failure, like Enron, Tyco, etc. This paper will cover how the incorrect management decision lead to its failure. It will compare and contrast leadership, management, and organizational structure that contributed to this failure. History of Kodak and timeline George Eastman started Kodak in 1878. In 1888 Eastman presented a first simple camera to the world. He made the complex and complicated process of photography easy and simple enough to use by everyone. "you press the button, we do the rest," Eastman demonstrated his marketing ingenuity (History of Kodak, n.d.) . The success of Kodak continued and in 1895 Kodak introduced the first pocket camera. Later Eastman formed his companies guiding principle: volume production at affordable cost, global presence, creative marketing, satisfied customers with best customer service, and growth through uninterrupted research and development. Furthermore, he stressed on how important it is to value brand name and the quality it stands for. Eastman firmly believed in product quality and never compromised under any circumstances (Kodak, 2006). Introducing color photography, Kodak continued its growth and success by investing in R&D and by 1963 become standard in photography...

Words: 1297 - Pages: 6

Premium Essay

Business Ethics

...firstly use “The Rise and Fall of CSR” introduce a briefly development of CSR as well as the importance of the history. Then, a few facts are used to expose social, ecological and ethical issues and implied the CSR has failed. Three factors are followed to explain “The Failure of CSR”. And then, contrasting and explaining CSR 1.0 and CSR 2.0 respectively, in the “Embracing the Future” section, five principles that make up the DNA of CSR 2.0 are detailed analyzed. After shifting from CSR 1.0 to CSR 2.0 in both principles and practices, a logical conclusion is pushed that a new model of CSR is needed to meet new challenges. Finally, the author presents the Double-Helix Model, analogizing CSR 2.0 constitution with DNA chains. At the end of the article, the author talked about the real purpose of business, that is, serve society. How has CSR evolved? As the author described, corporate social responsibility (CSR) as a dynamic movement has been experienced for more than 4,000 years. He outlined 8 main processes of evolution from the ancient to the 21st century, from firstly mentioned in religion texts or activities to modern concept introduced by industrialists, and from institutionalized with standards to plethora in 21st century. What are the key challenges organisations face? The author pointed out that the CSR has failed, we are witnessing the continuously decline or it might be reborn and rejuvenated (Visser, 2010). The failure can be attributed to three factors – the Triple Curse...

Words: 751 - Pages: 4

Premium Essay

Organization Behavior

...What level of political action has broad long-term strategic impact? Choose 1 answer A. Individual B. Network C. Coalition D. Department    Surprise and fear of the unknown are reasons people resist change. How might this reaction manifest itself through employee behavior? Choose 1 answer A. They create rumors to fill the void created by lack of official announcements. B. They become increasingly comfortable with the routine. C. They become more productive in response to warnings. D. They display less fear of the unknown. ...    What two recommendations should a manager consider in implementing an organizational change? Choose 1 answer A. Use a systems model and implement the change quickly. B. Ensure the organization is ready and involve mid-level managers....

Words: 2399 - Pages: 10

Free Essay

Industry 4.0: New Age of Manufacturing

...Industry 4.0: The future of manufacturing Technological developments, over the years have driven dramatic increases in industrial productivity since the dawn of the Industrial Revolution. In the times since, however, these advancements were only incremental, in comparison to the ground-breaking innovations that have occurred in the IT Industry. Now, though, the rapid globalization over the past has led to establishment of many new competitors, competing for the resources necessary for success. Industry 4.0 refers to the fourth industrial revolution or the Techie Industrial Revolution It will have a higher impact and require less implementation of new equipment (40-50%)1. Industry 4.0 is enabled by disruptive technologies that are expected to change the manufacturing sector by 2025 through significant innovation2. The first industrial revolution involved saw the adoption of steam power. The second industrial revolution or Industry 2.0 was all about the rise of electricity and the 3rd revolution was the digital revolution when electronics broke the market. The transformation to the new age Industry entails the inclusion of sensors, machines, workpieces, and IT solutions along the value chains and beyond a single enterprise. This will enable the connected systems to interact for predicting failure and adapting to such circumstances. Consequently, manufacturing productivity increases, fosters industrial growth and in turn will change the face of competition in the factories...

Words: 1206 - Pages: 5

Premium Essay

Enron

...ENRON’S FAILURE RESEARCH #1 Failure of Enron Corporation Enron Corporation, called America’s most innovative company for six consecutive years by Fortune Magazine, was the world’s leading energy company. Enron was formed in 1985 by a merger of Houston Natural Gas and InterNorth, involving the transmission and distribution of electricity and gas throughout the United States, but majority of its growth was due to the pioneering marketing and promotion of power and communication bandwidth commodities as well as its related risk management derivatives (Columbia Electronic Encyclopedia, 2009). Under new leadership Kenneth Lay and Jeffrey Skilling, Enron adopted an aggressive growth strategy. To ‘seal the deal’, they hired Andrew Fastow as CFO and it was he, that helped to create the complex financial structure for Enron (Reinstein & Weirich, 2002). One could say that Enron began to plummet as soon as the company shifted its focus from regulated natural gas domestically to international energy, water and broadband communications – as these were volatile and risky hedging transactions (Reinstein & Weirich, 2002). Engaging in these risky transactions, enabled Enron’s stock to rise but when these three new areas went sour, the stock plummeted as well. Enron’s management did not disclose these losses and liabilities on their financial records nor to the investors of the corporation (Reinstein & Weirich, 2002). Despite Enron being called the most innovative and having alleged...

Words: 2195 - Pages: 9

Premium Essay

The Role of Software Testing in the Development of Quality Software

...The Role of Software Testing in the Development of Quality Software MehwishZulfiqar 13414, MS (CS) Iqra University, Islamabad. Abstract: In software testing organizations, effective knowledge management of the testing process is the key to improve the quality of software testing. Quality must be built into our products and it can never be tested in after the fact. Although QA has an important role in assuring the quality of our products, their work is entirely indirect. Their role is to influence others in the organization. Testing programs are conducted to ensure that the software application meets the specifications and service level expectations of their product. Testing improves product quality. However, there is rarely enough quality control time built into development projects, and there is an endless possibility of testing that could be executed. In this article major issues are discuss that improve the quality of software and also discuss the role of software testing in the delivery of quality software product.Simply we say that process of testing is basic, but knowing what to test for is challenging for a developer. Software testing requires well-analyzed test cases and proper execution in order to find issues in the software. It also requires efficient management of the procedures. 1. Introduction: Software systems are an increasing part of life, from business applications to consumer products. Most people have had an experience with software that did not...

Words: 2071 - Pages: 9

Premium Essay

Tqm and Environmental Cost

...TOTAL QUALITY MANAGEMENT (TQM) AND ENVIRONMENTAL COST INDEX |NO |DETAILS |PAGES | |1 |INTRODUCTION OF TQM |2 | |2 |PRINCIPLES OF TQM |3 | |3 |THE COST OF TQM |4 | |4 |A MODEL FOR ORGANIZATION EXCELLENCE |6 | |5 |ENVIRONMENTAL COST |8 | |6 |WHY MEASURE ENVIRONMENTAL COST? |8 | |7 |ENVIRONMENTAL MANAGEMENT ACCOUNTING(EMA) |9 | |8 |ENVIRONMENTAL ACCOUNTING |9 | |9 |IDENTIFYING AND CLASSIFYING ENVIRONMENTAL COSTS |10 | |10 |ANALYZING ENVIRONMENTAL COSTS |12 | |11 |SUMMARY ...

Words: 3122 - Pages: 13