PRIME---toward process-integrated modeling environments: 1
ACM Transactions on Software Engineering and Methodology
Research in process-centered environments (PCEs) has focused on project management support and has neglected method guidance for the engineers performing the (software) engineering process. It has been dominated by the search for suitable process-modeling languages and enactment mechanisms. The consequences of process orientation on the computer-based engineering environments, i.e., the interactive tools used during process performance, have been studied much less. In this article, we present
... e PRIME (Process Integrated Modeling Environments) framework which empowers method guidance through processintegrated tools. In contrast to the tools of PCEs, the process-integrated tools of PRIME adjust their behavior according to the current process situation and the method definitions. Process integration of PRIME tools is achieved through (1) the definition of tool models; (2) the integration of the tool models and the method definitions; (3) the interpretation of the integrated environment model by the tools, the process-aware control integration mechanism, and the enactment mechanism; and (4) the synchronization of the tools and the enactment mechanism based on a comprehensive interaction protocol. We sketch the implementation of PRIME as a reusable implementation framework which facilitates the realization of processintegrated tools as well as the process integration of external tools. We define a sixstep procedure for building a PRIME-based process-integrated environment (PIE) and illustrate how PRIME facilitates change integration on an easy-to-adapt modeling level. Issues encountered in building a flexible software development environment R. Kadia ABSTRACT This paper presents some of the more significant technical lessons that the Arcadia project has learned about developing effective software development environments. The principal components of the Arcadia-1 architecture are capabilities for process definition and execution, object management, user interface development and management, measurement and evaluation, language processing, and analysis and testing. In simultaneously and cooperatively developing solutions in these areas we learned several key lessons. Among them: the need to combine and apply heterogenous componentry, multiple techniques for developing components, the pervasive need for rich type models, the need for supporting dynamism (and at what granularity), the role and value of concurrency, and the role and various forms of event-based control integration mechanisms. These lessons are explored in the paper. ABSTRACT A concurrent software application, whether running on a single machine or distributed across multiple machines, is composed of tasks that interact (communicate and sychronize) in order to achieve some goal. Developing such concurrent programs so they cooperate effectively is a complex task, requiring that progrmmers craft their modules-the components from which concurrent applications are built-to meet both functional requirements and communication requirements. Unfortunately the result of this effort is a module that is difficult to reason about and even more difficult to reuse. Making programmers treat too many diverse issues simultaneously leads to increased development costs and opportunities for error. This suggests the need for ways that a developer may specify control requirements separately from the implementation of functional requirements, but then have this information used automatically when building the component executables. The result is an environment where programmers have increased flexibility in composing software modules into concurrent applications, and in reusing those same modules. This paper describes our research toward a technology for control integration, where we have developed techniques for users to express control objectives for an application and a system that translates those specifications for use in packaging executables. ABSTRACT We present a domain-independent model of hierarchical software system design and construction that is based on interchangeable software components and large-scale reuse. The model unifies the conceptualizations of two independent projects, Genesis and Avoca, that are successful examples of software component/building-block technologies and domain modeling. Building-block technologies exploit large-scale reuse, rely on open architecture software, and elevate the granularity of programming to the subsystem level. Domain modeling formalizes the similarities and differences among systems of a domain. We believe our model is a blueprint for achieving software component technologies in many domains. ABSTRACT The Desert software engineering environment is a suite of tools developed to enhance programmer productivity through increased tool integration. It introduces an inexpensive form of data integration to provide additional tool capabilities and information sharing among tools, uses a common editor to give high-quality semantic feedback and to integrate different types of software artifacts, and builds virtual files on demand to address specific tasks. All this is done in an open and extensible environment capable of handling large software systems. ABSTRACT Traditional framework-based applicationdevelopment assumes applications are based on single frameworks extended with application-specific code. More recently, it's become clear that application development is often based on mul-tiple frameworks that have to be integrated with one another, as well as with class libraries and with exist-ing legacy components, to fulfill application require-ments. But this integration process can lead to serious integration problems, since a framework is generally designed under the assumption that it is fully in control of the event loop. And a framework is always designed for extension (not for integration) and without the need for incorporating legacy com-ponents. Here, we focus on the integration of multi-ple frameworks at the code level, avoiding questions about integrating documentation. ABSTRACT We discuss the main methodological and technological issues arosen in the last years in the development of the enterprise integrated database of Telecom Italia and, subsequently in the management of the primary data store for Telecom Italia data warehouse applications. The two efforts, although driven by different needs and requirements can be regarded as a continous development of an integrated view of the enterprise data. We review the experience accumulated in the integration of over 50 internal databases, highlighting the benefits and drawbacks of this scenario for data warehousing and discuss the development of a large dedicated data store to support the analysis of data about customers and phone traffic. ABSTRACT System integration is the programming function that, during the development of a system, interfaces the separately programmed and tested functions. The inclusion of an integration function in building a programming system facilitates the logical addition of functions to the system, provides an independent view for reporting the progress of the project, and ensures that adequate interfacing of all system components will take place prior to release of the product. This paper describes the advantages to be gained by employing the integration function and the problems that can be circumvented as a result. Among the areas discussed are the setting in which system integration functions, the responsibilities of an integration group, and the goals that system integration can achieve. ABSTRACT This paper presents an exploratory approach to the development of a tool for integrating existing databases. The intent is to meet specific requirements and to achieve flexibility through the creation of an "open" system. The methodology ABSTRACT This paper highlights the features and functions of BPSimulator, a simulation analysis tool focused on work flow and business processes. Based on Systems Modeling Corporation's general-purpose simulation system, Arena", BPSimulator leverages all of the power and flexibility of Arena while providing the business analyst with an intuitive interface and a short learning curve. Like Arena, it is a comprehensive system that addresses all phases of a simulation project from input data analysis to the analysis of simulation output data. ABSTRACT Workflow techniques have gained a lot of attention as a means to support business process re-engineering but also as a means to integrate legacy systems. Most workflow mod-els view the applications as a fixed set of tasks. In this paper we analyse inter-organisational application domains and analyse properties for transactional workflows and systems supporting them. We study a workflow model for the appli-cations where job step instances cannot be fixed in advance. We analyse requirements arising horn this kind of environ-ments through a particular application, and introduce special modelling components to support these requirements. We also develop the concept of C-unit to cope with concur-rency anomalies and recovery. ABSTRACT In a research and technology application project at Bellcore, we used multidatabase transactions to model multisystem work flows of telecommunication applications. During the project a prototype scheduler for executing multi-database transactions was developed. Two of the issues addressed in this project were concurrent execution of multi-database transactions and their failure recovery. This paper discusses our use of properties of the application and the telecommunication systems to develop simple and efficient solutions to the concurrency control and recovery problems. ABSTRACT Workflow management aims at the execution of business processes within enterprises of any size and structure. In this paper we present a data distribution strategy for workflow schema data with low update rates and workflow instance data with high update rates. Thus, on the one hand, parallel local access to workflow schema data is possible. On the other hand, synchronization upon updates is avoided. This data placement strategy is based on the organizational requirements of an enterprise which are modeled by the concept of an (extended) organizational relation. ABSTRACT This paper describes ActionWorkflow approach to workflow management technology: a design methodology and associated computer software for the support of work in organizations. The approach is based on theories of communicative activity as language faction and has been developed in a series of systems for coordination among users of networked computers. This paper describes the approach, gives an example of its application, and shows the architecture of a workflow management system based on it. ABSTRACT The design and implementation of a workflow management system is typically a large and complex task. Decisions need to be made about the hardware and software platforms, the data structures, the algorithms, and network interconnection of various ABSTRACT An interdisciplinary research community needs to address challenging issues raised by applying workflow management technology in information systems. This conclusion results from the NSF workshop on Workflow and Process Automation in Information Systems which was held at the State Botanical Garden of Georgia during May 8-10, 1996. The workshop brought together active researchers and practitioners from several communities, with significant representation from database and distributed systems, software process and software engineering, and computer supported cooperative work. The presentations given at the workshop are available in the form of an electronic proceedings of this workshop at http://lsdis.cs.uga.edu/activities/). This report is the joint work of selected representatives from the workshop and it documents the results of significant group discussions and exchange of ideas.