Subject: Summary of KBSA-5 5th Annual Knowledge-Based Software Assistant Conference Syracuse, New York 24-28 September 1990 Douglas A. White white@aivax.rl.af.mil (315-330-3564) Louis J. Hoebel hoebel@aivax.rl.af.mil (315-330-4833) Rome Laboratory/C3CA Griffiss AFB NY 13441-5700 The arrival of practical diagnostic systems based on AI technology in the late 1970's and early 1980's led the Rome Air Development Center (now RL) to investigate the possibility of creating a knowledge-based system that would diagnose software systems and assist in their maintenance. The conclusion of this investigation was that this type of diagnostic expert system would be impossible for software because of the dearth of formalized knowledge about software in general and any software system in particular. This initial negative result, along with the dim prospect for immediate relief through automatic programming caused a serious consideration of the alternatives. The alternative perceived to have the greatest promise was that of a knowledge-based system that did not provide total automation of the software synthesis process, but assisted as users directed the evolving development. This system would maintain a total record of all decisions and activities which occurred in the creation of a software system and would possess the expertise to automatically perform many of the tedious tasks of program development. Communications among participants in a development would be enhanced by the monitoring and reporting capabilities provided by the knowledge-based system. These beginnings led to the KBSA program, a retreat from pure automatic programming that is based upon the belief that by retaining the human in the process many of the unsolved problems encountered in automatic programming may be avoided. The underlying concept of the KBSA is that the processes rather than the products of development will be formalized. This allows a knowledge base to evolve that will capture the history of the software development process. The impact that this will have on the process of software development is that software will be derived from requirements and specifications through a series of formal transformations. Maintenance will also take place at the requirements and specification level as it will be largely possible to "replay" the process of implementation. In 1986 the initial KBSA Conference was held by RADC to provide a mechanism for those participating in the KBSA program to exchange technical information and to expose the project to a broader community consisting largely of potential users from the DoD and DoD contractors. In subsequent years the audience was broadened. However, presentations continued to be limited to a restricted community from closely related projects. This year a deliberate effort was made to open the conference to all interested parties with the goal of fostering cooperation among the multitude of similar research projects that have germinated since the KBSA project started. Although the theme of this year's conference was "Bridging the Gap", referring to the problem of inserting new technology into bureaucratic organizations, the conference in fact continued to served primarily as a forum for discussion of technology. The success of this year's conference has provided convincing evidence that future conferences should continue to aspire to an openness that will foster cooperation in the pursuit of greatly increased software development productivity through the use of knowledge- based systems technology. As has been customary in previous years, newcomers to the conference (and KBSA project) were presented with overviews and tutorials which serve to set the context for the remainder of the conference. Attendees at the first afternoon were presented with overviews of the research activities of the Rome Air Development Center (RADC) by Robert Ruberti, the KBSA project by Douglas White, the RADC Software Life Cycle Support Environment (SLCSE) project by Deborah Cerino, and a tutorial describing and comparing CASE and KBSA technologies by Melissa Chase of Mitre. SLCSE is a computer-based environment of integrated software tools which supports the various phases and activities of the software life cycle in the context of a "conventional" software engineering environment. The remainder of the week was filled with papers and panel sessions covering a spectrum of topics relevant to the KBSA program. Also included in the conference were a dinner speech by Dr. Goeffrey Fox, director of the Northeast Parallel Architectures Center at Syracuse University, and a Program Design Review of one of the contracts of RADC's KBSA program, the KBSA Concept Demonstration. Information on obtaining this and other KBSA conference proceedings and tech reports is found at the end of this article. More information may also be obtained by contacting the authors, email preferred. Panel Sessions Panels addressed topics of: 1) issues in using object oriented databases to support of KBSA; 2) explanation and dynamic documentation of software; 3) the comparison of CASE technology with KBSA technology; 4) the need for intelligent interfaces to make richly functional systems such as KBSA usable; and 5) bridging the gap or moving paradigm changing technology into organizations such as the government where procedures are firmly entrenched. Panel: Issues in Using Object Oriented Databases to Support KBSA The panel, "Issues in Using Object Oriented Databases to Support KBSA," was chaired by Penny Muncaster-Jewell of McDonnel-Douglas and included Ian Schmidt of Object Design , David Fisher of Incremental Systems , Aaron Larson of Honeywell, Weseley Wilson of Ontalogic and David Wells of Texas Instruments. Ian Schmidt was the initial speaker and provided a general overview of what object-oriented data base management systems (OODBMS) are being used to support, the need for them, why they are emerging now, and what OODBMS are in terms of technology. He claimed that their predominant use is in engineering design systems because of the rapid turnover in product line. The characteristics of these applications are that they are highly interactive, multiuser environments containing complex objects where rework and revision are common. He described OODMBS as a fusion of three leading edge technologies, object orientation, client/server database management, and workgroup technology. Aaron Larson described what is needed from OODBMSs for KBSA, what issues arise in using a commercial product. He outlined "standard" capabilities of current systems which are: object orientation, a database, persistent and non-persistent objects, multiple inheritance, multi methods (association of methods with more than one class) and constraints (to define logical grouping in an abstract way). Important issues are the need to move methods (behavior) into the database, in essence turning it into an execution environment, and database export which becomes more difficult when you start dealing with networks of objects. David Fisher introduced his comments by stating that he does not use the words "object oriented" at all but is certainly talking about systems in which there is a concept of a generalized type system including things such as classes and types as found in programming languages. He believes that there is a need for efficient management of fine grained as well as large grained objects and this is needed from the perspective of databases in which the data will be programs, requirements, processes and descriptions of these items. Wes Wilson described the variety of software applications asking for OODBMS technology. He observed that OODMBS technology is really a technology of operations as opposed to a technology for MIS, and that it serves as an integration vehicle to allow the automation of the many functions of an organization and the tying together of the whole information network. He asserted that the flexibility of the OODBMS is evidenced by the variety of applications that are deciding that it is the correct underpinning for their developments and the common thread is the need for a standard data model. David Wells discussed implementation details of a DARPA effort to achieve consensus on the internal architecture of an open OODBMS. Although there are many different types and pieces of a complete OODMBS around, building a complete system is presently beyond the resources of an individual organization. DARPA is thus looking to TI to build a framework under which various vendors may be able to build and integrate pieces. He expressed the belief that a chicken and egg situation exists in that it is impossible to know features are needed until an attempt is made to put a large application on top of one, but no one wants to attempt a large application without a stable platform. He stated that relational data base management systems came to the world out of revelation but that OODMBS's have grown out of persistent programming languages, extensions to the relation model so that they contain objects, and the knowledge base system work. He feels that they are eventually going to merge but not for a while. The DARPA project is attempt to speed up by providing common platform for pieces to be merged with the goal of seamless persistence for some language. Panel: Dynamic Documentation and Explanation of Software William Swartout, chairman of the "Dynamic Documentation and Explanation of Software" panel,.opened the discussions by stating that current practice is for software systems and their documentation to be decoupled. Documentation is inflexible once written and is outside of the machine, ie. the machine provides no help in examining the documentation. He thinks that what is desirable is a system where the programming environment helps capture design decisions so that those can be displayed as part of the documentation thus coupling the documentation to the application. Documentation should also be available on demand and customized to the needs to the user. It should also be interactive so that if the user does not understand the documentation alternative descriptions can be provided to clarify the situation. Questions he presented to the panel included: 1. What kinds of documentation need to be provided? 2. How does the kind of information depend on the reader and context? 3. Where does the information come from? 4. What decisions need to be made in presenting the information and who makes these decisions? 5. What kinds of media should be supported? Lewis Johnson initiated the panel discussions by stating his opinions of what needs to be done to produce documentation that clients can use. From his work on KBSA and other similar projects he feels it is important to think about documentation as a question answering problem, where the goal is not to produce documentation with the bigger the better. He stated that frequently documentation is oriented toward developers rather than clients and what is needed is documentation that is oriented toward the model with which the client is likely to view the system. David Littman described the problem of documentation as something like tutoring or teaching because you have someone who does not know something about a complicated object and you need to determine how to develop that understanding. Since there are limited resources to develop good documentation the issue is how to focus the search for good documentation methods. He observed that uniformly wide and deep documentation is not feasible, however if you want to prevent errors it is his observation that these errors are likely to happen at trouble spots which are not uniformly distributed. Therefore he claims, you don't need deep and wide. He claims that research in documentation should first focus on trouble spots to determine empirically what kind of information is useful and when and how to present it such as might be accomplished using an expert apprentice method. Ursula Wolz discussed the issue of customizing documentation to the needs and abilities of users, how to be interactive, and what decisions need to be made in presentations. She described her research focus has been on interactive systems and looking at how to use documentation for existing systems and turn it into question answering systems. What she has discovered is that to help users you must first give goal centered information, and secondly understand the situation they were in which assumes the need to know what user already nows. A final problem is that the need for the user to get information is secondary to his performance of the current task, so the information needs to be grounded/concise. She feels that what this suggests is that it is desirable to put emphasis on developing a robust model of the actions and/or acts of tasks being accomplished. Eduard Hovy began his discussion by stating that partial documentation is as good as it is going to get. His claim is that documentation is a problem of language generation not of documentation and that it is impossible to produce useful documentation automatically because the system has to be given too much knowledge that is not directly reflected in the underlying structures being used. The answer is of course to provide some background semantics such as is possible in KBSA. Panel: KBSA vs CASE Gilles Lafue, chairman of the "KBSA vs. CASE" panel introduced the topic by stating his view that terminology used by the two groups is often the same causing any answer to the question of how KBSA compares to or differs from CASE technology to be unclear. He observed that it is difficult to know the differences and the goal of panel was to determine the depth to which one must go to see the differences and similarities. Issues which he asked the panel to address included: 1. The types of specifications used. 2. The kind of code generated from specifications. 3. What level of code optimization is possible. 4. How consistency of specifications is maintained (automatically or not at all), and 5. How specifications are validated. Bob Balzer was the first panelist so speak, and he provided the "Old timers view of KBSA". He expressed the views that CASE is bottom up while KBSA is top down, Case is product oriented while KBSA process oriented, Case is well engineered while KBSA is a laboratory prototype at best, and finally CASE is team oriented while KBSA is individual oriented (tools to help users work better). He observed that the present direction of the two communities is to reduce differences between KBSA and CASE by adoption by each of the good features of the other. He described the KBSA vision as having three parts; the formal derivation of code, an automated iterative life cycle, and an intelligent assistant that proactively participated. Benefits of the KBSA are that the formal specification is a prototype that can be run, one would never patch source code, systems would be developed through evolutionary transformations, and the intelligent assistant is both reactive and proactive. Bob then expressed his view of where the industry is headed. He believes that the strength of the CASE community is that it provides commercial well engineered tools. On the other hand KBSA demonstrates things that might come into the CASE environments in the future, and provides a framework for experimentation and exploration. Doug Smith's comments focused on recent issues and problems of KBSA community. The goal of KBSA is to formally represent all software related artifacts, to develop formal models of application domains and theories, and to create systems from formal specifications via semantic changing transformations. He believes a key aspect of KBSA is the knowledge base and the reuse of knowledge and derivation histories. In this context it is the capture of knowledge of the system, of where the knowledge came from, and what code actually does that is what we are really trying to address with KBSA. What does it take for software evolution ? You need to know why code is what it is. Tony Wasserman stated that it is virtually impossible to draw line between KBSA and CASE because the goals are similar. What his has put together is base environment with a family of graphical editors. The long term goals are much like what Bob described but the approach is different. The central role in KBSA is transformations and the emphasis on formalism and existing process orientation are important distinguishing differences. The ideas of project management are very important to everybody as it meets the need to control the process. CASE tools however need a process or method from the using organization. CASE does not work in organizations that do not presently do software engineering. His organization spends lot of time helping people do software engineering. He believes that organizations capable of exploiting KBSA are at different level of maturity and are more mathematically sophisticated than most. Glover Ferguson stated that the nice thing about acronyms is that you can make them whatever you want, however, if both CASE and KBSA are taken to the broadest definition they seem to be the same with common objectives. He feels that lot of what we talk about boils down into reuse, because given the choice of doing something very fast or not having to do it at all, not doing it at all is the fastest (ie. the philosophy is to take something you have and modify it). In software development, the element of reuse is growing and the idea is to do a much better job at it. A knowledge-based approach is tied back to reuse and is also needed to manage/coordinate across a project. Knowledge-based assistance also enables the recovery of the knowledge of system once people are gone. Panel: Intelligent Interfaces to Richly Functional Systems: Making KBSA Usable Panel Chair Bill Sasso introduced the session by presenting his motivation for the panel by claiming that interface specialists are needed to broaden and enrich the vision of KBSA such that the great functionality that is being developed can actually be used. Lewis Johnson stated that in a slightly broader context than interfaces that communications and learning are becoming increasingly more important. These issues are indeed drivers of the functionality of KBSA systems. He felt that in the interaction between analysts and the system, formal languages are not what the analysts want to see. In this regard he felt that the KBRA developers got things right with multiple views and styles of communication. The community can learn from CASE tools by building around conventional notation but avoid picking a particular notation. He concluded by claiming that support for multimedia presentations that integrate different notations and coordinate to present a picture of complex notations is the right support that interfaces can bring to richly functional complex systems. In much the same spirit Peter Selfridge presented his experience and thoughts on the interfaces he developed while working and browsing on the Infinity/75 switching system software. The interface was developed as an extension to the history of interfaces that he had used. Progressing from PDP-4 toggles, thru Unix file system command line interfaces to the modern "Macintosh Desktop" metaphor. Peter felt that the lacking ingredient in the MD metaphor was the annotation, direction and relations that were only briefly hinted at in Unix.(cd, ls + options etc). The interface for code browsing then needed to include displays for (complex) structural relations, functional relations and the display and browsing needed to be controllable. Control is essential to passing the "test of 10,000" proposed by Michael Williams. This interface to CODE- BASE handled a file system with 17000 nodes in a coherent and controllable manner. Finally, Peter argued that interfaces need to provide the mechanism for doing something (query at least!) to these objects, not just passively display them. Loreen Torveen discussed interfaces to KBSA like systems, these contain "Large Information Spaces", from aspect of supporting the social aspect of people and there work. People doing work often work and then reflect on what they have done. They read the "situational backtalk" and continue on in perhaps unanticipated ways. Work at the design process is not a tidy, top-down process; requirements enter at all stages of the process. Cooperative systems, like KBSA is envisaged, need to provide resources for the management of trouble.problems. They need to support peoples work as practical actions and finally they need to deliver assistance via collaborative manipulation of the materials of the work context. KBSA requires this since large scale software development is complex and will produce problems, studies of software design reveal the need for the support of situational backtalk and finally software design involves many types of software objects that require manipulation. Elliot Soloway continued by bridging the gap between the issue of interfaces and what they can provide and how people and organizations interface with new technology. Elliott started by claiming that you don't need to convince anyone that they need a refrigerator, nor do you need to tell them how to use a refrigerator, they just go out and get one and use it. So, how is KBSA like a refrigerator! Well its not, and that's the challenge. We need to get people to use the wonderful technology that is being developed in the manner (richly functional) that we intended people to use these tools. Software development is a people intensive process, it's pure thought, and we need to support that social situation. Unfortunately, organizations not people buy software and they don't even know that they need KBSA like systems. In order to support this process we should be developing interfaces that run on 200 MIPS machines. Geoff Fox posited 1000 MIPS during his keynote address. This is where we are headed and once we are there it's a whole new game. We will have the horsepower to do the things we want and need to support. But society is conservative and people resist change in favor of the status quo. We need to declare victory early on and let people adopt and integrate the new technology into their work place. Many new strategies need to be developed and that's also part of the challenge. During the discussion Bruce Johnson raised the question that people think, act and design in many different ways. Elliot responed that by and large computers presented "one or two styles and that's it" and.with an analogy about human teachers and all the different learning styles that a classroom full of students presents. Lewis commented that during ARIES work, no matter how many different presentations were made available, it was never enough. He suggested that what we need to do is develop mechanisms that can "gin-up" new presentations easily. The need for intelligent interface tools was brought up my Yigal Arens while Penny Muncaster-Jewell mentioned the need for users to find the interfaces intelligible for their applications. Loreen thought that a KBSA knowledge-base could mediate between the developers and the users by allowing users to annotate and make comments back to the developer. Lewis claimed that this was not enough and that knowledge-based scenarios are need to show the user the interaction of the system not just an image of what the system will do. The discussion then revolved around the need for better user modeling, hands-on experience with actual tasks so that users (application and developer?) can provide early feedback. Yigal argued that much of this was not news and wanted to now what we do next. More ideas, more concentrated efforts and more money are needed. The session ended with the realization that the sum of the interfaces that KBSA developed systems will need are vast and beyond the needs of just KBSA developers and KBSA application developers. Panel: Bridging the Gap A panel session chaired by Mary Anne Overman of the NSA. The members were Sam DiNitto of RADC, Dennis Smith of SEI and Penny Muncaster-Jewell of McDonald Douglas. The focus of the panel was technology insertion with emphasis on bringing the government closer to the state of the practice in the commercial sector. Mr. DiNitto started his talk with experiences of doing Software Engineering Technology Transfer. He offered these key points: Government support is close to no support; technology often supports the system developer best, capitalization costs can make transition unattractive, having a demanded capability is a desireable case; and there are too few smart buyers who understand how scaling up effects maintainability. He described RADC's multispectral approche to transition. This includes communication of users, test sites, contractors and developers; avoiding pitfalls such as special hardware, non- maintainable environments (use mil specs!), proper demos (don't over sell). Mr. DiNitto finished his position statement with the observation that a multifaceted approach is necessary, industry needs to be involved and its still lengthy Dennis Smith discussed an SEI project that is investigating the impact of CASE tool adoption. Two sets of issues have emerged, first needs fulfillment in terms of integration, scalability and flexibility; and second, what practices effect tools impact (readiness, strategy, commitment, culture and standards). In relation to these Mr. Smith outlined 6 distinct stages to adopting tools. 1. Awareness 2. Management Commitment. 3. Tool Selection 4. Initial Project Use (ie not on a critical project) 5 Implementation Strategy and 6. Routinization ( differentiate between trial usage and routine usage) The focus of Penny Muncaster-Jewell's presentation was on the insertion of technology into large organizations. He experience includes TI, DoD, NASA as well as McDonald Douglas. She argued that quantified results as well as financial motivation are necessary for initial adoption. She mentioned the need to consider overall costs. This means both life-cycle costs and costs over the life time of the tool, not just for the first project. The maintenance issue may actually effect the choice of technology while true life-cycle costs remain unanalyzed. During Penny's summation before open discussion she pointed out that government contracts don't often reward productivity gains, the upfront costs of adopting new technology and the difficulty of risk assessment. Initial discussion centered on the hierarchy of DoD-Prime Contractors-Sub Contractors. With DoD becoming a shrinking part of the market and smaller Subs being unable to afford expensive new technology , primes often turn to off the shelf commercial market place solutions. Another discussion centered around Lou Hoebel's questioned of what does it take to sell KBSA. Ms. Muncaster-Jewell's comments focused on openness, both for future KBSA conferences and also in the area of open architectures. Sam and Dennis mentioned that small, riskless tool insertion (vs total new environments) was important. Mike DeBellis stated that KBSA is a big leap from the current practice but also concurred that small success, such as in reengineering and maintenance might be the way to go. Penny Chase concurred and suggested a case study of successful tool adoption such as spreadsheets. Some final commentary by Sam included remarks about government resistence to consortia and the lack of interest by universities in teaching software engineering. General agreement was reached that KBSA had the best chance in places that had a process model and that KBSA need to provide a default process model that could add in tool use. Since software development is a people intensive activity, KBSA needs to provide guidance and provide flexibility as people adapt. Paper Presentations Leading off a series of talks on different approaches to software development, Douglas Smith discussed the use of knowledge-based development software to support the derivation of correct and efficient algorithms from formal specifications. He explained how a user develops a program, using KIDS, from the formal specification by applying meaningful correctness preserving transformations. Mehdi Harandi described an approach based on the use of schemas to derive formal specifications from informal descriptions. Design and implementation details such as representation, organization, and the process of schema retrieval, instantiation, and refinement were also discussed. The approach was illustrated through the use of an example derivation. Sanjay Bhansali discussed a process by which it is possible to synthesize a program by recognizing the similarity that a problem bears to one previously solved and modifying or reusing the original solution. APU, (Automated Programmer for Unix) a program synthesis system using this technique to achieve extensive reuse of code at both the subroutine and design level, is then described. Sidney Balin described an environment for capturing the distinctive features of software artifacts for potential reusability. The system KAPTUR supports knowledge acquisition and evaluation of artifacts for intelligent developer reuse selection. James Palmer presented a paper in which he argues for an environment to support information transfer and transformation. He presents some initial results about a workstation utilizing interactive multimedia to support requirements engineering. In a more theoretical vein, Paul Bailor presented a formal language theory model for use in software development. He argues that the graph, as a fundamental mathematical concept, has application in the areas of specification, design, analysis and formal models for parallel computation and software development. Shiu-Kai Chin postulated that formal methods provide a means for documenting design knowledge in an executable and formally verified manner supportive of design reuse. He then proceeds to demonstrate with several examples the process of functional composition and higher order design. He concludes by describing the need for a hyper-object-based environment in which declarative languages, specification languages, and theorem provers are integrated and between which the user may move freely. Chi-Sharn Wu presented a classification of specification systems. Specification systems contain methods, languages and tools and can be at the level of informal, semi-formal and formal. This classification allows for comparative assessment and Wu and Avizienis identify five formal specification techniques. Martin Feather described an extension of the program transformation paradigm which deliberately changes the meaning of specifications through the application of "evolution" transformations. The record of such "evolution" transformations which capture the relationships between a changed specification and the original were proposed as being useful in alleviating problems that arise during replay of transformational developments. A session dealing with the structure of code began with Lisa Neal's presentation of work on example-based programming methodology to support design, development and reuse of software. She presented results from an empirical study giving positive results particularly for advanced programmers and she relates these to prior studies of use of structure editors. Peter Selfridge followed with a paper on the CODE-BASE, a software information system. The system is an interactive code discovery facility that encodes a description of UNIX and C information in the CLASSIC KR language. A query language is provided that supports automatic extraction of syntactic code knowledge. Paul Bailes stated that the view of software development as proceeding from informal requirements through formal specifications is mistaken, and software tools based upon this view are deficient in the process that is imposed upon their users. Design Genesis Support Systems (DGSS), including a prototype Paradigm-Oriented Programming Environment (POPE), which are language and method independent and support the most abstract design paradigms employed by designers, were then described along with their potential benefit of providing a wide spectrum of KBSA-like environments. Lewis Johnson described ARIES, a system being developed as part of the KBSA program which supports the acquisition of requirements and incremental evolution of these into specifications. The emphasis of the discussion was on how ARIES would be used and the capabilities it provides. Warren Moseley, in the paper "Animated Knowledge-Based Requirements Traceability in Large Scale Space Applications", makes the claim that CASE research has placed too much emphasis on the support of the bookkeeping aspects of software life cycle management only to find that the life cycle has no process to give it foundation. He then described an automated knowledge- based approach to requirements traceability called the Poor Man's Case Tool (PMCT) which uses active knowledge bases with visual representations to provide a repository for software requirements. The first of four papers on V&V was given by Alun Preece and dealt with a development strategy geared towards verification as a distinct activity from validation. He argues that verification of an expert system should be separated into two activities. The two activities are verification of the knowledge-base for completeness, conciseness and consistency with respect to the conceptual model and verification of the implementation in the same manner. The conceptual model knowledge-base is transformed into the implementation. He defines conceptual models to be abstract descriptions of the entities, relations and tasks that system is knowledgeable about. Randy Stiles followed with a presentation on validating rule interactions in knowledge-based systems. His motivation is the incremental nature of the rule base, and its integrity, as a system is developed and maintained. He presents search and evaluation methods for both monotonic rule-bases and for rule- bases that allow deletion. Both papers will appear in a forthcoming issue of Intelligent Systems Review. Kevin Benner then present a paper on simulation in support of validation. Besides presenting the current state of SIMSYS, a simulation tool, he gave a discussion of the scaling problems in current techniques for validating formal specifications. A proposal is offered for combining symbolic evaluation and simulation. Deborah Frinke presented an alternative to verification based on planning-based approach called intelligent testing. In contrast to verification, the TPLAN, system approach is to find sample inputs that demonstrate the incorrectness of a system. This method can allow for early testing of specifications and is an improvement on a STRIPS style approach in its use of heuristics. Two development environments where presented, one (Multiview) based and multiple concurrent views views of a system under development and a multi-agent rule-based environment presented by Naser Barghouti. Barghouti presented the MARVEL system and an approach to solving the concurrency control problem of large scale software development projects using a multi-agent model. A rule-based approach is used for detecting conflict between agents. Chris Marlin presented his work on a distributed implementation environment. The motivation comes from the multiple representations that are often employed during software development. Marlin pointed out that a challenge will be presenting a large variety of views and in customizing the views available and the view instances. Customization of concurrency control based on process information may help in the exploitation of parallelism in incremental code generation. In his presentation N. Boudjlida proposed concepts for software process modeling which yield active support resulting from the utilization of knowledge about the tools which can be used and the policies to be obeyed. UPSSA, a system using a subset of a Model for Assisted Software Process was also described. Gerald Williams described the use of a conceptual (or metamodel) level of representation to reconcile the differences between the representation used in the two significantly differing platforms in moving the paraphrasing capabilities written for the KBSA Specification Assistant project to the Concept Demonstration project. Speculations about the generality of system specification concepts in general revealed by this exercise were also discussed. Bob Schrag described the application of interval-based constraint technology to the problem of automated user interface design. Layout is implemented using KUIE (the KBSA User Interface Environment) a Lisp-based object-oriented graphical user interface toolkit built upon the capabilities of X-Windows, CLX, and CLUE. The scope of the addressed layout problem includes object size, positioning, and relative placement. Eduard Hovy described the extension of techniques used for planning in the generation of natural language text and automated modality selection for multi-modal presentations and how they handle the requirements of text layout. Aaron Larson described a model consisting of a set of abstractions, techniques, and tools for managing the evolution of systems of design objects such as might be required for a KBSA like environment. Initial experiences arising from use of the model in an actual prototype design capture system were also discussed. Deborah Baker described mechanisms for ensuring type and object integrity of distributed persistent objects that enable sharing and communication among components of a software environment such as proposed by the KBSA. Noah Prywes gave the first of two papers on reverse software engineering. He presented an approach for the automatic translation of real-time programs into a non-procedural specification language, MODEL, for concurrent programs that also provides support for testing and maintenance. Model uses regular and boolean algebras for mathematical representation to allow the employment of uninitiated users in the maintenance phase. J. Zhang presents a personal view of reverse engineering for reuse and maintenance that is not just the reverse of forward engineering, i.e. capture of requirements, specifications, design etc.. A definition of reverse engineering is posited that involves enhanced software understanding at multiple levels and uses and generates new documentation. The knowledge-based approach includes the avocation of formal methods and code information to automate the reverse engineering process. Acknowledgement The authors greatly acknowledge the assistance of Melissa P.Chase in preparing this article. Available KBSA technical reports and prior conference proceedings.