The seminar will cover the principles and procedures of information and software system quality assurance (comprising test, measurement and assessment) for procedural, object-oriented and agent-based dependable software systems. Attendees will exercise proven techniques for goal-directed measurement, scaling and assessment for software certification. Assessment of both the software product as well as the software process will be discussed with respect to its relevance for such acceptance assessments. A standardized process model for measurement, assessment and certification of dependable software will be used to make the attendees familiar with this comprehensive assessment procedure and to learn how to embed it into today's standardized or non-standardized software processes. Basic knowledge in mathematics and some knowledge of software methods and tools is required. Emphasis will be given to selected advanced topics depending on the needs of participants.
Hans-Ludwig Hausen is Principal Scientist (Senior Researcher, Project Manager) at FRAUNHOFER German National Engineering Research Society. He worked 25 years as project manager, consultant, advisor and lecturer on computer aided software engineering, software quality assurance and software process modeling and tailoring on more than 10 large software engineering projects for governments and industry. He has more than 60 publications and is the author of 3 books on software engineering environments, software quality and productivity and information storage and retrieval.
The next stage in the evolution of enterprise applications will be based on Web services. Web services are pieces of application functionality that are exported through a set of standard application programming interfaces (APIs), and allow applications to be constructed by locating and binding to the exported functionality. More interestingly, multiple Web services can be coordinated together in unique combinations in an Internet application to implement value-added services for users. In this tutorial, we describe the design, development, deployment, and maintenance of Internet applications based on Web services. We also describe the emerging mobile Internet environment, the unique issues inherent to these environments, and the challenges in developing mobile applications based on loosely coupled Web services. In addition to a broad coverage of the fundamental topics, industry standards, and technologies (e.g., Java, J2EE, application servers, XML, SOAP, WSDL, UDDI) underlying the development of Web services and applications based on Web services, the tutorial will provide practical, step-by-step instruction for the development and deployment of enterprise-class Web services and applications based on standard Java 2 Enterprise Edition (J2EE) application servers and SOAP servers. We also touch on .Net technologies in support of enterprise Web services.
Dr. Sandeep Chatterjee is a seasoned technology expert and business professional with over a decade of contributions as a thought leader, technologist, consultant, entrepreneur, and author. Dr. Chatterjee is Chief Technology Officer of a Web services delivery and management startup, where he is responsible for the development and strategic positioning of the company's flagship enterprise Web services runtime platform. He also serves as a Chief Technology Consultant for Fortune-100 and major not-for-profit organizations including Hewlett-Packard and ACCION International.
Dr. Chatterjee is co-author of "Developing Enterprise Web Services: An Architect's Guide", a book by Prentice Hall, and has served on the Expert Group that specified the worldwide standard for mobile Web services. He also sits on the Board of companies developing mobile and Web services technologies, including Clickmarks and Foundationalnet.
Previously, Dr. Chatterjee was the lead and chief architect of Hewlett-Packard's Web Services Mediation Platform. He was also Entrepreneur-in-Residence at FidelityCAPITAL, the venture capital arm of Fidelity Investments, and was Founder & Chief Technology Officer of Satora Networks, which developed tools and technologies for developing appliances and services for the mobile and pervasive Internet.
holds a Ph.D. in Computer Science from the Massachusetts Institute of
Technology, where his research in networked client architectures and systems
was selected as one of the top thirty-five inventions in the thirty-five
year history of MIT's Laboratory for Computer Science, and his invention
is showcased in a time capsule at the Museum of Science in Boston, Massachusetts.
Brat and Arnaud Venet
The theory of Abstract Interpretation is a static analysis technique pioneered by Patrick and Radhia Cousot in the mid 70's. It provides generic algorithms for building program analyzers that can detect runtime properties of a program without executing it, simply by exploring the text of the program and a formal specification of the semantics of the language. Program analyzers obtained by following the Abstract Interpretation framework are guaranteed to cover all possible execution paths. Abstract Interpretation is particularly well suited to the computations of numerical invariants in programs (like loop counters, variable ranges, etc.).
This tutorial will cover the general theory of Abstract Interpretation (including the notions of abstract domains, collecting semantics, widening and narrowing operators) as well as give practical guidelines to build program analyzers based on this theory. Illustrative examples will be given using the "C Global Surveyor", an analyzer developed at NASA Ames for checking the absence of runtime errors in C programs.
Our expectation is that the audience of the tutorial will gain a clear understanding of Abstract Interpretation, especially of
After this tutorial, the audience will be able to answer questions such as:
Dr. Brat received his M.Sc. and Ph.D. in Electrical & Computer Engineering in 1998 (The University of Texas at Austin, USA). His thesis defined a (max,+) algebra that can be used to model and evaluate non-stationary, periodic timed discrete event systems. Since then, he has worked on the application of static analysis to software verification. From 1997 to June 1999, he worked at MCC where he was the technical lead on a project developing static analysis tools for software verification. In june 1999, he joined the Automated Software Engineering group at the NASA Ames Research Center and focused on the application of static analysis to the verification of large software systems. For example, he applied tools based on abstract interpretation to the verification of software for the Mars PathFinder, Deep Space One, and Mars Exploration Rover missions.
Dr. Venet obtained a PhD in Abstract Interpretation in 1998 from Ecole Polytechnique (France) under the supervision of Radhia Cousot. He has worked as an associate researcher in Patrick Cousot's group at Ecole Normale Superieure (France). Dr. Venet architected and implemented industrial static analyzers for real-size critical softwares at PolySpace Technologies. He also worked on automated test generation and code compression for JavaCard applications at Trusted Logic. Dr. Venet has an extensive theoretical and industrial experience in static analysis. He is currently a Research Scientist at NASA Ames and is working on specialized static analyzers for verifying large mission-critical softwares developed in NASA. Dr. Venet coauthored three patents on industrial applications of static analysis.
It is now widely accepted that functional aspects (Platform Independent Models (PIM) in OMG's Model Driven Architecture terminology) should be dissociated from non-functional ones (such as persistency, fault-tolerance and so on) and their platform specific implementations. The Unified Modeling Language (UML) gives the designer a rich, but somehow disorganized, set of views on her model as well as many features to add non-functional annotations to a model. In this tutorial, we show how to organize models around the central notions of (1) quality of service contracts (for specifying non-functional properties a la QML on PIMs) and (2) aspects for describing how they can be implemented. Based on our experience in the IST QCCS project, we show how to model contracts in UML with a small set of stereotypes, and to represent aspects like design pattern occurrences, that is using parameterized collaborations equipped with transformation rules expressed with an object-oriented extension of OCL2 working at the meta-model level. The second part of this tutorial will present our transformation based approach as implemented in the UMLAUT framework to build design level aspect weavers, based on a meta-level interpreter, that takes a PIM as entry point, processes the various aspect applications as specified by the designers, and outputs another (detailed design, or PSM level) UML model that can serve as a basis for code generation.
Prof. Jean-Marc Jezequel received an engineering degree in Telecommunications from the ENSTB in 1986, and a Ph.D. degree in Computer Science from the University of Rennes, France, in 1989. He first worked in Telecom industry (at Transpac) on an Intelligent Network project before joining the CNRS (Centre National de la Recherche Scientifique) in 1991 as a researcher (Chargé de recherche). Since October 2000, Jean-Marc Jezequel is a professor at the University of Rennes. He is leading an INRIA research team called Triskell (http://www.irisa.fr/triskell), working in the domain of object-oriented software engineering for distributed computing systems and telecommunications. In the general context of building and assembling reliable and efficient components based on the Aspect Oriented Design ideas, he is working on a set of tools allowing the formal manipulation of UML models (UMLAUT). UMLAUT builds on various technologies, including formal specification based on the OCL, at both the model and meta-model levels (e.g. for representing Design Patterns applications) as well as the validation (random or exhaustive simulation, test cases generation) of distributed software systems based on model-checking related technologies. He is the author of two books published by Addison-Wesley, and of more than 60 publications in international journals and conferences.
Widespread development and reuse of software components has been regarded by many as one of the next biggest phenomena for software. Reusing high-quality software components in software development has the potential of drastically improving the quality and development productivity of component-based software. However, widespread reuse of a software component with poor quality may literally lead to disasters. Component-based software engineering involves many unique characteristics, some of which have caused new issues, challenges, and needs in testing and quality assurance of components and component-based software. Improper reuse of software components of good quality may also be disastrous. Testing and quality assurance is therefore critical for both software components and component-based software systems.
Although a number of recent books address the overall process of component-based software engineering and specific methods for requirements engineering, design, and evaluation of software components and component-based software. However, only a few of published papers discussed issues in testing and quality control of reusable software components and component-based software systems. The primary goal of this tutorial is to discuss the issues, challenges and needs in testing of components and component-based software. Moreover, this tutorial reports the recent advances and research efforts in developing new solutions to solve those problems and meet the needs.
The tutorial provides an in-depth insight look at the technical issues and managerial aspects of testing software components and component-based programs from the perspectives of component-based software engineering. You will learn the state-of-the-art practice, issues and challenges, new solutions and research efforts in component-based program testing.
At the end of the tutorial, the participants will know:
The targeted audience includes technical managers, software testing engineers, quality assurance people, and development engineers who are working on component-based software projects. In addition, the tutorial is designed to help software engineering researchers and graduate students to learn the basic issues and challenges as well as new solutions in testing of third-party components and component-based programs. The tutorial will be very useful for professionals who are interested in understanding the general concepts and methods in component testing and component-based software validation.
This tutorial assumes that participants have the general understanding about the basic concepts of software engineering and software testing methods, and have some working experience in software development and validation.
Jerry Z. Gao, Ph.D., is an associate professor at Computer Engineering Department of San Jose State University. His current research interests include component-based software engineering, software testing methodology and test automation, information technology and Internet computing, mobile computing and wireless commerce. Before he has many years working and management experience in software industry before he joined San Jose State University. He has over 45 technical papers in IEEE/ACM journals and magazines as well as international conferences. He has co-authored two technical books on software testing. The first book is titled as Object-Oriented Software Testing, published by IEEE Computer Society Press in 1998. Recently, he and his co-authors (Dr. Jacob Tsao, and Dr. Ye Wu) just completed a new book, titled as "Testing and Quality Assurance for Component-Based Software", which will be published by Artech House Publishers in September 2003. In addition, he has served as a technical committee member for a number of international conferences. Dr. Gao has taught various academic classes in software engineering, including software engineering I & II, object-oriented software design & analysis, object-oriented software testing, software testing and quality assurance. He has been invited to offer a number of short courses and tutorials to industry people and professionals, including object-oriented software testing, Engineering Internet for global software production, software performance testing over the Web, and design for component testability.
The high quality development of critical systems (be it dependable, securitycritical, real-time, or performance-critical systems) is difficult. Many critical systems are developed, fielded, and used that do not satisfy their criticality requirements, sometimes with spectacular failures. Systems whose correct functioning human life and substantial commercial assets depend on need to be developed very carefully. Systems that have to operate under the possibility of system failure or external attack need to be scrutinized to exclude possible weaknesses. Part of the difficulty of critical systems development is that correctness is often in conflict with cost. Where thorough methods of system design pose high cost through personnel training and use, they are all too often avoided.
UML offers an unprecedented opportunity for high-quality critical systems development that is feasible in an industrial context.
However, there are some challenges one has to overcome to exploit this opportunity, which include the following:
The tutorial aims to give background knowledge on using UML for critical systems development and to contribute to overcoming these challenges. Specifically, it presents a simplified version of UML called UMLlight that has the advantage of offering advanced tool-support. It includes an interactive tool demo with advanced tool support for UML.
Jan Jurjens is a researcher at TU Munich (Germany). He is the author of a book on Secure Systems Development with UML (Springer-Verlag, to be published in 2003) and about 30 papers in international refereed journals and conferences, mostly on computer security and software engineering, and has given 4 invited talks at international conferences. He has created and lectured a course on secure systems development at the University of Oxford and 15 tutorials on his research at leading international conferences. He is the initiator and current chair of the working group on Formal Methods and Software Engineering for Safety and Security (FoMSESS) within the German Society for Informatics (GI). He is a member of the executive board of the Division of Safety and Security within the GI, a member of the advisory board of the Bavarian Competence Center for Safety and Security, a member of the working group on e-Security of the Bavarian regional government, and a member of the IFIP Working Group 1.7 "Theoretical Foundations of Security Analysis and Design". He has been leading various security-related projects with industry.
Johannes Grünbauer has worked as a banker at the HypoVereinsbank (the second largest bank in Germany) and studied computer science at the Technical University (TU) of Munich with a minor in economics. The subject of his master's thesis was "Model based security analysis of a banking application". The work was done in cooperation with a major german bank. Currently, he is a research assistant and PhD student at the Department of Informatics at the Software and Systems Engineering chair of Professor Manfred Broy at TU Munich. His research interests include security and model-based software engineering.
Le développement de logiciels peut être perçu comme une suite de transformations successives appliquées aux besoins de l'usager, pour mener à un logiciel exécutable et répondant aux exigences fonctionnelles et non-fonctionnelles. Certaines de ces transformations ont été bien comprises et bien formalisées. D'autres continuent d'échapper à la rigueur mathématique. Notons, par exemple, l'infranchissable passage entre les exigences des usagers et les spécifications du logiciel, et l'intégration des considérations architecturales dans le logiciel résultant.
Pour caractériser les fonctions (transformations) complexes, les chercheurs ont utilisé deux stratégies complémentaires :
Dans les deux cas, l'application d'une transformation passe par deux étapes, a) d'abord, la sélection de la solution ou de la famille de solutions, manuellement, par le développeur, et b) l'application de la fonction partielle ou locale, pour adapter la solution au problème sous la main.
Ces deux paradigmes ont dominé la recherche sur les outils de génie logiciel durant la dernière décennie. Au niveau de l'industrie, les solutions implantées ont été fragmentaires et incomplètes. L'OMG a reconnu l'intérêt de cristalliser ces approches dans un cadre de développement complet qui traite les diverses étapes de développement. Ce cadre, appelé (abusivement) Model-Driven Architecture, offre aussi une solution efficace au problème de prolifération de systèmes middleware (interstitiels) : en séparant effectivement les aspects métiers des aspects conception, on peut développer et déployer une même application sur diverses plateformes cibles à moindre coût.
Dans ce tutoriel, nous ferons un survol de la littérature sur les approches transformationnelles, et nous étudierons le cas particulier de l'architecture MDA.
Hafedh Mili est professeur d'informatique à l'UQAM depuis 1988. Il détient un diplôme d'ingénieur de l'École Centrale de Paris (1984), et un PhD en informatique de l'Université George Washington (1988). Ses travaux de recherche touchent principalement au génie logiciel orienté-objet et à l'application de techniques de l'intelligence artificielle au génie logiciel. Il compte plus de soixante publications dans des revues et conférences internationales sur l'intelligence artificielle, la réutilisation de logiciel, et le génie logiciel orienté objet. Il a récemment publié un livre intitulé Reuse-Based Software Engineering (2002), avec A. Mili, S. Yacoub et E. Addy, publié chez John Wiley & Sons, N.Y. (ISBN 0-471-39819-5). Présentement en sabbatique, il est chercheur visiteur au CRIM.