Software Engineering

UC Santa Cruz – CMPE 276 – Fall 2000
T Th 12-1:45PM

Baskin Engineering 372



Jim Whitehead


BE 123


Office Hours:

Thurs 10-11, or by appointment


Below are some project ideas for the CMPE 276 term project. It is possible to develop a different project; if you do so, you should email a brief description of the project by October 6.

Software Engineering for Application Layer Protocols

As users of such Internet services as the Web, email, telnet, and FTP, we commonly use and depend on application layer network protocols. An application layer protocol is one that assumes either the existence of a reliable transport, such as TCP, or an unreliable packet transport, such as UDP, and bridges the gap between user requirements and the capabilities of the underlying network infrastructure. Currently, development of application layer protocols is ad-hoc. While some rules of thumb exist, there is no organized body of knowledge concerning how to develop these protocols, and how to discriminate between “good” and “bad” protocol designs.

Since application network protocols are designed to meet the needs of network users, just as software is developed to meet end-user requirements, the techniques, and frameworks developed in Software Engineering seem applicable.

In this project, you will take one aspect of Software Engineering (such as testing, or formal methods) and apply it to application layer protocols.  Since this is only a 10 week course, the key to this project is keeping the scope small.  Some ideas are:

·         Formal specification for application layer protocols: One criticism of formal methods is they are very time consuming, and their application might result in missing a market window. However, application network protocols are often developed over a long time period, and the costs of errors in such protocols are very large.  As a result, formal specification of application layer protocols seems like a good match, since the formal specification would not take too long, and could save significant money.

In this project, you will select one formal specification technique, such as Z, and pick one application layer protocol, such as HTTP/WebDAV.  Select a small subset of the protocol’s capabilities (for example, just the COPY and MOVE methods in WebDAV), and provide a formal specification of just those capabilities. In the writeup, provide an analysis of the strengths and weaknesses of the specification technique as applied to the selected protocol. Include a discussion of which protocol requirements were not specifiable using the formal specification mechanism, and why.

If additional project members are added, increase the scope of the specification, and examine the possibility of automatically generating test cases from the formal specification.

·         Requirements engineering for application layer protocols: Most application layer protocols have an associated requirements, or goals document. However, these are typically not consulted, or maintained once development of the protocol commences. In this project, you will discuss several practices in software requirements engineering that you believe will provide significant benefit in the development of application layer protocols. Discussion should include such factors as needed tools support, analysis of who performs the work, and who receives benefit, and time needed to apply the technique.

·         Testing of application layer protocols: Application layer protocols commonly are subjected to interoperability testing, where permutations of client/server pairs are tried, and the results noted. Less frequently, compliance testing is performed, to determine whether a client or server’s implementation of the protocol meets the specification. Testing of protocols is somewhat different than either white box or black box testing, since the purpose is to test the specifics of the transport of the data that flows between two black boxes, the client and server (or peers, in a peer-to-peer protocol). This project involves the selection of two or three testing techniques, and discussing their applicability (or lack of applicability) for testing application layer protocols. Discussion should include strengths and weaknesses of the technique, types of protocols for which it applies, whether it works best for testing clients or servers, and the mechanics (and difficulty) of applying the technique.

Other possibilities for applying Software Engineering techniques to application layer protocols undoubtedly exist: feel free to propose one.


Rohit Khare’s 7th Heaven column in IEEE Internet Computing provides a good introduction to individual application layer protocols:

The Spec’s in the Mail – Covers the Simple Mail Transfer Protocol (SMTP) and gives a list of applicable RFCs (

The News About Jon – Covers the Network News Transfer Protocol (NNTP)

Who Killed Gopher? – Covers the Gopher protocol, and the Hypertext Transfer Protocol (HTTP) (

The Internet Protocol Journal, at occasionally covers application layer protocols:

“Internet Mail Standards”, Paul Hoffman, Internet Protocol Journal, 3(2), June 2000

The HTTP working group web page is:

The WebDAV working group web page is:

WebDAV Resources, and also provides information on WebDAV.

Open Source

Open source software development, where teams of developers from across the Internet collaborate on a software project whose source code is freely available, is currently receiving significant interest due to the success of projects such as Linux, and Apache. Like many trends in the computer industry, open source has been the subject of many claims and hype.

·         Open source: does it match the hype? In this project, you will perform a brief survey of articles on open source software development, to develop a short list of claims concerning open source development. Then, these claims will be evaluated by examining one current open source project, such as Apache, Mysql, sendmail, KDE, Gnome, or ArgoUML. Examples of things to look for include: are people using consensus voting?  Where do people send patches?  Are people solving problems they have (“scratching itches”) or are they participating for fame and fortune?  Is open source software truly higher quality, and more performant? If more people are added to the project, the scope can be expanded to include more qualities of open source software, and more open source projects.

·         Open source project metrics. Due to the use of the Internet as the collaboration medium, a greater percentage of the development process is captured in artifacts such as email, project documents, etc. within the computer. This raises the possibility of creating new metrics tailored to open source software development. In this project, you will develop several new metrics, or identify existing metrics, that can be applied to open source software development. Some ideas include: measuring the level of turnover on the mailing list? How much of the code was written by people who no longer participate in the project? How influential are certain people on the mailing list? Who are the biggest contributors in terms of messages, code, bug reports, etc.? The metrics should be applied to an existing open source project, and the results discussed. As more people are added to the project, the number of metrics, and the number of open source projects can be expanded.

I’d like to thank Jason Robbins for the suggestions and thoughts he contributed to these projects.


IEEE Software had a special issue on Open Source and Linux in the Jan/Feb 1999 issue.

“Linux on the Move”, Terry Bollinger, Peter Beckman, IEEE Software, Jan/Feb, 1999, p. 30-35. (Includes “A Brief History of Free Software and Open Source”, Jesus M. Gonzales Barahona, Pedro de las Heras Quirós, Terry Bolinger, p. 32-33).

“The Business Case for Linux”, Evan Leibovich, IEEE Software, Jan/Feb, 1999, p. 40-44.

“Setting Up Shop: the Business of Open-Source Software”, Frank Hecker, IEEE Software, Jan/Feb, 1999, p. 45-51.

“Linux in the Workplace”, Jacob Hallén, Anders Hammarqvist, Fredrik Juhlin, Anders Chrigström, IEEE Software, Jan/Feb, 1999, p. 52-57.

“Linux in Practice: An Overview of Applications”, Terry Bolinger, IEEE Software, Jan/Feb, 1999, p. 72-79. (Article includes 6 sidebars covering domain-specific uses of Linux).

“Culture Clash and the Road to World Domination”, Greg Perkins, IEEE Software, Jan/Feb, 1999, p. 80-84.

“Linux and Open-Source Success”, Interview of Eric S. Raymond by W. Craig Trader, IEEE Software, Jan/Feb, 1999, p. 85-89.

Communications of the ACM had a section of the April, 1999 (Vol. 42, No. 4) issue dedicated to Open Source.

“The Cathedral and the Bazaar” by Eric S. Raymond has been a recent influential paper advocating Open Source development. (The above page has links to a number of other related articles).

A critique of this paper appears in “Beyond the Cathedral, Beyond the Bazaar”, by Jonathan Eunice

Another critique is: “A Second Look at the Cathedral and the Bazaar”, Nikolai Bezroukov, First Monday, volume 4, number 12 (December 1999),

“A case study of Open Source software development: the Apache server”, Audris Mockus, Roy Fielding, James Herbsleb, Proc. 2000 Int’l Conference on Software Engineering (ICSE2000), Limerick, Ireland, June 4-11, p. 263-272.

Appropriately, the O’Reilly book, “Open Sources: Voices from the Open Source Revolution”, ed. by Chris DiBona, Sam Ockman & Mark Stone, 1999, has its full text available online.

“The Simple Economics of Open Source”, Joshua Lerner, Jean Tirole, National Bureau of Economic Research, 2000, NBER Working Paper 7600.

The online journal First Monday had a number of articles on Open Source in their March, 1998 issue:

“Cooking Pot Markets: An Economic Model for the Trade in Free Goods and Services on the Internet”, Rishab A. Ghosh, First Monday, vol. 3., no. 3, March, 1998,

“Linux and Decentralized Development”, Christopher B. Browne, First Monday, vol. 3, no. 3, March, 1998,

“FM Interview with Linus Torvalds: What motivates free software developers?”, First Monday, vol. 3, no. 3, March, 1998,

“Requirements Elicitation in Open-Source Programs”, Lisa G. R. Henderson, Crosstalk, July, 2000,

On October 23, 2000, Slashdot ( had a discussion on requirements elicitation in Open Source projects.

“A Quantitative Profile of a Community of Open Source Developers”, Bert J Dempsey, Debra Weiss, Paul Jones, and Jane Greenberg, Unpublished manuscript, University of North Carolina,

DMOZ has a listing of Open Source resources at:

Google has a list of resources as well:

Rationale for Creating Revisions

The use of a revision control system is a common practice in most software development projects today, and the technology of revision control is well understood. However, what is less well understood are actual use patterns and motivations for fine-grain operations such as check-in, and check-out. This goal of this project is to answer the question: why do people make new revisions?  For example, if a software developer is currently working on a piece of source code, what criteria does she use to decide when it is time to check-in the code, and create a new revision?

This project involves going out into the field and interviewing software developers about their reasons for deciding to check in. Interviewing academic software developers is not sufficient; to perform this project, you must interact with developers in a corporate, or open source software development effort. As a result, before considering this project, you should have one or two sites in mind where you could easily gain access (for example, if you have a part time job, or a friend who works nearby).  Open source projects might be a useful place to find people to interview, although it is likely that developer’s check-in rationale will vary depending on their revision control system, and hence the results from an open source project may have a dependency on CVS. If more people are added to the project, the number of sites, and the number of people interviewed can be increased.


The two classic papers on revision control are the SCCS and RCS papers:

“The Source Code Control System”, Mark J. Rochkind, IEEE Trans. on Software Engineering, 1(4), 1975, p. 364-370.

“RCS – A System for Version Control”, Walter F. Tichy, Software-Practice and Experience, 15(7), p. 637-654.

The CVS system is frequently used by open source projects, due to its support for distributed software development. For an introduction and overview:

“CVS II: Parallelizing Software Development”, Brian Berliner, Winter 1990 USENIX Conference, Washington, DC, Jan. 22-26, p. 341-351.

Conradi and Westfechtel have a survey of version control data models:

“Version Models for Software Configuration Management”, Reidar Conradi, Bernhard Westfechtel, ACM Computing Surveys, 30(2), 1998, p. 232-282.

The Configuration Management Yellow Pages, have links off to a staggering number of version control and configuration management systems:

There is also a regular workshop series, called System Configuration Management (SCM), which has nine proceedings available.  The most recent ones are published in the Springer-Verlag Lecture Notes in Computer Science (LNCS) series: SCM-8 is LNCS 1439, and SCM-9 is LNCS 1675.

There has been relatively little published research examining actual use patterns of version control and configuration management systems. Rebecca Grinter has done some field studies of configuration management use, reported in:

“Supporting Articulation Work Using Configuration Management Systems.” Rebecca Grinter, Computer Supported Cooperative Work: The Journal of Collaborative Computing. 5(4), 1996, p. 447-465.

PCTE Postmortem

One idea that was heavily explored in the software environments research of the late 80’s and early 90’s was the notion of a central repository for all artifacts created while developing a software project. The basic notion was that if all tools stored their data in the same repository, it would be possible to provide additional services such as advanced searching, automated process support, configuration management, management reports, and visualizations. The storage facilities offered by the operating system were considered inadequate, since they did not provide facilities for efficient searching, annotating artifacts with metadata, and creating inter-artifact relationships.

Researchers realized that this repository vision could only be achieved by developing a standard repository, having a uniform interface to standard services. If such a standard existed, tools would be developed to use the standard. As more tools adopted the standard, and more data existed in the repository, the value of the repository would increase until all tools would, of course, stored their data in the common repository.

There is only one problem with this story: it didn’t work.

The Portable Common Tools Environment (PCTE) was the most influential repository standard developed to meet this vision. Many intelligent and educated people worked hard to develop this standard, and there was some commercial support for it, as well as many research papers.  However, the standard never took off, and never received widespread adoption.

The purpose of this project is to develop an explanation for why PCTE was not adopted by the marketplace. After a flurry of excitement in the early 90’s, PCTE just faded away, and there was never a solid retrospective on PCTE that critically examined why it did not succeed.  The principals involved in the creation of PCTE are still alive, and an effort should be made to contact several of them so they can be interviewed to learn the “story behind the story”.

As more people are added to this project, the number of interviews can be expanded. Examination of other similar standards can also be performed: the Texas Instruments Object Oriented Database (TIOODB) was another repository standard effort of the early 90’s that had limited success.


“PCTE: The Standard for Open Repositories”, Lois Wakeman, Jonathan Jowett, Prentice-Hall, New York, 1993.

International Standards Organization page on PCTE: ISO/IEC JTC1/SC22/WG22 – PCTE: