white paper - adhoc 2.0

10
Ad hoc 2.0 – The New Generation Nuno Brito CMU/UC Master of Software Engineering 2009/2010 Managing Software Development [email protected] Abstract Ad hoc software development is the construction of software solutions through methods and techniques that are most often developed from scratch during project development. Although a reputation of limited flexibility and scalability haunts this methodology, it is nevertheless a powerful technique to build and deploy applications or prototypes within a short period of time. Ad hoc will not scale in the traditional format that is known to software engineers, but web 2.0 platforms such as forums, blogs and social networks combined with accessible development frameworks are now reviving the ad hoc philosophy as a valid approach to developers where this scaling/complexity limitation is not an obstacle anymore. Even thought it is not a magic bullet that can solve the issues that arise from avoiding a more complete and formal methodology, this is nevertheless a growing and attractive trend amongst many small sized software projects to produce software capable of rivaling side by side commercial applications from professional developments, resulting in what we define as Ad Hoc 2.0 – The new generation. 1. Introduction Where does the term Ad hoc come from? Ad hoc is a Latin phrase which means "for this [purpose]". It is a common practice to use this term for describing specific solutions applied to a given task or problem which cannot be generalized or adapted to other purposes. In the context of software development, an ad hoc methodology is often associated to a short life cycle span, or even to the lack of architecture that addresses scalability and completeness since the early project inception. These are characteristics recognized since a long time as essential to improve the odds of success for a professional project in the software industry [4]. However, with the advent of massive access to the Internet came the emancipation of end-users, providing the conditions that allowed them to evolve from a passive state onto active positions inside the project as beta-testers or even becoming active developers that help to carry forward the progress and lead the project to success. The power of this type of cooperative development is well described by Eric S. Raymond in “The Cathedral and the Bazaar” [5]. The Internet thrives at a web 2.0 era as a global platform that is reachable to a far wider stream of the population when compared to web 1.0. People from any gender, age or technical background can effectively communicate, work, produce and share knowledge on a global scale with a relative ease. There is an emergent new generation of software projects that follow the old ad hoc approach principles but are no longer tied to a restricted group of people involved into development with a high level of knowledge on professional programming activities [2]. On this paper, we will begin by focusing on the way how ad hoc is currently perceived from a professional perspective and then perform a more in depth analysis to attempt understanding the advantages and disadvantages that arise from this methodology when contextualized inside a software engineering plan. Later, we approach the concept of Ad hoc 2.0 from a high level perspective that explores the reasons why it’s use is made possible and attempt to identify common life cycles that projects following this methodology might eventually follow. We will then focus on real examples of projects built and maintained with some success while employing this methodology to some extent and analyze some of the characteristics that can either lead to success or failure the future of a project in terms of longevity and large scale software development.

Upload: nuno-brito

Post on 16-Apr-2017

1.160 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: White paper - Adhoc 2.0

Ad hoc 2.0 – The New Generation

Nuno Brito CMU/UC Master of Software Engineering 2009/2010

Managing Software Development [email protected]

Abstract

Ad hoc software development is the construction of software solutions through methods and techniques that are most often developed from scratch during project development. Although a reputation of limited flexibility and scalability haunts this methodology, it is nevertheless a powerful technique to build and deploy applications or prototypes within a short period of time. Ad hoc will not scale in the traditional format that is known to software engineers, but web 2.0 platforms such as forums, blogs and social networks combined with accessible development frameworks are now reviving the ad hoc philosophy as a valid approach to developers where this scaling/complexity limitation is not an obstacle anymore. Even thought it is not a magic bullet that can solve the issues that arise from avoiding a more complete and formal methodology, this is nevertheless a growing and attractive trend amongst many small sized software projects to produce software capable of rivaling side by side commercial applications from professional developments, resulting in what we define as Ad Hoc 2.0 – The new generation. 1. Introduction Where does the term Ad hoc come from? Ad hoc is a Latin phrase which means "for this [purpose]". It is a common practice to use this term for describing specific solutions applied to a given task or problem which cannot be generalized or adapted to other purposes. In the context of software development, an ad hoc methodology is often associated to a short life cycle span, or even to the lack of architecture that addresses scalability and completeness since the early project inception. These are characteristics recognized since a

long time as essential to improve the odds of success for a professional project in the software industry [4]. However, with the advent of massive access to the Internet came the emancipation of end-users, providing the conditions that allowed them to evolve from a passive state onto active positions inside the project as beta-testers or even becoming active developers that help to carry forward the progress and lead the project to success. The power of this type of cooperative development is well described by Eric S. Raymond in “The Cathedral and the Bazaar” [5]. The Internet thrives at a web 2.0 era as a global platform that is reachable to a far wider stream of the population when compared to web 1.0. People from any gender, age or technical background can effectively communicate, work, produce and share knowledge on a global scale with a relative ease. There is an emergent new generation of software projects that follow the old ad hoc approach principles but are no longer tied to a restricted group of people involved into development with a high level of knowledge on professional programming activities [2]. On this paper, we will begin by focusing on the way how ad hoc is currently perceived from a professional perspective and then perform a more in depth analysis to attempt understanding the advantages and disadvantages that arise from this methodology when contextualized inside a software engineering plan. Later, we approach the concept of Ad hoc 2.0 from a high level perspective that explores the reasons why it’s use is made possible and attempt to identify common life cycles that projects following this methodology might eventually follow. We will then focus on real examples of projects built and maintained with some success while employing this methodology to some extent and analyze some of the characteristics that can either lead to success or failure the future of a project in terms of longevity and large scale software development.

Page 2: White paper - Adhoc 2.0

Before we start dwelling onto the definitions between ad hoc in the traditional format and the new format that is the topic of this paper, it should be noted that there are some ambiguities regarding the terms: “life cycle”, “methodology” and “process” related to the development of projects. The author will adopt a consistent and contextualized definition of each term throughout the paper in order to ease its overall meaning at each paragraph. To help clarify what each term is usually referred to, some explanation about each term is due: “Process” refers to the tasks that are executed inside each phase of development. “Methodology” refers to the set of techniques and methods that are employed on these processes. “Life cycle” is referring to all phases of the project since it’s early inception until it’s final state of release of the product to the customer including posterior phases (if any) as help desk support and maintenance for example.

2. Traditional Ad Hoc development 2. 1. Concept Software engineering often tends to consider Ad hoc with a negative impression when it comes to choose a life cycle solution against a formal approach.

SEI (Software Engineering Institute), characterizes ad Hoc in “CMMI for development” [10] as a methodology belonging to the lowest possible level: “At maturity level 1, processes are usually ad hoc and chaotic. The organization usually does not provide a stable environment to support the processes. Success in these organizations depends on the competence and heroics of the people in the organization and not on the use of proven processes. In spite of this chaos, maturity level 1 organizations often produce products and services that work; however, they frequently exceed their budgets and do not meet their schedules. Maturity level 1 organizations are characterized by a tendency to over commit, abandonment of processes in a time of crisis, and an inability to repeat their successes.” Formal methodologies will often attempt to follow a predefined set of stages intended that provide a certain quality level and management control, in some cases, even allow certification of their software practices from a third party entity. These are recognizable characteristics for a well engineered product that is aimed to reach a higher level of CMMI maturity [10]

since its inception, but a development process is also driven from the developer’s talent and effort which tends to avoid formalized processes that present restrictions to creativity and innovation [13]. When integrated at an open and global market environment where innovation and competition are more often than not the aggressive points that help to establish success [7], can we continue to assume as a correct to regard ad hoc approaches in such negative manner to the point of software engineers completely discard them as an efficient solution for professional software industry products? The criticism mentioned on CMMI about the precarious style of developing software is certainly true in most of it’s extent, but it is also true that software developers still apply ad hoc in their everyday work as an intuitive and efficient way to manage the resources available to them. This is an approach that allows a project in a real world of software competition where high levels of creativity under a tight budget of time and resources become so critical as necessary to survive. The most interesting force that drives ad hoc as a compelling approach is the (in)famous preference for a lack of formal processes since the early inception of the project. There is a common misconception that ad hoc followers do not follow any formalized process, the truth lies in the fact that an ad hoc project will simply be developed without a specific process methodology followed from a strict and formal manner since the beginning. Eventually, some formal processes do tend to surface after they are tested and considered as fit to the custom needs of the project in question but this misconception might lead assumptions that characterize the overall development as “hackish” or even as “chaotic” when viewed from an external perspective that is not aware of how these formal processes are unique to the project in question even if they don’t follow a standardized approach. Using nothing more than know-how from the team of developers mixed with experience acquired during the project’s life cycle, an ad hoc project can ensure in most cases that whatever processes are adopted along the development cycle, they will eventually be modified enough to the point of sufficing the needs of the project to some extent. The level of efficiency resulting from these ongoing ad hoc processes is perhaps questionable as it will depend directly from the proficiency and talent of the team elements or even derive from user feedback that helps them to be elaborated. This is an approach that carries advantages

Page 3: White paper - Adhoc 2.0

and disadvantages that will be explored over the next section.

2.2. Advantages and Pitfalls

Ad hoc still holds several advantages as a development approach since it is well adjusted for projects of relatively small to medium dimension that can be made available when time and expected results don't justify the cost of instantiating a formal process. For example, ad hoc approaches are very well applied to cases where a given project requires a proof of concept before engaging onto a full scale development following a formal approach, or perhaps applied to projects that are meant for internal use, where customer, developers and end-users all belong to the same company and a formal process is considered to add unnecessary overhead to deliver the product. Ad hoc can sustain a high level of motivation and interest from a team since it allows developing a project in a light-weight manner that is intended for release within a short term of time, this is a good manner to promote innovation and creativity to take place without too many barriers to development, however, allowing a project to be developed without formal methodologies and depending solely on the experience from developers is also a risk that raises the odds of impact from several disadvantages that need to be considered before engaging into this sort of practice. What specific reasons keep companies away from adopting Ad Hoc? Ad hoc inherited a very negative image since the 90’s. This is the worse factor that still prevents at present its official adoption by professional software developers. It will also remain as the cause to which people will point their finger when looking for a reason to blame for a never-ending appearance of problems that arise when the inherent limitations to this approach are not properly understood. This approach suffers from the same type of issue often associated to companies who avoid XP or generic Agile methodologies after passing through negative experiences due to their initial lack of experience in the agile approach to begin with. This same factor inducts a catch-22 type of problem where a company that publicly admits to use ad hoc on their projects will suffer to pass onto the outside world an image where quality and efficiency are not perceived as reliable, thus causing impact on the reputation, but underneath the hood, the same company can still rely

on ad hoc as an approach to allow exploring competitive solutions on the market that allows to quick releases to public and outperform competitors. We can identify and detail some of the reasons for this negative fame:

• Poor practice of requirements’ elicitation - no predefined set of procedures to properly conduct a requirements elicitation process that allows understanding the scope of a given problem. Raising the risk that a given project can drive a team to waste efforts on solving superficial symptoms instead of actually solving the reason that causes the problem or even mixing requirements with a possible solution.

• Quality assurance is not measured nor

conducted efficiently - without using an efficient strategy for early defect detection or procedures to continuously assess overall quality through methods such as Fagan inspections [12], a high density of defects will eventually be discovered after releasing the initial product version to the customer.

• Lack of scalability or generality, ad hoc

projects focus their target to a specific audience with specific needs. This characteristic will constraint since the initial days of development the range of evolution that can be expected in terms of scope, size and complexity when planning to grow in all these dimensions. We'll explore this particular characteristic over the next section of this paper.

2.3. Scalability of ad hoc projects Scalability is an often desired characteristic on commercial projects that seek a growing number of end-users as it might bring a proportional growth in revenue. But if the base of end-users grows continuously over time, the project will also tend to inevitably increase in size and complexity. As result, features will be added, exposed defects will require correction, support for end users and legacy maintenance for older versions will need to be provided in a scale that will surely continue increasing. Scalability will therefore impact the development resources that were originally allocated to a given ongoing project [1]. Ad hoc is rarely prepared to the endurance of time and increase of user base over long

Page 4: White paper - Adhoc 2.0

terms of time. A project based on ad hoc architecture will demonstrate it's inadequacy to survive as complexity increases, often requiring a complete rebuild from the ground up to match the expected level of usage and efficiency - perhaps even employing a formal process to achieve the intended results for the future and no longer be labeled as ad hoc. These are the most common reasons why companies choose a formal development process from the start. Characteristics such as scalability and maintainability over the long term are planned right from the early project inception and the gain from a quick start and delivery as provided by ad hoc will only come with higher costs to solve as the project itself grows bigger. Even thought the bad reputation assigned to ad hoc can sometimes be considered as exaggerated since people tend to pick examples where it is certainly not an appropriate approach, the truth is that its limitations in the traditional format are both a blessing and course that will limit its usage to developers with significant experience in the software industry to take advantage of most benefits it has to offer without suffering most of the recognized pitfalls. However, a new generation of ad hoc is surfacing that does not require an extensive experience on software development nor is limited to the same boundaries of the traditional ad hoc format as before. We will now introduce this new concept over the next section. 3. Ad hoc 2.0 – The New Generation

It is a fact that the Internet plays an important role as communication platform for people that wouldn't otherwise group together. This access to the Internet world influenced the software development paradigm as reported by Eric S. Raymond in “The Cathedral and the Bazaar” [5]. However, Eric’s study is mainly focused on the contrasting differences from closed source projects compared to open source movements and leaves out some of the interesting management techniques that allowed ad hoc processes to efficiently manage resources in large scale software development projects. Both open and close source movements share concerns that are similar to some extent about the question of human managing, quality control and customer satisfaction even thought they are following different approaches regarding the way how software should be developed.

Open source projects are most often characterized as a geographically disperse group of volunteer developers that enlist themselves to aid at tasks suited for their talent which eventually keep a project moving forward on a good level of progress if it retains the attention of an audience. The popularity, influence and success of a given open source project can therefore be directly measured in most cases by the number of users and volunteers that work to ensure it's long term longevity. The approach for developing these projects when they are small sized can be loose and sporadic. But when these projects start growing across several dimensions such as complexity and user base - then a natural tendency to adopt a strict and formal set of methodologies that allow a team to keep the overall processes of development under control will eventually begin to surface. So, we can conclude that regardless of the nature from a given software product being open source or closed source, if the dimension of this project belongs to mid or large size and complexity then it will eventually need to increase the control over the processes for developing software in the long term if the overall project’s goal is to remain active and popular. Traditional Ad hoc usage can be found necessary to kickstart a project that follows either the open or closed source type of methodology, we also mentioned previously that exists a natural evolution of a project to somehow expect a progressive formalization of the ad hoc processes to also occur if enough time is given. Or, in more extreme situations, the developers from an initially ad hoc project can decide to redesign it from scratch, using then the formal methods that are more appropriate to generate a new architecture based on lessons gathered from past experience on the development effort. Failing to evolve onto this formalized phase of development might eventually lead a project to stagnation or increase the risk of becoming overrun by other competing projects. We can therefore infer that a project cannot use traditional ad hoc methodologies as a sustainable solution for a long term of time and this fact remained as a solid truth since a long time ago. But when ad hoc meets the Internet of masses, an exciting new twist on the fate of ad hoc projects allows breaking this rule. The audience of users interested in joining efforts to help a given project for absolutely no cost allows people to organize themselves and develop a project by

Page 5: White paper - Adhoc 2.0

voluntarily assuming the responsibilities of specific roles inside these ad hoc processes. Groups of people will form effective development teams that can per times outperform professional developers. This can be due to the high level of motivation and feedback from multiple perspectives provided by direct contact between end-users and product developers. People involved in ad hoc 2.0 projects will usually share the same domain knowledge in a similar manner to what currently occurs with open source movements. There are situations were this type of ad hoc is the ideal situation for decentralized development without requiring a high level of control or management. When efforts are based on cooperative work from volunteers that are not confined to a strict and formal organization as noted on many open source movements of bigger dimension, development will become intuitive and dynamic, adapting more easily to the newer trends of technology without added costs. This type of specific volunteer movement can be found across many other projects of small dimension whose code is not necessarily released to the public but where people still offer their time and talent to help in tasks such as language translation, beta testing, help desk support or whichever tasks required. There is an ongoing trend where the exclusive use of web 2.0 technologies such as bulletin boards, blogs or any other social networks will allow people across the globe to communicate and share knowledge between them. These are the ideal locations where an ad hoc project can progress and reach popularity with the increase of users and complexity while withstanding the test of time without ever requiring the move onto more formalized practices as seen in the traditional ad hoc format. There is no extensive roadmap, no architecture plan is detailed, no monitoring of milestones nor extensive quality control is conducted, and yet, these are projects that still manage to provide a product or service that reaches success and even outperform professional developments under some situations. This is what we identify as “Ad hoc 2.0 – the new generation”. It's a concept for developing software where some of the common disadvantages identified from traditional ad hoc are eventually solved or not even considered relevant to some extent. There is an important human value that is raised from this mass number of users that allows these projects to keep moving forward regardless of their complexity. These projects can be based solely on the help and talent from volunteers. It’s the type of methodology where small groups of

developers can effectively rival against products developed from commercial companies or provide solutions that wouldn’t otherwise be explored by commercial groups. Ad hoc 2.0 projects have no roadmaps defined for each development phase and can often be defined as an ongoing development work whose progress is based on volunteers and continuous feedback from users. 3.1. Typical life-cycle of an Ad Hoc 2.0 project What can be expected from Ad Hoc 2.0? This is a flexible approach. There is no structure that can accurately describe a similar pattern across projects of this particular kind, nevertheless, it is possible to distinguish a few phases that are likely to occur during the project lifetime. The ideal team development condition to allow forming an Ad hoc 2.0 approach are situations where only a single developer or a small number of developers between 2 up to 5 elements are included. There may exist several team elements assigned with secondary roles such as beta testers and such, but the primary team elements should always be responsible for the most important decisions regarding the project progress while sharing the role of end-users on the same project. The lifecycle for an Ad hoc 2.0 project will typically follow 5 distinct phases of development.

1. Idea – The concept for a given project is proposed by a developer or small group of developers to solve a given problem or challenge. A brainstorm session between some of the stakeholders occurs and attempts to identify possible competitors, alternatives and involved complexity to carry forward the project onto a feasible success.

2. Implementation – The project begins

implementation shortly after the idea is accepted as achievable. The project will likely be coded using a software framework that is familiar to the developers.

3. Initial version – A first version is made

available to the public as proof of concept that the innovating idea for the project can be achieved. Usually released to a small group of people that are close to the development circle in order to gather feedback.

Page 6: White paper - Adhoc 2.0

4. Revision / Beta versions – Incremental beta

releases and versions considered as stable are published to address bugs or requested features. Very active participation of end-users in this stage.

5. Stagnation – A project eventually reaches a

state of inactivity and either doesn’t fulfill requests from end-users or there is not enough interest from the community where it is discussed to provide additional feedback. Stagnation occurs since the project development is neither announced as terminated nor the responsible developers will provide insightful news regarding the project progress in the future.

3.2. Advantages and disadvantages Ad hoc 2.0 under this new context remains truthful to it's origins – avoid formal processes from the beginning, change and adapt as necessary in order to survive. To avoid an extensive list of characteristics, we narrowed the scope down to 10 advantages and 10 disadvantages that seem relevant to expose on this particular development methodology. Advantages that are commonly found:

1. Good for volunteer work as people can join at any given time as beta testers or even developers in some situations;

2. Well adjusted to small sized projects as the level of complexity can still be completely manageable by a single developers or small team of developers;

3. Allows to build strong relationships with users since they also become an active part of the development process as beta testers or other roles to which they volunteer and help the project effort move forward;

4. Affordable – requires few resources since inception and the development team can in most of the cases work on the project during his free time as a hobby;

5. Localized – solves an immediate need of a specific audience instead of requiring support for a much wider array of possible users;

6. Flexible – adjusts well to new requirements that may appear during development or to new situations that may cause a significant drift on the

development of the project; 7. Easy to manage– doesn't require team

elements to posses a specific formal education on software development and management, a small team is capable of develop a good product using common good sense to solve most situations;

8. Lightweight – can be instantiated in a short period of time and keep the motivation of the involved developers at a very high level of engagement as the project results are visible very quickly;

9. High number of versions released to the public that address new features or defects fixes;

10. Development driven by user feedback, progress occurs according to motivation of authors and user base requests. If a project is popular then it might act as a good motivator for the author to proceed with further development;

And also some disadvantages:

1. It's not prepared to scale or handle complexity by default, the effort required to adapt onto a bigger user base or code complexity might be overwhelming for a small development team;

2. No formalized quality assessment as there are no repeatable processes to ensure that the project development doesn't cause conflicts with previous development effort [12];

3. No metrics defined to evaluate project status makes the task of tracking progress more difficult to understand by everyone involved in the project;

4. Risks are often not clearly explored and analyzed before proceeding with project development, this might force developers to deal with unexpected risks that could have been avoided from the start;

5. No clear milestones with well defined goals as the project is governed by the specific needs at any given time. Defining milestones is a very difficult task unless there is a clear goal to achieve from the beginning;

6. Requirements' elicitation is often incomplete as developers tend to opt by delivering a quick fix or patch to solve the symptoms of a problem without focusing on the reason that is causing it in the first place.

Page 7: White paper - Adhoc 2.0

7. No schedule is usually defined in most cases as the project progress will depend on the effort and motivation from the developers, which might lead to prolonged states of stagnation;

8. High rate of bugs since new versions are often released to public without extensive debugging care and often rely on user feedback to detect defects and prioritize their fix.

9. Defects are not reported with enough details because users are not properly trained for this task, often leading to ambiguous or incomplete reports about a defect that needs to be corrected.

10. Too many different versions are difficult to use and might even paralyze any effort to bring simplicity into a version system that attempts to distinguish between stable and unstable versions of the product.

Previously, Ad hoc methods would be found applied mostly to software built with the focus on a specific target audience, the “users by the dozens” [4]. Access to the Internet allowed this specific target audience to become amplified from mere dozens to hundreds or even thousands of users on daily basis. This difference in the scale of masses places Ad hoc in a privileged position as most of the costs from development and research can be reduced to a few single developers with talent. Success is often measured by the amount of people who follow their progresses and this is probably the way how software development can achieve its most emphatic form: a state of coding where the gap between developer and end-users of the project is reduced to a bare minimum. This flexibility to release a product often allows publishing several minor version updates as prototypes that help developers get a better notion of the ideal product that answers the ever changing expectations from end-users. This type of approach often requires an uncertain number of releases and it’s very difficult to perform estimations or assessing the overall project status but these are also factors that assume a smaller or less critical relevance in this context. The recent software development frameworks are also responsible in part for the existence of this new generation of developers. Powerful frameworks are in most cases offered at no cost for developers (Eclipse, Visual Studio) as the companies who develop them have also interest in their mass adoption, and provide a

level of built-in functionality that wouldn’t otherwise allow these small sized teams to quickly produce applications based on already built user interfaces and many other components that are stable and allow them to focus efforts on innovation. In order for a given project to achieve a competitive level and maintain this position, the software processes that are employed should also be based on flexibility and extensibility rather than high quality as mentioned by Nogueira in “Surfing the edge of chaos”[11]. The goal is to maintain a balance between innovation and public interest. Assuming that an ad hoc methodology allows a project to reach the top of it's initial expectations and is well adjusted to keep a large percentage of users happy with the product for a long period of time, then there is an important question that needs to be asked: where can software engineering be found under this new paradigm? We devote the next section to clarify this question and the implications of processes when viewed from a software engineering perspective. 3.3. Surviving to scalability and long-term development The most desired characteristic in Ad hoc 2.0 projects is a successful transition to a higher scale of work and complexity that allows supporting a wide number of users and developers while increasing the overall quality of services that are made available by the project. Failing to evolve into a more adapted architecture for the project will eventually lead the overall development into a situation where the project will no longer be able to keep up with other competing projects. Software Engineering should be employed to explore more efficient engineering methodologies that are capable of producing processes adapted to a certain type of project and this same engineering practice now becomes necessary to ensure that this trend of software development can learn how to safely evolve onto a more balanced level of development progress. A transition from an ad hoc approach onto a formalized life cycle where processes, architecture and vision for the project are already defined in advance is not an easy task [14]. In order to understand how this approach should be executed, it is necessary to look on real examples from projects that somehow pioneered the path for this thriving Ad hoc 2.0 philosophy, strongly focused on the power of community work and

Page 8: White paper - Adhoc 2.0

attempt to understand some of the possible outcomes for the future, even risking to never actually move onto a state where formalized processes can be safely implemented with success. 4. Case Studies

One of the most seductive advantages of Ad Hoc development is the freedom to create and quickly releases to the public a working version of the project that ensures enough innovation and chaos will provide encouragement for volunteers to help and motivate the developers into further progress. People developing a software product on their free time and publishing it on the Internet will instinctively be adopting an Ad hoc 2.0 lifestyle without necessarily becoming aware of their selection and inherent consequences in most cases. It is important to promote the investigation and elaboration of studies that allow to capture and understand some of the magic behind this seemingly chaotic group of processes that seems nevertheless be working because of the human factor, exactly as described previously by the CMMI. Therefore, ad hoc is a human instinct approach, a specialized management technique that is applied whenever processes seem to add excessive overhead, are unknown to developers or fail to meet a given efficiency criteria – as mentioned before, even newer software products can be released using this particular lifecycle as method to bootstrap them and evaluate their prospective popularity before engaging into a more formal development style. As examples to showcase this style of development, three competing software products designed for the Microsoft Windows platform were chosen. They all started as products created by stand alone developers that later evolved to large scale software systems due to their expansion by components originated from third party developers without direct relation to the development team in some cases. 4.1 nLite

nLite is a software product that allows users to customize or remove specific components from a given Microsoft Windows XP/2003 install CD or Microsoft Windows Vista install DVD. It will also allow to integrate updates, automate the installation process and integrate third party programs [6]. Its development

depends on a single developer, Dino Nuhagic, which released the initial version in April 2004. This product was an immediate success ever since the initial release. Thought buggy, it would still allow users to save a considerable amount of time that was previously required to customize a given Microsoft Windows installation source.

The project's development space for discussion was

provided by MSFN (http://msfn.org) – a forum dedicated to the discussion of windows based technology that was already popular at the time of nLite's presentation to public. At that time, this forum counted with an average of 342 visitors at each 30 minutes. Using this site as base for discussing and presenting incremental versions of the product, nLite grew onto a user base of thousands over the initial 12 months, reaching a state of recognized quality when it began being referenced by Microsoft in the official knowledge base support channels.

Remaining as a one-man-project, nLite evolved onto vLite as the Microsoft Windows Vista Operative System was launched. When the family of software products developed by this author began increasing in terms of overall complexity aggravated by a significant amount of new features added to each recent version, these factors began to toll a visible weight on the development effort. The author decided to cease development and declared product freeze in early 2008 at the peak of its popularity. Support to the products is still provided on volunteer basis at the forums but it is grows increasingly outdated as the products do not keep up with the newer released Microsoft Windows 7 and allows other competitors to compete for filling the void left by a possible 7lite in the future. What happened? Too many arguments can be formulated regarding the reasons why the project came to halt. The author himself claimed the need for free time to engage in a professional software development activity. A project with increasing complexity and user requests was simply too much for a single person to support or one can also consider that at some given point after so many years, the author lost its initial motivation to pursue development of this project. What we can infer is that in many aspects, this is still a successful project today, but the fact that it remained as a one-man-project was eventually a weak point that allowed development freeze to occur.

Page 9: White paper - Adhoc 2.0

4.2 Bart's PE Builder BartPE (Bart's Preinstalled Environment) is a

lightweight variant of Microsoft Windows XP or Windows Server Operative System which can be run from a CD or USB drive [8]. It was developed by Bart Lagerweij. The exact date of release for the first version is not simple to track but it was initially published in the early months of 2003 [9]. The product was quickly marked with controversy at the time of launch. Microsoft was holding the monopoly for a solution targeted to Windows system administrators, the new Operative System platform designated as Windows PE – which was only available as a commercially product to some organizations. BartPE on the other hand, was offered completely for free and allowed similar results without requiring a specific license from MS. This free solution to the commercial Windows PE (WinPE) from Microsoft allowed to gather a significant number of end users, especially after Microsoft have no legal basis to contest the legal existence of BartPE, which attracted considerable attention by the media. This alternative to WinPE was only deemed as legal by Microsoft as long as it would respect the Windows XP/2003 End User License Agreement (EULA). In subsequent years, the product passed through 3 major versions and enjoyed extensive popularity. The third and latest version of this product was released in February 2006 [8]. BartPE counted with support from the 911 CD Builder forum (http://911cd.net/forums), a forum that was well known at the time for hosting other freeware developed projects related to the customization of boot disks based on MS Windows PE and MS DOS. The success of BartPE eventually eclipsed popularity of all other projects hosted at the same forum across the years that passed and become the official meeting point for BartPE users around the world. Unlike nLite, there was no successor of BartPE to the Windows Vista platform and this gap was filled with a competitor project called VistaPE from Sergey Gurinovich in early 2007. What happened? The author stopped appearing on the public forums where his project was discussed and avoids replying email messages regarding BartPE, leaving the project onto a development freeze state. Support is still

provided on the public forum by volunteers but the exposed defects and user requested features that remain unanswered are a serious issue that undermines the attractiveness for users that now tend to adopt recent Microsoft O.S. such as Microsoft Windows Vista and Microsoft Windows 7. 4.3 WinBuilder WinBuilder began as a community spin-off from the 911cd.net forums into a brand new forum called Boot Land (http://boot-land.net) in mid 2005 [3] by a small group of active users. This software project proposed to solve some of the issues and feature requests that were not addressed by Bart's PE Builder at the time. The main goal of WinBuilder is similar to BartPE regarding the flexible customization of Windows PE boot disks with emphasis in allowing that users can modify and improve the project, building components from top to bottom according to a given need. This flexibility allows building extremely tailored boot disks projects. With the appearance of subsequent Microsoft Windows platforms, WinBuilder became a software framework for popular projects such as LiveXP, VistaPE and Win7PE. Flexibility in excess also originated negative characteristics. Since each WinBuilder project is free to define their own rules and organization, this ad hoc approach causes a considerable lack of stable working structures for developers to share code and knowledge across projects. Yet, development remains extremely active, based on volunteers that keep improving and providing updated versions of this software product and respective projects associated with it. Why is this working? Decentralized development is clearly present at the heart of this project. Each developer or team of developers doesn't depend exclusively on other developers and if the development of a given project enters a freeze state – other volunteers can launch updated or even spin-off versions that eventually provide a good degree of flexibility and progress that outlives the initial author through other people that assume these positions in the development process. However, development processes have not yet been formalized nor seems to exist any plan to formalize them. Even thought activity remains very visible across the several components of the project, new challenges are typically solved using an Ad hoc approach.

Page 10: White paper - Adhoc 2.0

6. Conclusion What can we learn from these examples? All three projects shared the same concept of ad hoc and unplanned development. Supported by Internet websites frequented by masses of users from a specific target audience, these projects enjoyed a suited platform that helped gather new users and extend the longevity of both the website as the project itself. This model of development is made possible even with scarce financial resources. Developers can code these products as a hobby during their free time or secondary activity. Even thought the inception of a project first occurs with the intent to solve a given challenge, these products can remain actively used for years to come as valuable working tools for thousands of users around the globe in years to follow. As seen on the Windows PE case, Microsoft provided a commercial product to solve a particular market need but was eventually driven to rethink their business model as the freely available alternatives succeeded in gaining the market attention and momentum. Later versions of Windows PE (since 2.x and above) are now made available for free for Microsoft Windows system administrators. This sort of community-based development is also present in open source movements but on the types of cases in particular of software developments where the source code is rarely made available to others, the project progress becomes extremely dependent from the motivation of the original author unless some balance can be reached to overcome this dependency. Software Engineering could have helped the first two projects presented on this case to survive the test of time and continue to extend their popularity. Care about the scalability of the project to newer Windows platforms or quality assessment techniques might have helped to simplify the overall software architecture if they had been considered since the beginning of development but this was no the case unfortunately. A single person using Ad hoc to develop a software product with the proper basis in software engineering processes, awareness of this community power provided by web 2.0 technologies, the right amount of motivation that establishes empathy with a target audience, has all the right ingredients to achieve a very satisfactory level of success for the future.

7. Acknowledgements This paper would have not been possible without the support, feedback and help from those across the Internet that embody this ad hoc 2.0 philosophy on daily basis without expecting more than a genuine “thank you” for their help to others. I’m grateful to have had the chance to meet some of these wise internauts over the years and even observe from the first row how these developments can truly succeed to survive the test of time, scale and complexity. Would like to thank in particular to Jacopo Lazzari (Italy), Peter Schlang (Germany), TheHive (USA), Cemal Tolga Talas (Turkiye) and David Kummerow (Australia) for their inspiring contribution for this paper.

8. References 1. Dean F. Sutherland, “A Tale of Three Processes”: Reflection on Software Development Process Change at Tartan. 2. Paul S. Adler, Beyond “Hacker Idiocy”: The Socialization Of Software Development. 3. Nuno Brito, “WinBuilder”: Case study, 2009. 4. Clay Shirky, “Clay Shirky's Writings about the Internet”, March 2004. 5. Eric Steven Raymond, “The Cathedral and the Bazaar”, version 3.0, August 2002. 6. Wikipedia “http://en.wikipedia.org/wiki/NLite_and_vLite”, as seen on 21st of November 2009. 7. Paine, Lynn Sharp & Royo, Jose. “Cimetrics Technology” (A1)(9-399-108). Harvard Business Online (2, February 1999), Paine 99 8. Wikipedia “http://en.wikipedia.org/wiki/BartPE”, as seen on 21st of November 2009. 9. Bart Lagerweij, “http://www.911cd.net/forums//index.php?showtopic=837”, as seen on the 21st of November 2009. 10. CMU / SEI - “CMMI for Development” 2006. 11. Nogueira, et al. “Surfing the Edge of Chaos”: Applications to Software Engineering, Naval Postgraduate School. 12. Michael Fagan, “A history of software inspections”: contributions to software engineering, Springer-Verlag New York, Inc., New York, NY, 2002 13. Tom DeMarco & Timothy Lister, “Peopleware”: Productive projects and teams, Dorset Publishing 14. Brooks, "No Silver Bullet”: Essence and Accidents of Software Engineering, Computer, 10-19, Apr. 1987