Learning from Experience
I was confused and didn't know how to react when Jim Carroll, a highly regarded figure in the construction industry, proudly presented me with his "Nine Elements for Project Success," the product of two years arduous labor. These nine elements were presented as a well-accepted model, as if they were based on solid findings, and were meant to serve as guidelines for the successful management of capital projects. To my mind, however, the nine elements were based on weak hypotheses and were clearly deficient.
Jim, who is now the President of Flour Daniel/Morrison Knudsen, LLC, was at the time, February 1988, serving on the Construction Industry Institute (CII) Project Organization task force. The CII, a national research organization established in 1983, is located at the University of Texas in Austin. It brings together 90 owners and construction companies with 30 universities in an effort to improve the management of capital projects.
When I was first invited to join this task force, it was just after it had experienced a breakthrough. In one of its most productive meetings, the task force was able to identify the "Nine Elements for Project Success." The task force then appointed a small team of five members, headed by Jim, to produce a handbook for practitioners that would thoroughly explain each of these nine elements. I met this team at its first meeting, where Jim allowed me to peruse the nine elements in their rough form, and asked me to write a chapter on project strategy. I honestly felt, however, that I could not fulfill his request.
I just could not accept the "Nine Elements for Project Success." First, they did not cover very important areas of capital projects. Second, they were stated as the "one-best-way" principles, completely ignoring the rich variety of project contexts stemming from the different environments, organizations, technologies, projects, and people.
There was another issue complicating the matter for me. I was very flattered being invited to join the CII task because I knew that at that time I was the only foreign scholar invited to work for the CII. It offered me access to some of the most progressive organizations in the world, and the ability to collect invaluable data from them. I came to the CII with the understanding that I was expected to conduct research for them. I did not feel that my accumulated theoretical research findings allowed me to prepare a chapter useful for practitioners.
I spent a full three hours talking with Jim. To be more accurate, I should say arguing and debating with him, often quite heatedly. At first he stonewalled me. I could not produce the tiniest dent in his rock-hard opinion. Jim stubbornly maintained that applying these nine elements was the way, and the only way, to achieve project success. After all, it took the task force two years to agree on these elements, and Jim was clearly in no mood to retreat and re-examine them. He wanted to make further progress and bring back to the task force tangible products, that is, finalized chapters of the handbook.
It was close to midnight when I realized that I would never be able to convince Jim of the weakness of the nine elements, and was almost about to give up. As a last resort, I asked him whether he would be ready to put the nine elements to a personal test. That is, I asked him to see whether the application of the nine elements could explain the success or failure of the projects with which he was involved during his career. Jim confidently agreed to put the elements to the test.
The next day, the entire five-member team received a fax from Jim in which he shared a simple analysis of the seven projects with which he was involved during his career. He rated the projects according to their success -- four projects were rated successes and three were rated failures. He also examined whether the nine elements applied to those projects. Now it was Jim's turn to be confused.
There was very little, if any, apparent correlation between project success and the nine elements. For example, some elements were applied to all seven projects, and some almost to none. Two factors exhibited a most bizarre behavior. They applied to all the failed projects, but they did not apply to all the successful ones.
Jim is an unusually courageous man. He was willing to reflect on his experience and to learn from it, or more accurately in our case, to unlearn from it, and to share his unlearning with us. He was willing to admit that the nine elements could not provide the complete answer. I was very satisfied, especially in light of the fact that now I felt I was able to conduct a study within the CII organizations before attempting to write my chapter.
But my problems with the task force were not over yet. A year later, I submitted my research report to the task force. The results, based on an elaborate study in 11 CII member companies, were quite shocking to most members of the task force. Among other things, my findings showed that in most capital projects, uncertainty is not resolved early in the life of the project, for example, at the end of project design. Even more troubling was my finding that in most capital projects, not only are the "means uncertainty" (how to do it) resolved late in the project life, but so are the "end uncertainty" (what to do).
Since most members of the task force could not accept that capital projects suffer from uncertainty, and definitely not from "end uncertainty," they adjusted the presentation of my findings. Instead of portraying project planning as a gradual process of lessening uncertainty, they portrayed it as a gradual process of increasing certainty.
The task force's behavior at the conclusion of my study was very much in line with, and may even explain, their behavior from the outset of the study. The task force members felt comfortable in a certain world, and so they denied uncertainty, even in the face of empirical data from within their own organizations. Most of the members were not really interested in learning, even when they lead a research activity. They did not formulate research questions, only research answers. It could be that the mix of contractors and clients put everyone in a marketing mode. Marketing always starts with an answer; research with a question. They hired "researchers" not to find out or understand reality, but to confirm their own beliefs.
This story demonstrates our ability to learn by reflecting on our own experiences, and our inability to learn by promulgating "answering" over "questioning." Learning starts with a question, a problem to be solved, a dilemma to be resolved, and a challenge to be met. Managers who treat questions as annoyances and regard them as signs of ignorance, and who prefer bold assertions and confident statements because they convey a sense of mastery and control (and certainty), are not learning. Questions force deep thinking and reflection. Questions are an invitation to open a conversation; answers are a prelude to shutting it down. Breakthroughs come from a fresh question, not a ready answer.
You can imagine that even at the end of the study I was not yet ready to write the chapter for the practitioners, (you can as well imagine that the task force did not feel an urge to ask me to do it). It took me an additional six years of research and consulting to feel that I was ready to sit down and compose a book of "guidelines" for practitioners. In today's world with its accelerated pace of change, we can't afford to wait six years for knowledge to be shared. Even a year is often too long. By that time a great deal of knowledge is obsolete and irrelevant.
We need an on-line mechanism to generate and share knowledge. ASK should help us learn (and unlearn) by reflecting on our experiences. It will also help us to ask questions that will open our minds and enable us to share these experiences and questions very quickly with our peers. Together with other initiatives of the Academy of Program and Project Leadership, ASK will help us build an interconnected knowledge community.