For more than fifty years, researchers have investigated the impact of the built environment on the health outcomes of hospital patients and the general wellbeing of the staff who care for them. Despite this significant history, our knowledge base can often be incomplete or contradictory, arguably due to a lack of long-term post-occupancy research. But why is this the case?
In any discussion on evidence-based design, there are two polar concerns to bear in mind. First, as a profession, we rarely seek the kind of detailed knowledge that can only be generated by a rigorous post-occupancy evaluation (POE) process, perhaps because of the limited performative demands or because our training in brief development methods is equally limited. Increasingly lean fee proposals certainly don’t encourage us, but it’s possible we’re also concerned about the potential legal (and emotional?) consequences of finding out that our ideas do not quite work the way we thought, and said, they would. Whatever the reason, we seem reluctant to return to our completed projects to find out if they work the way they were intended to. 1
Second, the making of architecture is, and always will be, a speculative act of discovery. This latter concept may also contribute to the first, but in the context of contemporary demands for a performative, adaptable architecture we must reconcile these intrinsically connected – and potentially conflicting – ideas, because if we are not learning about how our buildings work, on what information are we basing our future design speculation?
In Australia, the completion of a POE is contractually mandated for all government projects, usually undertaken after one year of occupation. However, the purpose of each study and the method with which it is carried out can differ from state to state. For many state projects, the primary purpose is to evaluate the project delivery methods and the degree to which the project outcomes meet the brief requirements. These studies are often undertaken by the project design team rather than by independent researchers, meaning that the data is of limited value.
Senior Director of Clinical Infrastructure at Queensland Health, Kate Copeland, says Queensland Health takes a more balanced approach than other states, using a Canadian methodology titled Building Performance Evaluation (BPE), which looks as much at the experience of staff and patients as it does at the financial bottom line, the effectiveness of the process or the degree to which the design meets the expectations of the brief. In Queensland, project architects are required to complete a review midway through the design stage but, after one year in operation, independent consultants are engaged to undertake an evaluation of the design outcomes. The results are not published in the traditional academic sense, where they would be subjected to peer review, but Copeland emphasizes the importance of providing feedback to the design team and the staff of newly completed facilities. In addition, the results are used to inform future project briefs through the evolution of documents such as the Australasian Health Facility Guidelines (AusHFG).
Sheree Proposch, a principal at Hassell and a specialist in the design of healthcare facilities, argues that hospital users and the design profession can all benefit significantly from the results of a more rigorous BPE process. At the Gold Coast University Hospital, a recent BPE highlighted the improvements single-occupancy rooms had delivered. These included a higher standard of patient care because staff felt more able to discuss treatment regimes, shorter patient stays and lower levels of cross infection. But the BPE also found that the single rooms placed a greater burden on staff – for example, nurses had to walk further to reach the same number of patients. Proposch says the BPE creates opportunities for a more nuanced approach to design in the future. She also argues that to be really useful, BPE should ideally include not only the design outcome, but also the brief intent and subsequent factors such as user groups, operational considerations and contractor issues, which modify the brief.
Although these examples from Queensland suggest a broader and deeper scope of analysis, they remain short-term, project-focused studies. Ron Billard, principal of Billard Leece Partnership (BLP), agrees on the need to generate data that is specific to each completed project; however, he also advocates for a POE process that compares outcomes across a range of similar projects. Having recently completed four subacute healthcare facilities across Australia, BLP has made the unusual decision to undertake all four POEs at the same time, making a comparative analysis possible.
Ian Forbes, an experienced health facility planner and adjunct professor at University of Technology, Sydney, argues that although NSW Health also uses its POE results to influence changes in the AusHFG, there is a reluctance to release that knowledge to the design firms that are engaged in public health facility design. He also questions the value of the AusHFG as a design tool, suggesting that he had “serious concerns about their rigid imposition, which would limit the possibility for change and innovation in functional and physical solutions.”
According to Forbes, what is missing in all the rigorous evaluation methodologies developed for POE is the need for continuing and open discussion. He suggests that “dialogue among participants involved in the review is perhaps the most important aspect of the evaluation. Seeking to find simplistic methodologies that will answer all aspects of health facility design is just not possible.”
To create a truly performative architecture, one that is based on a deeper understanding of the complex relationship between people and their environment, we need more reliable information, the kind that can only emerge from long-term, coordinated post-occupancy analysis of completed buildings. However, sociologist John Law reminds us that the world is not to be understood by adopting a methodological version of auditing, because, in doing so, we fail to “make and know realities that are vague and indefinite because much of the world is enacted in that way.” 2
Perhaps part of the problem also lies in the terminology that we choose – the word “evaluation” suggesting a judgement of the past, when our primary interest is actually in shaping the future. In its place, the term “post-occupancy analysis” suggests a holistic, process-oriented approach where not only facilities but also the forces that shape them (political, economic, social, etc.) are taken into account, more accurately reflecting the life cycle of an evolving architecture-in-use.
As a profession we stand to gain from a rigorous, coordinated approach to post-occupancy research, where results are made widely available, not as a set of prescriptive rules but as a suite of evolving principles, worthy of continuing development. When we treat knowledge as an iterative, continuous process – and not a product that we aspire to complete – predictable patterns begin to emerge, but we also generate a whole new suite of questions that provoke a speculative response. Perhaps in this way we may reconcile the contemporary need for predictability in our performative environments, without diminishing what is essentially a non-repeatable, creative act of design.
1. A 2013 survey of 420 design practices around the world by Evidence Based Design Journal found that just 5 percent of firms undertake any form of POE as part of their normal practice. Where practices were primarily involved in the design of healthcare facilities, the percentage that undertook POE rose to 34 percent. ebdjournal.com/blog/general-design/the-knowledge-problem
2. John Law, After Method: Mess in Social Science Research (New York: Routledge, 2004).
- Site details
- Project Details