The Road Not Taken: Building Physics, and Returning to First Principles in Sustainable Design

The path we are currently following towards ‘sustainable design’ is a result of the accidents of the past 300 years of history. If we look further back, to before the exploitation of fossil fuels, we find a very different approach to building envelopes, and to building use and comfort. This was necessarily very low carbon, and demonstrably effective, but, unfortunately, we have forgotten many of the fundamental principles on which it rested. This paper argues that our current choice of retrofit pathway is leading us away from, rather than towards, a sustainable built environment. Current efforts to reduce carbon and energy based on modern ’layered’ envelopes and misunderstandings of thermal comfort are proving much less effective than predicted. We would further argue that they are too often delivering unintended consequences: contributing to the overuse of carbon and energy, and derailing the development of a sustainable built environment. We draw on research and case studies, as well as on the lessons from history, to show how the problem derives from a neglect of first-principles thinking and fundamental building physics. Equally, though, we show how combining good building physics with a re-evaluation of older approaches to construction and building use delivers some powerful and effective tools for tackling the climate emergency.


Introduction
For many, in his poem 'The Road Not Taken', American poet Robert Frost celebrates 20thcentury America as a culture rooted in risk-taking and ʹcan-doʹ individualism. However, this may well be a misinterpretation: critics point out that Frost rather seems pointing out that the road taken may not be a deliberate choice at all, but random, and only later justified in the traveller's mind as being the 'right and proper' path [1].
Frost's insight can help us to understand why our current efforts to drive down carbon and energy in the built environment have been rewarded with remarkably little success. As this paper tries to show, designers and retrofitters are currently embracing a picture of 'sustainable' architecture that is not so much a result of choosing between options as the end result of a series of accidents of history. The road we are following is not the only pathway towards reducing carbon, nor (more importantly) is it by any means the best. As we continue to follow it, it is forcing us into actions that risk being counterproductive as we battle to reduce carbon emissions. It is already clear that our efforts to create more sustainable buildings have been delivering unpleasant unintended consequences, with little demonstrable benefit in long-term carbon and energy reduction [2][3][4].
Essentially, the accepted road is problematic because it neglects both fundamental building science and some important lessons from the past, and instead rests on a series of unquestioned assumptions and misunderstandings. In this paper, we draw on history, research and field studies to try to explain how and when these unhelpful dogmas arose, and the implications they have had for the built environment. This, in turn, suggests where the opportunities might exist for swapping to a much more effective and productive pathway. This is not uncharted territory, for the simple reason that for the many centuries prior to the industrial use of fossil fuels buildings had, from necessity, to be both durable and functional with very little input of energy (and certainly none of fossil fuels). Unfortunately, most of the methods people used to make and operate durable and flexible buildings have been almost entirely forgotten in a world that, for more than two centuries, has relied increasingly on simply adding in more fossil-fuel energy whenever it has met with difficulties in design or management.

Thermal Comfort as Air Temperature: A Flawed Paradigm That Has Led to Sealing Envelopes
The fundamental issue around our current approach is that it is based on a definition of thermal comfort as a function of air temperature. Despite this oversimplification being questioned regularly from the very beginning of the era of heating and cooling [5], currently it is almost universally accepted without question that for any indoor space to be comfortable and useable, the air temperature must be controlled. This idea has been central to the commodification of comfort, where the 'perfect' temperature is meant to be provided by a space-heating or air-conditioning system [6]. In fact, as many years of excellent research across the globe has unequivocally shown, thermal comfort is a very much more nuanced concept (perhaps better framed as thermal discomfort), and one in which air temperature plays, at most, a minor role [7][8][9]. Moreover, the gains in human comfort, health and productivity promised by this approach have not been delivered. At the same time, space heating and cooling are recognised as the principle contributors to the built environment's intensive use of energy and carbon; see, for example, [10][11][12].
It is also often forgotten that trying to control the air temperature in a space can easily have counter-productive effects on comfort. Put simply, heated air rises and creates draughts (Figure 1). Similarly, air conditioning requires the air to be drawn through ducting, to the great discomfort of any occupant unfortunate enough to be seated beneath an intake or outlet. Figure 1. Infrared thermography shows heat generated by large gas-fired radiating units in an English Cathedral rising to the ceiling, causing cracking of the painted wood, strong draughts through the building, and very high energy consumption. Other common issues associated with heating in churches include underside corrosion of lead roof coverings. ©Tobit Curteis Associates.
The dominance of space conditioning is particularly important for retrofit, because to give it a chance of success without huge wastage of energy, the building envelope must be sealed; and it must be sealed not merely by closing the windows, but by separating the interior from the exterior as completely as possible. Building scientists have been at the forefront of developing ways of achieving this, but it remains extremely challenging, and if poorly handled can lead to serious failures of the envelope such as condensation and the consequent deterioration of materials such as wood, lead and iron; see, amongst many examples, [13]. Sealing envelopes can also lead to problems with indoor air quality, for example high humidity, mould growth, and the trapping of indoor pollutants; see, for example, [14]. High humidity is also closely associated with thermal discomfort [15].
The technical issues resulting from sealing modern multi-layered construction are common currency for building performance assessors. It has also been well established that a particularly worrying consequence of the current approaches to retrofit has been the maladaptation of older buildings by, for example, sealing and insulating solid walls. This can cause the failure of materials and envelopes that had hitherto given excellent service over perhaps hundreds of years [16]. The French word for 'sustainability' is 'durability', and it is easy to see that the most durable building will be the one with the longest usable lifespan, requiring the least energy input for ongoing maintenance and operation.
There is extensive literature on air quality and building failure, but some other critical aspects of retrofit's effect on carbon expenditure and building usability do not seem to be widely discussed in the published literature [17]. One is the through-life carbon cost of retrofitting. The energy, carbon and other resources needed to install, operate and maintain retrofit measures will have a critical bearing on the building's long-term sustainability, and therefore to be able to assess the true impact of retrofit choices on carbon outcomes, we need to know how, why and when the retrofit materials and systems will fail in the field, and what would be needed to maintain them and to replace them when they reach the end of their life. For traditional building materials and systems, this knowledge is highly developed and freely available, but it is much less accessible for modern construction, and virtually non-existent for many retrofit materials (not least because these tend to be proprietary products). To take the example of air sealing and insulation: What do we know about the in-use durability and failure modes of materials such as housewrap? What would need to be done to repair building wraps should they begin to fail, or to replace them when they reach the end of their life? Would the interior and exterior wall surfaces of the building need to be stripped? If so, what would such a major intervention mean for building use and for overall carbon consumption?
Even more importantly, we need to find a way of balancing the carbon budget of retrofit measures that might reduce the lifespan of the building, or at least introduce an extra demand for maintenance and repair (both of which may use energy or generate carbon).
Perhaps because occupants do have some awareness of these problems (particularly for older buildings), the uptake of retrofitting in many countries remains low, even in the face of the climate emergency. Indeed, it appears that energy and carbon in the built environment may be continuing to rise [18]. It is clearly imperative that we urgently reconsider our direction of travel. However, is there an alternative route that could deliver better results?
It may help us to recall that our ancestors did not all die of cold or heat (any more than do those living in countries with limited access to high-carbon space-conditioning systems, but a strong tradition of vernacular architecture). They learnt from experience to understand the true causes of thermal discomfort, and knew exactly how those should be combatted.

Changes in the Concept of Thermal Comfort
The idea that building usability derives from air temperature is remarkably recent, and would probably have surprised even our grandparents: central heating did not become ubiquitous in Europe until well after the Second World War, and air conditioning is a still newer technology. Even fireplaces did not appear in ordinary houses in the UK until the 17th century, and for the next two hundred years they remained chiefly a means of cooking and providing light and comfort, rather than delivering warm air [6].
Until fossil fuels began to be used extensively, the key to thermal comfort was dealing directly with the causes of discomfort. The primary literature on how the body keeps itself safe as external temperatures change, or as it needs to lose heat in response to exercise, is to be found in the medical journals. There is broad agreement on the causes of heat loss, and on the approximate amounts of body heat given up to conduction, convection, and radiation:  Some heat is lost by direct conduction into surfaces being touched by some part of the body: how it does this depends on the nature of the surface, and the area of the body that is touching it;  Some heat is lost by convection into the air: if the air is still, this is no more than 2% of total heat loss. If the air is moving, and the skin is wet (for example, from perspiration), evaporation can raise this to as much as 22%;  The primary cause of heat loss-60-65%-is the radiation of body heat into the surrounding surfaces [19,20].
These ways of losing heat were arguably much better understood at a time when people were obliged to listen to their own senses, rather than consult a thermometer. The current ubiquity of the thermometer blinds us to the fact it was not invented until the 18th century and not common until the 19th. Air temperature is now easy to measure, but it remains a poor proxy for comfort. Already in 1916 Sir Leonard Hill and his colleagues of the UK's Medical Research Committee noted [5]: "For purposes of controlling the heating and ventilation of rooms the thermometer has been used and has acquired an authority it does not deserve… it affords no measure of the cooling of the human body and is, therefore, a very indifferent instrument for indicating atmospheric conditions which are comfortable and healthy to man." Interior clothing was more substantial than we are now used to, and buildings were partitioned into smaller spaces. 'Thermal delight' was provided by hearths; and simple tools such as hot bricks, charcoal foot-warmers, and lapdogs helped to heat people directly [21].
Most importantly, the principal comfort issue-heat loss by radiation-was dealt with simply and passively by imposing radiant heat breaks between the occupants and the heat-absorbing surfaces around them. These included some that are still familiar to us, such as mats on the floor, and others, such as wall draperies, that are largely forgotten. We have certainly forgotten their important role in comfort. Cloths could be hung across the entire wall surface or just behind where the occupant was seated, and draped into canopies to cut heat loss upwards. To cut draughts, they could be hung across doors and fireplaces (although in a time of small windows, solid construction, and few fireplaces, draughtiness was not yet the serious concern it would later become; indeed, it was generally considered desirable and healthy [22]).
Wall cloths are almost ubiquitous in contemporary depictions of interiors from the earliest times until the end of the 17th century. Studying these pictures can reveal many interesting details. For example, in chapels (almost the only spaces to have large areas of glazing prior to the industrial production of glass), drapery covered the base of the windows, presumably to capture the air chilled by the glass as it fell. We do not have a name for this type of intervention: it is not 'insulation', because the heat is not passing through the walls and ceiling. However, there is some insulation effect as well, even for light fabric: studies of the impact of net curtains on windows have shown that the temperature of the curtain can be 2.5 to 3.8 °C higher than that of the glass [23].
In England, the preferred means of covering walls was with cloth stretched onto battens and painted to imitate tapestry, and these were found everywhere from the most humble homes and taverns to stately houses. The more expensive option is the best known today: the tapestry, which was the exclusive province of extremely wealthy individuals or institutions ( Figure 2). Interestingly, in medieval castles it is not uncommon to find hooks for hanging woollen tapestries directly above decorative wall paintings: we know from the house records that the tapestries were hung up in winter, but taken down again in spring (when they might be at risk from condensation on the wall). Bare walls would also be beneficial in summer, when it became desirable to lose body heat. Figure 2. Bolsover Little Castle in Derbyshire, England, which dates from the beginning of 17th century, uses every one of the Stuart period options for providing comfort if one was extremely wealthy. In the Star Chamber, rush mats cover the floor, the walls are hung with cloths and tapestries that also block drafts through doorways, and the windows have both secondary glazing and shutters. Other rooms are similar, or paneled in timber. ©Historic England. Timber was used in a similar fashion to cloth. Wooden paneling provided an excellent thermal break, and depictions of scholars and artists at their desks suggest the desks often had backs and hoods of timber. The elaborate timber canopies constructed in medieval choirs were not just highly decorative, but served a very practical purpose. The cloth hangings on beds are another example of the overlap between the treatment of rooms and furniture.
Occupants and builders alike were in a good position to learn from their experiments into improving comfort, and then to pass that learning on through the guild system. In London, one of the biggest and most powerful guilds was the 'Steyners', who made the painted cloths, but others are likely to have been involved as well: the name of the guild of upholsterers, 'The Worshipful Company of Upholders', suggests it was they who undertook the hanging of cloths.

A Paradigm Shift from Radiant Loss to Air Temperature
In England, the use of cloth draperies and matting to make buildings comfortable continued without break until the end of the 17th century. At this point, a number of significant events occur at once.
The first appears to be the Great Plague, which struck London in 1665, and lasted until the destruction of the medieval city by fire the following year. The Rebuilding Act of 1667 required houses to be built in brick or stone. This is usually seen as a response to prevent fire, but fear of the plague is likely to have played a part: the Lord Mayor's Orders of 1665 "Concerning the Infection of the Plague" specify that 'the goods and stuff of the infection, their bedding and apparel, and hangings of chambers, must be well aired with fire… within the infected house, before they be taken again to use.' [24] Certainly, depictions of 18th-century interiors show panelling, but no rugs or hanging cloths.
Secondly, in 1709, the first practical thermometer was invented in Germany, setting off a vogue for heat and temperature studies with Enlightenment scientists such as American emigré Benjamin Thompson (Count Rumford). The pivotal point, however-as with so much change in buildingsappears to have been the exploitation of coal as a fuel, and the subsequent dramatic drop in the price and availability of energy. With little by way of radiant breaks, Georgian houses must have been uncomfortable, so people began to turn increasingly to fireplaces to give relief from cold. Coal burning was dirty, though, and carried a high risk of carbon monoxide poisoning.
It was into this landscape that, in 1796, Rumford began promoting a new design for fireplaces that restricted the chimney opening, greatly increasing the updraught so that it carried away soot and fumes. The side walls of the fireplace were angled to reflect heat back into the room, and with this concept, the dominant role of fireplaces began to change from cooking to heating. Rumsford was an astute businessman, and his fireplaces quickly became very fashionable indeed [25].
Unfortunately, the occupants soon discovered the negative consequences of heating the air, coupled as it was with a strong draw though the chimney: draughts became a very serious problem. It is therefore no great surprise to see that within a few decades the fashions had changed back to rooms festooned with heavy curtains and rugs. These would be swept away once again more a century later, when the Modern Movement encouraged a renewed fashion for hard surfaces, this time made palatable by newly introduced building services such as central heating and air conditioning.

Changes in Building Envelopes
The emergence of building services is not the only important crossroad along this path: the exploitation of coal and other fossil fuels also led to dramatic changes in the materials and construction of building envelopes.
Traditional building systems operate on principles that are simple, but since they have become unfamiliar to modern architects, engineers and builders, they are perhaps worth explaining in a little detail. They are based on solid walls made of materials that are permeable, such as brick, stone, earth, timber, and lime-based mortars. Water vapour can travel between the voids in modern cavity construction, but it does not travel through permeable walls: if a vapour molecule enters the surface pores, its collisions with the pore walls rapidly cause it to condense. Of course, it may condense onto liquid water in the capillaries and then be drawn elsewhere in the wall by capillary action-perhaps even evaporating out the other side-but actual vapour movement in pores is extremely slow, and independent of the conditions outside the wall [26].
Liquid water could theoretically pass right through the interconnected pores, but in practice it does not, so long as the building is kept in reasonable condition. Sir Frederick Lea (1900-1984, head of the UK's Building Research Station) suggested that traditional construction might be compared to a greatcoat in the way it handles water. A raindrop hitting a 'greatcoat' wall will be held in the pores it hits on the surface, prevented from penetrating further by the pressure of the air it is trapping in the adjacent pores and capillaries. From the surface, it quickly evaporates again, often during the same rainstorm [27].
It is only if a raindrop should happen to hit a surface pore that connects to a capillary that is already filled with water that the rainwater will be drawn into the wall. Thick permeable walls will also resist heat transfer unless they are wet [28]. Traditional architecture is therefore characterised by features such as wide eaves and cornices, or hood mouldings and sills, that are intended to protect the bulk of the wall from rainwater entry at weak points, such as the wall heads (where gutter overflows could inject water into the bulk of the wall) or the window surrounds (where run-off from the glass could be drawn into the fabric). These protective features are often very decorative, but their primary purpose is practical (Figure 3) [29]. Builders learnt quickly from failure: Romanesque buildings, constructed before the invention of window glass, lack the protective window features that are so characteristic of Gothic architecture, with its large areas of stained glass. As windows became larger and larger, run-off from glass was emerging as a new problem [30,31]. Making sheet glass requires huge amounts of energy, so glazing was rare in domestic architecture until coal began to be used for glassmaking at the beginning of the 17th century [32].
With clear glass windows, a new problem appeared in the form of solar gain. Even in winter, this could cause thermal discomfort [33]. Again, builders responded quickly, developing the vertically sliding sash window, which allowed the finest possible control over ventilation, and could be combined with shutters to allow night flushing without compromising security ( Figure 4). Another important invention was the awning, which soon developed in sophistication to allow occupants control over not just solar gain, but ventilation too. Glass was just the beginning: when fossil fuels began to be used to make ferrous metals, architecture changed even more dramatically. The technologies that allowed steel to be made using coal unleashed a storm of innovation in architecture. Steel structural elements appeared first, alongside machine-made sheet glass, and later high-energy materials such as aluminium. Building development began to center around industrial production and research in enterprise and higher education, rather than learning 'on the job'. As a consequence, vernacular architecture-which developed in response to local materials and local climates-began to disappear.
The technology to make glass facades began with glasshouses, and architects spoke of the potential of this new type of construction for providing 'healthy' buildings in cities. Alas, the Crystal Palace (constructed in London in 1851 for the Great Exhibition) revealed inherent problems in overheating and condensation [34]. Although glass and steel continued to be used for train sheds and factories (which could not yet be lit artificially), this technology did not immediately transfer to other types of building. Early experiments by Liverpool engineer Peter Ellis in building steelframed offices with glass-heavy facades did not prove popular, and glass and metal architecture might have remained a purely industrial phenomenon were it not for American architect John Wellborn Root, who was visiting Liverpool in the late 1860s when Ellis's building at 16 Cook Street was being constructed. In 1882, Root modified some of Ellis's initial innovations for his Montauk building in Chicago.
America provided a fertile ground in which to develop an entirely new type of construction. Two years after the Montauk building, another Chicago-based architect, William LeBaron Jenney, designed a ten-story building with a complete metal-frame. Then, in 1893, the Chicago World's Columbian Exposition brought together Francis John Plym and Edward Drummond Libbey, who would go on to be regarded as the founders of the modern metal and glass curtain wall. Plym founded the Kawneer Company in the aftermath of the great San Francisco earthquake and fire of 1906, whilst Libbey partnered with Michael Owens in 1912 to patent the world's first 'sheet glass drawing machine', making large sheets of glass commercially viable. Glass curtain walls began to appear in cities across the US.
Despite continuing problems with solar gain, these new building systems gained cachet when, in 1933, the Bauhaus school of architecture in Germany was closed by the Nazis. Many of its teachers found their way to America, where their Modern Movement aesthetic of hard surfaces of metal, glass, and concrete (another material made possible by the exploitation of fossil fuels) proved extremely popular. Eventually, it became fashionable across the world.
The First World War mobilised manufacturing industries in pursuit of a common goal, but it was the Second World War and its aftermath that led to some of the most significant and rapid changes in building materials and technologies. The aircraft industry initiated an explosive growth in aluminum manufacturing, and improvements in material quality and uniformity. New materials appeared, such as silicones, acrylics, and epoxies. These materials and related inventions such as the 'sandwich' panel allowed the curtain-wall industry to develop. As WW2 began to turn in the Allies' favour, the US found itself with a vast supply of surplus aluminum. Architects such as Pietro Belluschi recognised the potential of the embodied energy in this surplus, and embraced the opportunity to design buildings clad almost exclusively in aluminum. His 'Equitable Savings and Loan' building in Portland, Oregon, became "… the first to be sheathed in aluminum, the first to employ double-glazed window panels, and the first to be completely sealed and air conditioned" [35].

The Paradigm Shift from Greatcoats to Raincoats
The new envelopes were thin and light, and intended to be waterproof: Frederick Lea noted that they behaved like raincoats. Their surfaces do not have pores to hold the rain: instead, it beads and collects into flows that run down the facade under gravity. The weak points are the joints, which must be very well sealed to avoid run-off being wicked in through the skin, and it was soon discovered that the inevitable leaks were best dealt with by having more than one raincoat layer. These mass-produced, layered 'raincoat' wall systems marked a paradigm shift in the design of building envelopes.
Another problem proved even more challenging than rainwater penetration: raincoat facades trap water in both directions. Water inside the cladding cannot easily pass out through the walls to evaporate, whether that water is the result of rain penetration, plumbing leaks, or condensation. The bread and butter of modern building performance assessment is identifying these types of failures, and finding ways of remediating them.
Despite increasingly obvious issues, after the Second World War raincoat technology completely eclipsed the traditional 'tried-and-true' greatcoat systems that had been the province of practical builders. So many masters and apprentices died during the wars of the first half of the 20th century that the loss of hands-on knowledge was all but complete. By 1946, Swedish scientist C. H. Johansson would feel able to assert [36]: "It is clearly unwise to allow walls, whether of brick or porous cement, to be exposed to heavy rain. They absorb water like a blotting paper and it would be a great step forward if an outer, water-repelling screen could be fitted to brick walls." Similar sentiments remain common today, and indeed Johansson is still much cited by architects and engineers. However, it is very easy to demonstrate that neither blotting paper nor that other familiar analogy for brick, a sponge, will absorb water if they are dry: they resist water uptake until there are some water-filled capillaries to draw the moisture in.
Unfortunately, ignorance of solid-wall construction has led to what amounts to an industry in maladaptation, not least adding coatings that attempt to provide waterproofing to greatcoats. Coatings cannot keep all the water out of the wall-indeed, because they often lead to beading and run-off, they can actively encourage rain penetration -but they greatly slow evaporation, so the moisture content of the wall builds up over time. This has two very well-known consequences: firstly, wetting of the wall (with all the resulting problems, including increased rain penetration); and secondly, powdering and spalling of the surface as salts are deposited below the coating or at the interface between treated and untreated material. Air pressure in the pores may be an additional failure mechanism [16,37].
With body heat loss through radiation as a source of discomfort now poorly appreciated, the thermal behaviour of solid-wall construction is also being misinterpreted. Sealing the envelope, however important it may be for lightweight construction relying on space heating and cooling, is highly detrimental to greatcoat buildings, but maladaptation is common, and indeed is being encouraged by the application of building models that are completely unsuited to either this type of architecture, or to its proper modes of operation [38]. To give the most obvious example, adding insulation to thick solid walls is unnecessary at best, and will slow or prevent evaporation. The end result can be counter-productive: if water gets into the wall (whether from condensation or, more likely, from leaks in plumbing or rainwater goods), moisture levels will build, and the wall will start to transfer heat.

The Commodification of Comfort
As the old ways of dealing with discomfort were forgotten over the course of the 20th century, air heating and cooling began to be a common feature of buildings. Raincoat architectureparticularly the curtain-walled skyscrapers-relied ever more heavily on the fledgling buildingservices industry, which provided (amongst many other requirements) lifts and electric lighting. In 1921, Willis Haviland Carrier patented his 'centrifugal chiller', the first practical approach to controlling the humidity and temperature of the air. Eventually, Carrier's technology gave birth to the idea of "comfort cooling", and a Heating, Ventilation and Air Conditioning (HVAC) industry that would position itself as a very timely means for making the ever-larger spaces behind flat-glass curtain walls livable [38]. Solar gain was no longer tackled with awnings, but with air-conditioning, and then (when this proved insufficient) with air conditioning plus internal blinds. It is ironic that, in many glass-walled buildings, the blinds are almost always drawn to make internal conditions bearable.
With energy cheap, the wastefulness inherent in air conditioning was not yet considered to be a problem, and neither were its impacts on neighbouring areas, including the contribution it was making to urban heat islands. Most worryingly, perhaps, the centralised control of air temperature (whether by heating or cooling) meant that comfort became a commodity to be purchased, rather than something to be achieved by occupants reacting to the quirks of their own building, and to how they were using it [39,40].
A homeowner or a facilities manager can now buy a system that promises to make occupants perfectly comfortable by keeping the air temperature in a narrow band. However, setting aside the technical challenges of this (especially in spaces with partitions and furniture), the ideal temperature will inevitably be different for different occupants, and even for the same occupants at different times. Radiation of body heat and solar gain both play a large part, but so does the level of activity. If the atmosphere is damp, it will feel much colder in cold weather and much hotter in hot weather, regardless of air temperature [5].
With so many factors contributing to thermal comfort, it is not to be expected that any space heating or cooling system could possibly deliver perfect comfort to everyone using the building. However, having been sold a dream of comfort, the common response of occupants to discomfort is to assume they are not running the system hard enough, and so they will over-ride the controls and adjust the thermostat. One can speculate whether this is the underlying cause of the well-known 'rebound effect'; see, for example, [41].
Despite the many limitations of space-conditioning, it has been marketed extremely successfully, even to occupants who are apt to complain about the results [33]. This is underlined by the responses made by readers to a recent New York Times article questioning the wisdom of near universal air conditioning in the US [42]. Many respondents claimed the hottest parts of America would be uninhabitable without it, even though the article had pointed out that the take-up of airconditioning in still hotter climates is currently very low. With the market in the US now nearing saturation (well over 90% of buildings have air conditioning), manufacturers will be looking to expand into new countries [43]. Of the yearly increase in world energy demand, 21% is currently attributed to the increasing use of air conditioning [44,45]. With the climate now warming rapidly, this prospect becomes even more alarming.
In Europe (where air conditioning is becoming more and more 'standard'), a similar narrative surrounds central heating. 'Fuel poverty' supposes that the health of people unable to heat their houses to certain air temperatures will suffer, although the evidence linking deaths with indoor air temperature extremes show more deaths from heat stress than from cold; see, for example, [46]. The narratives of fuel poverty, energy efficiency, and climate change have become unhelpfully entangled, leaving occupants confused about the best response to reducing energy consumption and carbon outputs.

Where Has Our Current Road Taken Us
We have now travelled so far down our post-industrial road that we have arrived in the somewhat bizarre situation of assessing traditional construction not on its own merits, but on how much it differs from contemporary norms. Older buildings are stigmatised as 'hard-to-treat', or energy-hungry, despite the evidence of several thousand years of proven effectiveness in a lowcarbon, low-energy environment, and the well-attested problems of modern construction failing to deliver promised energy efficiencies.
Nonetheless, traditional greatcoat construction remains an excellent and robust envelope system. It is difficult to imagine a more sustainable domestic building than a thatched cob cottage in wet, cold Devon in England. The thick solid walls-earth mixed with straw on a stone plinth-are excellent insulators and extremely durable, and were made with local earth and human labour. The thatch (made of locally grown straw) is also superbly insulating, and can be maintained simply by regular re-ridging and "spar coating" (replacing only the deteriorated outermost layer of thatch, which means that building has never to be roofless even for a short period). Maintenance of the walls consists of little more than regular whitewashing or mud rendering, and glazing is minimal [47]. The building itself is flexible, and can be changed and extended with relative ease to suit changing uses. Inside, thermal comfort can be provided as it was originally by cloth hangings or wooden panelling, and mats on the floor, supplemented as necessary with elements to heat the people rather than the air. The introduction of plumbing has given new challenges, but these are relatively easily dealt with by approaching installation and maintenance with sufficient care.
Many such buildings are more than 600 years old, and indeed they have no inbuilt obsolescence. Kept maintained, they could survive indefinitely. Even climate change should have little impact: very similar buildings are constructed in Sub-Saharan Africa, where the thick earth walls provide superb insulation against the heat.

The Situation Today: Collecting Points for Sustainability
As building practitioners in a world faced with a climate crisis, we are often drawn to new products and materials not just for the aesthetic options they offer, but also for the performance cited by industry: "thinner, lighter, more cost-effective and energy-efficient". This approach to design and retrofit is strongly reinforced by the promotion of points-based building rating systems such the Leadership in Energy and Environmental Design (LEED ® ) of the US Green Building Council (USGBC), or the UK's Building Research Establishment Environmental Assessment Method (BREEAM).
Where is collecting points leading us? Current rating systems threaten to replace firstprinciples thinking with checklists, and, until recently, most have largely ignored truly quantifiable outcome-based design.
In the US, early implementation of the LEED ® green building rating system created newperhaps unanticipated-challenges around the durability and performance of materials. A good example of unintended outcomes is given by a commercial property that was among the first to be awarded a LEED ® Platinum rating (the highest possible). The building was designed and constructed almost entirely of rapidly renewable and recycled materials and remains a fitting reflection of the mission of the owner, a not-for-profit, environmental advocacy group. A particular feature was the engineered structural members being expressed on the exterior of the envelope ( Figure 5). Unfortunately, the selection of materials, detailing and their exposure in a coastal region created unforeseen challenges for long-term maintenance and care ( Figure 6) [48].  The intentions had been good, to be sure, but the outcome was a building uniquely vulnerable to water penetration, and one that proved extraordinarily difficult and costly to repair. By the time substantial completion was reached, uncontrolled rainwater penetration was already widespread. After only ten years of service, we found significant decay in the structural members. To allow the compromised members to be removed and replaced with the least impact on day-to-day operation, it was necessary to construct an externally applied temporary structural steel framing. The carbon and energy costs of these problems, and of their solutions, will have had a significant impact on the sustainability of the building, as well as its maintenance demands and lifespan.
Similar issues are evident when existing buildings are retrofitted to meet new targets, in one infamous case with truly tragic results. The 2013 BREEAM Pre-Assessment report for the refurbishment of Grenfell Tower in London suggested that the proposed refurbishment project could potentially achieve a BREEAM rating of 'Good', with the highest score (14.83%) achievable under 'energy' [49]; the project was given an overall score of 69% for the materials chosen. The renovation was completed in 2016, but in 14 June 2017 the tower was completely destroyed by fire, with the loss of 72 lives (Figure 7). At time of writing, the enquiry into the complex causes of the spread of the fire was ongoing. It is interesting to note, however, that it appears that the material selection, placement and detailing which were intended to optimize climate-specific heat, air, and moisture transport performance across the building envelope may not have taken full account of the system's combustibility and potential reaction to fire.
The clear lesson is that materials and design must always be considered holistically. The results of the investigation will reveal more, but the tragedy has already taught us that, as stewards of our built environment, accountability during sustainable design and construction is both warranted and necessary [50]. Failing to return to the first principles of building science and to take a holistic view of the building had devastating consequences. We sympathise with the determination of the Grenfell survivors to ensure that the lessons of the fire lead to changes both in retrofitting and the design of new buildings.

Where Do We Need to Be? Developing a New Roadmap
While we cannot unring the bell of history, we need to be more acutely aware that large-scale fossil fuel exploitation has shaped our built environment in curious ways. There is no doubt, either, that the architecture of the future will need to be more environmentally conscious, low-carbon, and sustainable in the broadest sense. If we are to design safe buildings for a zero-carbon future, we must seek to question our current ways of thinking about building envelopes, building services and building use.
In the light of the climate emergency, it is clear we are well overdue for another paradigm shift: one that will finally allow us to reconcile our commoditised present with the many lessons afforded by pre-industrial and vernacular construction.
Can we move away from the road we have been following so blindly, and which has proved so singularly wasteful and damaging? Faced with the challenge of rethinking what we do, right down to its first principles, how do we make the best use of the lessons of history, good and bad? For example, returning to some form of solid-wall construction should allow us to build much longerlived buildings that do not fail when sealants come to the end of their lives, and need little if any additional insulation. We should not seek to mimic or recreate what has been lost, but rather to find new ways of integrating the tried-and-tested building principles developed before the Industrial Revolution with the best things we have learnt since.
A major stumbling block to achieving a better built environment has been the loss of effective feedback loops; there has been nothing to replace the guilds and apprenticeship systems that served communities so well for so long. Despite being recognised as highly desirable, Post-Occupancy Evaluation (one of the few structured ways of learning from real buildings in real operation) remains very rare. We have shifted towards a reliance on standards, and no longer have a robust way to learn quickly from our mistakes, or to use that learning to refine future practice. It is tempting to simply recommend that better practice be led by professional and vocational organisations, but care would be needed not only to deal with overlapping areas of interest, but perhaps more importantly with the many gaps that are currently not dealt with by any recognised profession. Building performance evaluation, which does cover all the ground from building to occupation, does not yet have a clear structure for training or passing on knowledge.
Currently, we are relying on codification and measurement for passing on practice, and it is here that we would argue that our immediate efforts as building scientists must concentrate. One of the reasons for the unnatural dominance of air temperature is that it is so easy to measure, and the same argument could be advanced for the obsession with R-and U-values. Can we find ways of assessment that take us closer to what it is we really want to know: that is, whether the occupants feel comfortable?
Assessment would need to incorporate not only physical factors (such as loss of body heat by radiation), but also the sense of comfort derived from having immediate control. It will need to take account of the interconnectedness of many sources of discomfort, and also of the measures that could be put in to deal with that discomfort. There must invariably be strong inputs from medicine and from sociology. To take the simple example of an awning: if we restrict our analysis to the impact the awning has on the air temperature of the room by reducing the heating of the glass (and the radiation of that heat), we will fail to appreciate its many other direct and indirect benefits, including preventing the direct solar heating of occupants and the surfaces they are working on, beneficial changes in air circulation, and (not least) the subtle benefits accrued by handing control to the occupant. Perhaps a better marker of success, at least in the early stages of our new journey, would simply be a reduction in the building's demand for space heating and cooling.
Lifecycle analysis (LCA) is another critical component of sustainability, and this must include not only the carbon and energy used to install and run (say) an air-cooling system, but also that used to maintain and repair it, and to decommission and replace it when it reaches the end of its life. These days, many building services are so deeply integrated into the building fabric that efficient maintenance is very difficult; this further reduces what is already a short lifespan for the equipment. Ease of maintainability, durability, and maximum lifespan must be key criteria not just for selecting materials, but when designing complete mechanical services, retrofits, or buildings.
When calculating LCA, it is vital to clearly distinguish between those mechanical services that are integral to the basic use of the building and should therefore be considered as part of the building fabric (for example, lifts and pumps for high-rise structures), and those that are ancillary, such as space heating.
LCA presents a serious challenge if you are attempting to extract very accurate carbon costings: especially in light of the variations in installation and future care that can greatly affect longevity, but also because so many products are proprietary. On the other hand, because consistency is unlikely, trying to be very precise may in fact produce unreliable results. Fortunately, to make choices that do reduce carbon we need to know only enough about the carbon costs of options to be able to triage retrofit options into broad categories: green (little or no carbon involved in creating or using, and little or no risk to the longevity of the building), red (significant carbon inputs or significant impacts on the building); or amber (difficult to label as 'red' or 'green' without further investigation). Interventions using materials obtained locally, with long lifespans, can usually be quickly labelled as green, whereas materials involving high-carbon materials or manufacturing processes, or significant transport, will probably be 'red'. We are then free to concentrate on the more ambiguous 'amber' options. There, too, the aim should be to undertake just enough research to be able to determine whether the option is green or red.
An important aspect of this is that any LCA assessment must include lifespan and whole-life costs over the long term, and all building professionals must get used to thinking of 'the long term' as centuries rather than decades (as building conservators have always been obliged to do when planning conservation and repair).
We will also need to find ways of incorporating the carbon and energy benefits of avoided costs. For example, if using radiant heat loss breaks and local heating in a building allows us to reduce-or, even better, eliminate-space heating, then we will also be able to avoid thermally sealing the envelope to prevent the loss of conditioned air. We will thus be able to avoid not simply the carbon costs of the air heating itself, but also those of the associated retrofitting.

Conclusions
Any road directed to a truly sustainable future would lead unambiguously to buildings that are long-lived, good for their occupants, and frugal in their energy and carbon demands. The route we have been travelling along so long and so hopefully may have been paved with good intentions, but its direction is debatable, and it is raises some serious questions, including:


How is whole-life costing being calculated in new construction for exterior envelopes, and for retrofitting? What are the modes of failure for these interventions? What impact do they have on material durability and performance? Are they always safe, or even desirable?  Do some retrofit measures decrease the durability of the original construction? If so, what are the consequences for total energy and carbon use over the longer term?  Are older, vernacular buildings being painted as the villain of climate change simply because we have been comparing apples with oranges? Many important functional elements which originally allowed them to operate have been stripped away. If these were returned, and occupants taught their purpose and how they should be used, would it be possible to reduce or even eliminate space heating and cooling once again? Without space conditioning, there is no need to try to make these buildings airtight or more thermally massive than they already are, avoiding the consequent problems for the fabric and the indoor air quality. How would their energy use and carbon output then compare to those in contemporary buildings, especially over the longer term?
As this paper has tried to demonstrate, history provides many lessons and examples for achieving a low-carbon built environment. What steps do we need to take to build these in to modern low-carbon design?
In terms of research, we should look more attentively through the historic records and at vernacular construction, seeking to understand the functional reasons behind peculiar design features. This should not stop at envelope design: researchers need to be on the lookout for any original fixtures, fittings, furnishings, and patterns of use that enabled these buildings to be effective in a low-carbon environment. From this, we should gain many more useful but forgotten tools that could easily be adopted and adapted, as well as shedding light on our own assumptions.
Assessment is another key issue. We will need to develop reliable methodologies for quantifying thermal comfort (or perhaps discomfort) that take proper account of the occupants and the way they are using the building. Psychology can be a powerful tool. If occupants do feel that radiators are more effective if painted red, or have less need for heating is shown a film of a roaring fireplace, then we would be foolish not to take full advantage of this as part of a low-carbon strategy.
We also need to find robust processes for assessing the through-life carbon costs of both construction and retrofitting, which incorporate lifespans and maintenance needs. Accuracy is probably impossible and undesirable: the aim should be a process that allows us to quickly and transparently triage competing options [51]. In the past, knowledge was developed by trial and error, but we no longer have the luxury of the time that requires, nor do we have the building-skills training systems such as the Guilds that once supported it. Building science can and must step into this breach. Building scientists need to find ways of understanding and assessing impacts that are much more pertinent-albeit possibly more difficult to measure -than air temperature or U-values. Equally, we need to continue developing LCA, and striving to agree simple assessment methods that can support rapid common-sense decision-making.
Climate change is urgent, and the primary remit of building scientists must be to work together to develop the tools the sector needs as quickly as possible. The related responsibility will be to disseminate these tools. We have struggled to pass on to design and construction professionals even the most basic knowledge of building performance, and this must change.
Communication is no less important than research, and it is perhaps even more challenging. How do we influence not only building professionals, but owners, occupants, and policy makers, in a world that no longer has many effective systems for acquiring and passing on best practice? Part of the trick surely lies in making potentially challenging new messages compelling and coherent, and presenting them in ways that make clear sense to our audiences.
Finally, we must disentangle current building practice, which is fraught with problems that we have only alluded to in this paper, but which have a strong bearing on the current situation. For example, the appetite for proprietary building products (often new and untested), is encouraged by a desire for guarantees and warranties, in the hope of transferring risk and providing assured outcomes. Clearly, this is wishful thinking, but it has been a very real barrier to the uptake of traditional materials and systems in modern construction, despite their in situ behaviour being well understood. Siloing of expertise is another problem with very similar underlying causes.
Breaking down these and similar roadblocks will take time and determination. To facilitate that process, we suggest that a few essential aims be embraced by the sector: 1. The training and education of building professionals at all levels should be cross-disciplinary, and begin with the fundamentals of building science. Teaching must cover greatcoat as well as raincoat architectural systems: not only to ensure better outcomes when we are repairing or refurbishing older buildings, but also to give professionals the confidence to draw on a much wider range of building materials and systems when designing sustainable new buildings; 2. We must seek to identify and question all our perceived wisdoms, especially those concerning thermal comfort and sustainability, seeking deep answers to the question of what is really needed to make buildings useable, agreeable, and low-carbon; 3. Climate-driven vernacular design must be re-evaluated as a source of lessons to be drawn on not just for the design and construction of new buildings, but also for their operation; 4. We need to be sure the buildings we design or refurbish do not have embedded requirements for carbon-intensive services. Sustainable buildings are those that can be readily maintained and run for very long periods at minimal carbon cost; 5. Rather than expecting that building services overcome fundamental shortcomings in the envelope or the conception of building use, we must seek to design buildings that are inherently low-carbon in both construction and maintenance, and would able to function in a low-carbon manner in both current and future climates for a wide range of occupants; 6. We must also reconsider the familiar ways we approach services: not just heating and cooling, but lighting, water supplies and sewage. As well as critiquing current practice, this means being much more clear-sighted about the true needs of occupants. We need to work out the best ways of addressing those needs in a low-carbon manner, and we need to be able to communicate this best-practice advice to policy makers as well as building professionals; 7. The unhelpful 'old and new' dichotomy should be abandoned, along with perjorative but illinformed labels such as 'hard to treat'. Instead, buildings should be assessed on their success in fulfilling the real demands placed on them by real occupants, and their actual outputs of carbon. This means turning away from an excessive reliance on theory and modelling, and towards field assessment. It is also likely to mean a more broadly applied and creative consideration of what is meant by such terms as 'comfort' and 'passive control'.
Taking these first steps will not be easy, but they will set us firmly back on the road towards a sustainable future.
Authors: A Principal and Director with Wiss, Janney, Elstner Associates, Inc. and Wiss, Janney, Elstner Limited in London, Dan Lemieux is a registered architect. He a leader in the development of international standards for Building Enclosure Consulting and Commissioning, and serves as a Chair on ASTM Committee E06, Performance of Buildings. Dan has more than 25 years of professional practice in commercial building design, including hands-on building envelope assessment, repair design, performance testing and global supply-chain technical support for products and materials sourced from the Americas, UK, EU, UAE, Canada and China.
A technical specialist at Historic England, the arm's-length body advising the UK Government on all aspects of the historic environment, Robyn Pender is also a Commissioner on the planning body for English Cathedrals, and serves on the Environmental Working Group of the Church of England and the editorial board of the Journal of Architectural Conservation. A physicist with a postgraduate degree in wallpainting conservation, her doctoral research examined moisture transport in building materials. She specialises in building performance and the impacts of climate change, two strands which are becoming ever more closely entwined.
Author Contributions: This article was written as a collaboration between the two authors, and reflects their experiences working as building performance specialists in the US and UK. All authors have read and agreed to the published version of the manuscript.
Funding: This research received no external funding.