Next Article in Journal
The Sustainability of Organic Grain Production on the Canadian Prairies—A Review
Next Article in Special Issue
Some Sustainability Aspects of Energy Conversion in Urban Electric Trains
Previous Article in Journal
Extracting Minerals from Seawater: An Energy Analysis
Previous Article in Special Issue
The Rhetoric of Sustainability: Perversity, Futility, Jeopardy?
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Science, Open Communication and Sustainable Development

1
Science Commons, c/o Creative Commons, 71 Second Street, Suite 300, San Francisco, CA 94105, USA
2
Oak Ridge National Laboratory, P.O. Box 2008, MS-6038, Oak Ridge, TN 37831, USA
*
Author to whom correspondence should be addressed.
Sustainability 2010, 2(4), 993-1015; https://doi.org/10.3390/su2040993
Submission received: 1 February 2010 / Revised: 19 March 2010 / Accepted: 22 March 2010 / Published: 13 April 2010
(This article belongs to the Special Issue Advanced Forum for Sustainable Development)

Abstract

:
One of the prerequisites for sustainable development is knowledge, in order to inform coping with sustainability threats and to support innovative sustainability pathways. Transferring knowledge is therefore a fundamental challenge for sustainability, in a context where external knowledge must be integrated with local knowledge in order to promote user-driven action. But effective local co-production of knowledge requires ongoing local access to existing scientific and technical knowledge so that users start on a level playing field. The information technology revolution can be a powerful enabler of such access if intellectual property obstacles can be overcome, with a potential to transform prospects for sustainability in many parts of the world.

1. Introduction

Sustainable development across the regions, systems, and populations of this world is a path, not a state [1]. The challenges to sustainability evolve constantly, as changes in conditions and driving forces emerge with little notice. Sustainable development is therefore a process of adaptation, usually at a local scale—geographically, sectorally, and/or socially. In fact, the world as a whole depends on ideas generated in these local contexts as a source of options for us all.
The problem is that, more often than not in developing country contexts, individuals with world-class intelligence and innovativeness—the human resources to assure local adaptation—are simply not well-informed about the knowledge bases that would help them to understand and adapt alternatives available to them now. Maybe even more importantly, they are not well-informed about knowledge bases that would help to discover and develop alternatives that are currently not available either to them or to others. Knowledge is the foundation for discovery and innovation as well as for coping.
This paper discusses open communication of scientific information as an essential element in assuring global sustainable development, and it describes several approaches for opening access throughout the developing world.

2. Sustainability and Innovation

Sustainability—at scales from global to local—depends on the diffusion of knowledge for at least two reasons. One reason is that sustainability is a challenge of such enormous magnitude that it needs to attract the best talent across the world, including human resources in areas not now in a position to be well-informed about needs, alternatives, and opportunities to develop better options than those currently available. The second reason is that sustainability rests considerably on bottom-up, grassroots-based actions, reflecting local contexts that differ so greatly that what makes sense in one situation is seldom what makes sense for most situations. For example, the Millennium Ecosystem Assessment found that in some cases regions that appeared ecologically stable contained localities that were highly unstable, and regions that were generally unstable contained localities that were quite stable [2]. As a second example, research on potentials for adapting to impacts of climate change shows that benefits and costs are shaped profoundly by local contexts: geographically, sectorally, and socially [3].
Realizing the full potential of distributed talent in all parts of the world depends fundamentally on assuring that the talent knows the current state of science and technology, again for two reasons. The first reason is that knowledge informs what the talent does and avoids wasting time reinventing the wheel. The second reason is that knowledge attracts talent by posing interesting puzzles and offering opportunities for learning.
We will first consider what is known about processes of innovation and their relationships with knowledge generation and transfer. We will then summarize current views about how the movement of knowledge into action is one of the central challenges for sustainability.

2.1. Theories of Innovation

2.1.1. The Foundations of Innovation: Pre-Internet Theories and Technology Transfer

Our examination of contemporary innovation theory starts with the work of Schumpeter, which placed the entrepreneur in the key role of “creative destructor” who sweeps away old, inefficient systems (after gaining access to capital) and actively destroys old firms and organizations. This concept of innovation embraces not only the invention of new products, which is the realm of traditional patented intellectual property, but also business models, organizational structures and processes [4]. Schumpeter later developed a second theory of entrepreneurship in which he extended the concept of “entrepreneur” to large organizations with resources to invest in all phases of the process.
In both of Schumpeter’s visions of innovation, access to monopoly rights is a key lever for success. Patents on products are particularly important in both, although the entrepreneurial vision is much less dependent on monopolies overall than the corporate vision. The first, entrepreneurial Schumpeterian vision is realized in faster and faster cycles, while the ability to connect networks of actors with the capital and manpower to innovate at scales previously unachievable outside a corporate umbrella fundamentally changes the innovation structure.
After Schumpeter there was a long period of development of science policy and funding in the US driven by Vannevar Bush after World War II, who had a vision of government funding in basic science directed by scientists without significant intervention, which was connected post hoc to the Schumpeterian idea that basic science should be converted into products by companies. The Bush vision foreshadows neatly the emergence of the key legislation encouraging technology transfer from US universities to private entities for commercialization, with an emphasis on scientific freedom in research and strong patent rights, and in many ways still drives US innovation theory, or at least, the traditional side of US innovation theory [5].
The Bayh-Dole Act of 1980 is the legislative “bridge” between basic science funded by government and the view of the firm as the engine of innovation. Created in response to the perception that the government investment had become unbalanced in terms of results reaped to the public, Bayh-Dole created a mandate to universities receiving government funds to seek to commercialize inventions resulting from those funds. While the actual language is quite broad (and on one kind of reading might even echo Schumpeter’s broad vision), in practice the law has come to represent a mandate for public universities (and other non-profit institutions) to file patents and attempt to license those patents, often aggressively.
Bayh-Dole is hotly debated [6,7]. It is vigorously defended by powerful lobbies in both academic technology transfer and industry, although the technology industry (Hewlett Packard and others) is beginning to actively push for reform. The legislative concept is “going viral”—versions of the act have passed in all countries of the EU, Brazil, China, and more. Arti Rai has been a leading scholar on the impacts of the legislation, both in terms of its potentially negative effect on the progress of biomedicine and the impact of naively porting Bayh-Dole style legislation to developing countries [8].
A newer, but still complementary and traditional, view of innovation is popularly known as Sabato’s Triangle, developed out of South America in the late 1960’s [9]. The three points of the triangle are government, the science-technology infrastructure, and the production infrastructure. The inclusion of the government at the peak of the triangle indicates a shift from the disengaged funder role, but there is also a key shift with the universities taking on a more industry-like role, developing technologies on top of basic research. This is also the beginning of treating innovation, and the role of the triangle within it, as a key aspect of international development. Modern versions of the triangle replace the S&T infrastructure and “production” with simply “academic and corporate” points, but the original role of the infrastructure resonates more deeply today as we look at the impact of information and communication technologies (ICT) generally as an enabler for innovation.

2.1.2. Contemporary Theories of Innovation: The Rise of Networks and Crowds

Central to the older theories of innovation is the idea that success in breakthrough research and development was based on control—of ideas, of knowledge, of data, and most importantly of the intellectual property rights that governed the copying and use of the ideas, knowledge, and data. For most of the 20th century, this control approach worked. Large, centralized laboratories sprung up out of government research via the National Laboratories system in the United States, and out of massively successful technology companies like Bell, General Electric and Xerox.
However, many of the most important innovations to emerge from Bell Labs, or the Xerox Palo Alto Research Center (PARC) were completely missed as innovations by laboratory management [10]. It was not until those innovations spun out into new places and new companies, funded by private venture capital and staffed by a new class of mobile knowledge workers [11], that the computer mouse and the concept of windows-driven computer “desktops” made the leap from concept to reality. The advent of mobile workers and private capital has been augmented by personal computing and digital networks, creating a remarkable cycle of innovation in technology and culture.
We now sit at a point when the small and firm and the individual are capable of playing much larger roles in innovation cycles alongside massive companies than ever possible inside an industrialized culture. Enormous amounts of know-how have been embedded into widely available tools, from computer-aided design for architectural work (like Google Sketch Up) to object-oriented software packages (like the wildly popular Ruby on Rails programming environment), that the average individual can design and manufacture innovations for consumer goods, technologies, and even genetically modified bacteria. The innovation system itself is being subjected to radical innovation, although the funders and institutions involved in the traditional research and development culture are not only slow to catch onto the change, but in some cases, actively serve as drags on the newer forms of innovation.
Three newer theories of innovation—open innovation, user-driven innovation, and distributed innovation—each deeply rooted in the transformation of invention and innovation wrought by information technologies, merit close attention in this study. The theories emerge from studying the responses of the firm and the individual to the rise of networks, and attempt to understand, explain, and predict the new kinds of innovation possible in the networked world.
Open innovation (OI) is closely associated with Henry Chesbrough [12]. OI explores the potential of an institution (i.e., the firm or the university) in a networked environment. Most of the smart people, as Joy’s Law notes [13], have always worked somewhere else. But in a network culture there is the opportunity to connect more and more of those smart people to an institution’s mission: to contribute to internal projects from the outside, to take a project that fails to gather internal support forward using outside funding, to generate novel projects outside and “spin into” new internal projects. All of this becomes possible as the transaction costs involved in the knowledge movement required drop.
Open innovation depends on the fundamental knowledge “leakage” in the institution. This comes from many sources depending on the kind of institutions. It could be publication, via scholarly journal, by patent, by conference presentation, or more currently via blog post or wiki edit. The core insight of open innovation is the ability to use the world outside an institution to generate internally useful knowledge—and the core dependency of open innovation in turn is the need to make the flow of knowledge in and out of an institution a purposeful thing, not a random process.
The last key element of open innovation theory is the importance of the business model. A world of purposeful information flow is at odds with many of the business structures of the last 50 years—especially intellectual property rights. Copyrights govern the copying, distribution, and reuse of the documents containing actionable knowledge, from software to scholarship. Trade secrets and knowledge leakage on the public web are completely at odds with one another. And patents prevent institutions from acting on useful knowledge, even if the action would be far afield from the business concerns of the patent owner.
The standardization of the network is, so far, a technocratic one. It enables the emergence of the open innovation paradigm but does not yet enable the horizontal spread of open innovation, because each company must develop its own bespoke strategy for assembling the set of technologies (and rights to use those technologies) needed for each situation. There is not yet the legal standardization, to allow recombination of technologies with freedom to go to market, nor the semantic standardization, to allow for easy discovery of relevant technologies via a search engine or online marketplace, required to realize the full potential of the open innovation theory.
The second major theory of modern innovation is generally known as “user driven innovation” or UDI, and is associated with Eric von Hippel [14]. UDI focuses on the individual innovator, who tends to be an end user with an idea to improve a product. In UDI, the innovation comes from being “close” to the problem, or alternately we say the knowledge required to innovate is “sticky” and doesn’t move far from the user.
Von Hippel’s classic text in this space is Democratizing Invention, and the title is a good indication of the basics of the UDI theory. The core insight is that if the end user of a product has the power to change and improve the product, she is very likely to do so, irrespective of intellectual property rights. The user in the classic UDI system cares more about functionality than about IP protection, and thus, the essential components of UDI are democratic access to the systems necessary for innovation in the user’s context: the toolbox, the prototyping facility, the ability to test.
UDI has been observed, again and again, in fields where the democratic access to creative systems is present. Typical examples include the farmer who attached watering hoses to a bicycle, creating central-pivot irrigation systems, or the communities of sports practitioners in snowboarding and kite-surfing who are constantly improving their own gear in home machine-shops. This form of innovation relates to the larger companies when the users’ innovations are noticed, improved, packaged into visually appealing formats, and marketed. UDI is also increasingly observable in biotechnology in the context of tool-construction—the field has been revolutionized by a series of machines for generating data which all emerged initially as rough prototypes from the laboratory context.
The third modern theory examines the truly disruptive aspect of the network, which is the innovation power of a collected set of individuals whose individual actions “snap together” into a coherent group through standard technical systems and digital networks. Its theory is generally called “distributed innovation” (DI), and Karim Lakhani is a leader in its exploration [15].
The emergence of distributed innovation on the network in many ways is not surprising, as Boudreau and Lakhani note: “if the innovation problem involves cumulative knowledge, continually building on past advances, then collaborative communities have inherent advantages. Communities are naturally oriented toward solutions that depend on integrating skills, knowledge and technologies that transcend an individual contributor’s purview. In fact, successful communities necessarily have knowledge-sharing and dissemination mechanisms designed into them. They also tend to converge on common norms with a culture of sharing and cooperation, broad agreement on a technology paradigm and common technical jargon to support productive collaboration” [16].
This is a completely novel ecosystem for innovation, and two good examples are the online encyclopedia Wikipedia and the open source software development culture and methodology. In each case, problems are solved without a central authority assigning tasks and without the maximalist approach to intellectual property associated with traditional forms of innovation diffusion and exploitation. Rather, the communities are formed by many different individuals, participating for very different sets of reasons and incentives, who self-organize around challenges and tasks.
An example from open software: dozens of programmers might be interested in creating a specific functionality in Linux, and casually note the functionality on various lists. One programmer decides to tackle the creation of the functionality, does so, and releases the software code by checking it into a common code repository and notifying a mailing list. More programmers come along and improve the functionality, line by line, over a series of weeks or months, until the functionality is stable, integrated into the overall code base of Linux, and the group of programmers dissolves for the individuals involved to work on other projects, or to leave the project entirely. This is mirrored on Wikipedia, though the encyclopedic nature of the site also encourages the creation of pages for which there has been no initial discussion.
DI is at once incredibly powerful and incredibly difficult to create ex-ante, as it requires the congruence of significant technical standardization and problems amenable to distributed problem solving. Software code is “modular”—code written by one person in one part of Linux will, if maintained according to standards, integrate smoothly into the whole. Software languages have evolved so that writing code can take place in an entire ecosystem of programmers across tools and coding platforms. The expertise and ability required to contribute is widely distributed, and easily gained by those interested thanks to online programming resources, cheap ubiquitous computing resources, and internet access (at least in the developed world). And there is an enormous ecosystem around open source software, of code repositories and mailing list software, code versioning systems, and more that make it easy to start and maintain an open source project.
Designing the DI ecosystem for a new field, from a standing start, is daunting. So far, the “designed DI” system has proven quite difficult to achieve. However, the power of distributed innovation can be seen in many of the “semi-closed” web 2.0 platforms such as the iPhone application store, Facebook, Twitter, and more, which each use an ecosystem approach to engage users in a distributed but closed innovation environment.

2.2. Transferring Knowledge in Order to Assure Sustainability

As an aim of science and technology in the United States, sustainable development was transformed from a minor theme to a major focus by a pathbreaking report by the National Academies in 1999 [17]. This report considered whether it is possible to move the world to sustainability in a period of two generations/half a century. It concluded that such a goal is indeed possible, but it is possible only if it is addressed by a truly revolutionary advance in integrating knowledge and action: (1) advancing research agendas in “sustainability science”, and (2) advancing capacities to bring together those who have knowledge with those who means to implement it.
Since that report was issued, a number of institutional initiatives have emerged very quickly. The National Academies established a Roundtable on Science and Technology for Sustainability, involving both scholars and practitioners, which have organized annual workshops and workshop reports and established task forces to advance knowledge about critical issues. A continuing Sustainability Science Program at Harvard University has spun off responsibilities for continuing data bases to a Center for Science, Technology, and Sustainability at the American Association for the Advancement of Science (AAAS), which organizes sessions at each annual AAAS meeting; and these two activities have established an international network, the International Initiative on Science and Technology for Sustainability, which is supported by Harvard and TWAS, the Academy of Sciences for the Developing World (Trieste, Italy). Other significant institutional initiatives include the notable Global Institute of Sustainability at Arizona State University.
Meanwhile, the science community has stepped forward with leadership at the highest levels. “Sustainability science” has emerged as an active new trans-disciplinary field of science [18]. In 2002 and 2003, Bruce Alberts, then President of the National Academies, not only spoke of the importance of accelerating the application of science and technology to sustainable development but involved himself substantially in promoting it [19]; note especially his interest in increasing the use of electronic media in disseminating up-to-date science. And, in his presidential address to AAAS in 2008, John Holdren spoke about science needs and potentials [20]. A consistent theme across all of these developments, especially in the NAS Roundtable, has been an urgent need to accelerate the movement of knowledge into action as a prerequisite for progress with sustainability. Early in the process, an international group of experts met in Mexico City, under the auspices of ICSU, to develop a synthesis of knowledge about science and technology for sustainable development [21]. The report included a major section on linking knowledge and action, including recommendations such as:
  • Shaping a “partnership on S&T for sustainable development” by (a) strengthening ability of locally-based initiatives to harness S&T from around the world; (b) facilitating engagement of young scientists and technologists in efforts to achieve sustainability; and (c) building a global community of scientists and engineers for sustainable development, learning from one another;
  • Developing a “new contract between science and society for sustainable development”, changing rules related to both the demand and the supply side of S&T;
  • Greatly enlarging the range of knowledge sources, including place-based knowledge that is often not reported and shared, addressing the issue of how to “certify” knowledge.
In most regards, these efforts are vague about how knowledge gets moved into action. Several recent critiques have criticized an apparent assumption that good scientific and technological knowledge (S&T) will be used if communicated. For example, van Kerkhoff and Lebel [22] reviewed linkages between research-based knowledge and local actions for sustainable development. It concluded that effective applications depend on a shared responsibility, with research-based knowledge embedded in relationships with actual or potential contributions from other forms of knowledge. McNie [23] reviewed challenges in reconciling the supply of scientific information with user demands: i.e., what constitutes useful information? It stressed the importance of strengthening the roles of endogenous scientists in managing the boundary between science and society, involved in determining priorities, inspiring trust and credibility, and sharing credit and benefits.
A more recent paper, growing out of the CGIAR experience with science applications for development, considered a number of projects in Africa and Asia related to five different problem areas addressed by international agricultural research [24]. This study found that strategies for closing gaps between knowledge and action include: combining different kinds of knowledge, approaches for learning and bridging, strong and diverse partnerships that level the playing field, and building capacity to innovate and communicate.
More specifically, building on previous studies, the paper points to the value of co-produced knowledge (where external scientific information is combined with internal information and experience), challenges in transforming receivers of information from knowledge producers who believe intuitively that their situations are in many ways unique into knowledge learners, and roles of “boundary spanners” who assist with communication and translation across boundaries [25].
One clear challenge is that the ability of scientists and leaders in receiving societies, especially in developing countries, to participate in the co-production of knowledge for sustainability depends fundamentally on leveling the playing field regarding access to what is currently known: especially the science that underlies technological innovation. How can local innovation leaders be expected to be equal partners in innovation if they are not equal partners in open access to S&T information?

3. The Power of Open Access to Science and Technology Information

What, in fact, do we know about how open access to information relates to the likelihood of scientific discoveries and innovative applications?

3.1. Information Diffusion and Scientific Discovery [26]

A continuing theme in research on determinants of scientific discovery is that the provision of knowledge is a powerful driver. For instance, many years ago Robert K. Merton analyzed concept formation in a context of the sociology of science. One important finding was that scientific discovery often occurs in “multiples”. It is often the product of an intellectual climate at a specific time and place, where scientific foundations are in place [27]. Moreover, he suggested that serendipity plays a role, but not serendipity in a sense of random accidents but in a sense of coming across scientific information or data that are unexpected—which is at least in part a function of the diffusion of data to stimulate the development of new theory by Merton and Barber [28]
In his book, The Act of Creation [29], Arthur Koestler argued that innovation often occurs through the marriage (“bisociation”) of previously unrelated ideas, and Parayil has suggested that technological change is a product of social-historical processes of knowledge change (1999) [30]. Other recent research along these lines includes Zucker and Darby [31] who suggest that collaboration is a predictor of success.
A somewhat different literature argues that the critical trigger for discovery is not the methods of science but personal interactions between individuals and information available to them. Examples include Paul Feyerabend, who argued that the methods of science do not generally contribute to the successes that it claims [32]. Successes are, in fact, more personal. From a different perspective, Polanyi has argued that all knowledge is personal; and truth is best pursued for its own sake, not with agendas in mind [33]. Even Taleb, whose evocative book, The Black Swan: The Impact of the Highly Improbable [34], seems to imply that discoveries are no more than random serendipity, more recently has suggested that invention can be facilitated by information and financial support, as long as the provision of support does not discourage “out of the box” thinking [35]. Taleb’s concept of the “grey swan”—the improbably but statistically inevitable positive event—is intriguing in this context, as one might begin to design systems of knowledge transfer and diffusion in the developing world specifically to maximize the odds of the positive event’s occurrence.
Supporting this perspective are a number of examples of research and practice which indicate that non-market information networks encourage individual creative initiatives. For instance, Peter Hall found that the golden age of cities in Europe was associated with extensive “open source” information transfer [36].
Meanwhile, a variety of experiments are under way, associated with the vivid recent experience of the information technology industry with promoting transformational change. One example is the initiative by Bill Gates to promote “quests” through crowd-sourcing platforms to invite wild ideas and to encourage collaborations in developing them (mainly discussed in the blogosphere, but see “Bill Gates Set to Begin New Chapter” [37]. Another example is Nathan Myhrvold’s “innovation sessions”, which try to catalyze breakthroughs by organizing collaborative brainstorming by groups of creative people with very different backgrounds—sharing information and perspectives interactively—which are reported to have led to significant unexpected successes [38]. A variation of this perspective is the argument that more extensive forms of interaction and collaboration, encouraged and facilitated by information technology, can overcome issues of incentives and ownership to have transformative impacts: “It doesn’t matter why any one person does it as long as enough people do it” [39]. Possibly related concepts include “Metcalf’s Law”, which suggests that the economic value of a telecommunication network is generally proportional to the square of the number of connected devices in the system, and “Reed’s Law”, which goes further to suggest that the utility of a large network (including a social network) can scale exponentially with the size of the network [40].

3.2. Information Diffusion and Advances in Technology for Sustainable Development

There is abundant evidence from the pre-internet era that advances in technology, combined with information diffusion, were dramatically effective in increasing economic and social sustainability. The most notable cases were agriculture and health.
In agriculture, innovation was spurred by interests in developing countries in improving food production for growing populations, beginning as early as the 1940s in Mexico, who sought to establish an agricultural research station to develop better varieties of wheat. Supported by such foundations as Rockefeller and Ford, this global effort had enormous impacts in many parts of the world for a number of decades [41], and its work continues today through the Consultative Group on International Agricultural Research (CGIAR), with UN support.
In health, science has joined with international institutional partnerships to eradicate smallpox, tuberculosis, and polio, along with reducing the incidence of malaria, cholera, and other diseases in many areas [42]. Reaching out into small villages and disadvantaged communities, this effort was often a model of how to connect global science with local action.
Implications of the internet and other information and communication technologies for sustainability are only beginning to unfold, and most of the evidence is still anecdotal. A recent book [43], however, assessed consequences of 23 research projects funded by Canada’s International Development Research Centre (IDRC). The book begins by suggesting that building local capacity starts with a quest for information, based on a case study of Asian fisheries. Four of the case studies are outgrowths of IDRC’s Acacia Initiative, designed to help African communities south of the Sahara make effective use of modern information and communication technologies. Although evidence of advances in local sustainability are limited, the cases show very rapid changes in the use of such technologies: e.g., Senegalese pastoralists using cell phones and GPSs to track livestock migrations, a town in Mozambique where a wide range of local citizens are now using a local telecentre, and expanded uses of cell phones and other technologies by the poor and disadvantaged in Uganda. Based on these experiences, the book suggests that modern information and communication technologies are likely to change information access in developing countries far more rapidly than has generally been expected.
At the same time, while open innovation in the Chesbrough sense has been embraced, at least in theory, by many large companies, we can also observe it in action at a smaller scale in the developing world in wind technology. Some observations indicate that, although materials are scarce, there are enough resources available locally in some cases to enable the construction of working windmills and turbines, if there is enough active knowledge transfer into the local context to enable local initiatives (see text box below).
Similarly, Green Step, a German organization, designed a microelectric plant specifically for use in a developing world context, and developed a plan to diffuse the knowledge into test communities in Cameroon [44].
This sort of innovation meets the first test of open innovation, by purposefully moving knowledge out of the organization into an external context. The second step of open innovation is embodied in the designed ability to use local materials, such as reclaimed wood and car batteries and scrap iron, to build the windmills. This represents an investment in the external capacity of the market to innovate: the designs can be built without requiring the influx of new and expensive materials, which also means that a cycle of user-driven innovation can take place—improvements on the design which further tune the system to local needs and capacities.
Each micro-plant is designed to support five houses or less, and each house requires an electric load far below western consumption: two light bulbs, one radio, one television, and a mobile phone. So far, at least, such projects have seldom been built, or invested in, in the developed world. But the design allows local decision-makers to move toward a stable and useful energy source in the absence of national grid infrastructure to provide regular energy, and provides enough energy for local development needs.
The third element of open innovation, the business model’s ability to capture value from the open system, is less relevant here due to the non-profit nature of the project. But one could conclude that an organization dedicated to raising philanthropic grants represents a business model that is well suited to an open innovation system in which the entire goal is the creation of initial designs for use in the developing world energy context which are intended to be taken, improved, copied, and distributed. Again, access to the exploding amount of content available, from the amateur experiences detailed on blogs to the technical journals, can only be expected to improve the designs and implementations on the ground, representing an improvement in the capacity of the “market”—in this case the implementers on the ground—to improve and innovate.

4. Approaches to Opening Access to Science and Technology Information

If at least in the internet era, accelerating the application of science and technology for sustainability depends on equalizing access to S&T information on the parts of both producers and users, how can this be done? What are the alternative pathways?

4.1. Emerging “Commons” Processes and Their Relationship to Modalities of Innovation

One alternative is to seek avenues that encourage information providers and users to join in a “commons”, in which information is exchanged in accordance with standardized agreements.
For example, patents are a form of intellectual property that governs inventions. A patent is a monopoly right granted by governments to applicants who have devised something—a technology, a tool, a machine—that is useful, new, and non-obvious to experts in the field. A patent holder receives the right to “exclude” others from using that technology. Only the holder can make the tool, use the machine, or offer the technology for sale (or, more importantly in a world of cumulative innovation, make a commercial use of the technology as part of a larger innovation).
Patents are deeply relevant to the new “networked” theories of innovation, as the network allows companies (OI), individual users (UDI), and networks (DI) to construct new kinds of value by combining a number of technologies which are likely to carry patent rights held by different owners. Indeed, a core shared characteristic of these theories is the concept of “recombinance”: the idea that technologies built in one place for one purpose can have many purposes, and that users of a technology or technologies can create entirely new and often unforeseen value in new business spaces and via new business models.
These forms of innovation interact with the patent culture in the context of energy and development. Unlike in free software, where open copyright licenses are sufficient, any new technologies developing from the various networked innovation systems will need to have “freedom to operate” at the commercial level or risk infringement proceedings. Patent rights are transferred from owners to users through a very traditional process of license negotiation, transacted by expensive attorneys and resulting in artisan-generated legal contracts with extraordinarily high transaction costs. As a result, the recombinant aspects of the new modalities of innovation, which (like all network effects) depend on low transaction costs and high transaction volume, tend at some point to be thwarted by the need to combine a set of patents together and obtain permission to go to market.
Though this pain is perhaps less relevant in the context of user-driven innovation, the risk to a company taking a rough prototype to market remains a barrier to bringing innovations to the broader community from UDI (individual user) processes. Both distributed innovation and open innovation are at significant risk from constraints on the willingness of business to invest in new recombinant technologies because of the potential for injunctions and lawsuits by patent holders whose interests are infringed at some level by an innovative re-use of a technology or tool.
Many attempts have been made and are being made to address this risk by the construction of interoperability systems for patents in various disciplines and spaces. The oldest strategy is to form a “pool” of patents that are cross-licensed among a variety of holders [45]. Patent pools save time and money by creating a structure that allows for single-license access to the patents in the pool, and they can be enormously valuable if multiple companies hold patents that block each other from moving forward (as in many technological innovations such as semiconductors). However, pools require the existence of an administrating entity, can also carry high transaction costs relative to other systems, and create risks of antitrust violations.
A new wave of patent interoperability systems is now emerging, inspired by the success of free/libre open source software and free culture. Here, the governing form of intellectual property is a copyright rather than a patent: a monopoly right granted to the author of a creative work, which prevents others from making and distributing copies of the work.
A copyright is a powerful, automatic right which the author receives without making an application to the government, and it can be used by its owners to create “freedom” through the use of an open license such as the GNU General Public License [46]. In each case, the owner can impose a behavioral control on the user: for instance, the owner lets the user make and distribute copies, derivative works, remixes, and new software, but the user must bring the new works back online under the same free and open terms for the next user. This is known as “copyleft” and is the legal foundation of open systems, especially those at the root of distributed innovation. Copyrights on software govern the copying aspects of code, not the utility functions of the code. But the utility of software can be directly affected by patents on software, and we see early attempts to deploy an open patent approach to software as well.
Several alternatives have appeared in recent years to address obstacles that copyrights and patents can present to a distributed approach to innovation. Because these alternatives are laboratories for learning about institutional approaches to more nearly “open communication” and their pros and cons, we will summarize how four different initiatives are conceived.
The Linux Patent Commons (LPC) was developed in 2005 as a response to the threat of patents to distributed innovation in software [47]. Software patents are widely acquired by companies large and small, although deeply controversial in terms of their true value and impact. A study by a non-profit organization in the same year as the LPC’s launch found at least 283 patents that might be infringed by Linux, including 27 held by Linux opponent Microsoft [48]. Clearly, the foundation of the distributed innovation was subject to a fault line that could be exposed through patent infringement lawsuits and injunctions.
In addition to the perceived risk of Linux infringing on closed patents, there is an increasing desire of patent holders to provide a formal assurance to developers and users of open source software that a set of patents are not a threat to the community and to create a vast “safe harbor” for cumulative, collaborative reuse of code. The traditional methodology of artisan-based license drafting fails in this context. Paying a lawyer only makes business sense if a revenue stream is expected, and thus a new mode of licensing had to be developed for the goal of removing the patent threat from open source software.
Created and administered by the Linux Foundation, rather than requiring a set of licenses, the LPC uses a set of “commitments” to simplify the process by which access to patented inventions can be granted [49]. In this approach, a commitment is a promise, not a license, made by the owner of a patent to the community. In the LPC, each commitment typically includes five key components: named beneficiaries, grants of permissions, statements of permitted use, defensive termination provisions, and a reservation of rights. The net result of the commitment is that the patent is widely available for use in any development project using an approved open source copyright license, creating a connection point between the two vastly different classes of intellectual property. IBM jumpstarted the LPC with a grant of 500 software patents to the commons, and remains the largest contributor of patents to the system [49].
The LPC is a success, mainly because of the size of IBM’s original grant, but significant challenges remain. No more than 30 additional patents have been contributed as of this writing in addition to the IBM founding grant of 500. The IBM corpus alone generates significant mass of available patents, but the lack of follow-on patent grants indicates a lack of network effects in the willingness of other companies to participate as patent licensors. And at almost the same time as the LPC launch, Sun Microsystems dedicated more than 1,600 patents to open source, but tied the patent license to the use of a very specific open source copyright license, which happened to be in use almost exclusively on software promoted by Sun [50]. This led to a fear of “forking” of patent rights away from the generic Linux codebase, but these fears to date have not been realized.
An alternative to the LPC is the Eco-Patent Commons (EPC), which follows essentially the same approach as the software system: commitments, not licenses, where the owners promise not to sue the users as long as the uses are towards the shared goal of sustainability. The EPC is managed by the World Business Council for Sustainable Development, launched in 2008 with public commitments of patents by IBM, Nokia, Pitney Bowes, and Sony, with further commitments from Bosch, Dupont, Xerox, Rico, and Taisei over time for a total of 95 patents as of this writing [51]. The EPC is similar to the LPC in its success in attracting business to the concept of low-transaction cost public patent solutions, but it suffers from a similar problem as the LPC in the lack of a critical mass of patents to enable cumulative innovation in a wide variety of fields.
The patent licensing regime most directly inspired by free/libre open source software actually developed not from software itself, but from agricultural biotechnology. The Biological Innovation for Open Source (BIOS) project, based in Australia, began with a single core patented technology for transferring genes in plants—a fundamental technology for creating genetically modified plants for food. The technology was developed specifically to avoid encumbrance from existing patents, so that users could develop plants without any fear of a “reach through” from the technical tools to the end products and distribute the plants and seeds widely to alleviate global hunger [52].
The original BIOS patent license employed a “patent left” approach. As in copyleft licensing, the goal was to stimulate the re-contribution of improvements to the underlying technology: a plant created would be freely available, but if a licensee improved the gene transfer tool itself and patented the improvement, then the improvement would have to be made available back to all other licensees of the tool. These licenses cover a series of technologies developed in tandem with the licenses, which have had patent applications filed and in many cases granted in multiple countries.
Presentations by BIOS leaders note many licensees for the technology but indicate little progress toward the goal of re-contribution of improvements (personal conversations by John Wilbanks with Richard Jefferson of BIOS/IOI). The financial incentive to pursue follow-on patents for improvements is diluted by interests in sharing the improvements back with the group, which is a feature of the differences between copyright and patent. As noted by the founder of the Free Software movement, patents are expensive to acquire, which makes patent-left a more complex economic proposition than copyleft [53].
BIOS now features a modified strategy that encases the original patent-left license in a suite of legal tools, all of which conform to a commitment strategy similar to the LPC and the EPC [54]. However, the tools do not involve a waiver approach but encode the norms of re-contribution into the covenant, allowing for multiple authors of licenses to submit contracts for authorization and branding as meeting the commitments, and allowing also for multiple classes of contracts other than patent licenses, including materials transfer agreements. Much of the effort of the BIOS initiative has shifted to an effort to radically increase transparency via patent informatics, using a “patent lens” technology to search across patent databases and to create open “patent landscapes” for key areas so that users can innovate around the existing patent lockups [54].
The most recent regime to emerge for patent licensing comes from Creative Commons (CC), an organization that has executed a suite of licenses for managing copyrights under the rubric of “some rights reserved”, now covering almost half a billion objects on the Web [55]. The patent licenses are being tested for use in sustainability in a project called the GreenXchange (GX), although the licenses are not specific to any field of use. These legal tools create a patent license regime akin to the CC license regime for copyrights. However, the CC copyright license suite contains six licenses, generated from choices including whether or not to enable commercial use and whether or not to enable derivative works. The patent regime is simpler than the CC copyright licenses, creating a commons of easily licensable patents via low transaction costs and transparency rather than through the use of “reach through” clauses requesting re-contribution of improvements by users through the patent license.
The core of the CC patent license grants a commercial use right to users to make, use, and sell the technology under the license. This right is conditioned on the users’ compliance with certain requirements: e.g., notification of license execution and right to request attribution (framed this way to avoid brand abuse via attribution, e.g., “Nike inside!” on a product Nike finds shoddy). This is an approach which one might follow for the sustainability technologies developed at a company such as Nike, which does not sell such technologies, but creates them in the process of designing and manufacturing shoes. Nike is willing to license such technologies as green rubber and water-based adhesives under such a regime, because their use is good for the overall sustainability environment while it does not erode the core competitive advantage of the company.
However, these patents represent only a tiny minority of total patents held at a company such as Nike, many of which could be useful for other uses in the right hands—if the core business of the company were protected, and if transaction costs were low enough. For example, Nike patents on airbag technology might be extremely useful in the construction of longer-living tires, which is a clear sustainability improvement in terms of landfill reduction. The public patent license approach in general, from the waivers to patent left to transaction cost reduction, has failed to address this use case, which is why in each case a very small total number of total potential patents has entered the public patent commons to date.
The CC approach addresses this reality by allowing the use of the public patent license as a standard “form” to which a patent holder can attach a single legal clause. This clause serves to modify the patent license by narrowing the set of users allowed to sign the license by a “field of use” restriction. Thus, Nike might add a paragraph allowing the use of traditionally off-limits technology like the airbag or shoe chemistry but exclude any shoe companies, athletic apparel manufacturers, or other related industrial competitors from signing the license. The underlying legal code of the rest of the contract remains the same, including the notification requirement, the right to request or deny attribution, and the fee structure (with related fee waivers). A patent owner could also use the standard form CC license but instead of a field of use exception, use a “safe harbor” provision in which the rights to make commercial use only apply in a limited area (such as rare disease or poverty reduction).
Two other elements distinguish the CC project. First, the approach provides that all patents under the licenses will also receive a broad, covenant-based “research exception” for academic and non-profit research uses. In other words, as long as the work done does not involve making and using a technology for sale, but for research, then wide latitude should be granted. To minimize transaction costs, this is achieved by means of the patent owner pledging not to sue, in a method inspired by the LPC and EPC commitments. Second, CC also provides a significant technical infrastructure for using metadata to describe rights and licenses, which integrates into Google, Yahoo!, Microsoft Office, and many other technical systems. The patent licenses reuse this infrastructure, allowing for integration of patent searching and licensing into web and enterprise software.
The CC approach theoretically opens the door to all three classes of innovation, but is probably most immediately relevant to the OI paradigm. The inclusion of a fee waiver for small-revenue companies allows for DI and UDI to emerge as well, and the Creative Commons technical infrastructure supports this possibility; but the capacity to innovate on patents tends to flow better in larger institutions, and it is within those larger institutions that the GX experiment is occurring as of this writing.

4.2. Accelerating User-Driven Innovation by Opening up Access to Scientific Information

User driven innovation in sustainability and energy technology has enormous potential to change the way that people interact with their energy sources on the ground. UDI offers a tantalizing opportunity to create local solutions to local problems using local tools—a laudable goal. However, there are very real limitations to the application of UDI in the spaces where knowledge itself is sparse, where technologies and tools are crude and insufficient, and where the ability to create a minimal prototype is mistaken for a revolutionary invention.
Access to knowledge can intersect with this set of limitations in multiple beneficial dimensions. First, if the user-innovators are able to access the available knowledge, the likelihood of repeating past failures decreases significantly. Second, the democratization of the ability to describe an experiment created by digital communications means that the user-innovator herself is able to engage in a conversation with others to receive feedback, advice, and criticism (whether constructive or not). And third, access to knowledge makes the overall community far more able to evaluate, test, and communicate the core innovations as either being truly innovative or a case of wishful thinking.
We have seen in the cases of wind technology the ability to implement user driven innovation in reality. The physics of windmills is well known, and the innovation comes from the design of locally implementable versions of the wind turbine and the understanding of how to repurpose locally available tools and materials. But these cases also hold an important lesson as to the limitations of all innovation systems when faced with knowledge problems, which is that the absence of standardized foundational knowledge in a subject area creates an upper boundary for UDI (see text box: The Boy Who Harnessed the Wind).
A good example is the limitation of the battery technology available in the world, whether to user-innovators or to companies. Energy examples of UDI often rest on the repurposing of car batteries and other storage systems, and a recent example of UDI in solar panels in which human hair replaces silicon involves a battery circuit of sorts as well. Access to old car batteries, papers about batteries, and papers about the electrochemical properties of hair can be facilitated. But the reality of battery technology is that it is primitive, no matter the sophistication or genius of its user.
Access to knowledge in this case can serve as a guide to avoid past failures. But in many ways it is even more important to help the users understand their own results. Is the weak electrical current observed from the solar panel built with hair truly a breakthrough in designing solar panels [43]? Or is it an artifact of the creation of a cuprous cell, which generates weak currents but doesn’t scale with size, meaning the panel itself won’t be able to liberate its designers and implementers from electric grids [57]?
These are precisely the kinds of questions that can be answered through access, not only to the available academic and technical literature, but to the community of users who can serve as a collective intelligence applied to the innovations. The solar panel example has been posted on thousands of web page, in tones ranging from breathless acceptance to frank and sometimes harsh debunking. From these pages flow a list of citations to read and debate, ideas on improving the concept and skepticism that the concept can’t be improved. Much of the debate is ill-informed, as is much of the debate on any topic on the Internet. But weaving throughout the content is a clear thread of linked, relevant knowledge [58].
This aspect of access, the idea of access not just to text but to communities, must be a part of designing access to information and knowledge for the purposes of sustainability. As mobile telephony becomes mobile network access, the capacity to engage in debate becomes a parallel to the capacity to access information. The user must know how to find information, and be legally allowed to use that information, but also how to sift through the sudden explosion of information available. Otherwise the potential benefits of access to scholarly and technical knowledge will be difficult to realize.

The Boy Who Harnessed the Wind

The story of William Kamkwamba is known primarily in the developed world from his autobiography, The Boy Who Harnessed the Wind [56]. A 14-year old forced to drop out of school when his parents couldn’t pay the school fees, Kamkwamba immersed himself in textbooks, and was able to both devise and build a windmill to help his family. The story is one of overcoming hardship, and of the success that comes with it, including visits to New York and Las Vegas, and an introduction to the Internet. But we can analyze the story as well as one of user-driven innovation, and explore the foundations of the design and implementation of the windmill.
From the perspective of innovation, three notable keys to the breakthroughs in design Kamkwamba made are (1) his own desire to tinker (with engines, radios, go-carts), (2) his ability to use the materials at hand to perform the tinkering, and (3) his access to textbooks containing essential principles of physics, magnetism, and electricity. The desire is at the heart of user-driven innovation: the goal is to understand the systems and improve them, not to acquire property rights to inventions. The user feels the problem acutely, understands the problem better than anyone (in the von Hippel sense, the knowledge of the problem is “sticking” close to the user) and is the most motivated to innovate a solution.
Close behind the desire to solve a problem is the democratic access to tools that can create solutions. In the machine-invention context there is a long history of user-driven innovation because the tools required are relatively well distributed—screwdrivers, wrenches—and for something like a windmill, there is present an underlying supply base of parts created by the widespread distribution of engines and other similar machines. This is not to imply that Kamkwamba’s innovation was easy; the goal is simply to point out that there was at least the possibility of taking parts from some machines and repurposing them in the pursuit of machine-building in a way that, for example, would not be possible for a biotechnology innovation in the same context.
Added to this base of inventors’ desire and access to minimally sufficient tools is the access to the theory and scientific canon that informed the design of the windmill itself. Similar to the scarce and unorthodox tools, the knowledge available was very basic—textbooks about physics—but sufficient in the right hands to allow for true innovation in design. One can only imagine how much easier the design would have been had true knowledge tools been available to Kamkwamba, like access to designs for other windmills, design software, review papers in engineering journals about magnetism and electricity in practice, and the informal communications from conferences, trade shows, blogs, and face to face meetings of windmill designers.
There are hundreds and hundreds of Web sites, and dozens of peer-reviewed journals, that represent essential information for the design and implementation of a windmill. None were accessible to the design process in this case. Access to the range of knowledge resources, from the “maker” culture to the elite technical journals informing western design optimization, would radically increase the capacity of the brilliant but geographically or culturally isolated inventors trying to solve a local sustainability problem.
Access to information and the ability to exploit information “overload” represents an enormous gulf between those with both ubiquitous connectivity to the global Internet as well as the economic resources to pay for proprietary scholarly content. The promise of open access to scholarly content is the potential to level the playing field for innovators the world over, not just those with access to Home Depot and the resources of a wealthy university library.
Another example of knowledge and technology flow is synthetic biology. Synthetic biology is a young field sitting at the intersection of molecular biology and engineering, in which the genetic functions elucidated through traditional biological research are converted into functions that can be used in an engineering context. Some genes act as “switches” and others as “inverters”, allowing engineers to treat them as abstract, combinable parts that can be programmed and inserted into living bacterial systems [59]. A good example would be the engineering of microbes to create the chemical precursors to naturally occurring compounds with significant human health benefits, but significant environmental issues. Artemisinin is lethal against all strains of malaria, making it an incredibly powerful compound in post-infection treatment, and has cured more than a million patients with the disease. It has been part of traditional Chinese medicine for thousands of years. But it comes at a cost. Artemisinin must be extracted from the leaves of the sweet wormwood tree, or via a complex and costly chemical synthesis process, each carrying a high cost [60]. Aggressive breeding and cultivation using petrochemicals would also carry a high environmental cost, and still are unlikely to yield enough compound to make the drug available at an affordable cost.
Synthetic biology offers a way out. A research group at the Lawrence Berkeley National Laboratory took genes from the sweet wormwood tree that make the chemical precursor and engineered a common microbe to serve as a natural “amplifier”—increasing the yield by more than 10,000 [61]. Beyond that, the system is something of a standard that allows the creation of the isoprenoid family of drugs to which artemisinin belongs, which includes costly anticancer drugs that would otherwise only be available through the destruction of rare marine corals.
Existing cases related to sustainability are more traditional examples of knowledge flow from the developed world to the developing world than they are examples of modern innovation theory in action. In the artemisinin case, the research is funded by large foundations in the developed world, with the intent to ship the drugs at low cost to areas in need. But the groundwork for downstream user driven innovation, distributed innovation, and open innovation is being laid by many of the early scholars in the field. A database of standardized “parts” is maintained at MIT (MIT Registry of Standard Biological Parts [62] and a large set of labs posts vital know-how and laboratory protocols on the web [63], all under either a public domain legal approach or a liberal copyright license strategy. Open patent strategies are being explored and released to ensure wide freedoms to operate. And a yearly competition involving college students worldwide is creating a growing base of people skilled in the art of using these open tools to address problems.
An important example of emerging distributed innovation in this context is in the area of groundwater analysis for arsenic [64]. Arsenic contamination is a dire problem in parts of the developing world, especially Bangladesh, where the World Health Organization estimates 35,000,000 people take water from polluted wells. Existing testing methods involve either shipping water samples for laboratory analysis, which is not a scalable way to test water in a local context, or the use of portable kits that can be misinterpreted or create new toxic waste as part of the testing process.
A group of Scottish college students took on the challenge of building a better arsenic detection system in 2006. Using the standard parts available at the MIT database and the natural properties of the E. coli microbe (the same bacterium used for artemisinin production) these students “programmed” E. coli with functionality that adjusted its natural ability to detect arsenic and integrated this ability to other natural functions that break down lactose to produce acid. This programming means that the new E. coli produce acid in the presence of arsenic. In combination with a simple acid detection fluid, which is not only cheap but easily distributable and non-toxic, a small amount of the bacteria will turn arsenic-polluted water red, while safe water will turn yellow. Thus, a group of students working with open tools and systems created, and in turn distributed the plans for, a workable field detection system for arsenic [48]. This is only the leading edge of applications from synthetic biology and points at the capacity building power of the open approach. [65].

5. Conclusions

Unquestionably, one of the prerequisites for sustainable development is knowledge, in a world where both the sustainability challenges and the knowledge bases are constantly changing. Knowledge not only informs coping, but it also supports innovation—and it helps to attract talent for sustainability problem-solving.
If as the current research literatures indicate, transferring knowledge for sustainability depends on user-driven co-production of knowledge, combining both external and local knowledge, then a central issue is how to assure local access to existing knowledge as a key requirement for that creative process.
The rapidly emerging information technology (IT) revolution is becoming a powerful enabler for such access, if intellectual property obstacles can be overcome—not only making information readily available but promoting a network culture that can integrate the advantages of closeness to a sustainability problem with an ability to share information and ideas with others who have a wider range of perspectives and experiences to draw upon. This potential is especially bright if institutional changes are encouraged and supported that, within some developing countries and regions, address ways in which dysfunctional governance and other institutional disincentives discourage and impede local innovativeness.
Together, these conclusions paint a picture of potential transformational improvements in prospects for sustainability in many parts of the world in the coming decades, built on IT-based open access to scientific and technological information and knowledge. If we can realize this enormously promising potential, sustainability will be accelerated, and linkages between actors at scales from global to local will be strengthened in ways that will be good for the global community in other respects as well.

References

  1. Wilbanks, T.J. Sustainable development in geographic context. Ann. Assn. Amer. Geogr. 1994, 84, 541–557. [Google Scholar]
  2. Ecosystems and Human Well-Being: Multiscale Assessments; Millennium Ecosystem Assessment Series Volume 4; Island Press: New York, NY, USA, 2005.
  3. Wilbanks, T.; Kates, R. Beyond adaptation to climate change: embedding adaptation in responses to multiple threats and stresses. Ann. Assn. Amer. Geogr. 2010, (in press). [Google Scholar]
  4. Schumpeter, J. The Theory of Economic Development; Harvard University: Cambridge, MA, USA, 1934. [Google Scholar]
  5. Bush, V. Science: The Endless Frontier. Available online: http://www.nsf.gov/about/history/vbush1945.htm (accessed on 25 September 2009).
  6. Kennedy, D. Bayh-Dole: Almost 2005. Science 2005, 307, 1375. [Google Scholar] [CrossRef] [PubMed]
  7. So, A. Is Bayh-Dole good for developing countries? Lessons from the U.S. experience. PLoS Biol. 2008, 6, 262. [Google Scholar]
  8. Rai, A.; Eisenberg, R. Bayh-Dole reform and the progress of biomedicine. Law Contemp. Prob. 2003, 66, 1. [Google Scholar]
  9. Botano, N.; Sabato, J. Science and Technology in the Future Development of Latin America. Available online: http://dialnet.unirioja.es/servlet/revista?codigo=140 (accessed on 9 March 2010).
  10. Xerox has been heavily criticized (particularly by business historians) for failing to properly commercialize and profitably exploit PARC’s innovations. A favorite example is the GUI. Initially developed at PARC for the Alto and then commercialized as the Xerox Star by the Xerox Systems Development Department. Although very significant in terms of its influence on future system design, it is deemed a failure because it only sold approximately 25,000 units. A small group from PARC, led by David Liddle and Charles Irby formed Metaphor Computer Systems. They extended the Star desktop concept into an animated graphic and communicating office automation model and sold the company to IBM. From http://en.wikipedia.org/wiki/PARC (company). PARC. Available online: http://en.wikipedia.org/wiki/PARC_company (accessed on 9 March 2010).
  11. Chesbrough, H.W. The era of open innovation. MIT Sloan Manag. Rev. 2003, 44, 25–41. [Google Scholar]
  12. Chesbrough, H. Open Innovation: The New Imperative for Creating and Profiting from Technology; Harvard Business School Press: Boston, MA, USA, 2003; p. 227. [Google Scholar]
  13. Lakhani, K.; Panetta, J. The principles of distributed innovation. Innovations 2007, 2, 97–112. [Google Scholar]
  14. VonHippel, E. Democratizing Innovation; MIT: Cambridge, MA, USA, 2005. [Google Scholar]
  15. Jeppesen, L.; Lakhani, K. Marginality and problem-solving effectiveness in broadcast search. Organ. Sci. 2010. [CrossRef]
  16. Boudreau, K.; Lakhani, K. How to manage outside innovation. MIT Sloan Manag. Rev. 2009, 50, 68–76. [Google Scholar]
  17. Our Common Journey: A Transition Toward Sustainability; National Academy Press: Washington, DC, USA, 1999.
  18. Kates, R.; Clark, W.C.; Corell, R.; Hall, J.M.; Jaegar, C.; Lowe, I.; McCarthy, J.; Schellnhuber, H.; Bolin, B.; Dickson, N.; Faucheux, S.; Gallopin, G.; Gruebler, A.; Huntley, B.; Jäger, J.; Jodha, N.; Kasperson, R.; Mabogunje, A.; Matson, P.; Mooney, H.; Moore, B., III; O’Riordan, T.; Svedin, U. Sustainability science. Science 2001, 292, 641–642. [Google Scholar] [PubMed]
  19. Alberts, B. Harnessing Science for a More Rational World. 2003. Available online: http://www.nasonline.org/site/DocServer/speech2003.pdf?docID=110 (accessed on 9 October 2009).
  20. Holdren, J. Science and technology for sustainable well-being. Science 2008, 319, 424–443. [Google Scholar]
  21. ICSU Series on Science for Sustainable Development No. 9: Science and Technology for Sustainable Development; International Council for Science (ICSU): Paris, France, 2002.
  22. Van Kerkhoff, L.; Lebel, L. Linking knowledge and action for sustainable development. Annu. Rev. Env. Resour. 2006, 31, 445–477. [Google Scholar]
  23. McNie, E.C. Reconciling the supply of scientific information with user demands: an analysis of the problem and review of the literature. Environ. Sci. Policy 2007, 10, 17–38. [Google Scholar]
  24. Kristjanson, P.; Dickson, N.; Clark, W.C.; Romney, D.; Puskur, R.; MacMillan, S.; Grace, D. Linking international agricultural research knowledge with action for sustainable development. PNAS 2009, 106, 5047–5052. [Google Scholar] [CrossRef] [PubMed]
  25. Cash, D.; Clark, W.C.; Alcock, F.; Dickson, N.; Eckley, N.; Guston, D.H.; Mitchell, R.B. Knowledge systems for sustainable development. PNAS 2003, 100, 8086–8091. [Google Scholar] [CrossRef] [PubMed]
  26. Wilbanks, T. Inducing transformational energy technological change. Energy Econ. 2010, (in press). [Google Scholar]
  27. Merton, R.K. The Sociology of Science: Theoretical and Empirical Investigations; University of Chicago Press: Chicago, IL, USA, 1973. [Google Scholar]
  28. Merton, R.; Barber, B.; Shulman, J.L. The Travels and Adventures of Serendipity: A Study in Sociological Semantics and the Sociology of Science; Princeton University Press: Princeton, NJ, USA, 2004. [Google Scholar]
  29. Koestler, A. The Act of Creation; Penguin Books: London, UK, 1964; p. 752. [Google Scholar]
  30. Parayil, G. Conceptualizing Technological Change: Theoretical and Empirical Explorations; Rowman & Littlefield: Oxford, UK, 1999. [Google Scholar]
  31. Zucker, L.; Darby, M. Star scientists and institutional transformation: Patterns of invention and innovation in the formation of the biotechnology industry. PNAS 1996, 93, 12709–12716. [Google Scholar] [PubMed]
  32. Feyerabend, P. Against Method: Science in a Free Society; New Left Books: London, UK, 1975; p. 279. [Google Scholar]
  33. Polanyi, M. Personal Knowledge: Towards a Post-Critical Philosophy; Routledge & Kegan Paul: London, UK, 1962. [Google Scholar]
  34. Taleb, N.T. The Black Swan: The Impact of the Highly Improbable; Random House: New York, NY, USA, 2007. [Google Scholar]
  35. Taleb’s Homepage. http://www.fooledbyrandomness.com (accessed on 9 April 2010).
  36. Hall, P. Cities in Civilization; Pantheon: New York, NY, USA, 1978. [Google Scholar]
  37. Gates, B. Innovating to Zero. Proceedings of the Technology, Entertainment, and Design (TED) 2010 Conference on What the World Needs Now, Long Beach, CA, USA, 9−13 February 2010; Available online: http://www.ted.com/talks/bill_gates.html (accessed on 4 March 2010).
  38. Gladwell, M. In the air: who says big ideas are rare? New Yorker, 12 May 2008. [Google Scholar]
  39. Benkler, Y. The Wealth of Networks: How Social Production Transforms Markets and Freedom; Yale University Press: New Haven, CT, USA, 2006. [Google Scholar]
  40. Reed, D. The law of the pack. Harvard Bus. Rev. 2001, 2, 23–24. [Google Scholar]
  41. Ruttan, V. The green revolution: seven generalizations. Int. Dev. Rev. 1977, 19, 16–23. [Google Scholar]
  42. Needham, C.A.; Canning, R. Global Disease Eradication: The Race for The Last Child; ASM Press: Washington, DC, USA, 2003; p. 196. [Google Scholar]
  43. Carden, F. Knowledge to Policy: Making the Most of Development Research; Sage: London, UK, 2009. [Google Scholar]
  44. Burgermeister, J. African Town Gets Wind Power and Knowledge. Available online: http://www.renewableenergyworld.com/rea/news/article/2008/02/african-town-gets-wind-power-and-knowledge-51394 (accessed on 22 March 2010).
  45. Serafino, D. Survey of patent pools. KEI Research Notes, 4 June 2007. [Google Scholar]
  46. GNU General Public License; Free Software Foundation: Boston, MA, USA. Available online: http://www.gnu.org/licenses/gpl.html (accessed on 22 March 2010).
  47. Open Source Development Laboratories Launches Open Source Patent Commons Library. 2005. Available online: http://www.patentcommons.org (accessed on 12 December 2009).
  48. Shankland, S. Linux Potentially Infringes 283 Patents. ZDNet News. 2 August 2004. Available online: http://news.zdnet.com/2100-3513_22-137530.html (accessed on 2 August 2009).
  49. Linux Foundation. Patent Commons Project. 2010. Available online: http://www.patentcommons.org/about/terms_of_use.php (accessed on 22 March 2010).
  50. Perens, B. The Open Source Patent Conundrum. CNET News. 31 January 2005. Available online: http://news.cnet.com/The-open-source-patent-conundrum/2010-1071_3-5557340.html (accessed on 6 October 2009).
  51. World Business Council for Sustainable Development. Eco-Patent Commons. Available online: http://www.wbcsd.org/web/epc/ (accessed on 22 March 2010).
  52. Reutsche, E. Open Innovation: Technology Leadership through Collaboration. In Proceedings of the Business Symposium on Open Innovation in Global Networks Conference, Copenhagen, Denmark, 25−26 February 2008.
  53. Stallman, R. On Free Hardware. Available online: http://www.linuxtoday.com/news_story.pho3?Itsn=1999-06-22-005-05-NW-LF-0049 (accessed on 9 April 2010).
  54. Patent Lens Homepage. http://www.patentlens.net/daisy/patentlens/patentlens.html (accessed on 22 March 2010).
  55. Creative Commons Homepage. http://creativecommons.org/ (accessed on 22 March 2010).
  56. Kamkwamba, W.; Mealer, B. The Boy Who Harnessed the Wind: Creating Currents of Electricity and Hope; William Morrow: New York, NY, USA, 2009. [Google Scholar]
  57. Teenager Invents 23 Solar Panel that Could Be Solution to Developing World’s Energy Needs…Made from Human Hair. Daily Mail. 10 September 2009. Available online: http://www.dailymail.co.uk/sciencetech/article-1212005/Teenager-invents-23-solar-panel-solution-developing-worlds-energy-needs-human-hair.html (accessed on 22 March 2010).
  58. Nepal Human Hair Solar Panel Hoax. Available online: http://sites.google.com/site/edwardcraigyatt/hairsolarpaelnepal (accessed on 9 March 2010).
  59. Martin, V.; Pitera, D.; Withers, S.; Newman, J.; Keasling, J. Engineering a Mevalonate Pathway in Escherichia Coli for production of Terpenoid. Nat. Biotech. 2003, 21, 796–802. [Google Scholar]
  60. Lite, J. What is Artemisinin? Ask the experts. Scientific American, 23 December 2008. [Google Scholar]
  61. Chu, J. A Safe and Simple Arsenic Detector. 2007. Available online: http://www.technologyreview.com/Biotech/18103 (accessed on 9 February 2009).
  62. Registry of Standard Biological Parts Homepage. http://parts.mit.edu (accessed on 22 March 2010).
  63. Open WetWare Homepage. http://openwetware.org (accessed on 22 March 2010).
  64. Joshi, N.; Wang, X.; Montgomery, L.; Elfick, A.; French, C.E. Novel Approaches to Biosensors for Detection of Arsenic in Drinking Water. Desalination 2009, 248, 517–523. [Google Scholar]
  65. iGEM2006. Arsenic Biosensor—The Worst Mass Poisoning the World Has Ever Known. Available online: http://parts2.mit.edu.wiki/index.php/Arsenic_Biosensor (accessed on 22 March 2010).

Share and Cite

MDPI and ACS Style

Wilbanks, J.T.; Wilbanks, T.J. Science, Open Communication and Sustainable Development. Sustainability 2010, 2, 993-1015. https://doi.org/10.3390/su2040993

AMA Style

Wilbanks JT, Wilbanks TJ. Science, Open Communication and Sustainable Development. Sustainability. 2010; 2(4):993-1015. https://doi.org/10.3390/su2040993

Chicago/Turabian Style

Wilbanks, John T., and Thomas J. Wilbanks. 2010. "Science, Open Communication and Sustainable Development" Sustainability 2, no. 4: 993-1015. https://doi.org/10.3390/su2040993

Article Metrics

Back to TopTop