Symposium Panel: The Technology Context
Session Two of the J.B. and Maurice C. Shapiro Environmental Law Conference dealt with several leading edge technologies and the challenges associated with establishing a regulatory framework for their safe use. The session included four distinguished experts on the implementation of new and innovative environmental technologies. This summary is composed from the author’s notes and may not accurately reflect the statements of the panelists.
Samuel Thernstrom, a senior climate policy advisor at Clean Air Task Force began the session by discussing geoengineering research and the challenges associated with regulating its development. Geoengineering is an effort to alter the planet’s climate and stop global warming. The two primary forms of geoengineering being researched are Solar Radiation Management (SRM) and Carbon Dioxide Removal (CDR). SRM is a form of geoengineering that attempts to block sunlight in an effort to cool the planet. CDR, on the other hand, is a general term to describe processes that remove CO2 from the ambient air, otherwise known as atmospheric scrubbing.
Mr. Thernstrom argues geoengineering should be taken seriously for two reasons. First, there is a growing concern that current attempts at adaptation and mitigation in solving climate change have become imprudent. Second, there is a growing interest in geoengineering in other nations. Thus, it is important for the United States to become involved to ensure such technology is being properly regulated and performed competently. To establish an effective regulatory framework for geoengineering, Mr. Thernstrom believes nations must foster necessary research and develop an international consensus on geoengineering that is free from the gridlock of political posturing.
The second speaker was Jonathan Gilligan, an Associate Professor at Vanderbilt University and an Assistant Director for Research at the Vanderbilt Climate Change Research Network. Professor Gilligan raised many concerns regarding geoengineering and its value going forward. Professor Gilligan was especially critical of SRM because lowering the average global temperature may have serious consequences regarding precipitation, climate cycles, ozone and a host of “unknown unknowns.” In establishing a regulatory framework for SRM, Professor Gilligan raised important questions about who should control the “thermostat” in lowering the planet’s temperature, responsibility for maintenance of the temperature and also the uncertainties in proving causation for those adversely affected by SRM. Ultimately, Professor Gilligan is concerned that SRM does not adequately remedy the problems associated with climate change, it does not clearly establish uncontroversial criteria that can be used in assessing its benefit and in turn foster coordination, nor does it build on an existing technology, which may work to defeat any benefit it may have.
Michael Rodemeyer, founder of the Pew Initiative on Food and Biotechnology and a faculty member of the Science, Technology and Society Program at the University of Virginia focused his presentation on synthetic biology and the various regulatory challenges it faces. Synthetic biology involves a process in which a genetic code can be constructed synthetically in laboratories to build innovative organisms that could be used to solve many of the health concerns facing humans. Similar to the concerns raised about geoengineering, synthetic biology posses problems of novelty, complexity and uncertainty.
Current risk assessment methods may not properly account for these potentially complex and unpredictable outcomes. Therefore, the challenge for regulators of synthetic biology is to find the balance between under and over regulating. Mr. Rodemeyer believes regulators will have to live with some risk moving forward. Scientists cannot prove something is safe but only dispel certain risks. Performing necessary field tests will inevitably create risk and therefore the reasons behind its application can be crucially important in any risk/reward assessment. Beyond looking at whether something is safe or not, Mr. Rodemeyer sees a system that properly monitors risk as the most valuable regulatory tool for synthetic biology moving forward.
The last speaker of Session Two was Jerry Johnston of the U.S. EPA Office of Environmental Information. Mr. Johnston discussed data exchange and its role in rapidly addressing problems while shaping public policy. Mr. Johnston provided the Chesapeake Bay Program and the government website, Data.gov, as two examples of readily accessible data being put to beneficial use. As part of the Chesapeake Bay Program, the website ChesapeakeStat was created to provide data to the public about efforts being taken to restore the Chesapeake Bay. The website provides data about indicators, strategies and funding dedicated to improving the Bay. Mr. Johnston noted the website has worked to foster unanticipated uses of data and has significantly shaped public policy.
Mr. Johnston then looked at the Data.gov program. Data.gov is a government website that aids in providing to the public datasets generated and held by the Federal government. Data.gov improves access to government data and encourages innovation while fostering government transparency and promoting efficiency. Overall, Mr. Johnston believes Data.gov has been great in fostering a community and establishing a platform for engagement with government and will be a powerful tool in shaping policy in the future.
In conclusion, these panelists, like many of the other panels, emphasized balancing risks and innovation; the importance of publically available information; and the need for policymakers to be open-minded and flexible.
-Louis Formisano, Associate