Pages

Thursday, September 3, 2015

AAG 2016 CFP: Political Ecologies of Technology

AAG 2016 CFP: Political Ecologies of Technology
#AAGpoet
Association of American Geographers (AAG) Annual Meeting, 29 March – 2 April 2016, San Francisco


Anthony Levenda | Urban Studies, Portland State University
Dillon Mahmoudi | Urban Studies, Portland State University
Eric Nost | Geography, University of Wisconsin - Madison
Heather Rosenfeld | Geography, University of Wisconsin - Madison
Ashton Wesner | Environmental Science, Policy and Management, University of California - Berkeley

New forms of data production, management, analysis, and use are increasingly coming to bear on realms familiar to political ecologists: conservation, agriculture, resource extraction, the city, and the body. Political ecology (PE) is marked by its attention to situating socio-environmental change materially and discursively through the political economic contexts of concrete “land managers.” This has meant examining the making and application of knowledge about environments (Lave 2012) as well as the social movements and identities that arise around and through nature (e.g. gender; Carney 1993). Urban political ecologists (UPE) in particular have used infrastructure to show that the circulation of power is bound together with nature’s production and circulation (Heynen et al. 2006). As such, political ecologists have long dealt with various technologies - broadly defined, and digital or otherwise (e.g. GIS - Weiner et al. 1995; remote sensing (RS) - Robbins 2001) - often noting how their introduction changes specific landscapes, livelihoods, and access to resources.

With growing interest in and use of digital tools to know and transform nature, a renewed focus on the political ecology of technology is necessary. In environmental governance, for example, new kinds of RS hardware, ecosystem modelling tools, and data visualization programs may both extend and transform existing gender, colonial, capitalist, and other relations through automation, rapid processing, displacement of existing knowledge regimes, and integration of ecosystem data with financial logics (Johnson 2013). Taking technology as a focal point, political ecologists can learn from feminist and critical race scholars who through studies on the industrial revolution of the home and DNA sequencing/mapping, have pointed to the ways in which technologies tend to embed and entrench existing social relations and patterns of accumulation (Cowan 1985; Wajcman 1991, 2008; Tallbear 2013). However, the conditions under which such technologies are developed and deployed - and the nature of the technologies themselves - remain understudied in the subject areas familiar to political ecologists, raising questions about how to characterize the contexts and actors driving environmental change (Robbins and Bishop 2008; Rocheleau 2008; Braun and Whatmore 2010; Meehan 2013) and socionatural “metabolism” (Heynen et al. 2006).

We believe political ecologists are well-equipped to approach these questions and that PE’s emphasis on contextualization offers unique perspectives to critical technology studies in general, beyond the realms of conservation and development. We invite topical and conceptual papers from political ecologists and others addressing the relationship between software, data, technology, and environment, with a critical lens on how these technologies produce, transmute, and/or subvert social relations along intersecting axes of race, gender, class, ability, and immigration status. Specifically, we seek papers that address the following questions:

  1. What kinds of landscapes are produced by algorithmic decision-making? (Weiner et al 1995; Robbins 2001)
  2. If we take Angelo & Waschmuth’s (2014, 2) call for urban political ecology (UPE) to examine “the dimensions of urbanization processes that exceed the confines of the traditional city,” how might we examine the UPE of digitally enhanced infrastructures that facilitate flows of nature into the city (and urbanized nature out of it)
  3. What does it mean to see like a “cyborg state”? (Scott 1998; Haraway 2013) How do new technologies, such as “dashboards”, facilitate abstraction, quantification, and representation, changing state strategies for conceptualizing and acting on socio-ecological complexity and enabling control and discipline?
  4. In turn, how do land managers use social media, new kinds of RS, and other technologies to resist this control, or in conservation politics in general? (Büscher 2013)
  5. In what ways does "nature" - in various guises - appear in software engineering and software architecture to serve as a guiding metaphor (e.g. software "ecosystem" or data “mining”)? In turn, what are the "ideologies of nature" (Smith 2008 [1984]) that arise from and alongside specific technologies?
  6. What monsters, creatures, and other figures can help us critically approach and better understand uneven human-technology-environment relations?
We wish to do so while opening up fruitful discussion around conditions of historical context and social relations in PE. Ultimately, with PE's focus on social and environmental justice and equity, we ask, alongside Robbins and Moore (2015), under what conditions does technology embody, enact, and enable more abundant socio-ecological futures (Collard et al. 2015)? Under what conditions does it perpetuate violence?

Those who would like to participate in the session should contact nost@wisc.edu by October 22 with a brief statement of interest or an abstract. Session participants will need to submit an abstract and register for the conference by October 29.


References


Braun, Bruce and Sarah Whatmore, eds. 2010. Political Matter: Technoscience, Democracy and Public Life. Minneapolis: University of Minnesota Press, 319pp.


Büscher B. 2013. Nature 2.0. Geoforum 44 (May):1–3


Carney, Judith A. 1993. Converting the Wetlands, Engendering the Environment:  The  Intersection of Gender with Agrarian Change in Gambia. Economic Geography 69(4): 329-348.


Collard, R-C., J. Dempsey, J. Sundberg. 2015. A manifesto for abundant futures. Annals of the Association of American Geographers 105(2): 322-330.


Haraway, Donna J. Simians, Cyborgs, and Women: The Reinvention of Nature. Routledge, 2013.


Heynen, N., M. Kaika & E. Swyngedouw.  (2006). “Urban political ecology: politicizing the production of urban natures.” In Heynen, N., M. Kaika & E. Swyngedouw (eds.) In the nature of cities: Urban political ecology and the politics of urban metabolism.   New York: Routledge, pp.1-19.


Johnson, L. 2013. Catastrophe bonds and financial risk: Securing capital and rule through
contingency. Geoforum 45: 30-40.


Lave, R. 2012. Bridging Political Ecology and STS: A Field Analysis of the Rosgen Wars. Annals of the Association of American Geographers 102 (2):366-382.


Meehan, K. 2013. Tool-power: Water infrastructure as wellsprings of state power. Geoforum 57 (1): 215-224.


Robbins, P. 2001. Fixed Categories in a Portable Landscape: The Causes and Consequences of Land-Cover Categorization. Environment and Planning A 33 (1): 161–80.


Robbins, P. and K. Bishop. 2008. There and back again: Epiphany, disillusionment, and rediscovery in political ecology. Geoforum 39: 747-755.


Robbins, P. and S. Moore, 2015. “Love Your Symptoms: A Sympathetic Diagnosis of the Ecomodernist Manifesto” http://entitleblog.org/2015/06/19/love-your-symptoms-a-sympathetic-diagnosis-of-the-ecomodernist-manifesto/. Accessed July 24, 2015.


Rocheleau, D.E. 2008. Political Ecology in the key of policy: from chains of explanation to webs of relation. Geoforum 39: 716-727


Scott, James C. 1998. Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. Yale University Press.


Smith N. 2008. Uneven Development: Nature, Capital, and the Production of Space, 2nd edition,
University of Georgia Press: Athens, GA.


Stengers, Isabelle, and Jane Bennett. Political Matter: Technoscience, Democracy, and Public Life. Edited by Bruce Braun and Sarah J. Whatmore. Minneapolis: Univ Of Minnesota Press, 2010.
TallBear, Kim. 2013. Genomic Articulations of Indigeneity. Social Studies of Science 43(4): 509-534.
Wajcman, Judy. Feminism Confronts Technology. Penn State Press, 1991.
Weiner, D., TA Warner, TM Harris, and RM Levin. 1995. “Apartheid representations in a digital landscape: GIS, remote sensing and local knowledge in Kiepersol, South Africa.” Cartography and Geographic Information Systems 22(1): 30-44.

Sunday, August 23, 2015

Katrina 10 roundup

I've compiled links to articles - everything from brief news stories to longform pieces - surrounding the ten year anniversary of Hurricane Katrina. I've also tried separating them into categories or genres. This is obviously a work in progress and I'll update this as the week leading up to the actual anniversary continues, and afterwards. What've I missed?

last updated: 8/25/15

Memoir/personal reflection
"Essay: Claire Z. Cardona, a ‘Katrina Kid,’ on weathering ‘The Big One’" Dallas News
http://www.dallasnews.com/news/local-news/20150822-essay-claire-z.-cardona-a-katrina-kid-on-weathering-the-big-one.ece

"NASA vs Nature, " ArsTechnica
arstechnica.com/science/2015/08/nasa-versus-nature-august-29-2005/

"I worked for the governor of Louisiana during Katrina. Here are 5 things I learned." Vox
http://www.vox.com/2015/8/20/9176225/hurricane-katrina-government

"Ten years after," The New Yorker
http://www.newyorker.com/magazine/2015/08/24/ten-years-after-comic-strip-ronald-wimberly?mbid=social_twitter

"New Orleans' beautiful complexity was the one thing Katrina didn't wash away," The Guardian
http://www.theguardian.com/commentisfree/2015/aug/23/new-orleans-hurricane-katrina-funeral-best-party?CMP=twt_gu

"How One of Katrina’s Feel-Good Stories Turned Bad," Buzzfeed
http://www.buzzfeed.com/petermoskowitz/how-one-of-katrinas-feel-good-stories-turned-bad#.bfpMJEobb

Coastal preparedness
"Since Katrina: NASA Advances Storm Models, Science," NASA.com
http://www.nasa.gov/feature/goddard/since-katrina-nasa-advances-storm-models-science

"New Orleans area's upgraded levees not enough for next 'Katrina,' engineers say," NOLA.com
http://www.nola.com/futureofneworleans/2015/08/new_levees_inadequate_for_next.html

"Policy: Hurricane Katrina’s lessons for the world," Nature
http://www.nature.com/news/policy-hurricane-katrina-s-lessons-for-the-world-1.18188?WT.mc_id=TWT_NatureNews

"Offshore oil and gas industry adapts, but risks remain 10 years after Katrina," NOLA.com
http://www.nola.com/katrina/index.ssf/2015/08/energy_industry_adapts_but_sto.html?utm_source=dlvr.it&utm_medium=twitter

“How to Save a Sinking Coast? Katrina Created a Laboratory,” New York Times
http://www.nytimes.com/2015/08/08/science/louisiana-10-years-after-hurricane-katrina.html

"Protecting a New Generation of Poisoned Kids After Katrina," National Geographic
http://news.nationalgeographic.com/2015/08/150819-new-orleans-katrina-lead-poisoning-hurricane-children-environment-health-pollution/?utm_source=Twitter&utm_medium=Social&utm_content=link_tw20150819news-katrinalead&utm_campaign=Content&sf12147598=1

"10 Years Later, Did We Learn Anything from Hurricane Katrina?" Forbes

http://www.forbes.com/sites/marshallshepherd/2015/08/14/10-years-later-did-we-learn-anything-from-hurricane-katrina/

"From the archives"
"And Still They Rise: Confronting Katrina," Edge of Sports
http://www.edgeofsports.com/2007-08-29-280/index.html

"The Superdome: Monument to a Rotten System," Edge of Sports
http://www.edgeofsports.com/2005-09-06-152/

Photography
"THEN AND NOW: See how New Orleans has bounced back 10 years after Hurricane Katrina," NY Daily News
http://www.nydailynews.com/news/national/new-orleans-10-years-hurricane-katrina-article-1.2334354

"How Katrina changed us: 9 essays of loss, perseverance and rebirth," NOLA.com
http://www.nola.com/katrina/index.ssf/2015/08/katrina_-_how_it_changed_us_9.html?utm_source=dlvr.it&utm_medium=twitter

Cultural change, policy change, and gentrification
"Remembering Katrina in the #BlackLivesMatter Movement," Medium
https://medium.com/@amprog/remembering-katrina-in-the-blacklivesmatter-movement-dd91367aa9d4


"Hurricane Katrina proved that if black lives matter, so must climate justice," The Guardian
http://www.theguardian.com/us-news/commentisfree/2015/aug/24/hurricane-katrina-black-lives-matter-climate-justice?CMP=share_btn_tw

"Hurricane Katrina, ten years later: When the investor class goes marching in," UMN Press Blog
http://www.uminnpressblog.com/2015/08/katrina-10-years-later-when-investor.html

"New Orleans, the Reluctant 'City Laboratory'" CityLab
http://www.citylab.com/design/2015/08/new-orleans-the-reluctant-city-laboratory/401144/?utm_source=SFTwitter

"“It’s not just a party, it’s our life”: Jazz musicians led the way back to the city after Katrina — but what is this “new” New Orleans?" Salon
http://www.salon.com/2015/08/23/its_not_just_a_party_its_our_life_jazz_musicians_led_the_way_back_to_the_city_after_katrina_but_what_is_this_new_new_orleans/?utm_source=twitter&utm_medium=socialflow

"Who Killed Public Housing in New Orleans?" The Nation
http://www.thenation.com/article/requiem-bricks/

In fact, check out this week's entire issue of The Nation:
http://www.thenation.com/issue/august-31-september-7-2015/

"10 New Orleanians on How Katrina Changed Their City" Next City
https://nextcity.org/features/view/new-orleans-hurricane-katrina-local-stories-housing-gentrification-race#calvinjohnson

"'Death of My Career' What happened to New Orleans' veteran black teachers" Ed Week
http://neworleans.edweek.org/veteran-black-female-teachers-fired/

"“Reform” makes broken New Orleans schools worse: Race, charters, testing and the real story of education after Katrina," Salon
http://www.salon.com/2015/08/03/reform_makes_broken_new_orleans_schools_worse_race_charters_testing_and_the_real_story_of_education_after_katrina/

"10 years after Katrina, a look at Obama’s promises to rebuild and protect New Orleans," The Lens
http://thelensnola.org/2015/08/20/10-years-after-katrina-a-look-at-obamas-promises-to-rebuild-and-protect-new-orleans/

"A Housing Crisis Amid Tens of Thousands of Abandoned Homes," The Atlantic
http://www.theatlantic.com/business/archive/2015/08/new-orleans-blight-hurricane-katrina/401843/

"Starting Over, " The New Yorker
http://www.newyorker.com/magazine/2015/08/24/starting-over-dept-of-social-studies-malcolm-gladwell

Monday, March 9, 2015

The limits and value of big data: towards a political ecology approach

It's been quite a while since I've posted here. I've been working a bunch with UW-Madison's Center for Culture, History, and Environment on our new site, Edge Effects, including a post on the use of site selection models in conservation, and now as an editor. Check it out!

In fact, I've been thinking more and more about the use of tools like site selection models in the practice of conservation: what they do, the political economy behind their production, and how people use them.


And so I've been looking more and more into how "big data" is coming to bear on the environment. I was scheduled to give the paper below at the recent Dimensions of Political Ecology conference at the University of Kentucky, but my session was already jam-packed! So I'm posting it here instead. I make several arguments in the paper: 1) though much of what we hear about big data comes from the realms of healthcare, academic research, and corporate finance, new kinds of data analytics are indeed coming to bear on familiar territories for political ecologists: conservation, agriculture, resource extraction, and the body; 2) political ecologists are well-equipped to tackle big data; 3) political ecology may in fact offer a unique perspective to critical data studies in general, beyond the realm of conservation and development. These arguments are somewhat superficial here, nor is the paper fully referenced. It is a first draft and I appreciate any feedback you may have:


----

My research focuses on the tools of environmental governance. I’ve looked at ecosystem assessment methodologies that produce viable markets in nature; my dissertation is looking at the political economy of new visualization tools and models, and data collection techniques being brought to bear on coastal land loss and marsh restoration in Louisiana. I’m asking: how are these tools produced and utilized and with what effects in the service of adaptive management and the valuation of ecosystem services. So when I saw this John Deere promotional video, I couldn’t help but be intrigued. [It’s a little slow-moving – you might fast forward to the 3 minute mark].




Promoters of these kinds of “data-driven solutions” to agriculture, business, health, government, and conservation suggest that a frontier of existing and constantly coming online information – big data - and the new tools and techniques associated with compiling and making sense of it - can fundamentally change how individuals and institutions go about decision-making. Political ecology (PE) has always concerned itself with the question of how land managers make decisions, with a critical eye towards contextualizing these decisions within broader political and economic processes and forces. So what if anything does PE have to say about this particular situation?


PE, I argue, has the conceptual tools and interests to engage with these kinds of situations. PEists should be engaging with these kinds of situations – in fact, given its long-standing emphasis on questioning the scalar and political premises of knowledge’s production, circulation, and application it is uniquely positioned to, in ways that would add to the conversation around new forms and objects of data analysis as a whole. Not only should it, but in many ways it must if it want to continue being its critical role as a “hatchet” for what science, state, and capital expect from those working the land.


In this paper, I spell out what big data is, to illustrate briefly three different ways data is being taken up in environmental governance, and ultimately, what PEists can do with big data – what we can ask, what we can show, and how we might practice it ourselves.



"The Body as a Source of Big Data." http://ihealthtran.com/wordpress/2013/03/infographic-friday-the-body-as-a-source-of-big-data/
What is big data?
Even just spelling out what big data is an improvement upon most accounts, which give it an unearned mythical status:
Simply put, because of big data, managers can measure, and hence know, radically more about their businesses, and directly translate that knowledge into improved decision making and performance. Harvard Business Review
It’s crucial yet difficult to pick apart what we mean by big data. It has a certain parallax – that is, it’s an object that looks different when we look at it from different advantages. At one level we might say big data necessarily comprises an object, a method or practice, and a tool – on another level it’s: data, new tools for collecting it, and new tools for analyzing and making sense of it. Let’s look at each in turn.

Object
Colloquially, there’s a sense that big data is simply datasets that Excel can’t deal with. More precisely, this means data where there are more columns than rows, or, as many or more attributes of events and things than those events/things themselves (National Academy of Sciences, 2013). But big data does not just mean big much less numerical data (spreadsheet-ready or not). A more formal definition might include not just voluminous, but: disparate (that is, coming from various sources – different sensors, let’s say, but also different texts), heterogeneous (of different data types – text (e.g. police records) vs. spatial), and uncertain (of varying quality, accuracy, and so on). For some observers and promoters, this translates to: “volume, velocity, and variety.”
"Big Data? Volume, Velocity, Variety." Some also add a fourth v: value (see below!) http://www.wired.com/2013/06/is-big-data-in-the-trough-of-disillusionment/

Method
Whatever the data itself looks like, it won’t speak for itself. There’s a practice to big data, and there are two important moments to consider here.

Data collection

First, big data is marked by the use of new technologies that can record more and more observations, more and more rapidly, and perhaps above all, more and more remotely  - that is, at some (physical, social, computational) distance from where, who, and what is in fact sifting through the data. Some of the foremost big data stories we hear about fall into one – or often more - of three kinds of a remote governance: 1) Enhanced remote sensing (RS).  Political ecologists have regularly both employed and critiqued RS (Turner 2003), but: 1) this is in many ways RS on steroids – one company, Skybox, is interested not simply in capturing land cover, but extremely fine details of land use (such as the number of cars parked at a mall on a given day); 2) one of the interesting new developments we’re seeing is the integration of remotely sensed data and user-provided data. For instance, satellites are capturing spectral data in west Africa, which algorithms then parse into general land cover categories. Managers of a program out of Cornell then provide incentives to local pastoralists to actually go out and confirm, or “ground truth” the cover type. As my friend Patrick Bigger put it, “It's like Uber surge pricing for environmental surveillance.” 




This leads us into the second kind of remote sensing: 2) self-reported data. Data collected unconsciously (or sometimes consciously) – with permission or without it – from location-based apps or perhaps from website activity (your “clickstream”). This kind of data collection ultimately raises important fears about privacy. But those in the “quantified self” movement embrace collecting as much data as they can about themselves, in the hopes of optimizing say their health or even their investment strategies; 3) Finally, an emerging connectivity of devices – from smart phones to servers to satellites - gathering, transmitting, and analyzing data at a distance has some tech leaders envisioning a so-called “internet of things,” an objectification of everything – literally everything, from trees to buildings (see video below) – into governable sources of data. California has set up water level sensors to give them real-time feedback on lake and reservoir levels, and the USFS is experimenting with real-time monitoring of all sorts of different forest measures.



Data analysis
:
The promise of new sources of information means little, however, without the conceptual and practical apparatuses that allow the data to be understood (as some might put it, for the data to become information or knowledge). I’ll name a couple of the key maneuvers “data scientists” make here. I focus on the practice of analysis, but will note the here: analysis is becoming a commodity (what Wilson (2014) calls “Analytics TM”), bringing with it the promise that the world is “infinitely” analyzable. But making the earth suitable for analysis requires first indexing it. For one company, the goal is: “To index the earth the way Google indexes the Internet,” that is to, from each particular observation of the earth’s surface, discretize certain phenomena and values (e.g. extent of deforestation or, as above, number of cars in a parking lot) and then associate these objects with the observation, allowing for easy aggregation, statistical manipulation, and retrieval. Next, for many, the volume of data suggests that traditional tenets of statistical representation can be set aside. When n=N – when the sample of data is in fact the population, when we are able to collect and index all crime data for a city - we do not need to try to extrapolate, which introduces uncertainty in prediction or in analyzing new information (Mayer-Schönberger and Cukier 2013). Instead, simple yet surprising and powerful correlations and trends –for instance that Wal-Mart’s sales of strawberry pop-tarts skyrocket before hurricanes - are enough, enough for Wal-Mart to keep the shelves stocked, but probably not ask why strawberry pop-tarts go so quickly. This has led some – Chris Anderson, famously - to declare “the end of the theory.” Data need no explanation – the trends speak for themselves, allowing for prediction of future events (such as what you’ll order next on Amazon). Data scientists’ final move has been to develop more and better algorithms by which these correlations and predictions can be made. This involves “machine learning” – or the recursive tweaking of equations to “fit,” or properly explain, the data so that predictions can be made. There are generally two kinds of algorithms at play here, which will be familiar to those working with RS: supervised and unsupervised (It’s also worth noting that even though there are much more advanced algorithms at play (e.g. neural networks) these are the basics, and they have their roots in many of the tools that social scientists might use: PCA, linear regression, and so on.) I’m going to skip over an explanation of the differences between these, but you might get the impression that “unsupervised algorithms” are artificial intelligence come to life. While many big data proponents suggest that the “data speak for themselves” – and that’s what the phrase “machine learning” suggests, that machines alone can discover and interpret patterns - data managers will be the first to note that even unsupervised algorithms require active intervention and subjective choice: clustering – an important unsupervised algorithm involves an initial choice on the part of the analyst as to how many clusters to look for; clusters must also be meaningfully interpreted.
Ayasdi's analysis of key basketball "positions" as illuminated by the visual topology of the data. http://www.ayasdi.com/wp-content/uploads/_downloads/Redefining_Basketball_Through_Topological_Data_Analysis.pdf

Tool

Regardless of whatever fanciful things algorithms can do to “reveal” the pattern of a dataset and make predictions, these results need some translation out of R or whatever analysis software package they’ve been developed in. While proponents suggest data speak for themselves, it might be more accurate to claim that data visualize themselves. That data patterns and results can and must be seen is a tenant of data science, meaning that whatever goes on in the analysis should be translated into some visual medium, all the better to communicate with decision-makers around. And so we see alongside new analytics, new tools: for instance, “dashboards” and “decision support tools” that collate data and results, providing decision-makers “levers” to pull to move forward.


Palantir's tool for "adaptively managing" salinity levels in the San Joaquin Delta. https://www.palantir.com/2012/09/adaptive-management-and-the-analysis-of-californias-water-resources/
What does big data have to do with ecology?
Increasingly, conservationists, other land managers, and powerful actors in environmental governance (financiers, policy-makers, etc.) are adopting these tools. Conservation has a long history of the use of technology for management, all the way from mandating best available pollution control technologies to RS and GIS to big data today. For many conservationists, the claim is explicitly that new technologies “can help save the planet.” This is an important claim to investigate on several accounts: 1) the word help is important, but is always vaguely defined. To what extent is this a proposal that ecological degradation simply needs a technical fix (as opposed to addressing the root social causes)? 2) the claim embeds a particular scalar argument (save the planet, not just certain ecosystems). Political ecology has always been concerned with both of these kinds of questions. I offer three very short vignettes simply to illustrate what happens when big data and conservation meet, and to point to areas of research where political ecologists should be working.
"Finally, Data You Can Actually Use." Climate Corporation. http://www.climatepro2015.com/

Smart farming
Smart farming is everything in the John Deere promo linked to above. Smart farming advocates aim for the integration of field, equipment, and meterological data to support precision planting and optimizing crop yields. The idea is that this data – made accessible through interfaces and dashboard tools – can help farmers limit unnecessary applications of fertilizers (through near real-time, sub-meter access to satellites able to pick up near-infrared light, allowing insight into nutrient deficiencies and growth rates), and plant the right kinds of crops in the right place given current climatological and soil conditions. A number of different firms have set themselves up in this market, including Skybox, which bills their recently launched private satellites as providing “Big data. From space,” and Climate Corporation, which collects public weather data, translating it onto a private platform to sell to farmers (and was recently acquired by Monsanto).

Valuing nature

While smart farming aims to increase the value of agricultural production by conserving inputs, conservationist organizations themselves are aiming to use big data to value ecosystems in their own right. They have set up a couple of cases using social media data to reveal the otherwise unknown or, by default, zero economic value of nature. In California, The Nature Conservancy uses user-generated data from the eBird app to identify where migratory birds are during particularly important intervals and then pays rice farmers in key gaps along the birds’ routes to keep their fields flooded for longer, preserving habitat that can be hard to come by in the context of the state’s long drought. These mini-wetlands are called “pop-up habitats,” and TNC is also talking about it in terms much like smart farming, calling it “precision conservation.” Elsewhere, researchers from the Natural Capital Project – a coalition of academic ecologists and conservation non-profits – have used Flickr data to estimate the recreational values provided by different ecosystems, as a way of making clear to policy-makers the job and revenue creating aspects of nature. Instead of providing surveys to tourists to ask them how much they visited a national park, the researchers used Flickr data as a proxy for visitation, and to guess at how environmental degradation at these parks might change recreational attendance.

Hedging bets
Finally, some data analytic firms are pushing the boundaries of new data forms in their use for valuing nature. These firms write algorithms that sift through satellite imagery and other economic data to make sense of and correlate environmental and economic change, then package this information and sell it to hedge funds in need of investment advice. For instance, Skybox says that it can monitor say gold production from mines across the globe, giving investors in real time literally a bird’s eye view of this facet of the economy. Without having to wait for Newmont Mining’s quarterly statement, investors can with the help of Skybox’s algorithms, more or less spy on the mining giant’s operations to get a feel for whether output is on the rise. The same goes for logging in tropical rain forests, oil and gas wells, and, again, crop condition. In the case of forest monitoring, Skybox suggests that not only can this data be useful for investors, but for activists as well. They’d like to sell their analytics to both sides.

Questions to ask and possible perspectives
1. Material effects
These three vignettes, brief as they are, raise a number of questions and suggest several general lines of inquiry. First and foremost we need to better understand what sort of material impacts are detectable and attributable to big data in practice. What difference does big data analysis make in the transformation of certain landscapes, in comparison to earlier modes of governance? Might it increase the speed at which change occurs (a la pop-up habitats) or improve conservation outcomes by illuminating unseen trends? What is the environment effect of, for instance, Palantir’s tool for managing Delta water salinity? Are water levels actually optimized? What and who do the landscapes reflect? What in ecological terms is a “pop-up habitat”? Moreover, we could also better understand and communicate the environmental impact of server farms: the greenhouse gas emissions stemming from work and life now situated in “cloud” space, the SO2 emissions from diesel generators, and the water usage to cool down servers.


"Data is the new oil" http://dismagazine.com/issues/73298/sara-m-watson-metaphors-of-big-data/
2. Data as waste and value
Political ecologists focused on the intersection of cultural and environmental politics have fruitfully been investigating questions about waste and value, noting that capitalism relies on discursive translating bodies and landscapes as alternatively waste and as valuable (Moore 2013; Goldstein 2013; Gidwani and Reddy 2011). What is previously deemed waste(land) can be enclosed and incorporated as value, and valuable labor can be laid to waste (only to be speculated upon as someday bearing value again). In many ways this is exactly how data managers talk about big data - as a kind of hoard, a resource to be mined, something that in a kind of parallax view, splits the difference between waste and value [I'm grateful to Mohammed Rafi Arefin for making this point clear to me]. They believe that there is value in the data waiting to be realized, just as oil is waiting in the ground, ready to be extracted, refined, and transported to realize its value. This may help us understand data privacy issues in a new way - the unsolicited collection and analysis of data may be a sort of enclosure of otherwise waste information. Big data also more directly promises to be (environmental) economists’ holy grail: long have they sought ways to understand how people value things whose value is not revealed by markets. With big data capturing all sorts of “traces” we leave behind as we click around the Internet, the idea is this data can serve as proxies revealing preferences. When there exists a dataset like Flickr, the story goes, we have a much better sense of just how much people value wildlife. 

No matter what, we should not accept some managers’ techno-optimism without reservation. Like hoarders, analysts fear being overwhelmed by their data. The actual work of having to sort through the data mess just as easily gives the analyst a sense of dread as it inspires hopeful visions of data-driven decision-making.  Acknowledging data mining’s limitations, managers vividly describe the problem of having to sort through too much stuff to get to the valuable parts as one of powerlessness. “Without a clear framework for big data governance and use,” one consultant writes, “businesses run the risk of becoming paralyzed under an unorganized jumble of data.” For one manager reflecting on a massive dataset, “the thought of wading through all THAT to find it stops you dead in your tracks.” Another similarly evoked the feeling of being stranded at sea: “There’s a lot of water in the ocean, too, but you can’t drink it.” Continuing the drowning theme, others simply acknowledge that in big data analysis there is a risk that the analyst will be overwhelmed by inconsequential results: "This set of patterns [in a big dataset] often contains lots of uninteresting patterns that risk overwhelming the data miner.” Indeed, data miners are anxious about not finding the true or most interesting pattern, and instead finding only what they want: “Data is so unstructured and there’s so much out there, you are going to find any pattern you want … whether it’s true or fake.” They fear finding themselves in the data. This is analogous to the “ecological anxiety disorder” (Robbins and Moore 2013) ecologists find themselves in where they either see their concepts as too normative or that humans are a negative influence in ecosystems. We should leverage this concern. As political ecologists might ask what climate change adaptation programs in the developing world expect of their target audiences, we can ask, what do new digital technologies for conservation, farming, and environmental governance expect of their users?



Drowning in big data. Intel. http://www.intel.com/content/www/us/en/big-data/big-data-101-animation.html
I think in many cases we’ll find that these tools expect their users to be rational actors who predictably respond when provided a particular set of informational inputs. That is, the conceit of these tools is to ignore the political economic contexts which shape the extent to which users will even be able to acknowledge, decipher and act upon them (as well as the political economic contexts of their making). There are good reasons, political ecologists know, that a US farmer – especially a small-scale one, but we can also think about the average age of farmers, the fact that most work off-farm jobs - will be unable to perform the sort of decision-making John Deere presents. Yet it is not just these farmer-centric limits to analysis that will be key to recognize – we can think also about the political economic structure of data collection and analysis today: its centralization (like GMO seeds, farmers may or may not own their data) and monopolization (Monsanto recently bought out Climate Corporation), and the limits to “data-driven” action these introduce. Data and new tech are not black boxes for farmers just because they are technically overwhelming and once removed from farmers, but because they are socially boxed off as well.

3. Visualizing like a state
More specifically, with the increasing emphasis on data-driven solutions to everything, we should expect to see “analysis paralysis”, or situations in which too many options and too much information are presented to decision-makers for them to be able to take any coherent action. This may be particularly true for policy-makers, the kind targeted by tools like Palantir’s salinity monitor:
…our analysis tools “can give policy makers maximum insight into the relationships between the variables that affect the Delta’s health and allow them to make decisions that appropriately weigh the interests of all parties involved.” Palantir
What exactly is “maximum insight”? What does it mean for tools to objectively weigh interests? There are many reasons that the kind decision-making subjects envisioned here doesn’t and won’t actually exist. Many actually existing policy-makers do not even know what kinds of tools are technologically possible – they don’t know what to ask for from programmers. The ones I’ve been talking to tell me things like “someone told me I wanted a dashboard” – in other words, they do not actually know they need the kind of easy to use tools Palantir is offering them, but are instead convinced by the discourse that tells them they do need such tools. And even if decision-makers do seek out dashboards, what they’re often asking for is the “easy button,” the button that gives them an answer spelling out what to do. Yet many programmers will reply that that’s something they can’t give them. It’s not their place to give the one answer, but to provide only maximum insight and present all interests, letting the decision-maker do the rest and pinning accountability on them. Or, merely “suggesting” changing the type of seed to plant, like the tool in the John Deere video. In other words, tools lure decision-makers with the promise of easier, more defensible basis for their choices, while at the same time deflecting as much of that responsibility back towards them.

4. Ideologies of Nature
Finally - and I’m grateful to Dan Cockayne for pushing me on this – we should ask how big data and analysis rely upon and change ideologies of nature. As Neal Smith (1984) argued, not only does capital physically produce nature, but it produces ideologies of nature, bourgeois notions of capital, race, gender and so on as natural. How does TNC’s creation of “pop-up habitats” change how we think about nature? In what ways might we start to think about environmental protection in terms of “surge-demand” and “precision” and understand ecosystems as “real-time” or “pop-up” events? What about the reverse- when instead of talking about nature through data, we start to conceptualize data using our existing languages for talking about nature? What is the effect of describing computation as happening in an “environment,” or more perniciously, naming data as a resource – an object like oil, coal, or gas to be mined and extracted?

Conclusions
What can PE in particular say to the larger conversation surrounding big data?  Below, I name three points political ecology can make in this discussion, three points that summarize the questions and perspectives sketched above. I end with an argument for a political ecological practice of big data.

I believe political ecology is equipped not only to adequately analyze the kinds of situations I sketched above, but that through these sorts of “knowledge politics” critiques it can usefully inform critical data studies. Much in the way of reflexivity on big data concerns its implications for privacy, its heightening of the digital divide, and its use as an academic research tool. Much as political ecologists know the problematic epistemologies of remote sensing, these reports usefully point out that social media data in particular cannot provide the kinds of information or insights many analysts want it to – much data isn’t geotagged, and even the images or tweets that are do not necessarily indicate that the person can be adequately related to that location (boyd and Crawford 2012; Crampton et al. 2013; Wilson 2014). But we should not stop at these critiques of big data as our own method for understanding the world (nor stop at crucial questions of what work big data does as a governance “meme” – Graham and Shelton 2013)  – we must extend them to the particular situations in which big data becomes an applied analysis tool regardless of its foundational epistemological flaws. In terms political ecologists are familiar with: we can illustrate not just data’s moment of production, but its moments of circulation and application. Indeed, Crampton et al. (2013) call for a kind of contextualization or “situating” of big data, as research tool if not means of decision-making. This kind of intimate grounding through case studies is what political ecologists excel at.


We must articulate that big data is produced, and we must show how and why. Not content to merely illustrate that new technologies constrain or afford certain actions by governance actors or land managers, we should provide “chains of explanation” beyond our research contexts back to larger forces at play. We can provide crucial inroads into questions like: what’s the political economy that allows a firm like Skybox to index the earth? Who’s producing these tools? How do private firms rely on and generate value out of public data? How does big data gain “value” for firms or farms? What is it that allows data name value in the world, as TNC and the Natural Capital Project believe it does?


Likewise, big data is something which is practiced with people, in spite of the mystification of algorithms as wholly autonomous entities. Political ecology has a long-standing concern with understanding the concrete, embedded decision-making and social situatedness of land managers – as opposed to discursively idealized subjects (like rational actors) or, to use one of Piers Blaikie’s images, bureaucrats in airplanes (designing a population stabilization program based on abstract concepts and figures the like of which, Blaikie wryly notes, are not in the minds of two lovers as they lay down to bed. This concern should lead us to consider data managers as very much the kind of land managers Blaikie started from and focused on in his Political Economy of Soil Erosion: “actual people making decisions on how to use land.” I mean this in two ways: 1) those making decisions that have concrete bearing on the treatment of particular ecologies, as when TNC’s data analysts, with the help of their algorithms, decide where to extend migratory bird habitat; 2) I also mean it in the sense that those we traditionally consider land managers – farmers, pastoralists, peasants – are fast become data managers in and of themselves, be it as “smart” farmers or as pastoralists incentivize to go out and ground truth RS imagery.


Finally, we must understand the actual effects and outcomes of new techniques of data management. While it is crucial to recognize how data potentially connects domains in novel ways – i.e. when Skybox syncs ecological data with financial data – we should try not to reproduce discourse that “algorithms will rule our world.” If we do, we miss key resistances and ambiguities. We should remember that Skybox supports activists who feel that the technology can offer more visibility to illegal extraction activities be it in the Amazonian or Appalachia. We should be reminded to look for these fractures and to try to widen them, to realize that we can engage in a “politics of measure” that either questions the very measurability of things (Mann 2007; Robertson and Wainwright 2014) or asserts the strategic utility of doing so (Wyly 2007; Cooper 2014).


Which leads to the last big question to pose here: do we as political ecologists employ big data ourselves in support of our “hatchet and seed” mission? Is data just the object of our critique, along the lines I have laid out here? These are questions human geographers are asking themselves right now (Graham and Shelton 2013, special issue), some seeing it as an opportunity to rethink empirical social science in a more interdisciplinary way (Ruppert 2013), with others more cautious, suggesting big data are the epiphenomena of the real issue, and the real need is to question the datafied production of knowledge, to question the idea that big data is truth that simply needs mining (Wilson 2014). I think political ecologists have a lot to gain by reflexively engaging with the analysis of massive data sets, extending PE’s tradition of sitting critically if sometimes awkwardly (Walker 2005) between political critique and ecological fieldwork, cognizant that its own tools of research are often those it seeks to criticize (a la Lave et al. 2013; Turner 2003).


In short, I’m reminded again of Piers Blaikie, when in the 5th chapter of PESE,he sketches out what a grounded study of peasant political economy in the context of soil loss would look like. Literally sketches out what he calls “a schematic and heuristic device which suggests the way in which these complex relationships (between people, and between people and environment) can be handled.” And what does it look like? 



Blaikie's heuristic for understand land use decisions and soil erosion. 

What else but a massive, complex, messy matrix of variables and objects of study (in this case, households? A big dataset. What is his critical relationship to this heuristic? He describes it as a watch – a watch keeps time, but it does not give its user any clue on how to use time. In the same way, the heuristic and the data it organizes are meaningful only in the context of political economic theory – it “maps” political economy onto the case study and “attempts to calibrate part of it precisely.” In other words, political ecologists might have our big data cake and eat it too. I’m not entirely sure what this engagement would look like concretely – the GLOBE project at the University of Maryland-Baltimore County is a likely candidate - but it seems to me that’s where we ought to be headed.

Thursday, October 16, 2014

Can technology save the planet?

A provocation from WWF's chief scientist John Hoekstra that's exactly where I end up in my new post over at Edge Effects. It's a fun intro to quantum computing that backs into a discussion of the assessment and geodesign software tools that conservationists are deploying around the world to better measure restoration interventions, track environmental change, and fight back against environmental crimes like illegal logging. Ultimately, I'm less sure than Hoekstra that the answer to his question is a resounding yes.

The post comes right on the heels of a few interesting stories out the past few days. First, yesterday the Natural Capital Project has launched a MOOC, where you can learn about their toolset. I've written speculatively about some of those tools here before, but it's great to have the chance to go behind the scenes. Second, Hoekstra held a Twitter-mediated conversation last week during SXSWEco, discussing the potential for drones, big data analytics, and other emerging technologies to, well, save the planet. I'm not even convinced yet that what we're seeing conservation right now qualifies as big data - the term seems loosely applied - but Hoekstra led an important conversation about how to do big data in conservation while recognizing issues of security, digital divides, and privacy.

So check out the post, and be sure to bookmark or follow Edge Effects while you're at it. It's an amazing new site run by grad students affiliated with the University of Wisconsin-Madison's Center for Culture, History, and Environment

Thursday, August 21, 2014

Where does the value of nature come from?

$125 trillion is no small chunk of change. You could, without a doubt, buy a lot of stuff with that in your pocket. In fact, it's more than the world's gross domestic product (GDP), or the value of all the goods and services produced in the global economy each year - everything from cars to haircuts to World Cup tickets. But according to a recent study led by Bob Costanza, $125 trillion is also the dollar equivalent of what all the work the world's ecosystems do for us - things that aren't normally counted in GDP, like the flood damage a coastal wetland prevents. The $33 trillion mark and other figures like it vary widely while provoking much controversy...and are increasingly taken by world political and business leaders as self-evident, touted as the next big thing in conservation.

$125 trillion is a best guesstimate. It's a follow-up to Costanza's landmark 1997 paper in which he and colleagues suggested the number might be more like $33 trillion. At the time, other researchers had reviewed existing valuation studies, finding that overall an acre of wetlands might be valued anywhere between six cents and over $22,000! Other commentators noted that looking at just one particular ecological function (say, flood prevention), the values researchers came up with could differ by two orders of magnitude from site to site. For many critics, these numbers imply the reduction of nature to something to be bought and sold, which is particularly problematic if they're going to fluctuate so wildly. For others, no matter the range of values, they just aren't helpful - they're all a "serious underestimate of infinity" because we simply can't do without many of things nature does for us, like provide breathable air. Still, for a growing number of conservationists and decision-makers, putting some number - often a dollar value - on ecosystems is exactly what's needed to save it, to show policy makers and businesses the economic importance of nature, in hopes of preventing its destruction and encouraging its conservation.

So where do these numbers come from?

It's appealing to write-off statements like, "the value of an acre of wetland is six cents," as simply the work of ivory tower intellectuals busily justifying their own existence. After all, the scholars who published the most recent study are all affiliated with an academic institution. It's also easy to get the sense that these numbers come from no place in particular. That's the feeling you get reading histories of the ecosystem service concept (see here and here). These papers do important work revealing that the valuation of nature just didn't come along in 1997 with Costanza et al.'s first estimate of the value of the world's ecosystems. But beyond having a history, behind Costanza et al.'s new number and the “modelling sausage” that spit it out is a body of literature, a set of theories, and communities of scholars called environmental and ecological economics. And this community and its history is grounded in place. Valuing nature didn't just appear from out of nowhere. Environmental, and its younger, upstart cousin, ecological, economics and the basis for valuing nature come out of a long-standing engagement with wetlands, especially coastal marshes, and particularly as they faced development pressure, often from the oil/gas industry. When ecologist Eugene Odum and economist Len Shabman sparred in the 1970s over how exactly conceptually and empirically to get at nature's value, their material was Gulf Coast marshes. Today, in a post-Hurricanes Katrina and Sandy world, when we hear prominent arguments for restoring or conserving ecosystem services because of their value, it's coastal places like Mobile Bay, AL that are paraded out as examples of where coastal restoration "show strong returns on dollars invested." It's the US Gulf Coast that has defined nature's valuation as we know it today - its methodology and policy advocacy - and will continue to shape it in the wake of the 2010 Deepwater Horizon oil spill.

The Gulf Coast oil and gas industry's payments to oystermen for access to lay pipelines across harvesting grounds mark economists', conservationists', and others' earliest struggles with valuing nature's goods and services beyond the confines of established markets. The Gulf's hydrocarbon industry grew significantly following World War II in order to meet growing demand from suburban consumers, as Jason Theriot details in his great new book about the twin histories of the industry and wetland loss and protection in the Gulf. But, of course, companies like Tennessee Gas needed to get their products to market - mainly on the rapidly urbanizing east coast - from wells in the middle of Louisiana's marshes. To do so, they laid hundreds of miles of pipelines in canals carved through wetlands. These areas, however, were often the same spots where oystermen had traditionally harvested. At the time, in the 1950s, the ecological consequences of the kinds of hydrological disruptions caused by canals were already to some extent understood by fishermen and scientists alike: the catch would likely be diminished from any nearby canals. In part because many working in the hydrocarbon industry were local oyster experts themselves and in part because of the influence the fisheries community had historically exerted on state regulators, oil and gas companies went out of their way to provide compensation for direct damages from pipelines. What the industry paid was simply what the expected catch would have fetched on the open market. In paying oystermen for losses to their harvest, oil and gas companies were acknowledging:the broader effects of their activities, but still had some market signal to guide them; they were not trying to compensate for things without market prices like water quality that environmental economics pioneers like Dales were first proposing at the time. The industry's payments were ad hoc and not meant to be a systematic assessment of all of the what we would now call ecosystem services a wetland provided. Still, conservationists seemed to be at least considering for the first time about what values the market wouldn't capture. For instance, the chief of the Oyster Division of Louisiana's Wildlife and Fisheries Commission was particularly concerned that compensation would not account for long-term, large-scale effects:

we feel that the long range effects resulting in permanent ecological changes are by far the most serious and the most difficult to assess damages for. Direct effects are largely a matter of obtaining ROW [right of way] and making adjustments for damages at the time of construction. The area involved is comparatively small and involves only the path of the canal and the immediate vicinity on each side. Ecological and hydrographic changes may be permanent and may affect extensive areas ten miles or more on either side of the canal.” (58)

Into the 60s and 70s, conservationists at Louisiana's state environmental agencies and at Louisiana State University (LSU) continued calling for a more formal recognition of the importance of wetlands, as part of a growing movement nationwide to daylight corporate and government decision-making that had impacts on the environment. As the story goes, in1969, the massive oil spill offshore of Santa Barbara, CA inspired Congress to pass the National Environmental Policy Act (NEPA), which required all federal agencies to undertake a formal review of the costs and benefits of any action with a major effect on the environment. NEPA, however, did not require agencies to monetize these costs and benefits in order to evaluate projects. Ultimately, with executive orders from Reagan and successive administrations requiring more cost benefit analysis (CBA), monetization became the default. Already by the 70s, the Army Corps of Engineers had been conducting CBAs of its projects. As Tennessee Gas looked to the dock price of oysters to account for some of the broader effects of its pipeline canals, the corps might determine whether or not to build a dam based on the cost to fisheries weighed against the benefits, measurable in dollar terms, arising from new recreation opportunities.

CBAs accounted for only so much of what conservationists thought was important about coastal habitats. ← This if anything is the constant refrain throughout the history of nature's valuation, from both advocates and critics: what are we counting? CBA might assess how a levee project would cost in damages made to fisheries, but at the time there were few techniques for accounting for the loss of storm surge protection that came from impounding wetlands. Easily the landmark piece decrying the limits of CBA was James Gosselink, Eugene Odum, and R.M. Pope's 1974 paper, “The Value of the Tidal Marsh”. It was a short white paper written for the LSU Center for Wetland Resources, but nonetheless recieved remarkable national attention from wetland conservationists as they made their case in the 70s for increasing resource protection. Gosselink et al.'s aims were to counterbalance development pressure on coastal ecosystems by expanding what ought to be counted as monetary cost from development. For instance, they valued the waste assimilation capacity of wetlands by looking at what it might cost regions to fully treat their sewage if all wetlands imply vanished and were replaced with wastewater plants. This “replacement cost” was not a market price, but was an existing signal (and it's measures like these that led to some of the most famous examples of institutional payments for ecosystem services, namely New York City's payment to farmers in the city's watershed to conserve natural habitat, which has reduced the region's water treatment costs.) The group, in the end, described wetland value in terms of $/acre/yr, but how they got their was through the concept of "emergy.” Emergy is a neologism for the amount of solar energy embodied in an ecological good or service. This could be converted into monetary terms by comparing the caloric requirements for service production in a wetland to the price to burn calories in things like oil that do have a market price. It may sound a little convoluted today, but the idea still has some traction. What's important about the emergy argument is that it proposes that nature's value is intrinsic; value arises from ecological transfrormations of energy, rather than supply and demand. Not surprisingly, this upset many economists, some of whom thought the idea went against some of the fundamental tenets of their discipline, in which value is fundamentally relative, dependent on the vagaries of supply and demand, and ultimately, how much rational subjects desired certain things. This is precisely what Gosselink et al. were skeptical of: if we believe neoclassical economics, nature has no value because it has no market price. But of course nature has value and so it must reside somewhere in nature. As one research team later put it, “The point that must be stressed is that the economic value of ecosystems is connected to their physical, chemical, and biological role in the overall
system, whether the public fully recognizes that role or not.” (emphasis in original) Politically, this perspective translated into an argument to not leave wetland protection to the whims of the market, but for better government planning. That would be something at least Gosselink would be more involved with in the next decade, contributing to the consolidation of Louisiana's modern oil/wetland regulatory regime while leading environmental reviews of projects like the massive Louisiana Offshore Oil Port.

One student of Howard Odum – Eugene's brother and collaborator - was none other than Bob Costanza, lead author of the 2014 paper valuing the world's ecosystem services at $33 trillion. After graduating from the University of Florida in the late 70s, he got a job at LSU. As he recalls it, he was in part drawn there by the presence of Herman Daly, whose work was moving in similar directions and had been an inspiration, and who happened to show up at his job talk. LSU at the time would have been a hub of activity focused on valuing nature, through coastal marshes, with Gosselink, Costanza, and Daly all pioneering in their own way and Eugene Turner, another freshly-minted student of Odum, making headway on understanding the ecological effects of oil and gas canals on the coast. Throughout the 80s (and to some extent into today) Costanza would publish on the ecological and economic facets of Louisianan and Gulf wetlands. In one paper in particular, 1989's “Valuation and Management of Wetland Ecosystems,” he and his co-authors produced another estimate of Louisiana's wetlands, a follow-up to Gosselink et al. Like Gosselink et al., Costanza, Maxwell, and Farber conducted an emergy analysis. But they also did something different: besides counting calories or looking at the market rate of fish raised by coastal estuaries, they actually hit the pavement (a boat ramp parking lot, actually) and asked people what wetlands were worth to them. The technique is known as contingent valuation. What they were after was people's "revealed" preferences - the amount each person spent on gas to get themselves to a wetland to fish could be considered part of its value, as a provider of a recreational service. The researchers were also interested in "stated" preferences - what people say they would pay to protect a wetland. If you're thinking that preferences sounds a lot more in line with the neoclassical economics approach than with the emergy perspective, you'd be right. We might read Costanza et al.'s paper as a sort of continental divide in the valuation of nature: the first half an emergy analysis focused on elucidating the inherent values of nature, a perspective that was prominent up until that point, the second half all about new techniques to get after how much society desires wetlands in practice, regardless of what nature has to say about it, new and exciting methodolgies that were about to get their trial by fire (see below). And this sort of split reflects where Costanza et al. end up in the paper when it comes to policy recommendations: they suggest that oil and gas companies provide bonds to cover the mitigation of their impacts. The amount of the bond would be based on the predicted extent and nature of the impacts, and depending on the final ecological outcome, the company would get more or less of its bond back. The researchers' argument here was not for better planning to restrict where the oil and gas industry could work, but to modify the industry's accounting practices, something that is all the rage now, with TEEB and TNC working hard at incorporating green accounting in business.

Here's the thing about contingent valuation: it doesn't work. Economists often expect people to behave rationally, but asking people how much they would pay to protect pelicans has presented economists with a number of persistently thorny issues. For instance, when researchers ask people how much they would pay to protect a nearby natural area from a hypothetical development scenario, people regularly act strategically and give “protest” answers. They'll say $0, insinuating that the park is priceless, or offer some absurdly high price, all in the belief that there may be an actual development project in the works and that their answers may stop it. It also turns out that people don't value twice as much wetland at twice the price. It was another “largest to date” oil spill – the Exxon Valdez tanker leak in 1989 - that brought to a head many of the methodological concerns surrounding the use of contingent valuation (in theory and applied to real cases). Economists and regulators alike asked themselves, how do we figure out how much damage the tanker spill has caused to wildlife? What about the value someone in Iowa places on the mere existence of some species in Alaska? NOAA, in charge of the clean-up, commissioned a study led by some of the top minds not just in environmental economnics but economics writ large - Nobel Prize winners like Kenneth Arrow - to see if contingent valuation was a proper method to use. They found that it was, and the courts have affirmed NOAA's prerogative to use it. But rarely it has.

Instead, NOAA has tended to employ good ole replacement cost. If it costs $100 million to buy all the construction equipment, fill, and plants to replace a wetland degraded by a Chevron oil spill, then Chevron must pay that amount. Like contingent valuation, the problem with replacement cost, as practiced in NRDA compensation, is that it doesn't work. The cost of replacing ecological structure doesn't necessarily equal the cost of lost ecological functions. It's one thing to put the right kinds and quantity of plants back in; it's another to make sure that the ecosystem is providing the same kind of flood mitigation service, for instance.

Doubts about the object and goal of compensation are precisely what are haunting economists yet again following the lastest “largest oil spill to date,” the Deepwater Horizon spill. It inspired a revisiting of the contingent valuation question with some authors revising their position from 20 years ago post-Exxon Valdez. Diamond, for instance, is now even more critical of contingent valuation and more skeptical that it could ever be of much use. Some number is not better than no number. Meanwhile, the spill has become a poster child for ecosystem service valuation advocates. In the monumental TEEB synthesis report, the section on “Applying the Approach” begins with the lamentation that if only the value of wetland services had been properly accounted for in business practice, BP would have never let the spill happen.

What is clear from the Deepwater Horizon fallout is that beyond whatever BP ends up having to pay to compensate for affected wetlands, they're going to have to pay separate fines into another fund to be used for large-scale restoration of the coast, beyond specific places the spill reached. This is a (bitter) windfall for conservationists; as one put it -"This is a once in a lifetime opportunity" to do something about Louisiana's land loss problem. BP money, at least in Louisiana, will be funneled into the Master Plan the state's CPRA has developed. The Master Plan lays out restoration principles and priorities (e.g. let nature do the work - harness the Mississippi River to deliver sediments to open water to build new land) and describes a suite of sites selected for restoration. These sites were selected based on their cost effectiveness, which in part has been a question of how many ecosystem services restoration will bring - how much flood prevention or how many alligators, for instance. The question here has is not so much the cash value of these services; the authors of the Plan explicitly note: "We didn't have the time this time around to look at that." As the head of the CPRA noted, however, this kind of analysis is in the works for the next version of the Master Plan, coming in 2017. At the heart of the matter is the development and deployment of economic valuation techniques for evaluating public spending. Economists are hard at work determining what kinds of restoration are most worthwhile: how many acres will $X in sediment diversions bring in over time compared to other methods of marsh creation? So far, acreage - a fairly straightforward metric - has been the target, which makes sense giving land loss is measured in acres, but expect to see ecosystem services migrate into the accounting. The goal in valuing the land building and protecting services provided by coastal wetlands is to spend public money wisely in an era of austerity. These metrics allow decision-makers to evaluate tradeoffs. Already in the master plan, the increase in certain ecosystem services like carbon sequestration were argued to outweigh and justify the decrease in other services, like shrimp habitat. Again, these were not yet $ valued. This time around, it certainly won't be emergy used to derive the value of these public goods. Instead, what we are seeing hints at is a move toward marketizing services. Instead of developing $ metrics to inform planning or to build new public institutions (via bonding a la Costanza), some important Louisiana decision-makers are turning to potential carbon and nutrient markets to help value (and pay for) wetland benefits. There's no need to do contingent valuation of a wetland function like carbon sequestration that is traded in California's cap and trade market at $10 a ton - that's the value right there.

Where does nature's value come from? In no small part, from those working to understand and protect Gulf Coast marshes. The practice of assigning the environment a dollar value continues to evolve and does so as practitioners – regulators, economists, scientists - engage with these ecosystems, providing certain opportunities and obstacles. Indeed, an ongoing question within the field is the role of environmental science and the extent to which ecologists can provide the kind of information about nature economists want and need to do valuation. The complexity of ecological functions - their nonlinearity, dynamsism, etc. - has long been acknowledged as a stumbling block, from Westman's prescient 1977 Science article to the Millenium Ecosystem Assessment to even the most ostensibly gung ho supporters of valuation, TEEB. These difficulties do not mean that the champions of the valuation of nature feel defeated. Just consider how Costanza felt in 1997, "….although ecosystem valuation is certainly difficult and fraught with uncertainties, one choice we do not have is whether or not to do it". Besides environmental economics' struggle with its existential dependence on externally produced knowledge, its practitioners struggle with understanding their own conditions for knowing nature's value. There's still much question about whether contingent valuation will work. We've seen a few here: people don't always act rationally, and they also reject the survey techniques researchers employ to come up with their numbers. This is setting aside what is perhaps the trickiest question, that of benefits transfer, or the practice of taking the monetary values for ecosystems in one part of the world and using them in a different part of the world. After all, as one scholar put it, "An acre of coastal salt marsh seaward of New Orleans is many times more valuable …than an acre of abandoned farm pasture in Nebraska” and it's being able to say how much more valuable and how locally specific to get that troubles many researchers. Finally, what kind of policy angle environmental economists ought to take is still open to debate (the riff between so-called environmental and ecological economists itself is a part of that). Is nature open for business and up for sale? Or is the goal simply better planning and perhaps $ numbers aren't needed? You can expect those questions to be asked if not resolved in any good paper today. Yet, as one group of historians of the ecosystem services paradigm notes, these uncertainties are not the growing pains we might expect from an emerging research perspective. As is clear when we look at the history of coastal marsh protection in the Gulf Coast, nature's valuers have had at least 40 years experience to figure things out. Instead, what lingering questions and simmering debates reflect are genuine obstacles to a rightly controversial practice.

In attempting to answer these questions and close these debates, how economists, regulators, and scientists go about valuing nature will evolve. This of course is happening globally. One only has to follow the Natural Capital Project around the world, from Colombia to British Colombia, to get a sense of the importance of these places to how the vision of nature as capital is being articulated and materialized. But Louisiana and the rest of the Gulf Coast will undoubetedly continue to be a sort of lab for experimenting on the policy, science, and economics behind the valuation of nature - the place where many of the important conversations ongoing about the future of conservation in the face of climate change are worked out. As a senior adviser to America's Wetland Foundation recently put it, “We can test it better than anyone.”