Image: Redd Angelo, via Unsplash

Between 2005 and 2013, the size of the digital universe – the space held by the digital information created, replicated, and consumed in a single year – grew by a factor of 33. Big data’s footprint has grown accordingly, fed by data streams that convey ever-greater amounts of information, at an accelerating pace, across an increasing variety of topics.   

Increasingly, researchers, policymakers, and citizens are enlisting large data sets to support or reshape existing environmental monitoring and management practices. Twitter posts guide responses to and help anticipate the cost of natural disasters. Online repositories of city- and state-led climate action gather best practices and lessons learned, building technical capacity.

The use of big data can enable new approaches to well-trod questions and add unprecedented levels of breadth and depth to environmental datasets. But trying to sip from this fire hose of information can also become a challenging, resource-intensive process in its own right. In addition to the challenges related to managing the sheer scale of big data sets, new forms of information prompt questions about how to best collect, harmonize, and link this data to existing conversations and goals.

Two upcoming workshops will focus on strengthening the link between data and policy in efforts to understand and support climate mitigation and adaptation. On April 25 and 26, the Climate Action Analysis Network will bring researchers, analysts, and data providers to University College London, to discuss research priorities around climate action commitments made by cities, regions, companies, investors, and civil society. From May 11-12, experts in climate adaptation policy and computer science will gather at the Yale-NUS College Singapore campus to discuss ways to more fully leverage the potential of big data, in the workshop Tracking Climate Adaptation: Methodological Challenges in the Age of Big Data.

The April Climate Action Analysis Network meeting, hosted by Data-Driven Yale and The Stanley Foundation, in collaboration with the Galvanizing the Groundswell of Climate Actions Coalition, will focus on outlining a roadmap for the research and analytical community over the next several years. In particular, 2017 and 2018 will include a number of activities and milestones that take stock of and continue to facilitate climate action in the wake of the historic Paris Climate Agreement. During the meeting, participants will discuss ways to weave information about the role of cities, regions, companies, investors, and civil society into these activities. The workshop will also consider how to fill data gaps and more closely track the progress and impact of these climate commitments.

In May, Yale-NUS College, in collaboration with the Adaptation Tracking Collaborative (ATC), will host a workshop that explores ways to develop the necessary data to measure climate adaptation policies. Many metrics simply track the presence or absence of different adaptation procedures (e.g., a country’s creation of a National Adaptation Plan). The ATC has piloted a monitoring framework to better capture the policy goals and instruments governments have adopted to address the impacts of climate change. During the workshop, participants will identify specific opportunities for alternative forms of data collection (e.g., machine learning, web scraping, text as big data analysis, and social media analysis) to support this framework and general adaptation policy-tracking efforts.

Both workshops will allow participants to explore big data’s potential for policymaking. They should offer a vital look into the ways familiar strategies – of prioritizing needs, identifying relevant questions, and sharing experiences – can help harness more innovative tools.

Image: Redd Angelo, via Unsplash

css.php