Leptoukh Award Winners

Nomination process

Leptoukh Award Winners
2017 – Kirk Martinez
2016  – Cynthia Chandler
2015 – Dawn Wright
2014 – Bryan Lawrence
2013 – Simon Cox

 

2017 Leptoukh Lecture

Kirk Martinez University of Southampton

Earth sensing: from ice to the Internet of Things

The evolution of technology has led to improvements in our ability to use sensors for earth science research. Radio communications have improved in terms of range and power use. Miniaturisation means we now use 32 bit processors with embedded memory, storage and interfaces. Sensor technology makes it simpler to integrate devices such as accelerometers, compasses, gas and biosensors. Programming languages have developed so that it has become easier to create software for these systems. This combined with the power of the processors has made research into advanced algorithms and communications feasible. The term environmental sensor networks describes these advanced systems which are designed specifically to take sensor measurements in the natural environment.

Through a decade of research into sensor networks, deployed mainly in glaciers, many areas of this still emerging technology have been explored. From deploying the first subglacial sensor probes with custom electronics and protocols we learnt tuning to harsh environments and energy management. More recently installing sensor systems in the mountains of Scotland has shown that standards have allowed complete internet and web integration.

This talk will discuss the technologies used in a range of recent deployments in Scotland and Iceland focussed on creating new data streams for cryospheric and climate change research.

 

2016 Leptoukh Lecture

Cynthia Chandler, Woods Hole Oceanographic Institution

Data, data everywhere…

The scientific research endeavor requires data, and in some cases massive amounts of complex and highly diverse data. From experimental design, through data acquisition and analysis, hypothesis testing, and finally drawing conclusions, data collection and proper stewardship are critical to science. Even a single experiment conducted by a single researcher will produce data to test the working hypothesis. The types of complex science questions being tackled today often require large, diverse, multi-disciplinary teams of researchers who must be prepared to exchange their data.

This 2016 AGU Leptoukh Lecture comprises a series of vignettes that illustrate a brief history of data stewardship: where we have come from, how and why we have arrived where we are today, and where we are headed with respect to data management. The specific focus will be on management of marine ecosystem research data and will include observations on the drivers, challenges, strategies, and solutions that have evolved over time. The lessons learned should be applicable to other disciplines and the hope is that many will recognize parallels in their chosen domain.

From historical shipboard logbooks to the high-volume, digital, quality-controlled ocean science data sets created by today’s researchers, there have been enormous changes in the way ocean data are collected and reported. Rapid change in data management practices is being driven by new data exchange requirements, by modern expectations for machine-interoperable exchange, and by the desire to achieve research transparency. Advances in technology and cultural shifts contribute to the changing conditions through which data managers and informatics specialists must navigate.

The unique challenges associated with collecting and managing environmental data, complicated by the onset of the big data era, make this a fascinating time to be responsible for data. It seems there are data everywhere, being collected by everyone, for all sorts of reasons, and people have recognized the value of access to data. Properly managed and documented data, freely available to all, hold enormous potential for reuse beyond the original reason for collection.

 

2015 Leptoukh Lecture

Dawn Wright, Environmental Systems Research Institute

 Toward a Digital Resilience (with a Dash of Location Enlightenment)
The AGU Earth and Space Science Informatics Focus Group addresses a compelling array of research questions and projects. This year’s session topics range from large-scale data management within global cyberinfrastructures or virtual observatories, to intelligent systems theory, semantics, and handling of near-real-time data streams, to issues of “dark data,” data transparency, reproducibility, and more. The aim of this lecture is to build in part on these themes but to consider more broadly how we might push the boundaries of informatics knowledge more along the lines of use-inspired science (responsive to the needs and perspectives of society while still being fundamental and cutting edge). To wit, as we contend with human impacts on the biosphere recent innovations in computational and data science are now facilitating community resilience to climate change (e.g., helping communities to monitoring air quality or drought, find available drinking water, determine habitat vulnerability, etc.). But not often discussed is a path toward digital resilience. If digital tools are to continue helping communities, it stands to reason that they must engender some resilience themselves. The capacity to deal effectively with change and threats, to recover quickly from challenges or difficulties, even to withstand stress and catastrophe, can apply to data too. As investments in digital data continue to rise, we find ourselves in new “digital world order” comprised of ubiquitous technologies from satellites to wristwatches to human biochip implants. And a significant proportion of these are geospatial, given the incredible power of maps to communicate, persuade, inspire, understand, and elicit action. Therefore, the lecture reviews and recommends seven fundamental digital research and communication practices. The aim is ensuring not only a modicum of resilience for our nascent discipline, but in prototyping and delivering repeatable solutions that all can use to help guide the planet towards a more resilient future.

 

2014 Leptoukh Lecture

Bryan Lawrence

Director of Models and Data at the UK National Centre for Atmospheric Science, Professor of Weather and Climate Computing at the University of Reading, and the Director of the STFC Centre for Environmental Data Archival (CEDA).

Trends in Computing for Climate Research

The grand challenges of climate science will stress our informatics infrastructure severely in the next decade. Our drive for ever greater simulation resolution/complexity/length/repetition, coupled with new remote and in-situ sensing platforms present us with problems in computation, data handling, and information management, to name but three. These problems are compounded by the background trends: Moore’s Law is no longer doing us any favours: computing is getting harder to exploit as we have to bite the parallelism bullet, and Kryder’s Law (if it ever existed) isn’t going to help us store the data volumes we can see ahead. The variety of data, the rate it arrives, and the complexity of the tools we need and use, all strain our ability to cope. The solutions, as ever, will revolve around more and better software, but “more” and “better” will require some attention.

In this talk we discuss how these issues have played out in the context of CMIP5, and might be expected to play out in CMIP6 and successors. Although the CMIPs will provide the thread, we will digress into modelling per se, regional climate modelling (CORDEX), observations from space (Obs4MIPs and friends), climate services (as they might play out in Europe), and the dependency of progress on how we manage people in our institutions. It will be seen that most of the issues we discuss apply to the wider environmental sciences, if not science in general. They all have implications for the need for both sustained infrastructure and ongoing research into environmental informatics.

 


 

2013 Leptoukh Lecture

Cox-2013-04-26a[1]

Simon Cox is a Senior Principal Research Scientist at CSIRO. He trained as geophysicist, with a PhD in experimental rock mechanics from Columbia (Lamont-Doherty) following degrees in geological sciences at Cambridge and Imperial College London. He came to Australia for a post-doc with CSIRO, and then spent four years teaching at Monash University in Melbourne where he first began using GIS. Returning to CSIRO in Perth in 1994 to work on information management for the Australian Geodynamics CRC, he moved its focus for reporting onto the emerging World Wide Web, deploying a web-mapping system for Australian geology and geophysics in 1995. The challenge of maintaining the AGCRC website led to metadata-based systems, and Simon’s engagement with the standards community when he joined the Dublin Core Advisory Council.

Work on XML-based standards for mineral exploration data led on to foundation of the GeoSciML project in collaboration with a number of geological surveys. An interest in tying these into broader interoperability systems led to engagement with the Open Geospatial Consortium, where he co-edited the Geography Markup Language (GML) v2 and v3. In OGC he developed Observations and Measurements as a common language for in situ, ex situ and remote sensing, going on to become an ISO standard, and forming the basis for operational systems in diverse fields including air-traffic, water data transfer and environmental monitoring applications. In 2009-10 he spent a year as a senior fellow at the EC Joint Research Centre in Italy working on integration of GEOSS and INSPIRE. He served on the council of the IUGS Commission for Geoscience Information and the International Association for Mathematical Geosciences. In 2006 he was awarded OGC’s highest honor, the Gardels medal. He has been a member of AGU since 1982. Simon is currently based in CSIRO Land and Water in Melbourne, working on a variety of projects across environmental informatics and spatial data systems.

2013 presentation at AGU