Great Lakes Mapping Project

What is left to discover within the maps of the past? That is the question poised by the Nexus Lab in its collaboration with the Professor Robert Markley (University of Illinois), Kenton McHenry (National Center for Supercomputing Applications), and Chris Morris (University of Texas at Arlington). Using French and British maps from the sixteenth to the eighteenth centuries, the Great Lakes mapping project is using computer vision to create entirely new sources of information about the environmental history of North America.

Without present-day tools such as GPS and satellite imaging, cartographers of the past recorded their impressions of the Great Lakes with the problems inherent in the human eye and personal and political assumptions. To determine the discrepancies of measurements in French and British perceptions, researchers use data visualization techniques including spatial calibration and charting software to compare historical mapping data and proven geographical coordinates.

The Great Lakes mapping project has the potential to become an entirely new method of obtaining historical climate models with data—data otherwise not recorded or difficult to ascertain. This technique will be useful in predicting the future patterns of global warming and climate change. The next step for the researchers is to expand their corpus of historical maps with this methodology, and also examine new coastlines and other bodies of water around the world. The project is also in the process of honing its algorithm to better handle complicated coastline data with minute differences in between water and land.

Websites

Affiliated Organizations

Project Director(s)

Michael Simeone

Director of Data Science and Analytics, Hayden Library

Headliner Info

Professor Robert Markley (University of Illinois)
Kenton McHenry (National Center for Supercomputing Applications)
Chris Morris (University of Texas at Arlington)