This project is a spontaneous offshoot of the research that I have been doing for Michael Henry, which is focusing on establishing Thomas Jefferson’s knowledge of environmental architectural design throughout the conception, execution, and modification of Monticello.
It all began with the suggestion from Prof. Henry that it might be beneficial to create something like a “LinkedIn” diagram for Mr. Jefferson to begin to understand what he knew, where he might have learned it, and who he might have learned it from. I started this task manually using Illustrator. This initial diagram began as a circular chart of sorts, starting with a nexus being Thomas Jefferson and radiating outward by year, and divided into ‘hemispheres’–the top half being Europe, the bottom being the United States. This was utterly time consuming, and frustrating on the mind-numbing level to know that as research progressed I would have to redraw this chart. Each time. I found it incredibly hard to believe that there was no easier way, more specifically no parametric way that would streamline research and visualization simultaneously.
A long spring break later, I came across a way to begin organizing the data that I had already found in the form of citations, etc. and visualizing it. I turned to to the same methods and software that is being used to organize and visualize Big Data, such as social networks and google searches, to solve this.
I first found a project being undertaken by Harvard called Visualize Historical Networks which attempts much of the same thing using a software, I believe, called Gephi–which uses Java and other things I don’t yet understand to begin to visualize adjacency at a specific time and extended periods.
I came across a plugin for excel that allowed me to do this quickly and easily (as this in itself is not part of my research) using my Endnote database. It is currently limited in its scope, ability to modify parameters, and export options however not totally limited. I have already drafted a proposal to expand this research using some creative scripting to communicate between programs, MatLab, and a brute-force method for harvesting historical data from large digital databases (Library of Congress ,etc.).
We’ll see where this goes, time permitting!