Finding new information in unlikely places

Finding new information in unlikely places.

http://www.hakaimagazine.com/article-short/what-declassified-spy-images-are-teaching-us-about-climate-change

Advertisements

A Proposal to Objectify Subjective Values in Historic Presevation

The long-standing paradoxical debate in historic preservation between the values-based approach and the fabric-based has always irked me. An attempt was made in the last few decades to address this question by proposing/introducing an economic perspective to the debate: “What is it worth and to whom?”

However, this method, as far as I have been able to tell, has fallen short of the big picture: the value here to be quantified is not monetary–for that is easy enough to do, but is ultimately superficial when it comes to assessing historical value. It seems that in our day and age this is the first round out of the gun when it comes to valuation, “How much does it cost and how does that equate to value?” The idea is subjective in itself, it is predicated on there being an evaluator.

Is it possible to remove the bias? Is it possible to determine what people en masse think about an object and what kind of values they attach to it without influencing this very personal, inner process?

One way is to tap into the vast sea of social media data, that with a little time and practice is readily available to virtually anyone. This is the idea behind Big Data: how companies, governments, organizations and the like track interest, epidemics, trends, and just about anything else there is to track. Want to know how popular your website is? Want to know how well your product is doing compared another company? Want to know how popular North Brother Island is in New York City, and what people are saying about it (this is the our studio site/topic this semester)?

Although this method does come with its own drawbacks, it’s power comes from its unbiased and infinitely queryable nature. Even the traditional method of surveying is flawed when it comes to this: only a certain population will be open to a survey (think of all those green peace folks you’ve passed on the street); and of those who  actually accept to be surveyed, the act of surveying itself ultimately influences their behavior (think of the observer effect). This relationship is nonlinear and, for all intensive objective purposes, unpredictable.

Twitter, Instagram, Facebook, Flickr, etc. can all be used to gauge interest as a value of an object. Along with sources such as Wikipedia it is also possible to very quickly establish an accurate and vast stakeholder network that extends beyond the rational typically applied to such a resource. Typically, we look at these relationships in a single dimension, meaning with only a single degree of separation. Using the power of big data, it is possible to start with an initial topic and expand it to 2 or more degrees of separation, in turn creating an exponentially larger relationship network including topics/stakeholders/etc. one would never have been able to predict (well beyond the bounds of human capability).

The image below is a single search on Wikipedia for “North and South Brother Islands” with 1.5° of separation. Below it are some of the particular nodes of interest to the project. All obtained within minutes.

MACRO - north brother island

Given the right parameters, this method could also be used to very quickly provide an interactive and dynamic map of an object’s history that could be made available to those interested. It can also be used to track the behavior of a particular intervention or realized plan that has been recently implemented. These results can be displayed in real-time and by geographic location (given that the geographic data is available; only ~6% of Tweets, for instance, are geotagged).

The possibilities are endless, and invaluable to a progressive valuation method in contemporary historic preservation.

Here is an interesting article discussing the possibilities and benefits of Big Data methods/tools.

Down the Rabbit Hole I go

rabbit hole

So much going on this semester–too much perhaps? Never.

As part of an independent study I have created this semester, I’ve begun research into writing python components in grasshopper. The study deals with at-risk adobe structures in New Mexico, at the Fort Union National Monument, and developing a methodology for correlating historical climatological data to deterioration reports in the effort to establish patterns of deterioration that may helpful today to assessing the risk these structures face.

I’ve begun toying with the idea, thanks to the help of Professor Mostapha Sadeghipour Roudsair (who is co-sponsoring the study along with Professor Frank Matero), of creating a plugin for grasshopper that incorporates the hygrothermal analysis capabilities of WUFI. What is potentially quite powerful is the ability to recreate past meteorological scenarios (the site has exquisite and meticulous weather data dating back to 1895) and study the pathologies of the structures by cross-referencing historical photos and accounts. Using the same method then, today it would be possible to begin a recording campaign that corresponds real time, onsite weather data with such analysis. And in assessing future risk, it is then possible to predict such pathologies into the future by studying and establishing past and current deterioration mechanisms as they are directly related to climatological factors (wind, precipitation, groundwater rise, etc).

It’s looking like this will be my thesis. 1/3 Environmental Building Design, 2/3 historic preservation, which was the model that had been agreed upon for my thesis when I was granted the opportunity to pursue these degrees. What’s most exciting is to see people and professors from both departments exciting about the same topics and learning new things & methods never thought possible.

Anyway, more to come…