Data mining without a pick-axe

Computers have become the historical researchers’ powerful ally when it comes to discerning patterns amid data in text. Technology has allowed researchers the ability to complete projects that would have been impossible without these computational methods. The typewriter allowed the researcher the ability to create a manuscript that was legible and edits were cleaner, unlike handwritten documents that needed a cypher. Word processors, the internet and digital databases that allowed researchers the ability to not only type up and revise documents, but to complete research from the comfort of their office without having to drive to a library or archive to review written sources. While some of these tools were advancements allowing finished research to become published article or monograph faster, research has advanced to combine critical thinking with computational data mining that creates new assessments of old research.

Data mining is like other technological resources, as it is only as good as it is created to be. Working with computer scientists, geographers, and some graphic engineers, historians can develop programs that can possibly organize data effectively by creating a program that not only finds data anomalies and similarities, but helps pinpoint how it affected a city, state, or region on a map, chart or diagram. This electronic resource allows data to be extracted from numerous sources in a fraction of the time it would take to review the data manually and through interpretation a pattern revealed that converts the raw data pertaining to historical events or periods into interpretive charts, maps and diagrams.

According to historian Cameron Blevin’s recent article “Mining and Mapping the Production of Space- a View of the World from Houston,” we are now capable of exploring scanned historical reproductions as three-dimensional data and use technology to further analyze the human connection between time and geographical space. Blevin uses evidential support by using “data mining” to sift through hundreds of printed materials that allowed him to reveal how minute details like advertising, train schedules and news articles were all relevant to show how a Texas newspaper presented readers the world outside the Houston city limit. The usage of city names and states showed that places like Chicago or New York City were no longer just names to Houstonians, but becoming places of commercial significance. While extracting information using computer scripting, analysis allows insight into how “space and place” created economic, political and cultural status during periods of time that would be missed through manual interpretations of the same data.

Technology comes with a cost and requires people with a greater knowledge in the fields to run an effective program. It is not just software that allows the visualization of history, but the creation of a community that come together for the benefit of scholarship and the expansion of digital humanities. However, through manual connections we maintain an emotional attachment to the past that can never be replaced by a digital resource of research. So until data mining is able to comprehend all sources and think critically of all data like the human mind, we will still be required to manually sort through some data piece by piece, line by line, word by word and preserve the human bonds to the academic world.