LSC 555

Institutional Repositories, Wikileaks, and the Network Age


In this paper, I will first define institutional repositories and evaluate and assess how diplomatic information, correspondence, and memoranda might fit into that definition.  Then I will trace the evolution of the mandate within our government to increase sharing of information across agencies, precipitated by the recommendations of the 9-11 Commission Report.  Just briefly I will mention society’s transformation from the Information Age to the Network Age and its impact on information sharing.  Following a short review of selected elements of forensics in digital librarianship, I will hypothesize how all these forces converged, resulting in the Wikileaks release of sensitive, classified information to public media outlets.  I will conclude with thoughts about the public interest and information governance within the framework of access to government information.

Institutional repositories defined

Institutional repositories, defined generically by Crow (2002) as “digital collections that provide access to the intellectual output of an institutional community,” is a term seldom applied to the extensive information holdings of a nation’s foreign affairs and national security superstructure.  But let us face the facts.  First, the Department of State, one foreign affairs agency among many, receives thousands of classified reporting cables per month originating from embassies and consulates throughout the world, distributed to U.S. embassies in nearby countries with a need to know, disseminated throughout the foreign affairs establishment in Washington and, in many cases, shared with close foreign allies.  Second, hundreds of thousands of government historical documents, diplomatic telegrams, memoranda, etc., are maintained in chronological order for ultimate de-classification and/or for distribution to the public upon request under the Freedom of Information Act.  Third, the Department of State Ralph Bunche Library, the oldest library in the Federal Depository Library system, (established by then Secretary of State Thomas Jefferson in 1789) holds 400,000 print volumes and several thousand e-books, along with access to numerous electronic databases (

Yeates (2003) provides a similarly applicable definition of institutional repository.  “An institutional repository is the collective intellectual output of an institution recorded in a form that can be preserved and exploited…repositories are key to the ability of institutions to respond to future needs for more dynamic cross-boundary communications services” (Yeates, 2003).  One important drawback of institutional repositories cited by Yeates is reliance on unproven methods for long term digital preservation.  We will come back to that later.

One may conclude, then, that while not commonly considered as such (in large part, I suspect, because most often institutional repositories are considered the domain of the academic environment exclusively), the extensive information holdings of the Department of State (DOS) may and should qualify as an institutional repository.  Moreover, it is not without precedent that a non-academic organizational body would adopt the institutional repository designation, especially if attached directly to a research-type institution.  As one example, Romary and Armbruster describe a decision by the European Commission, in conjunction with the European Research Council, to create an institutional repository (Romary & Armbruster, 2010).

9-11 Commission mandates information sharing

Chapter 13 of the 9-11 Commission Report calls for a different way of organizing government in wake of the terrorist attack on New York City (Kean, 2011).  Specifically, it cites information sharing failures that may have prevented analysts and intelligence specialists from “connecting the dots” in advance and thereby averting the terrorist attack.  Various pieces of information, for example, about the attackers and about their presence in the United States existed in various databases, for our purposes, in various institutional repositories across government, but the information was not shared across agencies.  The report made the following two recommendations:

Information procedures should provide incentives for sharing, to restore a better   balance between security and shared knowledge,” and

           “The president should lead the government-wide effort to bring the major national security institutions into the information revolution.  He should coordinate the resolution of the legal, policy, and technical issues across agencies to create a “trusted information network.

Agencies, including the DOS and the Department of Defense (DOD), in response to these recommendations, immediately sought ways, technologies, and methodologies to achieve the recommended intra-government information sharing that had previously been non-existent at worst and insufficient at best.  While the Ashcroft memorandum, released immediately after the 9/11 attack, was a knee-jerk response designed to restrict public access to sensitive information (Uhl, 2004), bi-annual GAO reports track steps taken within government to improve information sharing across agencies (Walker, 2004).  And as in most cases involving government bureaucracy, efforts and responses occurred and continue to occur in a sporadic and uneven fashion.

SIPRnet opens the door to information sharing

The Secret Internet Protocol Router Network, also known as SIPRnet, had been known in military circles as the way the DOD distributed sensitive information on its various computer systems (Weinberger, 2010).  Largely in response to the recommendations of the 9-11 Commission Report, access to SIPRnet was expanded to include more members of the military, the intelligence community and overseas-stationed employees and contractors of DOD, DHS, and DOS.  BBC reported that at the time of the Wikileaks theft, over 2.5 million civilian and military members had access to SIPRnet (BBC, 2010).  Meanwhile, State developed its own repository of diplomatic communications, called the Netcentric Diplomacy Initiative, which allowed the sharing of its documents with and the hosting of its documents on SIPRnet.  Connection to and sharing with SIPRnet in effect expanded the size State’s institutional repository by a logarithmic proportion.  At this point, military members with access to SIPRnet also had open access to tens of thousands of diplomatic telegrams from U.S. embassies around the world.  To date, as a response to the Wikileaks release of classified telegrams, State’s Netcentric Diplomacy Initiative has suspended its access to DOD’s SIPRnet (Hoover, 2012).  State plans to have in place sometime in 2014 a metadata tagging system that will allow cataloging and retrieval of data from its vast repository of diplomatic telegrams but that capability does not exist at present.

Simultaneously, the big intra-governmental push to share information also harkened the requirement for government to “join the information revolution” (Kean, 2011).  However, by the time of the attacks of 9/11, social theorists had already postulated one year previously the transference from the information age to the network age, “one which goes beyond primarily content in the form of data to also embrace distribution infrastructure, data management and the linkages between content producers and consumers” (Brevini, Hintz, & McCurdy, 2013).   The idea of the network society also speaks to the role of the institutional repository in the evolution of government information systems.  While the Information Age gave rise to new ways of information generation, managing and dissemination through the use of new technologies (Paul, 2012), the Network Age broke the great power and sovereign state monopolies on information of the prior Information Age through the creation of media and social networks with expanded power and reach (Brevini, Hintz,& McCurdy, p. 89).  There also appears to be a correlation between the evolution from Information Age to Network Age and the evolution of Web consumers and downloaders in Web 1.0, to Web producers and uploaders in Web 2.0.

Digital decay opens the door to leakages

This information-rich environment might have remained the exclusive domain of trusted military and diplomatic operators.  Data security measures in place and information security (INFOSEC) incentives might have been sufficient to regulate control of sensitive material.  But there was one unplanned-for element, unanticipated digital decay.

Again, the language and vocabulary of institutional repositories holds explanatory power.  Long term preservation of digital information seems a foregone conclusion.  We believe that electronic data will remain, will retain its format, and will always be there in storage to serve us, like hieroglyphs carved on the walls of ancient Egyptian temple ruins.  Robert Fox (2011), in “Forensics of digital librarianship,” disabuses us of false notions of digital permanence by explaining causes of digital decay, some of which are quite relevant to the present discussion (Fox, 2011).  Enumerating possible causes of digital decay, Fox considers the following: neglect, where improper handling or exposure degrades media quality; file glut, where files are securely stored but in a disorganized or not well thought out manner, making it nearly impossible to retrieve relevant digital data; corruption, where flipped data bits, failure of storage systems, and human error lead to file corruption; hardware failure, disasters, backup failures, disc failures, and human error result in data loss; and obsolete formats, where obsolescence results from proprietary formats, defunct vendors, formats requiring special, obsolete hardware and/or obsolete storage media (Fox, p. 266).  Most significant among these causes of digital decay, and most relevant to our discussion of the Wikileaks release to public media of classified data, is the concept of file glut. In the case of Wikileaks, file glut didn’t specifically corrupt the records and the data, but it corrupted the process by which records were retrieved and transported to the degree that a low ranking soldier was able to burn copies of sensitive and classified material to DVD’s without his supervisor or anyone in his chain of command taking notice.

Summary and conclusions

The elements in place were the following: a foreign affairs agency with an institutional repository that doesn’t call it one; a mandate, following a vicious terrorist attack, to share information across agencies; the transformation from the Information Age to the Network Age that expanded the power and reach of social and media networks beyond and outside the power of the sovereign state; and file glut, resulting in decay of digital material due to improper organization of digital files and subsequent poor supervision of the process of retrieving and copying highly classified documents.  The question is less of why did Wikileaks happen when it did, and more of why something like Wikileaks didn’t happen sooner and/or why it hasn’t happened again?

We cannot have a discussion about government information without a discussion about transparency, the public interest, and the public’s right to know or have access to information that affects them, in short, that provides for an informed public which is necessary in a democracy.  In the United States, the right of the public to information about their government is enshrined in the Constitution and the nation’s laws.  The Freedom of Information Act, enacted in 1966, was the first federal law that established the legal right of access to government information (Uhl, 2004).  Unfortunately, the Ashcroft memo immediately following the 9-11 attacks not only curtailed access to information, but strengthened the government’s ability to restrict access and gave a blanket FOIA exemption to the newly formed Department of Homeland Security (Uhl, pp. 267-269).  There has been, arguably, a constant erosion of FOIA enforcement since the 9-11 attacks.  Insofar as the repository in question is a repository for the American people, this restriction on access is a negative.  But perhaps it is a repository not for the American people, but exclusively for the practitioners of foreign policy.  In that case, the public access restriction is a different thing altogether.  We have to define the stakeholders.

Who are the stakeholders in this information repository? And who are the shareholders? In other words, who has the greatest right to the information resources managed by government agencies, the government itself, or the people who elect and are represented by their government leaders?  Embedded in that question is an equally interesting one: who within government has the greatest, or strongest claim on information resources created and developed by government agencies, the executive or the legislative leadership?  It all comes down, basically, to a question of information governance and management of the information transaction space.  Kooper, Maes and Lindgreen (2011) describe information governance as “focusing on the seeking and finding, creation and use, and the exchange of information, not solely on its production,” and they view information governance as “a framework to optimize the value of information in some sense to the actors involved” (Kooper, Maes & Lingreen, 2011).   They describe the actors involved as 1) the creator of the information, 2) the receiver of the information, and 3) the governing actor. i.e., the actor who regulates the interaction between the creator and the receiver.  In our case, that would include the government as creator, the public as receiver (with an implicit requirement for government transparency), and the leakers (and perhaps the press who transmitted the leaked information) as the governing actor.  The pertinent question one may raise, however, is how could better or stronger information governance have informed the actors and prevented Wikileaks from happening in the first place, or mitigated its effects once it occurred?

A fuller analysis of these three actors awaits a different paper in perhaps a different subject matter area.  In our role as information professionals, our work laying the problem out in these terms, i.e., defining the institutional repository space (and even naming it as such), explaining the information policy roles (as there are numerous information policy applications at work), and setting forth the information governance aspects contributes significantly to a deeper understanding of the problems that exist and facilitates their solution in a complete and interdisciplinary way.  Our work done, we pass it on to other professionals, safe in the knowledge that the completion of our tasks, as information professionals, made a vital contribution to the overall discourse.


Brevini, B., Hintz, A., & McCurdy, P. (2013). Beyond WikiLeaks: Implications for the future of communications, journalism and society. Palgrave Macmillan.

Crow, R. (2002). The case for institutional repositories: A SPARC position paper. ARL Bimonthly Report 223.

Fox, R. (2011). Forensics of digital librarianship. OCLC Systems & Services, 27(4), 264-271.

Hoover, J. Nicholas. “InformationWeek: State Department CIO: What’s Changed Since Wikileaks.” InformationWeek. N.p., 5 Apr. 2012. Web. 06 Oct. 2013. <;.

Kean, T. (2011). The 9/11 commission report: Final report of the national commission on terrorist attacks upon the United States Government Printing Office.

Kooper, M., Maes, R., & Lindgreen, E. (2011). On the governance of information: Introducing a new concept of governance to support the management of information. International Journal of Information Management, 31(3), 195-200.

Romary, L., & Armbruster, C. (2009). Beyond institutional repositories. Available at SSRN 1425692.

“Siprnet: Where the Leaked Cables Came from.” BBC News. BBC, 29 Nov. 2010. Web. 07 Oct. 2013. <;.

Uhl, K. E. (2003). Freedom of information act post-9/11: Balancing the public’s right to know, critical infrastructure protection, and homeland security. Am.UL Rev., 53, 261.

Walker, D. M. (2004). 9/11 Commission Report. Reorganization, Transformation, and Information Sharing.

Weinberger, Sharon. “What Is SIPRNet?” N.p., 1 Dec. 2010. Web. 6 Oct. 2013.

Yeates, R. (2003). Institutional repositories. Vine, 33(2), 96-101.


Environmental Scan, Arnold Hirshon

The article, Environmental Scan (Arnold Hirshon), provides a comprehensive overview of library trends and developing technologies.   This short summary highlights several ideas that “jumped off the page” for me in various issue areas.

Hirshon’s introduction cites Saffo’s rules for forecasting and Kurzweil’s Law of Accelerating Returns, both worthy of memorializing here.  Economic and social issues includes rising costs and budgetary concerns (pre-2009 market crash), Generation Y attributes, and privacy and personal data security issues (pre-Wikileaks and pre-Snowden NSA surveillance leaks).  Updating this section in light of events since its original publishing would change considerably the focus of the article, even as it changes our outlook going forward.  The technology issues section considers the shrinking size and falling cost of computer memory and postulates resulting developments.  It foresees the falling cost of personal computers and the shift to mobile technologies.  It accurately predicts the rising popularity of social networks and gaming technologies.  The education section accurately predicts the rise of the “open movement,” more commonly referred to today as the  MOOC education delivery technology.  Information content predicts the rise of e-books and digitization of information resources.  Finally, the library leadership section predicts the evolution of the library platform from a purely collection function to an advocacy for information access and space management function.

OCLC Systems and Services  The system development life cycle and digital library development, H Frank Cervone

The system development life cycle and digital library development article by H. Frank Cervone describes a process that follows very closely the standard methodology for project management.  It outlines the eight phases of the system development life cycle, details the go/no-go decision points at critical junctures in the process, and clarifies the end products/outcomes that makeup the knowledge management core of the project enterprise.   Knowing this process and its vocabulary and how it applies in a general and universal way is key to ensuring project success for any type of undertaking.

One thought on “LSC 555

  1. Blog Week Two

    The Integrated Library System: From Daring to Dinosaur

    The Kinner and Rigda article, “The Integrated Library System: From Daring to Dinosaur,” traces the evolution of library management systems from the automated library system (ALS) of the 80’s and 90’s to the integrated library systems (ILS) of more recent times, using West and Lyman’s three=phase procession of the effect of information technology on organizations (Kinner and Rigda, 2009, p. 402). The three-phase procession provides a handy model that captures robustly the stages of modernization, innovation, and transformation and its effects in the changes from ALS to ILS.

    Significantly, the article also traces market forces affecting industry vendors and posits that those forces also influenced (and, not accidentally, were influenced by) the ALS to ILS evolution. For example, mergers and acquisitions among vendors resulted in a “reshuffling” and a consolidation of companies. giving rise to a few very large firms who were able, in turn, to reach economies of scope and of scale that may not have been previously achievable. A similar consolidation occurred in the banking and the automobile industries several years ago, stimulated by post-war demand and war-developed new technologies (something I studied as an undergraduate over 25 years ago in a managerial economics course; not likely I’ll remember the exact source, but I do recall a textbook entitled “The structure of American Industry.”).

    The authors make a not-so-veiled argument supporting open source ILM systems in the section entitled, “Growing Library Dissatisfaction.” It closes with commercial examples and practical virtues of open source ILS applications.


    Kinner, L., & Rigda, C. (2009). The Integrated Library System: From Daring to Dinosaur?. Journal of Library Administration, 49(4), 401-417.

    The Next Generation Integrated Library System: A Promise Fulfilled

    The article, “The Next Generation Integrated Library System: A Promise Fulfilled,” by Yongming Wang and Trevor Dawes, provides a definitive and an affirmative answer to the question raised in the title of the article that precedes it in the syllabus.

    Following a brief introduction, the authors set forth the following two pillars of the second generation library automation system: comprehensive and unified management of library resources; and a departure from traditional ILS models in exchange for a service-oriented architecture (SOA) model (Wang and Dawes, 2012, p. 76).

    The literature review section cites the following three trends in libraries (Breeding): greater volume of digital collections; changing expectations vis-à-vis user interfaces; and different attitudes toward data and software (Wang and Dawes, 2012, p. 77).

    The article concludes with four aspects of next generation ILS and two examples of existing commercial applications.


    Wang, Yongming, and Trevor A. Dawes. “The Next Generation Integrated Library System: A Promise Fulfilled?” Information Technology and Libraries 31.3 (2012): 76-84.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s