In this section
What's New - Issue 37, September 2011
In this issue:
- What's On - forthcoming events from the Setptember 2011 onwards
- What's new - new reports and initiatives since the last issue
- What's what - 'Do Good Work' - William Kilbride and Andrew McHugh contemplate success in digital preservation
- Your View? - comments and views from readers
What's new is a joint publication of the DPC and DCC
The DCC have a number of events coming up that may be of interest to you. For further details on any of these, please see our DCC events listings at http://www.dcc.ac.uk/events/. You can also browse through our DCC events calendar to see a more extensive list of both DCC and external events.
Practical Tools for Preservation: A hackathon (Hosted jointly by the DPC and the OPF)
27 September – 29 September 2011 York
The Digital Preservation Coalition (DPC) and the OPF are co-hosting a hackathon which aims to bring together digital preservation practitioners, collection curators and technical experts to present problematic digital collections, articulate requirements for their assessment, and then apply tools to automate the detection and identification of the content issues. It builds on the success of the AQuA project.
PrestoCentre Training Course 2011
12 - 16 September 2011
The audiovisual (AV) record of the 20th century is at risk, with digitisation being a solution, but this created a new problem: the preservation of digital AV content. Managers and technical staff of the AV industry need to be knowledgeable of, and understand how to use, the latest digital preservation technologies, in order to procure the safety of these documents of cultural heritage. Based on the experiences of some of the largest audiovisual and broadcast archives in Europe, this training will give a complete account of the tools and technologies available for the digital preservation of, and access to, audiovisual content, outlining strategies, workflows and architecture planning.
DCC Roadshow: Oxford
14-16 September 2011
The fourth DCC Roadshow is being organised in conjunction with the Oxford eResearch Centre, Oxford University Computing Services and the Bodleian Libraries. This free event will bring together participants to share examples of current practice and emerging strategies and provide an opportunity to discuss ideas on how to improve data management in light of current economic constraints. The event will run over three days and will provide participants with advice and guidance tailored to a range of different roles and responsibilities within their institutions.
DDI Workshop: Managing Metadata for Longitudinal Data - Best Practices
18-23 September 2011
Longitudinal survey data carry special challenges related to documenting and managing data over time, over geography, and across multiple languages. This complexity is often a barrier to building efficient systems for data access and analysis. DDI (Data Documentation Initiative) Lifecyle, a metadata standard that addresses the full life cycle of social science research data (formerly referred to as DDI 3), is designed to provide an efficient structure for the documentation of complex longitudinal data. In this workshop, participants involved in longitudinal data projects around the world will work together on issues involved in documenting longitudinal data.
JISC e-Learning Programme workshops on developing digital literacies
22 September 2011
A series of free workshops from the JISC e-Learning Programme on developing digital literacies are taking place during June–October 2011. These workshops will offer the latest in organisational thinking and educational development around digital literacies. Participants will hear the outcomes of recent JISC-funded activity in this area and be given the chance to share their own experiences through structured activities. Proven resources in support of staff and curriculum development and institutional change will be available to download, adapt and use in participants' own contexts of work.
Looking after your research data: a workshop for trainers
22-23 September 2011
This workshop, hosted by the UK Data Archive, will showcase outputs from their Researcher Development Initiative (RDI) project on Training in Data Management and Sharing for Researchers. The workshop is aimed at people who are tasked with training or teaching researchers - at all levels - in how to look after social research data, specifically UK-based lecturers, tutors, graduate teaching assistants and research support staff in universities, colleges and research organisations. Over one and a half days, participants will work through a series of training modules covering the seven key areas of data management identified by the UK Data Archive. The workshop will conclude with data management clinics giving a chance to discuss participants' individual challenges.
The Future of the Past of the Web
7 October 2011
The workshop has a programme of invited talks and discussion panels by UK and international speakers, featuring use cases of web archives and exciting new developments.
INSPIRE directive and the social sciences workshop
7 October 2011
The UK Data Archive and EDINA are hosting a workshop on the European INSPIRE Directive and how this directive matters for the social sciences. This workshop is co-ordinated by the JISC funded geospatial projects U•Geo and GECO and is held at the UK Data Archive, University of Essex.
DCC Roadshow: Brighton
The fifth DCC Roadshow is being organised in conjunction with the University of Sussex Library. This free event will bring together participants to share examples of current practice and emerging strategies and provide an opportunity to discuss ideas on how to improve data management in light of current economic constraints. The event will run over three days and will provide participants with advice and guidance tailored to a range of different roles and responsibilities within their institutions.
Preservation Of Complex Digital Objects Symposia (POCOS) event
11-12 October 2011, Glasgow
Preservation of software art presents challenges in many fronts, including complex interdependencies between objects; time-based and interactive properties; and diversity in the technologies and practices used for development. This exciting two-day symposium will provide a forum for participants to discuss these challenges, review and debate the latest developments in the field, witness real-life case studies, and engage in networking activities.
Intellectual Property Rights for Preservation: DPC Briefing day
21 November 2011, Bristol (Details to be announced)
The current issue of JISC Headlines includes two podcasts on research data management from leading experts in the field.
Quality data underpins research excellence by Kevin Schurer, Pro-Vice Chancellor (Research and Enterprise) at the University of Leicester
The social life of data by Professor David De Roure, Professor of e- Research at the University of Oxford e-Research Centre
Data centres at heart of UK data sharing culture
Ahead of JISC’s conference on ‘Research Integrity: the importance of good data management’ next week, a new study by JISC and the Research Information Network has found that data centres have been instrumental in developing a culture of data sharing among researchers.
JISC Legal Cloud computing and the law toolkit
The aim of this resource is to guide educational professionals through the legal aspects of implementing cloud computing solutions in their institutions.
International list of research data repositories
DataCite, BioMed Central and the DCC are pleased to announce an international list of repositories for research data. The list is a working document, so please get in touch to suggest changes or additions.
Cloud services for education and research - projects and partners announced
Since announcing a £12.5 million fund in February that aims to help universities and colleges deliver better value for money by working together more effectively, HEFCE and JISC are now able to confirm the projects and partners appointed to deliver the two parts of this work: a national cloud infrastructure and supporting services. JANET (UK) will deliver the national brokerage to aid procurement of cloud services between higher education institutions and commercial suppliers and Eduserv will provide a pilot cloud infrastructure for higher education institutions. Other partners include De Montfort, Exeter, Edinburgh, Kent, Liverpool John Moores, Oxford, Leicester, Southampton and Sunderland universities. The Digital Curation Centre (DCC) will develop data management tools and training capability. This will support the production and implementation of data management plans for universities and their researchers to preserve data for sharing, re-use and citation.
Key Speakers announced for IDCC11
IDCC11 will feature opening and closing keynotes from a panel of distinguished contributors. Ewan McIntosh, Founder and CEO of NoTosh, will speak on the subject "Public data opportunities". Professor Philip E Bourne, Editor-in-Chief of PLoS Computational Biology will give a keynote to open the second day of the conference entitled "Open data driving scholarly communications in 2020". The closing keynote address of the conference, "Models, Science, Openness", will be given by Professor Stephen Emmott, Head of Computational Science at Microsoft Research.
The programme will also include invited speakers in plenary sessions together with an interactive afternoon, "Community Space" for posters, demonstrations and informal meetings and a symposium on Personal Genomics. Research tracks will allow for presentation and discussion of peer-reviewed papers covering all types of digital curation research and scholarship. Practice tracks allow for presentation of experiences of all sorts in institutions, research domains or regions.
The Future of the Past – Shaping new visions for EU-research in digital preservation
This workshop started with a stock-take of achievements and ongoing activities funded under the ICT programme, presenting the portfolio of digital preservation projects and the research roadmaps proposed by the community so far. The main part of the workshop consisted of group discussions providing input to the digital preservation research agenda within the next EU framework programme for research and innovation (Common Strategic Framework, 2013-2020). The executive summary and full report from the workshop along with the report "Research on Digital Preservation within projects co-funded by the European Union in the ICT programme" are now available.
VIDaaS project launched
The VIDaaS (Virtual Infrastructure with Database as a Service) Project, based at Oxford University Computing Services, was formally launched in May 2011. The project has two key interconnected aims. First, it will develop a Web-based database service (the DaaS) for academic researchers, which will allow users to create, work with, and share databases online. This offers advantages such as automatic secure backup, the ability to access data from anywhere, and allowing multiple collaborators to work on the same database. For those who wish to do so, it will also offer a straightforward means of publishing datasets on the Web. Secondly, the project will develop a virtual infrastructure which will enable the DaaS to function within a cloud computing environment. Cloud technology – that is, the use of a network of computing resources to store and process data – has the potential to save organisations a considerable amount of money, through economies of scale and enhanced flexibility of provision.
Geospatial Data Preservation Resource Center
A new Web site, the Geospatial Data Preservation Resource Center, aims to help those responsible for producing and managing geospatial data learn about the latest approaches and tools available to facilitate long-term geospatial data preservation and access. The Web site provides descriptions and links for a variety of relevant resources, including education and training modules, useful tools and software, information on policies and standards for preserving geospatial data, and examples of successful preservation and associated benefits. This first release of the Web site, which CIESIN will be enhancing over the next year, was developed as an element of the National Digital Information Infrastructure and Preservation Program (NDIIPP) of the Library of Congress.
CERIFy Data Surgery report
If you work with research information, CRIS systems and the CERIF standard, then the latest report from the CERIFy project may be of interest to you. The CERIFy project is funded by JISC from February to July 2011 to investigate how the CERIF standard (Common European Research Information Format) for Current Research Information Systems (CRISs) could be used more widely within the sector to manage research data, to increase such engagement to a critical mass, and to support an emerging community of practice in Research Information Management (RIM).
Celebrate Liberation – A worldwide competition for open software developers & open data
Open up educational resources legally with new JISC tools
Making your educational resources openly available is not always straightforward when there are multiple licences involved – but two new JISC online wizards can help navigate the issues. Amber Thomas, programme manager at JISC, said, “These are really useful tools for aiding the remix of creative commons licensed content. The wizards are very simple to use, and we hope they will be useful to many people.” The wizards navigate through the licence compatibility issues which arise when blending Creative Commons (CC) licensed resources into open educational resources. They have been created for use by JISC-funded open educational resources projects, but it is anticipated that they will have to be applicable to other projects throughout the creative industries internationally.
Terrier, IR Platform v3.5
Terrier 3.5, the next version of the open source IR platform from the University of Glasgow (Scotland) has been released. 3.5 represents a significant update over the previous 3.0 version of Terrier.
RCS Project Outputs
The JISC funded Research Communications Strategy (RCS) project, based at the Centre for Research Communications – University of Nottingham has now come to a close. The final outputs include exploration of Open Access and the role of social networking in scholarly communications.
JISC calls for all metadata to be openly accessible
Unlocking the descriptive information or metadata about digital content, articles, books and research is the key to making it more useful, according to the JISC-funded resource discovery taskforce as it embarks on a new programme of work. If all UK metadata was made openly accessible, the taskforce says, then the resources themselves would be more visible and it would be easier to build innovative new ways for researchers, teachers and students to explore the resources. Twelve national organisations have signed up to a new set of open metadata principles and now JISC is inviting all publicly funded organisations including universities, colleges, libraries, museums and archives to make the same commitment. Signing up means that organisations are committed to supporting the principles and looking for opportunities to carry them out in whatever they do – whether this is building new ways to present unique collections or in contributing to national shared services for managing collections.
Open data challenge: what treasures can you find?
Libraries, museums and archives have recently taken to experimenting with open data with a vengeance and now the JISC Discovery programme with the DevCSI project are running an international competition to see what people can unearth in this data. The challenge is to develop an application which allows people to discover the treasures hidden in one of ten datasets - from the shipwrecks lurking off UK shores, to the metadata behind Jane Austen’s will online.
OpenDOAR reaches its’ 2000th Repository
OpenDOAR provides a comprehensive, authoritative and quality checked list of institutional and subject-based repositories. In addition it encompasses archives set up by funding agencies like the National Institutes for Health in the USA and the Wellcome Trust in the UK and Europe. SHERPA Services is delighted to announce that the OpenDOAR directory now boasts over 2000 repository entries from across the globe.
SWORD v2 client funding opportunity
In order to increase the number of SWORD v2 client implementations, the JISC have donated over £5,000 to fund new SWORD v2 clients. The majority of this money is being made available in a contested request for projects. We are seeking developers or development teams to submit ideas for creating new SWORD v2 clients, either by upgrading existing SWORD clients, building SWORD functionality into other scholarly communications tools, or developing entirely new deposit tools. In addition a small amount of the money will be used to provide technical support to the winning developers by the original SWORD v2 team ensuring that the projects have access to all the help and support they need. Entrants are encouraged to make use of the existing SWORD v2 client code libraries. Using the existing client code libraries will lower the development effort needed, enabling rapid, efficient, and cost-effective development. Proposals to add SWORD v2 into existing well-adopted and mature systems are particularly welcome.
Fedora 3.5 released
This release of Fedora, the robust framework for building digital repositories, focuses on several "under the hood" changes that improve Fedora's ability to be integrated and tested as part of larger repository systems.
Automating quality assurance and assessment of digital collections with AQuA
Manual quality assurance of digitised content is typically fallible and can result in collections that are marred by a variety of quality issues. Poor storage conditions can result in further damage due to bit-rot. Technological obsolescence can lead to additional risks. Detecting, identifying and fixing these issues in legacy digital collections are costly and time consuming manual processes. Identifying problems in a timely manner following digitisation or acquisition supports more effective and cost efficient mitigation. The AQuA Project applied a variety of existing software tools in order to automate quality assurance and assessment. Results from the JISC funded Automating Quality Assurance Project Mashups can be found on the project wiki.
KRDS Digital Preservation Benefits Analysis Toolkit
The Toolkit consists of the KRDS Benefits Framework (Tool 1) and the Value-chain and Benefits Impact tool (Tool 2). Each tool consists of a detailed guide and worksheet(s). Both tools have drawn on partner case studies and previous work on benefits and impact for digital curation/preservation.
DPC responds to EC Science Policy Consultation
The DPC has responded to a new policy consultation from the EU regarding preservation and access to scientific information. Preservation has a particular importance for scientific information because meaningful innovation is necessarily responsive previous generations of research. In that sense, preservation of appropriate research outputs is essential to all sciences, especially for unrepeatable experiments or unique moments of discovery. Aspirations about access to information are meaningless without commensurate actions that to ensure preservation. We welcome all actions that will encourage a dialogue between and within member states to ensure the preservation of scientific information and we call on the EU to engage in that dialogue as a matter of urgency, using existing examples of best practice to help build capacity.
Three new digital continuity posts advertised at Archives New Zealand
New Zealand seeks to appoint three senior advisors in Digital Continuity 1 x Permanent, 1 x Fixed Term parental leave cover until 1 September 2012, 1 x Fixed Term until 26 October 2012.
Leeds University joins the Digital Preservation Coalition
The University of Leeds has joined the Digital Preservation Coalition. 'Over the last few years our digital collections have grown and diversified', explained Bo Middleton of the University Library. ' They represent a considerable investment and we must move to protect these assets through active preservation.'
Editorial: Do Good Work!
William Kilbride (DPC) and Andrew McHugh (DCC/HATII) muse on what success looks like in digital preservation.
Gus Grissom didn’t seek fame: it was thrust upon him. Stagecraft and oratory were not among the criteria used to select the first seven astronauts of the Mercury programme. The first American astronauts – selected to be the first humans in space – came from an elite cadre of active military test pilots with experience of high-speed, high-altitude, high-risk aviation. Grissom was a natural stick and rudder man – massively qualified to fly complex aircraft but with little preparation for the fame that came with space travel. ‘Asking him to say a few words was like handing him a knife and asking him to open a vein’ (Wolfe 1979). Perhaps his most celebrated ‘few words’ came to the 18,000 workers of the ConvAir plant which was assembling the Atlas rocket upon which he and his colleagues would be thrust into space – or blasted to bits (Boomhower 2004). His concise pep talk was the mildly ironic but entirely genuine instruction: ‘Well … do good work.’
You can’t test a rocket: it works or it doesn’t. The only coherent test is the extent to which the product matches the design specifications, and that makes brave assumptions about the design of things which have never been shown to work before. Grissom’s ‘Liberty Bell’ capsule worked fine up to the point that it splashed down. A set of explosive bolts blew the hatch off giving him precious little time to struggle free from the rapidly sinking ship. (The bolts functioned entirely as planned: it was the function that was wrong.) Subsequent craft had more subtle and more complicated double hatches which could not blow off by accident. It seemed entirely sensible until the Apollo One fire which killed Grissom and two colleagues. You really can’t have that much electrical charge and that much exposed wiring and that much velcro in a capsule with 100% oxygen pressurised to five times above the norm and a hatch mechanism like that. Design, not implementation, was the problem.
I’m not sure we yet have convincing tests for digital preservation and I’m not sure how to spot ‘good work’. If anything it’s getting harder to test. We can measure the extent to which a product meets certain specifications and we can test individual components against localised criteria but that’s not the same thing. The more we depend on components the more we need to be circumspect about the overall design.
Let’s start by celebrating the bits where we’ve made good progress. We have had some great successes at the component level. The PLANETS test bed, for example, allows us to examine the performance of migration tools against criteria which we can select and define. This is critical because it provides a basis to evaluate different approaches and thus identify ‘good work’. This experimental and evidential approach is thorough and methodologically sound and it works really well for a small but ubiquitous set of file types for which there are multiple migration tools, especially when you want to plan image migration. It has its origins in the DELOS Digital Preservation Testbed has been scrutinised and tested over several iterations. But a testing framework is only as good as the materials it experiments with. So it is less useful for more exotic formats where resources are sparse and not at all where the options are highly constrained.
This is just one example of where experimentation helps to validate a component. Other parts of our architecture can be tested in other ways or against industry standards which others have provided – the energy consumption and durability of storage devices can be directly quantified. But these are components – like the bolts on the hatch. What about the overall architecture. Who is checking this?
The idea of a trusted digital repository has been in existence since at least 2000 when RLG and OCLC first established a joint working party to examine the attributes of a repository (OCLC/RLG 2002) that complied with the (then newly emerging) Conceptual Reference Model for an Open Archival Information System ‘OAIS’ (CCSDS 2002). In this they were following a path already suggested by the CPA/RLG Task Force on the Archiving of Digital Information which first met in 1994 and sought to ‘advance the development of trusted systems for digital preservation’ (CPA/RLG 1996 iv) and envisaged a world where a ‘certified archival repository’ would ‘be able to prove that they are who they say they are by meeting or exceeding the standards and criteria of an independently-administered program for archival certification’ (CPA/RLG 1996 9).
There are three things worth noting about these attempts to design preservation systems. Firstly, we’ve been at this a long time now, secondly our ideas about how to measure success have tended to fragment over that time, and perhaps most importantly the basis on which institutions provide preservation services have become increasingly distributed so that the definitional integrity implied in the ‘trusted repository’ has eroded.
The first point is relatively simple to deal with. Success in digital preservation can only be delivered in two slightly puzzling forms. It is either deferred to a point in the future and thus intangible in the present – meaning we can only point to ongoing efforts or improvements in what we used to do. Or success is delivered through a value chain in which case our success is compounded with others and difficult (impossible?) to assess directly. In either case, time is itself a measure of success. And until such times as technology or people stop changing it’s unlikely we will ever settle on definitive and comprehensive success measures.
The second point is more concerning and it points to a fundamental uncertainty on three topics: who should certify; how to certify; and why to certify.
There is a steadily growing body of standards which can be applied to claim success in the world of digital preservation. It is easy, and in some communities still fashionable to claim compliance to OAIS as a sign of success. This reveals a basic misunderstanding of the purpose and nature of the reference model. More appropriate tools have been derived from OAIS such as the Trusted Repository Audit and Certification: Criteria and Checklist ‘TRAC’ (CRL/OCLC 2007). These tools have even been tried in a number of test audits and a number of agencies have benefited and contributed to them. Simultaneously a light weight accreditation has emerged for research data centres, called the Data Seal of Approval (Harmsen 2009, DSA n.d.). The German competence network ‘Nestor’, which is also a founding partner in the Data Seal of Approval has published its own catalogue of criteria for long term cataloguing which was originally designed to provide a progressive link between coaching, self-assessment and external audit, (Nestor 2006, Dobratz et al 2007), while in the UK the Digital Curation Centre has developed and promoted a Digital Repository Audit Methodology Based on Risk Assessment ‘DRAMBORA’ which has provided an additional analysis of the risks that repositories face (McHugh et al 2007). Authors of the latter have argued that the demonstrable management of preservation risks is as good a measure of success in this context as any other.
While risk is a meaningful measure of preservation success, it can also be considered the most compelling factor in the selection of preservation approaches. One plans with risk in mind, and evaluates those plans in terms of the resulting risk exposure. The outcome of preservation planning is a distilled state of the world viewed in both objective and subjective terms. With the increasing availability of top-down, prescriptive best practice criteria, the extent to which our validation can be undertaken objectively must be questioned. Digital preservation describes the pursuit of a relatively common goal, and is a consideration with cross-sector and multi-disciplinary applicability. But a shared fundamental objective, to ensure the availability, usability and understandability of digital content over time, should not imply the existence of globally appropriate strategies. Success is contingent on an innumerable range of factors, diverse and variable individual priorities, emphases and contextual circumstances. Each can be tremendously influential in both presenting challenges to and facilitating preservation.
Orthogonal to these developments, a raft of regulations and standards have emerged on the broad topic of information management and retention. Because these come from clearly established authorities, sometimes with the force of law, they outflank the generic interests of the digital preservation community. In truth we frequently lack the capacity to inform such developments or the wherewithal to respond to engage with their creation and enforcement. For example BS10008 defines a new set of expectations about the custody and evidential requirements associated with electronic information which will be considerably more compelling for some preservation services than others (BSI 2008). Anyone operating an evidential repository will need to prioritise their response to these requirements over digital preservation standards. Sometimes these standards are quite specific: the Public Records (Scotland) Act 2011 revises the powers of the Keeper of the Records of Scotland to issue ‘advice’ around records management and disposal for public authorities and contracting authorities. This is big news if you are a contracting authority to the public sector in Scotland: and it’s irrelevant if you are anyone else. Other standards have certification and accreditation processes of their own. It’s hardly surprising if organisations view accreditation against data security standards like ISO27001 as a higher priority than TRAC or other digital preservation standards.
The information compliance stack continues to grow, deflecting, delaying and complicating certification of trusted repositories. Meanwhile digital preservation standards which lack a regulatory base and have little systemic authority behind them, proliferate. Periodic scandals around data protection, freedom of information, falsification, harm reduction and risk aversion can encourage senior executives to plan information management more carefully. But this is not necessarily a driver towards better preservation.
Step up the APARSEN Network of Excellence. APARSEN exists primarily to work against the trend of fragmentation in digital preservation research. Its extensive programme of work includes two related items around proving the viability of preservation services. It includes proposals on the broad topic of testbeds for preservation and one of the early actions include an effort to co-ordinate certification and then implement some examples. Although there may be different systems, and although these can be confusing, they are not mutually exclusive. So, a memorandum between DANS, DIN and CCSDDS introduces and clarifies the relationship between these three methods. So we can now talk about ‘basic certification’, ‘extended certification’ and ‘formal certification’. Passing one of these tests will allow organisations to display an appropriate mark on their website. APARSEN has now completed the first round of trial audits and the results are eagerly anticipated as they will not only reflect on the participants but on the process too. They will also be very helpful for those developing training materials in preservation.
One can muse at length on the differences between audit and certification, and the respective benefits (and difficulties) associated with each. The former implies a systematic process, no doubt required by the latter, but of value independently. Certification’s success demands an appetite from a range of associated stakeholders (which appears to exist, based on informal evidence). Less formally validated evaluation can be an end in itself, with few of the organisational, legal and financial dependencies one associates with a successful certification infrastructure. A reasonable way to distinguish their respective role is in terms of their capacity to offer reassurance. Repository certification provides this to a wider range of stakeholders, including those external, such as data creators, providers or consumers. An audit or series of audits on the other hand can have the same affect, albeit localised to those performing the preservation activities, who nevertheless have to make the best-guesses, and struggle with the same doubts as the rest of us.
Preparing for an audit can be quite a lot of work so in order for these standards to be widely used and accepted there is need for a whole community to prepare. The DPC’s contribution to this effort is focussed on its own members. By establishing a new peer review process of tools and services we will provide members with a conspicuous and independent measure of quality; clearly articulated improvement plans for their repositories; an opportunity to learn from other members; and a means to prepare for more formal external certification. Such work will be useful for funders who seek to designate repositories and, by improving practice it will support the DPC’s core mission of ensuring that our digital data is accessible tomorrow.
Grissom’s comical understatement went down a storm with the workers in the Convair plant. The audience roared its approval so loudly that Grissom and other dignitaries were practically driven from the stage and a massive banner with the slogan ‘Do good work!’ was soon raised over the plant (Schefter 1999 84-5). It helped that Grissom was willing to strap himself to one of their rockets while someone lit the touch paper. Perhaps it’s a case of instant empathy. Digital preservation is never as dramatic as that and there’s precious little public profile. It’s hard to empathise with the future and this is not a race to be the first anywhere. This is a race we can all win, and we’re more likely to succeed if we’re able to admit our frailties as well as our successes. Lives and fortunes and ideas and truths will be degraded if we fail to do good work.
Boomhower, RE 2008, Gus Grissom: The Lost Astronaut, Indiana Biography Series, Indiana Historical Society
British Standards Institute 2008 Evidential weight and legal admissibility of electronic information: Specification, British Standards Institute
Center for Research Libraries / OCLC 2007 Trustworthy Repository Audit and Certification: Criteria and Checklist, Center for Research Libraries and OCLC online at: http://www.crl.edu/sites/default/files/attachments/pages/trac_0.pdf last checked 06/07/2011
Consultative Committee on Space Data Systems 2002 Reference Model for an Open Archival Information System (OAIS), CCSDS 650.0-B-1 Blue Book, online at http://public.ccsds.org/publications/archive/650x0b1.PDF last checked 06/07/2011
Commission on Preservation and Access / Research Libraries Group 1996 Preserving Digital Information: Report of the Task Force on Archiving of Digital Information, RLG, online at: http://www.oclc.org/research/activities/past/rlg/digpresstudy/final-report.pdf last checked 06/07/2011
Data Seal of Approval (n.d.) Data Seal of Approval: Quality Guidelines for Digital Research Data, online at: http://www.datasealofapproval.org/sites/default/files/DSA%20booklet_2-0_engels_mei2010.pdf last checked 06/07/2011
Dobratz, S, Schoger, and Strathman S ‘The nestor Catalogue of Criteria for Trusted Digital Repository Evaluation and Certification’ in Journal of Digital Information 8 online at http://journals.tdl.org/jodi/article/viewArticle/199/180 last checked 06/07/2011
Harmsen, H (2009) Data seal of approval – assessment and review of the quality of operations for research data repositories, in Proceedings of the Fifth International Conference on the Preservation of Digital Objects - Joined Up and Working: Tools and Methods for Digital Preservation, 220-222, online at: http://www.bl.uk/ipres2008/ipres2008-proceedings.pdf last checked 06/07/2011
McHugh, A Ross, S Ruusaleep R and Hofman H, The Digital Repository Audit Method Based on Risk Assessment (DRAMBORA)', University of Glasgow
Nestor 2006, Kriterienkatalog vertrauenswürdige digitale Langzeitarchive Version 1, Nestor, online at: http://edoc.hu-berlin.de/series/nestor-materialien/2006-8/PDF/8.pdf last checked 06/07/2011
OCLC/Research Libraries Group 2002 Trusted Digital Repositories Attributes and Responsibilities, RLG, online at: http://www.oclc.org/research/activities/past/rlg/trustedrep/repositories.pdf last checked 06/07/2011
Schefter, J 1999 The Race: the Complete True Story of How America Beat Russia to the Moon, New York, Anchor Books
Wolfe, T 1979, The Right Stuff, Cape, London