Review by ITaaU

This excellent review was written by Michelle Pauli of IT as a Utility who helped fund the Conference, including this blog. It is also available on their website at www.itutility.ac.uk

Further observations from the Threats to Openness in the Digital World conference, Newcastle 24-25 November, 2015

Extreme weather metaphors were flying around at Northumbria University in Newcastle this week where the Threats to Openness conference was looking at the future of the public record in the digital age. According to the speakers, we’re facing a digital deluge in the coming decade as a result of a perfect storm of technological, political, legal and economic changes.

The fate of government records were at the heart of the discussions due to two major developments that are hitting archivists simultaneously: over a 10 year transition period that began in 2013 and involves releasing two years of records each year, records now have to be transferred to the National Archive after 20 years rather than the previous 30, and, soon, those sets of records will also be the first to be predominately digital. So, in 2016/17, digital records from the early 1990s – not least those dealing with decisions about wars in Iraq and Afghanistan – are due to be processed.

IMAG0766

This presents challenges at all levels of the transfer process, as David Willcox, digital sensitivity review lead at the National Archives, outlined. Digital records bring increases in both volume and complexity.

“We do not have dockets and files any more. We have blizzards of emails stored on different computers – a morass of stuff. The best thing you can say about it is that it’s data,” explained Tim Gollins, of the National Records of Scotland.

Two thirds of government data is held on shared drives, confirmed David Willcox. Email accounts for 50-70% of content, with one government department revealing that it had an impressive 190TB of email data.

This makes appraisal – the appreciation of the value and historical relevance of a record – more difficult. Even more challenging is sensitivity review – the process by which records are checked for compliance with data protection laws and any risks to national security, damage to international or business relations, personal information and so on. This review determines whether a record is retained by the department, sent to the National Archives as a “closed” record or opened to the public. According to Arthur Lucas, member of the Lord Chancellor’s Advisory Council, 75% of the documents going into the National Archives are closed, although he pointed out that transferring a document closed is not a way to “bury” a record and it is preferable to it being retained as at least it is indexed and its status can change in the future. Sensitivity review is currently done by humans, page by page, and can be a lengthy process. And it presents a real conundrum in the digital age: the nature of sensitivity review is inherently tricky for computers but sheer volume means that bringing in technology may be the only option.

Technology can certainly assist: eDiscovery tools can be used to apply categorisation or clustering to unstructured information; software can help highlight themes, events and people (which may help with reducing duplication – there is around 40% duplication in government records); 75% of exemptions from leading government departments relate to personal information and this is a good starting point for technological solutions as it should be easier for software to highlight easily identifiable fields such as names and addresses.

Research into this is crucial, and it is underway. Michael Moss, professor of mathematics and information science at Northumbria, and Tim Gollins have been working on an IT as a Utility Network-funded project to look at methods and algorithms that will enable the creation of useful tools. The CIA is working with the University of Texas on tools for CIA records. There is interesting work ongoing at Colombia University (they are “approaching record keeping from a completely different perspective,” commented Moss. “They have reconceptualised the archive from ‘a whole collection of documents’ to ‘data you can analyse’.”)

Why does this matter? “Transparency and openness,” said Willcox. “Good governance,” said Sir Alex Allen, former permanent secretary at the Ministry of Justice and Cabinet Office and author of the Records Review report into the readiness of government to move from the 30 year to 20 year rule. “Good record management is not just about preserving the historical record. It’s also important for the efficient running of the office.” When civil servants are asked for advice on a particular policy issue and they know that they looked at that same issue a few years ago, they need to be able to find the papers that relate to the discussions and decisions. It is also important for audit and accountability – how do you know if a private sector company contracted to do public sector work is fulfilling its contract if you can’t find the paperwork? – and for provision of evidence to public enquiries and legal proceedings.

A cautionary tale of what happens when it goes wrong was provided by Mary Daly, president of the Royal Irish Academy. She related the woeful story of how a key government decision was made during the Irish banking crisis in 2008. Or, rather, how we don’t know how that key decision was made because the records relating to it are “inadequate. In fact, non-existent. There was a complete lack of proper procedures.”

But the public record – and the huge changes it is undergoing in the digital age – is not restricted to government. According to Jeremy Frey, professor of physical chemistry at the University of Southampton, the scientific record, which is – or should be – also a public record when it is publicly funded, is also evolving.

“We are in a liminal period. Publication is a ritual we all go through but what we’re really about now is moving from paper publication paradigm to a digital one that will allow much more.”

However, there is currently a gap between the opportunities offered by digital and, in many cases, how they are being exploited – or not.

“Scientific papers have become a repository for the argument but most of the data is missing. In the past absolutely everything was in the paper, because it could be. Scale was not a problem. That has changed,” argued Frey. If researchers do not see and value the data then they cannot be sure that they can trust that value chain…and lack of trust in the data can destroy the scientific endeavour. As well as more intelligently accessible data (a scanned copy of data in the form of an image cannot be usefully searched) there needs to be more detail on methods and, to be disseminated effectively, it needs to have a narrative: “the story that you weave around the data that is as important as the data itself.”

For the humanities and social sciences, there is a challenge rearing from a different direction.David Erdos, lecturer in law and open society at the University of Cambridge, gave a rich rundown of the current state of the EU data protection landscape, how it is set to change and why it should concern humanities and social science scholars.

Currently, derogations within the directive allow some wriggle room for particular, special purposes, notably journalism, literature and the arts. The EU is now proposing that these are contained in a “middle area” covering knowledge facilitation more generally. One area of concern for researchers is that there would be no derogation from the proactive duty to provide privacy notices if the purpose of the data use changes. While biomedical research organisations have been busy lobbying about this, Erdos said that such activism needs to extend to the social sciences and humanities research community. “I have been trying to make them aware of it and their obligations around data protection,” he said. “The whole landscape is very confused around this and research ethics and policies. Seemingly, there is very little understanding of the implications of legislation. The community needs to fight for this – that’s what the press and journalists do.” It is likely that research will end up being an area for which a huge amount of discretion is passed to member states and, so, working on a national level will be as important as the European level.

Agnes Jonker, senior lecturer in Archives at the Archiefschool, (the Netherlands Institute for Archival Education and Research), University of Amsterdam, gave an insight into how the Dutch treat access to the public record, and how it compares to the UK. Most notably, the Netherlands’ first FOI law was drafted in 1980 (in the UK the FOI act is from 2005) and so there is, generally, a more relaxed air around the concept. Which is not to say that it is without its critics – a new FOI law was proposed in 2013 (though it will not be in force for a few years) to update legislation in the light of changes to the state: with the shrinking of government through increased privatisation, third parties are escaping FOI scrutiny. However, there is no reference in the proposed new law about the duty to document.

Coming a full circle back to public records and the humanities, Andrew Hoskins is a military historian who is concerned that current developments will render uncertain the record of warfare. He is particularly worried about the reduction from the 30 year rule to 20 years will result in more records being closed. “The buffer protects those potentially subject to embarrassment or danger. But buffering time is under pressure. From 2013 to 2023, two years of records will be processed every year without a doubling of resources. It’s punching holes in the records in demand by historians. It’s not a recipe for careful selection and preservation,” he said.

Hoskins, like Frey, thinks that the “story” is crucial – and risks being lost with the move to digital files. “It shifts away the context that comes from the handling of the physical file. Without the material context, the front and behind of each file, the information might be found but the story might be lost,” he said. “A history of warfare that depends on the official records of the British army has an uncertain future. Over the 2000s we’ve seen a perfect storm of technological, economic and political change at all points from collection to collation and archiving and assessment for declassification through to their being made public by the archives – and some of these pressures result from the culture of openness that has attached itself to current technological changes without adequate resources and understanding of the issues. It’s not a recipe for improved public access. Faster history is not necessarily better history,” he warned.

What’s the answer? The conference concluded by considering potential ways of moving forward and the actions, partnerships and collaborations required for that to happen. In the words ofDavid Thomas of Northumbria University, the ideal is to “take the archival idea and reinvent it in a new context of record making and record keeping in a new social world.” Watch this space.

 

Leave a comment