Tag Archives: conferences

IASA Conference 2011: Turning archives into assets

    Start a conversation 
Estimated reading time 2–3 minutes

Semantic enrichment

Guy Maréchal continued the Linked Data theme by talking in more detail about how flat data models can be semantically enriched. He pointed out that if you have good structured catalogue records, it takes very little effort to give concepts URIs and to export this data as sets of relationships. This turns your database into a graph, ready for semantic search and querying.

He argued that “going to semantics cannot be avoided” and that “born digital” works will increasingly be created with semantically modelled metadata.

From Mass Digitisation to Mass Content Enrichment

The next talk was a description of the SONUMA digitisation and metadata enhancement project. Sonuma and Memnon Archiving Services have been working on inventories and dictionaries to help them index audio visual assets. They have been converting speech to text, holding the text as XML files, and then associating sections of the XML with the appropriate point in the AV content, so that it can be searched.

They identify breaks in programmes by looking for the time stamps using OCR techniques, and then looking for jumps in the numerical sequences. They assume that jumps in the numbers are breaks in programmes. This enables them to break up long tapes into sections, which usually correspond to programmes.

Social networking and Knowledge Management

Tom Adami described Knowledge Management projects at the United Nations Mission in Sudan (Best Practice Lessons Learnt: How the Exit Interview and Oral History Project at UNMIS is building a knowledge database). The UN in Africa faces problems of high staff turnover, remote locations, and difficulties in maintaining infrastructure. However, they have been using social networking to encourage people to share their knowledge and experience in a user-friendly way and so add to the official knowledge base.

Archive as a social media lab: Creative dissemination of digital sound and audiovisual collections

Budhaditya Chattopadhyay talked about a project to bring together archival practice, artistic practice, and social media. He also referred to the problems of preserving social media which is in essence ephemeral but may be an integral part of an artwork.

IASA Conference 2011: Keynote speech on Linked Open Data

    Start a conversation 
Estimated reading time 4–6 minutes

Kevin Bradley, IASA president, gave the welcome address to the 42nd IASA annual conference. He characterised the digital revolution as one that will continue reverberating for years. He reminded us that it is not always easy to sort the sense from the nonsense and that we are often surprised by what turns out to be valid and how easy it is not to see the wood for the trees – or perhaps to “lose the word to the bits”, or the “picture to the pixels”.

Keynote address – on Linked Open Data

The keynote speech was given by Ute Schwens, deputy director of the Deutsche Nationalbibliothek / German National Library (DNB). She opened with a lovely visualisation by the Opte Project of various routes through a portion of the Internet. It looked a bit like visualisations of neurons in a brain or stars and galaxies.

Ute’s talk was in support of publishing Linked Open Data. She outlined some of the concerns – lack of money, open access versus intellectual property rights, poor quality of data and assets themselves, and inadequate legal frameworks. She said that we shouldn’t be trying to select for digitisation, because everything in an archive has already been selected or it wouldn’t have been kept in the first place. She also highlighted the benefits of making digital versions of unique or fragile artefacts, in order to allow access without risk to the original. She talked about how there are many ways to digitise and that these produce different versions, so an archival master that is as close to the original as possible should always be preserved.

She used as an illustration original piano rolls. These can only be played on very specialised electrical pianos and users cannot practically be given access to them directly, but they were played by specialists and the music recorded, so users can be given access to that. The recordings are not the same as the piano rolls, but are a new and interesting product. It seems obvious that you would not destroy the original piano rolls simply because the music from them had been recorded and now exists in a digital version, so why should you destroy other forms of media such as film, simply because you have a digital version? The digital version in such cases is for access, not preservation.

One fear is that free access to information will diminish usage of an archive or library, but by opening up you can gain new users, especially by providing free access to catalogues and metadata (I like to think of these as “advertising” – shops make their catalogues freely available because they see them primarily as marketing tools).

Another fear is loss of control, but new scientific ideas often arise when diverse strands of thought are brought together and unexpected uses are made of existing data. The unusual and the unforeseen is often the source of the greatest innovation.

She pointed out that we have drafted searching and indexing rules over centuries to try to make objects as findable as possible, so Linked Open Data is merely the next logical step. We can combine automatically generated information with data we already have to provide multiple access points. We need to describe to put objects into context, but we don’t have to describe what they look like in the ways that we used to for catalogues. Good metadata is metadata that useful for users, not metadata merely for maintaining catalogues.

She ended by calling for more open access to data as ways to promote our collections and their value, adding that in uncertain times, our only security is our ability to change.

In the discussion afterwards, she said that Google needs our data and the best way to engage with – and even influence – Google is by gaining recognition as a valued supplier and making sure Google understands how much it needs us to provide it with good quality data.

The conference was hosted by Deutsche Nationalbibliothek / German National Library (DNB), Hessischer Rundfunk / Hessian Broadcasting (hr), and the Deutsches Rundfunkarchiv / German Public Broadcasting Archives (DRA). The sponsors were EMC2 (gold); Memnon Archiving Services, NOA audio solutions, and Arvato digital services (Bertelsmann) (silver); and Cedar audio, Cube-tec International, Front Porch Digital, and Syylex Digital Storage (bronze).

Digital Asset Management – DAM EU Conference – Third Session

    Start a conversation 
Estimated reading time 2–3 minutes

Sustaining your DAM

Sara Winmill from the V&A talked about the huge shifts in mindset that were needed to accompany their DAM work. They needed to stop thinking about storing pictures of things and start thinking about managing those digital images as the things. Their needs for storage were vastly underestimated at first. Unlike the myth, storage is not so cheap – the V&A need some £330K for storage annually. They have been investigating innovative approaches to “backup bartering” – finding a similar organisation and storing a copy of each other’s data, so that the backups exist offsite but without the expense of using commercial storage companies.

Despite having a semantically enabled website, they have not been able to link their Library Catalogue’s MARC records with the images, and have three sets of identifiers that are not mapped.

One of their major DAM problems is trying to stop people storing multiple copies and refusing to delete anything. The core collections images need to be kept, but publicity and marketing material is now being stored in the system without any selection and disposal policies in place, The original system was designed without a delete button altogether.

Can we fix it? Yes we Can! Successfully Implementing a Multi-faceted DAM system at HiT entertainment

It was a pleasure to hear of Tabitha Yorke’s successful DAM implementation at HiT as they built their first digital library. This was a relatively constrained collection and two fulltime members of staff were able to catalogue it in a year. This provided the metadata they needed for a straightforward taxonomy-based search system that is simple and easy to use. This meant that self-research was supported, saving the team much time and increasing productivity hugely. They are now working to integrate the library with rights systems. They worked hard at getting users to test the metadata and made sure that they were cataloguing with terms the users wanted to search with, rather than those that occurred first to the cataloguers. They now have two digital librarians managing 150,000 assets.

Tabitha stayed on the stage and was joined in a panel session by David Bercovic, Digital Project Manager at Hachette UK, and Fearghal Kelly of Kit digital. The afternoon ended with David Lipsey’s concluding remarks.

Digital Asset Management – DAM EU Conference – Second Session

    Start a conversation 
Estimated reading time 4–6 minutes

Serco Artemis Digital – Realising the Value of Archives and Rehabilitating Prisoners

Bruce Hellman from Serco described the work they have been doing to employ prisoners as cataloguers and transcribers. The work, which varied from project to project, but which included typing up handwritten archival documents that were not suitable for OCR capture techniques and adding metadata, was very popular with prisoners.

Bruce argued that it gave them a chance to develop skills that would be useful in the workplace on their release, and allowed organisations to get work done more cheaply than by paying standard market rates.

How Metadata and Semantic Technologies will Revolutionise your Workflow

John O’Donovan of the Press Association gave an entertaining presentation about using semantic technologies to index or re-index and publish to the web content from a range of systems, including legacy systems and external feeds. He pointed out – with a series of amusing ambiguities and unintentional innuendos – that simple text search lacks context, and that newspaper headlines often contain jokes, ambiguous terms, and terms that quickly become obsolete. So, metadata is vital in assembling assets that are about the same topic.

He stressed the importance of keeping your metadata management separate from your content management, so that metadata can be changed without having to re-index assets. (An exception is rights and other non-subjective metadata that needed to be embedded in the asset for further tracking. This is not a major concern to the Press Association as they do not track assets once they are published onto the web. I wasn’t sure what would happen if you decided you wanted to repurpose your content, and so needed a new set of metadata, how you link content and metadata, and how you manage the metadata and content within their separate stores.)

The PA are using Mark Logic as the content repository and a BigOWLIM triplestore to handle the associated metadata. Content is fed into the content store, then out again to a suite of indexing technologies, including concept extraction and other text-processing systems, as well as facial recognition software, to create semantic metadata. Simple ontologies are used to model the content, mainly indexing people, places, and events – themes chosen as covering the most popular search terms entered by users of the website.

John argued that such gathering and indexing of assets in order to automatically create and publish collections of associated content was simpler and easier than ingesting diverse content and metadata into traditional search, content management, and online publishing systems.

DAM for Content Marketing, Curation, and Knowledge Organisation

Mark Davey of the DAM Foundation took us on an animated and musical tour of different perspectives on metadata, engagement, social media, and how different the “digital natives” – young people who have grown up with digital technologies – will be to previous generations. Kids of the future will be able to have an idea in the morning, go to an online website app and create their site, their brand, and their marketing strategy in the afternoon, and be engaging with their potential clients by the evening.

Mark pointed out that people have moved on from the initial narcissism of social media and self-publishing and now want compelling stories they can engage with. He pointed out that as semantic technologies advance, we are caught in a feedback loop with them – we are the ontology that is driving the machines – and so we should be aware and vigilant. As the technologies become more powerful and all pervasive, we may lose sight of how they are working to serve us, rather than how we are serving up information about ourselves to them.

Marketing will have to become more sophisticated. Amongst the many statistics he quoted, I noted that 84% of 25-34 year olds have left a favourite website because of ads. At the same time, our networks become more interconnected. In a “six degrees of separation” game, we discovered that three people in the audience had met the Dalai Lama, and we are linking to more and more people through social media sites every day.

The metaphor of information as water is a familiar one, especially in the knowledge management area, but Mark’s colleague Dave pointed out how appropriate it is when talking about a DAM/dam. The DAM system forms the reservoir of content.

(I couldn’t help comparing and contrasting the ever-changing semantic seas of information at the Press Association with the more manageable streams of content that flow within smaller organisations, and how very different approaches are needed for such different contexts. The other day I saw the metaphor used again, in an interview with – apparently – one of the LulzSec hackers who talked about their pirate boat and “copywrong” as an enemy of the seas. )

Black Holes and Revelations: DAM and a museum collection

As if to continue the water metaphor, the next speaker was Douglas McCarthy from the National Maritime Museum. However, he took the metaphor up a stage, to space ships and black holes, with their content assets hidden in black holes as 100,000 uncatalogued image files.

Having catalogued and improved their DAM system, the Musuem’s Picture Library is now showing a healthy profit. Many sales come from the “long tail” of images that no-one anticipated anyone would want. Rather than saturating the market, putting the images online has been stimulating demand, with customers calling for more collections to be made available.

Digital Asset Management – DAM EU Conference

    Start a conversation 
Estimated reading time 6–10 minutes

I enjoyed DAM EU – a Henry Stewart Event – last Friday and was particularly struck by the range of projects, the variety of approaches, and the diversity of industries represented. Sponsors included Capture, Kit Digital, and iBrams.

I will publish a series of posts over the next few days to cover the event, session by session. It has also been written up by Michael Wells.

The Art and Practice of Managing Digital Media

The conference was chaired and opened by David Lipsey, who has a distinguished career in Digital Asset Management (DAM). He introduced the day by talking about the need for “fungibility and liquidity” to make digital assets earn their keep. He pointed out that DAM is an emerging and maturing field. None of us thought of a career in DAM when we were at school, it is something that we have arrived at during our working lives. Now educational institutions are considering providing DAM courses to meet a growing need for DAM skills. DAM is also starting to become an enterprise-wide infrastructure layer, so “soft” skills, such as gaining cross-departmental buy-in to projects, are becoming increasingly important.

Getting More out of DAM: The Latest Technologies and What is Possible Now

It is always a pleasure to listen to the eminently sensible Theresa Regli of the Real Story Group. She observed that distribution channels are changing and there is now more focus not on gathering assets together, but on how you send out assets in the right format to customers – to all phone platforms, all PC and Mac platforms, etc. Mobile apps are very much in the limelight at the moment, but it is important to see them as just another channel. It is also worth remembering that you don’t need to build specialised apps for all devices, as cross-browser formats may well be adequate. The purpose of the Digital Asset Manager should be to get the right media to the right people in the right format.

To be successful, DAM projects need to handle change management and get people to work in different ways. Creative people want to work collaboratively and simultaneously, so DAM systems need to support this, rather than imposing rigid linear workflow processes.

When assessing DAM technologies, the differences are usually not in the functionality they offer, but in the way they provide that functionality. Too much emphasis is often put on technology to solve problems. Theresa said that she thought technology could solve about 20 per cent of business problems, and even that was probably a bit generous. The core problems are caused by metadata, collaboration, diligence, and governance.

Different people need different views of a DAM system at different times, so customisation and personalisation is useful – for example providing limited functionality on an iPad app that enables workers who are away from their desks to access specific aspects of a system. However, a danger to avoid is creating multiple versions of the same asset. It is much better to have a master version and transcode out to each device as necessary.

New developments in metadata include colour palette searching which is popular with creatives more interested in moods than particular things. Another development is AVID’s “layered metadata”, which offers the ability to annotate directly on to video, as well as have general metadata for the whole asset. Much work has been done to improve video handling tools, so automated scene detection and facial recognition software are now becoming available in products.

When buying a DAM system, it is worth remembering that the now dominant technologies are relatively new, and that what may be ideal at a workgroup level may not scale up to enterprise level.

How to be a Good DAM Customer

Sarah Saunders of Electric Lane spoke on an often overlooked, but vital, aspect of the DAM procurement process – how to be a good customer. She pointed out that you get to choose your DAM vendor, but they don’t get to choose you. Projects can be derailed by customers being unsure what they want, being poor project managers, or having unrealistic expectations of what they can get for their money. It is in no-one’s interests to drive the software vendor out of business.

For a project to be a success, it is important to have a project leader who is not part of any of the stakeholder departments and is empowered to make decisions and overrule any individual department if their focus on their own departmental needs threatens the entire project. Creative people and IT people tend to think very differently, to the extent that they can completely misunderstand each other. So, it is not enough just to get people to sign off a complex requirements document, they need to understand it on their own terms otherwise they are likely to be disappointed. Building a system that the creatives are happy using, even though their requirements may seem bizarre to the IT department, is just as important as keeping documentation in good order, something the creatives might not appreciate.

One of the hardest aspects is working out what you really want from the system, and where you are prepared to compromise in order to get something that can actually be built for the budget. In terms of functionality, it is important to think beyond what people ask for, as that is likely to just be what they are using already, to what they would like if they knew it existed.

It is also important to see any system working in practice, as demos may not be enough to know that it will work with your data and business processes. A small delay in keywording speed can have a huge impact on productivity.

An iterative build process good if needs are complex. It can be expensive, but may still be cheaper than ending up with an underperforming system.

Some customers expect the vendor to do all the thinking for them, but the vendors do not know the business needs and processes of every single business. It is not an easy job to be a software vendor, and a difficult, disorganised customer will always end up having a hard time.

Panel session: how we did our procurement

Laura Caulkin from Net-a-Porter and Lisa Hayward from Shell described their recent DAM procurement projects.

Net-a-Porter took a very user-centric approach, running lots of requirements gathering workshops and writing user stories rather than a requirements spreadsheet. They spent 6-8 months planning what they wanted the new system to do, with the help of external expert consultants.

At Shell the build process of their new DAM system took about 18 months and involved the migration of 50,000 assets. They selected Vyre, as a good cultural fit. Lisa stressed the importance of getting vendors to demo their products with your data and felt that details requirements documents and scoring systems were worthwhile. She argued for the “soft” value of a good cultural fit between the team and the vendor to be included in the scores, which the IT team did not agree with. She pointed out that the IT team have a very different perspective as they want the product that they will find easiest to integrate as their short-term project, but you are left to work with the system on a daily basis.

In terms of lessons learned, she felt that their initial requirements were not specific enough, and that this led to change requests and delays. This caused probpems wiht senior managers, who had not expected the project to take so long. However, once it was built, it was well received. Lisa pointed out that change management is really just talking to people and explaining what is going on. They had great success with the adoption of the new system as it was such a vast improvement on the system that had been built for them 10 years previously.

NKOS slides

    Start a conversation 
< 1 minute

Many thanks to Traugott Koch for these links:

NKOS Workshop at ECDL in Aarhus.

NKOS Special Session at DC 2008 in Berlin, all in one single pdf file.

The Joint NKOS/CENDI Workshop “New Dimensions in Knowledge Organization
Systems”, in Washington, DC, USA on September 11, 2008. “Thanks to the contributors, programme committees, chairs and the large and very active audiences. We invite your active participation 2009 as well. Watch the website. ”