Digital Asset Management – DAM EU Conference – Third Session

    Start a conversation 
Estimated reading time 2–3 minutes

Sustaining your DAM

Sara Winmill from the V&A talked about the huge shifts in mindset that were needed to accompany their DAM work. They needed to stop thinking about storing pictures of things and start thinking about managing those digital images as the things. Their needs for storage were vastly underestimated at first. Unlike the myth, storage is not so cheap – the V&A need some £330K for storage annually. They have been investigating innovative approaches to “backup bartering” – finding a similar organisation and storing a copy of each other’s data, so that the backups exist offsite but without the expense of using commercial storage companies.

Despite having a semantically enabled website, they have not been able to link their Library Catalogue’s MARC records with the images, and have three sets of identifiers that are not mapped.

One of their major DAM problems is trying to stop people storing multiple copies and refusing to delete anything. The core collections images need to be kept, but publicity and marketing material is now being stored in the system without any selection and disposal policies in place, The original system was designed without a delete button altogether.

Can we fix it? Yes we Can! Successfully Implementing a Multi-faceted DAM system at HiT entertainment

It was a pleasure to hear of Tabitha Yorke’s successful DAM implementation at HiT as they built their first digital library. This was a relatively constrained collection and two fulltime members of staff were able to catalogue it in a year. This provided the metadata they needed for a straightforward taxonomy-based search system that is simple and easy to use. This meant that self-research was supported, saving the team much time and increasing productivity hugely. They are now working to integrate the library with rights systems. They worked hard at getting users to test the metadata and made sure that they were cataloguing with terms the users wanted to search with, rather than those that occurred first to the cataloguers. They now have two digital librarians managing 150,000 assets.

Tabitha stayed on the stage and was joined in a panel session by David Bercovic, Digital Project Manager at Hachette UK, and Fearghal Kelly of Kit digital. The afternoon ended with David Lipsey’s concluding remarks.

Digital Asset Management – DAM EU Conference – Second Session

    Start a conversation 
Estimated reading time 4–6 minutes

Serco Artemis Digital – Realising the Value of Archives and Rehabilitating Prisoners

Bruce Hellman from Serco described the work they have been doing to employ prisoners as cataloguers and transcribers. The work, which varied from project to project, but which included typing up handwritten archival documents that were not suitable for OCR capture techniques and adding metadata, was very popular with prisoners.

Bruce argued that it gave them a chance to develop skills that would be useful in the workplace on their release, and allowed organisations to get work done more cheaply than by paying standard market rates.

How Metadata and Semantic Technologies will Revolutionise your Workflow

John O’Donovan of the Press Association gave an entertaining presentation about using semantic technologies to index or re-index and publish to the web content from a range of systems, including legacy systems and external feeds. He pointed out – with a series of amusing ambiguities and unintentional innuendos – that simple text search lacks context, and that newspaper headlines often contain jokes, ambiguous terms, and terms that quickly become obsolete. So, metadata is vital in assembling assets that are about the same topic.

He stressed the importance of keeping your metadata management separate from your content management, so that metadata can be changed without having to re-index assets. (An exception is rights and other non-subjective metadata that needed to be embedded in the asset for further tracking. This is not a major concern to the Press Association as they do not track assets once they are published onto the web. I wasn’t sure what would happen if you decided you wanted to repurpose your content, and so needed a new set of metadata, how you link content and metadata, and how you manage the metadata and content within their separate stores.)

The PA are using Mark Logic as the content repository and a BigOWLIM triplestore to handle the associated metadata. Content is fed into the content store, then out again to a suite of indexing technologies, including concept extraction and other text-processing systems, as well as facial recognition software, to create semantic metadata. Simple ontologies are used to model the content, mainly indexing people, places, and events – themes chosen as covering the most popular search terms entered by users of the website.

John argued that such gathering and indexing of assets in order to automatically create and publish collections of associated content was simpler and easier than ingesting diverse content and metadata into traditional search, content management, and online publishing systems.

DAM for Content Marketing, Curation, and Knowledge Organisation

Mark Davey of the DAM Foundation took us on an animated and musical tour of different perspectives on metadata, engagement, social media, and how different the “digital natives” – young people who have grown up with digital technologies – will be to previous generations. Kids of the future will be able to have an idea in the morning, go to an online website app and create their site, their brand, and their marketing strategy in the afternoon, and be engaging with their potential clients by the evening.

Mark pointed out that people have moved on from the initial narcissism of social media and self-publishing and now want compelling stories they can engage with. He pointed out that as semantic technologies advance, we are caught in a feedback loop with them – we are the ontology that is driving the machines – and so we should be aware and vigilant. As the technologies become more powerful and all pervasive, we may lose sight of how they are working to serve us, rather than how we are serving up information about ourselves to them.

Marketing will have to become more sophisticated. Amongst the many statistics he quoted, I noted that 84% of 25-34 year olds have left a favourite website because of ads. At the same time, our networks become more interconnected. In a “six degrees of separation” game, we discovered that three people in the audience had met the Dalai Lama, and we are linking to more and more people through social media sites every day.

The metaphor of information as water is a familiar one, especially in the knowledge management area, but Mark’s colleague Dave pointed out how appropriate it is when talking about a DAM/dam. The DAM system forms the reservoir of content.

(I couldn’t help comparing and contrasting the ever-changing semantic seas of information at the Press Association with the more manageable streams of content that flow within smaller organisations, and how very different approaches are needed for such different contexts. The other day I saw the metaphor used again, in an interview with – apparently – one of the LulzSec hackers who talked about their pirate boat and “copywrong” as an enemy of the seas. )

Black Holes and Revelations: DAM and a museum collection

As if to continue the water metaphor, the next speaker was Douglas McCarthy from the National Maritime Museum. However, he took the metaphor up a stage, to space ships and black holes, with their content assets hidden in black holes as 100,000 uncatalogued image files.

Having catalogued and improved their DAM system, the Musuem’s Picture Library is now showing a healthy profit. Many sales come from the “long tail” of images that no-one anticipated anyone would want. Rather than saturating the market, putting the images online has been stimulating demand, with customers calling for more collections to be made available.

Digital Asset Management – DAM EU Conference

    Start a conversation 
Estimated reading time 6–10 minutes

I enjoyed DAM EU – a Henry Stewart Event – last Friday and was particularly struck by the range of projects, the variety of approaches, and the diversity of industries represented. Sponsors included Capture, Kit Digital, and iBrams.

I will publish a series of posts over the next few days to cover the event, session by session. It has also been written up by Michael Wells.

The Art and Practice of Managing Digital Media

The conference was chaired and opened by David Lipsey, who has a distinguished career in Digital Asset Management (DAM). He introduced the day by talking about the need for “fungibility and liquidity” to make digital assets earn their keep. He pointed out that DAM is an emerging and maturing field. None of us thought of a career in DAM when we were at school, it is something that we have arrived at during our working lives. Now educational institutions are considering providing DAM courses to meet a growing need for DAM skills. DAM is also starting to become an enterprise-wide infrastructure layer, so “soft” skills, such as gaining cross-departmental buy-in to projects, are becoming increasingly important.

Getting More out of DAM: The Latest Technologies and What is Possible Now

It is always a pleasure to listen to the eminently sensible Theresa Regli of the Real Story Group. She observed that distribution channels are changing and there is now more focus not on gathering assets together, but on how you send out assets in the right format to customers – to all phone platforms, all PC and Mac platforms, etc. Mobile apps are very much in the limelight at the moment, but it is important to see them as just another channel. It is also worth remembering that you don’t need to build specialised apps for all devices, as cross-browser formats may well be adequate. The purpose of the Digital Asset Manager should be to get the right media to the right people in the right format.

To be successful, DAM projects need to handle change management and get people to work in different ways. Creative people want to work collaboratively and simultaneously, so DAM systems need to support this, rather than imposing rigid linear workflow processes.

When assessing DAM technologies, the differences are usually not in the functionality they offer, but in the way they provide that functionality. Too much emphasis is often put on technology to solve problems. Theresa said that she thought technology could solve about 20 per cent of business problems, and even that was probably a bit generous. The core problems are caused by metadata, collaboration, diligence, and governance.

Different people need different views of a DAM system at different times, so customisation and personalisation is useful – for example providing limited functionality on an iPad app that enables workers who are away from their desks to access specific aspects of a system. However, a danger to avoid is creating multiple versions of the same asset. It is much better to have a master version and transcode out to each device as necessary.

New developments in metadata include colour palette searching which is popular with creatives more interested in moods than particular things. Another development is AVID’s “layered metadata”, which offers the ability to annotate directly on to video, as well as have general metadata for the whole asset. Much work has been done to improve video handling tools, so automated scene detection and facial recognition software are now becoming available in products.

When buying a DAM system, it is worth remembering that the now dominant technologies are relatively new, and that what may be ideal at a workgroup level may not scale up to enterprise level.

How to be a Good DAM Customer

Sarah Saunders of Electric Lane spoke on an often overlooked, but vital, aspect of the DAM procurement process – how to be a good customer. She pointed out that you get to choose your DAM vendor, but they don’t get to choose you. Projects can be derailed by customers being unsure what they want, being poor project managers, or having unrealistic expectations of what they can get for their money. It is in no-one’s interests to drive the software vendor out of business.

For a project to be a success, it is important to have a project leader who is not part of any of the stakeholder departments and is empowered to make decisions and overrule any individual department if their focus on their own departmental needs threatens the entire project. Creative people and IT people tend to think very differently, to the extent that they can completely misunderstand each other. So, it is not enough just to get people to sign off a complex requirements document, they need to understand it on their own terms otherwise they are likely to be disappointed. Building a system that the creatives are happy using, even though their requirements may seem bizarre to the IT department, is just as important as keeping documentation in good order, something the creatives might not appreciate.

One of the hardest aspects is working out what you really want from the system, and where you are prepared to compromise in order to get something that can actually be built for the budget. In terms of functionality, it is important to think beyond what people ask for, as that is likely to just be what they are using already, to what they would like if they knew it existed.

It is also important to see any system working in practice, as demos may not be enough to know that it will work with your data and business processes. A small delay in keywording speed can have a huge impact on productivity.

An iterative build process good if needs are complex. It can be expensive, but may still be cheaper than ending up with an underperforming system.

Some customers expect the vendor to do all the thinking for them, but the vendors do not know the business needs and processes of every single business. It is not an easy job to be a software vendor, and a difficult, disorganised customer will always end up having a hard time.

Panel session: how we did our procurement

Laura Caulkin from Net-a-Porter and Lisa Hayward from Shell described their recent DAM procurement projects.

Net-a-Porter took a very user-centric approach, running lots of requirements gathering workshops and writing user stories rather than a requirements spreadsheet. They spent 6-8 months planning what they wanted the new system to do, with the help of external expert consultants.

At Shell the build process of their new DAM system took about 18 months and involved the migration of 50,000 assets. They selected Vyre, as a good cultural fit. Lisa stressed the importance of getting vendors to demo their products with your data and felt that details requirements documents and scoring systems were worthwhile. She argued for the “soft” value of a good cultural fit between the team and the vendor to be included in the scores, which the IT team did not agree with. She pointed out that the IT team have a very different perspective as they want the product that they will find easiest to integrate as their short-term project, but you are left to work with the system on a daily basis.

In terms of lessons learned, she felt that their initial requirements were not specific enough, and that this led to change requests and delays. This caused probpems wiht senior managers, who had not expected the project to take so long. However, once it was built, it was well received. Lisa pointed out that change management is really just talking to people and explaining what is going on. They had great success with the adoption of the new system as it was such a vast improvement on the system that had been built for them 10 years previously.

The Organizational Digital Divide

    Start a conversation 
Estimated reading time 2–2 minutes

Catching up on my reading, I found this post by Jonah Bossewitch: Pick a Corpus, Any Corpus and was particularly struck by his clear articulation of the growing information gulf between organizations and individuals.

I have since been thinking about the contrast between our localised knowledge organization systems and the semantic super-trawlers of the information oceans that are only affordable – let alone accessible – to the megawealthy. It is hard not to see this as a huge disempowerment of ordinary people, swamping the democratizing promise of the web as a connector of individuals. The theme has also cropped up in KIDMM discussions about the fragmentation of the information professions. The problem goes far beyond the familiar digital divide, beyond just keeping our personal data safe, to how we can render such meta-industrial scale technologies open for ordinary people to use. Perhaps we need public data mines to replace public libraries? It seems particularly bad timing that our public institutions – our libraries and universities – are under political and financial attack just at the point when we need them to be at the technological (and expensive) cutting edge.

We rely on scientists and experts to advise us on how to use, store and transport potentially hazardous but generally useful chemicals, radioactive substances, even weapons, and information professionals need to step up to the challenges of handling our new potentially hazardous data and data analysis tools and systems. I am reassured that there are smart people like Jonah rising to the call, but we all need to engage with the issues.

There’s no such thing as originality

    Start a conversation 
Estimated reading time 5–8 minutes

Back in 1995 my brother wrote his MA dissertation on copyright: Of Cows and Calves: An Analysis of Copyright and Authorship (with Implications for Future Developments in Communications Media) or How I Learned To Stop Worrying and Love Home-Taping. It is interesting how relevant it remains today, especially in the light of the Hargreaves Report delivered to the government in May. Essentially, nothing much has changed in the intervening 16 years. My brother reflected the predictions of the profound changes that digital technologies would make – and were already making – to the creative industries. Although details such as the excitement over ISDN lines and no mention of mobile technologies date his work, the core issues he covers – who owns and idea and who should get paid for it – remain remarkably current.

I’ve written a brief overview of the Hargreaves Report for Information Today, Europe. The two aspects of most interest to me are the proposals for a Digital Copyright Exchange and for handling of orphan works.

Ideas as objects

My brother argues that creative works are all part of an ongoing cultural dialogue that no one individual can really “own” and that copyright only made sense for the short period of time where technology reified ideas as artefacts that could be traded as commodities (like potatoes or coal). The business model of “content” as “physical item” started to fail with the invention of the printing press, as the process of copying ceased to be a creative act, so each individual copy was not a “new” work in its own right. Copyright law was developed to commodify the “idea” within a book, not the physical book and was enforceable for only as long as access to the copying technologies – printing presses – could be limited. The digital age has made control of the copying process impossible, as the computer replaced the printing press, so one could exist on every desk, and now, thanks to mobile technology, in every pocket. He notes that in pre-literate societies, authorship of a myth or a folklore was not important, and I find it interesting that crowd sourcing (e.g. Wikipedia, citizen journalism) has in some ways returned us to the notion of a culturally held store of knowledge contributed and curated by volunteers, rather than by paid professionals.

Music is not the only art

Many of the discussions of free v. paid for content seem to run to extremes, and seem to be coloured by the popular music industry’s taste for excess. The music industry inflated commodity prices far beyond what consumers were willing to pay just as cheap copying technologies became widely available, making the pirates feel morally justified. It is hard to feel sympathy for people living a millionaire rockstar lifestyle. The inevitable increase in piracy was met not by lower prices, but by the industry issuing alarmist statements about home taping killing music. It didn’t! Music industry profits have risen steadily. The industry has simply turned its attention to charging more for merchandise and live events. The lesson to be learned is that people are willing to pay for experiences, services, and commodities that they perceive as being worth the price and better than the alternatives. Most music fans would rather pay for an easy, virus-free reliable download service than deal with illegal download sites, just as back in the 1980s sticking a microphone in front of the radio to record the charts wasn’t as good as buying the vinyl. The effects on the reference publishing industry were very different, affecting many small businesses and people on far less than rockstar wages, but most displaced people found ways to transfer their skills to numerous new areas of work – obvious examples are content strategy or user experience design – that simply didn’t exist in the 1990s.

It has become a bit of a cliche that there aren’t any business models that work, or have been shown to work, in the new digital economy, but are things really so different? In order to have a business somebody somewhere has to be persuaded to pay for something. Everything else is just a complication. If your free content is supported by advertising, it just means that someone needs to be persuaded to pay for the advertised product, instead of the free bit that appears on the way. Similarly, “freemium” is really just old-style free samples and loss leaders. You can’t have the “free” bit without paying for the “premium”. The two key questions for producers remains, as they have always been, how do you produce content that is so useful, entertaining, or attractive that people are willing to pay for it, and how do you deliver it in ways that make the buying process as easy as possible?

Hargreaves suggests a light touch towards enforcement of rights and anti-piracy, firmly supporting the view that if content and services are good enough, people will pay, and that education about why artists have a right to be paid for their work is as important as catching the pirates. Attempting to “lock” copies with Digital Rights Management systems certainly don’t seem to have been very successful. They are expensive to implement, unpopular, and pirates always manage to hack them. Watermarking doesn’t attempt to prevent copying but does help prove origin if a breach is discovered. Piracy is less of a worry for business-to-business trade, as most legitimate businesses want to be sure they have the correct rights and licences for content they use, rather than face embarrassing and expensive lawsuits, and a simplified, secure Digital Copyright Exchange would presumably be in their interests.

Digital Copyright Exchange

Hargreaves proposes the Digital Copyright Exchange as a mechanism to make the buying and selling of rights far easier. At the moment, piracy can be a temptation because of the time and effort required to attempt to purchase rights. Collections agencies and the law form layers of bureaucracy that hamper start-ups from developing new products and simply confuse ordinary users. This represents real lost revenue to the content providers.

Metadata analyst and music fan Sam Kuper suggested an interesting proposal for setting fair prices – that artists should put a “reserve price” on their work, with an initial fee for purchasers. Once the “reserve price” has been reached, any subsequent purchase is shared between the artist and the early purchasers. This would guarantee a level of income for artists, allow keen fans to get hold of new material quickly, and allow those less sure to wait to see if the price drops before purchase. Such a system sounds complex, but could work through some kind of centralised system, so that “returns” to early purchasers would be returned as credits to their accounts.

From an archive point of view, Hargreaves’s call to allow the digitisation and release of orphan works without endless detective work in trying to trace origins would be a huge boon.

So, there is much to think about in the Hargreaves report and some very sensible practical suggestions, but much detail to be worked out as well. I wonder if in another 16 years, my brother and I will have seen any real change or will we still be going through the same debate?