Arnoud de Kemp was the marketing and sales director and deputy member of the board of the scientific Springer-Verlag from 1984 to 2004. Apart from worldwide sales and marketing, he was responsible for the development of new media and electronic publishing ("SpringerLink"). He was a member of the Executive Board of the International Association of Scientific, Technical, and Medical Publishers (STM), was from its very start active in the International DOI Foundation, a long-time Director of the International Electronic Publishing Research Centre (IEPRC), past President of the Deutsche Gesellschaft für Informationswissenschaft und –praxis (DGD, now DGI) and now Chairman of the Electronic Publishing Working Group (AKEP) in the Börsenverein des Deutschen Buchhandels (Association of German Publishers and Booksellers), with lots of other activities going on.

Since 2004, he and his associate Ingrid Maria Spakler have been building up the digital agency and publisher "digilibri" in Heidelberg, which sees itself as an intermediary between suppliers and purchasers and which is using advanced database and security technology, especially access rights and digital watermarking. digilibri just opened its new website, a media database and an online Asset Management System with a special programme "digilibripro" for publishers and other organisations that would like to manage, catalogue and present their digital assets, in particular images with texts. digilibri turns digital objects on the fly into electronic publications by assigning a Digital Object Identifier and registering the electronic publication in a central internet register for permanent identification, citation and retrieval.

Arnoud de Kemp is acknowledged as one of the pioneering experts in the international scientific and professional publishing landscape. With his new company, digilibri, he is also in a position to argue from the point of view of an advanced user of DRM. Contact: dekemp@digilibri.de.

INDICARE: Mr. de Kemp, there is a lot of talk about Digital Rights Management (DRM). Our impression is that scientific publishing is largely unaware of this. Is that correct?

A. de Kemp: What’s in a name? Publishers use DRM, but they call it something different. Maybe, because they organised themselves long before DRM became a well-known expression. Henceforth DRM is far more used by science and professional publishers as well as by learned societies than is generally perceived.

INDICARE: What is your underlying conception of DRM?

A. de Kemp: My overall simple definition of DRM is: DRM is nothing else but electronic or digital registration and control of the access to media, both databases and specific content. This might start with the registration of subscriptions to printed journals in large computer systems of subscription agencies and publishers, go through the exploitation and administration of access to electronic journals through journal agencies or in electronic library collections, and go on with electronic watermarks in all kinds of documents, e.g. in audio books. This is a very broad area. The amazing thing however is that, apart from watermarking, publishers, libraries and journal agencies have been using such systems for much longer than the term "DRM" has been in fashion. In the area of scientific and professional publishing, the term DRM has not been and is hardly being used at all. People that are involved in system development, database management, telecommunication etc. of course use a different language, but they are not publishers.

In a more narrowly scientific definition, one would stress cryptographic encoding, digital identification and the regulation of use. I regard the registration of access and the metering of use up to billing as important and because there is much more money involved, the music publishers and entertainment companies exploiting their content commercially have been far more active.

INDICARE: Where do you see the difference between the entertainment industry which has strongly pushed the debate on DRM and the scientific publishing domain?

A. de Kemp: There is a whole series of differences. The most important is that in science there is a long tradition of free exchange of research results. Scientists go to conferences and present papers, present and defend their issues in poster sessions. This may result in articles that are offered for publication. Most of the material comes in unsolicited, some material is written on invitation (invited papers). For journal articles, there are no royalties involved. Secondly, you have to realise that there are only a few very large scientific publishers. The majority are small companies, university publishers and learned societies. There is little cooperation and limited standardisation going on between publishers, except for SGML and DOI. Most of the standardisation, that publishers use, comes from industries like Adobe (with PDF), database developers and network companies. Broadcasting of music and television is a completely different business than that of the scientific publishers. A good scientific journal may have a printing of up to 3 to 4,000 copies and that`s it. Thirdly, we have completely different distribution channels. We sell our content, in particular journals, by subscription through bookshops and specialised subscription agencies, primarily to libraries and institutions. Practically all journals now also exist in electronic form. It is still common to sell a combination of a printed title and its electronic version. Libraries can licence for one title, a series of titles or entire fulltext databases and their users then have unlimited access and can download text documents. In other words, we seem to have a straightforward DRM environment in scientific publishing, distribution and dissemination. There are services run by publishers or learned societies or aggregators, that are based on metered downloads, these however are mostly for bibliographic and factual databases: abstracts, tables of contents, chemical structures, chemical reactions, patents, news, business information, and stock market quotations.

I’d like to mention one more specific feature of the entertainment business: the prices for CDs and videos are kept artificially high by the entertainment industry. The proliferation of selfburned CDs or DVDs may thus be seen as a kind of consumer protest.

The music industry has to pay royalties to composers, song writers, musicians, conductors, studios etc. It is a far more complex business. In our world, the use of photocopiers is metered and a little fee per copy made is then paid to the central reproduction right organisation (RRO), which pays publishers and registered authors according to certain schemes. Publishers mostly pay royalties only to book authors, but with many works in science, which consist of individual contributions, not even that is the case. The publisher makes the investment, takes the risk, guarantees continuity and promises to make the content publicly known. The authors / contributors get the reward of being published and hopefully cited. For a lot of journal publications the author has to pay a page charge to support the publisher: In most cases these publishers are societies and the payment is to keep the price of the publication low, especially for members of the society. The "Open Access" initiatives support that authors and their institutions pay enough money to make the publication free to anyone.

INDICARE: What is actually happening at publishers with respect to the introduction of DRM in the stricter sense?

A. de Kemp: Next to nothing is happening, as in the world of publishing, people feel that everything is already taken care of. Through the consortium licenses and copyright laws, there is the possibility for unlimited use of scientific literature. Students can log in from home and access literature from the databases that are licensed by the university libraries. In this way, scientific literature for the end-user is in most cases in principle freely accessible. There is little inclination to copy and disseminate scientific articles as an alternative.

During the past two years, there has been a hefty debate on the reform of German copyright law, under which professors, teachers, students and workgroups are to be allowed to copy parts of works and store these. There was a great fear among publishers that through this, entire journals, journals or works of reference would "leak". In my opinion, this is largely unfounded since digital literature is mostly already "free" in an organised way. It is different however with books, especially textbooks, where we still have very little experience as only few are available in electronic format.

INDICARE: Isn’t there the fear that a scientist could download an article from Elsevier’s Science Direct or SpringerLink and, for instance, make 10 copies which he passes on to his colleagues?

A. de Kemp: He has access to his own article and he is allowed to do that. For research and teaching purposes such practices are permitted. But it does not work that way. He will send a mail with an attachment or with a link. Still, lots of scientists order offprints or original PDFs from their publishers for documentation purposes in the funding and approval process and for exchange with colleagues.

Most publishers nowadays allow authors to store a copy of their article on their own server if at least a link to the original and formal publication by the publisher is made.

INDICARE: And there are no forces in the publishing world that are now saying, "we will no longer allow that, since in DRM we have the technical means to prevent it"?

A. de Kemp: No. As long as it is covered by licenses, contracts and laws, it is not seen as a major problem. We had that debate in the course of the reform of the German copyright law and the implementation of the law in practice certainly still needs close monitoring to prevent ill use. But in general, in the scientific publishing world the tendency is that current organisation and regulation is adequate. What we do not want, is mass copying by libraries, which then provide large-scale document delivery services in unfair competition with publishers of all kinds, for profit or not for profit.

Who should control individual use? That is the crucial question. The publishers are unable to control individual use at universities since there is only one central point of access. Publishers and their agents provide statistics on the general use to the universities and are happy, in most cases, that the literature offered this way, is better used than ever before. Reading rooms in university libraries are full nowadays.

We have had extensive discussions on this in STM circles, the International Association of Science, Technical, and Medical Publishers. Everything attempted in this direction in the past, watermarks, digital envelopes which have to be opened with codes received previously or afterwards, did not meet with acceptance. People don’t want it. Scientists and students want information without technical barriers.

INDICARE: No further restrictions? No stipulations that this document may only be used on a single computer or, for instance, be printed once only?

A. de Kemp: No, no further restrictions. Anything more is not feasible to control. If one has institutional license agreements or consortium contracts with large data centres and universities, then access control is only possible by means of a general IP address. We cannot determine who is behind it. That’s a problem. It is not like in P2P or B2B where there is a direct relationship between supplier and consumer. Our route is from a supplier to a large grey cloud called university. We are unable to ascertain whether this means 10 or 200 institutes or 2,000 or 10,000 students and we are also unable to organise transparency in this respect, apart from a description in the contract.

INDICARE: Do you mean that the effort to control individual transactions would be too great?

A. de Kemp: No, systems from publishers and agents would not be able to do it. Universities and their libraries don’t work with access control. Everything is open. It is different in the industry, which also licenses our content. They don’t wish anyone from the outside to know who is using which information. Industries may have very detailed internal costing or profit centres. But that is their issue, not ours. They don’t wish transparency on which articles and documents are being used. That is by no means such a sensitive subject in the distribution of music as it is in science and research.

About barriers, we have been confronted directly with this problem when building up "digilibri". We supply pictures, high-quality photos, copies of antique documents, high resolution images of original paintings with lots of descriptive text. For each image the rights situation is documented in a very flexible way. From the very beginning we considered to work with DRM as we needed to prevent this sensible material from unwanted commercial exploitation. This starts with registration. We present three look-up formats: thumbnails a preview and a very large preview, all in a low resolution, but enough for a computer screen. We add intelligent watermarks. Each document to be found in our media database in each format is now protected by a visible "digilibri" watermark. Only registered users are allowed to see the large preview. Once a registered purchaser interested in the image has clarified all issues, related to use and exploitation, the image is released as a download or submitted on a DVD, which we think is the better way to ship high-resolution material anyhow.

However, we soon noticed, during the tests we conducted, that the acceptance of visible watermarks among artists, photographers, illustrators who see their own works in the database with a watermark is rather low down to negative. At the moment we’re therefore thinking of using invisible or more transparent watermarks.

INDICARE: Hence you would be taking a direction that specialist calls forensic DRM.

A. de Kemp: Yes, of course we are thinking in this direction, although there are also problems with forensic DRM.

INDICARE: What would be the alternative?

A. de Kemp: The open route using contracts. We will conclude framework contracts with editors, image agencies, designers etc. and give them open access to our material in a special catalogue, controlled by their IP address, user name and password. We provide user rights by contract in the conventional manner (printed, stamped, sent by fax with signatures etc.) and the material provided is given no further protection.

INDICARE: Besides those that you have already mentioned, are there any other barriers to the introduction of DRM in publishing houses? For instance, is DRM too expensive, not sufficiently reliable, or inadequately standardised?

A. de Kemp: To me, the last point seems to be the main problem. There are still no standards for reliable encryption in the dissemination of scientific documents. The user does not appreciate being restricted by all kinds of technologies.

At Springer, we used to have never-ending tests with CD-ROMS, trying to encrypt them. Most technologies were obsolete from the beginning or soon became obsolete. I realize that there are more advanced technologies.

INDICARE: Isn’t Adobe-Acrobat already the standard?

A. de Kemp: Yes, it is currently the best encryption for documents that we can imagine. Fantastic. It comes along with all PCs and Macintosh computers as an OEM product and the Acrobat Reader can be freely used. That is why it has been so successfully established.

By the way, there is the DocuRights system by the Aries Company that builds up on PDF. It is being tested and partly already used by a number of STM publishers. At Springer, we were also investigating it, but I don’t know if Springer made the decision to apply the system. DocuRights wraps the document in a secure container and protects it regardless of its physical location in the Internet. During my time at Springer we actually came to the conclusion that this was an interesting technique, but not necessary.

INDICARE: Let’s have a look at other actors involved in the exploitation of scientific content, e.g. the collecting societies. Some argue, that collecting societies might become obsolete due to DRM systems, because collective rights management and compensation schemes could now be replaced with more equitable, individual use-based billing. What do you think about this?

A. de Kemp: The collecting societies were created to collect and administer fees, charges for copiers, fax machines, DVD burners, scanners, blank media etc. Somebody has to collect, administer and distribute these dues. And that can practically only be a centralised organisation.

The alternative model is to concentrate on content and attempt to measure it. That is extremely complex and difficult to achieve, since organisations like the collecting society "VG Wort" have a legal basis and too many parties in the information sector are involved.

In the medium term, I would hope for a shift in the tasks of the collecting societies. For instance, combining Digital Object Identifiers (DOI) and DRM systems, one could establish a kind of usage counter and use this at least for detailed metering, in the long term even for a better distribution of the money to publishers and authors. The collecting societies would no longer be superfluous, since they could be responsible for the business of accounting and billing. This would not be limited to texts and images. The DOI would also be a perfect facilitator in other sectors like digital music, audible books, download platforms in general.

INDICARE: How do you view the relationship of scientific publishers to the open access movement?

A. de Kemp: As I have said before. "Open Access" wishes to make all published material free of charge. In their view, libraries and scientists stand on one side and the publishers and their helpers on the other. In principle however, the publishers should not be against "Open Access". If the money that the libraries currently pay to publishers for the use of the publications is re-allocated by the funding agencies and similar organisations, to finance the publishing process and dissemination of electronic publications, we as publishers should be happy as life will be easier. Springer very quickly presented "Open Choice" with good arguments: we don't care who pays, but whoever pays, can determine the rules of use. The "Open Access" movement is a real anti-DRM movement. The danger of "Open Access" is that relevant scientific literature becomes grey literature and there are big issues like originality, exploitation of the results described all the way up to patent application, that are not addressed at all.

INDICARE: To close, a question about the more structural midterm changes. How do you see the functional and structural changes in scientific publishing?

A. de Kemp: Positively! By consortium contracts with universities and entire countries, scientific context is accessible everywhere. CrossRef will continue to spread its influence and support linking and hopefully better access to full text as Scholar Google is currently attempting. In the past, publications were "hidden" in large or small university libraries and not accessible. Finding the way was not always easy. Bibliographic databases have been around for a long time, but that is a very narrow access. Now the material is accessible around the clock on the Internet. That’s a fantastic development.

But I have a different worry. The worry’s called Google and I have a great fear that we are being "googlified". The great simplicity and the enormous quantitative search results that Google produces are being seen uncritically. This might result in a tendency to no longer use documents, articles and books, but to solve all our information problems using Google. There, information is not really indexed deeply enough and the algorithms behind the ranking are unclear. "Googlification" should create great concern for everybody in the information as well as education sector, including parents of children.

The time will come that the majority of library holdings is available digitally. There are initiatives everywhere, triggered or accelerated by announcements from Google and Amazon to digitize whole libraries or make whole publisher catalogues readable (Inside the Book). The French National Library, the European Library, led by the Royal Library in the Hague, large university centres like Göttingen in Germany or Cornell in the US, all have retro-digitisation projects. The Gutenberg project is also a project to digitise out-of-print books. Soon, we will have the whole world in our hands.

INDICARE: Thank you very much for this interview.

Sources
Important institutions and projects mentioned in the interview:

Status: first posted 09/08/05; licensed under Creative Commons
URL: http://www.indicare.org/tiki-read_article.php?articleId=129