OPENING SESSION:The Host library TIB was highlighted: with 9 million items, 500 staff, 1700 workspaces, and opening 95 hours a week, as the largest science and technology library in the world. The T9 Universities were also introduced. The T9 libraries are the libraries of the nine leading German Institutes of Technology, many of whom have University status. The role and importance of IATUL was outlined, in that since its formation in 1955 IATUL has provided an international forum for the exchange of information.
Find out more about TIB at http://www.tib.uni-hannover.de/en/
Martin Hofmann-Apitius: “Innovative usage of unstructured information sources: From text- and data-mining to model-driven decision-support”This session outlined efforts undertaken to develop methods to make unstructured scientific data available in a structured format, enabling computer processing of said data. The illustrative example of 'a needed genome sequencing to facilitate personalized medicine' was used. As each person differs, the question of how to assess the individual case becomes imperative to answer. This is especially true when one might be met with the case of a patient with 6 weeks to live. Existing tools, at the Fraunhofer Institute, for such analysis include: Text miner, Prominer, SCAIview, Dictionaries, Medline Abstracts, NLM pdf’s, PMID and entities. These are analysed using Named Entity Recognition (NER) and Normalization. The Pro-miner tool pre-processes include NLP and NER [named entity recognition]. SCAIviewer allows for semantic search and document retrieval. To develop this further and identify causal relationships, machine learning is needed. KNIME is leverage for this. UIMA (OS standard for content analysis) is used to identify relationships and allow for the extraction of BEL like statements. This simplified syntax allows for the application of automated reasoning. Encoded queries or statements can create graphs. The process is still quite manual and input intensive. See examples of this graphing at http://www.sciencedirect.com/science/article/pii/S1552526015000837
This process has a 70-90% recall and precision rate for the biomedical field. As more dictionaries are added NER matches improve, the data is improved culminating in improved results.
BEL records full provenance of the process, can encode entire datasets, thus significantly speeding up the analysis process. This is especially important if you are faced with the example above, where the patient has weeks to live. There needs to be a change in the management of copyright and copyright law to allow for this type of decision support to begin saving lives, in critical time sensitive scenarios, as many databases where important research in this area do not allow for full text mining.
Mr. Cotta outlined changes that have resulted from the appointment of the new European Commission in 2014, relative to his position in the European Commission Directorate General for Communications Networks, Content & Technology [DG Connect]. The directorate is working towards their aim of creating a single digital market, which allows for the free flow of data, including research data. Copyright reform is a very important aspect of this new direction, especially in relation to text and data mining. This move dove tails with the EU commission’s policy for open science, plans to create an e-infrastructure, and the commission’s emphasis on open access publishing.
KEYNOTE 2: JOSÉ COTTA: “FROM OPEN ACCESS TO OPEN SCIENCE: A VISION”
As much of the research undertaken within the EU is done under the auspices of publicly funded institutions, the case grows for research, data, process and even software generated via such funding structures to be made publicly available, so the citizens of the funding countries can see the end result of the projects undertaken, and when applicable, benefit from the results. The open availability of this information is essential with the move towards evidence based decision making, but may necessitate a change in publishing, intellectual property, data protection and copyright models. Indeed, Mr. Cotta called attention to issues with peer review, and indicated that this too, may also need an alternative. For example, the Horizon 2020 funding stream emphasises the open access publishing of project results, as evidenced by the OpenAIRE project.
PANEL DISCUSSION: “INFORMATION RESOURCES AND SOCIETAL CHANGE”Chair: Peter Löwe
PANEL DISCUSSION PARTICIPANTS:
• Martin Hofmann-Apitius, Fraunhofer Institute SCAI
• José Cotta, European Commission
• Frank Scholze, Karlsruhe Institute of Technology University, Germany
• Elisha Chiware, Director CPUT Libraries, South Africa
At the end of this session it was interesting to see the illustration of the issues and topics created by the graphical artist who was present in the main hall for the duration of the morning.
LIBRARY STRATEGY AND MANAGEMENT SESSION
ELLEN SAFLEY: “UNCOMFORTABLE – COMMITTING TO CHANGE – FINDING SUCCESS”Ms. Safley discussed the decision of the library to become an early adopter of the Alma library platform, the issues which arose from this, and integrating it with the resource discovery layer. The project was outlined, and approaches to relieving staff stress given. In addition testing and problems which arose from going live on the new system on the first day of semester were detailed.
CAROLIN BECKER: “PERFORMANCE INDICATORS AT TUM LIBRARY”Ms. Becker outlined, how, in pursuit of ISO9001, performance measures both qualitative and quantative were needed which integrated into budgets, staffing structures, policy and strategy. The quality management team were able to identify data already gathered which match Key Performance Indicators (KPI’s) in the German national library statistics scheme (BIX). However, the weakness of the existing data is that they focus on traditional library tasks, with over 400 KPIs for 6 dimensions [... of the balanced scorecard]. TUM decided to select 10-20 KPIs for each functional area, and consulted staff in the area as to what these should be. While some overlap arose, the library executive committee decided on the final set.
A Quality Management Officer has been put in place, and the focus on measurement moved to improving services, rather than controlling staff.
Examples of KPI’s included, physical and electronic collections versus the levels of usage, and cost per use, the library as a place - down to branch level, technical services workload versus the up to datedness of services, information support requests via phone, e-mail what’s app, usage of e-learning materials, Facebook reach. Only some KPI’s have a target to be met.
PETRA DÜREN: “SHADOW OF THE LEADER: HOW LIBRARY LEADERS UNDERMINE OR BOLSTER CHANGE EFFORTS”
SIMONE FÜHLES-UBACH: “VALIDATING LIBRARY STRATEGIES BY ASSUMING THE USER PERSPECTIVE”
Dr Fuhles-Ubach put forward a new model of addressing library strategy, which has been used on New Zealand, and the UK in the business administration arena, where the strategy is looked at from the user perspective, leveraging the PRUB model. [See https://openstrategies.com/what-is-prub-introduction for more on this model] This should help describe what users want to do. The Horizon Report 2014, Library Edition was listed as containing examples of the types of strategies that libraries should consider applying the ORUB model to. [It can be found at http://cdn.nmc.org/media/2014-nmc-horizon-report-library-EN.pdf]. The common characteristics of strategies were outlined and a definition put to the attendees. All strategies should be evaluated using at least the following questions: Is it logical? Will it definitely work? Is it worth it? Applying the PRUB model is not as easy as it looks, but it can help identify orphan projects in advance, as you can apply the model to backwards planning. It was reported that libraries are often in a sandwich position, needing to address the strategy of the host organisation, while integrating the user perspective to avoid investing in orphan projects.
It was advised that as it is hard to get user participation in strategy planning sessions, the alternative is to observe user behaviour, as in what they do, not what they say they do.