Friday, 24 July 2015

IATUL Conference 2015 - Day 1.

DAY 1 

OPENING SESSION:

The Host library TIB was highlighted: with 9 million items, 500 staff, 1700 workspaces, and opening 95 hours a week, as the largest science and technology library in the world. The T9 Universities were also introduced. The T9 libraries are the libraries of the nine leading German Institutes of Technology, many of whom have University status. The role and importance of IATUL was outlined, in that since its formation in 1955 IATUL has provided an international forum for the exchange of information.
Find out more about TIB at http://www.tib.uni-hannover.de/en/

KEYNOTE 1:

Martin Hofmann-Apitius:  “Innovative usage of unstructured information sources: From text- and data-mining to model-driven decision-support”

This session outlined efforts undertaken to develop methods to make unstructured scientific data available in a structured format, enabling computer processing of said data. The illustrative example of 'a needed genome sequencing to facilitate personalized medicine' was used. As each person differs, the question of how to assess the individual case becomes imperative to answer. This is especially true when one might be met with the case of a patient with 6 weeks to live. Existing tools, at the Fraunhofer Institute, for such analysis include: Text miner, Prominer, SCAIview, Dictionaries, Medline Abstracts, NLM pdf’s, PMID and entities. These are analysed using Named Entity Recognition (NER) and Normalization. The Pro-miner tool pre-processes include NLP and NER [named entity recognition].  SCAIviewer allows for semantic search and document retrieval. To develop this further and identify causal relationships, machine learning is needed. KNIME is leverage for this. UIMA (OS standard for content analysis) is used to identify relationships and allow for the extraction of BEL like statements. This simplified syntax allows for the application of automated reasoning. Encoded queries or statements can create graphs. The process is still quite manual and input intensive.  See examples of this graphing at http://www.sciencedirect.com/science/article/pii/S1552526015000837

A recent project (sponsored by Philip Morris) worked to create a more integrated workflow linking UIMA, SCAiview, to create a semi-automated workflow called BELIEF. During the project it became evident that there is a greater information gain if the full text of an item is mined as opposed to just the abstract.

This process has a 70-90% recall and precision rate for the biomedical field. As more dictionaries are added NER matches improve, the data is improved culminating in improved results.
BEL records full provenance of the process, can encode entire datasets, thus significantly speeding up the analysis process. This is especially important if you are faced with the example above, where the patient has weeks to live. There needs to be a change in the management of copyright and copyright law to allow for this type of decision support to begin saving lives, in critical time sensitive scenarios, as many databases where important research in this area do not allow for full text mining.


 KEYNOTE 2: JOSÉ COTTA: “FROM OPEN ACCESS TO OPEN SCIENCE: A VISION”

Mr. Cotta outlined changes that have resulted from the appointment of the new European Commission in 2014, relative to his position in the European Commission Directorate General for Communications Networks, Content & Technology [DG Connect]. The directorate is working towards their aim of creating a single digital market, which allows for the free flow of data, including research data. Copyright reform is a very important aspect of this new direction, especially in relation to text and data mining. This move dove tails with the EU commission’s policy for open science, plans to create an e-infrastructure, and the commission’s emphasis on open access publishing.
As much of the research undertaken within the EU is done under the auspices of publicly funded institutions, the case grows for research, data, process and even software generated via such funding structures to be made publicly available, so the citizens of the funding countries can see the end result of the projects undertaken, and when applicable, benefit from the results. The open availability of this information is essential with the move towards evidence based decision making, but may necessitate a change in publishing, intellectual property, data protection and copyright models. Indeed, Mr. Cotta called attention to issues with peer review, and indicated that this too, may also need an alternative. For example, the Horizon 2020 funding stream emphasises the open access publishing of project results, as evidenced by the OpenAIRE project.



PANEL DISCUSSION: “INFORMATION RESOURCES AND SOCIETAL CHANGE”  

Chair: Peter Löwe

PANEL DISCUSSION PARTICIPANTS:
Martin Hofmann-Apitius, Fraunhofer Institute SCAI
José Cotta, European Commission
Frank Scholze, Karlsruhe Institute of Technology University, Germany
Elisha Chiware, Director CPUT Libraries, South Africa

This lively discussion created insight into how similar issues are in relation to sourcing funding, globally, although the scale of the funding may differ. Mr. Chiware argued that the information which users consider critical to their needs is often that which is most expensive, leaving the librarian with the need to balance budget against the provision of relevant information. Mr. Scholze, spoke about issues surrounding relating data back to the basis of science, and emerging issues around transparency and re-usability vis a vie scientific data. Mr. Cotta highlighted the importance of ensuring the rules relating to copyright and data protection do not ‘kill’ science, especially in relation to data and text mining. Mr. Hofmann-Apitius spoke of the lack of critical thinking skills in the current generation of researchers, as they have become Google dependant. Mr Chinware, confirmed this as a global trend, while Mr. Scholze, put forward that libraries are needed as facilitators, for the teaching of skills to access scientific data commented upon by Mr. Hofmann-Apitius. The importance of making research data available in its raw formats was discussed, especially in the light of young students facing societal change, not just that of the ‘academic world’. Additionally the importance of exposing research published in local periodicals to global users was identified as an issue.
At the end of this session it was interesting to see the illustration of the issues and topics created by the graphical artist who was present in the main hall for the duration of the morning.


LIBRARY STRATEGY AND MANAGEMENT SESSION

ELLEN SAFLEY: “UNCOMFORTABLE – COMMITTING TO CHANGE – FINDING SUCCESS”

Ms. Safley discussed the decision of the library to become an early adopter of the Alma library platform, the issues which arose from this, and integrating it with the resource discovery layer. The project was outlined, and approaches to relieving staff stress given. In addition testing and problems which arose from going live on the new system on the first day of semester were detailed.


CAROLIN BECKER:  “PERFORMANCE INDICATORS AT TUM LIBRARY”

Ms. Becker outlined, how, in pursuit of ISO9001, performance measures both qualitative and quantative were needed which integrated into budgets, staffing structures, policy and strategy. The quality management team were able to identify data already gathered which match Key Performance Indicators (KPI’s) in the German national library statistics scheme (BIX). However, the weakness of the existing data is that they focus on traditional library tasks, with over 400 KPIs for 6 dimensions [... of the balanced scorecard]. TUM decided to select 10-20 KPIs for each functional area, and consulted staff in the area as to what these should be. While some overlap arose, the library executive committee decided on the final set.
A Quality Management Officer has been put in place, and the focus on measurement moved to improving services, rather than controlling staff.
Examples of KPI’s included, physical and electronic collections versus the levels of usage, and cost per use, the library as a place - down to branch level, technical services workload versus the up to datedness of services, information support requests via phone, e-mail what’s app, usage of e-learning materials, Facebook reach. Only some KPI’s have a target to be met.


PETRA DÜREN: “SHADOW OF THE LEADER: HOW LIBRARY LEADERS UNDERMINE OR BOLSTER CHANGE EFFORTS”


Prof. Duren stated that with 19% of major changes evaluated as successful, there is a need for managers to use leadership skills to manage change both perpetual and deliberate, to have a vision of the outcome, a plan for the change process, and to address staff anxieties. A study was undertaken in Germany and the USA of what staff expect from managers implementing changes. While there were some similarities, there were also significant differences, indicating a different set of expectations / culture around change. For example to German speaking respondents, clear project management was more important than participating in the process. Communication was identified as a key element in both countries responses.


SIMONE FÜHLES-UBACH: “VALIDATING LIBRARY STRATEGIES BY ASSUMING THE USER PERSPECTIVE”


Dr Fuhles-Ubach put forward a new model of addressing library strategy, which has been used on New Zealand, and the UK in the business administration arena, where the strategy is looked at from the user perspective, leveraging the PRUB model. [See https://openstrategies.com/what-is-prub-introduction  for more on this model] This should help describe what users want to do. The Horizon Report 2014, Library Edition was listed as containing examples of the types of strategies that libraries should consider applying the ORUB model to. [It can be found at http://cdn.nmc.org/media/2014-nmc-horizon-report-library-EN.pdf]. The common characteristics of strategies were outlined and a definition put to the attendees. All strategies should be evaluated using at least the following questions: Is it logical? Will it definitely work? Is it worth it? Applying the PRUB model is not as easy as it looks, but it can help identify orphan projects in advance, as you can apply the model to backwards planning. It was reported that libraries are often in a sandwich position, needing to address the strategy of the host organisation, while integrating the user perspective to avoid investing in orphan projects.
It was advised that as it is hard to get user participation in strategy planning sessions, the alternative is to observe user behaviour, as in what they do, not what they say they do.


EWALD BRAHMS: “HILDESHEIM UNIVERSITY LIBRARY – USER-ORIENTED CHANGE MANAGEMENT”

Dr Brahms began by discussing change management as a business management concept, and then expanded this to library management, highlighting that change is an ongoing process, and definitely non-linear in nature. His experience indicates that there can be several change processes happening at once, all of which need to be managed, but perhaps using different approaches to do so. Hildesheim experienced significant changes as a result of external decisions regarding funding sources for the University, including the introduction of fees, which had a knock on effect on the library. In addition, parallel to this was an increase of +78% in student numbers, and + 60% in staff numbers between 2002 and 2014. Dr Brahms pointed out that libraries can make Universities more attractive, and leveraging this over 8 to 9 years allowed significant investment in library resources and facilities to be achieved.  In recognition of the level of change facing the library, the librarians took part in a change management workshop. The closing remarks for day 1 included the assertion to always have a plan B.



No comments: