Abstract
This paper summarizes the 18th Academic Publishing in Europe (APE) Conference: Berlin Re-Visited: Building Technological Support for Scholarship and Scientific Publishing, held as a hybrid event 10 and 11 January 2023, and organized by the Berlin Institute of Scholarly Publishing (BISP), a not-for-profit organization dedicated to bringing publishers, researchers, funders, and policymakers together. This year’s conference theme “Out with the old, in with the new!” was discussed and presented in keynote speeches, the APE lecture, and several panel discussions. Current challenges within scholarly publishing, e.g., with research integrity, trust, and research assessment, have much to do with the old ways of doing things. To move science forward, new technologies and innovations, like the decentralized web, FAIR digital objects, and blockchain technology are needed to shape new paradigms. In many sessions it was stressed that not just new technologies are needed to move science forward, but human collaboration and partnerships as well. The changing role of the journal and the importance to recognize more diverse research outputs, beyond the journal article, was a topic of importance. Not only research assessment reforms and improved collaboration amongst different stakeholder groups are needed to address this, new publishing systems, better metadata, and open infrastructures as well. A session presenting different start-ups showcased how Artificial Intelligence, Natural Language Processing and software technology can be used to tackle problems in e.g., finding relevant funders and peer reviewers, and detecting image plagiarism. It was also discussed what publishers can do to help achieve the Sustainable Development Goals. Collaboration, data transparency and sharing best practices amongst researchers, funders, and policy makers are key. Another important topic during this year’s APE was how publishers can support Early Career Researchers: establishing new workflows and infrastructures that enable publication of a wider range of research outputs, and broader recognition of these outputs, will incentivize ECRs. This year’s APE was concluded with the APE Award Ceremony. The winner - Vsevolod Solovyov from Prophy - has used AI to enhance the current peer review system and made a very important contribution to improving the scholarly communication system.
Keywords
The 18th Academic Publishing in Europe (APE) Conference, and first hybrid APE, was opened by
Prof. Dirnagl focused on a few of the challenges: e.g., the current academic reward system. Due to the exponential growth of science inputs and outputs, metrics have become very important, and content has been ‘downgraded’. He stressed that academia is responsible for the creation of a ‘reputation economy’. The ‘wave of academic publishing’ consists of many different outputs, such as preprints, data, blog posts etcetera. And there is even an additional mass of ‘dark matter’: nonpublished outputs such as negative results and non-publication of clinical trial results. He added that if all these outputs were to be published, the wave would even be much bigger.
He continued on another problem: Peer Review. He mentioned a few of the reasons why Peer Review is overrated as a quality control mechanism: Methods and approaches are too complex, it is time-consuming, not scalable, untransparent etcetera. Some of these problems can be fixed by technology but others cannot. The root problem for Prof. Dirnagl consists of the academic rewarding and incentive system. He said the system needs to go back to content, have fewer metrics, more narratives, and reward Open Science and Team Science. This will give fewer papers, more time for research and more quality, diversity and innovation. He said that change is happening, there are initiatives like the Coalition for Advancing Research Assessment [1] and there are lots of other developments like registered reports, review commons, open peer review, Open Lab Notebooks etcetera.
Prof. Dirnagl finished with what he saw as the future of scholarly communication: Heavily pre-print based, fewer classical journals, disentanglement of the publishing and review process, modular, automated, less commercial, and integrated perspectives and approaches of non-WEIRD (Western, Educated, Industrialized, Rich and Democratic) countries. And additionally: look at what ECRs expect of the scholarly publishing system.
Dr. Sutton opened the session by asking the panelists what they thought to be opportunities for the publishing industry. A few panelists stressed how the industry should make better use of technology and that collaboration with researchers, funders, and policy makers should be enhanced, e.g. when tackling climate issues. Dr. Pattison saw the rise of preprints as an opportunity, and Dr. Herrmann added that the key opportunity for publishers was to support researchers in the transition to Open Science. Burley mentioned the changing role of the journal. Will the next generation still be very much journal-focused? How can we make it more intuitive and easier for researchers?
The discussion continued on innovation in the publishing industry. Verses said that for the so-called Tiktok generation, the industry is never innovative enough. The primary artifact in scholarly communication is still a flat pdf. ‘We should push towards a reproducible knowledge stack’. Jegaseesan said that Open Science will enable a new world of connecting different stakeholders. Dr. Herrmann said to be mindful of the complex ecosystem: the PDF can mean different things for e.g. an author, a reviewer, a leader. Dr. Pattinson added that innovation on tools for e.g. code, data, dynamic figures should continue, even though this can be challenging.
The panelists agreed that already a lot is being done when it comes to driving innovation, e.g. fully digital workflows; but not all innovations are always visible to researchers. The panelists saw the additional need to exploit the full potential of new technologies, e.g. AI. There is also innovation potential in making literature more trustworthy. From a biomedical perspective, preprints are the future. Dr. Pattinson added there are also concerns and barriers, e.g., authors need to get used to public criticism. Verses said that one of the results of the Elsevier Confidence in Research report [2] was that authors are anxious of misinformation getting out there. Dr. Herrmann added that the shift to Open Access and Open Science has an impact on authors but on readers as well. There is additional innovation potential in getting information to readers. Jegadeeshan said that because of growing content, more needs to be done for researchers to keep track. The systems cause a lot of extra work, we need to make publishing much easier. Publishers should help researchers to communicate their research and this should be done in an equitable way.
Additional challenges mentioned by the panelists were: the article-based economy, duplication of efforts, lack of standards e.g. on metadata. Efficiency with regard to Peer Review and research integrity, should be increased. Tracking and measuring inequities in scholarly publishing should also be tackled. Sutton said to be aware of OA Business Models creating new inequities, not just North-South but also well-funded vs less funded research.
Questions from the audience focused on quality control, trust in science and publishers’ responsibility. Dr. Herrmann thought publishers could be more transparent, e.g. regarding image and data checks. Pattinson said that everyone should get used to the idea that science can be messy and wrong. By educating the audience, creating lay summaries, making better use of AI, publishers can do more. Additionally, quality control beyond the PDF format should be improved through the development of research infrastructures that allow for combined research outputs. This cannot be done in silo, e.g. the NFDI [3] is being co-created with multiple stakeholders. It is also about incentivizing researchers and involving different communities, such as ECRs and different disciplines. Within the academic system there are fundamental challenges that go across all disciplines, and innovations to address these challenges will benefit science as a whole.
He explained how at ESMT they looked at how their current expertise, internal processes, and their interaction with different stakeholders affected transformation and how leadership needed to change to lead through the digital transformation. He said that for the first time in history, the majority of MBA students are enlisted in online programs. Dissemination of knowledge and interaction have changed completely, and experimenting through e.g. social innovation labs and enriched learning experiences, is necessary to optimize interaction. He added that digital fatigue amongst the younger generation should also be taken into account and the fact that digital learning environments will never completely replace in-person classrooms.
He said that for sustainable transformation, an interdisciplinary approach is needed. Collaboration, and wide dissemination of insights are key to sustainability, and it is crucial that scientific insights reach society. He added that the life cycle of invention to innovation needs to be optimized, and research and technology need to be better integrated. This can be done through empowering individuals. Even though there are challenges, Prof. Rocholl was convinced science can overcome these. He concluded his presentation on an optimistic note: the fall of the Berlin wall has shown that disruption offers opportunities for fundamental change.
The Session:
The first speaker,
She continued on the reasons behind the continued misuse of bibliometrics. One example was how the Journal Impact Factor (JIF), a journal level metric, was used as a proxy for individual researcher and article evaluation. The JIF is only responsible when used to measure journal performance, it does not include information about why an individual article is cited. Another example: The H-index was originally designed for theoretical physics but is now being applied in other research fields. In addition, it is based on total citations which build over time, so younger researchers are disadvantaged. She concluded that the future of responsible Research Assessment should be based on a balance between technology, metrics, and expert quality reviews. With an increased use of data profiles, an introduction of measures for real world impact that influence policy & practice, indicators of trust can be created.
The next speaker
In order to achieve a change in research culture, Dr. Kramer stressed the importance of open data and open infrastructures, to prevent vendor lock-in and control, and to prevent selective coverage and exclusion. An open infrastructure enables full transparency, allows institutional control over data and the stories the institutes want to tell. This will allow a more equal global research ecosystem. Dr. Kramer concluded that vendor lock-in through commercial providers can be prevented by data that is open at the source; there are several initiatives: Initiative for Open Citations [9], Initiative for Open Abstracts [10], OpenAlex [11].
The third speaker
Prof. Koellinger explained how In the current internet there are problems with e.g. link rot, content drift, and version control. He said that identifiers that are currently used, like for example DOIs, do help but are often not unique nor persistent. He introduced cryptographic hashes: to convert strings of arbitrary length into strings of fixed length. If you change anything in the input (like a single word) you get a different hash. Hashes are unique and immune to link rot and content drift. With this, a publication protocol, in which rich research objects consist of hashes indexed on a blockchain, can be constructed. When you add or update a small component in for example a manuscript, you get a new hash, and as a result a new research object is created; the old version of the manuscript is not overwritten, both versions will remain accessible. This will look like a public registry of research objects in a blockchain, stored on different servers.
The decentralized web will also enable interoperability and granular, version-controlled citations. Citations as function calls can be implemented, which are the building blocks for FAIR interoperable research objects. Prof. Koellinger added that the decentralized design prevents vendor lock-in and enables gateways for different stakeholder groups, such as publishers, universities and funders. He ended his presentation with a question for the audience: Why is this not going to work?
In the subsequent discussion with the audience, this question was not answered but instead, the feasibility of reproducing code in a decentralized web was discussed: how would it work in a research system that is large and vast? Prof. Koellinger said the blockchain architecture of the decentralized web has the advantage of track records: when a compute job would take months, it would only have to be done once.
Other questions focused on research assessment and peer review. How for example better use of AI could lead to more automated PR. Right now there is too much trust on journal articles and citation performance; other research outputs should be rewarded. Data publication, reproducibility, and pre-registered reports should be incentivised and rewarded. It was also discussed how to develop indicators for impact and quality. Dr. Kramer said that impact can mean different things, it is very context-dependent, perhaps more could be done with help of technology but this is not clear yet.
The
Meadows started off by explaining the question mark in the title:
Meadows continued the session by asking the panelists: do we have a problem with Research Integrity? Dr. Pattinson said the pressure to publish is getting worse. Dr. Alam believed the problems are broad and that there are varying issues in different disciplines. She thought it to be the responsibility of publishers to raise awareness through education and training. Mejias agreed that RI is an issue and the COVID pandemic showed how fragile trust in science is. She said it is important for the general public to know that research is not flawless and that there are mechanisms in place to address these issues. Burley added there is a role for publishers to improve trust in science, e.g., with AI tools. Graf mentioned the STM integrity hub [14] and added that quality control is the publisher’s core business.
The discussion continued on improving trust. Dr. Alam said not enough has been done to crosslink different research outputs: “We live in a digital world but don’t really use it”. Meadows said that PIDs could be used for linking outputs. Mejias agreed that transparency needs to be improved and that this can be done with PIDs and better metadata. For this to work and add real value, global adoption of PIDs is needed and inequities should be addressed. Burley agreed a multistakeholder solution is needed and a way to discourage bad actors should be found. Graf added that the transition to OA has taken up a lot of effort of all involved. It also generated goodwill and that should be used to address e.g., Paper Mills.
One question from the audience focused on the opportunity publishers have to get hold of RI by inviting reviewers, and other groups to reproduce results. Would this be feasible? The panelists agreed that this would be an unbelievable amount of work, and different for the various disciplines. In theory, publishers could enable reproducibility for some disciplines, by making sure methods and code are described and by validating completeness. Another question focused on tracking the quality of the authors: e.g.: sanctions to authors who have been caught submitting fake papers. Meijias said that PIDs and metadata help build rigor to the scholarly record, e.g. with ORCID iDs, by e.g. selecting reviewers with at least 5 records in their ORCID iD. Burley said that awareness amongst publishers has to be raised. Perhaps it is a sense of shame with publishers when doing a retraction? The panelists agreed that a more nuanced set of mechanisms is needed to move beyond the binary system of retraction/non retraction, paper mills vs. good papers.
The session concluded with the views of the panelists on how technology could help with Research Integrity. Graf stressed the importance of identity validation. Meadows added this could be ORCID. Burley said that authors can also make mistakes innocently. Authentication is important, and tools to identify issues upfront should be used. Mejias stressed that associated metadata and open infrastructures can provide the context on how research was produced and better enable reproducibility. Dr. Alam added that linking outputs through identifiers will help to stop bad actors. She said that we still need humans to make judgments, but tools definitely help with scalability. Graf thought that we don’t have the balance between human efforts and technology right yet: we are still relying too much on PR. Alam said that doing integrity checks could be a dayjob for an academic editor. More division of labor is needed and publishers cannot be held exclusively responsible. Meadows concluded that to address these challenges, increased collaboration between research institutions and publishers is needed.
The
The first example was the Babel Fish in the film: Hitchhikers’ guide to the galaxy: Automated translation was considered impossible, but now, with automatic DeepL translations it is made possible by using the knowledge of mathematicians and computational scientists. The second example was on protein folding, this was considered impossible too, but now, by using mathematics and AI, gigantic leaps in research on protein folding have been made. Research still continues and Prof. Ziegler thought it to be a shame not all research is published OA. The third example was heart CT scans. It was considered impossible to visualize the coronary arteries noninvasively. But now, because of mathematics, heart CT scans are possible and work even better than invasive scans.
The three examples showed that with AI a lot of things have become possible that were thought impossible before. He added, however, to be mindful of AI. He said that Joseph Weizenbaum already warned about possible dangerous implications of AI. And recently in the Tagesspiegel [15], a German newsletter, it was mentioned that robots are being used in the war in Ukraine and can kill on their own, without human intervention. Prof. Ziegler concluded: do we really want that?
The second Day of the 18th APE started with the session:
After the presentations, the audience was asked to vote for the best Dotcom when looking at the solutions from an investor perspective. Prophy was voted as the best Dotcom.
The Panel:
Kraft spoke about the transformation of European sustainability reporting requirements. The main target is to be climate neutral in 2050 with a strong focus on biodiversity and the establishment of a circular economy. With the introduction of the EU’s Corporate Sustainability Reporting Directive [23], the EU wants to encourage businesses to change and transform. Companies subject to the CSRD will have to report according to European Sustainability Reporting Standards (ESRS). With regard to climate change, an individual pathway for companies on how they can contribute must be defined. Kraft stressed that companies can define relevant topics and that the directive also allows them to focus on opportunities.
Dr. Meskine introduced Wiley’s climate goals: to be carbon net zero by 2040 in line with science based targets. He explained how Wiley took the 1st step: look at where emissions are. It turned out that Wiley’s most emissions were in scope 3: vendors, purchased goods and services as well as capital goods. Wiley put in place a variety of policies, developed tools to start collecting information from vendors and established a multi-year pathway with specific targets.
Dr. Sänger explained how associations can support the publishing sector in reducing CO2 emissions. She said that umbrella organizations like e.g., IPA, STM, EIBF can support collaboration, raise awareness and share best practices. There are several initiatives which have the SDGs in focus: e.g. the SDG book club [24] and the SDGs Publishers Compact [25]. Other initiatives collaborate on other topics, such as ecological, economic, and social sustainability, and share best practices on e.g. returns policies (RISE bookselling project [26]). The German Borsenverein sustainability working group [27] set up task forces on production & logistics, reporting requirements, sustainable operations, and collaborate to get funding.
Sherer said that the UK publishers association has also set up an SDG-pact and she urged publishers to use the platforms they have to raise awareness. For example the IPA Publisher 2030 Accelerator [28]. This platform goes beyond the academic publishing sector, and topics, like print on demand, accounting and financial systems and book returns, are discussed.
The discussion continued with the question if the aim should not be “carbon-zero” rather than “carbon-neutral”. Kraft was of the opinion that offsetting is still better than doing nothing. It is a good starting point and it is easier to start when you present data. Sherer added that collecting data for big companies like SN can be very complex, as there is a lot of data, e.g. on traveling and events but they aim to be transparent about data collection. Meskine agreed that transparency is important to show that publishers really care.
Sherer asked the panelists if they thought publishers should collaborate more on tangible examples such as reducing print. Kraft said that when it really comes down to making an impact it is very complicated, no one can do this alone, and sometimes the most direct impact might not be the best climate impact in the long run. Sänger agreed that the industry needs to be objective: it is not just print, digital emissions as well. Meskine said that Wiley has been moving away from print for a while now. But print will not go away completely. He added that digital carbon emission data is also part of Wiley’s calculations.
It was concluded that capturing data is very complicated, sharing experiences is important, and engaging colleagues and working collaboratively on innovations that account for sustainability, are key.
The session:
SeamlessAccess [29] is an initiative working on top of a federated authentication system and makes it easier to use for researchers, it has three added value elements. The first is Recognition: the button is a standard visual cue that researchers recognize and trust. The second added value element: It also provides a service for researchers to find their institutions. The third component is the persistent layer that remembers the user’s chosen institute. It works across different websites. Data shows that SeamlessAccess boosts usage of federated authentication and adoption is growing.
Dr. Koers continued on attributes: they contain information about a user and are passed on to a publisher or a service provider after authentication. He explained how Project GetFTR [30] is an example of how knowledge of attributes can help smoothen the user journey. GetFTR offers infrastructure and flexible integration options in different contexts and use cases, and offers e.g. access to the best version of the article and maximizes the number of accessible articles.
At the end of his presentation Dr. Koers highlighted two important changes: Browser changes and decentralized identities. There are ongoing developments at big tech browser vendors like Google, Apple, etc. E.g., Apple has started to hide IP addresses. Publishers should inform themselves about these changes, e.g. through the Federated Identity Community Group [31] and discuss these changes with library customers. Decentralized identities have the potential to be the new paradigm, it is the next big thing in how we think about digital identity. With a decentralized identity, the user is really at the heart and in control, whereas in federated authentication, the user is a conduit.
Dr. Schultes explained the FAIR digital object in more detail. There is a digital resource (could be anything: an abstract, paper, dataset, author etc), and a GUPRI (Globally Unique Persistent Resolvable Identifier), metadata about the resource, the metadata schema, and a link to the location are added. A computer can find out what it is, what you can do with it and what you are allowed to do with it. Dr. Schultes continued on Nanopublications: these make large datasets more manageable. A nanopublication is an assertion with provenance and publication info which enables a type of blockchain representation. You can make assertions about scientific content, administrative things etc. Dr. Schultes’ research group saw how FAIR digital objects and Nanoplications are moving on the same track which resulted in The Comparative Anatomy of Nanopublications and FAIR Digital Objects [34].
How do nanopublications and FDOs relate to the publishing industry? The GO FAIR Foundation suggested integrating FAIR into academic publishing processes. With IOS Press they started developing FAIR Connect [35]: an OA publishing platform for the development and dissemination of good practices for FAIR Data Stewardship. It also offers search engine services for FDOs and the journal FAIR Connect [36]. With FAIR Connect they try to FAIRify the academic publishing process, in which they see a paradigm shift from narrative articles that serve human users, with units of science and siloed publications, to FDOs that serve human and machine users, enable fine granularity of content and in which the publication is mediated by an open, decentralized infrastructure. Dr. Schultes concluded: The value will be found less in data storage and more in data services that are compliant to emerging standards enabling the principles of FAIR.
She continued on the current status of the DEAL agreements: Wiley is negotiating for DEAL II, Springer Nature as well and talks with Elsevier have restarted. DEAL was initiated by an alliance of German scientific organizations to support the OA transformation, steered by the German scientific community and to replace excessive cost increases with a comprehensible pricing system. She added that DEAL has been a Publish and Read success story for institutions in Germany. She said the German Wissenschaftsrat has stated in January 2022 that Gold OA should be the standard for all publications, and Recommendations on the Transformation of Academic Publishing: Towards Open Access [37], were published.
With regards to costs, within DEAL there has been attention to have a fair balance of interests e.g. from small and big publishers, and specific support for research-strong and publish-strong organizations. The DEAL group checks the publication figures, forecasts publication increases, and establishes simple conditions for participation, also for smaller organizations. They also look at what other countries do, e.g., JISC in the UK is working comparably. Recommendations from the German Science and Humanities Council (Wissenschaftsrat) are a.o. to support the development of transparent information budgets in libraries and to include publication costs as part of the research budget.
Dr. Sens said that the lessons learned include a.o.: financial solidarity has its limits, libraries need to get the university presidents on board, there is no one-size-fits-all solution, and SLAs with publishers are necessary. She concluded by looking to the future, what comes next? What will the OS developments, author-based publishing, growth rate in articles, and new business models mean? Dr. Sens concluded: we need good deals with publishers, attractive cost distribution models for participating institutions, and trust the data; there is no way back. She ended her presentation with a question: What would a DEAL contract look like if there was no past?
In the panel session:
The discussion continued on how to involve ECRs more in the Peer Review process. Salholz-Hillel said that for ECRs, there is a lot of time pressure and no recognition. She added that there are mixed views when it comes to Open Peer Review. Weissgerber said there is a need to increase diversity. For ECRs Peer Review is seen as an entry into publishing but more innovation is needed to include ECRs, e.g. ECR advisory boards, preprint review programs, ECRs as innovation editors. Pulverer said it should be clear for ECRs what they get out of it. Havemann added that train-the-trainer and capacity building programs are needed, as well as monetization incentives, as salaries in the GS are low. Additionally, language editing in multiple languages is needed.
When asked for ideas to improve scholarly publishing, the panelists agreed that the scholarly infrastructure should increase focus on other outputs, beyond journal articles. There should be more incentives for that, and better systems are needed, e.g. better linking opportunities for research data. Hartgerink said to look at the broad scope of other research outputs. This changes quickly and often publishing systems look backwards. Weissgerber said that getting new and better systems to allow ECRs to do the science of the future is really important. Pulverer added that much research is not shared because it does not fit within the system. Wagner felt that publishers are working on this, as they collaborate with societies to hear the needs of scientists. However, he heard editors say that they don’t want to add DOIs to their datasets because they fear citations to the journal article will decrease. So from a publisher’s point of view the uptake of new technology and publishing systems can be very slow amongst the research community. Pulverer agreed but felt that publishers still have a responsibility to provide mechanisms to publish other outputs and to valorize these outputs.
Weissgerber asked Salholz-Hillel if there was one thing that was different for ECRs compared to more senior researchers. Salholz-Hillel felt that the pressure to publish is stronger for ECRs and that publishers should make it safe for ECRs to take risks, e.g. Elife has done this with preprint publishing. In conclusion, Weissgerber encouraged publishers to engage in continuous discussions with ECRs and she pointed to the publication in PLOS Biology: Recommendations for empowering early career researchers to improve research culture and practice [38].
Before the 18th APE was closed by Marta Dossi (BISP), the APE Award for Innovation in Scholarly Communication, supported by Digital Science, was handed out by Daniel Hook. There were two honorable mentions: Dr. Joshua Nicholson for his work on extending citations to become more multidimensional and valuable. And Dr. Vivienne Bachelet for her work to create a fully multilingual end-to-end platform for scholarly publishing. Vsevolod Solovyov (Prophy) won the APE Award for his work to make reviewers more discoverable and thus helping to ensure that research is correctly and efficiently reviewed.
Please note: APE 2024 will be held on 9–10 January 2024.
