Digital Transformation (DX) in Education: eBooks

Lori Carlin • November 1, 2023

In a recent Ideas in Action, Delta Think explored the topic of Digital Transformation (DX) in EducationDigital Transformation is a decisive strategy intended to increase the value, reputation, and sustainability of higher education institutions: how can educational institutions be reshaped to foster learner-first approaches while also keeping up to date with the growing technological, cultural, and financial demands of the 21st century?


Part of the answer to this question focuses directly on teaching and learning. Digital Transformation promises to reinvent education using technology to support curriculum delivery, improve pedagogy, expand access, and ultimately improve teaching efficacy and student outcomes. EDUCAUSE reported the top two drivers of Digital Transformation are 1) improving the student experience and 2) improving faculty teaching and advising. While there are multiple technologies and systems being leveraged to address these needs, scholarly content of all shapes and sizes remains foundational in higher education.

The 2022 ACRL Academic Trends and Statistics survey reported that since 2017 the average eBook collection across all library types (community college, college, and university libraries) increased 96.6% from 315,213 to 619,895. Despite cultural change and cost barriers, institutions, via their libraries, have invested to build significant collections of eBooks. Alongside journals, eBooks form the bedrock of the academic research enterprise and are fundamental in support of the education usage occasions that power DX.


How Does Book-Based Content Support Digital Transformation?


eBooks and book-based content can play a critical role in support of two key drivers of DX:

  • Student Success – Despite overwhelming evidence (and common sense) that access to content impacts student success, SPARC reported in 2023 that due to the cost of required course materials, more than two in five students (44%) said they took fewer classes, nearly one in three (32%) reported earning a poor grade, and nearly a quarter (24%) reported dropping out of a course. Administrators are focusing on improving access to educational resources as part of their DX initiatives, including content that is funded and provided by their library.
  • Sequencing and Scaffolding – Course design has traditionally been the purview of faculty. Core textbooks are often supplanted by learning objects including book chapters, multimedia, and even figures and tables, providing the flexibility to adapt courses and integrate content into technology tools and platforms essential to efficiency and efficacy of teaching and learning. eBooks serve as a primary source of information and knowledge for a course and empower faculty to select and utilize content that align with the learning objectives and cover essential concepts.


What does DX Look Like in Teaching and Learning?


Dr. Jonathan Wisco is a leader in curriculum design and one of the architects of Boston University School of Medicine’s pre-clinical curriculum, Principles Integrating Science, Clinical Medicine, and Equity (PISCE). During a recent webinar on how technology has impacted education (co-hosted by Delta Think and Silverchair), Dr. Wisco and his medical students shared their current education needs and preferences: self-directed learning is primary … they are motivated information-seekers and will utilize multiple resources, methods, and learning styles to ensure mastery.


“You want the flexibility of a curriculum to be able to learn when you want and how you can, in the most efficient way for yourself.”
– Gabrielle Lakis, M1


The learning paradigm continues to move away from rote memorization to emphasize critical thinking and problem-solving skills, shifting the focus from linear comprehensive content to granular, specific, and contextual knowledge.


“You cannot memorize everything you need to know, but where can you go to find that information when you need that information … you don’t need to memorize every little detail … and that is an affirmation of where our curriculum is heading.”
– Gabrielle Lakis, M1


Dr. Wisco emphasized the benefits of personalized learning. He and his colleagues work not only to ensure students have the knowledge they need to pass board exams, but also are prepared to have an impact in their field of medicine throughout their professional life.


“The idea of individualized education is extremely important to us … and one of our highest priorities is to train lifelong learners. We are trying to train them with the skills to be able to decide what content is most important for them at the moment. Instead of telling them what anatomy is, we help them discover and understand why that anatomy is important.”
– Dr. Jonathan Wisco, Associate Professor


Is Your Content DX-Ready?


There are fundamental characteristics that can ensure book-based content remains a vital component in support of teaching and learning and student success. How can publishers ensure that their content is relevant in support of DX? By being fluid, flexible, and findable.


“What is most frustrating to me is when I want to provide content and deploy it to the students in the way they need it, but I go to this resource, and this resource, and this resource. And I can’t use it anymore because it is not helpful for that [learning] pathway.” 
– Dr. Jonathan Wisco, Associate Professor


Let’s Talk


Delta Think helps publishers, professional societies, technology companies, start-ups and others find their place in the rapidly transforming education ecosystem. We collaborate with our clients using proven methodologies to define actionable, customer-driven strategies across content and product development, commercial infrastructure, and operations. Our experience in the education landscape – from undergraduate and graduate-level through to professional development, licensure, and certification – is your opportunity. We’d love to share more about what we are seeing and hearing in the world of education and how your organization can learn to thrive in this fast-moving landscape. Contact us today to discuss an education-based project customized to satisfy your specific organizational objectives and budget.


By Dan Pollock & Heather Staines January 13, 2026
Overview This month we look at the changing mix of licenses in use among OASPA members and what these trends reveal for open access publishing more broadly. Introduction Each year OASPA surveys its member organizations to gather information about the volumes of output they publish in their fully OA and hybrid journals. These data provide a useful lens on how the most OA-committed publishers are approaching licensing and how that compares with the market as a whole. We’re delighted to be working with OASPA on its survey again this year. We process the raw data into consistent categories, normalize publisher names, and create visualizations of the data over time. We also produce a yearly blog post in cooperation with OASPA, outlining some of their results. Because space constraints limit what can be covered in OASPA’s own post, we explore additional angles here, placing OASPA member behavior in the context of Delta Think’s wider, market-level analysis. Subscribers to our Data and Analytics Tool can investigate the data further still. Our work with OASPA provides a complementary view into our market-wide analysis. Use of Licenses We can examine which common open access licenses are in use, as follows. 
By Lori Carlin December 4, 2025
Impelsys and Delta Think Join Forces to Expand Strategy and Technology Capabilities for Publishing, Scholarly Communications, Education, and Healthcare Communities
By Dan Pollock and Heather Staines December 2, 2025
Overview Each year, our scholarly market sizing update and analysis goes way beyond open access headlines. One consistent finding is that market share of open and subscription access is highly dependent on subject area. This month we look at how to best use our Delta Think Data and Analytics Tool (DAT) to understand and analyze these variations. With coverage of approximately 220 detailed subject areas, the data shows that headlines can sometimes mask important detail. Background Since we began our scholarly journal market analyses in 2017, one of our core objectives has been to enable deep analysis of our headline findings. Our annual market share updates represent a summing of data – more than 200 detailed subject areas, 200 or so countries, also split by society vs. non-society journal ownership. This level of detail is clearly too much for our monthly short-form analyses, so we present the market-wide headlines in our annual updates. However, by picking one subject area as an example, we can see how much nuance lies beneath the surface, and why these variations matter. Subscribers to DAT can use our interactive tools to quickly and easily see each level of detail and filter for just those relevant to their organization. Market Share Variation by Subject Area Our latest market headlines suggested that open access (OA) accounted for just under 50% of article output in 2024. However, this headline proportion varies considerably by subject area.
By Lori Carlin & Meg White November 19, 2025
Navigating Uncertainty, Innovation, and the Winds of Change As the Charleston Conference 2025 wrapped up, one thing was clear: scholarly communication continues to evolve against a backdrop of uncertainty: economic, technological, and policy-driven. Yet amid the turbulence, conversations throughout the week pointed toward resilience, adaptability, and even optimism. As Tony Hobbs observed during the Shifting Tides policy session, “the good news for scholarly communication is that due to technology advances, it is now possible to sail into the wind.” The Elephant in the Room: Doing More with Less Heather Staines Every conversation I had in Charleston seemed to circle back to one thing: budgetary uncertainty. Whether the concern was policy changes like potential caps on overhead or shifting grant funding or the ripple effects of declining enrollment, both domestic and international, everyone was asking how to do more with fewer resources. This theme ran through the plenary Leading in a Time of Crisis, Reclaiming the Library Narrative, and even the lightning sessions, a shared recognition that we’re all trying to redefine what “enough” looks like. What stood out was how data-driven decision-making has become essential. Libraries, publishers, and service providers are not just analyzing what to add, but what to let go of, all in an effort to find a new balance. And then there’s AI. We have moved beyond “sessions about AI” to “AI everywhere.” I will admit that I once thought AI was a solution in search of a problem, but now it’s woven through nearly every conversation. Librarians are leading the way on AI literacy, while publishers and service providers are using AI to innovate to meet changing research needs. The uncertainty is real but so is the shared determination to adapt, learn, and move forward together. The Long Arm of the Law and Its Reach into Scholarly Communication Meg White One of the things I love about Charleston is that there is always a moment that challenges me to reframe how I think about the work we do. This year’s Long Arm of the Law session did exactly that. It was a vivid reminder that the legal and policy currents swirling around us are not abstractions; they shape our ecosystem in ways we can’t afford to ignore. Paul Rosenzweig set the stage with a fascinating and lively walk through the history of executive orders. Hearing that Washington issued just eight while later presidents relied on them more frequently primarily to advance political agendas made the evolution very real. What stood out was the fine line between legitimate executive authority and overreach, and how easily those boundaries can blur. Nancy Weiss then brought the conversation directly into our lane with her analysis of an Executive Order directing the Institute of Museum and Library Services (IMLS) to reduce its activities to the bare legal minimum. Her experience as former General Counsel gave us an inside view of what such a directive could mean for libraries, museums, and cultural programs, all places where so much of our community’s work takes root. Sessions like this are why Charleston continues to be invaluable to me. They stretch my understanding, give me new context, and remind me that staying informed is part of how we navigate change together. Data-Driven Insights: The 2025 Author and Researcher Survey Lori Carlin My week was cut unusually short (for me) by other meetings I had to fly off to, but I still managed to squeeze in 2.5 days of interesting sessions, discussions, and ‘business casual’ gatherings. The first two events I attended this year were definite highlights, both of which were the brainchild of and brilliantly orchestrated by my colleague, Heather Staines – the Vendor Meetup on Monday evening and the Leadership Breakfast on Tuesday morning. Both were jam packed and filled with lively conversation. If you’re not familiar, the Vendor Meetup is an open, casual gathering (sponsored this year by Get FTR) designed to give vendor representatives, especially early career attendees, who attend only for Vendor Day a chance to socialize and network, something they often miss when they’re in and out in a single day, but all are welcome to attend! The Leadership Breakfast, a smaller invitation-only event designed to give a more intimate networking experience within the larger Charleston Conference, is always a thoughtful session centered on a pressing issue of the day, and this year was no exception. The discussion focused on sustainability across the entire scholarly communication ecosystem—from funders to libraries to publishers. Frankly, no one can unhear the words of one of the panelists (a library director) when he commented that his budget has dropped from ~$7M to ~$5.4M in the last 24 months … with more to come. Finally, I’m a little biased, but I dare say I and my panelists were very pleased with the session I moderated focused on the impact of US research funding changes, which highlighted info from Delta Think’s Spring 2025 Author and Researcher Survey, along with how publishers who participated used the data to inform their strategies. We also had a librarian on the panel who informed the audience about the impact of these changes on universities overall and libraries in particular. As you may know, the survey data showed rising concern about institutional support, with many researchers rethinking how they publish and participate in conferences. Respondents also described how tightening budgets are straining peer review and research dissemination, while responses varied sharply between U.S.-based and international authors, reflecting distinct policy and institutional pressures, it also showed that the impact is being felt globally. In the tradition of Charleston, what made the session so powerful was the discussion. Colleagues from societies, publishers, and libraries focused on how they are using these insights to understand the challenges and to act on them. From adjusting publishing strategies to helping researchers to growing relationships in other markets, to shaping advocacy and outreach activities, organizations are using these insights to inform resource and budget direction in innovative ways. For me, that was the real takeaway: turning evidence into collaboration, and progress. Even in uncertain times. We’re running the survey again now with plans to compare results to the Spring version. If you’re interested, there is still time to sign up! End of An Era (Two, in Fact!) This year’s conference marked a pivotal moment: the first without the in-person presence of founder, Katina Strauch (though we were grateful for her virtual participation), and the well-earned retirement of longtime Conference Director Anthony Watkinson, who rang his iconic bell one last time. We would not be here without them and their visionary colleagues who built this community from the ground up. Thank you, Katina and Anthony. Charting What Comes Next If there was one metaphor that captured Charleston 2025, it was motion; not adrift, but deliberate progress in the face of resistance. From policy updates to AI integration to the enduring strength of the scholarly community, the week’s sessions affirmed that innovation often takes root during uncertainty. As Tony Hobbs reminded us, even headwinds can propel us forward — if we learn how to adjust our sails.
By Heather Staines November 6, 2025
We are proud to share a video recording of our October News & Views companion online discussion forum! Join us for our annual update on the volume and revenue associated with Open Access publishing. If you missed the session, or if you attended and would like to watch/listen again, or share forward with friends, please feel free!
By Dan Pollock and Heather Staines October 21, 2025
Overview After a rocky couple of years, the open access (OA) market may be finding its footing again. Each year, Delta Think's Market Sizing analyzes the value of the OA scholarly journals market—that is, the revenue generated by providers or the costs incurred by buyers of content. Our analysis estimates that the OA segment expanded to just under $2.4bn in 2024. Although growth has improved compared with last year’s deceleration, it continues to lag behind the broader historical trend for OA. The proportion of articles published as OA has declined slightly, likely driven by continued reduction in the output from the large OA publishers. This trend has benefited established publishers, who saw growth in OA activity and revenue as they continued to consolidate their positions. Looking ahead, OA could soon begin outpacing the broader journals market once again—but likely through different growth drivers than in the past. Read on to see what those shifts might look like. Headline findings Our models suggest the following headlines for the 2024 open access market:
By Lori Carlin & Meg White October 13, 2025
Collaborate with Delta Think to uncover how funding and policy uncertainty continue to reshape the research ecosystem — and gain tailored insights for your community.
By Lori Carlin & Meg White September 25, 2025
Introduction: One question, two paths  A recent essay in The Conversation posed the question, “Is ChatGPT making us stupid?” The author examined emerging research suggesting that over-reliance on AI tools for writing can dull critical thinking, originality, and even memory retention. But as the author points out, AI has the potential to augment human intelligence when used well , acting as a catalyst for deeper thinking rather than a shortcut around it. We agree and seek to guide our clients in determining how to use AI to strengthen research and scholarship. From concern to opportunity When AI is approached as a collaborator, it sparks creativity, deepens inquiry, accelerates problem-solving, and amplifies creativity. It can strengthen teams, enhance services, and improve efficiencies across the publishing enterprise. Turning Ideas into Action Here’s how Delta Think can help you transform smart AI potential into purposeful, strategic action: Strategy and Market Research Focus: Identify where AI can deliver the most value for your organization, grounded in community needs and behaviors. Delta Think Approach: Gather and analyze evidence through quantitative and qualitative methods to uncover how your community – your researchers, authors, reviewers, and readers – are using AI now or, better yet, where and how they could be using it in the future. Marrying their unmet needs with your strategic goals creates your roadmap to future success. 2. Build vs. Buy Decisions for AI-Powered Products Focus: Develop proprietary AI solutions, partner with trusted vendors, or combine the best of both approaches to suit your needs. Delta Think Approach: Assess your current state and future needs, design decision frameworks that weigh cost, capability, risk, speed-to-market, and long-term scalability, and build the approach that will work best to support your business goals and community needs. 3. AI Policy and Governance Focus: Ensure responsible, transparent, and ethical AI use that safeguards scholarly integrity. Delta Think Approach: Facilitate the development of your AI governance with the creation of important guardrails and policies, working to mitigate bias and hallucination risks, safeguarding research integrity while enabling innovation. 4. UX/UI Testing for AI Products and Features Focus: Design AI experiences that enhance human engagement. Delta Think Approach: Test results, interfaces, prompts, and transparency signals to keep users informed, empowered, and confident in your products and tools. 5. Licensing and Partnership Strategy Focus: Leverage commercial arrangements to unlock AI potential while aligning with your mission and values. Delta Think Approach: Guide you through licensing agreements, proprietary data partnerships, and collaborations that create sustainable competitive advantage and strategic revenue streams. Turning Ideas into Impact By reframing the conversation from Can AI substitute scholarship? to How does AI amplify scholarship? , publishers can lead the next wave of innovation. Delta Think’s collaborative approach ensures that your organization’s adoption of AI enhances creativity, critical thinking, and trust. We can help you map out your bespoke AI-strategy roadmap, develop new products and services, test prototypes, and design governance guidelines. Reach out today or schedule some time at the Frankfurt Book Fair (10/14-16) to discuss how Delta Think’s expertise and proven methodologies can help your organization unlock key insights and drive innovation.
By Dan Pollock and Heather Staines September 9, 2025
How might planned cuts to funding of the US National Science Foundation affect scholarly output? In our last News & Views we analyzed how the headline cuts might apply to relevant activities. This month we examine how journals may be impacted and model some scenarios quantifying the impact on global scholarly output. Background The US National Science Foundation is an independent US federal agency that supports science and engineering across the US and its territories. In its 2024 financial year (FY) 1 , it spent around $9.4 billion, funding approximately 25% of all federally supported research conducted by US colleges and universities. In July we looked at how reported funding cuts and NSF budget cuts proposed by the US Government might affect the NSF’s output of research papers. We found that in the near term the effects would be limited, as the cuts focus on NSF activites that produce low volumes of papers. However, cuts proposed over the coming year may have a more profound effect as they are deep and affect research activities. We have also previously analyzed proposed cuts to funding of the National Institutes of Health (NIH). We noted how cuts to the world’s largest producer of biomedical research could have a profound effect on publication outputs. So how do cuts to the NSF stack up? The effects on journals As ever, the headlines and averages are unevenly distributed, so we looked at how individual journals might be affected. 
By Dan Pollock & Heather Staines July 29, 2025
The US Government has planned cuts to funding of the US National Science Foundation (NSF) in 2025 and 2026. Before we can undertake a full analysis of how these cuts might affect publishers, we must unpack some data. This month we put the cuts in context, looking at how the cuts impact research and the scale of NSF output. And we find they may not affect research in the ways the headlines suggest. We will follow up with a future analysis modelling specific scenarios of impacts on publisher submissions. Background The US National Science Foundation is an independent US federal agency that supports science and engineering across the US and its territories. In its 2024 financial year (FY) 1 , it spent around $9.4 billion, funding approximately 25% of all federally supported research conducted by US colleges and universities. In May 2025, the New York Times (NYT) published an article analyzing proposed cuts to NSF funding by the current US Government. The NYT’s analysis suggested a 51% cut in funding from 1 January through 21 May 2025, with a further 56% reduction proposed for next year 2 . We have previously analyzed effects of proposed cuts to funding of the National Institutes of Health (NIH). The proposed cuts to the NSF are deeper, so might they have an even greater negative effect on publication volumes? Understanding what the cuts apply to The 51% cut in 2025 covers 140 days, equivalent to a 20% annualized cut. So could we see the same level of reduction in papers this year? And could this be followed be a 56% drop next year, as the 2026 cuts cover a full year? As with our analysis of the NIH, we need to understand how the changes in funding translate into research activities, and thence into corresponding volumes and timing of publication output. We therefore analyzed the NSF’s own budgetary figures to put the cuts into context.