Changing circumstances now favour more widespread adoption of Electronic Lab Notebooks. Yet the process requires careful planning, warns Samantha Pearman-Kanza, who shares her experience of the University of Southampton’s implementation trial.
Research data is being generated at an exponential rate, and yet so much of it remains scattered across paper lab notebooks, random spreadsheets with names such as “data_xxx” and assorted digital files with minimal structure.
This fragmented makes it hard for researchers to find and understand their own work years later, let alone reuse others, which creates barriers to collaboration, risks data loss and exacerbates the reproducibility crisis. Funders and journals are increasingly mandating the requirement for FAIR data (findable, accessible, interoperable and reusable) but we don’t stand a chance of improving our data if things continue in this vein.
Affordable hardware and growing pressure for FAIR data created the perfect catalyst to embark on an ELN implementation plan
However, there is a solution. Electronic Lab Notebooks (ELNs) offer a centralised location for digitally capturing, storing and sharing the scientific record.
Unfortunately, implementing these tools is no mean feat. ELNs aren’t a new concept, the notion of digital tools to capture the scientific record was born in the 1990s. Yet, despite three decades of developments and countless attempts across academia and industry to implement them, widespread adoption has not been achieved.
Why? Historically, ELNs faced major barriers: scientists remained attached to their paper lab notebooks and vehemently resisted this change. ELNs were frequently perceived as too costly with limited recognition of the potential benefits, and the combination of slow, clunky hardware married with the hostile conditions of most laboratories made their potential for usage almost non-existent.
But in 2024, the stars finally aligned. Advances in Cloud-based ELNs, affordable hardware and growing pressure for FAIR data created the perfect catalyst to embark on an ELN implementation plan. And thus, the School of Chemistry and Chemical Engineering at the University of Southampton launched a formal trial of the Revvity Signals ELN to understand if our school could genuinely benefit from this tool, and whether it would not only meet our needs but actually enhance our workflows and data capture.
Designing the trial
It’s important to note that designing an ELN trial takes time, and it involves far more than just buying licences. Our planning began in July 2024, and the trial launched in January 2025 with 36 participants across 12 research groups who were spread across the five sections of our school. Top-down leadership is crucial, and as such we called for enthusiastic group leaders to volunteer themselves and their researchers, and specifically targeted new starters and first-year PhD students who were not already entrenched in specific ways of working.
After vendor training, to enable a solid understanding of the inner workings of the ELN, we configured it to suit our school and stakeholder needs. Appropriate user roles and group hierarchies were set up, alongside structured templates to improve the quality of data capture, and a robust set of metadata fields that were required for all experiments and notebooks to ensure findability and traceability.
Alongside this, we established a data exit strategy and backup system to ensure full record retention should migration ever be required, a vital consideration for any software implementation.
Pre-trial surveys captured current practices, hardware requirements and attitudes towards FAIR data, which helped us establish a baseline for comparison, and design templates. Hardware was a key factor in this trial as it’s frequently been a major barrier to adoption. With modern ELNs now capable of running on low-spec machines, we procured dedicated in-lab hardware for those who needed it to avoid device contamination and duplication of data entry.
It’s also worth noting, our chosen ELN (Revvity Signals) ticked our boxes, with its cloud-based access, ChemDraw integration and inventory management capabilities. However, with over 85 active ELNs on the market there is no universal right answer for an ELN. A successful implementation will be far more linked to your approach than the software itself.
Launching and running the trial
After six months of planning, we launched our trial. Let me stress, you can’t just hand over the licences and expect everything to fall into place. Training and support had been formally implanted well before the launch, in order to keep users engaged.
Our first activity was hands-on training with laptops for the users (no death by Powerpoint here!). Leaders learned how to review and sign off experiments, while researchers were able to try creating experiments using different templates, adding metadata and searching for notebook entries.
Designing an ELN trial takes time, and it involves far more than just buying licences. Our planning began in July 2024, and the trial launched in January 2025 with 36 participants across 12 research groups who were spread across the five sections of our school
To keep momentum, we ran bi-monthly group check-ins and monthly ELN surgeries for troubleshooting and feedback. These sessions helped us resolve issues and refine the system as needed. We also worked closely with Revvity throughout the trial, leveraging their support channels to resolve technical issues and keep the trial on track.
Streamlining health and safety
A major feedback point during the trial was around health and safety integration. Initially users were creating their COSHH forms and risk assessments outside of the ELN, storing them in SharePoint and then uploading them to the ELN, which they found cumbersome as a process. Their wish was our command, and as such by the mid- point of the trial, the Health & Safety documents could be created, approved and linked directly to experiments within the ELN, eliminating duplication and streamlining compliance.
Key findings and feedback
Our feedback was overwhelmingly positive. Before the trial, 75% of users solely used paper, and by the mid-point, more than 90% wanted to keep using Signals.
Today, around 120 experiments are lodged weekly, and using the ELN has become part of the day-to-day routine for the majority of our users.
Both our researchers and group leaders have taken to it enthusiastically, with researchers citing the key benefits as “seamless integration with ChemDraw, the accessibility of a Cloud- based system and the clear templates and metadata to improve FAIRness”.
Group leaders noted that it provided them with much better oversight of their researchers’ work.
Of course, nothing is perfect or without challenges. We had some bumps along the way, with issues around login retention and mobile support, and ultimately the requirement for hardware (60% of our users requested hardware will pose a cost burden in the future. Despite this, trial was declared an overwhelming success and we are now working on a full-scale rollout for the entire department.
Lessons learned:
Drivers of successful ELN adoption
People often ask me what my “top tip” is for implementing an ELN. I think it’s obvious from this article that there is too much to distil into just one, but if I were to provide one sage piece of advice, it would be: However good your software is, adoption doesn’t happen by magic, you need time, money and people to make it happen.
So, to end with some key lessons learned:
1. Engage early: Involve stakeholders from researchers to IT and health and safety teams.
2. Train thoroughly: Run hands-on sessions to engage users and build confidence.
3. Start small: Trial software with enthusiastic groups before scaling up.
4. Review and revise: Use feedback to refine templates and workflows.
5. Plan for hardware: Dedicated in-lab hardware is often key to successful adoption.
6. Think long term: Consider vendor support, data exit strategies and integration with other systems before you implement!
Dr Samantha Pearman-Kanza is a Principal Enterprise Fellow at the University of Southampton, the principal investigator for CaSDaR (the Careers and Skills for Data-driven Research) Network+, and the lead on Process Recording for the Physical Sciences Data Infrastructure