For the last four issues we’ve been walking our way around the flywheel of research computing teams, looking at the external forces tugging on our teams — today I’ll want to talk about the internal forces acting on us.
But first let’s look at what those external arrows have in common.
I’ve drawn the arrows as pointing in different directions, representing the different stakeholders (our individual researcher clients; research in our community as a whole; and our supporting clients, the institutional and external funders).
The fantastic news is, though, all of those forces are actually pulling us in the same direction.
All of these constraints are orienting us towards one requirement: that we ensure our teams are fundable. Our teams are fundable when continuously allocating money to our teams is something that named researchers care specifically and vociferously about, and is something our funders feel confident about, because we’re consistently and demonstrably enabling disproportionately high research impact per unit money invested.
So if that’s the big external constraint on our teams, what are the internal constraints?
When I’m talking to leaders of technical research support teams, there’s one topic that always comes up very quickly after funding — and that’s hiring and retaining staff.
The people working on our teams are extremely intelligent, capable, and self-motivated experts, people who could find a job paying twice as much in industry within the space of two or three months if they chose to. But they highly value the world of academic research. In fact, they probably had the opportunity to stay on the research track longer than they did. However, they bore easily, prefer technical work to applying for funding and writing papers, and very much enjoy to be able to go from project to project, learning new domain science as well as technical skills as they do so. That’s how they end up on our teams.
So while the big external forces require us to make sure our team is fundable, our big internal forces require something of us if we’re going to have a staffed team to fund at all. Our team needs work that:
So here’s where we’re just unbelievably fortunate, much more so than other expertise-based businesses like management consultancies. Because the overlap between these two parts of the Venn diagram is enormous; our staff by and large want the same things required of us by external constraints.
So by and large we want to structure the work our team takes on in a way that allows us to pick an area in which we’re going to become very good, to be fairly nimble about how we offer services and products in that area to researcher clients whose projects will greatly benefit from our contribution, and to have high impact in as efficient a way as possible.
I’ve talked about doing this before, especially in #157. Our work and our team and our fundability benefit from putting together a system where we can leverage existing skills, knowledge gained from projects done from researchers, and documented and communicated success stories into an ongoing practice. Over time, that looks like this slide below which always generates a lot of discussion when I show it:
That is, we develop the processes by which we can take the existing expertise of our team, systematically grow it and leverage it with standard operating practices and automation, and bundle it into things that we know that researchers value, that will lead more often than not to successfull research projects and research impact, and will let us communicate that impact to our sustaining clients in our institutions and funders. Which is to say now we’re back at the diagram that we started with 10 weeeks ago:
Next issue I’ll talk a little bit about how, once we’ve got this down, we can make it easier for ourselves and others to communicate what we do. Two very scary words - positioning and marketing.
And on that dire note, on to the roundup!
Over in The Other Place, Manager, Ph.D., in issue #172 I talked about how individual productivity is, for our kinds of teams especially, not really what we care about.
Also covered in the roundups were articles on:
In review: The successes and shortcomings of Horizon 2020 - Thomas Brent & Goda Naujokaitytė
It’s always worth reading reviews of funding programs to see how funders and those that decide on their funding to see what matters to them. This article in Science Business summarize a European Commission evaluation of the Horizon 2020 program. Yes, papers and citations matter, but there were other measures that Horizon 2020 was measured against:
Funders (institutional or national/supranational) will tell you what they care about (#75), and the more we can help them advance their goals, the more support we can start seeing for ours.
‘Very positive’ national support for research management - Nina Bo Wagner
I find it heartening that after years of seeing nothing, there’s starting to be broad support for a professionalization of management in our professions. Here, Wagner summarizes a panel discussion of some work being done as part of the RM Roadmap effort in Europe. That effort defines Research Managers (RM) as
…including research policy advisers, research managers, financial support staff, data stewards, research infrastructure operators, knowledge transfer officers, business developers, knowledge brokers, innovation managers, legal and research contracts managers/professionals, etc. For simplicity, we use the term research management, but this exercise covers also other terms such as research support, research management and administration, professionals at the interface of science and other terms which are used as the norm in the national landscapes across Europe.
And yes RMs aren’t a great name. In the UK, for instance, they’re looking for a better name and title.
Foundational Competencies and Responsibilities of a Research Software Engineer - Goth et al, arXiv:2311.11457v2
I didn’t report on this when it first came out - there’s a somewhat reorganized up v2 of the manuscript now, describing a set of competencies for RSEs at what I think is the right level of abstraction. (That’s no small praise! The hardest part about such an effort in as diverse a field as any kind of technical research support is considering the problem at a high enough level to be able to apply widely while remaining grounded enough to still be able to make meaningful distinctions. Some efforts in our line of work have struggled with this).
This is a really nice framework.
NASA Transform To Open Science (TOPS)
Ah, and I had missed this, too - NASA’s long had a commitment to practicing open science themselves, but with TOPS, they are putting together a curriculum, tools, and resources for the practice of Open Science more broadly. I’ll be keeping an eye on this.
US agency allocates $90m to education research infrastructure - Craig Nicholson, Research Professional News
NSF invests $90M in innovative national scientific cyberinfrastructure for transforming STEM education - NSF News
This is interesting - digital research infrastructure to help researchers (and industry) study and improve things for one of the other missions of our institutions, education.
From the NSF announcement:
SafeInsights aims to serve as a central hub, facilitating research coordination and leveraging data across a range of major digital learning platforms that currently serve tens of millions of U.S. learners across education levels and science, technology, engineering and mathematics. With its controlled and intuitive framework, unique privacy-protecting approach and emphasis on the inclusion of students, educators and researchers from diverse backgrounds, SafeInsights will enable extensive, long-term research on the predictors of effective learning, which are key to academic success and persistence. […] Because progress in science, technology and innovation increasingly relies on advanced research infrastructure — including equipment, cyberinfrastructure, large-scale datasets and skilled personnel — this Mid-scale RI-2 investment [led by OpenStax at Rice University: LJD] will allow researchers to delve into deeper and broader scientific inquiries than ever before
One of the things I like about these mission-driven projects is that they inherently cut across what are the traditional DRI silos - there’s necessarily elements of research computing, research data/research data management, and research software development integrated into this. Here the privacy requirements make the research data management aspects primary, but the product wouldn’t work without research computing and research software expertise.
Engagement Facilitation Guide for Smaller and Emerging RCD Programs - Daphne McCanse
CaRCC Capabilities Model Focused Tools Engagement Guide and Script - John Nicks, Forough Ghahramani, et al
Ah, this is nice - I’m a big fan of the CaRCC Capabilities model, but it’s an awful lot for a smaller institution to even know how to start with. This is a guide to engage with smaller institutions to help them come up with a plan for mapping out their capabilities. It could be used for someone coming in from the outside, or for the institution itself.
More broadly, it’s a nice guide to mapping out and engaging with key decision makers and stakeholders at an institution for any purpose.
Fascinating look from quite some time ago on spreadsheet errors in the context of broader human error research: Thinking Is Bad.
The case for naming your handful of utility scripts starting with a comma.
Another introduction to differentiable programming: Alice’s Adventures in a differentiable wonderland.
MS-DOS 4.0 is now open sourced.
And that’s it for another week. If any of the above was interesting or helpful, feel free to share it wherever you think it’d be useful! And let me know what you thought, or if you have anything you’d like to share about the newsletter or stewarding and leading our teams. Just email me, or reply to this newsletter if you get it in your inbox.
Have a great weekend, and good luck in the coming week with your research computing team,
Jonathan
About This Newsletter
Research computing - the intertwined streams of software development, systems, data management and analysis - is much more than technology. It’s teams, it’s communities, it’s product management - it’s people. It’s also one of the most important ways we can be supporting science, scholarship, and R&D today.
So research computing teams are too important to research to be managed poorly. But no one teaches us how to be effective managers and leaders in academia. We have an advantage, though - working in research collaborations have taught us the advanced management skills, but not the basics.
This newsletter focusses on providing new and experienced research computing and data managers the tools they need to be good managers without the stress, and to help their teams achieve great results and grow their careers.
This week’s new-listing highlights are below in the email edition; the full listing of 135 jobs is, as ever, available on the job board.