Michael Nielsen, the credit economy, and open science

No credit cards please.......

Michael Nielsen is a good friend as well as being an inspiration to many of us in the Open Science community. I’ve been privileged to watch and in a small way to contribute to the development of his arguments over the years and I found the distillation of these years of effort into the talk that he recently gave at TEDxWaterloo entirely successful. Here is a widely accesible and entertaining talk that really pins down the arguments, the history, the successes and the failures of recent efforts to open up science practice.

Professional scientific credit is the central issue

I’ve been involved in many discussions around why the potential of opening up research practice hasn’t lead to wider adoption of these approaches. The answer is simple, and as Michael says very clearly in the opening section of the talk, the problem is that innovative approaches to doing science are not going to be adopted while those that use them don’t get conventional scientific credit. I therefore have to admit to being somewhat nonplussed by GrrlScientist’s assessment of the talk that “Dr Nielsen has missed — he certainly has not emphasised — the most obvious reason why the Open Science movement will not work: credit.”

For me, the entire talk is about credit. He frames the discussion of why the Qwiki wasn’t a huge success, compared to the Polymath project, in terms of the production of conventional papers, he discusses the transition from Galileo’s anagrams to the development of the scientific journal in terms of ensuring priority and credit. Finally he explicitly asks the non-scientist members of the audience to do something that even more closely speaks to the issue of credit, to ask their scientist friends and family what they are doing to make their results more widely available. Remember this talk is aimed at a wider audience, the TEDxWaterloo attendees and the larger audience for the video online (nearly 6,000 when I wrote this post). What happens when taxpayers start asking their friends, their family, and their legislative representatives how scientific results are being made available? You’d better believe that this has an affect on the credit economy.

Do we just need the celebrities to back us?

Grrl suggests that the answer to pushing the agenda forward is to enlist Nobelists to drive projects in the same way that Tim Gowers pushed the Polymath project. While I can see the logic and there is certainly value in moral support from successful scientists we already have a lot of this. Sulston, Varmus, Michael and Jon Eisen, and indeed Michael himself just to name a few are already pushing this agenda. But moral support and single projects are not enough. What we need to do is hack the underlying credit economy, provide proper citations for data and software, exploit the obsession with impact factors.

The key to success in my view is a pincer movement. First, showing that more (if not always completely) open approaches can outcompete closed approaches on traditional assessment measures, something demonstrated successfully by Galaxy Zoo, the Alzeimers Disease Neuroimaging Initiative, and the Polymath Projects. Secondly changing assessment policy and culture itself, both explicitly by changing the measures by which researchers are ranked, and implicitly by raising the public expectation that research should be open.

The pendulum is swinging and we’re pushing it just about every which-way we can

I guess what really gets my back up is that Grrl sets off with the statement that “Open Science will never work” but then does on to put her finger on exactly the point where we can push to make it work. Professional and public credit is absolutely at the centre of the challenge. Michael’s talk is part of a concerted, even quite carefully coordinated, campaign to tackle this issue at a wide range of levels. Michael’s tour of his talk, funded by the Open Society Institute seeks to raise awareness. My recent focus on research assessment (and a project also funded by OSI) is tackling the same problem from another angle. It is not entirely a coincidence that I’m writing this in a hotel room in Washington DC and it is not at all accidental that I’m very interested in progress towards widely accepted researcher identifiers. The development of Open Research Computation is a deliberate attempt to build a journal that exploits the nature of journal rankings to make software development more highly valued. 

All of these are part of a push to hack, reconfigure, and re-assess the outputs and outcomes that researchers get credit for and the the outputs and outcomes that are valued by tenure committees and grant panels. And from where I stand we’re making enough progress that Grrl’s argument seems a bit tired and outdated. I’m seeing enough examples of people getting credit and reward for being open and simply doing and enabling better science as a result that I’m confident the pendulum is shifting. Would I advise a young scientist that being open will lead to certain glory? No, it’s far from certain, but you need to distinguish yourself from the crowd one way or another and this is one way to do it. It’s still high risk but show me something in a research career that is low risk and I’ll show something that isn’t worth doing.

What can you do?

If you believe that a move towards more open research practice is a good thing then what can you do to make this happen? Well follow what Michael says, give credit to those who share, explicitly acknowledge the support and ideas you get from others. Ask researchers how they go about ensuring that their research is widely available and above all used. The thing is, in the end changing the credit economy itself isn’t enough, we actually have to change the culture that underlies that economy. This is hard but it is done by embedding the issues and assumptions in the everyday discourse about research. “How useable are your research outputs really?” is the question that gets to the heart of the problem. “How easily can people access, re-use, and improve on your research? And how open are you to getting the benefit of other people’s contribution?” are the questions that I hope will become embedded in the assumptions around how we do research. You can make that happen by asking them.

 

Enhanced by Zemanta

Beyond the Impact Factor: Building a community for more diverse measurement of research

An old measuring tape
Image via Wikipedia

I know I’ve been a bit quiet for a few weeks. Mainly I’ve been away for work and having a brief holiday so it is good to be plunging back into things with some good news. I am very happy to report that the Open Society Institute has agreed to fund the proposal that was built up in response to my initial suggestion a month or so ago.

OSI, which many will know as one of the major players in bringing the Open Access movement to its current position, will fund a workshop that will identify both potential areas where the measurement and aggregation of research outputs can be improved as well as barriers to achieving these improvements. This will be immediately followed by a concentrated development workshop (or hackfest) that will aim to deliver prototype examples that show what is possible. The funding also includes further development effort to take one or two of these prototypes and develop them to proof of principle stage, ideally with the aim of deploying these into real working environments where they might be useful.

The workshop structure will be developed by the participants over the 6 weeks leading up to the date itself. I aim to set that date in the next week or so, but the likelihood is early to mid-March. The workshop will be in southern England, with the venue to be again worked out over the next week or so.

There is a lot to pull together here and I will be aiming to contact everyone who has expressed an interest over the next few weeks to start talking about the details. In the meantime I’d like to thank everyone who has contributed to the effort thus far. In particular I’d like to thank Melissa Hagemann and Janet Haven at OSI and Gunner from Aspiration who have been a great help in focusing and optimizing the proposal. Too many people contributed to the proposal itself to name them all (and you can check out the GoogleDoc history if you want to pull apart their precise contributions) but I do want to thank Heather Piwowar and David Shotton in particular for their contributions.

Finally, the success of the proposal, and in particular the community response around it has made me much more confident that some of the dreams we have for using the web to support research are becoming a reality. The details I will leave for another post but what I found fascinating is how far the network of people spread who could be contacted, essentially through a single blog post. I’ve contacted a few people directly but most have become involved through the network of contacts that spread from the original post. The network, and the tools, are effective enough that a community can be built up rapidly around an idea from a much larger and more diffuse collection of people. The challenge of this workshop and the wider project is to see how we can make that aggregated community into a self sustaining conversation that produces useful outputs over the longer term.

It’s a complete co-incidence that Michael Nielsen posted a piece in the past few hours that forms a great document for framing the discussion. I’ll be aiming to write something in response soon but in the meantime follow the top link below.

Enhanced by Zemanta