Policy Without Practice

There's no record of the Plan S leader publishing in a peer-reviewed journal

Robert-Jan Smits has been a policy person his entire career, which spans back to 1983, according to his online CV. He’s highly educated, has earned a number of honors and awards for his work, and seems generally well-regarded. He is also a listed author or contributor in relation to more than a half dozen books, reports, and monographs, mostly in the policy space, many from the 1970s and 1980s.

Recently, I wrote a piece outlining how the days of Plan S differ from those of E-Biomed, despite many surface similarities. After publishing that post, my subconscious must have kept churning. It unexpectedly produced a potential additional difference — while Harold Varmus (the ostensible leader of the E-Biomed initiative) was a serious researcher who had published numerous articles across many journals, could the same be said for Smits?

After some research, the bottom line appears to be that despite advancing a policy that is focused on transforming all journals dramatically and quickly by changing their business models, Smits has no record of ever having published in a peer-reviewed journal.

Smits does not list any publications at all on his CV, even the half-dozen or so book chapters, reports, and monographs found in WorldCat. Publications in general don’t seem to be an important part of his work history or professional standing.

Unless the major journal bibliographic databases are missing something, Smits has not published in a peer-reviewed and edited journal. This would mean he has not experienced a journal peer-review process, dealt firsthand with journal editors and staff, or enjoyed the benefits of being published in a well-respected journal.

Searches of Scopus, Web of Science, Google Scholar, and Google (general) all come up empty when it comes to identifying journal publications with Smits as an author. In the spirit of peer-review, I also asked another person with broad skills in bibliographic searchers to double-check and expand my work. We both reached the same conclusion — if Smits has published, it must be in journals that are out of the mainstream (i.e., aren’t indexed), aren’t peer-reviewed, or both.

The one citation identified is an interview with Smits in Science from 2011. It is not a peer-reviewed paper.

Why might this matter?

For a few reasons, I believe.

Outsiders (and some insiders) don’t often appreciate what journals do, how publishers work, and why a robust third-party is vital to generating information people can nearly always trust. On top of this, and for a variety of reasons, caricatures of publishers, editors, journals, and review processes run rampant, are almost always wrong in some key way, and tend to over-simplify. If the lead policymaker has no direct experience with the way journals operate, and has not interacted on a professional level with the very people his policy could affect, one might reasonably expect a lack of sophistication and empathy in resulting recommendations. Such a person might also be more susceptible to believing caricatures and stereotypes.

Without an intuitive understanding of how journals work, and without a palpable appreciation of the value publication in a good journal can bring to a work, it’s difficult to imagine Smits or anyone else would create a policy fulsome and nuanced enough to be effectively implemented.

In such a case, the burden on the process increases, so that an inadequate process in the hands of someone with no documented experience with journals would be doubly worrisome and unlikely to generate a solid, holistic, and realistic outcome.

A sincere and informed policymaker might conscientiously gather together all the key stakeholders affected by a potential policy, facilitate reasoned discussion, draft plans for review by the stakeholder groups, revise and resubmit as needed, and generate consensus. The result of such a comprehensive and careful process might be that the policy is at worst accepted, and ideally embraced, by those it involved.

The question is, given Smits’ inexperience with the journal publication process, has Smits run a process that has been careful and comprehensive enough to overcome his inexperience with how journals work?

The short answer appears to be, No. Smits’ process has come under criticism. He claims he has met with publishers, funders, and others. However, Smits was called out multiple times in Berlin last week for ramming a policy through with little consultation beyond a hand-picked set of large publishers, many of whom said his approach was superficial; for utilizing a token and poorly conceptualized discussion and revision process; for not meeting with many publishers (university presses, society publishers, small independent presses); and for his consistent disdain for reasoned discussion and direct feedback. These criticisms became easier to accept last week, given the amount of shouting, dismissiveness, confusion, and haughtiness on display in Berlin at the APE meeting.

The lack of a journals publication record would also lead me to wonder whether Smits understands scholarly publishing enough to contextualize particular viewpoints (and some questionable practices), such as those favored by Frontiers, an organization he appears to have strong ties to. Their opinions and preferences may carry inordinate weight with someone who doesn’t possess the background to ask sophisticated questions, identify practices outside the mainstream, or challenge generalizations.

With little to ground him, and no apparent first-hand experience in being published in a journal, Smits could also be more susceptible to misinformation, limited perspectives, or strong opinions coming from any number of sources — social media being a particularly strong candidate.

Some of Smits’ statements seem to reveal naïveté when it comes to OA, technology, and publishing. For instance, he insists publishers have to experiment with a new business model called Gold OA — without realizing we’ve been experimenting with it, sometimes quite successfully, for 15-20 years. He insists scientific and scholarly societies have to search for new business models, again without knowing via experience speaking with society publishers that nearly every board or management discussion at any society is usually about this very topic, and that some major initiatives have come of this, all of which still pale in comparison (when it comes to member appeal, sustainable revenues, and mission match) to publishing.

So, in addition to a process that is worrisome and inadequate, this very process is being led by someone who appears to have no first-hand experience submitting papers, receiving peer-reviewer comments, making revisions, working with editors, making further revisions, working with copyeditors, seeing language improved and errors removed, proofing final galleys, seeing a publication emerge, and enjoying the benefits of community feedback around a solid and interesting journal article.

It’s a shame Smits may have never experienced what a great process most journals run, because something informed by that process would only have benefited Plan S in its formative days. Using such a process might have improved the initial end product, so that Plan S and its main advocates wouldn’t have found themselves aggressively defensive in Berlin last week.