Monday, March 14, 2016

A Scholar’s Sting of Education Conferences Stirs a Hornet’s Nest

RESEARCH

A Scholar’s Sting of Education Conferences Stirs a Hornet’s Nest

MARCH 14, 2016 

Jim Vander Putten sent fake research-paper summaries, larded with unforgivable methodological errors like the five-person study seen above, to the organizers of 15 conferences he believed to have lax standards. All responded by offering to let him present his findings. 
Jim Vander Putten suspected that some education conferences accepted any study pitched by someone willing to pay a registration fee. He worried that the gatherings enabled scholars to pad their publishing records while tainting research in the field.

To test his hypothesis, he sent fake research-paper summaries larded with unforgivable methodological errors to the organizers of 15 conferences he believed to have lax standards. All responded by offering to let him present his findings and to publish his papers as part of their proceedings.

But instead of exposing the dissemination of bad research, Mr. Vander Putten now stands accused of research misconduct himself.

Administrators at the University of Arkansas at Little Rock, where he is an associate professor of higher education, have told him he violated policy by undertaking a study of human subjects without the approval of the campus’s institutional review board. They have rejected his defense that an outside, commercial review board signed off on his plans — after Little Rock’s board failed to do so. A research-integrity officer on his campus has called on him to relinquish the data that he gathered. University officials took such actions after conference organizers he had duped threatened to sue.

Mr. Vander Putten’s unusual case highlights inconsistencies in the judgments that review boards make. It also raises questions of how much commercial boards, which account for a growing share of such reviews, can be trusted to safeguard colleges’ interests.

“Whenever someone threatens legal action, the administrators roll over and acquiesce.” 
Mr. Vander Putten served six years as head of the university’s own review board, from 2000 to 2006. He argues that he did nothing wrong in getting his study plans approved last April by Solutions IRB, a private company in Little Rock. He describes the company’s review of his proposed study as "very rigorous and thorough," and says the university did not announce a policy requiring in-house IRB approvals until after he had completed his study.

"Whenever someone threatens legal action, the administrators roll over and acquiesce," he says. "That is part of the culture."

Allen Hicks, a spokesman for the Little Rock campus, last week declined to discuss the case. But Scott L. Thomas, dean of Claremont Graduate University’s School of Educational Studies and president of the Association for the Study of Higher Education, said it is a mistake to focus entirely on the question of whether a given study protected human subjects. University review boards, he said, function not just to protect human subjects, but to protect university interests and to verify that the research is insured. "There are liabilities and risks associated, for both the university and the researchers," he said.

Knocking on New Doors

The role that commercial boards play in overseeing academic research is expected to continue growing as a result of proposed federal regulations that would let several universities involved in a joint study hire a single, outside board to evaluate their effort’s human-subjects protections.

Although there is little hard data comparing the quality of university boards with commercial ones, federal investigations into the rigor of both types have found that either can function well or make serious mistakes, said Michelle N. Meyer, director of bioethics policy at the Clarkson-Icahn School of Medicine at Mount Sinai. With commercial IRBs, she said, the chief question is whether a profit motive biases their judgment, either by tempting them to be lenient to attract customers or by prompting them to be exceptionally tough to safeguard their credibility.

The federal government does not require review boards to consider whether research proposals were considered by boards elsewhere. But Stuart Horowitz, president of institutions and institutional services for the WIRB-Copernicus Group, a company that includes several independent review boards, said last week that commercial boards should ask potential clients about any past reviews to discourage "IRB shopping."

Dana Gonzales, president of Solutions IRB, said her company asks customers if their studies are under review elsewhere but did not ask Mr. Vander Putten if another board had previously reviewed his study plans. She said her company had since changed its review application to ask about not just current reviews but also past ones, "just to make sure this doesn’t come up again."

Ms. Meyer said IRB shopping among academic researchers "is not much of an issue because normally you don’t have a choice," with faculty members generally required to use their university’s own board or a commercial board that has contractually agreed to uphold its standards. If researchers think their university’s board needlessly obstructs their work, she said, their chief recourse is to find employment elsewhere.

Red Flags, Fatal Flaws

Mr. Vander Putten is no stranger to debates over institutional review boards. He has researched the errors such boards make and argued that they apply inconsistent standards to social science and often unnecessarily hinder low-risk studies.

He says he became interested in what he calls "vanity" education-research conferences when references to studies presented at them cropped up in the backgrounds of scholars applying for jobs or promotions at his institution. Although the existence of low-rigor conferences in desirable travel destinations is no secret in academe, Mr. Vander Putten takes an especially dim view of such gatherings. He argues that scholars who attend them waste institutional funds and, often, public dollars. And, he says, the conferences’ willingness to publish any study produced by a registered participant taints the quality of research and hinders colleges’ efforts to evaluate faculty productivity.

“The only way to expose a counterfeit is by using their own methods.”
To try to catch conference organizers in the act of failing to scrutinize research, Mr. Vander Putten devised fake proposals to present papers. The proposals’ lengths varied to comply with differences in conference applications’ requirements, but all contained certain red flags signaling fatal flaws in the research they described. For example, all discussed a quantitative study based on interviews with just five students, and all made blatantly preposterous claims about what they purportedly had shown.

Jeffrey Beall believes such ruses have been highly effective in exposing lax standards in scholarly publishing. Mr. Beall, a scholarly-communications librarian at the University of Colorado at Denver who writes the blog Scholarly Open Access, says such stings have "saved people from submitting good manuscripts to bad publishers." Citing a widely publicized 2013 exposé in Science magazine based on the acceptance of a fake, obviously flawed scientific paper by dozens of open-access journals, he said often "the only way to expose a counterfeit is by using their own methods."

Federal regulations do not provide much guidance, however, on the question of what separates ethical deception from unethical deception in human-subjects research, says Celia B. Fisher, who monitors and advises on such regulatory activity as director of the Center for Ethics Education at Fordham University. The regulations urge institutional review boards to weigh the risks and benefits of such deception, but make clear that "significant damage to someone’s reputation" is "greater than a minimal risk."

Excessive Scrutiny

Mr. Vander Putten argued in his study proposals that he could not possibly test conference organizers’ willingness to accept bad research by being upfront about his scrutiny of them. He offered assurances that he would protect the individual reviewers of his fake conference-presentation proposals by shielding their identities, but said he planned to name the conference organizers. He made multiple, failed attempts over a period of about 18 months to get his study approved by his own university’s institutional review board, which subjected it to excessive scrutiny, he argues.

The university’s board finally closed his case in January 2015, saying he had failed to assuage its concerns over issues such as potential harm to reviewers, whom conference organizers or people willing to investigate the review process might be able to identify. When he learned of the board’s decision to close his case, he says, "I just interpreted it as them absolving themselves from any responsibility for this review, so I went somewhere else."

“I just interpreted it as them absolving themselves from any responsibility for this review, so I went somewhere else.” 
Solutions IRB initially pressed Mr. Vander Putten on matters such as his plans to destroy emails and other records to protect reviewers’ identities, but then accepted his study plans last April, telling him in a letter, "You have provided adequate safeguards for the rights and welfare of the participants in this research study."

The university administrators who later received complaints about his study ordered him to scrap plans to present it at last year’s annual conference of the Association for the Study of Higher Education, which had assumed his study received campus IRB approval.

As for his findings, Mr. Vander Putten stands by his study, but it’s unlikely to be published anywhere soon.

Peter Schmidt writes about affirmative action, academic labor, and issues related to academic freedom. Contact him at peter.schmidt@chronicle.com.


No comments:

Post a Comment