Skip to content

What the proposed human subjects rules would mean for social and behavioral researchers

July 28, 2011

The federal government is proposing a massive overhaul of the rules governing human subjects research and IRBs (previously mentioned here). The proposed rule changes were just announced by the Department of Health and Human Services. They are outlined in a document called an “advance notice of proposed rulemaking,”or ANPRM. (See also this overview in the NEJM by Ezekiel Emanuel and Jerry Menikoff.)

Reading the full ANPRM is a slog, in part because the document keeps cross-referencing stuff it hasn’t talked about yet. But if you do human subjects research in the United States, you owe it to yourself to read it over carefully. And if you are so moved, you can go comment on it at Regulations.gov until the public comment period ends on September 26. (You can comment on any aspect of it. The document contains 74 questions on which they are soliciting input, giving the impression that they will be particularly responsive to comments on those points.)

Proposed changes

Based on a first read-through, here is my understanding of proposed changes that will be most consequential for social and behavioral researchers. (Caveats: This isn’t everything, I’ve simplified a lot, and it’s quite possible that I’ve misunderstood some stuff. But hopefully not too much.)

“Informational risks” would no longer be reviewed by IRBs. IRBs would no longer evaluate or regulate so-called “informational risks” (stuff associated with confidentiality etc.). The arguments are that IRBs rarely have expertise to do this right, informational risks have changed with developments like network technology and genetic testing, and IRBs’ time is better spent focusing on physical and psychological risks. Instead of putting informational risks under IRB oversight, all researchers would be governed by a uniform set of data security regulations modeled on HIPAA (see below).

“Exempt” is changed to “excused” — as in, excused from review. This is a big one. Among other things, all educational tests, interviews, surveys, and similar procedures with competent adults would now be called “Excused.” And because informational risk is is being separated out, the new rules would drop the qualifications related to identifiability — meaning that even surveys/interviews where you collect identifiable information would be excused. The excused category would also be enlarged to include other minimal-risk activities (such as watching videos, solving puzzles, etc.). For studies in the new excused category there would be no prior review by an administrator or an IRB member. Instead, a researcher would file a very brief form with your human subjects office saying what you are going to do. And then, as soon as the paperwork is filed, you go ahead and start collecting data. No waiting for anybody’s approval. A random sampling of these forms would occasionally be audited to make sure the excused categories are being applied correctly.

Paperwork for expedited studies will be streamlined. Currently, you have to fill out a full protocol for an expedited study. That would be changed – expedited review would involve shorter forms than full review.

Continuing review is eliminated for almost all minimal risk studies (and for certain activities on more-than-minimal studies, like data analysis). No more annual forms saying “can I please run some more t-tests?”

Updated consent procedures. Consent comes up in a few different places in the ANPRM. For Excused studies, “Oral consent without written documentation would continue to be acceptable for many research studies involving educational tests, surveys, focus groups, interviews, and similar procedures.” (“Continue to be acceptable?” I’ve routinely been asked to get written consent for self-report studies.) The ANPRM also proposes a variety of ways to standardized and improve consent forms, for example by restricting how long the forms could be and what could be in them.

Simplification of multi-site studies. Domestic multi-site studies would have one and only one IRB of record. Review by each institution’s IRB would no longer be necessary (or even permitted).

Existing data could be used for new research only with prior consent. I’m not entirely clear on where they are drawing the line on calling something new research on existing data. (Is it a new investigator? a new research question? does it depend on how the research was described in the original consent form?) And this intersects with the HIPAA stuff (see below). But the general idea, as I understand it, is that at the time data is collected, researchers would have to ask subjects if their data could be used for other studies in the future (beyond the present study). Subjects would have to say “yes” for the data to be re-used in future studies. Data that was not originally collected for research purposes would not have this requirement, but only it is fully de-identified. (But all existing datasets collected prior to the new rules would be grandfathered in.)

Data security rules will be based on the HIPAA Privacy Rule. This is one that I’m still trying to sort through. I don’t know much about HIPAA except that people in biomedicine seem to roll their eyes and sigh when it comes up. It also vaguely stinks of over-extension of biomedical standards into social and behavioral research — the same rules apply regardless of the content of the data. As I understand it, datasets would fall into 3 categories of identifiability. Identifiable datsets are those that contain direct identifiers like names or images of faces. Limited datasets from which direct identifiers have been removed but which still have data that might make it possible, alone or in combination, to re-identify people (e.g., a ZIP code might be used with other information to figure out who somebody is). De-identified datasets have neither names nor any one of a list of 18 pieces of information that are semi-identifiable. Regulations governing how the data must be protected, who may have access to it, audit trails, etc. would be similar to HIPAA. All of this would be outside of IRB control — it would be required of all investigators regardless of level of review. I know that sounds vague; like I said, I’m still figuring this one out (and frankly, the ANPRM isn’t very specific).

My first reactions

Overall I think this sounds like mostly good news for social and behavioral researchers, if this actually happens. It’s possible that after the public comment period they’ll drop some of these changes or do something completely different.

I’d ideally like to see them recognize that certain research activities are protected speech and therefore should be outside of all federally mandated regulation. At the very least, universities have had to figure out whether to apply the Common Rule to activities like journalism, folklore and oral history research, etc. It would be nice to clear that up. (I’d advocate for a broader interpretation where interviews and surveys are considered protected speech regardless of who’s doing them. “Do you approve of how the President is doing his job?” is the same question whether it’s being asked by a journalist or a political scientist. But I’m not holding my breath for that.)

The HIPAA stuff makes me a little nervous. It appears that they are going to require the same level of security for a subject’s response to “Are you an outgoing person?” as for the results of an STD test. There also does not seem to be any provision for research where you tell subjects up front that you are not offering or guarantee confidentiality. For example, it’s pretty common in social/personality psych to videotape people in order to code and analyze their behavior, and later in another study use the videotapes as stimuli to measure other people’s impressions of their behavior. This is done with advance permission (I use a special consent form that asks if we can use videotapes as stimuli in future studies). Under the new rules, a videotape where you can see somebody’s face is considered fully identifiable and subjected to the most stringent level of control. Even just giving your own undergraduate RAs access to code the videotapes might require a mountain of security. Showing it to new subjects in a new study might be impossible.

So I do have some concerns, especially about applying a medical model of data security to research that has low or minimal informational risks. But overall, my first reading of the proposed changes sounds like a lot of steps in the right direction.

Comments are closed.

Follow

Get every new post delivered to your Inbox.

Join 417 other followers

%d bloggers like this: