What's all the Fuss About Scientific Integrity?

Transcript:

Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

20140811_Scientific_Integrity

Marka Pattison: Good afternoon. I'm Marka Pattison and I'd like to welcome you to the office of policy analysis seminar.

We usually look into, delve into issues of natural and cultural resources. Today we're going to delve into the underlying science. What gives it integrity and what compromises that integrity.

Our speaker is Alan Thornhill, the director of the office of science quality and integrity and the bureau scientific integrity officer at the USGS. Alan...

Alan Thornhill: Good day everybody and thank you for joining me. I'm going to talk a little bit about the departments integrity policy and about science integrity in general today.

First of all, people wonder when I tell them my job title exactly what is science integrity, and why are we so concerned about suddenly? It shows up in the paper an awful lot more than it used to. We'll touch on some of the reasons why we're concerned about in a moment.

But one of the realities is that there really is intense scrutiny on science and the scientific process right now from a variety of different sides. You've probably seen examples of science being attacked where scientific results are at odds with ideology.

The science really has to be completely above reproach to stand up to that scrutiny. That's one of the reasons that we are really interested in protecting science integrity.

Science is, should be, we hope it will always be, agnostic to the results. Whatever science tells us is one of the inputs that we hope policymakers and decision makers would take in to make decisions.

It has an important role to play in decision making, policy making, management, but when the integrity of that science is compromised, it doesn't necessarily have a trusted role to play. That's why we need to be concerned about the integrity of science.

Federal agencies have had a catalyst to discuss scientific integrity since President Obama took office and issued his Presidential Memorandum in 2009. Then, the Office of Science and Technology Policy issued their guidance about a year later to help the Executive Branch agencies.

To develop the Scientific Integrity Policies that the President requested, with a big exclamation point behind it, all agencies develop. Nearly all agencies responded. Nearly all agencies had something in place by the deadline. A few lagged behind.

I'm very proud that DOI was one of the first agencies to respond to the requirement, and OSTP looked at our policy and considered it to be the gold standard for how the Executive Branch responded to the memorandum from the President.

External entities looked at it, as well, and gave us pretty high marks, but we'll talk about some of the criticism and what we're doing about it in a few minutes. The President chAlanged us to think about science and its use.

The way DOI approached scientific integrity is we want to proactively protect the process and the product of science from a loss of integrity. As opposed to catching loss of integrity after it's happened, and then punishing the people involved. That is a reactionary way of dealing with it.

We wanted to be proactive and protect science and the integrity of science. Here's some words on a slide to talk about what science integrity is. It's really one of the most important values in science. It's the foundation of scientific work. Without integrity, you'll lose your credibility.

In science, credibility is really about all you've got. If you don't have credibility, you're not trusted. You're not sought after.

If you think of credibility as the currency of science, then with a loss of credibility due to loss of integrity, for example, you are a poor scientist indeed. It is those that are highly sought after with a stellar reputation that really are the rich scientists in terms of integrity and credibility.

That's where we want to protect the integrity, want to protect the value of the people and the products. You can read the slide here, but I came up with a graphic that I think helps to describe scientific integrity a little bit more directly.

This one works for my mom and my grand mom when I want to explain to them what scientific integrity is. Let me explain this graphic to you real quick.

There are some animations in this, so I'm going to delay after each click so that the folks out in the WebEx world can see the change when it comes.

When we talk about the use of science and its trustworthiness, we're really talking about the credibility and the reputation of the bureau and the agency that produces it. The credibility and reputation of a bureau is really based upon the shoulders of the credibility.

And reputation of the individuals that work in those agencies, the scientists, the people in science support positions, and in fact, anybody who touches science or communicates about science or uses science in any way.

To develop the credibility and reputation of individual scientists, there's a whole bunch of necessary, but perhaps not sufficient, elements that have to be in place.

For example, a scientist really has to adhere to professional codes of conduct. They have to avoid real and potential conflicts of interest, or even the appearance of conflicts of interest. They have to behave ethically at all times.

Scientists have to be willing to consider new data and new analyses, not get stuck in a rut. They need to participate in peer review and publish in reputable outlets.

Scientists have to have a deep understanding of the subject matter and stay current in their field, and they have adhere to the scientific method and the process of science. Again, necessary but not necessarily sufficient to develop your credibility and reputation as a scientist.

Scientific integrity is maintained when all of these building blocks are uncompromised. Any one of those building blocks is cracked, worn, weathered, beaten up and you'll begin to undermine the reputation and credibility of your scientist, and ultimately the agency or the bureau.

But integrity doesn't stop with the scientist. Users and communicators of science, those who manage science, those who make decisions based on science, they need to uphold the highest standards of integrity as well.

Let's think about what exactly are we talking about with regard to integrity here. If you can imagine a long ruler. On the left hand is perfectly acceptable actions and behaviors. On the right side is criminal negligence and criminal misconduct. A very long gradient between those two things.

Integrity falls somewhere towards that left side. You begin to lose integrity, it's not necessarily going to lead to very bad things happening, but it could. To take action when things are going wrong early, you can correct problems before the train comes off the tracks entirely.

Taking corrective action early, correcting what's going on, is a way to protect the integrity of the science process and products.

That's what we want to do when we're monitoring the science health in an agency is to watch for small correctable breaches and get those corrected before they become much worse down that right side of the gradient.

Loss of integrity, a small departure or a significant departure from accepted standards, practices. Misconduct is generally described very clearly in regulations as fabrication, falsification and plagiarism.

It's important to note that a loss of integrity can happen with or without intent. It could be a mistake that undermines the integrity of the process or products of science. That can be corrected.

If there's intent and a person is actively undermining the integrity of the science process or products, then corrective actions can also be taken. But it might also mean administrative action against individuals who are actively undermining the integrity of science.

You can have both kinds, the intentional and the unintentional. We want to correct both kinds as well. When we talk about the DOI policy on integrity of science and scholarly work, we started off with three basic overriding principles.

We wanted the scientific and scholarly work considered by departmental decision‑making to be robust, of the highest quality and the result of rigorous science and scholarly processes. We wanted it to be trustworthy.

We wanted the policy to protect science by applying to everyone at DOI, not just the scientists, but the support people, the political appointees, the co‑operators, partners, permittees, everybody.

That's what we've got. That's how the policy is framed. Of course, we recognize that it's not just science that drives decision making and policy making.

There are at least three pillars that decisions are made upon at the Department of the Interior, science, the best available science, the long term best interest of the American People, and fidelity to the law. Those are three basic pillars that decision making stands on, within DOI.

Of course, we recognize that there could be economic, budgetary, political influences to all decision making because after all, we are social organisms. Science can be a part of a decision. Rarely is it ever the only pillar that a decision stands upon.

One of the goals that we had with our policy is to create a culture of integrity, where science was conducted with integrity and excellence, where there is a culture that encourages integrity, where we're widely recognized for excellence and our credibility.

Where the individuals that work with science or use science are proud of the science that they work with and lead by example with executing that science, and that leadership encourages the environment and culture that supports scientific integrity.

These were all the goals that we had for the policy. Ultimately, what we are trying to do is create a proactive preventative policy and then a culture that supported it, that encouraged it, and that expected integrity of science in scholarly work.

One of the key points here is that the policy applies to everybody, co‑operators, contractors, permittees, lessees, grantees, volunteers when they engage in, supervise, manage or influence scientific activities.

Some of the principles that we focused on, when we developed the policy, are listed here. I won't run down them all, you can read them while I'm talking to you.

Ultimately, the purpose of basing it on these fundamental principles was to be able to detect and then take corrective action on issues before they become of a size or magnitude that could potentially damage the credibility and reputation of individual scientists and the agency.

Our policy was designed to be proactive, and it really fits in between all of the other policies and regulations that exist out there, the Inspector General, the Ethics Office, the Foyer Office, the IQA Guidelines.

All of these things exist out there to protect the integrity of what goes on at an agency in one way or another. We were trying to thread the needle and fit all of the integrity policy into the space that really isn't covered very well by those others regulations and policies.

We worked very closely with those other offices. We know that there's a blended barrier between these various policies, and we have to be working closely with our colleagues to make sure that our goals are met for this policy and that the adherence to those policies.

And regulations is always foremost. Part of our policy includes the Code of Conduct. You'll find 10 "I will" statements, like a pledge, that covers employees, contractors, and such.

Then, there are 6 additional "I will" statements that apply directly to scientists and scholars, above and beyond the basic 10, and then 3 additional statements that apply to decision makers, above and beyond the 10 for everyone.

The version two of the policy, which I'll talk about in a second, is in review right now and hopefully to be released any day now, which we've been saying for a while for those of you who've been following this story.

We have built a training module that will highlight the key elements of the version two of the policy so when the policy gets released, there will be a training module that's released on the DOI Learn site along with it. In there, it highlights the Codes of Conduct because it's really the backbone of the policy.

If you are curious about the policy, I'd suggest you start with the Codes of Conduct, and then maybe read the definitions after that because it helps you to understand the context of what we're doing with the policy.

One of the differences between the DOI policy and the Scientific Integrity policies of other agencies is we designed our policy to have skin on the bones, actual people whose job is to operate and maintain the connection to the policy and the scientists and others that they represent or work with, within their bureaus.

Each bureau has a Scientific Integrity Officer. There's a Scientific Integrity Officer for the department as a whole. At the end of the presentation I'll show you where to find the contact information for all of those folks. One of the key roles for these people is to act as an ombudsman.

Somebody who you can go and talk to about that situation, where the hair on the back of your neck stood up because somebody was asking you to do something with data, or a graph, or a situation that didn't sound right, didn't quite track with acceptable practices within your discipline, or whatever.

Maybe it's a mistake, maybe that person that supervises, or whatever, doesn't understand what those accepted practices are. If you don't feel comfortable having that conversation with your supervisor first, call your Integrity Officer, have that conversation.

And see if there's an easy way to understand whether this is a misunderstanding or you're been asked to do something that's a breach of integrity.

The Integrity Officer can coach you through how to have those conversations with your supervisor, for example. That's a role that a number of Integrity Officers has played on a regular basis.

The Integrity Officers also lead the initial inquiries when an allegation of loss of integrity arises. We have the Collective Group Accessor, a peer group and peer reviewers of the way we're thinking about problems that we may be looking at.

We also help to develop or to convene Scientific Integrity Review Panels, in the cases where there's complicated or difficult cases that need scrutiny from a set of colleagues or peers who have experience with that topic. The Integrity Officers are involved in that entire process.

To date we've dealt with about 35 formal cases of a loss of integrity at the Department of the Interior. About eight of these cases are still open right now.

I should mention when we get to the Q and A period, after I finish speaking, happy to take questions about any of the stuff that I've been talking about, except for questions about open cases. I can't deal with those in a public forum, of course.

Then, probably, on the order of 50 or more informal cases that the Integrity Officers have dealt with at one level or another. These are cases that are resolved without going through the formal process that's laid out in the policy for bringing an allegation of loss of integrity.

Then what have we done with Version Two? When we released the policy back in 2010, a number of external watchdog groups were paying attention to the federal agencies releasing their policies.

And were grading the policies on how well they did meeting the expectations of the presidential memo and meeting the expectations of their professional peer groups for codes of conduct and things like that.

In most cases we got pretty good marks, mostly B's and B‑pluses, from those external watchdog agencies, but they pointed out a few things that we could do better and that we could pay more attention to, which went into the thinking about Version Two of this policy.

We've moved a lot of material into a handbook, which allows us to more easily update guidance for people that are looking at issues related to scientific integrity. We've codified this idea of the ombudsman role.

We have this integrity council now, which is made up of all of the integrity officers as an official role of theirs to interact. We've updated the policy to catch up to the 2012 version of the enhanced Whistle‑Blower Protection Act.

We've developed a whole new section on reprisal because it turns out that is a serious concern. While you cannot guarantee it won't happen, you can at least be prepared for it and understand how to deal with it hopefully proactively. We've got an appeals process.

We've made intent more clear where we thought it was perfectly clear when we wrote Version One of the policy. Hearing back from others when they read it made it clear that we weren't all that clear and needed to fix that in several different places.

Hopefully we've done a better job of that in round two. We've also provided a way in the science integrity policy to make it possible for scientists and others to be elected to their professional peer group boards of governors.

In Version Two of the policy we've streamlined how that process works partly because the Office of Government Ethics have slightly changed the rules and made it a little bit easier for us to consider those sorts of requests from federal employees.

If you are a scientist and you think that serving your professional peer group in an official capacity is part of your future, you should check with your scientific integrity officer and start to think about how do you make that happen?

Don't wait until you get nominated or elected. Start that process when you're still thinking about whether or not it's a good move. Of course, you have to have concurrence from your supervisors, so it's a good idea to talk to them early as well.

These are some of the things that we've done in Version Two of the policy, but in the time between Version One was released and now a number of other DOI policies have been updated and included references to the scientific integrity policy, in some cases quite a few references to the integrity policy.

The communications policy and social media policies are examples of where the scientific integrity policy has made its way into the thinking of those people who communicate science to the internal and external worlds. We're very happy with that.

I'm going to switch gears real quickly and get you guys engaged a little bit and talk about what does scientific integrity look like or what does the loss of scientific integrity look like? I'm going to propose two scenarios here for you.

I'm going to show you the background and give you the dilemma in a couple of different slides and give you two or three minutes to think about it and maybe talk about it with the person sitting next to you.

Ponder whether you think the policy would cover this as a loss of integrity or not or if it's even a loss of integrity or not. Let's see how you do on these. Here's the first one. I'm going to read this for you, and you can read it at your own pace as well.

Here's a scientist about to release data that appears to show a significant decline in a certain species that would change the management status of that species. Two or more of the data points seem to be anomalous. They deviate from what is expected.

After rechecking and checking and rechecking these repeated measurements, the anomaly is still there, but there isn't a clear reason for it. The scientist includes these data points in the final paper, and the senior manager responsible for the decision decides to leave those two data points out.

Because they're not relevant to the decision being made. There's your scenario. What do you think? Management prerogative or breach of integrity? Talk amongst yourselves. I'll give you a second to think about it.

Alan: All right. Let's see how you did. The question at hand is whether or not this manager is acting with scientific integrity or, put another way.

If faced with the policy on scientific integrity would this manager be in breach or not of the principles in the policy? How many, show of hands, say breach of scientific integrity?

Male Participant: What are the options that you're putting in his place?

Male Participant: ...because it will depend on the options, have....?

Alan: You'll find that there's a loss of integrity or not. There may be degrees of loss of integrity, but you're either following standard procedures or you're not. Then what you do about that loss of integrity may vary quite a bit.

It may be that there would be administrative action taken against somebody or maybe not. Was it intentional or not? All of these things have to go into the supervisor's and the responsible manager's decision about what to do with a report that comes from a scientific integrity officer.

We merely provide the report to the supervisors or the responsible managers to say, "Here's what we found. Here's what we think happened."

We make no recommendations about corrective action other than possibly making suggestions about how the science could be put back on the rails, not dealing with the people. That's left to HR and administrative action.

Here is what the policy says. "No, the senior manager is not acting with integrity." In fact, this would be considered falsification because you're misrepresenting data. Omission of data is equivalent to adding data that don't exist. You can't do that. If there are anomalies and you can't explain them.

You are welcome to have an explanation of the fact that you can't explain them and suggest ways that you might go about figuring out why these anomalies exist. You can't ignore them because they don't fit with the facts the way you see them. We have a question at the back?

Female Participant: Out of pure curiosity this says that it's an announcement of the study. What I'm seeing in my head is a quick one‑pager saying, "Hey, we have this great study."

You're talking about a two percent anomaly. Are we saying in every single anything that ever talks about science we have to be perfect?

Alan: The question is about what's the medium that this is being released through? Is it merely a press release that says, "This is the case"?

Is it the fact that we hold everything we say up to this very high threshold of integrity that says, "You must talk about everything with all the caveats that exist there?" In fact you do. The policy on communication says that when you're talking about science in a press release or whatever.

You are beholden to the scientific integrity policy. Now it can be dealt with very easily in a press release by saying, "There were some questions still that need to be addressed about some sites."

Fine, then you're exposing the fact that maybe everything doesn't track with what you found, or you put an asterisk to say, "Full report available here. Some anomalous data need to be explained." However you do it you can't merely ignore the fact that 100 percent of your data don't track.

There are ways to deal with it that are relatively simple. All right. Scenario Number Two. I'm going to give you the background on this because not everybody in the room may know anything about Red Knots and horseshoe crabs. Let me real quickly give you the background.

Male Participant: Could you go back to that last case? It might be a breach of integrity, but it doesn't follow intention because the guy thought, "It doesn't make a difference."

Alan: It may or may not be intentional. We're not questioning intentionality here.

Male Participant: That's important to note, too, because it's sanctioned. As I said...

Alan: It could be an oversight. "Oh, I didn't realize I was supposed to put that into the press release" or it could have been "This isn't going to help my case.

I'm going to delete these data points and not talk about them." Intentionality is not at issue here. We're talking about strict adherence to the policy.

Male Participant: Or he could be like, "I didn't think it would make a difference either way," which is what's implied by that thing. It didn't make a difference, so he thought .

Alan: That may be true to the decision, but you can't misrepresent the science to support your decision. That's really the point.

Male Participant: ..it should not..

Alan: Right. Intentionality, as I've said from the beginning, is definitely an element here. You can have a loss of integrity without intentionality, and you can have a loss of integrity with intentionality. You can probably tell which one is worse.

Alan: Checking. All right, so let me give you the quick background. Red Knots is a migratory shore bird. They make the longest yearly migration of any bird, more than 9,000 miles from the Arctic to the tip of South America.

Horseshoe crab's eggs are an important food source for this migratory bird. They fatten up and make that trip. Red Knots stop in Delaware Bay to feed and refuel before continuing their journey. In recent years horseshoe crabs have been harvested extensively by the fishing industry.

Which use the crab as bait and also use their blood for pharmaceutical purposes. Although the pharmaceutical industry releases their crabs after collecting blood, estimates a mortality from 3 to 30 percent of the animals that are subjected to that blood draw.

Fishing pressure as well as effects of climate change are having an effect on the horseshoe crab population and in turn are having an impact on the Red Knot population. That's the background.

The scenario here on the screen, so preliminary studies indicate that the link between population declines of these two species affirm the findings conclusively.

Any effort to curb horseshoe crab harvest would impact commercial fishing and pharmaceuticals, which are two major economic drivers in the Mid‑Atlantic regions. There is significant political pressure to curtail the research efforts and limit the discussion about the distribution.

And possible impact of this stuff. A little caveat down there on the lower right. This scenario is fictitious, but all the facts are based on real work. You can go to that newsroom location there and read about this study.

Here's your dilemma. What should be the proper reaction of scientific program managers and other decision‑makers to this situation? How would you answer the following three questions?

To maintain scientific integrity what should be the proper reaction to the scientific program manager and other decision‑makers to the situation?

Number two, how should the agency share findings and results of the preliminary study? Number three, how can the agency scientists better inform the debate between resource, conservation, and economic value or shoot craps?

There's your three questions based on this dilemma. Talk amongst yourselves for a moment, and see if you have a feel for how this should be handled.

Alan: For those of you in WebEx land, I was asked to flip the slide back. I'm going to flip it back to the questions now. The room is quieting down. Obviously you've come to your conclusions, and you have a firm understanding of what this means and what should happen, yes?

Not any way to do a show of hands in this case, so what I'll do is I'll reveal the answers to these to move this conversation forward. Then we can have a Q&A period at the end here.

The first thing to remember is that there are at least three pillars behind every decision that gets made or underpinning decisions that get made. Science is one of them. Long‑term best interests of the American people is another. Fidelity to law is third. These things shouldn't necessarily mix.

They are three independent things that support any particular decision. Then there's economics, politics, and all that other stuff, which may also influence decisions, but they shouldn't mix. That's the first thing to remember.

When we come up with answers, science should objectively support study results. Don't censor. Don't manipulate. Ensure those results are publicly shared in accordance with departmental guidelines.

These days that means releasing your data as well so that other people can do an independent assessment to see if the results are robust. That's what the scientific program managers should approach this as. That's the direction they should give to their scientists.

Number two, how should the agency share their findings? Full disclosure including the data these days, including the limited scope of the study recognizing this is a preliminary study. More work is needed. Fine.

Add the caveat, but you can't deny that these results have been found. Are they robust results? Well, release the data. Let other people analyze them independently, and then do the full study to see if the preliminary study is held up or not. You can't hide it.

Ensure the study results and reports are readily available, and don't hide them/bury them on a website so that people can't find them. Number three, how should the agency scientist better inform the debate? The underlying science needs to inform policy decisions as it relates.

You need to do the science not only on the ecological links, but you should also initiate studies on the economic links of these important factors in the area. This is the context within with the biology is occurring. You can do science on all of that, and that science can help inform the decision‑making.

It's appropriate to initiate potentially a multi‑pronged study that looks at economics, ecology, climate change effects, whatever else you think might be relevant to a decision‑maker needing to make decisions in the future.

Male Participant: That doesn't suggest it's necessarily a breach of scientific integrity to stop pursuing a line of research.

Alan: That is correct.

Male Participant: You can see the budget folks and still say, "You know what? We may not like the answer, but we're going to provide the information we have already. We're not going to fund anymore studies."

Alan: Right, so for people out in WebEx land, the question was really this doesn't mean that you have to continue to do the study. It may be that management because of budgetary reasons or whatever else could say, "We can't continue to do this study. We can't afford it" or whatever.

Indeed, that would not be necessarily a breach of scientific integrity unless you found that there was political pressure to change the mission direction of a particular science office, in which case it could be political interference of science. It's not necessarily that.

You guys have probably experienced the fact that a long‑term research project has the plug pulled because OMB no longer provides funding for that, or Congress decides to no longer provide funding for that. That is the end of a research project. It's not necessarily integrity issues although it could be.

All right. I'll leave you with the final slide here, which provides access to the scientific integrity website at DOI, and a number of other resources that might be useful for you including the original memo from the President, the OSTP guidance that was provided, and a great handbook from the National Academies called "On Being a Scientist," which gives you a lot of context and background for science in general and integrity in specific.

I should also mention that you'll find at the DOI.gov/scientific integrity a link to the current policy. You'll find a link to templates, the handbook, and also all the contact information for all of your bureau scientific integrity officers, the departmental integrity officer, and a bunch of other resources there as well.

With that I will stop yapping at you and take some questions.

Male Participant: That does not apply to the taste test.

Alan: Yes, the comment for those you out in the WebEx, "The cartoon doesn't necessarily apply if it was a taste test." A lick, not a bite.

Marka: Anything from the Internet? OK, then we can...

Alan: We're taking questions. If you guys want to chat in to the WebEx and ask questions, we'll try to cycle back and forth between the room and the WebEx. Yes, sir?

Male Participant: Thanks. Great presentation. One of the things that not so much integrity but more on the idea of ethics and would like to hear how you guys integrate ethical concepts into the whole idea of integrity. One of the things we work with is invasive species.

We know that invasive species are best dealt with when their populations are small before they really expand and can go out and do harm. Sometimes there's pushback and people say, "Well, you're going to go in, and you're going to kill this population without information that that population is indeed going to cause harm."

The only way to really get that information empirically is to let it go and let it grow. Then once you say, "Oh, my goodness. It's causing harm," you have very high‑confidence information, but you've lost that opportune time to do something. Some of it is an ethical consideration.

You have to balance the idea of having that perfect or very good data set with the ethical consideration of what you have to do to get that data set.

Alan: In response to that I would say that that's not an integrity question at all. In fact, I would even push back and say it's not necessarily an ethical question. In fact, if you think about those three pillars on which a decision is made, science is one of them.

But the long‑term best interests of the American public is another. If past management practices have demonstrated that early intervention in an invasion of a species that is potentially damaging is the best way to manage.

Then it may be that the decision‑maker makes the decision based on that pillar, long‑term best interests of the American people, instead of heavily relying on the science because the science isn't there yet.

Now is that a good decision? Well, time will tell if that was a good decision or not, but one of the things we know about adaptive management is that over time as science continues to inform the management practices of resource agencies the reliance on opinion.

And individuals' gut feeling about the right thing to do goes down. The reliance on the body of science that supports decisions goes up, but you don't always have the science you need to make the decisions that you have to make right now.

A great example of when I used to work at BOEM is the Outer Continental Shelf Lands Act demands that a particular cycle of decision‑making occurs.

It is irrelevant whether the science is mature enough to support those decisions partly because we know those decisions are based on more than just the science.

The way Congress wrote the law you have to consider the science, but it's one of many inputs. The decision‑making process continues on a cycle that is sometimes quite out of sync with the science process, but that's the way it has to work.

We have to recognize that and as scientists attempt to get out ahead of the decision‑makers if possible and deliver the science at the time they're going to need it. We have an Internet question?

Male Participant: We have a question from the Internet, and it reads as follows. "One thing that wasn't addressed is the process of creating data visualizations. There are lots of gray areas when it comes to this. Are there any general guidelines that can be offered?"

Alan: Right, so visualizing data and representing data in a graphical format can be an art. However, there are professional standards for how to do this, and there are guidelines for how to do this within those standards.

What I would say is look to your professional peer groups for that guidance, and look to places like the National Academies who have done studies that have demonstrated the degree to which you can represent data graphically without crossing the line of integrity.

Yes, it's an art. Yes, there are people that have crossed over the line, which can be seen on the NIH scientific integrity website there.

They have many documented cases of data that have been falsified in a graphical way, on gel plots and stuff. That has definitely crossed the line, but they provide guidance for what isn't crossing the line on the scientific integrity website at NIH.

I would look to your peer groups and look to the professional standards that your peer groups already have established.

Marka: Questions in the room?

Male Participant: The example that was offered here is really more and it's really a case of risk analysis risk management, which I think, its can help inform, also, rather than just .

Alan: The example of whether you make a decision about early action of an invasive species or not does definitely have an element of risk analysis into it and frankly resource analysis as well.

It'll take X amount of resources to deal with it now. We think that it'll take 1,000 times X resources to deal with it five years from now.

A decision‑maker has to weigh these things and say, "Well, in that case it's in the best interests of serving the American public and putting their dollar to good work to take action right now and not wait until it costs so much more in the future."

Male Participant: That's a risk management question.

Alan: That's a risk management question, and those are balancing questions that decision‑makers have to make all the time. The key is they can't hand‑pick the data and the research to back up the decision that they make that isn't based on research and data.

It's fine if you make a decision that's not based on science. Don't tell people that the science told me to make this decision. That would be a breach of integrity.

Marka: Internet question?

Male Audience: Yes, here's another question. It reads as follows. "Is there a process through the policy for the public to bring a complaint a breach of scientific integrity within the agency, not just from agency staff?"

Alan: Yes, the current version of the policy and the future version of the policy in Section 3.8 ‑‑ I have this thing memorized ‑‑ says that any person internal or external to an agency can bring an allegation of loss of integrity. Same thing as with Inspector General. Anybody can bring a chAlange.

It is then beholden upon the system to process that chAlange to see if, in fact, there's anything there. Yes, look to the policy Section 3.8. You'll find all the steps that are necessary and the required materials that we want to see when an allegation comes in.

It includes things like conflict of interest statement. What interest does this person have in this particular issue?

Is there a reason to believe that they have a bone to pick with a particular decision, person, science, or something? Because that's relevant to deciding whether the case is manufactured or not.

Marka: Questions in the room? You said you couldn't talk about an open case, but I'm curious if you could share examples of an unintentional and an intentional that did take place?

Alan: If you go to the DOI.gov/scientific integrity and click on the link that's called Closed Case Database, you can read summaries of several. We're a little bit behind because we haven't had time to write up the summaries.

You can read about several cases that have been closed that were intentional, that obviously undermined the science integrity of a particular situation.

I can provide a very high‑level summary of one of those real quick, which is a supervisor told scientist to move a sampling cage away from an outfall from a potential polluter.

That was not part of the scientific protocol of that study as it was designed. The integrity officer looked into that case, found that in fact this was a breach of integrity because it was management interference, and it undermined the design of the study.

The recommendations were made for correcting the actions that undermined the science, and then the managers and the supervisors were left with the responsibility of deciding what to do with the person who made that decision. That's an example of where it was intentional.

We have lots of examples where it's unintentional, and these usually don't make it all the way through the formal allegation process because as we begin to sort an issue or a problem we realize, ah, it's a misunderstanding.

This person says that they were plagiarized in their work. The person that was writing the draft completely overlooked putting that reference back in, but we see it in an earlier draft. If it was an oversight, it can be corrected easily. Everybody goes home happy.

That's an unintentional breach of integrity. Corrective action can be taken. All problems are averted. If those actions were taken and it were published that way or no errata were published after the fact, then you could bring a formal complaint.

You'd have to unravel that whole thing through the formal complaint process, which is again one reason to go talk to your supervisors or your science integrity officers if you feel like something is coming off the rails to see if it can be fixed relatively easily.

Marka: Questions from the Internet? Any more questions in the room? I'm going to ask one, if you will?

Alan: Sure.

Marka: Given that science rarely speaks with one mouth and these questions can come from inside and outside, how are you funded and administered to do all the looking into that you have to...

Alan: You should know the inquiries. That's a great question because we wrestled with that when we wrote the first version of policies. I'm a scientist buried in Bureau X.

And I've got three supervisors above me before I get to the director. Let's say somebody in the director's office has an allegation brought against them about a breach of integrity.

Exactly how am I who is three levels down in the management chain going to do an inquiry based on my boss's boss's boss? Well, the policy provides for that. When the integrity officer is operating in their official capacity as an integrity officer, and I should mention that these were all collateral duty.

We have one person that is full‑time on scientific integrity matters, but the rest of us are collateral duty. When we're operating with that particular hat on, we report to the director of our bureau, full stop. We bypass our entire supervisory chain and go straight to the director.

If there is an allegation of loss of integrity brought against a director of a bureau, that allegation goes to the departmental level integrity officer.

That person reports directly to the deputy secretary on those matters. We've come up with a way to bypass the potential conflict or potential interference that might be made possible by the various chains of supervision.

Marka: Any other questions? If not, thank you very much.

Alan: Oh, do we have one from the audience or from the Internet?

Marka: Oh, there is one.

Male Audience: One more question here from the Internet. "You mention that you want to create a culture of integrity. Does this imply that such a culture did not exist whether in the scientific community or at large in DOI?"

Alan: No, it does not imply that that is the case, but what has become apparent is that people on the outside and even people on the inside have the impression that integrity is lacking.

What we want to do is make sure people understand the protections we have in place and appreciate that we are actively and proactively trying to deal with issues as they come up.

We want people to recognize DOI as developing and using quality science, and we don't ever want that to be mistrusted or unappreciated. Whatever we do is to enhance our already good reputation and protect our credibility and protect our science.

Marka: Thank you all very much, and hope you join us next month when we talk about ecosystem services and applications to resource management. Goodbye.

Alan: Thank you.




Transcription by CastingWords

ؓ�j�&

One of the most important values in science is integrity—in fact, integrity could be thought of as the currency of science. Without integrity, you lose credibility with your colleagues and the community; your results become meaningless; your value as a scientist is diminished. Scientific Integrity is maintained when all of the building blocks of reputation and credibility are solid and uncompromised. A transgression in any of elements of these building blocks can undermine the credibility of individual scientists and potentially damage the reputation of the entire bureau or agency. The goal of the DOI policy on scientific integrity is to detect, and take corrective actions on issues before they become significant and potentially damaging to the scientists and/or agency. The policy does not just deal with the wreck that results when the scientific train has come off the tracks, it is designed to help prevent the derailment in the first place.

Alan Thornhill, Director of the Office of Science Quality and Integrity, and the Bureau Scientific Integrity Officer, U.S. Geological Survey