Wired and worried: Why we’re anxious about our technology

In my previous post, I wrote about a stream of critiques that are asking serious questions about the new technological world we’re inventing and investing in: Where is all this leading? Maybe it’s some kind of Rorschach test, but there’s a pattern to these articles and books–at least a half-dozen connected anxieties that keep emerging, at least the ones I see. You might recognize more. If you do, chime in with your comments.

1. We’re connected, but are we really detached? We have all sorts of digital ways to “network,” but what about the face-to-face connections? Can a Facebook friend ever be like a flesh-and-blood friend? Are cyber-communities real communities?

2. The Internet might want to be free, but are we actually trapping ourselves? All so-called developed nations–and more and more of the developing societies–depend on electronic and digital technologies just to keep our basic services functioning. Have we wired ourselves into a corner? The nightmare scenario, of course, is that, by accident or intent, our energy grids, water supplies, and transportation systems fail, bringing on a new dark age of catastrophe and chaos. Security experts are working overtime on preventing cyber-attacks that could paralyze us.

3. We invent and use technology to gain unprecedented power, but will our technology one day take control?  This corollary to No. 2 is a staple of the sci-fi canon: What happens when our machinery gets smart and powerful enough to take over? Think HAL in 2001: A Space Odyssey, the sinister rabbit-hole world of The Matrix, the rebellion in Isaac Asimov’s I, Robot. But we don’t need some fictitious future. The potential is as close as Google. As Nicholas Carr noted in 2008,

Sergey Brin and Larry Page, the gifted young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search engine into an artificial intelligence, a HAL-like machine that might be connected directly to our brains. “The ultimate search engine is something as smart as people—or smarter,” Page said in a speech a few years back. “For us, working on search is a way to work on artificial intelligence.”

4. Is our technology changing us from being citizens to being merely consumers? As much as I like George Orwell, he got it wrong in his famous dystopian tale, 1984, in which an all-powerful, all-seeing government runs people’s lives.

No, it’s looking more corporate all the time. As Neil Postman pointed out a generation ago, we’re closer to Aldous Huxley’s Brave New World than to Orwell’s vision. (For another take on the same idea, check out David Mitchell’s excellent mind-bending novel, Cloud Atlas.) To quote Carr again:

The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network’s reigning business model as well. The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction.

5. We can dive into the world from our TVs and desktops, but are we just swimming in the shallows? So much information, so little depth. You name it: Relationships, knowledge base, attention spans. We have more information at our finger tips than ever before, but it’s not clear that we’re better educated or more perceptive thinkers. (I haven’t read it yet, but I wonder if Thomas Bergler’s new book, The Juvenilization of American Christianity, excerpted in June’s Christianity Today, strikes a similar chord. Anyone?) Wisdom? There’s not an app for that.

6. We have more portals to humanity than ever, but are we becoming less human? The answer to this question depends on how we define “human,” of course. But whether it’s in how we experience the world around us, in how we relate to other people, in how we think, in what we value as a culture and, more and more, how much technology will literally get under our skin and become part of us, the unsettling question is: What are we doing to ourselves as people? (The Matrix, again, anyone?)

So what to do? Diane Ackerman, writing in her New York Times blog, offers a couple of suggestions, and they’re not a bad place to begin: “I wish schools would teach the value of cultivating presence. … One solution is to spend a few minutes every day just paying close attention to some facet of nature.”

What we can’t do is pretend this stuff doesn’t exist or doesn’t matter, or that we can somehow go back to the days before … when, exactly? Not that I want to. I like most of these tools and toys. I’m writing this on a Mac that’s wirelessly connected to the Internet, for heaven’s sake, and I’ve been able to almost instantly connect to books and articles published all over the globe. My cell phone just buzzed with another text. On Saturday I carried on a real-time conversation with friends in Egypt, and we could see each other. Later I got to watch a soccer match, live from Poland.

But that’s not to say these things are unalloyed, all good and no bad. There are always trade-offs and we have valid reasons to ask questions.

Just getting in the habit of thinking about these questions is a start. That’s got to be better than handing ourselves over to research labs or the marketing departments of Silicon Valley.

Digital civilization and its discontents

A funny thing is happening on the way to digital paradise, and I’m not talking about the Facebook stock dive or the LinkedIn password hacking.

Intelligent, inquiring minds want to know: What is all our technology and connectivity doing to us? We may not be downloading the devil, but some voices are asking what hath Google and Facebook wrought.

Technology angst isn’t new. Some folks in the 1400s predicted cultural and theological catastrophe when Gutenberg re-invented the printing press. We might just be updating old anxieties—Technophobia 2.0–but that’s not to say we shouldn’t be asking these questions, and several writers are. A select list:

Four years ago, Nicholas Carr, writing in The Atlantic, asked, “Is Google Making Us Stoopid?” (Yes, the misspelling was intentional.) Maybe not making us exactly stupid, he concludes, but something else: turning us into “pancake people,” borrowing a phrase from playwright Richard Foreman, people who are “spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.”

Last month, Stephen Marche asked a companion question in the same magazine: “Is Facebook Making Us Lonely?” Based on new research, Marche says the answer is, roughly, yes. More narcissistic too.

Journalist Maggie Jackson wrote on similar themes in book length a few years ago in Distracted: The Erosion of Attention and the Coming Dark Age (Prometheus, 2009).

The way we live is eroding our capacity for deep, sustained, perceptive attention–the building block of intimacy, wisdom, and cultural progress. … The seduction of alternative virtual universes, the addictive allure of multitasking people and things, our near-religious allegiance to a constant state of motion: these are markers of a land of distraction, in which our old conception of space, time, and place have been shattered. This is why we are less and less able to see, hear, and comprehend what’s relevant and permanent, why so many of us feel that we can barely keep our heads above water, and our days are marked by perpetual loose ends. … We are on the verge of losing our capacity as a society for deep, sustained focus. In short, we are slipping toward a new dark age.

And then this week, science writer Diane Ackerman asks in the New York Times, “Are We Living in Sensory Overload or Sensory Poverty?” She’s not a Luddite, but she frets that we are cutting ourselves off from the world, even as we try to “experience” more of it online:

As a species, we’ve somehow survived large and small ice ages, genetic bottlenecks, plagues, world wars and all manner of natural disasters, but I sometimes wonder if we’ll survive our own ingenuity. At first glance, it seems as if we may be living in sensory overload. The new technology, for all its boons, also bedevils us with alluring distractors, cyberbullies, thought-nabbers, calm-frayers, and a spiky wad of miscellaneous news. Some days it feels like we’re drowning in a twittering bog of information.

But, at exactly the same time, we’re living in sensory poverty, learning about the world without experiencing it up close, right here, right now, in all its messy, majestic, riotous detail. The further we distance ourselves from the spell of the present, explored by our senses, the harder it will be to understand and protect nature’s precarious balance, let alone the balance of our own human nature.

I can see at least a half-dozen interconnected anxieties that keep surfacing in these various critiques. I’ll save those for the next post. (This is a blog, after all, and so this post should stay fairly brief. Irony? You betcha.)

In the meantime, I’d love to know what you’re thinking about the impact of our new technology. What do you make of these concerns and questions? Please add a comment to the blog and get in the conversation.

~~

(I found the photo illustration here, but I could not find information about who created it, permissions, etc.)

Breaking news: Political conservatives are not stupid!

Count to 10. Easy, right? Almost automatic.

Now, count to 10 again—but in alphabetical order. That’s different.

Now mentally trace a route you often drive or walk—to your job or the grocery store or school. Again, simple.

Now, imagine that your normal path and even the next most obvious route to the same place are blocked. What’s your third- or fourth-choice route?

That little exercise illustrates the difference between what psychologists call “low-effort,” or “automatic,” thinking and “controlled” thinking. Most researchers believe we manage most of our days with automatic thinking, which frees our brains to focus on more complex, unfamiliar or difficult tasks. That’s how I can make a tuna sandwich or pump gas or drive to work while I think about details for my daughter’s wedding or how to revise a class schedule or deal with the insurance company.

That’s the kind of difference Scott Eidelman, an assistant professor of psychology at the University of Arkansas, and his colleagues discuss in a recent research journal article. The title might explain why it’s generated a lot of friction.

Low-Effort Thought Promotes Political Conservatism,” published online in March in the Personal and Social Psychology Bulletin, states a simple thesis, summarized in a news release: “People endorse conservative ideology more when they have to give a first or fast response. This low-effort thinking seems to favor political conservatism, suggesting that it may be our default ideology.” (The paper identified “political conservatism” with three common traits: “an emphasis on personal responsibility, acceptance of hierarchy, and a preference for the status quo.”)

To be clear, the researchers added, “We are not saying that conservatives think lightly.”

Or that they’re stupid. But you wouldn’t know it from the reaction of several conservative bloggers.

Study: Conservatism ‘linked to low brainpower’” according to the aggrieved TeaParty.org. “Study ‘Proves’ Conservatism Linked To Stupidity” The Ulsterman Report sarcastically proclaimed. The Conservative Review harrumphed: Conservatism Comes From “Low Brainpower?” Not So Fast, Eggheads At University Of Arkansas. And you have to love the headline from the Washington Examiner: “Study: Dumb drunk people are more conservative.”

During a phone conversation on Friday I asked Eidelman if any of these headlines were accurate interpretations. In a word: “No.”

Scott Eidelman, Ph.D.

While he’s happy people are talking about the research, Eidelman confessed he was “a little disappointed” in how the study has splashed onto the blogosphere.

He compared the reaction to a game of telephone: When social scientists use a term like “low-effort thinking,” they’re using specific jargon to describe the normal, automatic thinking we all do—counting to 10, driving to work—in contrast to the “second-phase” thinking we do when we have time to ponder a subject.

But apparently some knee-jerk commentators saw “low effort” and “translated” it to mean “no-effort” or “lazy” or even “stupid.” Those mistakes got picked up and amplified by others. The “quotations” in the headlines are actually from other commentators, not from the scientists. For the record, the following words don’t appear anywhere in the original research: stupid, stupidity, stupidly, brainpower (low or otherwise), dumb. Not even prove or proves. And not, um, egghead.

Eidelman did not point out the irony of how such shoddy treatment only reinforces the kind of “stupid” stereotype that the commentators are complaining about. You can leave that to me.

“It’s not that political conservatism promotes low-effort thought,” he told me. “What we found is that low-effort thought promotes political conservatism. It’s a subtle difference, but it’s not the same.”

Eidelman drew an analogy: He might carry an umbrella because it’s raining, but that’s completely different from saying that it’s raining because he carries an umbrella. In other words, while low-effort (or “first-response”) thinking tends to promote political conservatism, being conservative doesn’t tend to promote low-effort thinking.

This conservative tendency is roughly reflected in clichés about “comfort zones” and “better the devil you know than the devil you don’t.” As Eidelman noted, certain “conservative” characteristics are built into humans for our benefit, such as the tendency to save our energy or avoid unnecessary risks.

“If you want to look at evolutionary history, people were more likely to survive if they assumed a person approaching was a threat,” he explained. “It was smarter to assume that unknown plant was poisonous rather than edible. Or the sooner you know your place in a society, the better your chances to thrive.”

Sometimes those “first responses” were correct: the stranger was indeed hostile or the plant was really poisonous. But sometimes the stranger would turn out to be an ally or the plant a healing herb. In those cases, the “first response” would be … well, wrong. Finding out if the first (“low-effort”) thinking was correct could be discovered only with “second-step” thinking.

“When people don’t have the opportunity to engage in political thinking, when you strip away the effortful thinking, they tend to be conservatives,” Eidelman said. “But that’s only concerning the first-step thinking. We don’t have much on what the second step is. It’s an open question if that first response is correct. We haven’t measured outcomes. We think the scales are tipped toward conservatism. But whether it’s good or right to challenge that depends on people’s values and goals.”

Eidelman wondered if this “low-effort” tendency might help explain at least one aspect of current American politics.

“Liberals might understand conservatives more than other way around, because liberals, in a way, started at the same place,” he suggested. So here’s a thought: could empathy explain why congressional Democrats are often perceived, rightly or wrongly, to compromise more often on legislation than their Republican colleagues?

As we finished talking, it occurred to me that America’s Founding Fathers were literally invested in the status quo of the British colonies. They valued hierarchy, as their later writing of the Constitution proved. They preached personal responsibility. They sound a lot like political conservatives. But Washington, Adams, Jefferson, Franklin and the others did anything but stop at “low-effort” thinking. If they had, we might never have seen the American Revolution.

Learning to love nuclear power

One reason I appreciate a good newspaper/news site is its potential for stopping people like me in our tracks to make us think fresh thoughts (if not always change our minds) while events are fresh, maybe even still in progress. Strike while the iron’s hot, right?

A column in the Guardian, a national British newspaper, provided a valuable example this week. Like any good column, this isn’t objective “reporting,” but it uses “reportage” to make a point. In this case, George Monbiot, a regular writer for the Guardian, makes the case FOR nuclear power, which isn’t what you’d expect from an environmental activist. His column came out just a few days ago — that is, after the tsunami hit the Japanese nuclear plant in Fukushima, sending entire nations, such as Germany, into full retreat from their nuclear programs.

The fearful responses are understandable. I’m not all that comfortable with nuclear power, and the Japanese disaster has resurrected old fears around the world. On the other hand, living in southern Appalachia, I’m not all that thrilled with what the coal industry is doing to the environment either. Mountaintop removal, anyone?

I don’t discount for a moment how many jobs rely on coal mining. But we need a long-term energy plan that will reduce our dependence on fossil fuels, both coal and oil, and we have only so many options.

That’s Monbiot’s point: nuclear isn’t perfect, but by analyzing data he’s concluded it’s not only a viable option, but a more desirable and safer option than fossil fuels.

What to do? It’s not a simple issue, and I’m not really sure. But I’m grateful for Monbiot and other writers who don’t impose artificially simple solutions on complicated problems and retreat into predictable positions. Rather than steer away from complexity, he did his homework and drove a surprising route right into the middle of it.

I wish more journalists would do that.

The future is now – really, really now

By strange conjunction of media, the future came knocking rapping banging pounding on the door of my consciousness on Wednesday:

1. Borders, the nation’s second-largest bookstore chain, filed for Chapter 11 bankruptcy today. The company will close about 200 of its 674 Borders and Waldenbooks stores in the U.S. The reasons for the failure aren’t mysterious: Buyers have migrated to Amazon.com and to Barnes and Noble, which dived into new media more aggressively than Borders, which apparently wasn’t hard to do. Borders may survive, thanks to an injection of almost $500 million from investors, but it won’t be the same. Think digital. Think new media. Think the end of bookstores as we’ve known them, although I could imagine the survival of the very small, very local boutique-like stores.

Time, Feb. 21, 2011

2. My Feb. 21 issue of Time magazine arrived in the mail today. (Two old-media words in that sentence alone: magazine, mail.) The cover was a stark image of a bald human head with a wire coming out the back of the neck, a la Matrix. Cover line: “2045: The Year Man Becomes Immortal.*” The asterisk directs us to the small type at the bottom of the cover: “*If you believe humans and machines will become one. Welcome to the Singularity movement.”

“Singularity,” the cover story explains, is “the moment when technological change becomes so rapid and profound, it represents a rupture in the fabric of human history.” I haven’t read the entire story yet, but this “movement” promises — or threatens, depending on your point of view — to change us to the core, right down to what it means to be human. It even raises the prospect of what might be called eternal life, but not the way we find it described in the Bible or Koran.

3. Today’s “Fresh Air” program (NPR) featured Biz Stone, one of the co-founders of Twitter. (I love the name. Well, both of them: the network and the guy.) Stone joined host Terry Gross for “a wide-ranging discussion about the service, including how it was used recently in Egypt to help organize the revolution and how it has been used to spread democracy movements in other countries,” as the “Fresh Air” website says. It was a terrific, informative and, at some moments, inspiring interview.

My estimation of Twitter’s value as a social medium pretty much quadrupled,, especially listening to Stone talk about how the network has dealt with playing a major role in several major events in its five-year existence. (Five years!) But at some point during the interview, I realized in a deeply profound way that what Twitter and other social networks is doing is — pardon the cliché — the new normal.

It hit me at a gut level as never before: This is it. Unless we somehow throw ourselves back into a tech-less dark age (cf. Canticle for Leibowitz), we’re living in a new world and we’re not going back to the old one. It hasn’t been too long — less than six months — that I’ve said something like this: “Twitter is cool, and I see some good uses for it. But what’s the big deal?” I’ll never say that again.

4. Two words: Jeopardy. Watson. (In case you’re wondering: Watson, the IBM computer, easily outscored two human champions in a three-day match on the 25,000-year-old game show.)

5. And finally, in local news, WETS-FM, the public radio station for northeast Tennessee, where I live, announced it’s launching digital broadcasts in the autumn. It is probably the first radio station in this area to go digital, adding three HD channels to its existing analog broadcast. The station radically changed its format last year. It used to air a widely varied mix of NPR news and weekend programming, classical music and local “Americana” programming. Then last February it switched to all-news-and-talk during the week, with some NPR programming and Americana music on the weekends. Classical was gone. Were many listeners ticked off? You could say that. But going digital will let WETS-FM add a station just for Americana and another for jazz and classical.

Good timing for the announcement, by the way: East Tennessee State University, the station’s owner, received a $70,000 grant from the Corporation for Public Broadcasting to install the equipment. The CPB is in Congress’ budget-cutting line of fire this year. Maybe this served as a not-so-subtle message from the station that federal funding for NPR and PBS can actually add value to the community.

F2F Finale: That’s all, folks

This is my final “Face to Faith” column. It’s been a good run, since June 2003. If you’re keeping score, that’s 346 columns.

First, the thank-you notes. Thanks to the editors of the Johnson City Press for the opportunity to explore a lot of interesting territory. Thanks also to friends and colleagues who have generously offered their ideas, suggestions and encouragement.

Thanks to the countless people who let me share their expertise, insights, experiences and voices in this space. One of my favorite parts of being a journalist is the privilege of meeting people I would never otherwise get to know.

Finally, thanks to you for reading and for sending your comments, criticisms (honest!) and compliments. Even more, I appreciate your joining me in looking at all sorts of subjects through the lens of religion. One of my favorite parts of covering religion has been the variety, with the chance to write about everything from Trinitarian doctrine to tax law.

The breadth of religion, as well as its depth, is not a small point. More than ever, we need all the tools we can manage to help us understand our world, and it’s no secret that dozens of important news stories every week – whether in our front yard or on the other side of the globe – are ripe with religious meanings, causes and effects.

So before I go, let me suggest seven topics to keep tabs on, listed in no particular order. These aren’t predictions. Let’s just call this a kind of heads-up memo.

The unbuckling of the Bible Belt. I’ve regularly called our region “the area formerly known as the Bible Belt.” No doubt this place still has a different religious climate than, say, New York or Los Angeles. Even so, church attendance is lower than the national average and actual behavior and attitudes about several key social issues mirror the rest of America. With the increasing secularization of society and growing cultural diversity, we’re not as distinct as we used to be (or maybe like to think we are).

The continuing rise of syncretism. “Syncretism” is a fancy word for mixing beliefs and practices into a kind of spiritual stew, an inclination some people have tagged with labels like “me-ism” or “cafeteria religion.” This is a long-time trend, but I was reminded of its power and attraction when I saw “Avatar” last week. (See below, “impact of media, The.”) Regardless of what someone thinks of this development, it’s one that has real implications for how we view the world.

The politics of sex. I can’t think of one sex-related controversy being debated in the public square – birth control, homosexuality, the meaning of marriage (including same-sex marriage and civil unions) – that isn’t shaped by religious belief.

The impact of media. This issue goes beyond debates over the content of TV shows and movies. The media we invent – and how we use them – affect us. For example: In a digital world, how do you define a “community”? Is a church a church if it’s only on the Internet, or is a vague acquaintance on Facebook a “friend”?

The definition of “human.” Far from being a philosophical abstraction for eggheads, the question of what it means to be human is on our doorstep in a dozen ways. The abortion and end-of-life debates are prime examples. For future reference, we’ll also need to consider if there’s a point at which someone treated with cloning, genetic engineering or robotics might not be considered a fully human being anymore.

The spiritual dimensions of money. It’s not just the matter of garden-variety greed or even Bernie Madoff’s unfathomable fraud. Dozens of economic answers can raise scores of religious and spiritual questions. In other words: Are any religious, spiritual or moral issues connected to health care, jobs, welfare, education, foreign aid (think of Haiti this week), war, credit and debt (both personal and national), advertising and marketing, crime, the justice system or the care of elderly people?

The persistence of church-state controversies. Thanks to the massive gray area written into the U.S. Constitution and lived out in American history, the familiar tensions over faith and public life will continue. After 223 years, why stop now? This is part of our national DNA.

That’s all. In the words of an ancient Christian greeting: Grace and peace to you. Amen.

First published in the Johnson City (Tenn.) Press, 16 Jan 2010.

Science, religion and the NIH

Francis Collins: MD, PhD, Christian, guitar player, NIH director designate
Francis Collins: MD, PhD, Christian, guitar player, NIH director designate

It should come as no surprise that applause mostly greeted President Obama’s nomination of Dr. Francis Collins as the new director of the National Institutes of Health last week.

Collins, almost certain to be confirmed in the post, cemented his reputation as a first-rate scientist when he led the NIH-based effort to map the human genetic code, an achievement that’s been compared to the Apollo space program. Collins’ lab also found the genetic keys for several diseases, such as Parkinson’s and Huntington’s, providing essential breakthroughs to develop cures.

He also happens to be a Christian – famously so as the author of The Language of God: A Scientist Presents Evidence for Belief, a 2006 bestseller in which he described his conversion from atheism as a graduate student and his belief in a “wonderful harmony in the complementary truths of science and faith.”

Language book cover“I am a scientist and a believer, and I find no conflict between those world views,” he summarized for a commentary on CNN.com. “As a believer, I see DNA, the information molecule of all living things, as God’s language, and the elegance and complexity of our own bodies and the rest of nature as a reflection of God’s plan.”

Some scientists have a problem with that kind of thinking. “You clearly can be a scientist and have religious beliefs,” wrote Peter Atkins, a high-profile chemist at Oxford University. “But I don’t think you can be a real scientist in the deepest sense of the word because they (religion and science) are such alien categories of knowledge.”

Dr. Gene Rudd, executive vice president of the Bristol-based Christian Medical and Dental Associations, thinks such views are “biased” and “shameful.”

“A generation or so ago, a scientist’s faith would have been an asset,” he said. “Historically, science has prospered in cultures that understood there was a god who created an order of things, and people tried to understand that order. You will find some anti-science thinking among a minority of people in the Christian faith, but science historically flourished among Christianity and Islam.”

Time, 1996
Time, 2006

On the other hand, not all Christians are thrilled with Collins. His views on hot-button science issues – evolution, abortion, stem-cell research – run counter to typical conservative Christian positions. For example, he accepts Darwinian evolution as fact, and while he opposes abortion in most cases, he doesn’t explicitly rule it out.

Also, while he opposes producing embryos for research, he believes it is morally defensible to use embryos that had been created for fertilization but would otherwise remain unused.

“In the process of in vitro fertilization, you almost invariably end up with more embryos than you can reimplant safely,” he explained in a 2006 interview with Salon. “Is it more ethical to leave them in those freezers forever or throw them away? Or is it more ethical to come up with some sort of use for those embryos that could help people?”

Rudd realizes that Collins’ positions will “irritate” many Christians, and his organization “will have discussions” with Collins about embryonic stem-cell research. Still, he sounded optimistic about Collins.

“He is routinely accepted as an exceptional scientist, and he’s proven to be an exceptional administrator, which can be a rare combination,” Rudd said.

Dr. William Duncan, vice provost of research at East Tennessee State University, agrees with that assessment. Collins, he said, is a “world-class scientist,” and his faith is a non-issue for Duncan.

“Religious beliefs are very private, personal decisions for all individuals,” said Duncan, an immunologist who worked at the NIH from 1987 to 2004. “I’ve known many scientists who were religious, and religion never prevented any of them from pursuing their research. Each scientist needs to balance their religious beliefs and moral values with their career objectives and daily choices.”

The stakes are high: The NIH, the world’s most significant source of research money, will distribute about $37 billion in research grants over the next 14 months. The priority is to gain good data, according to Duncan, and he thinks the institutes’ review and decision-making process is “very transparent.”

“The NIH and the funding agencies in this country are primarily based on not on what your belief is but what is your proposal, the data, your plans,” Duncan said. “Scientists pursue knowledge, and the best science is done in an unbiased fashion. It’s really evidence-based data that drives the good science.”

Johnson City (Tenn.) Press, 25 July 2009.