Ideas are Not Soap: The Failure of Corporate Marketing for Communicating Science

26 01 2016

by Dennis Meredith, Science Communication Consultant

(Reprinted from ScienceWriters, the National Association of Science Writers magazine, Winter 2015-16 issue)

Over recent years, more and more research institutions seem to be adopting a corporate marketing approach to their communications. You can recognize these marketers by their use of such Aisle drop shadowbuzz-words as branding, messaging, market penetration and cost-benefit analysis. It’s an approach that risks compromising research communications, and more broadly a research institution’s missions to create and disseminate knowledge.

Administrators become enamored of a corporate marketing approach because they’re managers; and managers like to manage stuff. Corporate marketing offers them a chance to manage—a seemingly strategic way to “sell” the institution to key customers, such as prospective students, patients and donors. True, the marketing approach does have some utility, in that it can help academic institutions think more strategically about communicating core messages about the institution.

But corporate marketing is by definition shallow marketing. By aiming to sell the institution as a branded product, it fails to serve the intellectually rich marketplace of ideas in which researchers operate.

For example, corporate marketers too often abandon significant coverage of their institutions’ research—particularly basic research. They don’t see such coverage as serving their narrow marketing strategy. In fact, I’ve heard of communicators at some marketing-oriented research universities explicitly stating that they don’t do news releases on basic research advances.

Rather, marketers prefer the “sales rep PIO” approach to media relations. For example, they will expend considerable effort to get their researchers quoted in reaction to news of the day. But these mentions are basically trivial—the equivalent of corporate product placements. They don’t really advance the researchers’ ideas, but only give the institution’s visibility—or as marketers put it, “increase media impressions.”

Another hallmark of the marketing mentality is pitching stories to individual reporters to generate media “placements.” While pitching seems like a good tactic—generating documentable media “hits”—it’s a poor long-term communication strategy. For one thing, it relegates the institution’s news to a commodity to be sold like soap, reducing the institution’s credibility. Also problematic, pitching could be considered ethically questionable, since it constitutes a publicly funded institution preferentially offering a story idea to one reporter. Certainly other reporters not privy to that information wouldn’t be happy with such preferential treatment.

Ideas should be broadcast, not pitched. Good ideas, well communicated, will find their audiences, both reporters and the public.

A more credible and productive alternative model to sales rep PIOs is the “PIO journalist.” The PIO journalist doesn’t pitch stories, or produce releases that are essentially advertisements—peppered with such subjective terms as “breakthrough,” “revolutionary” and “major discovery.” (To see sales rep PIOs at work, search for these terms on EurekAlert! or Newswise.)

Rather, the PIO journalist produces a steady flow of newsworthy releases, compelling feature stories and videos. Like any media reporter, the PIO journalist seeks to vividly communicate research by creating stories with clear explanations, pithy quotes and memorable metaphors. The stories explain the implications of discoveries in a way that scientific papers do not. And importantly, the PIO journalist includes the caveats and cautions that any good journalist would feature, which makes the release more credible.

Over the long term, such compelling content obviates the need for pitches. Reporters come to understand that the institution’s track record of solid news and features mean that they are obliged to pay close attention to its communications.

Administrators may not resonate as much with PIO journalism as with corporate marketing, because there’s much less for them to “manage.” Their duties consist of hiring talented research communicators and giving them the resources they need to do their jobs.

Administrators need to appreciate that the resulting “products,” will be news releases and other content that better serve the institution’s interests by portraying it as a dynamic, creative source of new discoveries. Such releases effectively transmit those discoveries to the idea marketplace, where they will be seen by such important audiences as fellow researchers, prospective graduate students and corporate partners.

PIO journalists tell the institution’s research story the way that the researchers and the institution want it told, and not through the filter of the media. For example, news releases posted on such services as EurekAlert! and Newswise automatically appear on such news syndicators as Google News, right along with media stories. And content posted on the institution’s web sites and social media directly reaches audiences.

PIO journalists also recognize that media may sometimes be secondary targets of news releases—that releases have a multitude of uses beyond media alerts. For example, they serve as internal communications, as statements of record, as alerts to other researchers and as content to inform and engage prospective students and faculty, corporate partners and donors.

So, the next time you find yourself in the soap aisle of the supermarket, ask yourself whether research discoveries should really be considered the equivalent of the gaudy packages of detergents festooned with their punchy slogans.

Webinar: Explaining Research: New Tricks for New Media

11 06 2010

Physics World bannerHere is a link to the webinar  “Explaining Research: New Tricks for New Media,” which I  gave on June 9, 2010, as part of the Physics World webinar series.Margaret Harris

The moderator was Margaret Harris, Reviews and Careers Editor for Physics World, and who I’m very proud to say is a (highly talented!) former science writing student of mine. Margaret studied physics at Duke and then did a PhD in atomic physics at Durham University in the UK.

The webinar description:

Your career success depends not only on doing good work. You must also explain that work to important audiences: your colleagues, funding officers, donors, your institution’s leaders, students, your own family and friends, journalists, and the public. Dennis Meredith will offer invaluable tips on using new media technologies to engage those audiences in a clear and compelling way.

Please Explain: Training Scientists to Be Better Communicators

17 05 2010

This commentary was published May 16, 2010, in The Chronicle of Higher Education (registration required)

When it comes to persuading the American public about some of the most controversial issues of our time, today’s scientists too often get failing grades. Gallup polls show that only 39 percent of Americans believe in evolution, for example, while 48 percent say global warming is exaggerated and 46 percent say temperature increases are not due to human activity. And despite many recent court rulings asserting that there is no scientific evidence that vaccines cause autism, far too many parents still cling to that dangerous belief and refuse to have their children vaccinated.

Certainly some unscientific views arise from religious and political beliefs, but there’s another reason for such wrongheaded convictions, as well as for the public’s lack of scientific knowledge: Science suffers from its lack of a culture of explanation.

Scientists and engineers tend to communicate poorly in public controversies because—compared with, say, doctors and lawyers—their professions have not valued explanation. Their career advancement doesn’t depend on having lay-level explanatory skills. To progress professionally, scientists really need only to explain their work technically to other scientists—their colleagues, department heads, and granting agencies. But imagine what would happen to a doctor who couldn’t explain diseases to patients, or a lawyer who couldn’t explain the law to clients and juries. Their careers would be over.

A lack of public-communication skills also means that scientists and engineers do not think strategically about how to make their research work to their best professional advantage. For example, in 40 years as a research communicator at universities including the California Institute of Technology, Cornell University, Duke University, and the Massachusetts Institute of Technology, I never heard a researcher ask, “Who needs to know about my discovery?”

A class is not a “lay audience”

Many academic scientists might consider themselves expert explainers because a significant part of their job entails explaining research to undergraduates in their teaching. But even the most skillful scientist-teachers aren’t necessarily skilled science explainers. Speaking to “captive” student audiences is very different from communicating with any other lay audience, who often must be actively persuaded to be interested in a scientific topic.

Unfortunately, most science and engineering educators don’t even realize they need improvement. They don’t appreciate the potential benefits of communication training, so such training remains extremely rare on most college campuses. The result is that their students, too, graduate without knowing how to give a compelling public talk, write an interesting popular article, or create an engaging Web site. That puts them at a disadvantage in the job market because employers rank communication skills high in qualities they look for in an employee, according to Job Outlook 2015, the survey of employer organizations by the National Association of Colleges and Employers.

[I sought to help remedy the lack of communication skills by publishing my Explainiing Researchbook Explaining Research: How to Reach Key Audiences to Advance Your Work (Oxford 2010).]

Science would not have such a cultural deficit if scientists and those who educate them took a broader view of the value of communications than just immediate career advancement. They need to appreciate that their lack of skill and interest in lay-level communications limits their ability to reach audiences crucial to the success of their own research and their field. Such audiences include nonscientist administrators, potential collaborators in other disciplines, legislators, and donors. But even scientists’ communications with their own colleagues are less effective than they should be. By using the same skills that grab the attention of the local civic club or readers of a popular magazine, scientists could easily improve their seminars and papers.

Learning lay language

Yet scientists seldom bother to emerge from their cloistered realm of jargon to learn “lay language.” They often miss even the simplest and most obvious opportunities to advance the scientific point of view in the public mind by merely adjusting scientific vernacular. Clive Thompson, a columnist for Wired magazine, suggests that scientists could short-circuit one of creationists’ major arguments against evolution—that evolution is only a theory—simply by changing “theory of evolution” to “law of evolution.” “It performs a neat bit of linguistic jujitsu,” he explains. “If someone says, ‘I don’t believe in the theory of evolution,’ they may sound fairly reasonable. But if someone announces, ‘I don’t believe in the law of evolution,’ they sound insane. It’s tantamount to saying, ‘I don’t believe in the law of gravity.'”

Similarly, scientists need to rethink their use of the term “believe” in talking to lay audiences, writes the theoretical physicist Helen Quinn in Physics Today: “For most people a belief is an article of faith; a hypothesis or a theory is not much different from a guess. … When a person hears ‘scientists believe,’ he or she may hear it as a statement of faith or a suggestion of uncertainty. Neither is what we intend.” She suggests that scientists would strengthen their authority by replacing “We believe” with “Scientific evidence supports the conclusion that,” or even “We know that.”

Beyond scientists’ being linguistically tone-deaf, their lack of a culture of explanation makes them strategically maladroit when explaining their work to lay audiences. Rather than tailoring their arguments to their audiences, they tend to believe that merely presenting the facts of their work will lead audiences to see the light on such issues as evolution.

Dismal media coverage of science

Scientists’ reluctance to become activist-explainers of their work is one reason for the dismal coverage of research in the news media. Science coverage on the nightly news is so infinitesimally small as to be journalistic “noise”—a couple of percent of total coverage, according to the “State of the News Media” studies by the Project for Excellence in Journalism. Such poor coverage closes an important gateway to science for the public, making people far less likely to understand the importance of scientific findings or consider the possibility of careers in science.

Despite poor news-media coverage, people are interested in science—so scientists don’t have lack of interest as an excuse for their failure to engage the public. According to the National Science Board’s Science and Engineering Indicators 2014, 80 percent of Americans reported they were very or moderately interested in new scientific discoveries.

Scientists may also be reluctant to enter the public arena because of a wrongheaded belief that lay audiences have a low opinion of them. For example, I once heard the director of a national lab declare to reporters at a scientific meeting that the public disparages scientists as socially inept, unattractive, or villainous. Yet in a 2006 Harris Poll, Americans said they trusted doctors (85 percent), teachers (83 percent), and scientists (77 percent) far more than journalists (39 percent), lawyers (27 percent), or pollsters (34 percent). According to the National Science Board’s Science and Engineering Indicators 2008, “more Americans expressed a great deal of confidence in leaders of the scientific community than in the leaders of any other institution except the military.”

Communication courses needed

Establishing a culture of explanation to capitalize on people’s natural interest in science would not be difficult. Better education and support for lay-level communication are essential first steps. “Communication for Scientists” courses should become a standard component of science and engineering curricula. Such courses need not be onerous additions to students’ workloads—a semester-long course would be enough to introduce them to basic techniques of explaining their work to the public. To help faculty scientists and engineers, universities should offer one-day seminars aimed at honing lay-level communication skills.

Also, more scientific associations should follow the lead of the American Association for the Advancement of Science and the American Chemical Society in establishing programs to encourage scientists’ public involvement. The AAAS operates a Center for Public Engagement With Science & Technology, and the ACS has established a Chemistry Ambassadors program. Those efforts support scientists with workshops and information about how to explain their work to students, lawmakers, journalists, and other important groups.

Scientists and engineers may argue that they are too busy to engage the public. Certainly, the demands of running experiments, publishing papers, writing grants, and managing a laboratory are considerable. But researchers will inevitably need to explain their work at some point—on their laboratories’ Web sites, in reports to administrators, in research descriptions for government agencies, and so on. By applying only a bit more effort and attention, they can make those explanations far more effective for lay audiences. They should also use a “strategy of synergy” to make one communication piece—like a news release or feature article—serve many purposes and audiences.

As the former AAAS President John Holdren—now President Obama’s science adviser—asserted in his address at the association’s 2007 meeting: Scientists and technologists need to “improve their communication skills so that they can convey the relevant essence of their understandings to members of the public and to policy makers. … I believe that every scientist and technologist should tithe 10 percent of his or her professional time and effort to working to increase the benefits of science and technology for the human condition and to decrease the liabilities. The challenges demand no less.”

Can Communication Success be Quantified?

31 03 2010

Can communicators quantify their success? The short answer is sort of. Measuring success in public relations is a controversial and messy business, which is why I didn’t even mention it in Explaining Research. I felt that detailed discussion of the issues would detract from the utility of the book for researchers, who are more interested in learning how to explain their research than how public information officers grapple with the “sausage-making” of measurement.

However, I was reminded of how persistent and frustrating the measurement issue remains when a PIO colleague at a major research laboratory asked for advice about a new boss’s request for a quantitative measure of the office’s media relations success. The new metric-minded boss came from a marketing background—where measuring results is a holy quest—rather than a science communication background, a more complex communication environment. In an e-mail message, my colleague asked some  experienced communicators, including me, to discuss “what captures how well we’re doing our jobs without bogging us down so much with collecting or analyzing information that we can’t do our jobs.”

So, for the benefit of PIOs—and for those researchers interested in such sausage-making—here are some of the issues and pitfalls we explored:

One major measurement pitfall in my opinion is reliance on a long-criticized measurement called “advertising value equivalent” (AVE), which is a dollar amount that quantifies how much media stories would have been worth if they were paid advertising. Developing AVEs for news stories is an incredibly expensive proposition. One news office manager at a university where I worked spent well over $10,000 per year (she wouldn’t reveal the actual cost) with a company that produced an annual AVE for the office’s media clips. Of course, the AVE was huge—many hundreds of thousands of dollars as I recall—and she advertised that amount to her superiors as a meaningful quantitative measure of media relations success.

But AVEs are very poor measurements for many reasons. The best-articulated case against them that I’ve found is a post on the blog MetricsMan that I recommend reading. Basically, MetricsMan declares AVEs invalid because

  • They don’t capture the inherent value of news articles as credible, independent validation of a story; as opposed to the paid appearance of an ad.
  • They don’t measure the impact of an article on a reader.
  • They don’t take into account other important media relations services such as strategic counsel, crisis communications and viral campaigns.
  • They don’t measure the value of keeping negative news out of the media or of coping with it in communication terms.
  • They don’t distinguish between articles that appear in publications important to the institution, versus those that are less important. AVEs only count the cost of advertising in the publication.
  • AVEs count as positive in terms of comparison value even those articles that may be predominantly negative.
  • There is no way to calculate the value of a hit on the front page of a newspaper or a cover story in a magazine, because ads aren’t sold in those places.
  • AVE results may be going up when other legitimate communication measures, such as communication of messages, or share of positive coverage, may be going down.
  • AVEs don’t cover such non-traditional media as blogs or positive conversations on social networking sites.

In our e-mail discussion, veteran research communicator Rick Borchelt summarized the problem of quantification by telling our fellow PIO

I think the take away message is that there is no really good quantitative metric for media relations success, since media relations is/are an assessment of your relationships with media, not with how much ink they spill about you. You can’t really say with a straight face most of the time that the release you put on EurekAlert! generated the story in the New York Times that was read by the junior staffer of the senior Senator who put an earmark in the DOE appropriations bill that got the new molecular biology building. What we struggle with is how to prove the negative:  how much worse would a story about the lab have been if you didn’t know the reporter and could talk her off the ledge of a sensational (but inaccurate) story? Or factor in the opportunity cost of giving one reporter an exclusive that pisses off a dozen others. Or how more likely a reporter with whom you have a relationship is to come to the lab for comment on a breaking story out of all the contacts in his Rolodex. These are intangibles.

Another veteran communicator, Ohio State’s Earle Holland, recommended that our colleague ask some basic questions before even beginning to address the measurement issue:

You said that the new boss asked you to “come up with a way to measure how well we’re doing our jobs.”  First, you need to answer the question of “What is your job?” both in yours and his eyes. Otherwise you won’t know what’s there for comparison—you can’t have a metric without a scale to gauge it against…. Is the goal to get mountains of news media coverage? To what end?  Is it to protect the reputation of [the laboratory]—only good news goes out? Is it to motivate actions or opinions of key constituencies—something that’s probably impossible to gauge causally. Or is it to convey interesting, accurate science information to the public because [the laboratory] is a publicly supported enterprise and the public deserves to know? Who are the constituencies you want to reach and which are more important than others—you can’t just say “they all are. ” My point is that you have to know what would be seen as success before you try to measure how successful you are.

To those cogent comments, I would add that when a boss asks for any kind of measurement, a reasonable response is “What will you use that measurement for?” I have always followed a rule that if some piece of data is not necessary for making a specific managerial decision, then it is not worth gathering.

In the case of the news office manager cited above, she declared that “I will use the AVE for our news clips in advocating for our budget.” But in my experience, such information has never had any significant effect on budget-making. Other factors, such as the economic state of the institution, political advocacy, and persuasion have been far more important.

Even given the caveats and complexities of quantification, though, there are some legitimate numbers that PIOs can offer management, as long as they are put in the context of the overall communications program.

For example, Holland and his colleagues in OSU’s Research Communications office produce an annual report that includes numbers: how many stories produced, how many times they appeared in major media, how big the audiences for those publications were, etc. But these numbers are intended only to give a sense of productivity, not to suggest impact.

The report also explains how the stories were distributed—via blogs, posting on EurekAlert!, Newswise, etc.—and quantifies the audiences for those outlets. And the report quantifies the number of visitors to OSU’s research Web sites. Such data are available directly from the news services, and for the Web sites by using Google Analytics. Also, the appearance of news stories on Google News can be monitored using Google Alerts.

Importantly, however, the annual report also documents the research areas of the university from which news came, to demonstrate the comprehensiveness of coverage. And, it discusses the broad range of other ways the office uses stories, interacts with reporters and serves faculty members. Thus, the annual report goes beyond mere numbers to present a full picture of the office’s activities.

Such documentation of productivity is important. However also critical, and often neglected, is demonstrating productivity by proactively making sure that key administrators and other audiences are aware of  news stories and other communications achievements.

My favorite example of such proactive demonstration is the process that Borchelt established to remedy the lack of visibility for important media stories, when he was communications director at Oak Ridge National Laboratory. “Here was an institution focused on media stories as their goal. So, they would get this great story in the New York Times, and they would mention the story when visiting their congressman, and he’d ask ‘What story?’ ”

Thus, Borchelt began sending major media stories, along with a letter from the director, to important members of congress, as well as program officers and directors of the DOE, which funds the laboratory. “The letter would say ‘Thank you so much for giving us the opportunity to work on this exciting research that is reported in today’s New York Times,’ ” said Borchelt. “And we would often append the new release, because it tended to have a better explanation of what we were doing; and also because we could acknowledge the funding agency, so they could see that they got credit. It was hellishly labor-intensive, but incredibly useful,” said Borchelt. Members of Congress would use the articles in their communications to colleagues and even read them into the Congressional Record.

So, although media relations productivity can be sort of quantified, numbers are not enough. They must constitute only one part of a comprehensive effort to communicate productivity in all its forms to the people who sign the paychecks.

The Seismic Changes in Science Communication: “Radio In Vivo” Interview

4 03 2010

I was interviewed about the extraordinary changes facing science communication and about Explaining Research, on the science radio program Radio In Vivo, WCOM-FM, on March 3, 2010.

The discussion with host Ernie Hood explored the new pitfalls and opportunities facing scientists, public information officers and journalists in communicating  research to important audiences—colleagues, potential collaborators in other disciplines, officers in funding agencies and foundations, donors, institutional leaders, corporate partners, students, legislators, family and friends, and the public.

AAAS meeting: A Wrongheaded Myth Still Hinders Scientists’ Communication

20 02 2010

To my delight as a research communicator, the American Association for the Advancement of Science chose “Bridging Science and Society” as its theme for its 2010 meeting. And, I was heartened by meeting’s call “on every scientist and engineer to make their work both beneficial and understandable.”

And indeed, as I scanned the program I found that many sessions were devoted to helping scientists communicate their research to a broader audience.

Unfortunately, the meeting also revealed evidence that the scientific community still clings to the myth that it is the public’s lack of respect for scientists that hinders their communication, and not science’s own lack of a culture of explanation. Witness President Peter Agre’s statement in his opening address that

“I think we have a big challenge in science because the public often views us as nerd-like individuals in lab coats, consumed with equations, data-driven, and actually less than the humans and the passionate humans that scientists really are.”

Nothing could be further from the truth, as I demonstrated in my article “Scientists are Heroes.” While the lab coat is, indeed, a badge of scientists, and their data-driven nature is a part of their public perception, those are symbols of honor not derision. As I demonstrated in my article, public polls and Hollywood movies from Avatar to Indiana Jones overwhelmingly depict scientists as dynamic heroes. And popular televisions shows including the CSI series and Numb3rs also portray scientists as “passionate humans.”

I contend that scientists use this myth of the denigrated scientist  as one excuse to avoid confronting and correcting their own serious lack of a culture of explanation in science, as I discuss in the introduction to Explaining Research. They argue erroneously that “Since the public doesn’t respect us, why should we fight an uphill battle to explain our research?”

Besides this myth, there are other fundamental reasons for science’s cultural deficit of explanation, and since writing the book, I have come to understand them better:

For one thing, science and engineering are unlike such professions as law and medicine, in that there is no immediate need to explain their field to lay audiences in order to have a successful career. Imagine what would happen to a lawyer who couldn’t effectively explain principles of law to juries or clients. Imagine what would happen to a doctor who was inept at explaining medical problems to patients.

In contrast the career success of scientists and engineers depends almost completely on their ability to communicate to technical audiences–colleagues, deans, laboratory heads, etc. Even scientists who teach undergraduate classes are not judged heavily by their success at lay-level communication to those classes. At least, I have never heard of a researcher denied tenure because his/her teaching was not up to snuff.

So, it will take more than the fear of career failure to prompt scientists and engineers to reach out to the public. They must take a broader view that such communication does ultimately help their career, as well as their field and their society.  For example, a greater public appreciation of science and engineering helps persuade donors and legislators to support science. And it helps the voices of scientists and engineers be heard in public debates over such science-related issues as childhood vaccinations.

But an important first step toward creating a healthier culture of explanation is for scientists to abandon the corrosive myth that the public doesn’t respect and admire them.

Toyota Recall: You Can’t Fix Stupid

3 02 2010

Regardless of whether Toyota finds and fixes the flaws in their cars that cause uncontrolled acceleration and faulty braking, their mismanagement of communications has been an unmitigated disaster that alone will cost them huge financial losses and a severely damaged reputation. Or, as the old saying goes, “You can’t fix stupid.” Reuters has published one good account of Toyota’s PR problems.

Perhaps the only slightly silver lining in this dark cloud is that Toyota’s blunders offer useful communication lessons for anybody facing a crisis. Some examples:

  • Toyota took far too long to respond publicly to the problem. Its CEO did not comment for months after the problem started and has only responded informally. A smart communications plan would have had him speaking on YouTube and at news conferences from the time the crisis first began.
  • Toyota should have offered clear steps its customers could take to protect themselves.  In an article in the Fort Myers News Press, public relations expert Tina Matte pointed out that the company didn’t tell consumers what they should do about driving vehicles with the flaws.
  • Toyota was far from transparent. The company  only gave the public vague reassurances that it was developing a solution, and it hid the process behind the usual corporate wall of silence. A far more effective, albeit daring, approach would have been to put the company engineers out front, discussing candidly what they knew and didn’t know about the problem as they were working on it. Such candidness would serve  not only to reassure the public, but to portray Toyota sympathetically as an open organization, deeply concerned about the quality of its products. Perhaps public discussion might have even brought useful insights from the vast legions of engineers who would have avidly followed the developments.
  • Toyota used 20th-century communication methods. For example, full-page newspaper ads are woefully outdated. And when it did use social media, it did so ineffectively. In the News-Press article, public relations expert Ginny Cooper pointed out that the company’s Facebook page merely redirects readers to the recall Web page on Toyota’s main site. Cooper recommended a separate tab on the Facebook site addressing the recall. And, she said, there should be a separate Web site for the recall.
  • Finally, and perhaps most importantly, Toyota should have apologized early and often. Not only were apologies slow to come, but PR consultant Msato Takahashi pointed out in the Reuters article that the first apology by a headquarters executive did not include a deep bow, a standard gesture in Japan when a firm admits responsibility for a mistake. (Update: CEO Akio Toyoda finally issued a formal apology, complete with bow, on February 5, 2010, in his first formal remarks about the problems). Corporate lawyers often advise against such public apologies, arguing that they establish fault and weakens a company’s position in lawsuits. However, lawyers don’t take into account the full business impact of apologies. I suspect that early, frequent, heartfelt apologies would have been far more financially beneficial to Toyota–both in reducing lawsuits and in maintaining reputation.Certainly, the medical profession has found that apologies reduce malpractice suits.

It will be fascinating to watch how Toyota continues to handle their crisis communications, and whether their reputation recovers. For more tips on handling a communication crisis, see this tipsheet on the Explaining Research site.

“Dueling” Monkey Robot Videos Offer a Lesson in Communicating Research

8 10 2009

It’s rare that I get a chance to show vividly how  an ill-advised media policy can compromise communication of a piece of research. But two videos about development of brain-machine interfaces enable me to do just that. One video, below,  covers a story about Duke University neurobiologists using the brain activity of a monkey to control real-time walking patterns in a robot in Japan.

And a second video covers a story about University of Pittsburgh researchers using a monkey’s brain signals to control a robot arm, to enable the monkey to feed itself.

As you watch these videos, judge which one you think most effectively portrays the research and why. And guess which video received the most media play. (Although the research advances were both reported in 2008, I’ve just had a chance to do the analysis of their coverage.)

Both the Duke and the Pittsburgh research efforts are certainly important and newsworthy. But I think it’s fair to say that the Duke achievement–given the complexity of the walking behavior involved–is the greater of the two.

However, because Duke’s media policy severely limited what its video could portray, its research received far less media attention than the Pittsburgh story. Duke strictly prohibits any photos or videos depicting the use of animals in research, even if those images are central to the story. Thus, the Duke video could only feature an animation of the monkey on the treadmill, and not footage of the actual experiment. Not only was the animation primitive, but it was factually misleading. The “monkey” is anatomically inaccurate, looking more chimpanzee-like than rhesus-like. And the animation shows no evidence of electrodes attached to the monkey, leaving viewers to wonder how the signals got from the monkey’s brain to the robot.

In sharp contrast, the Pittsburgh video shows dramatic footage of the real monkey operating the robot arm to feed itself. The researchers cleverly  avoided complaints from animal rightists by obscuring the brain electrodes behind a piece of equipment. Also, it’s clear from the monkey’s behavior that it is perfectly comfortable.

Duke’s use of animation was also counterproductive in that it raised the most doubts in viewers, with one viewer commenting “This looks like BS. I’m not saying that it is, but the monkey isn’t real, the robotics footage is looped and shows feet never making solid contact with the ground, this may as well be fake as the stupid monkey.”

So, which video received the most coverage? Pittsburgh’s by far. Besides being featured on network news shows, it was posted on Web sites including BBC, PBS Online NewsHourNPR’s Science Friday, and Reuters. A subsequent video was also featured on the National Geographic Web site.  By contrast, the Duke footage showed up only on Reuters.

The New York Times did use videos in its Web coverage of both the Duke story and the Pittsburgh story, but the latter online story was made much more compelling by inclusion of its accompanying video.

But does it really matter that Pittsburgh’s research received more media attention than Duke’s? Yes, it does, given that media coverage reaches a wide range of key decision-makers, from donors to funding agency administrators, to legislators. And it matters because such media coverage also reaches prospective collaborators and other important scientific constituencies. Finally, increased media coverage influences scientific citations, as I discuss in the introduction to Explaining Research.

Duke’s refusal to depict animal research also has a broader moral and ethical dimension. In doing so, the university is evading its responsibility as a major research institution to emphasize the importance of animal research to medical advance—ironically aiding the cause of animal rights groups that oppose such research.

As an aside, an informal poll I did of major research universities found a broad spectrum of policies on depicting animals in research; but Duke was very much at the extreme end of that spectrum in its outright prohibition.

The broad lesson, I think, to be drawn from this case is that administrators need to get beyond their own expediency—perhaps even timidity—in setting communication policy. They should consider their broader responsibilities to their institution, its researchers, and science as a whole.

Disney Can Teach Lessons in Communicating Science

3 05 2009

You might think of Disney World as merely a vacation destination, with or without the requisite kids. But I’ve found that Disney really has much to teach researchers about communicating science and technology. And I don’t just mean tricks to hold the attention of squirmy school kids during a school science talk.

Disney World offers lessons about communication research that can make your seminars, Web content, and articles more engaging and thus effective. I really didn’t come to appreciate what Disney World can teach about explaining research until my latest trip. So, I dedicated the visit to exploring science communication Disney-style and how it offers take-home lessons that researchers might find useful.

Disney communicates science so effectively because its “imagineers” understand that just providing information is not enough. In creating Disney World, they understood that audiences need more than just information; they also need motivation to take in that information. And wherever possible, the imagineers offer audiences an involving experience that makes the information memorable. Researchers attempting to explain their research usually miss out on the benefits of motivation and experience because they neglect them in their communications.

For example, you likely see your departmental seminar as purely an informational event meant to convey as clearly as possible your latest experimental results. But because your audience comprises real people, reaching them most effectively also means motivating them and giving them an engaging experience.

To show what I’m talking about, here are examples from my Disney World visit, along with ideas on how you might apply them to make your research communications more effective. First, how Disney uses motivation:

As you might expect, Disney effectively motivates by injecting whimsy and humor into its science communication, especially using its cartoon characters, For example, in EPCOT’s The Land pavilion, for example, the “Circle of Life” movie uses characters from The Lion King to engagingly convey the need for environmental preservation. And in “The Seas with Nemo and Friends” pavilion, the entry ride superimposes cartoon characters over the real-life aquarium in the facility. (By the way, Disney offers useful Web pages giving an overview of its “Environmentality”  and Education programs.

In motivating visitors, Disney also knows how to take advantage of teachable moments to explain science. For example, bathrooms in the Animal Kingdom’s Conservation Station have “Whiz Quiz” plaques posted over the urinals and on the stall doors. The Whiz Quiz over the men’s room urinals asked “How much do elephants pee?” (20 gallons), and How far can rhinos and tapirs pee? (15 feet).

Waiting for Animal Kingdom’s Dino-Land Dinosaur thrill ride, visitors hear a concise explanation of the meteorite impact believed to have caused the extinction of the dinosaurs. The ride itself is introduced by scientists including a black woman. And during the ride, the narrator calls out the species names of the animatronic dinosaurs as they menace the riders. The Dino-Institute also displays the science-friendly slogan “Exploration, Excavation, Exultation.” (Unfortunately, there was a mildly anti-science sign in the Dino-Land Chester & Hester’s Dino-Rama carnival rides. The sign showed a cartoon dinosaur ejecting a white-coated scientist, with the caption “Scientific? Nope. Terrific!”)

In Downtown Disney, the T-Rex Café offers another good example Disney’s grasp of the motivating teachable moment. Diners are surrounded by a collection of animatronic dinosaurs that periodically erupt with roars and movement, giving them a feel for what real dinosaurs must have been like. The cafe also offers the educational “Paleo-Zone,” which includes an archeological dig and educational video games.

Certainly, in your communications you can’t summon animatronic dinosaurs to create teachable moments. Nor would you probably want to post your research abstracts on bathroom stalls. But you can create other teachable moments to offer audiences information about your work. You could post articles or displays in the waiting rooms, hallways, and cafeterias of your building–and not just leftover posters from meetings, but displays tailored for important visitors, from students to donors.

You could add to your Web site a category of links to interesting background articles, FAQs, Q&As, videos, and other content. This material could come from your professional association or funding agency. The Explaining Research Web site has a list of such sources.

Also, consider creating a Facebook page or blog on which you record interesting news about your work, as covered in Explaining Research.

In looking for places to create teachable moments, think like Disney. Ask yourself what venues your audiences frequent, and what kind of information might be appropriate to for those venues? Now, on to examples of how Disney uses experience to communicate:

For many visitors to EPCOT, their first experience is the Spaceship Earth ride inside the giant geodesic dome. This “dark ride” takes visitors past animated tableaus depicting the history of communication – for example an animatronic man pounding papyrus into a flat sheet and printers using the first printing press. However, the most involving experience, comes near the end of the ride. Visitors see on the computer screen in their ride pod an image of their face inserted into a scenario of life in the future. When the ride ends, they then emerge into an “interactive playground,” in which, for example, they can assemble a human body in 3D.

The Animal Kingdom also includes a multitude of science-related experiences. Besides such rides as the Kilamanjaro Safari through the park, visitors walking the pathways might encounter an explainer carrying an animal–we saw a caged spider–who can answer questions about its biology and behavior.

Visitors can also see how the park’s animals are cared for. They can peer through a window into the veterinary center in The Animal Kingdom’s Conservation Station in the Rafiki Planet Watch to watch animals get checkups and medical procedures.

Disney also makes its experiences multisensory, for example effectively using sound. “The Song of the Rainforest” comprises a set of dimly lit booths in which visitors don headphones to hear ultra-realistic rainforest sounds of animals, insects, chainsaws, and falling trees. The Planet Watch also offers information on projects visitors can do to create their own backyard animal habitats.

While you can’t bring such elaborate experiences to your audiences, you can come surprisingly close. When you give a seminar or talk, bring along an organism, mineral sample, instrument, or other object your audience can see, or even handle. If your work involves an interesting sound, include it in your presentation. Perform an engaging demonstration; or if there is some relevant, experiment your audience can do on their own, offer a handout or Web URL describing the experiment. Show a video of an experimental procedure. Even if that procedure is relatively mundane, the show-and-tell will make your talk more memorable. Direct your audience to an interesting place in the area where they can encounter an aspect of your work, for example a rock outcropping in a park or a museum exhibit. Ask for a volunteer from the audience, and use them in a demonstration, preferably a non-destructive one.

Also, think about ways to make your laboratory building experiential. You might install display cases with examples of your work. Or, if there is a public window into your laboratory, you might post information on the instruments and procedures that take place there. This assumes that your lab techs don’t mind having people peering at them.

It has always surprised and disappointed me how bereft laboratory buildings are of information and exhibits on the work going on there. This educational sterility has its consequences in making for an unfriendly atmosphere for audiences who might be interested in the work. Creating a version of a motivational, experiential “Disney World” in your laboratory has definite value in advancing your work. You never know when it might attract a passing student, colleague, administrator, or donor to become involved in your research.

In stressing Disney World’s use of motivation and experience, I don’t mean to imply that it fails to provide information. For most visitors, that information is packaged as modest nuggets embedded craftily in the fun experience. However, for those who want in-depth information, Disney does offer more extensive encounters. For example, most visitors in the Land pavilion are content with the short boat ride through the greenhouses, sliding quickly past displays of farming techniques such as aquaculture, hydroponics, and aeroponics. But visitors who want more can take the in-depth Behind the Seeds tour to learn about those techniques in more detail.

We took the Behind the Seeds tour, and besides learning more about the farming methods used, our enthusiastic, articulate agronomist guide showed us techniques of integrated pest management and tissue culture. We also tasted hydroponically grown cucumbers, and smelled samples of coffee, vanilla, pepper, and other crops grown in the giant greenhouses.

Disney certainly has far more resources than you do at their disposal to motivate visitors and give them memorable experiences. But with even a modest effort, you can make your talks, Web sites, articles, and videos more than just a Mickey Mouse production.

Welcome to The Research Explainer

29 08 2008

The Research Explainer aims to offer scientists, engineers, students, and public information officers a useful discussion of the latest developments in the technologies and techniques for explaining their work.

Dennis Meredith