quinta-feira, 30 de outubro de 2008

Obama bem perto da vitória

Clique na imagem para ampliar
Fonte: Jornal de Brasília, 30/10/2008

sábado, 18 de outubro de 2008

Back to Miller´s experiment - Science 2008

Clique na imagem para ampliar
Fonte: Science vol 322, p 404.

Image manipulation and fraud - Science

Science 17 October 2008
Vol. 322. no. 5900, p. 356
DOI: 10.1126/science.322.5900.356
.
SCIENTIFIC MISCONDUCT:
Falsification Charge Highlights Image-Manipulation Standards
.
Gretchen Vogel (*)
.
Controversy continues to plague work from the lab of prominent stem cell researcher Catherine Verfaillie. The University of Minnesota (UM) announced last week that an academic misconduct committee had concluded that Morayma Reyes, while a graduate student in Verfaillie's lab there, "falsified" four data images in figures in a 2001 stem cell article. The committee found that misconduct allegations against Verfaillie were unsubstantiated, but it did criticize her oversight and mentoring of lab personnel. The new charges come a year after questions were raised about the misuse of images in another key stem cell publication from the group (Science, 02 March 2007, p. 1207).
.
Reyes, now an assistant professor of pathology at the University of Washington (UW), Seattle, and Verfaillie, who now heads the Stem Cell Institute at the Catholic University of Leuven in Belgium, both acknowledge that errors were made in the preparation of the 2001 paper. But Verfaillie defends her supervision, and Reyes says that for several of the disputed images she merely globally adjusted the brightness and contrast in data images without any intent to deceive. "These errors were unintentional and were common and accepted practices at the time," Reyes wrote in an e-mail to Science.
.
The paper, published in Blood, claims that stem cells purified from human blood can form precursors of bone, fat, cartilage and muscle cells, as well as the endothelial cells that line blood vessels. At the time, blood stem cells weren't thought to be that versatile. Verfaillie and Reyes say the figure errors do not alter the Blood paper's conclusions, but Verfaillie has asked the journal to retract the paper, calling it "the proper course in this situation."
.
The Blood paper relates to work that the group later published in Nature, reporting that cells from mouse bone marrow could become a wide variety of cell types. Several groups have reported trouble reproducing that paper's results (Science, 09 February 2007, p. 760). Then last year, Nature conducted a re-review of the paper when a journalist at New Scientist questioned whether some data shown were identical to those in another paper. A UM investigation concluded that any duplication was the result of honest error. Nature published several corrections but said that the paper's conclusions were still valid and that Verfaillie continues to stand by the work.
.
CREDITS: REYES ET AL., BLOOD 98, 9 (01 NOVEMBER 2001) © THE AMERICAN SOCIETY OF HEMATOLOGY
.
New Scientist also alerted the university to an apparent duplicated image in the Blood paper (Science, 30 March 2007, p. 1779). The university then convened a new committee, which submitted its final report on 5 September. The school last week stated that the committee found that in four of the seven figures in the Blood paper, "aspects of the figures were altered in such a way that the manipulation misrepresented experimental data and sufficiently altered the original research record to constitute falsification." The committee cited "elimination of bands on blots, altered orientation of bands, introduction of lanes not included in the original figure, and covering objects or image density in certain lanes," the statement says.
.
The university has not released the full report, citing privacy laws, and experts in image analysis say it is hard to determine intentional fraud solely from the original paper. James Hayden, manager of the microscopy core facility at the Wistar Institute in Philadelphia, Pennsylvania, says that to make a clear point, scientists often alter images, sometimes more than they should. Good laboratory practice means all such adjustments should be noted in a paper and copies of the original image files kept, he says. Jerry Sedgewick, head of the Biomedical Image Processing Lab at UM and one of Reyes's mentors, says he is not convinced that she did anything wrong with the image adjustments she made. "This is done routinely and has been done since film and imaging began," he says.
.
During the investigation, Reyes asked George Reis, who heads the consulting firm Imaging Forensics in Fountain Valley, California, to assess whether changes made between the original image scans and the published images could be due to "global" adjustments, which would imply there was no intent to deceive. Reis told Science that he did determine that significant global adjustments could account for "most of the changes in most of the images." But he says he did not examine the images specifically for signs of editing such as adding or deleting individual lanes.
.
UM says it has forwarded the panel's report and supporting materials to the federal Office of Research Integrity in Rockville, Maryland. UW is waiting for more information from UM before deciding whether to discipline Reyes, according to a spokesperson.
.
Both Verfaillie and Reyes say they have implemented much stricter rules for dealing with data images in their labs as a result of the case. "I have learned a hard lesson," Reyes e-mailed Science. "Now that I am a mentor … I will make sure that my students will get the proper training, supervision and education."
.
*: With reporting by Rachel Zelkowitz.

sexta-feira, 17 de outubro de 2008

O País dos Petralhas (lançamento em BSB)

Clique na imagem para ampliar
Fonte: Jornal de Brasília, 17/10/2008

quarta-feira, 15 de outubro de 2008

The Misused Impact Factor - Science Editorial

Science 10 October 2008:
Vol. 322, no. 5899, p. 165
DOI: 10.1126/science.1165316

Editorial:
The Misused Impact Factor


by Kai Simons
.
Research papers from all over the world are published in thousands of Science journals every year. The quality of these papers clearly has to be evaluated, not only to determine their accuracy and contribution to fields of research, but also to help make informed decisions about rewarding scientists with funding and appointments to research positions. One measure often used to determine the quality of a paper is the so-called "impact factor" of the journal in which it was published. This citation-based metric is meant to rank scientific journals, but there have been numerous criticisms over the years of its use as a measure of the quality of individual research papers. Still, this misuse persists. Why?
.
The annual release of newly calculated impact factors has become a big event. Each year, Thomson Reuters extracts the references from more than 9000 journals and calculates the impact factor for each journal by taking the number of citations to articles published by the journal in the previous 2 years and dividing this by the number of articles published by the journal during those same years. The top-ranked journals in biology, for example, have impact factors of 35 to 40 citations per article. Publishers and editors celebrate any increase, whereas a decrease can send them into a huddle to figure out ways to boost their ranking.
.
This algorithm is not a simple measure of quality, and a major criticism is that the calculation can be manipulated by journals. For example, review articles are more frequently cited than primary research papers, so reviews increase a journal's impact factor. In many journals, the number of reviews has therefore increased dramatically, and in new trendy areas, the number of reviews sometimes approaches that of primary research papers in the field. Many journals now publish commentary-type articles, which are also counted in the numerator. Amazingly, the calculation also includes citations to retracted papers, not to mention articles containing falsified data (not yet retracted) that continue to be cited. The denominator, on the other hand, includes only primary research papers and reviews.
.
Why does impact factor matter so much to the scientific community, further inflating its importance? Unfortunately, these numbers are increasingly used to assess individual papers, scientists, and institutions. Thus, governments are using bibliometrics based on journal impact factors to rank universities and research institutions. Hiring, faculty-promoting, and grant-awarding committees can use a journal's impact factor as a convenient shortcut to rate a paper without reading it. Such practices compel scientists to submit their papers to journals at the top of the impact factor ladder, circulating progressively through journals further down the rungs when they are rejected. This not only wastes time for editors and those who peer-review the papers, but it is discouraging for scientists, regardless of the stage of their career.
.
Fortunately, some new practices are being attempted. The Howard Hughes Medical Institute is now innovating their evaluating practices by considering only a subset of publications chosen by a scientist for the review board to evaluate carefully. More institutions should determine quality in this manner.
.
At the same time, some publishers are exploring new practices. For instance, PLoS One, one of the journals published by the Public Library of Science, evaluates papers only for technical accuracy and not subjectively for their potential impact on a field. The European Molecular Biology Organization is also rethinking its publication activities, with the goal of providing a means to publish peer-reviewed scientific data without the demotivating practices that scientists often encounter today.
.
There are no numerical shortcuts for evaluating research quality. What counts is the quality of a scientist's work wherever it is published. That quality is ultimately judged by scientists, raising the issue of the process by which scientists review each others' research. However, unless publishers, scientists, and institutions make serious efforts to change how the impact of each individual scientist's work is determined, the scientific community will be doomed to live by the numerically driven motto, "survival by your impact factors."
.
====================================
Kai Simons is president of the European Life Scientist Organization and is at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany.

Professora Heloisa Miranda, IB, UnB


UnB deve R$ 200 milhões em passivos trabalhistas

Clique na imagem para ampliar
Fonte: Jornal de Brasília de 15/10/2008.

terça-feira, 14 de outubro de 2008

Impact Factor Fever - Letter to Science

Science 10 October 2008
Vol. 322. (no. 5899), page 191

Letters
Impact Factor Fever


In a recent Editorial ("Reviewing peer review," 4 July, p. 15) B. Alberts et al. addressed the most important problem affecting the scientific community today: the incredible pressure to publish, which is the drift of the "publish or perish" philosophy. Scientific quality is bound to suffer when scientists focus only on their publication records.
.
As an author, reviewer, and editor of a small international scholarly journal, I have noticed a dramatic increase in plagiarism, "salami-slicing" science, and other kinds of research misconduct over the past few years.
.
I fully agree that the peer-review process should be revised in order to reduce its length and make it less agonizing for authors, reviewers, editors, and readers (1). Some of the methods suggested in the Editorial, such as sending reviews on to other journals and enlarging the pool of referees, are certainly needed and will hopefully be successful. However, Alberts et al. failed to mention what is perhaps the most debilitating illness plaguing the scientific community, which I call the "impact factor fever." The exacerbated pressure to publish we all suffer from is induced by an exaggerated reverence for the impact factor.
.
Scientific achievement cannot be soundly evaluated by numbers alone. As Albert Einstein reputedly said, "Not everything that can be counted counts, and not everything that counts can be counted." How long must we wait until an antidote against the impact factor fever is developed?
.
Paolo Cherubini
Dendrosciences
WSL Swiss Federal Research Institute
CH-8903 Birmensdorf, Switzerland
.
Reference
1: M. Raff, A. Johnson, P. Walter, Science 321, 36 (2008).
..
The editors suggest the following Related site:

EDITORIAL
Reviewing Peer Review
Bruce Alberts, Brooks Hanson, and Katrina L. Kelner

Science 321 (5885), 15. (4 July 2008)
[DOI: 10.1126/science.1162115] Summary » Full Text » PDF »
.
Fonte: http://www.sciencemag.org/cgi/content/full/322/5899/191b

sábado, 11 de outubro de 2008

Clube da co-autoria

Clique na umagem para ampliar
Valor Econômico - 10/10/2008
por Carla Rodrigues

Trecho de matéria do jornal Valor Econômico

Clique na imagem para ampliar
Fonte: Valor Econômico, 10/10/2008

Marcos Valério é preso

Clique na imagem para ampliar.
Fonte: Jornal de Brasilia de 11/10/2008.

quarta-feira, 8 de outubro de 2008

quarta-feira, 1 de outubro de 2008