The peer-review principle for scientific publications is a well established and sound approach to ensure publication quality … in theory. In real life it is regularly found to be leaky. This is a current example, reported by Retraction Watch …
Please forgive an old trapper this melancholic flashback. But one of the most inspiring projects during my career was the successful migration of a business publication from print to online around the turn of the millennium. “Inside-Lifescience” was a multi-channel online-magazine covering latest news and trends in life science, pharma and biotechnology.
But let’s start with the roots. The publication was originally developed in early 1999 by a leading German publisher of specialized journals in the fields of life sciences/medicine and the information broker business that I had just founded. The intention of the publisher was to establish a periodical information resource that reflected the emerging European biotechnology industry. We – with our know-how of information research combined with in-depth knowledge of life science technologies and industry – were found to be the right content partner for this project. The result of both expertises was the printed monthly newsletter “BIONEWS”. “BIONEWS” mainly included news from all over the world arranged in categories. This was complemented by an event calendar, links to web sources, and an editorial. “BIONEWS” contained no advertisement and was exclusively financed by subscriptions.
As the aim of “BIONEWS” had always been to cover current trends and to be most up-to-date, we soon realized that a print publication had natural limits regarding timeliness. With the monthly frequency, the news for a single issue had been collected over a couple of weeks. Layout, setting, print, and delivery needed at least an additional week. So, at the time the reader had his copy in his hands, some news were already four or even more weeks old. Not really highly topical! The only way for a print publication to overcome this limitation would had been to shorten its publication dates. But this also would had multiplied the operating efforts and costs.
So, what alternatives did we have? After some discussions we finally decided to move online. This sounds obvious from today’s perspective. But at those times it was absolutely not. Well, honestly spoken, the facts spoke for themselves:
- an online publication could be updated more regularly (up to several times a day)
- the editors could react more flexible to incidents of immediate interest for readers
- there were no more regular expenses for setting, printing, and delivery
- production could concentrate on content not on layout
- the production process could be improved through content management technologies
- new database-based products became feasible
- the basic content was for free to readers because the financing relied on advertisement and enterprise services
As a result the whole production process from initial content research up to the archives was improved … resulting in a new product and new services at lower costs.
But lower running costs had to be paid with great set-up expenses. As the print version could be produced via the standard production path of the publisher, the online version needed a complete new infrastructure. We found this structure in an information management system that was able to channel incoming as well as outgoing information, and allowed to automatically publish content on the web. This system also could automatically mail electronic newsletters, send SMS messages, and fill WAP pages in parallel to the HTML pages (for generation Y: WAP was an early technology to make web pages visible on mobile devices with – at those times – minimalistic displays). Further technical problems had to be solved. “Inside-Lifescience”, the new name for the publication, needed a web server, and a reader-oriented web layout had to be developed.
Setting up a new information system did not only have a technical perspective but also psychological aspects. Established working behaviour needs to be changed. System users (the editorial staff, e.g.) needed to get an introduction to the new software. The internal “routing” of information was changed. More information has to be shared internally. And I am sure all of you know the sentence “But we have always done it by this, and it always worked fine!” But I was lucky to have a quite young team showing the flexibility that was necessary to successfully manage those changes.
The publisher now took the marketing part of the project. They had been an established professional marketing partner within the life science industry. They did have the contacts to sell banner places as well as corresponding “Inside-Lifescience” enterprise products (like content delivery for company web sites, e.g.). But they also had to learn, because selling an online banner is not the same as selling advertising space in print journals. So, the project was a challenging and exciting experience for both partners. At those times, somehow comparable to the joint exploration of a new continent.
One important aspect should not be forgotten, as it is still prevailing. Despite all the new media euphoria we did not want to close our eyes for reality. In those early times, only a few online journals and information portals were substantially in financial plus. Online publishing was not really established in means of the return of investment. One reason may had been that internet users were used to have information and content for free, and many people did not really acknowledge the value of high-quality information (to my opinion this has not really changed so far). Back then I was convinced that there was only one promising strategy to earn the money needed for the maintenance of an online information service: by accompanying products and co-operations. The few financially sound online projects, like “Focus Online” in Germany, showed that this was the way to success at that time. “Inside-Lifescience” had at this point already an advantage because it naturally cooperated with a variety of print journals that were under the roof of the publishing partner already.
Finally, “Inside-Lifescience” started real multi-channel with a web-magazine, an email newsletter, a mobile edition, an AvantGo-channel (at those times for PDAs), and an SMS alert service. And most important … with exciting, interesting, relevant and up-to-date quality content. We offered an always up-to-day view on the biotechnology industry, and had external industry insiders providing editorials. “Inside-Lifescience” lived as a successful online magazine with thousands of readers for a couple of years. It was discontinued when the collaboration ended due to a takeover of the publishing partner. We kept the online platform for a few more years as our corporate publication for clients and stakeholders, resulting in some major project acquisitions. But this is another story.
Revised version of the article “Moving Online – Developing an online information portal”, originally published in October 2000 by Business Information Searcher, ISSN 1365-5760
Journalists are mediators. And they are translators. Take me as an example. It is my job as a scientific journalist to translate scientific contents to the public so that people can understand what things like “cloning” and “genetic engineering” are. And, well, I am trying my best and it truly is an advantage for me to be an educated molecular biologist. I do understand scientific subjects as well as the technical terminology of the biosciences.
But what’s about my non-scientific colleagues? If a standard magazine journalists is in duty to write about – let’s say – Dolly the sheep, does he really have a chance to produce something meaningful? It is even hard for him to understand the details … and we expect a founded judgement. This colleague however is a translator to the public. Like a Chinese-English translator who never learned any Asian language and is working with a 1970 edition of a common dictionary (and avoid asking him for the Chinese signs). Taking this into account, can we really be surprised that the public opinion about biotechnology and gene technology is such bad in Europe.
This also had been a major point at the “Biotech in Europe” session of the recent BIOTECHICA BUSINESS FORUM 2002 in Hanover, Germany. Speakers included Crispin Kirkman (BioIndustry Association, UK), Claude Hennion (France Biotech), Christian Suter (BioValley Basel, Switzerland), Rob Janssen (Netherlands’ Biotech Industry Association) and Hugo Schepens (EuropaBio).
During the discussions Christian Suter mentioned that we are missing true science mediators in Europe. He quantified fruitful cooperations between journalists and scientists as lucky exceptions. And others added that there is a completely different communication culture in North America where scientists don’t worry about sitting in a TV shown and propagating their views to the public.
I do agree. We are really missing true translators and mediators of our contents. Where are the colleagues that are able to help journalists to understand? Dear scientists, journalists desperately need you! Help them to translate. Go out, be present and be the bridges crossing the river between scientific knowledge and the society. In my view many American scientists are highly sensitized regarding their role and duty for public understanding that is the base of public opinion. European scientists are much more afraid of being in the limelight of the media. But – honestly spoken – to my opinion it is part of their (publicly financed) job.
Why do so many European scientists avoid the public? Well, they never learned it. Being a public translator for scientific knowledge is not part of scientific education. Many researchers are just not able to translate.
It is a matter of terms … and a matter of relevance. Let me explain what I do mean with the “matter of relevance”. A true scientist talking about the developments in research will never make an absolute statement, like “Newton’s apple will definitely never go upwards”. He is always qualifying and seeing things in relative terms, even when there is just a hypothetical 0.0001% chance for an alternative event. Perhaps, one day, Newton’s apple may go upwards. It does not matter if this is relevant or not, it always will be a possibility. This basic kind of thinking is a result of the scientific knowledge finding process’ structure, that is driven by thesis and antithesis.
But for the average man or woman this “may be” is a sign of uncertainty, in the worst case interpreted as “there is something in it”. The 0.0001%-event has become a true and relevant option. Now, he is awaiting Newton’s apple to shoot up to the stratosphere, exploding there and finally destroying earth’s ozone shield.
As a conclusion, scientists have to learn to reduce, to focus and to rate various options for relevance. People want clear answers, simple explanations and meaningful statements.
Now, let’s talk about the “matter of terms”. Scientists and non-scientists are often using the same words but do speak different languages. Many scientific terms have a different meaning or an additional interpretation for average persons they have not for a scientist. The result is that both are speaking to each other but there is no true communication.
Take the word “sex” as an example. If a scientist is using the word “sex” he usually is thinking about the gender of the organism he is working with – but most non-scientists at first are thinking about something completely different. Another good example would be the word “glauben” that in the German language is used for “to my opinion” as well as for “to believe”. So if biotech managers “glauben” that gene technology is safe, is it their opinion or is it their believe? But let us focus even more towards “genetic engineering” and “gene technology”. For me the German translation “Gentechnik” has no weight. In my understanding the word stands for a scientific method, a lab application. It is not good or bad, it just is. But for an average German citizen “Gentechnik” has an expanded content, it has a negative meaning, it is a bad word, it is used like talking about devil’s kiss. Now imagine a molecular biologist and a politician having a discussion about gene technology. They are talking together … but finally there is no communication. You can observe it on any programme running on an European TV station.
Where are all these communication and public relation agencies serving the Life Science industries? What have they done during the past years? Well, at least they have lost an important battle. They lost the battle for sovereignty over words. And I suppose that they lost because many of them did not really understand the things they were fighting for.
If you want your public relations work being successful within the fields of Life Science and biotechnology it is much more important compared to any other branch of business that you have an in-depth-knowledge about the contents. Biotechnology and gene technology cannot be treated like others. You really have to understand the technologies you are trying to promote. You really have to know the key words and their true meaning as well as their interpretation by interest groups. And never forget that these words and expressions can have various meanings depending on who is using them!
But where is the way out of the dilemma? Very simple: strike back! Use the words in their true meaning. Use them ‘normalized’. And do not use them only on podium discussions but in your daily live. Speak about biotechnology with your family. Speak about biotechnology with your friends. Speak about biotechnology with your colleagues and business partners. Speak about biotechnology with your children and with their teachers. Speak about biotechnology at your breakfast table and at your barber. Speak about biotechnology with your doctor and with his nurse. Speak about biotechnology as it would be the most normal thing in the world. One day it will be. Win back the sovereignty over words! Now!
Revised version of the article “Let’s talk about Sex”, originally published in December 2002 by Inside-Lifescience, ISSN 1610-0255.
Medical microbiologists and hygiene experts are warning against the excessive use of antibiotics for years now. They observe more and more resistances which have been forced by the constant misuse of antibiotics.
Now, a July 2002 article in the online issue of the journal New Scientist reported that vancomycin – one of the final weapons in the fight against infectious bacteria – lost his power. A strain of Staphylococcus bacteria that is insensitive to vancomycin-treatment was found by medical staff in a hospital in Michigan (USA). Furthermore, VRSA strains (vancomycin-resistant Staphylococcus aureus) already turned up in Japan.
Bacteria are distributing (that should not really be a surprise), and during their travels they are “talking” with each other (if you allow me to call information exchange a talk). Vancomycin-resistant enterococci, bacteria from the gut, are well known already since 15 years. Enterococcus is a low-grade opportunistic pathogen which is completely harmless unless a person has reduced immunity. But scientists already expected to find other species carrying the information for the resistance sooner or later as a result of “talks with colleagues”.
Staphylococcus aureus now is much more of a problem with greater pathogenic potential. The bacteria are common inhabitants of human skin and nose. Under normal conditions the species is a harmless microfloral companion like many others living in and on our body. But when Staphylococcus aureus enters open wounds it is able to cause inflammations up to – sometimes fatal – blood poisoning.
The finding of penicillin by Alexander Fleming in 1928 and the following development of a variety of antibiotics helped to fight these infections as well as a broad range of bacterial diseases that let people around the world suffer till the middle of the past century. The following age of the antibiotics had also been a period where the believe that there is a pill against everything dominated common thinking as well as public health programmes including medical education. But today common antibiotics do not work anymore with many ubiquitous germs – many of them already carrying multi-resistances. For example the methicillin-resistant Staphylococci, an extremely harmful variation who is resistant to all standard penicillin antibiotics. In these cases only glycopeptides like vancomycin have been the last bastion against often deadly infections. Vancomycin has always been known as the antibiotic of last resort. Another bastion that fell now.
If we stay in this military view and language one has to realize that we had a lot of furious and quite successful tactical manoeuvres during the past decades, but the overriding strategy is going into the wrong direction. Latest findings of scientific disciplines like population biology and population genetics need to be much more included. And – as bacteria do not care about national borders – there still is a deep need to improve international standards for medical education. Progressive strategies for the usage of antibiotics must not be under control of national authorities anymore.
The escalation of the bacteriological arms race has been caused by people who should really know it better: doctors. Doctors who are using the antibiotic tool not correctly. To my opinion latest issues of medical microbiology and population biology should be far more integrated in university courses and further education.
Let me give you an example. When I had been in France last Christmas, I got a painful ear infection. I need to explain that I had had an operation some years earlier that left this ear without tympanic membrane, and so every infection is a reason to take care. But I was already used to it, and knew the standard procedures from my ‘doctor in charge’ at home in Germany, that always included a detailed diagnosis of the infecting germ. So, coming back to Christmas in France, I went to a local hospital (quite modern and fashionable, by the way) to see an otologist, an ear specialist. I was surprised! Everything was going very fast. It seemed that this entire diagnosis stuff was not really necessary. I finally left the hospital after 15 minutes with a prescription. Later in the hotel I had a closer view to the drugs I was honoured with. I had three heavy-stuff antibiotics and cortisone in my hands. Someone seemed to think that I was really close to death. And that without any state-of-the-art diagnosis of the causing agent, if it was viral, bacterial or fungal type 1.
Sorry for my open words, but to my opinion this had been an irresponsible conduct. Not irresponsible regarding the single patient (me in that case), but to a general health situation that finally affects every ‘single patient’ again sooner or later. My wife – who is French by the way (“o la la”) – later told me that this a normal behaviour in France. Many doctors are giving antibiotics for everything, even a simple viral (sic!) cold. And this for sure is not a French problem. Meanwhile I heard similar reports about some German practitioners, who take the easiest way by prescribing pills without any clear diagnosis, too. And in the United States everybody can buy antibiotics on one’s own in a pharmacy without any prescription???
Hey folks, we are not talking about vitamin C or Aspirin! These are drugs that need to be used under strict control and with a certain strategy. These are drugs that are pillars of our health systems. And everybody knows that there is no other group of drugs that are under such a pressure by emerging resistances. I would not waste a thought about some stupid doctors who are not aware what they are doing. But I am deeply concerned about the basic faults in our health systems that lead to misuse along a wide front.
Yes, I am really angry. Meanwhile I have become somehow an extremist targeting medics as well as public health systems … as regards the antibiotics misuse. I am sad about the current situation because if some people would have had just used their brains but their prescription pads we might not need to worry about it today. Just go to any university hospital and ask them for ‘hospital infections’. These people are in the front line. They know the problems we already have. And they have to pay for it.
The Michigan patient infected by the vancomycin-resistant Staphylococcus finally survived after treatment with an “antiquated” antibiotic called chloramphenicol, but – according to doctors – the VRSA’s susceptibility to this drug was fortunate.
We need to change the direction. Now!
1 I have to note that in past cases it has been always bacterial or fungal, and I am not sure if there is any causative viral agent of the middle ear.
Read more …
- Antimicrobial Use and Antimicrobial Resistance: A Population Perspective, Marc Lipsitch, Matthew H. Samore, Emerging Infectious Diseases 8(4), 2002
- CDC Antimicrobial Resistance and Antibiotic Resistance
Revised version of the article “Superbugs knockin’ on the door”, originally published in August 2002 by Inside-Lifescience, ISSN 1610-0255.
Let’s start with a joke. “What are three Germans doing that you have put into one room? – Founding an association!” In Germany we have associations for everything in the smallest village. Associations of hen breeders, associations of stamp collectors, associations of local singers, associations of hobby gardeners, associations of wine drinkers, associations of The Kelly Family concert visitors, and so on. Since late 2001 we additionally have the German Society for Proteome Research (DGPF), whose very first founding charter was wrote down on a beer mat (well, we are in Germany, aren’t we).
The foundation of the DGPF by scientists and industry representatives was a reaction on latest market and application movements towards protein research. Germany already has had strong Proteomics (& protein) research when others were still chasing the holy grail Genomics. But – to my impression – it was never really well communicated. So, one major aim of the DGPF will be to improve the international knowledge about the high level of German Proteomics.
But why are researchers and the industry more and more focussed on Proteomics? One of the major disadvantages of Genomics approaches is the missing connection between a gene and its cellular function. The fact that a gene has been sequenced does not give us the cellular function of the gene product. That makes genomic results so difficult to interpret. Even the sequence analysis with bioinformatics tools does not yield the full picture. Additional problems arise through the organisation of the genetic information as well as the fact that only a subset of genes is active in a specific cell at a specific stage.
So, scientists are moving to the functional level, to the gene products, to the proteins. And they developed the new term, “Proteomics”, for the complete set of proteins (functions) of a cell in a specific stage, in analogy to “Genomics” that addresses the complete set of genes (information) of a cell.
Similar to other attempts with a large-scale option in industrial applications (drug discovery e.g.), it will depend on the technology developing and supplying industry if Proteomics will get its chance. When I was doing the research for this article I had the impression that some companies just stuck the Proteomics label onto their existing products. This is neither a solution nor does it really fit the researchers needs. But where are the bottlenecks and what has to be done?
There is a dramatic increase of complexity while switching from the genetic to the functional level. A gene is a gene is a gene. There is slight variation caused by introns and foreign elements as well as expression control. But our scientific thinking is dominated by the “one gene – one protein” paradigm, even since the knowledge about posttranscriptional modifications has shown that it is not just that simple.
With proteins one has to view every single candidate in the context of multifunctionality and networking. In many cases one protein is not just one function. It is part of a high-complex cellular network of interacting and cascading activities. The function of most regulatory proteins for example depends on environment (regarding ‘cellular clock’ and location), posttranslational modifications and interacting partners. As a result one protein might have a couple of functions depending on where, when and with whom it is. This puts Proteomics to trouble.
At first, there are still no powerful technologies for many aspects in large-scale protein research available. Friedrich Lottspeich, head of the protein analysis group at the Max-Planck-Institute for Biochemistry in Munich and DGPF-chairman, said that recent methods exhibit great potential but are not yet ready for the industrial job, in drug discovery for example. There are only few suitable solutions for automation and high-throughput. Early stage MALDI-TOF applications work pretty well, in Structural Proteomics e.g.. But problems with high-throughput sample preparation, low abundant and hydrophobic proteins are unsolved. In Functional Proteomics automated interaction-screens based on the 2-Hybrid, SPR (surface plasmon resonance) or TAP (tandem affinity purification) technologies – that are essential to discover the networking aspect of proteins – are at its infancy. Antibody-based biochips already show the direction.
At second, Proteome research results in huge amounts of data. Corresponding to the higher complexity, Proteomics causes exponentially more data than Genomics does. But drug discovery (and scientific research in common) is not just collecting data, even if one might suspect some scientists to think so. No, the scientific progress depends on results derived by the analysis and interpretation of collected data. And this is getting more and more difficult with increasing complexity.
Finally, the complexity of protein functionality has to be taken into account while moving forward. An attempt to this is the field of Integrated Proteomics that considers various views by the combination of data coming from different approaches and sources. But . this again increases not only the total amount of data to be analysed but also the level of complexity. According to Thomas Franz, head of the Proteomics core facility at EMBL Heidelberg, existing bioinformatics solutions are not able to quantitatively and qualitatively analyse the produced data. This opinion is shared by a couple of colleagues working in the field. Scientific teams are analysing the data manually again because this is more effective and still yields the most meaningful results.
The conclusion is an answer to my question what has to be done. There is a deep need for at least a) large-scale protein research technologies, b) suitable bioinformatics solutions and c) Proteomics-optimised devices.
I am curious about the future development of Proteomics. It might be overrun by other “-omics” in public attention. But I am convinced that Proteomics will contribute important findings to our understanding of how a cell works. And for sure it is and will be a major market for technology suppliers and bioinformatics companies.
Originally published in April 2002 by Inside-Lifescience, ISSN 1610-0255.
Well, honestly, things are on the move these days. Scientists and publishers are discussing new ways of publishing scientific results. EMBO starts an initiative to set up a platform that will provide services relating to access and retrieval of digital information in the life sciences, ranging from bibliographic or factual data to published full text – E-BioSci. Even database publishers draw nearer academic institutions to promote their content products.
Last week scientists and information providers met at the 8th annual meeting of the German Information and Communication Initiative of the Learned Societies entitled “Open Systems for the Communication in Science and Research”. The conference wanted to discuss the latest national developments as well as strategies on how to improve the scientific information workflow.
The talks and presentations concentrated on three major points: the future of scientific publication, current developments in information infrastructures, and multimedia in academic education and training. Not more!? To my opinion every single topic would have been enough for an own conference. But the organizers aimed at giving an overview and … bringing people from different disciplines together.
I am sure you know the problem. For some reason communication between the academic disciplines often does not really exist but on the paper. Focussing on improving the supply of the scientific community with specialist information, we observe a variety of ‘island-solutions’. Young scientists are used to free internet information sources but are still completely unexperienced with using ‘valuable’ databases. How could they … there is no awareness of information with costs. The problem is well known. And now we are coming back to the lack of communication. Many scientific groups are developing strategies in parallel, to provide scientific labs with database information e.g.. Many solutions never really had a chance because they are redundant. Many resources are used in parallel without looking for synergies and if there could be a common way.
Let’s think capitalistic … or evolutionary: The best(?) system will survive! OK. This works on the international information markets where one can observe concentration movements towards Thomson, Elsevier and some other players. But do our academic structures really have the resources – as regards time and money – to waste it in a try-and-error development? Would it not be better to coordinate international – at least national – efforts? Should we not move on with a common focus and thereby free money for other things?
The first step in developing a common strategy is a vision, something that can be set as one’s goal. No ‘destination’ – no strategy. When you build a road you already know where you start from, but you also need to know where to go. Unfortunately my conclusion after this conference is that there are no true visions. Again we are developing strategies without a direction and wasting scientific resources and money.
What we really need is more communication. Not only communication between information providers and academic users. Also, communication between the disciplines, communication between the scientists. And this conference was not the solution but a very first step. The results have to prove their worth in real life.
Revised version of the article “Scientific information- where are the visions?”, originally published in March 2002 by Inside-Lifescience, ISSN 1610-0255.
The fact that the 2001 Nobel Price in Medicine has been awarded to three Yeast researchers should not lead to the wrong conclusion that the Nobel committee appreciated the fight against alcoholism or overweight. In fact without the tasty products of Brewers or Bakers Yeast (Saccharomyces cerevisiae) our lives would be much more healthy but – honestly – less nicer. Coming to the point, the award really recognizes the contributions of Leland Hartwell, Paul Nurse and Timothy Hunt to the understanding the control mechanisms of the cell cycle, the molecular cell division management system.
I myself did research on cell cycle regulation in Yeast in the late ’90s. As a Yeast guy in an innovative scientific environment that deals with frogs, mice and human cell lines you were always seen as an eccentric – and somehow funny – specialist (and it has always been a challenge to explain that my experiments are not related to the Yeast contaminations in the cell culture lab). Later I was glad to have the opportunity to cooperate and to discuss my results with Gustav Ammerer and Kim Nasmyth in Vienna, two other great Yeast geneticists.
Brewers Yeast – for example – is a budding organism (that is why it is also called Budding Yeast). Daughter cells are formed by small buds growing at the Yeast cell surface. This closely resembles the division of mammalian cells resulting in two daughter cells, e.g.. The key issue for the cell cycle now is to synchronize DNA replication with cell growth and division. And vice versa, the DNA replication needs to be reliably inhibited in the case that there is no division. So, the cell cycle is a series of cell functions controlling the whole life span of one cell generation. It starts over and over again until cell aging or other mechanisms stop the propagation. If the cell cycle does not work correctly cells either stop division or have improperly copied chromosomes or propagate uncontrolled. In humans the latter is connected to cancer.
Here the medical relevance of research with Yeasts like S. cerevisiae and Schizosaccharomyces pombe comes in. Yeasts as model organisms for the understanding of common functions in eucaryotic cells. Yeast cells as easy to cultivate mini labs offering research opportunities as regards fundamental cell activities that are too difficult to study in higher cells with their much more complex regulation networks. Well, if we have learned something about cell cycle regulation in Yeast during the past years then that it is even pretty complex in this very simple organism. Today we know a tight network of internal and external signals including the cell metabolism as well as the cytoskeleton. It looks like that there is not just a simple ‘clock’ but a whole system of communicating proteins with checkpoints and feedback loops. We can use these findings in Yeast to look for homologies and similarities in higher organisms. By comparing functionally known Yeast genes and proteins with the human genome and proteome we will be able to identify new research objectives as well as putative pharmaceutical targets.
To my view this “Nobel Prize for Yeast” is an appreciation of the role of model organisms in modern biomedical science. Understanding them leads to a faster understanding of the molecular basics of cellular malfunctions in humans. As a Yeastman still carrying small buds in my heart I congratulate the Nobel committee on its decision.
Originally published in November 2001 by Inside-Lifescience, ISSN 1610-0255.