Does Science Need Originality?

Does Science Need Originality?
Does Science Need Originality?

Conservative and traditional, science runs the risk of losing its creativity. So does science need people with out-of-the-box thinking, or are they only part of the problem?


In old age, the English naturalist Charles Darwin became seriously interested in the question of the musicality of worms. In his last book, published in 1881, he described a series of experiments on these creatures. Worms, Darwin found, sense vibrations through a hard surface. In addition, the scientist found out that the worms do not distinguish the shades of sound: they did not react either to the sharp sound of a whistle or to the roar of a bassoon.

In the middle of the 18th century, the French naturalist Buffon did the following: he heated balls of iron and other materials to white heat and then, touching them from time to time, recorded the time it took for the balls to cool to room temperature.

A hundred years earlier, Isaac Newton described another case - between the skull bone and his own eye, he inserted a “hairpin” (a kind of a thin sewing needle with a blunt end, - approx. Newton) and gently moved it, changing the shape of the eyeball.


Yes, these experiments are a little crazy, but there is still something "scientific" about them, since the observations and results have been carefully described. Darwin, for example, ruled out the hypothesis that hearing explains the behavior of earthworms; Buffon, studying the cooling processes of rocks, calculated the age of the Earth (his assumption is 75,000 years); and Newton's self-torture helped refine his theory of optics by clarifying the relationship between the geometry of the eye and the effects discovered by the intervention. Such methods, perhaps, are too unusual, but scientists in search of answers only followed the craving for knowledge, becoming a kind of freethinkers in the world of science.

It is worth remembering that Newton, Buffon and Darwin lived in a completely different era - the era of "aristocratic scientists", people from the wealthy classes engaged in various studies simply because they had a lot of free time. Modern science, on the other hand, encourages conformism. To become any kind of scientist, you need to get a degree, publish scientific articles that will be appreciated by fellow experts, seek funding and work. All these factors shape young professionals: they are taught not only the correct work in the laboratory, but also certain behavioral scenarios. The process of assimilating culture is part of a "paradigm" - in the sense that the philosopher and historian Thomas Kuhn put into this concept - that is, a set of values, practices and basic concepts inherent in most scientists.

Standardization, of course, is not the only problem - finding a job in science has become much more difficult. There are significantly fewer jobs than people willing to take them, and influential scientific publications can be counted on one hand. Thus, the decisions that scientists make (especially in the early stages of their careers) become peculiar and rather risky bets on what will be fruitful in the long term and ensure a decent career. A lot of amazing, passionate researchers have not achieved success in scientific circles (it is even better to say "did not become ordinary") simply because they made "the wrong decision." In the current environment, scientists are simply forced to remain conservative: doing something outlandish is not the best choice, which is most likely to harm a career.

Of course, this kind of “filtering” guarantees that modern science, in comparison with the science of Darwin's time, will be more seriously and better funded. The general paradigm has many good things: it facilitates communication and creates a unified basis for the accumulation of knowledge.However, training also includes the ability to convince colleagues that your work has value, that it aligns with their perceptions of "good questions" and "good answers." This makes science more productive, but at the same time inert and timid. As a result, it becomes more and more difficult to engage in truly revolutionary research projects (which is already a matter of rare adventurers).

Biologist Barbara McClintock went to great lengths and paid a high price for her pioneering research on so-called jumping genes. She was not, however, an outcast in the scientific community - by the 1940s, her professionalism was widely recognized for her fundamental work on heredity. McClintock has received several grants from Cornell University, has worked at Stanford, and served as a permanent fellow at the Cold Spring Harbor Laboratory in addition to membership in the National Academy of Sciences. But everything changed when she became interested in the issue of controlling the work of genes. In other words, how is it that identical nucleotide sequences work differently at different stages of the growth of an organism and in different parts of it?

It is believed that the chromosomes of a cell are composed of an ordered sequence of genes, which serves as a blueprint or map for the whole organism. However, McClintock found that genes in each individual cell can move along the chromosome: they turn on and off, reacting to external stimuli, other genes, and so on. At first, the scientific community was very hostile - not to McClintock's discovery, but to a complex, "patterned" model of biosystems, which she developed based on the results obtained. For 20 years McKlintock has been forced to be like everyone else and to do corn research. Only in the 1970s did her colleagues come to their senses, and in 1983 she was deservedly awarded the Nobel Prize.

“It takes a moment for fundamental change,” McClintock writes as if indifferently in a letter to a colleague.

As you can see, even the approval of the scientific community does not guarantee that “extraordinary thinking” like McClintock’s will not leave you vegetating in the backyard. Modern science may be very productive, but it seems that it does not greatly appreciate the unrestrained ingenuity of some of its representatives. Then, of course, people like McClintock need to be cared for and nurtured to give free rein to the dreamers willing to swim against the tide and help underappreciated geniuses change the order of things. Such people, by definition, are originals, eccentrics. And if science wants to test its own foundations for strength, it needs eccentrics. Anyway, that's what they say.

In the following story, you can probably find something remotely similar to the solution to the problem: it is necessary to return the "aristocratic scientists", which in the modern world will become entrepreneurs (both men and women), venture capitalists or billionaires from Silicon Valley. In July 2012, Californian businessman Russ George and indigenous people on the west coast of Canada orchestrated the dumping of 100 tons of ferrous sulfate into the Pacific Ocean. It was an epic geoengineering experiment called "fertilizing the ocean." The purpose of the experiment was to stimulate the growth of phytoplankton. For photosynthesis, these tiny microorganisms, which live in all earthly bodies of water, need iron, which George gave them. The increase in phytoplankton mass will help restore the salmon population, on which the region's well-being depends. In addition, the dead phytoplankton gradually sinks to the bottom, carrying with it the processed carbon dioxide.If the experience is successful, the “ocean fertilization” technique could be applied on a larger scale, thereby increasing fish stocks and decreasing atmospheric carbon.

The reaction to George's actions was immediate. The International Maritime Organization said the experiment violated more than one official regulation. Canadian authorities raided the businessman, seizing his work, and said that he allegedly violated several UN moratoriums. George continued to defend his point of view in print. He admitted that it is very difficult for him to be under "this black cloud of defamation" just because he "dared to stick his head where no one had gone before." The businessman argued that while research organizations only talked about the catastrophic risks of climate change, only he took some action. George proudly assumed the unspoken title of freethinker.

What lesson can we learn from this story? Science needs thinking outside the box, and yet it hardly accepts it. So why not just give the right to be crazy to people who have enough ingenuity, courage, will - as well as pride and money - because then they can easily find something to do outside of conservative scientific organizations. Undoubtedly, ambitious executives with wealthy private companies behind them can lead the movement for scientific adventurism. Indeed, who, if not the creator and at the same time the owner of innovations, can best lead the movement of technology and science towards a brighter future? But can we trust people like Russ George?

Frankly, no. One of the reasons for disbelief comes down to the very life story of a fighter with the system. The image of such a person is an archetype, part of human progress, where individual individuals - rebels, geniuses - set the direction of history. Yes, there is something compelling about this: a fighter who alone defies an ignorant crowd ultimately triumphs over conventional wisdom. In this case, it turns out that the entire historical process consists of the biographies of great men (despite McClintock's success, there are very few women in the annals of science). Can such stories be believed? How do we know they are all true and not just fun?

It often happens that the emergence of scientific innovation is best explained by economic, socio-political or some other external factors. An example is Galileo, who played an important role in the displacement of the geocentric system of the world. Despite his contributions to mathematics and astronomy, it should be remembered that significant improvements in lens grinding technology have made it possible to build better telescopes; the spread of printing allowed Galileo to promote his ideas (he was undoubtedly a brilliant master of the gift of speech); and the political and religious context of Europe in the era of the Reformation and Counter-Reformation gave rise to a varied and fairly open intellectual discussion (even if there was a serious conflict between Galileo and the Catholic Church). Taking into account all the facts, it is difficult not to admit that the significance of a separate genius, changing the course of history, is not so great. Therefore, the historical background should be given more attention, and we should not rashly hope that a couple of clever rebels can solve our problems.

It is also worth thinking about who exactly we are listening to. Most science adventurers are wealthy white men. And this is not surprising, because in order to become a successful original, you probably need to be in a special position from the beginning. It gives that share of confidence necessary for an original vision of the surrounding world. In addition, a person needs emotional and physical support to take risks, and his work must have a certain weight, and then he will be paid attention to.Only a small part of people manage to combine all of the above qualities, and they, most likely, will be representatives of a rather narrow group of society. From this point of view, it becomes clear why those who have been honored for their individual contributions to science can be called "aristocratic scientists."

It turns out that such a group will not be diverse. And this is not only dishonest, but also imposes unpleasant restrictions when we undertake to evaluate science by itself. It is most obvious that such isolation narrows the circle of potential researchers. Any structural feature that does not characterize a person as a scientist and interferes with independent thinking inhibits development. Monotony breeds suboptimal science. And it's not about the number of scientific studies, but about their volume and quality. It seems clear to me that background, experience and personal past influence the development of ideas. Therefore, if we want diversity of thought, we need diversity among people.

A classic example can be found in paleoanthropology. In the 1960s, this predominantly male field of science was fixated on just describing what drove human evolution. This is confirmed by the conference "Hunter Man", held in 1966 by the University of Chicago. Hunting was the main reason for evolution: the meat diet provided the brain with the energy to grow, and the need for cooperation and the cognitive skills needed to hunt forced the brain to grow. Of course, there could be no talk of any equality between the male “hunting” half and the female “collective”.

However, over time, women began to work in this area, and their gaze was found in a seemingly harmonious theory of inconsistencies. Female explorers have suggested that hunting may not have been the primary occupation of ancient humans. The irregular "reward" of meat was only a supplement to the stable diet provided by gathering. Later, the problems of childbearing came into focus: cooperation can be explained by the fact that childhood became longer (and the brain, as we remember, increased), and upbringing required teamwork. Anthropologist Kristen Hawks of the University of Utah has put forward the so-called "grandmother hypothesis", which explains the almost unique (in the animal world) life expectancy for women after menopause. According to Hawkes, women of non-reproductive age - grandmothers - played a critical role in maintaining the functioning of social groups. In theory, men could think of a similar thought too. Yes, modern technologies and discoveries have played a role in diversifying our views on human evolution, but the prosperity of new scientific hypotheses, of course, contributed to the experience and perspective of women on these problems.

Therefore, if a small group of people happens to lead the development of science and technology, we are more likely to make decisions that protect the interests of this group. This seemingly inevitable situation brings me to my last point: trusting systemic scientists makes us vulnerable to their eccentricities. We can hardly control the quirks of our idols. They could choose the best use of their time and money. Or waste everything. After all, it's their time and money. They should not be accountable to the people they influence. No matter how benevolent the intentions are, if only rich white men can be true science rebellions, that's bad news for everyone else.

The international assistance and development sector is replete with, sadly, sobering examples of what can happen when the distance between the “helper” and the “needy” is too great. Volunteer organizations that send students and tourists from wealthy countries to construction sites around the world have been criticized for inefficiency and even exploitation.Without the right knowledge and outlook, good intentions often do more harm than good. It seems to me that this applies equally to science and technology. Without people who are unlike each other, these areas will develop, willingly or unwillingly, rooting existing inequalities.

Paradoxicallybut I don’t believe that the problem of lack of creative minds can be solved by finding more smart rebels. Of course, we need people who are willing to argue and resist the old order. But maybe, instead of relying on proud loners courageously fighting the current of science, science itself should become more reckless and risky? After all, the hope of good luck cannot be regarded as a riot.

It is not simple. Remember what we talked about about the stages of entering the scientific world, depending on the opinion of colleagues and "betting" on various areas of research. All of these play an important role in maintaining scientific credibility and can be extremely productive. But only in a certain context. I am not suggesting a complete rejection of tradition. But I propose to abandon the idea that there is only one surefire way to organize a scientific community.

There is one area of ​​research that is close to me, which is widely criticized for extravagance, and the attitude towards it is in dire need of rethinking. Within the framework of this discipline, they study the existential risks to which future technologies and scientific breakthroughs can lead - say, out of control artificial intelligence, deadly superbugs, geoengineering errors, an asteroid collision with the Earth. Such threats are called “tragedy of the uncommons” by Jonathan B. Wiener of Duke University in North Carolina: society rarely thinks about large-scale, but unlikely events with global consequences, and even neglects them.

But we have to think about them. Even if the chances of a mass extinction are minimal, we can take action now, lowering the already low probability. This is what science is for - to deal with such threats.

However, due to the conservatism inherent in science, such abstract research programs are perceived as predicting the end of the world. Respected scientists will never engage in such work for fear of falling into the "reputation trap." Perhaps this is what prevents them from taking into account the most unlikely, but possible and significant results of their own research. Yes, it should be recognized that the likelihood of an event entailing the disappearance of civilization is difficult to calculate and even express. But that doesn't mean we shouldn't try. Even the smallest effort spent looking for safer futures for both the planet and all species is truly worth it.

The problem of diversity is especially acute here. Yes, the risk of extinction affects everyone, but that does not mean that “underprivileged” groups are less important. Imagine geoengineering can prevent catastrophic climate change. Coastal, rather poor countries will be the first to feel changes in weather and sea level. But the point is that important action decisions are not evenly distributed, as geoengineering is only effective on a large scale, and there are more dangers associated with it. Who will decide what risk to take? At what point will the threat of global climate change be serious enough for us to blindly trust new and unexplored technologies? My competence does not allow me to give clear answers. But if the people making such decisions belong to a small - and privileged - part of society, good results should not be expected for anyone but themselves.

The areas of science where we think the most useful research is done are more open. But in assessing the risks of mass extinction, we don't even know which questions are appropriate.When the stakes are high and the subject is so complex and hazy, it's time to get creative. If we don't know what is good and what is bad, the idea of ​​betting on one thing is not justified. Therefore, instead of hiring a whole army of scientists dealing with the same issue, they should be spread across the entire field of study.

The way science works now doesn't mean it will always work that way. The rules of the scientific game did not appear out of nowhere, they were created by hard work and no less hard work of thought - of course, sometimes chance or traits typical of human behavior, for example, a demonstration of authority, intervened. Scientists are people too; they react to stimuli in the same way. And if we want to see less conservative science, we need to reconfigure the reward distribution system.

You can, say, organize lotteries for funding, thereby increasing the amount of controversial research. In this case, scientists in need of money will feel freer, because they will not have to spend part of their efforts to impress the judges. The creation of new research institutes - well-funded, with a high status - will also help. It is probably also worth recognizing that scientific success and the importance of research do not always correlate.

With a certain degree of confidence, we can say that these principles will affect the work of analysts, print media, funding organizations and personnel committees. Scientific discoveries can still be judged in terms of significance, but we need to broaden our understanding of it: from "rather wrong, but probably productive" to "true under certain conditions" or "creative and opens up new spaces for research." In addition, if science becomes less conservative, it is necessary to think about how to protect new, eccentric, research from public misunderstanding or misuse of the results obtained. It is necessary to learn how to better handle scientific data, to conceptualize it.

But these are only preliminary provisions. Much remains to be done to put on a new track the organizations that control, shape and finance science. To begin with, we must admit that the romantic halo around the madmen from science to some extent prevents us from putting forward daring ideas. You don't need to be a rebel to think like him, to be adventurous, eccentric and willing to take risks. In some areas of science, this approach should be given the status quo. There is no need to bring back to life unique “aristocratic scientists” like Newton, Darwin or Buffon. Why not try to shift their way of thinking to modern science instead?

Adrian Currie ( is a philosopher who pursues PhD research at the Cambridge Center for Arts, Social Sciences and Humanities Research. Founder of the Extinct blog (, dedicated to philosophy and paleontology. Author of Rock, Bone and Ruin for publication by MIT Press.

The original article in English is available here.

Popular by topic