Around the time of the discovery of the Higgs Boson, Peter Higgs, its namesake admitted that the academics of his generation would not be productive enough to get a job in a modern University as junior academics.
The modern academic is not the tweed jacket wearing, pipe smoking thinker of the popular imagination. They are overworked and perpetually tired, required to maintain a high rate of publication, teach undergraduates, supervise researchers, provide pastoral support for both and run most of the operational aspects of their school.
As someone mulling over a career in science I find myself thinking, ‘When would I ever get the opportunity to just sit and think?’.
Isaac Newton was not maintaining a heavy schedule of teaching and publishing when he discovered the laws of physics. For a large portion of it he was lazing around in the grounds of a country house, which is where the apple happened to fall on his head. The modern way of doing science is simply too fast paced, too obsessed with material productivity to have any claim to profundity, too obsessed with underpaying lecturers to teach overpaying students to wring out as much as possible with as few personnel as possible. Formal academia seems to be an increasingly Sisyphean profession.
To suppose that an academic doesn’t need time to their thoughts, without expectation is to take on a rather technocratic view of science. To see science simply as a lineage of ideas, each formally derived from the last in a process of professional and bureaucratic procedure.
So, is there such a thing as original thought or is science just a big Markovian chain of events, not predetermined but entirely determinable by the prior state of science? Although the great ideas of ages past were delivered to us as a (mostly) formal canon of formal experiments building into formal theories, the genesis of those ideas cannot be determined entirely from their antecedent discoveries. The genesis of great ideas comes from elsewhere, from the creative adaptability of the human mind.
Against the dogma of rigour that pollutes the modern academy with its technocratic tedium, is the fact that any great idea worth sharing starts off not as a formal experiment, but as a cruder form of empiric: an intuition. We are apes, who live in a semi-predictable but ever-changing environment. We evolved to be good at making tools to bypass the need to rely on mother nature for environmental adaptation, a shortcut that allows a solitary monkey to do a million years worth of natural innovation in an afternoon with some sticks and stones. To be capable of such requires immense flexibility of thought, the ability to think about various counterfactuals, things that ‘might’ work and select from those prototypes, which, crucially, may be systemically different to each other. We are not naturally adherent to a rigourous mode of thought or to formal, professional processes of knowledge creation and dissemination. These things take training and a lot of intrinsic effort, but we do them for a reason.
We may know that our way of doing things works, but other people, who have other ways of doing things, who would stand to gain from listening to you, are going to be resistant to your ideas. People are skeptical of alternative solutions to problems for which there is already a stable consensus. That is why we need Science, with a big S, to be produced according to a formal procedure that achieves at least the following: (1) clear communication of ideas with minimum ambiguity for maximum pedagogical instruction, (2) The explanation of formal experiments to allow others to test and retest findings and (3) Theories which are testable and falsifiable (In the Popperian sense) through further experimentation. This professional code procures a common ground in which scientific ideas, regardless of their genesis can be expressed in such a fashion that they may be criticised as fairly as possible. It says nothing about how we came to consider those hypotheses in the first place.
In the modern world, most papers are written on the basis of somebody else’s idea, they start with extensive literature reviews explaining the state of the field and this paper’s unique contribution to the academy. They point out a gap in the research and work to fill it in. This is important work, and for an academic early in her career, it is the foundation which builds up the knowledge base and the intuition as to the prevailing direction of the field to come up with useful innovations of her own. However, it seems to be that in many fields this is it, this is all you will do for all your career. Senior researchers with well cited papers are all over the place, rarely do any of them seem to step away to make works of broader scope, where their deep intuition would translate into deeply imaginative thought. Their jobs are too insecure and their workload too intense. They live in a culture where professional survival and future financial solvency depends on maintaining a fast paced publishing schedule, while teaching undergraduates, supervising researchers, providing pastoral support to both and managing most of the functional aspects of the school. What time is there in the schedule of a modern scientist to consider the deeper fundaments of his field when he is overworked and his pension is getting rinsed?
If the time at leisure that would permit creativity to return to science were afforded at least to those that have proven themselves, then the toil of the young academic would feel worth it. Currently though I find myself wondering, who would ever want to be a scientist?