Wednesday, February 9, 2011

Is Science Getting Harder (or just more social)?

A recent blog post over at the WSG asks if it's getting harder to discover new things:
"If you look back on history, you get the sense that scientific discovery used to be easy. Galileo rolled objects down slopes. Robert Hooke played with a spring to learn about elasticity; Isaac Newton poked around his own eye with a darning needle to understand color perception. It took creativity and knowledge to ask the right questions, but the experiments themselves could be almost trivial. Today, if you want to make a discovery in physics, it helps to be part of a 10,000 member team that runs a multibillion dollar atom smasher. It takes ever more money, more effort, and more people to find out new things."

I remember sitting in front of the lab computer, hammering out my thesis, and bitterly staring at the APS freebee centennial calendar, which described the careers of the Greats within our field. It amazed me how these kids managed to walk off of some Midwest smallholder farm, embark on a series of textbook-destined experiments (of the technical difficulty that wouldn't get you a Masters today) and manage to become not only professors, but department chairs, within a few years of graduation! It sure seemed like science was getting harder...

But while it's a compelling story, I'm not sure I believe it.
One thing I've learned is that new bits of knowledge (even discipline-shaking advances) often seem completely obvious in hindsight. Accordingly, I think we may be taking for granted the creativity and observational prowess that are required to discover even "simple" facts.

I remember scoffing at the often sieve-like logic of the "great" Greek philosophers and being shocked to learn that visual artists in different cultures failed to simulate perspective for hundreds of years at a time. Stepping back, attributing my perception of these things to my innate intelligence is a pretty silly conclusion to come to - perhaps my understanding of the human and natural world has been enriched by over a decade of dedicated education within a culture with the most sophisticated recording of history and connection to other cultures that has ever existed. Perhaps the second nature by which, as a child, I drew far away objects as smaller than near objects was influenced by the tens of thousands of pictures I had seen portraying just that.
 
I experienced this transition of understanding in microcosm during grad school. I began my Ph.D. completely incapable of generating "new and interesting" ideas within my field as I had no clue what was new or what was interesting. I eventually generated a big enough pile of ink-covered dead trees to begin spitting out derivative proposals, and later a decent thesis proposal (with a lot of mentoring). By the time I was done my thesis, it was almost incomprehensible to me how it had been so difficult to come up with these 3 projects in the first place. My contemporary self could easily have gone on to design a whole career's worth of forest epidemiology studies. Of course then I jumped ship to an unrelated field and had to start (almost) all over again...

Now immersed in the world of genomics, I'm starting to see how even intricately complicated things can be simple. The incredibly disruptive new DNA sequencing technologies are quickly (and cheaply) allowing us to ask questions we always wanted to but didn't know how. Just today I was listening to a colleague casually describe her massive genomic/metabolomic/phenotypic dataset in her favorite crop. My mind was flooded with all kinds of cool ideas of what could be done with such a dataset - not because I'm brilliant, but because of the power of the technology to easily do what used to be much more difficult (Developing some basic programming capabilities has been a big part of this). I wonder if we're now (again) at a point in time where deceptively "simple" new ways of looking at the world will lead (with access to contemporary technology) to a slew of important new discoveries.

Roger Beachy, Director of NIFA, recently gave a speech to the American Society of Plant Breeders, where he outlined the USDA's shifting conceptualization of how to fund research - from projects envisioned and completed by individual professors to sprawling interdisciplinary consortia focused on "grand societal challenges." Many professors (understandably) view this trend as a threat to their careers, built on specialized and relatively isolated expertise in some (obscure?) corner of the scientific world. In particular I remember seeing this idea implemented during my postdoc as I read through the oddly-specific new calls for grant proposals - e.g. epigenetics studies of plant response to light, or molecular genetics of oomycete fungi.* While I'm definitely biased towards applied research anyway, I do think that this more focused and interdisciplinary approach really could facilitate our ability to discover the next big ideas within the noise of every little thing that someone might be interested in studying.

I'd bet that most "simple" ground-breaking realizations can only arise after years of serious scholarship and a radical change of perspective. And while it may be true that, at the beginning of science, this revolutionary work could be accomplished by individuals with technology no more complicated than a glass flask and some soup, I think it happens in the same fundamental way today - just with our (much more complicated) modern technology and enough people to understand all parts of the system we're trying to crack.


h/t: jessemun


* I can't manage to find a link to the grants I'm referring to, though I'm pretty sure it was the USDA...

No comments:

Post a Comment

LinkWithin

Related Posts with Thumbnails