Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

(DISCLAIMER: The following is based on my own experiences and may not agree with your own. These are just things I've personally observed to be true. And of course these are just tendencies rather than absolutes.)

Do you often see biologists without CS backgrounds making those kinds of mistakes about CS topics that people with CS backgrounds make about biology topics?

I also have a background in CS and biology, with some math and physics, too. I've also observed the tendency of people mainly trained in formal sciences (e.g., CS, math, logic) who's schooling involves a large amount of deriving things from first principles, to make this mistake (this is also true of more mathematically inclined physics majors and certain flavors of economics majors). They often think that what they know is enough to derive a novel insight from first principles, when the further you get from physics, the less true that becomes as the nonlinearity and sheer complexity of the world starts to interfere.

Different areas provide different ways of thinking with different strengths and weaknesses. They aren't mutually exclusive in the sense that learning one makes it harder to learn the other, but they require a non-trivial depth of study to pick up so most people tend to get mentally siloed unless they either study one of those other fields or somehow pick it up through a more non-traditional route (which absolutely happens, but is less reliable).

If you want an example of the type of things biologists tend to be weak at, I'd say quantitative thinking (at least relative to other sciences). Biologists tend to be the most math-phobic of the natural sciences, so most have a mental ugh field around most math. You'd be shocked how many grad students can't do some fairly basic stuff. Many undergraduate programs barely require calculus, though that's slowly changing, and you'll often get some exposure to some elementary probability and combinatorics in your introductory genetics class. And it's shocking how many people with whom I studied evolution and comparative physiology and anatomy didn't come away with some degree of probabilistic intuition.

Other fields I've studied to variously minor degrees that train particular mental habits or develop particular skills and perspectives that I've found valuable are psychology and the study of cognitive biases, cognitive science, computer programming and software engineering, chemistry and biochemistry, literature, economics (both the traditional kind and the more modern behavioral kind), probability, game theory, history, anthropology and a few others.

Fields I suspect train other mental habits/skills/etc. that I lack and haven't yet studied include poetry, martial arts, dance, jazz and/or improv, music theory and music in general, drawing, deeper dives into the topics I've already encountered, and probably a ton more that I'm can't remember.

Really the more wide ranging your curiosity is, the more well-rounded you become. And since most people don't bother leaving their silos (maybe a handful of others at most), you can after many years start to put together all sorts of insights that others find non-obvious (though they will still rarely be novel).

(I think nowadays people might call these "mental models", but learning about mental models directly through a description in a listicle has always seemed less useful than studying the fields themselves and indirectly building the mental model yourself.)



As a bioinformatician, while I largely agree with you, it's worth nothing that it's not that biologists are math-phobic or bigoted about the value of numbers - it's that biology has undergone the most radical changes in recent years of any of the primary sciences. A helpful metric (perhaps apocryphal) that I heard at a talk is that the total amount of knowledge in the sciences doubles about every 10 years, in biology every five, and in genetics every two years. Most biologists working today were trained before the genomics revolution, and have not developed the mental heuristics around it. In comparison to today, the genetics of 25 years ago feels practically paleolithic, and it's a comparison this stark is hard to find in any of the other sciences. Genomics in the early years was also an unregulated Wild West, with lots of speculative studies and predictions that never panned out. The field matured rapidly, but as a consequence experimentalists are vary of "predictions". For example, despite intense scrutiny, about a third of E. coli genes remain uncharacterised, and we don't even know if they're really genes, or an artefact of the gene prediction process. As a bioinformatician, the barrier to entry for predictions is quite low, and experimentalists are understandably cautious. It's also a case of moving goalposts - they get used to and begin to accept computational predictions from one domain, and meanwhile whole new fields of computational biology have opened up. It'll take a while for them to accept them.


Sounds like a great time to be alive!


It is! If you look at some of the brand new labs being established by freshly minted assistant professors, many of these fields didn't even exist 5-10 years ago. Machine Learning has just begun to percolate into Biology, and we are on the verge of major shifts in the field.

It's also not just computational biology that's booming. Another of the fields I follow, population genetics, has been practically rewritten in the past decade, thanks to improved techniques extracting Ancient DNA. When I started my PhD, the idea of extracting and assembling the genome of a Neanderthal was a distant pipe dream. By the time I finished my PhD, we have multiple Neanderthal genomes from across Eurasia, and the discovery of a previously unknown human ancestor, sequenced entirely from a single finger bone in a Siberian cave.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: