A while ago, I taught CS for a year in a local high school. I can very much relate to the notion of "astonishing facts were presented without astonishment": as a teacher, you don't have the freedom to teach whatever you want (of course), but you're very tightly bound to a curriculum that's developed by the state government. And for CS, this curriculum was so uninteresting and uninspiring (what a surprise: 13 year old kids don't care about the history of computers), that I couldn't blame any of my students not to show much interest in my classes.
As a matter of fact, I gave up after just one year. It wasn't any fun for anyone, not for the students, not for me.
What a horrendous crime, to turn a fascinating subject into a boring curriculum to be forced on teachers and children.
I've received great intellectual satisfaction from various well-taught subjects. I would rather chop off a finger than lose them. So curriculum committees that make subjects boring are doing something worse than chopping off millions of children's fingers.
With any kind of history especially, its just rote memorization of facts and not the connections between those facts.I hated history in school because of that, but now I actually find it interesting to learn that x happened because of y that also led to z and such. Or just rote memorization of technical facts, like how many wires does a PATA cable have. Or why must kids memoroze how an ethernet frame is built up? Sure go over it in class and show it as a lesson in how to read how binary protocols are defined. Because either you forget it anyways because its not relevant to your job, or you can look it up and memorize it over time as you use it often enough.
I really wish that teaching of history will get better for current and future kids.
I can really relate to your experience, even though mine was from a parent's perspective rather than as a teacher. I found a similar thing when tutoring one of my children in trigonometry. The way the material was being presented in school didn't click with him, but astonishingly, despite having studied it decades ago both at school and university, explaining it to him, it finally made sense to me. The unit circle definition of a tangent is a thing of beauty. I had the time to get my child to appreciate it as well, because of the extra time I had to spend with him, whereas the teacher had to hit curriculum benchmarks.
I also think this is where things like intergenerational math-phobia come from: parents who don't grasp core concepts and are scared off, and can't help their own children, creating an ongoing cycle.
> I also think this is where things like intergenerational math phobia come from: (elementary) teachers who don't grasp core concepts, are scared off, and can't help their own students, creating an ongoing cycle.
I hope you appreciate my addition of the other common path of math phobia.
Absolutely, I do appreciate that addition — I definitely had teachers like that.
It’s probably why, when I got to university and tackled subjects like probability theory, discrete math, and theoretical CS, I did extremely well — they weren’t reliant on the shaky algebra and trig foundation I had from school. Once the focus shifted to logic and conceptual thinking, without the baggage of poorly taught fundamentals, everything clicked
This is why most good teachers don’t use the books but find creative ways to still meet the standards. More work though, so fewer do it now with pay being so shit.
I think the whole teaching the history of computers is a big failure at an attempt to Segway into computer organization and architecture.
Nonetheless, I get what is happening. If it’s a pure Computer programming class then the goal maybe to have them understand the “basics”…like what is the hard drive vs RAM (memory allocation) or what is a transistor (Boolean logic) and what is a punch card (mnemonics and abstractions of those mnemonics to what is now just a computer programming language).
This is very much a tangent, but I think it's nearly certain that "segway" will end up overtaking "segue" as the predominant spelling for the word that is defined as: "to make a transition without interruption from one activity, topic, scene, or part to another"
The "mistake" happens so often, partially because "segway" is a much more straightforward spelling if one has only heard the word said aloud, that I think it will eventually become the actual way it is spelled!
English is spelled phonetically. Just not Modern English phonetically but Middle English phonetically. And then it froze into ideography because of printing press.
Actually it's not an absurd take at all. The absurd take is that we "should not bend language around ignorance."
That's precisely how language changes over time. Language is not a strict set of rules. It's based on understanding and consensus, so sometimes things that are "wrong" do end up being accepted.
I am not a native speaker, but the two words do not sound even remotely the same.
How does this mistake happen so often? Can you explain people's thought process a bit? Is it just: "Something something 'seg...' ... ah I know, I will simply use another random word that starts with the same 3 letters and doesn't make sense in this phrase!"?
Most mistakes remain mistakes, and do not become part of the language. The idea that mistakes generally get accepted as correct is simply untrue, which is what you are implying.
I am sure people will make the mistake, as they sometimes do today. But it is a mistake, and will likely be recognised as one.
It is likely that the language gets more cemented by automatic spelling and grammatical correction, including using AI. For example, there are a number of grammatical and spelling changes that have been cemented by American spelling/grammar checking programmes ie. by MS Word.
> The idea that mistakes generally get accepted as correct is simply untrue, which is what you are implying.
I did not imply that at all. I said sometimes, so it's not that absurd that it could happen. It does happen though, and a quick google search will give you pages of examples.
I'm sure you'd describe yourself as a Digital Auditor Workstation used by ADM DeeJays. After all, language changes over time, so my errors of usage are all correct.
Was that the point? Don't forget that you're on hackernews, not reddit. Strawmans are less accepted in this community. Individually, you are neither a consensus that was described nor did anybody in this thread implied that "all errors of usage are correct" and accepted. Your sarcasm is unwarranted and provides little value to this conversation.
Personally, I struggled a lot in my earlier CS/Informatics education, partly because I never felt like I understood what was actually happening/how we got here, everything was just factoids in a void. When I took a gap semester between my A.S. and B.S., I finally studied/explored a bit of the history and it put a lot finally in perspective.
> have them understand the “basics”…like what is the hard drive vs RAM (memory allocation) or what is a transistor (Boolean logic)
You must understand these things at least conceptually if you want to really understand how to write efficient programs. Maybe not at the level of how memory can electronically "remember" a 1 or a zero, or how a hard drive can magnetically do it, but at least the relative speeds e.g. register vs. cache vs. RAM vs. disk.
I basically found this in college too, I quickly gave up on computer science as a major. I'd rather just go out and learn how to build what I want to build versus hearing a 3-hour lecture about how the jvm works.
The answer is it's magic and no one cares, now let's go build some games
Firstly, and this is worth pointing out, "computer science" is not about programming. It's about science, in this case specifically the science that makes computers work.
At school I thought "computer science" meant "programming" - which it doesn't. So well done for recognizing this before wasting your much time. (Seriously, not sarcastic.) programming can easily be learned outside college.
To other general readers here though I'll say that understanding the science can be really helpful over a career. It's not terribly applicable in getting that first job, but as you progress more and more of those theoretical fundamentals come into play.
Ultimately there are a small fraction of people who need to understand how it all works, all the way down, because those people build the things that programmers use to build everything else.
This is a myth. Computer science absolutely is about programming. The science that makes computers work is called physics.
There are theoretical parts of computer science, but it is fundamentally a practical subject. All of it is in service to programming. Type systems are about typing programs. Algorithms are implemented using programs. Data structures are for use in programs.
The very worst computer science lecturers are those that forget it is a practical subject and try to teach it like abstract mathematics, because they believe (whether they realise they believe it or not) that it is more prestigious to teach abstract concepts than practical concrete things.
It is the same in mathematics, where unfortunately there has developed a tradition since Bourbaki of trying to teach abstract notions as fundamental while concrete problem solving is left to the engineers. The result is that many engineers are much stronger mathematicians than many mathematically-trained students, and those students have to relearn the practical foundations of the subject before they can make progress at the graduate level. If they don't, they get stuck doing what looks like maths, but is actually just abstract roleplaying.
The history of English is taught in English classes. Historical context is important and interesting. You don't really understand a subject without knowing a bit of its history.
My favourite classes were those where we didn't just get taught facts and theorems but we also got taught a bit about who proved the theorem for the first time, who discovered this fact, what this algorithm was first used for, etc. So much easier to remember too.
This is one of the best things about studying law: the very nature of it makes it impossible to teach it without the historical context.
they might just remember it all once they're adults!
imagine that!? an historically informed populace???
you'd need more expensive lies and higher quality fakes... the government would be costlier to run.
ideally, in the long term this would make the national currency's value in the international money market rise up. but why wait for that when one can directly manipulate money through trade fraud and covert military ploys?
Those curiculums developed by sould-dead gremiums in consensus on the minimum knowledge you goto have are a blight on western civilization. Instead of giving students the ability to discover a topic, or built something they are interested in themselves and then give them a understanding and fascination with the discoverers who have gone before them. Instead they kill the subject..
I must confess, it gives my dry old heart some joy, to see the anti-education masses coming from this, voting and storming the fortresses that produced the paywall around education, that only money with tutors could or accidental intrinsic motivation could overcome and burn & salt those outposts of classists academia.
Yes, definitely, destroying education as we know it without any plans for what the next thing is will definitely work.
Developed countries really need a come to Jesus moment, because the disdain for everything that made them great places is unbelievable. People will understand, after great suffering, that destroying stuff is much easier than building it.
> People will understand, after great suffering, that destroying stuff is much easier than building it.
"It is easier to destroy than to create" doesn't tell you when something should be torn down.
You can have a house that provided shelter for your family for generations, but if it's water damaged, the floors are rotting and it's full of toxic mold, the person who shows up with a bulldozer isn't necessarily wrong.
We're in the destroying phase right now. Unless you live in China - I hear they're mostly doing well. Or middle of nowhere Africa, where there's nothing to destroy because there's nothing there.
But systems can rot from within too, or just decay naturally, and don't need to be destroyed. What if the core ideas that built our current civilization were ideas of the past, that we don't have any more, and we don't know what to do when The Machine Stops? Doesn't have to be a literal machine - it's a good metaphor for how democracy fell apart.
Forgetting that it was the anti-education forces that created the curriculums. The war on public education goes back a long time; teachers lost the freedom to teach decades ago. and it has been the same forces behind it all along.
Ok... what would you do differently? Keep in mind you have to educate millions of students across an enormous spectrum of abilities, socioeconomic backgrounds, and interests.
My fork in the road with hard tech hard science versus biology was in high school. It seemed that students that wanted to become doctors took AP biology and students that wanted to be engineers took physics and chemistry. I had wanted to be an engineer since I was 12 years old so I felt the decision was already made. But all studying neural networks in college in the 80s I realized that there was this tremendously rich domain of real neurons which I knew nothing about. I worked as a software engineer for a couple years after graduating but then went back to school to study Neurophysiology. I did not pursue it as my area of work or research, but I am grateful for having had the opportunity to look at the world from the perspective of a biologist.
If you're an engineer and early in your career and feel there's something missing from your intellectual space, I encourage you to go back and get a graduate degree in something totally different. Humans live a very long time so don't feel like you're wasting time.
I've been programming since I was eight, but truly fell in love with biology in 12th grade chemistry: the first introduction to organic chemistry and biochemistry. It was the first time I truly started grokking the application of systems-level thinking to the biological world; how do trees "know" to turn red in the autumn? How do fetuses assemble themselves from two cells?
I decided to purse a double major in biochemistry and evolutionary biology and it was one of the best decisions I've made in my life. The perspective you gain from understanding all life in terms of both networks and population dynamics of atoms, molecules, cells, tissue, organisms and populations -- and how every layer reflects the layer both underneath and above it in a fractal pattern -- is mind-expanding in a way I think you just don't and can't get designing software systems alone.
I work as a software engineer / founder now, but always reflect wistfully on my time as a biologist. I hope to get back to it some day in some way, and think what the Arc Institute team is doing is inspirational [0].
Has anyone seen content that used this multiscale networking and population dynamics as an instructional approach?
For small example, there was a Princeton(?) coffee-table book which used "everyday" examples to illustrate cell/embryonic organizational techniques - like birds equally spacing themselves along a wire. Or compartmentalization, as a cross-cutting theme from molecules to ecosystems.
I've an odd hobby interest in exploring what science education content might look like, if incentives were vastly different, and massive collaborative domain expertise was allocated to crafting insightful powerful rough-quantitative richly-interwoven tapestry.
I would love to do something like this but simply cannot afford it. I think it is good advice but going back to school for a degree one does not plan on utilizing is not as feasible today as it was in the 80's, largely due to the sizeable increase in tuition without reciprocal increases in wages.
In this day and age, you can do this for FREE and on the side, whenever you have time!
There are tons of very well-done professional level video courses on Youtube.
There are more organized courses that only ask you for money for the "extras", like some tests and a certificate, but the main parts, texts and videos, are free.
You could start with a really good teaching professor (Eric Lander, MIT) and his course: https://www.edx.org/learn/biology/massachusetts-institute-of... (the "Audit" track is free, ignore the prices; also ignore the "expires" - this course restarts every few months and has been available in new versions for many years now)
It's very engaging!
There's similar courses for everything in the life sciences, there on edX, on Youtube, many other places.
I feel the true Internet is soooo underutilized by most people! Forget news sites, opinion blogs, or social media. Knowledge is there for the taking, free. Only the organized stuff, where you end up with a certificate costs money, but they usually still provide the actual content for free.
Time and energy are also at a premium in the current economy. Good luck learning biochemistry by watching YouTube videos after 8+h of coding and meetings plus commute plus making dinner plus cleaning up.
Depending on where you live, and what you want to study, you might be able to take a couple courses at the community college in areas of interest without spending a lot of money.
I was paid to get a PhD in Biology, albeit just enough to live on. Most people in PhD programs are, either through being a TA (teacher's assistant) or RA (research assistant). The real financial cost is the opportunity cost of 5-6 years of your life.
Whether or not broad support for training scientists holds up during and after the current administration remains to be seen.
My current tuition is under 500 CAD per class. The opportunity cost of not working full time is the real bulk of the cost of studying in places that have a functional government.
A lot of companies will pay for at least part of whatever college classes you take, without auditing whether or not it would be good for your specific job.
I encourage people to look into it, it's a benefit a lot of people have but don't use and it's leaving money on the table.
Can you say more? What kind of company would so such a thing? Maybe I live in a bubble but that's so far outside of what I've seen that it just sounds fantastical.
Ok, both of these comments made me doubt my memory so I just checked and my current employer, a very large consumer company, and the limits of the program are that you get a C or above, and the class is "related" to your job or any job you can get at the company. But I've gotten classes paid for that only tangentially related to my job with no problem. So I concede that you might not get a biology degree as an engineer but my particular company does a lot of different things so my guess is in practice you'd have no problems. I also worked at a now-defunct mid-size startup and a hospital system with similarly loose requirements but I don't have access to their docs anymore.
My company uses guildeducation.com and we can use basically $5k a year (I think, it might be semester), a lot of if it is just individual classes, but there are also some degree programs. I don't know if they preselect which courses are available to us or if we have access to the whole catalog. I suspect it's somewhat curated, because we are a medical company and most of it is medical stuff. There is a CS bachelor's program but last I checked there wasn't an MS CS program.
I would assume most companies with 100+ office workers (essentially big enough for an HR department) usually offer something like this in western countries.
I had a job with an education budget listed as benefits.
However, to use it there are constraints:
1. The topic should be related to technologies used by company. Cannot get a Google cloud certification as they are using aws.
2. To get it you need approval by line manager, hr, and director of the office.
3. If it is more than €250 you need to sign up loyalty agreement for a year. Meaning if you will return some amount of you quit.
With all that strings attached it is just a marketing bullshit to attract new hires.
Plus usually the employer wants it to be related to ones job, from their very limited perspective of the world and management decisions. For example I couldn't even take a language course for education vacation, as the employer did not make any use of my language skills.
I would love to do this, I just cannot afford it as others have already stated. It's depressing to feel like I spend so much of my life at my day job and yet require it to afford the tiny portion I get left. I wish things were different.
Same. Biology was an elective in high school and I never took it. I took Earth Science (basically introductory geology) and then went into the Chemistry/Physics track (two years of each). Never felt I missed it, last time I had any real biology education was a unit in 8th grade science and I didn't care for it then.
The breakpoint was molecular biology around 1986 with the introduction of PCR. Once that happened, biology went from being alchemy to being science.
I loathed biology as taught prior to that. Once I got a molecular biology course, I thought biology was amazing and wondered "Why the hell did we teach all that other crap?"
Well, that was because the tools we had for biology sucked prior to PCR. My problem was that I recognized that even as a child.
I invested a great deal of effort over 30+ years to learn biology, which I started to love in high school when a teacher introduced us to molecular biology. Over time I've come to appreciate that biology is a huge field and people who master one area often know little to nothing about many others.
To be proficient in biology you need to have "Extra" skills: extra ability to work with ambiguity,ability to memorize enormous amounts of descriptive information, and highly abstract representations. Digital biology often loses many aspects of biological reality, and then fails to make useful predictions.
Over the years, I've come to realize I know less and less about biology- that I greatly underestimated the complexity and subtlety of biological processes, and have come to admit that my own intelligence is too limited to work on some problems that I originally thought would be "easy engineering problems".
A great example of the rabbit hole that is modern biology is summed up here: what is the nature of junk DNA? To what extents are digital readouts like ENCODE representative of true biology, rather than just measuring noise? What is the nature of gene and protein evolution?
https://www.cell.com/current-biology/fulltext/S0960-9822(12)...
(note that while I disagree strongly with Eddy in many ways, I've come to recognize that I simply don't understand the modern view of evolution outside the perspective of molecular biology (IE, what geneticists like Eddy think).
Also, recently, Demis Hassabis postulated that if he is successful, we will come up with silver bullet cures in 10 years time simply using machine learning. It's amazing how many computer scientists (I call him that rather than a biologist, although he has worked into neuro) make this conclusion.
I've got a background in neuroscience and transitioned to data science a few years ago. Your comment about the rabbit hole of modern biology is spot on. I've been hearing for 10+ years about how ML like computer vision will revolutionize medical diagnosis and treatment. It hasn't happened yet and I think that enthusiasm comes from the fact that we built computer systems from the ground up and therefore know them deeply, whereas biological systems aren't fully understood.
Appreciate the sarcasm, but... it's really 3 billion years of evolution, with astronomical levels of actual entities living and dying in a dynamic world environment. Chemical reactions happening in nanoseconds. Polymers have extraordinarily complex behavior!
The field of biology was created by people who love to classify/name things. This has resulted in what we have now: A subject where the prerequisite to understanding is the ability to read long passages of text littered with jargon and visualize what that might represent. Even if everyone's reading skills were where they should be, the second part is not a super common skillset.
It's one of the reasons why I work in visualization for life sciences education: I think we're missing out on people who might otherwise make massive contributions to the field because they failed to memorize what the "endoplasmic reticulum" does. Much of biology you don't have to actually remember what things are called in order to understand the processes (at least at a basic level like what a middle schooler might be taught). Once you're exposed to the fascinating complexity of life at that level, for many people it can be interesting enough to build the motivation for the memorization/etc.
> The field of biology was created by people who love to classify/name things.
More to the point, the field of biology is so complex that for the longest time we could only name and classify things. Understanding came later, when we'd accummulated enough data and had hints from chemistry and other fields.
The problem is that once we gain that understanding, we add that as one more chapter to our textbooks, one more lesson tacked on, instead of rethinking the curriculum around our understanding.
The use of latin doesn't help either. "Cytoplasmic net" (or better yet "plasma net") is a lot easier to understand, visualise and remember than "endplasmic reticulum".
If you are an English speaker. If you are native in a Latin-based language, "reticulum" is pretty clear (reticolo, retículo, réticule etc). So, it's just a point of view and dictaded by the most used language within research/education at a particular point in time.
>Much of biology you don't have to actually remember what things are called in order to understand the processes
But even that's besides the point of the fact that all these things are nothing more than abstractions created by humans, and ultimately it's all one giant soup of interacting molecules.
This entire article reminded me of reading the introduction to Ender's Game by Orson Scott Card (it's in the 25th anniversary edition).
He mentions reading Kon-Tiki by Thor Heyerdahl and finding archaeology, as described by Thor, to be incredibly fascinating and exciting (which I agree with having read Kon-Tiki as well).
Card goes on to say that when he tried ACTUAL archaeology he found it incredibly boring. e.g. it was mostly sitting out in the hot sun dusting off rocks hoping to find some bones.
It's a reminder of two facts:
1. EVERY activity has exciting and boring pars
2. A good writer can make even dull and boring activities comes alive
I had always thought of biology as 'less rigorous' than the other sciences, and consequently less deserving of merit than, say, physics (my major). Less mathematical, not as rigorous, purely memorization devoid of deep understanding,.
It took me a while to shed that view.
1. There's an inherent charm and beauty to biology, and the ability to memorise is a skill.
2. The many different sub-disciplines of biology demonstrate the level of complexity that the field demands. And, even if it isn't as 'rigorous' as physics, do we denounce experimentalists because theoretical physicists exist? They simply serve as distinct, but crucial, parts of a chain.
I took some programming courses in college. I loved computers and was very interested. However, the classes were a guy reading from a book about C. That was pretty much it. You did what the book said and hoped something stuck in your head.
This was early days of the internet, the book(s) were largely the only resource. The instructors were folks who just understood coding in C naturally and had no idea how to communicate with those who did not. No joy in anything, just raw code.
I dropped out.
Decades later after age 40 I was at a career crossroads and took a web development class. I loved it, I could make things quickly, the instructor actually understood how to teach / introduce concepts. I've been happily coding professionally and personally since then.
How things are presented sometimes makes all the difference.
I remember my first interaction with computers was on one of those ancient ones way back when. Our teacher showed us how to make a circle appear on the screen. I was preoccupied with how the computer was actually able to render that circle, what exactly was happening under the hood and what kind of physics was happening for all this to come together as a circle on the screen and not that particular function of whatever program they were using at the time. That turned me off to wanting to mess around with computers for awhile.
The post by James Somers that this article references at the top inspired me to buy the David Goodsell book The Machinery of Life. I would seriously recommend that to anyone who doesn't have a background in biology (like me). The phrase is a bit of a cliché, but it genuinely blew my mind, to the extent that I had to read it slowly because there's so much fascinating stuff packed into such a small book. It's obvious to me now, but the fact that so much of this stuff is about physical shapes locking into each other, and doing it at an almost unimaginable speed, was absolutely enthralling.
My interest in biology isn’t driven at all by stories, history, or “adventure”, but rather by the awe-inspiring complexity and majesty of all the microbiological processes and their interplay.
Yes, it’s pop science, but last be year I read through Philipp Dettmer’s “Immune”, and the description of how the immune system continuously generates random/arbitrary sequences of nucleotides, builds the proteins that those sequences encode, and then subjects the resulting proteins to a “is this a ‘me’ protein or an ‘other’ protein?” gauntlet, the latter path of which allows the body to create antibodies for completely novel proteins... is just incredible.
I have an idle fantasy that, in the afterlife, I’ll be able to ask God questions like “so what are quarks made of?”, “why is the speed of light what it is and not any faster/slower? What would the universe have been like if the speed of light were several orders of magnitude faster/slower?”, “is there a single force that unifies all the ones that humans know about? What would the universe have been like if the weak nuclear force were just a tiny bit weaker?”, etc etc etc etc etc etc etc.
same inspiration but I wouldn't devolve it to 'pop science', it's simply less axiomatic than physicists and mathematicians would like. The fact there's 4 billion years of ecological change beyond the biological change just makes stuff hard to prove empirically.
esp. when physicists use things like the anthropic principle to describe our own universe.
A complex three dimensional organism self-assembling from a single cell is 100% magic, especially given how resilient it is to disruption. You can kill one of the two cells produced by the first division and still get a fully formed organism (that's one of the actual early experiments in morphogenesis theory).
Well, this is incredible:
"The gene sequence had a strange repeating structure, CAGCAGCAG… continuing for 17 repeats on average (ranging between 10 to 35 normally), encoding a huge protein that’s found in neurons and testicular tissue (its exact function is still not well understood). The mutation that causes HD increases the number of repeats to more than forty – a “molecular stutter” – creating a longer huntingtin protein, which is believed to form abnormally sized clumps when enzymes in neural cells cut it. The more repeats there are, the sooner the symptoms occur and the higher the severity"
Not the only sequence model that exhibits stutters on repetitive inputs...
My father, who was a teacher considered teaching classes to be a kind of performance art. For getting information, you are better off with a book (or other media). His goal was to put up a performance good enough to get students interested, and ideally, read the books later.
I can really relate to this — in school, biology felt like dry memorization. It never clicked with me, and I wrote it off for years. If I could recommend one subtopic of biology to math and physic people, it would definitely be mycology!
It's like real-life Pokémon GO and field mycology has a "collect 'em all" vibe. You get out into nature, identify and catalog fungi — it scratches the same itch as exploring an open-world game.
Fungi are discrete, classifiable entities with tons of metadata: GPS location, substrate, time of year, morphology, spore prints, photos, microscopic features. Perfect for structured data nerds.
Unlike many branches of biology, you don’t need to go to the Amazon. You can walk into your backyard or a nearby forest and find species newly known for your country and sometimes even new for science.
Microscopes, macro lenses, chemicals, even DNA sequencing. There’s a hacker spirit in mycology.
Projects like iNaturalist, Mushroom Observer, and FungiMap are full of real scientific contributions from everyday people. The barrier to entry is low, the impact can be surprisingly high, and the community is genuinely welcoming. Many leading contributors — even those publishing in cutting-edge scientific journals — are passionate autodidacts rather than formally trained biologists.
High intra-species variance, subtle features — perfect playground for machine learning wich is not nearly "solved" here.
Cordyceps that zombify insects. Giant underground networks that share nutrients between trees. Bioluminescent mushrooms. Many weird stories.
Mycology is also becoming a computational frontier - projects like FungiNet use graph networks to map symbiotic relationships, and citizen science platforms generate massive datasets perfect for ML applications beyond just classification. The unsolved phylogenetic relationships and complex biochemical pathways of fungi represent some of the most interesting computational problems in modern biology.
Hey, a lot of fellow biologists here! A few questions:
Is there a 'hacker news' for biology that I'm missing out on?
Where do you get your biology news from?
Where do you think the field/s are going?
Is bio harder than other STEMs?
I'm a neuroscientist/bioengineer by training and profession. I followed the path that a lot of commenters here did too, in that I came back to bio after a harder STEM career (physics). Glad to know I'm not alone in this!
I have a PhD in cell biology, as well as post-doctoral research experience. I'm now a software engineer for a big chip designer.
> Is there a 'hacker news' for biology that I'm missing out on?
Not that I know of. Could be cool. I produced a prototype platform that was a searchable DB of all papers published on PubMed, with a comment and karma system. The idea was to incentivise continued debate on published papers. Currently discredited papers continue to get cited for many years, because almost no-one publishes their critiques once the paper is published.
> Where do you get your biology news from?
Back in the lab, we would mostly just closely follow our small field with automated searches of PubMed. We'd find out about wider stuff by attending talks and speaking to other scientists. Often new and exciting things will take years to publish, so you would often find out before publication via talks.
> Is bio harder than other STEMs?
Only because the traditional high-school and university education in biology does not prepare people for the realities of cutting-edge bio research. I did a load of coding, image analysis, complex microscopy, chemistry etc etc, during my PhD and post-doc, none of which was taught during my degree or high school.
To thrive in biology research, you have to be comfortable not knowing about a thing, but figuring out how to do the thing, which likely no-one has done before.
1) Sadly there isn't really. There are a few good blogs like Derek Lowe's "In the Pipeline" that centralize news, but no anonymous online forum like this.
2) Google scholar alerts, Twitter, Bluesky, and word of mouth.
3) I think our understanding of biological processes at the mesoscale is about to hit an inflection point, largely through advances in electron microscopy (cryo-ET) and the ability to perform simulations at this scale.
4) Not harder but definitely more messy and progress is less linear.
> I liked math and physics and economics and even chemistry,
I think the reason is that these subjects give students a sense of instant gratification and instant empowering. Every concept and every theorem could be a breakthrough to my problem-solving skills, and I can immediately see how I can solve more problems through the learning.
But of course, nothing gives more instant gratification than CS. I can try and build things via my keyboard at any time. The learning and the associated work does not get better than that.
Love biology. I appreciate purist mathematician/logicians prefer chemistry and physics and it seems to be an inside joke in the professions that biology isn't on the same level when it comes to axiomatic things.
I'm a classic INTJ but left school and built biology-online.org 25ish years ago. I think it's had a couple of thousand years of reading hours. I sold it on thinking I lack the expertise the topic deserves (it ranked well on Google for lots of biological terms)
I love the lack of agency about biology/evolution, it found a way to create ourselves as well as the huge tree of life around us purely through biological/ecological pressures. And here we are. We owe a lot to how biology has expressed things over the past 4 billion years and will likely find out a whole lot more.
> Despite its popularity, the MBTI has been widely regarded as pseudoscience by the scientific community.[1][3][2] The validity (statistical validity and test validity) of the MBTI as a psychometric instrument has been the subject of much criticism.
> Many of the studies that endorse MBTI are methodologically weak or unscientific.[13] A 1996 review by Gardner and Martinko concluded: "It is clear that efforts to detect simplistic linkages between type preferences and managerial effectiveness have been disappointing. Indeed, given the mixed quality of research and the inconsistent findings, no definitive conclusion regarding these relationships can be drawn."[13][72]
>The test has been likened to horoscopes, as both rely on the Barnum effect, flattery, and confirmation bias, leading participants to personally identify with descriptions that are somewhat desirable, vague, and widely applicable.[10][73] MBTI is not recommended in counseling.[74]
Any survey (as opposed to horoscopes which aren't up to user choice) can be used to convey information about a person, even if that information is what they think about themselves. "I took a survey and I'm a Slytherin" conveys plenty, and no one feels the need to point out that that's unscientific.
I do quantitative biology now, although my background is in theoretical physics. Biology is fascinating, but ultimately there is a cultural divide between the scientific "language" used in biology and the scientific language of e.g., engineers, physicists (very famously described in "Can a biologist fix a radio?" https://www.cell.com/action/showPdf?pii=S1535-6108%2802%2900...)
I do find the author's point weird. "I thought high school biology was just memorizing facts, but I began to appreciate it when I read some pop science books and went scuba diving." So the only problem for the author was the topic of the classes, not the style. Why shouldn't one have the same problem with high school physics ("it's just about boring ramps and pulleys"), etc.? Personally I find the style to be a more important distinguishing factor, in that biology is much less quantitative than other science disciplines. Instead the author's problem is that biology should be even less quantitative and more literary or poetic...?
Ultimately science journalism/popularization is not the same thing as science. High school science classes (try to) teach the latter not the former.
High school physics and chemistry equips students to make (a very limited set of) predictions. High school biology super doesn't. When you're learning chemistry and physics, it feels like you're learning a systematic set of rules that let you approximate and model the world around you. Biology...doesn't, not really. Life is just more complex and higher order, and it's that much harder to actually use the study of it to understand the world immediately around you in any meaningful way.
It's still super cool, but it makes learning about it as a science less satisfying, since it's less friendly to the standard scientific method.
I've recently been delving into paleobiology, but what inspired it was very different from what's described in the post. I ingest a lot of pop educational stuff, mostly just for entertainment; but after a few years of just hearing the highlights and fun facts it became frustrating not being able to put all of it into context.
So I pushed myself a little out of my comfort zone and ordered a textbook and enrolled in a course. It made me realize how I've forgotten how to learn without it being entertainment. But, after some acclimation, I also realized that I don't really need an engaging presentation, because I really do just enjoy learning. So in a way my journey has been kinda the opposite of the author's - the 'fluff' around the information made it less appealing, not more. Though I suppose I might not have taken the leap to delve deeper into these topics in the first place if it weren't for the accessible versions.
Either way though, I think the real takeaway isn't that there's a right way to be interested in a topic - whether through stories and history or otherwise - but rather that school isn't the best environment for figuring out if something interests you, and it's worth re-visiting topics you might have written off with a fresh approach.
>I think the real takeaway isn't that there's a right way to be interested in a topic
I think a different perspective can sometimes illuminate though, it's not just about the person - it's them having an epiphany that motivates them to do something, like learn more.
>pop educational stuff,
I watch a lot of that as lazy entertainment, so much of it is factually incorrect (on YouTube etc). But I know better I guess.
One thing I was suggested when learning a new subject: pick a book like "history and philosophy of physics/chemistry/biology". This has served me well. Most of the teachers are bad; most of the text book writers are bad. In my considered view, a philosopher of biology can write a better biology textbook for newbies.
If you want to be fascinated with biology just go to nature, or a park and stay there for a while. After a while you ll start to wonder about the birds, the plants the snails, the cats. Biology is descriptive science , nothing wrong with it
I don't know if just going to nature is sufficient to get fascinated with biology. In my opinion it takes a fundamental reset in how you think about anything you see. Humans while smart have obviously had to learn to "ignore" thinking about how things work. You don't think too hard about how anything works really. I mean at a cursory level sure, but by vastly different interpretations of the word "cursory", you can change your thirst to know how things you see work at more and more fundamental levels.
You don't need to go into nature to get this curiosity except for the possibility that it makes you more meditative. You can look at your arm and think what the hell happens in there at a molecular level to make you move the muscles. Or when someone says nerves conduct electricity what the hell does that mean?
When I think like that I'm just curious why OP and others blame teachers or whoever else for not inducting the curiosity in them. Like it's someone else's job to make you curious? In my opinion you're either born that way or you're not. Some airport store book isn't gonna make you the next whatever scientist you adulate.
I sometimes skywatch late at night, marveling at the vastness of what's out there, and the glimpse of it we get over here. That gives me a sense of wonder about space, but did not make astronomy any more appealing to me.
Gaining an appreciation for nature is good, getting fascinated with biology is also good, but one is not necessarily related to the other in practice.
In high school I was all math>physics>chemistry>biology. So I didn't take biology. Much to my peril. I didn't learn that I wasn't just a brain on a stick until I was 25! At some point "The Inner Life of the Cell" blew my mind.
I was lucky to have a great AP Biology teacher in high school. I ended up minoring in the field and it has shaped my career. Now my child is a little biologist. It is a fascinating subject and so core to everything we are and everything we do.
I hated biology in school too. I wasn't interested in any of the memory tests. Still got the top grade, but dropped it when I could.
Later I got a job in bioinformatics. I worked on virus genomics. Definitely recommend viruses (specifically RNA viruses) because it's all the cool stuff without memorising bits of a flower etc.
Biology is a huge field, though. I can't criticise the way it was taught because I have no idea how to do it better. Not everyone is into genomics.
This article really strikes a chord: going through high-school biology I was shocked by the dessication of life in the way everything was presented, as if death itself had written the curriculum. I focussed on maths and suspected this was the hidden agenda: only present man-made constructs, treat the rest as if it were just wrong maths.
I'm just going to recommend the biology books written by Lewis Thomas. The books are collections of essays rather than science or text books. They blew my mind and opened up a deep respect for the field of biology and gave me a deep appreciation of life in all its forms, so many of which I didn't know existed.
Look for:
The Lives of a Cell: Notes of a Biology Watcher
The Medusa and the Snail: More Notes of a Biology Watcher
I think this is true for most any subject taught in school.
I have loved history since I was six and my parents got me my first adult history book. I love how all stories fit together, understanding why things happened the way they did, how and why people in the past thought differently than we do today, all of it. If you read a textbook, though, history is just memorizing one thing happening after another.
Part of that might be my (American) education system's fear of controversy: explaining what motivated abolitionists and slave-holders in the 1850, the actual stakes over which they were fighting, would not be popular in many states, and some parents would no doubt object. But also, it's complicated because the past is a different country- all of a sudden you are having to explain the way that the economics of the Industrial Revolution changed the demand for complimentary goods (1), the Curse of Ham (2), the way that printing presses functioned in antebellum American democracy (3), and the pre-Civil Service patronage system (4). Basically, you have to teach a college level course to understand how things were different then and why they happened. And really good teachers can simplify the details down to an age-appropriate level, but most teachers are, well, average, and so memorization is a lot easier path to follow.
1: The beginning of the industrial revolution mechanized looms and spinning wheels, and mechanized cleaning raw cotton. As basic microeconomics suggests, those improvements suddenly massively increased the demand for cotton. Those demand spikes transformed large slave owners from people who understood that slavery was bad and wanted to see it ended but not quite yet to people who thought that slavery was a positive boon for the enslaved people they owned. You can actually see this in their writing, in 1800 most slave-owners think that slavery is on its way out and will not spread much, and in 1830 slavery is the best thing that God gave people anywhere.
2: The Southern Baptist Convention created itself in 1845 because so many didn't think that National Baptists in the General Missionary Convention were committed to defending slavery and the Curse of Ham, and they wanted to be part of a religion dedicated to the idea that White people should rule over Black people.
3: Before the secret ballot each party would provide its own ballots, pre-marked, and you just turned in the ballot of the party you supported. This naturally meant that each party had its own printing press in each town, which meant that they also had newspapers, pamphlets, and the like, and the press-owner was almost always one of the most committed political partisans in an area. Then when their party won they would get the contract for printing all documents the government needed in that area. This is a major driver for political polarization in the 1840's and 1850s.
4: Before the existence of Civil Service protections, basically all of the staff of the government would change over with a new Administration, every postmaster in every town would be appointed by the President and would change with every election. The fear that a Northern President committed to abolition would use this patronage- and printing contracts to printing press owners- to build a large segment of white southerners committed to abolition- who would in turn spark a slave revolt- that was why so many Southern states tried to leave the Union at the election of the first Republican President, before he was even inaugurated or had a chance to do anything. Because if they waited, he would appoint abolitionists to every town in the country, so they had to get ahead of him.
I am sure the author is a fine person, but this is an incredibly self-entitled piece. A number of biologists managed to make it through these classes just fine, and are paid much less for pursuing their passion (and making the breakthroughs the author enjoys reading about while on vacation).
A title like "I wish I had enough attention to get through the boring parts of high school biology, I now find pop biology interesting" may have had less impact, though.
Computer scientists and programmers are very intelligent people who often have grossly unrealistic projections of their competency in other fields, and this is a fine example of the phenomenon.
The post is not about becoming a professional academic/researcher in biology, so it's not clear why your comments (this and the earlier deleted one) focus on competency, calling the author "not cut out for biology", etc.
The post is simply about what you call enough attention to get through the boring parts of high school biology — should biology in school be only for those who have that ability? Even if being a professional biologist requires those attributes, shouldn't the teaching of the science of life—which is full of wonder—have a bit of something for everyone else too? Even people who don't become biologists ought to love biology, surely?
That's what the post (like the earlier one by Somers) is about; it's not about “I could have become a biologist” (as you seem to be implying). You can call it pop biology, but it's missing from school where “astonishing facts were presented without astonishment”. I see nothing self-entitled about this.
It's the same in mathematics, say: even if being a professional mathematician requires (say) thinking long and hard and being willing to struggle with difficult problems, manipulating things in one's head, etc — surely there is value in exposing more students to pop mathematics / beautiful results (enjoying which is very different from actually doing mathematics, sure), so that more people could love mathematics recreationally, whether or not they become professional ones?
The author did fine in another field, but might have picked biology instead if they had gotten the switch flipped earlier in life. That some people get through bad classes isn't a proof that those classes are good; you get those few who would survive no matter what, and those whose brain-wiring is conducive to the way the bad classes are structured. This has a tendency to reduce diversity of thought over time, and contributes to academic ossification.
Secondly, fields really do need cross-discipline collaboration. Finding passionate CS people is fantastic because they bring a different skill set. I have often found that when we get diverse experts together, we can have everyone do the "easy part" and get results which would be otherwise unobtainable.
Yes, some people have 'engineers disease' and fail to appreciate the depth of knowledge and skills of folks who have spent their life in another domain... But the author doesn't seem to be one of these. Many of their favorite stories appreciate the combination of insight and hard work in the history of the field.
It does, indeed, suck that people working in biology get paid less than computer engineers. Blame capitalism...
As a biologist with a tech background (but actual biotechnology majors) - please we have enough tech bros who think they're biology's saviors. They'll just come in fascinated by some technological problem, call it the only blocker to solving aids and cancer and take away a billion dollars in funding over decades and show nothing of actual consequence. Like the entire protein folding field. It's a tool. Not the solution. Even today there was this hyperbolic piece on NBC about how this Harvard scientist working on microscopy image processing is being deported and now we are not going to cure cancer.
I feel bad for them, but I can assure you, as someone who did the research in the exact same field, they're curing nothing and are more likely to make cures slower by sucking away funding from more pertinent projects.
I've been working my way towards a biololgy degree very slowly (can only really fit one-class-at-a-time alongside working full time). I'm maybe 70% to a bachelor's degree in it. Been writing code for ages, but I've saved enough to accept a lower salary if it means I get to work on a real problem for once in my life. So I guess I'm one of those people you're frustrated with.
Do you have any advice for how to not be that kind of problem? For now I'm just focusing on my coursework, but at some point I'll be biologist-enough to help out with research. How do I approach it without being that guy?
In my (possibly not the best) opinion the most important quality will be to not delude oneself with the idea that their method or field is the most important field in all of science. Unfortunately academic structures force you to think and believe that and then proselytize that way. But if you stay above it at least in my books you're above most folks. But then I'm a lowly guy in a corner lol.
Practically what this means is that you should decide what you truly want to change (not necessarily what you can change with your current expertise) and pursue it across whatever fields necessary. If it's curing a disease, you have to decide what is the most important thing that's stopping us from curing that disease and pursue that exact topic. More often than not it's not anything software related. You have to grab a pipette at some point and guillotine a few mice at another lol.
I once met a scientist who spent a week traveling to where there was a powerful x-ray laser. He used it to blast a thin film of something or other that was floating on the surface of some water. He left with a flash drive full of data and some FORTRAN titled LSQREFL, which allegedly could decode the laser results. He then spent the next 6 months trying to make it actually do that. Turns out you had to have a folder with today's date on it on your desktop, otherwise the program would crash. This was documented nowhere, he just eventually puzzled it out from the code.
I offered to put it on github for him, so that at least he didn't have to be the sole caretaker for this endangered bit of software, but he was afraid of running afoul of the original author's rights, so endangered it will stay.
This was maybe an unlikely occurrence, falling neatly in the not part of your:
> More often than not it's not anything software related
But it makes me think that there is still some juice left to squeeze out there. I mean, I'm having a good time with my one-class-per-semester, I'd just prefer to not have to do it for another decade before I'm enough of a biologist to get my hands dirty.
What does the author claim entitlement to? Or what real-world malign effect are you expecting from this piece that warrants the charge? I went in expecting the type of piece you describe, since I know the type, but I've failed to read it as you do except with a disqualifying squint.
I think one of the things I love most about biology is its uncertainty. Things like Math and engineering are all rigid and rules based. Life is wibbily wobbly, lifey-wifey. An enormous soup of changing alleles cast as probabilities over eons all creating endless interactions you can't ever comprehend.
You have to become comfortable with the fact that there is uncertainy and there are parts of it you can't control. So instead you have to be obsessed with introducing order where you can. It is so refreshing to see a beautiful experiment that can wrestle a clear signal from the endless noise.
> Things like Math and engineering are all rigid and rules based
Depends where in math, in things like particle physics things get all wibbly wobbly is my cat dead or alive. In things like engineering quite often what you're dealing with is probability based, but you just stack the deck so far in your favor the probability is 1.
As they say, building a bridge that doesn't fall down is easy. Building a bridge that barely doesn't fall down is much harder.
A lot of experimental and applied physics operates this way. If you are synthesizing material, for example, it takes a lot of time and effort to get high yields of what you want. Before that your processes can be very probabilistic.
In fact, just finished listening to a talk where a experimentalist was talking about how to get the fabrication yields of superconducting qubits from currently low double digit to 99.99+.
Biology is messy at a macro level is all I'm saying. I don't need a hundred people butting in saying "butt aschully phsyix and code is also messy and harder at a quantum level." I know. We know.
Every scientist does that at some point. I've easily crossed my fingers and hoped numerous times that code I'd written would work, especially on the first time. Even more rewarding in the superstition when the project is hard, and you're a bit daffy at the end.
It's a human thing.
Surely Feynman made jested comments before running experiments. I'm sure some digging in his wonderful books and letters will find many examples.
> Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems via biologically inspired operators such as selection, crossover, and mutation.
Rosalind.info has free CS algorithms applied bioinformatics exercises in Python; in a tree or a list; including genetic combinatorics.
https://rosalind.info/problems/list-view/
FWICS there is not a "GA with code exercise" in the AP Bio or Rosalind curricula.
YouTube has videos of simulated humanoids learning to walk with mujoco and genetic algorithms that demonstrate goal-based genetic programming with Cost / Error / Fitness / Survival functions.
Mutating source code AST is a bit different from mutating to optimize a defined optimization problem with specific parameters; though the task is basically the same: minimize error between input and output, and then XAI.
A while ago, I taught CS for a year in a local high school. I can very much relate to the notion of "astonishing facts were presented without astonishment": as a teacher, you don't have the freedom to teach whatever you want (of course), but you're very tightly bound to a curriculum that's developed by the state government. And for CS, this curriculum was so uninteresting and uninspiring (what a surprise: 13 year old kids don't care about the history of computers), that I couldn't blame any of my students not to show much interest in my classes.
As a matter of fact, I gave up after just one year. It wasn't any fun for anyone, not for the students, not for me.
What a horrendous crime, to turn a fascinating subject into a boring curriculum to be forced on teachers and children.
I've received great intellectual satisfaction from various well-taught subjects. I would rather chop off a finger than lose them. So curriculum committees that make subjects boring are doing something worse than chopping off millions of children's fingers.
With any kind of history especially, its just rote memorization of facts and not the connections between those facts.I hated history in school because of that, but now I actually find it interesting to learn that x happened because of y that also led to z and such. Or just rote memorization of technical facts, like how many wires does a PATA cable have. Or why must kids memoroze how an ethernet frame is built up? Sure go over it in class and show it as a lesson in how to read how binary protocols are defined. Because either you forget it anyways because its not relevant to your job, or you can look it up and memorize it over time as you use it often enough.
I really wish that teaching of history will get better for current and future kids.
I can really relate to your experience, even though mine was from a parent's perspective rather than as a teacher. I found a similar thing when tutoring one of my children in trigonometry. The way the material was being presented in school didn't click with him, but astonishingly, despite having studied it decades ago both at school and university, explaining it to him, it finally made sense to me. The unit circle definition of a tangent is a thing of beauty. I had the time to get my child to appreciate it as well, because of the extra time I had to spend with him, whereas the teacher had to hit curriculum benchmarks.
I also think this is where things like intergenerational math-phobia come from: parents who don't grasp core concepts and are scared off, and can't help their own children, creating an ongoing cycle.
> I also think this is where things like intergenerational math phobia come from: (elementary) teachers who don't grasp core concepts, are scared off, and can't help their own students, creating an ongoing cycle.
I hope you appreciate my addition of the other common path of math phobia.
Absolutely, I do appreciate that addition — I definitely had teachers like that.
It’s probably why, when I got to university and tackled subjects like probability theory, discrete math, and theoretical CS, I did extremely well — they weren’t reliant on the shaky algebra and trig foundation I had from school. Once the focus shifted to logic and conceptual thinking, without the baggage of poorly taught fundamentals, everything clicked
This is why most good teachers don’t use the books but find creative ways to still meet the standards. More work though, so fewer do it now with pay being so shit.
I think the whole teaching the history of computers is a big failure at an attempt to Segway into computer organization and architecture. Nonetheless, I get what is happening. If it’s a pure Computer programming class then the goal maybe to have them understand the “basics”…like what is the hard drive vs RAM (memory allocation) or what is a transistor (Boolean logic) and what is a punch card (mnemonics and abstractions of those mnemonics to what is now just a computer programming language).
(Unless you're riding a motorized vehicle, the word is segue, not Segway)
This is very much a tangent, but I think it's nearly certain that "segway" will end up overtaking "segue" as the predominant spelling for the word that is defined as: "to make a transition without interruption from one activity, topic, scene, or part to another"
The "mistake" happens so often, partially because "segway" is a much more straightforward spelling if one has only heard the word said aloud, that I think it will eventually become the actual way it is spelled!
It will likely end up like many other more phonetic spellings: an indicator of ignorance, but more acceptable in America than elsewhere.
English is spelled phonetically. Just not Modern English phonetically but Middle English phonetically. And then it froze into ideography because of printing press.
This is an absurd take. We should not bend language around ignorance. There is a beauty to effort. Please take a second to explore this for yourself.
Actually it's not an absurd take at all. The absurd take is that we "should not bend language around ignorance."
That's precisely how language changes over time. Language is not a strict set of rules. It's based on understanding and consensus, so sometimes things that are "wrong" do end up being accepted.
I suggest this as a great introduction into what languages are and how they evolve over time https://www.amazon.com/Language-Families-of-World-audiobook/...
I am not a native speaker, but the two words do not sound even remotely the same.
How does this mistake happen so often? Can you explain people's thought process a bit? Is it just: "Something something 'seg...' ... ah I know, I will simply use another random word that starts with the same 3 letters and doesn't make sense in this phrase!"?
Also this is the first time I see it.
> the two words do not sound even remotely the same
Pronounced correctly, “segue” sounds just like “Segway” – not like “seg-oo”, as you might have assumed.
The two words are pronounced identically.
Most mistakes remain mistakes, and do not become part of the language. The idea that mistakes generally get accepted as correct is simply untrue, which is what you are implying.
I am sure people will make the mistake, as they sometimes do today. But it is a mistake, and will likely be recognised as one.
It is likely that the language gets more cemented by automatic spelling and grammatical correction, including using AI. For example, there are a number of grammatical and spelling changes that have been cemented by American spelling/grammar checking programmes ie. by MS Word.
> The idea that mistakes generally get accepted as correct is simply untrue, which is what you are implying.
I did not imply that at all. I said sometimes, so it's not that absurd that it could happen. It does happen though, and a quick google search will give you pages of examples.
I'm sure you'd describe yourself as a Digital Auditor Workstation used by ADM DeeJays. After all, language changes over time, so my errors of usage are all correct.
Was that the point? Don't forget that you're on hackernews, not reddit. Strawmans are less accepted in this community. Individually, you are neither a consensus that was described nor did anybody in this thread implied that "all errors of usage are correct" and accepted. Your sarcasm is unwarranted and provides little value to this conversation.
Write that in Old English orthography and you'll make a more consistent argument.
Þis is án ungeswutellic andswaru. Wé ne sculon bǣgan spræce ymbe ungewitt. Þǣr is fægernes tō earfeþe. Bidd þé niman án ōþer tid tō smeagan þis sylf.
I passed no judgement for or against, merely discussed that it was likely to happen.
I suggest you yourself take a second and explore why you think being smarmy on the internet is a way of getting people to agree with you.
Tbh, I’m more critical of commonly confused words in English like affect and effect, or discrete and discreet.
In more forgiving of mixing up homophones, even if one of them is a registered trademark (Segway).
Aren't discrete and discreet homophones?
[dead]
Personally, I struggled a lot in my earlier CS/Informatics education, partly because I never felt like I understood what was actually happening/how we got here, everything was just factoids in a void. When I took a gap semester between my A.S. and B.S., I finally studied/explored a bit of the history and it put a lot finally in perspective.
> have them understand the “basics”…like what is the hard drive vs RAM (memory allocation) or what is a transistor (Boolean logic)
You must understand these things at least conceptually if you want to really understand how to write efficient programs. Maybe not at the level of how memory can electronically "remember" a 1 or a zero, or how a hard drive can magnetically do it, but at least the relative speeds e.g. register vs. cache vs. RAM vs. disk.
I've loved the history of computers since I was young, although if I was forced to learn about it in school I know it would suck.
I basically found this in college too, I quickly gave up on computer science as a major. I'd rather just go out and learn how to build what I want to build versus hearing a 3-hour lecture about how the jvm works.
The answer is it's magic and no one cares, now let's go build some games
Firstly, and this is worth pointing out, "computer science" is not about programming. It's about science, in this case specifically the science that makes computers work.
At school I thought "computer science" meant "programming" - which it doesn't. So well done for recognizing this before wasting your much time. (Seriously, not sarcastic.) programming can easily be learned outside college.
To other general readers here though I'll say that understanding the science can be really helpful over a career. It's not terribly applicable in getting that first job, but as you progress more and more of those theoretical fundamentals come into play.
Ultimately there are a small fraction of people who need to understand how it all works, all the way down, because those people build the things that programmers use to build everything else.
This is a myth. Computer science absolutely is about programming. The science that makes computers work is called physics.
There are theoretical parts of computer science, but it is fundamentally a practical subject. All of it is in service to programming. Type systems are about typing programs. Algorithms are implemented using programs. Data structures are for use in programs.
The very worst computer science lecturers are those that forget it is a practical subject and try to teach it like abstract mathematics, because they believe (whether they realise they believe it or not) that it is more prestigious to teach abstract concepts than practical concrete things.
It is the same in mathematics, where unfortunately there has developed a tradition since Bourbaki of trying to teach abstract notions as fundamental while concrete problem solving is left to the engineers. The result is that many engineers are much stronger mathematicians than many mathematically-trained students, and those students have to relearn the practical foundations of the subject before they can make progress at the graduate level. If they don't, they get stuck doing what looks like maths, but is actually just abstract roleplaying.
Imagine if they taught the history of English to kids before they could read
The history of English is taught in English classes. Historical context is important and interesting. You don't really understand a subject without knowing a bit of its history.
My favourite classes were those where we didn't just get taught facts and theorems but we also got taught a bit about who proved the theorem for the first time, who discovered this fact, what this algorithm was first used for, etc. So much easier to remember too.
This is one of the best things about studying law: the very nature of it makes it impossible to teach it without the historical context.
Since most people throughout history couldn't read, I guess it would be relatable?
How many of those who couldn't read knew the history of their or other languages?
they might just remember it all once they're adults!
imagine that!? an historically informed populace???
you'd need more expensive lies and higher quality fakes... the government would be costlier to run.
ideally, in the long term this would make the national currency's value in the international money market rise up. but why wait for that when one can directly manipulate money through trade fraud and covert military ploys?
That's not the point, the point is the ordering is inverted, not that history shouldn't be learned.
Those curiculums developed by sould-dead gremiums in consensus on the minimum knowledge you goto have are a blight on western civilization. Instead of giving students the ability to discover a topic, or built something they are interested in themselves and then give them a understanding and fascination with the discoverers who have gone before them. Instead they kill the subject..
I must confess, it gives my dry old heart some joy, to see the anti-education masses coming from this, voting and storming the fortresses that produced the paywall around education, that only money with tutors could or accidental intrinsic motivation could overcome and burn & salt those outposts of classists academia.
Yes, definitely, destroying education as we know it without any plans for what the next thing is will definitely work.
Developed countries really need a come to Jesus moment, because the disdain for everything that made them great places is unbelievable. People will understand, after great suffering, that destroying stuff is much easier than building it.
> People will understand, after great suffering, that destroying stuff is much easier than building it.
"It is easier to destroy than to create" doesn't tell you when something should be torn down.
You can have a house that provided shelter for your family for generations, but if it's water damaged, the floors are rotting and it's full of toxic mold, the person who shows up with a bulldozer isn't necessarily wrong.
We're in the destroying phase right now. Unless you live in China - I hear they're mostly doing well. Or middle of nowhere Africa, where there's nothing to destroy because there's nothing there.
But systems can rot from within too, or just decay naturally, and don't need to be destroyed. What if the core ideas that built our current civilization were ideas of the past, that we don't have any more, and we don't know what to do when The Machine Stops? Doesn't have to be a literal machine - it's a good metaphor for how democracy fell apart.
Forgetting that it was the anti-education forces that created the curriculums. The war on public education goes back a long time; teachers lost the freedom to teach decades ago. and it has been the same forces behind it all along.
Ok... what would you do differently? Keep in mind you have to educate millions of students across an enormous spectrum of abilities, socioeconomic backgrounds, and interests.
No you don't. There is a narrow range of abilities at each level if students are properly held back when they haven't mastered the material.
Their interests are built by what they are taught. "Socioeconomic background" is a tautology. Their backgrounds are irrelevant.
My fork in the road with hard tech hard science versus biology was in high school. It seemed that students that wanted to become doctors took AP biology and students that wanted to be engineers took physics and chemistry. I had wanted to be an engineer since I was 12 years old so I felt the decision was already made. But all studying neural networks in college in the 80s I realized that there was this tremendously rich domain of real neurons which I knew nothing about. I worked as a software engineer for a couple years after graduating but then went back to school to study Neurophysiology. I did not pursue it as my area of work or research, but I am grateful for having had the opportunity to look at the world from the perspective of a biologist.
If you're an engineer and early in your career and feel there's something missing from your intellectual space, I encourage you to go back and get a graduate degree in something totally different. Humans live a very long time so don't feel like you're wasting time.
I've been programming since I was eight, but truly fell in love with biology in 12th grade chemistry: the first introduction to organic chemistry and biochemistry. It was the first time I truly started grokking the application of systems-level thinking to the biological world; how do trees "know" to turn red in the autumn? How do fetuses assemble themselves from two cells?
I decided to purse a double major in biochemistry and evolutionary biology and it was one of the best decisions I've made in my life. The perspective you gain from understanding all life in terms of both networks and population dynamics of atoms, molecules, cells, tissue, organisms and populations -- and how every layer reflects the layer both underneath and above it in a fractal pattern -- is mind-expanding in a way I think you just don't and can't get designing software systems alone.
I work as a software engineer / founder now, but always reflect wistfully on my time as a biologist. I hope to get back to it some day in some way, and think what the Arc Institute team is doing is inspirational [0].
[0] https://arcinstitute.org/
Has anyone seen content that used this multiscale networking and population dynamics as an instructional approach?
For small example, there was a Princeton(?) coffee-table book which used "everyday" examples to illustrate cell/embryonic organizational techniques - like birds equally spacing themselves along a wire. Or compartmentalization, as a cross-cutting theme from molecules to ecosystems.
I've an odd hobby interest in exploring what science education content might look like, if incentives were vastly different, and massive collaborative domain expertise was allocated to crafting insightful powerful rough-quantitative richly-interwoven tapestry.
I would love to do something like this but simply cannot afford it. I think it is good advice but going back to school for a degree one does not plan on utilizing is not as feasible today as it was in the 80's, largely due to the sizeable increase in tuition without reciprocal increases in wages.
In this day and age, you can do this for FREE and on the side, whenever you have time!
There are tons of very well-done professional level video courses on Youtube.
There are more organized courses that only ask you for money for the "extras", like some tests and a certificate, but the main parts, texts and videos, are free.
You could start with a really good teaching professor (Eric Lander, MIT) and his course: https://www.edx.org/learn/biology/massachusetts-institute-of... (the "Audit" track is free, ignore the prices; also ignore the "expires" - this course restarts every few months and has been available in new versions for many years now)
It's very engaging!
There's similar courses for everything in the life sciences, there on edX, on Youtube, many other places.
I feel the true Internet is soooo underutilized by most people! Forget news sites, opinion blogs, or social media. Knowledge is there for the taking, free. Only the organized stuff, where you end up with a certificate costs money, but they usually still provide the actual content for free.
Time and energy are also at a premium in the current economy. Good luck learning biochemistry by watching YouTube videos after 8+h of coding and meetings plus commute plus making dinner plus cleaning up.
Depending on where you live, and what you want to study, you might be able to take a couple courses at the community college in areas of interest without spending a lot of money.
I was paid to get a PhD in Biology, albeit just enough to live on. Most people in PhD programs are, either through being a TA (teacher's assistant) or RA (research assistant). The real financial cost is the opportunity cost of 5-6 years of your life.
Whether or not broad support for training scientists holds up during and after the current administration remains to be seen.
Please, the cost isn’t your life, that is life and it is great.
My current tuition is under 500 CAD per class. The opportunity cost of not working full time is the real bulk of the cost of studying in places that have a functional government.
>I would love to do something like this but simply cannot afford it.
Work for a company that will pay for it.
I can't imagine why a company would pay an engineer to get a masters degree in biology
A lot of companies will pay for at least part of whatever college classes you take, without auditing whether or not it would be good for your specific job.
I encourage people to look into it, it's a benefit a lot of people have but don't use and it's leaving money on the table.
Can you say more? What kind of company would so such a thing? Maybe I live in a bubble but that's so far outside of what I've seen that it just sounds fantastical.
Ok, both of these comments made me doubt my memory so I just checked and my current employer, a very large consumer company, and the limits of the program are that you get a C or above, and the class is "related" to your job or any job you can get at the company. But I've gotten classes paid for that only tangentially related to my job with no problem. So I concede that you might not get a biology degree as an engineer but my particular company does a lot of different things so my guess is in practice you'd have no problems. I also worked at a now-defunct mid-size startup and a hospital system with similarly loose requirements but I don't have access to their docs anymore.
My company uses guildeducation.com and we can use basically $5k a year (I think, it might be semester), a lot of if it is just individual classes, but there are also some degree programs. I don't know if they preselect which courses are available to us or if we have access to the whole catalog. I suspect it's somewhat curated, because we are a medical company and most of it is medical stuff. There is a CS bachelor's program but last I checked there wasn't an MS CS program.
I would assume most companies with 100+ office workers (essentially big enough for an HR department) usually offer something like this in western countries.
Every company I ever worked for constrained it in many ways
1. Masters degree only, they won't pay for anyone to get a bachelor's or associates
2. Must maintain a B average or better
3. Cannot take any time off, it has to be entirely on nights and weekends
4. Reimbursement after the fact, so you're taking on the initial financial risk up front.
I had a job with an education budget listed as benefits.
However, to use it there are constraints: 1. The topic should be related to technologies used by company. Cannot get a Google cloud certification as they are using aws. 2. To get it you need approval by line manager, hr, and director of the office. 3. If it is more than €250 you need to sign up loyalty agreement for a year. Meaning if you will return some amount of you quit.
With all that strings attached it is just a marketing bullshit to attract new hires.
Plus usually the employer wants it to be related to ones job, from their very limited perspective of the world and management decisions. For example I couldn't even take a language course for education vacation, as the employer did not make any use of my language skills.
Try something in the medical field, my company will pay for a bunch of medical related stuff when I just want to further my CS background.
I would love to do this, I just cannot afford it as others have already stated. It's depressing to feel like I spend so much of my life at my day job and yet require it to afford the tiny portion I get left. I wish things were different.
Jobs are a prison, if we had a slice of those efficiency gains, you would have ample time for all the things.
Same. Biology was an elective in high school and I never took it. I took Earth Science (basically introductory geology) and then went into the Chemistry/Physics track (two years of each). Never felt I missed it, last time I had any real biology education was a unit in 8th grade science and I didn't care for it then.
I am not sure biology is not a "hard science"?
The breakpoint was molecular biology around 1986 with the introduction of PCR. Once that happened, biology went from being alchemy to being science.
I loathed biology as taught prior to that. Once I got a molecular biology course, I thought biology was amazing and wondered "Why the hell did we teach all that other crap?"
Well, that was because the tools we had for biology sucked prior to PCR. My problem was that I recognized that even as a child.
I invested a great deal of effort over 30+ years to learn biology, which I started to love in high school when a teacher introduced us to molecular biology. Over time I've come to appreciate that biology is a huge field and people who master one area often know little to nothing about many others.
To be proficient in biology you need to have "Extra" skills: extra ability to work with ambiguity,ability to memorize enormous amounts of descriptive information, and highly abstract representations. Digital biology often loses many aspects of biological reality, and then fails to make useful predictions.
Over the years, I've come to realize I know less and less about biology- that I greatly underestimated the complexity and subtlety of biological processes, and have come to admit that my own intelligence is too limited to work on some problems that I originally thought would be "easy engineering problems".
A great example of the rabbit hole that is modern biology is summed up here: what is the nature of junk DNA? To what extents are digital readouts like ENCODE representative of true biology, rather than just measuring noise? What is the nature of gene and protein evolution?
https://www.cell.com/current-biology/fulltext/S0960-9822(12)... (note that while I disagree strongly with Eddy in many ways, I've come to recognize that I simply don't understand the modern view of evolution outside the perspective of molecular biology (IE, what geneticists like Eddy think).
Also, recently, Demis Hassabis postulated that if he is successful, we will come up with silver bullet cures in 10 years time simply using machine learning. It's amazing how many computer scientists (I call him that rather than a biologist, although he has worked into neuro) make this conclusion.
I've got a background in neuroscience and transitioned to data science a few years ago. Your comment about the rabbit hole of modern biology is spot on. I've been hearing for 10+ years about how ML like computer vision will revolutionize medical diagnosis and treatment. It hasn't happened yet and I think that enthusiasm comes from the fact that we built computer systems from the ground up and therefore know them deeply, whereas biological systems aren't fully understood.
Why would biology be so hard? It’s only a billion years of evolution, after all. We’re dealing with billions of things all the time. /s
Appreciate the sarcasm, but... it's really 3 billion years of evolution, with astronomical levels of actual entities living and dying in a dynamic world environment. Chemical reactions happening in nanoseconds. Polymers have extraordinarily complex behavior!
The field of biology was created by people who love to classify/name things. This has resulted in what we have now: A subject where the prerequisite to understanding is the ability to read long passages of text littered with jargon and visualize what that might represent. Even if everyone's reading skills were where they should be, the second part is not a super common skillset.
It's one of the reasons why I work in visualization for life sciences education: I think we're missing out on people who might otherwise make massive contributions to the field because they failed to memorize what the "endoplasmic reticulum" does. Much of biology you don't have to actually remember what things are called in order to understand the processes (at least at a basic level like what a middle schooler might be taught). Once you're exposed to the fascinating complexity of life at that level, for many people it can be interesting enough to build the motivation for the memorization/etc.
> The field of biology was created by people who love to classify/name things.
More to the point, the field of biology is so complex that for the longest time we could only name and classify things. Understanding came later, when we'd accummulated enough data and had hints from chemistry and other fields.
The problem is that once we gain that understanding, we add that as one more chapter to our textbooks, one more lesson tacked on, instead of rethinking the curriculum around our understanding.
Agreed 100%! Really like this
The use of latin doesn't help either. "Cytoplasmic net" (or better yet "plasma net") is a lot easier to understand, visualise and remember than "endplasmic reticulum".
If you are an English speaker. If you are native in a Latin-based language, "reticulum" is pretty clear (reticolo, retículo, réticule etc). So, it's just a point of view and dictaded by the most used language within research/education at a particular point in time.
It's not that they love to classify things, it's that you have no choice but to do so for people to know what you're talking about.
Not a lot of point in spending time researching something, only for no one to know what you're even referring to.
>Much of biology you don't have to actually remember what things are called in order to understand the processes
But even that's besides the point of the fact that all these things are nothing more than abstractions created by humans, and ultimately it's all one giant soup of interacting molecules.
This entire article reminded me of reading the introduction to Ender's Game by Orson Scott Card (it's in the 25th anniversary edition).
He mentions reading Kon-Tiki by Thor Heyerdahl and finding archaeology, as described by Thor, to be incredibly fascinating and exciting (which I agree with having read Kon-Tiki as well).
Card goes on to say that when he tried ACTUAL archaeology he found it incredibly boring. e.g. it was mostly sitting out in the hot sun dusting off rocks hoping to find some bones.
It's a reminder of two facts:
1. EVERY activity has exciting and boring pars
2. A good writer can make even dull and boring activities comes alive
I had always thought of biology as 'less rigorous' than the other sciences, and consequently less deserving of merit than, say, physics (my major). Less mathematical, not as rigorous, purely memorization devoid of deep understanding,.
It took me a while to shed that view.
1. There's an inherent charm and beauty to biology, and the ability to memorise is a skill.
2. The many different sub-disciplines of biology demonstrate the level of complexity that the field demands. And, even if it isn't as 'rigorous' as physics, do we denounce experimentalists because theoretical physicists exist? They simply serve as distinct, but crucial, parts of a chain.
>Less mathematical, not as rigorous, purely memorization devoid of deep understanding,.
This, here, is why nobody likes physicists. It's like they are taking extra courses on Advanced Hubris.
I took some programming courses in college. I loved computers and was very interested. However, the classes were a guy reading from a book about C. That was pretty much it. You did what the book said and hoped something stuck in your head.
This was early days of the internet, the book(s) were largely the only resource. The instructors were folks who just understood coding in C naturally and had no idea how to communicate with those who did not. No joy in anything, just raw code.
I dropped out.
Decades later after age 40 I was at a career crossroads and took a web development class. I loved it, I could make things quickly, the instructor actually understood how to teach / introduce concepts. I've been happily coding professionally and personally since then.
How things are presented sometimes makes all the difference.
I remember my first interaction with computers was on one of those ancient ones way back when. Our teacher showed us how to make a circle appear on the screen. I was preoccupied with how the computer was actually able to render that circle, what exactly was happening under the hood and what kind of physics was happening for all this to come together as a circle on the screen and not that particular function of whatever program they were using at the time. That turned me off to wanting to mess around with computers for awhile.
The post by James Somers that this article references at the top inspired me to buy the David Goodsell book The Machinery of Life. I would seriously recommend that to anyone who doesn't have a background in biology (like me). The phrase is a bit of a cliché, but it genuinely blew my mind, to the extent that I had to read it slowly because there's so much fascinating stuff packed into such a small book. It's obvious to me now, but the fact that so much of this stuff is about physical shapes locking into each other, and doing it at an almost unimaginable speed, was absolutely enthralling.
Check out his[1] Molecule of the Month series (https://pdb101.rcsb.org/motm/) if you haven’t already!
[1]: Although he just retired from it. Janet Iwasa will continue the project.
Ha, same here! Bought that book about a year ago after reading that post
My interest in biology isn’t driven at all by stories, history, or “adventure”, but rather by the awe-inspiring complexity and majesty of all the microbiological processes and their interplay.
Yes, it’s pop science, but last be year I read through Philipp Dettmer’s “Immune”, and the description of how the immune system continuously generates random/arbitrary sequences of nucleotides, builds the proteins that those sequences encode, and then subjects the resulting proteins to a “is this a ‘me’ protein or an ‘other’ protein?” gauntlet, the latter path of which allows the body to create antibodies for completely novel proteins... is just incredible.
I have an idle fantasy that, in the afterlife, I’ll be able to ask God questions like “so what are quarks made of?”, “why is the speed of light what it is and not any faster/slower? What would the universe have been like if the speed of light were several orders of magnitude faster/slower?”, “is there a single force that unifies all the ones that humans know about? What would the universe have been like if the weak nuclear force were just a tiny bit weaker?”, etc etc etc etc etc etc etc.
same inspiration but I wouldn't devolve it to 'pop science', it's simply less axiomatic than physicists and mathematicians would like. The fact there's 4 billion years of ecological change beyond the biological change just makes stuff hard to prove empirically.
esp. when physicists use things like the anthropic principle to describe our own universe.
A complex three dimensional organism self-assembling from a single cell is 100% magic, especially given how resilient it is to disruption. You can kill one of the two cells produced by the first division and still get a fully formed organism (that's one of the actual early experiments in morphogenesis theory).
Concentration gradients layered on concentration gradients layered on concentration gradients.
https://en.wikipedia.org/wiki/Segmentation_gene
And Brownian motion all but guaranteeing everything bumps into everything else constantly!
Well, this is incredible: "The gene sequence had a strange repeating structure, CAGCAGCAG… continuing for 17 repeats on average (ranging between 10 to 35 normally), encoding a huge protein that’s found in neurons and testicular tissue (its exact function is still not well understood). The mutation that causes HD increases the number of repeats to more than forty – a “molecular stutter” – creating a longer huntingtin protein, which is believed to form abnormally sized clumps when enzymes in neural cells cut it. The more repeats there are, the sooner the symptoms occur and the higher the severity"
Not the only sequence model that exhibits stutters on repetitive inputs...
And on the seventh day, God ended His work which He had done and began vibe coding the remainder of the human genome.
this should do the trick...
You also need selection, not just mutation (I know you are being silly, so am I)
Selection is handled by asynchronous events which populate the is_dead() boolean.
Critiquing my own code, though, it should really be a check against 'can_reproduce()' rather than 'is_dead()'.
My father, who was a teacher considered teaching classes to be a kind of performance art. For getting information, you are better off with a book (or other media). His goal was to put up a performance good enough to get students interested, and ideally, read the books later.
Makes sense, my perceived best professors were those that were enthusiastic about teaching and/or the subject
Related:
I should have loved biology (2020) - https://news.ycombinator.com/item?id=40103590 - April 2024 (253 comments)
I should have loved biology - https://news.ycombinator.com/item?id=32035054 - July 2022 (271 comments)
I should have loved biology - https://news.ycombinator.com/item?id=25136422 - Nov 2020 (298 comments)
I can really relate to this — in school, biology felt like dry memorization. It never clicked with me, and I wrote it off for years. If I could recommend one subtopic of biology to math and physic people, it would definitely be mycology!
It's like real-life Pokémon GO and field mycology has a "collect 'em all" vibe. You get out into nature, identify and catalog fungi — it scratches the same itch as exploring an open-world game.
Fungi are discrete, classifiable entities with tons of metadata: GPS location, substrate, time of year, morphology, spore prints, photos, microscopic features. Perfect for structured data nerds.
Unlike many branches of biology, you don’t need to go to the Amazon. You can walk into your backyard or a nearby forest and find species newly known for your country and sometimes even new for science.
Microscopes, macro lenses, chemicals, even DNA sequencing. There’s a hacker spirit in mycology.
Projects like iNaturalist, Mushroom Observer, and FungiMap are full of real scientific contributions from everyday people. The barrier to entry is low, the impact can be surprisingly high, and the community is genuinely welcoming. Many leading contributors — even those publishing in cutting-edge scientific journals — are passionate autodidacts rather than formally trained biologists.
High intra-species variance, subtle features — perfect playground for machine learning wich is not nearly "solved" here.
Cordyceps that zombify insects. Giant underground networks that share nutrients between trees. Bioluminescent mushrooms. Many weird stories.
Mycology is also becoming a computational frontier - projects like FungiNet use graph networks to map symbiotic relationships, and citizen science platforms generate massive datasets perfect for ML applications beyond just classification. The unsolved phylogenetic relationships and complex biochemical pathways of fungi represent some of the most interesting computational problems in modern biology.
Aside:
Hey, a lot of fellow biologists here! A few questions:
Is there a 'hacker news' for biology that I'm missing out on?
Where do you get your biology news from?
Where do you think the field/s are going?
Is bio harder than other STEMs?
I'm a neuroscientist/bioengineer by training and profession. I followed the path that a lot of commenters here did too, in that I came back to bio after a harder STEM career (physics). Glad to know I'm not alone in this!
I have a PhD in cell biology, as well as post-doctoral research experience. I'm now a software engineer for a big chip designer.
> Is there a 'hacker news' for biology that I'm missing out on?
Not that I know of. Could be cool. I produced a prototype platform that was a searchable DB of all papers published on PubMed, with a comment and karma system. The idea was to incentivise continued debate on published papers. Currently discredited papers continue to get cited for many years, because almost no-one publishes their critiques once the paper is published.
> Where do you get your biology news from?
Back in the lab, we would mostly just closely follow our small field with automated searches of PubMed. We'd find out about wider stuff by attending talks and speaking to other scientists. Often new and exciting things will take years to publish, so you would often find out before publication via talks.
> Is bio harder than other STEMs?
Only because the traditional high-school and university education in biology does not prepare people for the realities of cutting-edge bio research. I did a load of coding, image analysis, complex microscopy, chemistry etc etc, during my PhD and post-doc, none of which was taught during my degree or high school.
To thrive in biology research, you have to be comfortable not knowing about a thing, but figuring out how to do the thing, which likely no-one has done before.
Fun questions! My takes:
1) Sadly there isn't really. There are a few good blogs like Derek Lowe's "In the Pipeline" that centralize news, but no anonymous online forum like this.
2) Google scholar alerts, Twitter, Bluesky, and word of mouth.
3) I think our understanding of biological processes at the mesoscale is about to hit an inflection point, largely through advances in electron microscopy (cryo-ET) and the ability to perform simulations at this scale.
4) Not harder but definitely more messy and progress is less linear.
> I liked math and physics and economics and even chemistry,
I think the reason is that these subjects give students a sense of instant gratification and instant empowering. Every concept and every theorem could be a breakthrough to my problem-solving skills, and I can immediately see how I can solve more problems through the learning.
But of course, nothing gives more instant gratification than CS. I can try and build things via my keyboard at any time. The learning and the associated work does not get better than that.
Love biology. I appreciate purist mathematician/logicians prefer chemistry and physics and it seems to be an inside joke in the professions that biology isn't on the same level when it comes to axiomatic things.
I'm a classic INTJ but left school and built biology-online.org 25ish years ago. I think it's had a couple of thousand years of reading hours. I sold it on thinking I lack the expertise the topic deserves (it ranked well on Google for lots of biological terms)
I love the lack of agency about biology/evolution, it found a way to create ourselves as well as the huge tree of life around us purely through biological/ecological pressures. And here we are. We owe a lot to how biology has expressed things over the past 4 billion years and will likely find out a whole lot more.
> I'm a classic INTJ
https://en.wikipedia.org/wiki/Myers%E2%80%93Briggs_Type_Indi...
> Despite its popularity, the MBTI has been widely regarded as pseudoscience by the scientific community.[1][3][2] The validity (statistical validity and test validity) of the MBTI as a psychometric instrument has been the subject of much criticism.
> Many of the studies that endorse MBTI are methodologically weak or unscientific.[13] A 1996 review by Gardner and Martinko concluded: "It is clear that efforts to detect simplistic linkages between type preferences and managerial effectiveness have been disappointing. Indeed, given the mixed quality of research and the inconsistent findings, no definitive conclusion regarding these relationships can be drawn."[13][72]
>The test has been likened to horoscopes, as both rely on the Barnum effect, flattery, and confirmation bias, leading participants to personally identify with descriptions that are somewhat desirable, vague, and widely applicable.[10][73] MBTI is not recommended in counseling.[74]
Any survey (as opposed to horoscopes which aren't up to user choice) can be used to convey information about a person, even if that information is what they think about themselves. "I took a survey and I'm a Slytherin" conveys plenty, and no one feels the need to point out that that's unscientific.
I get that, but I wanted to elucidate since many people think Myers Briggs is based on data and is not a fiction.
So is it a coincidence that a majority of programmers tend to fall into the INTJ bracket?
The criticisms shared by GP are that MBTI is unscientific; ie: unclear or hard to disprove.
There’s a funny irony that you’re using a rhetorical device, as opposed to a rigorous scientific methodology, to try and navigate that criticism.
I don't see how, or at least that much. Language is merely a construct to describe things.
I don't mind the pigeon hole classification as it seems to describe me quite well vs the other definitions.
I should write a blog post entitled "I should have loved computer science"
Do you do bioinformatics?
Bioinformatics: https://en.wikipedia.org/wiki/Bioinformatics
Health informatics: https://en.wikipedia.org/wiki/Health_informatics
I'm a computational epidemiologist.
I do quantitative biology now, although my background is in theoretical physics. Biology is fascinating, but ultimately there is a cultural divide between the scientific "language" used in biology and the scientific language of e.g., engineers, physicists (very famously described in "Can a biologist fix a radio?" https://www.cell.com/action/showPdf?pii=S1535-6108%2802%2900...)
I do find the author's point weird. "I thought high school biology was just memorizing facts, but I began to appreciate it when I read some pop science books and went scuba diving." So the only problem for the author was the topic of the classes, not the style. Why shouldn't one have the same problem with high school physics ("it's just about boring ramps and pulleys"), etc.? Personally I find the style to be a more important distinguishing factor, in that biology is much less quantitative than other science disciplines. Instead the author's problem is that biology should be even less quantitative and more literary or poetic...?
Ultimately science journalism/popularization is not the same thing as science. High school science classes (try to) teach the latter not the former.
High school physics and chemistry equips students to make (a very limited set of) predictions. High school biology super doesn't. When you're learning chemistry and physics, it feels like you're learning a systematic set of rules that let you approximate and model the world around you. Biology...doesn't, not really. Life is just more complex and higher order, and it's that much harder to actually use the study of it to understand the world immediately around you in any meaningful way.
It's still super cool, but it makes learning about it as a science less satisfying, since it's less friendly to the standard scientific method.
I've recently been delving into paleobiology, but what inspired it was very different from what's described in the post. I ingest a lot of pop educational stuff, mostly just for entertainment; but after a few years of just hearing the highlights and fun facts it became frustrating not being able to put all of it into context.
So I pushed myself a little out of my comfort zone and ordered a textbook and enrolled in a course. It made me realize how I've forgotten how to learn without it being entertainment. But, after some acclimation, I also realized that I don't really need an engaging presentation, because I really do just enjoy learning. So in a way my journey has been kinda the opposite of the author's - the 'fluff' around the information made it less appealing, not more. Though I suppose I might not have taken the leap to delve deeper into these topics in the first place if it weren't for the accessible versions.
Either way though, I think the real takeaway isn't that there's a right way to be interested in a topic - whether through stories and history or otherwise - but rather that school isn't the best environment for figuring out if something interests you, and it's worth re-visiting topics you might have written off with a fresh approach.
>I think the real takeaway isn't that there's a right way to be interested in a topic
I think a different perspective can sometimes illuminate though, it's not just about the person - it's them having an epiphany that motivates them to do something, like learn more.
>pop educational stuff,
I watch a lot of that as lazy entertainment, so much of it is factually incorrect (on YouTube etc). But I know better I guess.
One thing I was suggested when learning a new subject: pick a book like "history and philosophy of physics/chemistry/biology". This has served me well. Most of the teachers are bad; most of the text book writers are bad. In my considered view, a philosopher of biology can write a better biology textbook for newbies.
If you want to be fascinated with biology just go to nature, or a park and stay there for a while. After a while you ll start to wonder about the birds, the plants the snails, the cats. Biology is descriptive science , nothing wrong with it
I don't know if just going to nature is sufficient to get fascinated with biology. In my opinion it takes a fundamental reset in how you think about anything you see. Humans while smart have obviously had to learn to "ignore" thinking about how things work. You don't think too hard about how anything works really. I mean at a cursory level sure, but by vastly different interpretations of the word "cursory", you can change your thirst to know how things you see work at more and more fundamental levels.
You don't need to go into nature to get this curiosity except for the possibility that it makes you more meditative. You can look at your arm and think what the hell happens in there at a molecular level to make you move the muscles. Or when someone says nerves conduct electricity what the hell does that mean?
I revisit this feynman video of him explaining (or not) magnets every few months and I think it's relevant to this question. https://youtu.be/MO0r930Sn_8?si=CkWYfiGoGCgAANwP
When I think like that I'm just curious why OP and others blame teachers or whoever else for not inducting the curiosity in them. Like it's someone else's job to make you curious? In my opinion you're either born that way or you're not. Some airport store book isn't gonna make you the next whatever scientist you adulate.
I sometimes skywatch late at night, marveling at the vastness of what's out there, and the glimpse of it we get over here. That gives me a sense of wonder about space, but did not make astronomy any more appealing to me.
Gaining an appreciation for nature is good, getting fascinated with biology is also good, but one is not necessarily related to the other in practice.
>or a brain remembers a memory
We don't actually know this very well, do we?
In high school I was all math>physics>chemistry>biology. So I didn't take biology. Much to my peril. I didn't learn that I wasn't just a brain on a stick until I was 25! At some point "The Inner Life of the Cell" blew my mind.
I was lucky to have a great AP Biology teacher in high school. I ended up minoring in the field and it has shaped my career. Now my child is a little biologist. It is a fascinating subject and so core to everything we are and everything we do.
I hated biology in school too. I wasn't interested in any of the memory tests. Still got the top grade, but dropped it when I could.
Later I got a job in bioinformatics. I worked on virus genomics. Definitely recommend viruses (specifically RNA viruses) because it's all the cool stuff without memorising bits of a flower etc.
Biology is a huge field, though. I can't criticise the way it was taught because I have no idea how to do it better. Not everyone is into genomics.
This article really strikes a chord: going through high-school biology I was shocked by the dessication of life in the way everything was presented, as if death itself had written the curriculum. I focussed on maths and suspected this was the hidden agenda: only present man-made constructs, treat the rest as if it were just wrong maths.
I'm just going to recommend the biology books written by Lewis Thomas. The books are collections of essays rather than science or text books. They blew my mind and opened up a deep respect for the field of biology and gave me a deep appreciation of life in all its forms, so many of which I didn't know existed.
Look for:
The Lives of a Cell: Notes of a Biology Watcher
The Medusa and the Snail: More Notes of a Biology Watcher
The Youngest Science
...and a couple of thers.
I think this is true for most any subject taught in school.
I have loved history since I was six and my parents got me my first adult history book. I love how all stories fit together, understanding why things happened the way they did, how and why people in the past thought differently than we do today, all of it. If you read a textbook, though, history is just memorizing one thing happening after another.
Part of that might be my (American) education system's fear of controversy: explaining what motivated abolitionists and slave-holders in the 1850, the actual stakes over which they were fighting, would not be popular in many states, and some parents would no doubt object. But also, it's complicated because the past is a different country- all of a sudden you are having to explain the way that the economics of the Industrial Revolution changed the demand for complimentary goods (1), the Curse of Ham (2), the way that printing presses functioned in antebellum American democracy (3), and the pre-Civil Service patronage system (4). Basically, you have to teach a college level course to understand how things were different then and why they happened. And really good teachers can simplify the details down to an age-appropriate level, but most teachers are, well, average, and so memorization is a lot easier path to follow.
1: The beginning of the industrial revolution mechanized looms and spinning wheels, and mechanized cleaning raw cotton. As basic microeconomics suggests, those improvements suddenly massively increased the demand for cotton. Those demand spikes transformed large slave owners from people who understood that slavery was bad and wanted to see it ended but not quite yet to people who thought that slavery was a positive boon for the enslaved people they owned. You can actually see this in their writing, in 1800 most slave-owners think that slavery is on its way out and will not spread much, and in 1830 slavery is the best thing that God gave people anywhere.
2: The Southern Baptist Convention created itself in 1845 because so many didn't think that National Baptists in the General Missionary Convention were committed to defending slavery and the Curse of Ham, and they wanted to be part of a religion dedicated to the idea that White people should rule over Black people.
3: Before the secret ballot each party would provide its own ballots, pre-marked, and you just turned in the ballot of the party you supported. This naturally meant that each party had its own printing press in each town, which meant that they also had newspapers, pamphlets, and the like, and the press-owner was almost always one of the most committed political partisans in an area. Then when their party won they would get the contract for printing all documents the government needed in that area. This is a major driver for political polarization in the 1840's and 1850s.
4: Before the existence of Civil Service protections, basically all of the staff of the government would change over with a new Administration, every postmaster in every town would be appointed by the President and would change with every election. The fear that a Northern President committed to abolition would use this patronage- and printing contracts to printing press owners- to build a large segment of white southerners committed to abolition- who would in turn spark a slave revolt- that was why so many Southern states tried to leave the Union at the election of the first Republican President, before he was even inaugurated or had a chance to do anything. Because if they waited, he would appoint abolitionists to every town in the country, so they had to get ahead of him.
ha i studied bio in undergrad > med illustration mfa then dropped out after a semester > ms in cs
I am sure the author is a fine person, but this is an incredibly self-entitled piece. A number of biologists managed to make it through these classes just fine, and are paid much less for pursuing their passion (and making the breakthroughs the author enjoys reading about while on vacation).
A title like "I wish I had enough attention to get through the boring parts of high school biology, I now find pop biology interesting" may have had less impact, though.
Computer scientists and programmers are very intelligent people who often have grossly unrealistic projections of their competency in other fields, and this is a fine example of the phenomenon.
The post is not about becoming a professional academic/researcher in biology, so it's not clear why your comments (this and the earlier deleted one) focus on competency, calling the author "not cut out for biology", etc.
The post is simply about what you call enough attention to get through the boring parts of high school biology — should biology in school be only for those who have that ability? Even if being a professional biologist requires those attributes, shouldn't the teaching of the science of life—which is full of wonder—have a bit of something for everyone else too? Even people who don't become biologists ought to love biology, surely?
That's what the post (like the earlier one by Somers) is about; it's not about “I could have become a biologist” (as you seem to be implying). You can call it pop biology, but it's missing from school where “astonishing facts were presented without astonishment”. I see nothing self-entitled about this.
It's the same in mathematics, say: even if being a professional mathematician requires (say) thinking long and hard and being willing to struggle with difficult problems, manipulating things in one's head, etc — surely there is value in exposing more students to pop mathematics / beautiful results (enjoying which is very different from actually doing mathematics, sure), so that more people could love mathematics recreationally, whether or not they become professional ones?
The other top-level thread that talks about how this happens in CS education too (https://news.ycombinator.com/item?id=43764315) seems to get the point of the post: it's the equivalent of Lockhart's A Mathematician’s Lament (https://worrydream.com/refs/Lockhart_2002_-_A_Mathematician'... ).
The author did fine in another field, but might have picked biology instead if they had gotten the switch flipped earlier in life. That some people get through bad classes isn't a proof that those classes are good; you get those few who would survive no matter what, and those whose brain-wiring is conducive to the way the bad classes are structured. This has a tendency to reduce diversity of thought over time, and contributes to academic ossification.
Secondly, fields really do need cross-discipline collaboration. Finding passionate CS people is fantastic because they bring a different skill set. I have often found that when we get diverse experts together, we can have everyone do the "easy part" and get results which would be otherwise unobtainable.
Yes, some people have 'engineers disease' and fail to appreciate the depth of knowledge and skills of folks who have spent their life in another domain... But the author doesn't seem to be one of these. Many of their favorite stories appreciate the combination of insight and hard work in the history of the field.
It does, indeed, suck that people working in biology get paid less than computer engineers. Blame capitalism...
As a biologist with a tech background (but actual biotechnology majors) - please we have enough tech bros who think they're biology's saviors. They'll just come in fascinated by some technological problem, call it the only blocker to solving aids and cancer and take away a billion dollars in funding over decades and show nothing of actual consequence. Like the entire protein folding field. It's a tool. Not the solution. Even today there was this hyperbolic piece on NBC about how this Harvard scientist working on microscopy image processing is being deported and now we are not going to cure cancer.
I feel bad for them, but I can assure you, as someone who did the research in the exact same field, they're curing nothing and are more likely to make cures slower by sucking away funding from more pertinent projects.
Also relevant xkcd https://xkcd.com/1831/
I've been working my way towards a biololgy degree very slowly (can only really fit one-class-at-a-time alongside working full time). I'm maybe 70% to a bachelor's degree in it. Been writing code for ages, but I've saved enough to accept a lower salary if it means I get to work on a real problem for once in my life. So I guess I'm one of those people you're frustrated with.
Do you have any advice for how to not be that kind of problem? For now I'm just focusing on my coursework, but at some point I'll be biologist-enough to help out with research. How do I approach it without being that guy?
In my (possibly not the best) opinion the most important quality will be to not delude oneself with the idea that their method or field is the most important field in all of science. Unfortunately academic structures force you to think and believe that and then proselytize that way. But if you stay above it at least in my books you're above most folks. But then I'm a lowly guy in a corner lol.
Practically what this means is that you should decide what you truly want to change (not necessarily what you can change with your current expertise) and pursue it across whatever fields necessary. If it's curing a disease, you have to decide what is the most important thing that's stopping us from curing that disease and pursue that exact topic. More often than not it's not anything software related. You have to grab a pipette at some point and guillotine a few mice at another lol.
I once met a scientist who spent a week traveling to where there was a powerful x-ray laser. He used it to blast a thin film of something or other that was floating on the surface of some water. He left with a flash drive full of data and some FORTRAN titled LSQREFL, which allegedly could decode the laser results. He then spent the next 6 months trying to make it actually do that. Turns out you had to have a folder with today's date on it on your desktop, otherwise the program would crash. This was documented nowhere, he just eventually puzzled it out from the code.
I offered to put it on github for him, so that at least he didn't have to be the sole caretaker for this endangered bit of software, but he was afraid of running afoul of the original author's rights, so endangered it will stay.
This was maybe an unlikely occurrence, falling neatly in the not part of your:
> More often than not it's not anything software related
But it makes me think that there is still some juice left to squeeze out there. I mean, I'm having a good time with my one-class-per-semester, I'd just prefer to not have to do it for another decade before I'm enough of a biologist to get my hands dirty.
What does the author claim entitlement to? Or what real-world malign effect are you expecting from this piece that warrants the charge? I went in expecting the type of piece you describe, since I know the type, but I've failed to read it as you do except with a disqualifying squint.
I think one of the things I love most about biology is its uncertainty. Things like Math and engineering are all rigid and rules based. Life is wibbily wobbly, lifey-wifey. An enormous soup of changing alleles cast as probabilities over eons all creating endless interactions you can't ever comprehend.
You have to become comfortable with the fact that there is uncertainy and there are parts of it you can't control. So instead you have to be obsessed with introducing order where you can. It is so refreshing to see a beautiful experiment that can wrestle a clear signal from the endless noise.
> Things like Math and engineering are all rigid and rules based
Depends where in math, in things like particle physics things get all wibbly wobbly is my cat dead or alive. In things like engineering quite often what you're dealing with is probability based, but you just stack the deck so far in your favor the probability is 1.
As they say, building a bridge that doesn't fall down is easy. Building a bridge that barely doesn't fall down is much harder.
Not saying those fields don't have uncertainty, but I've never seen an physicist pray to Newton that gravity works this time when the ball drops.
I have seen molecular biologists (jokingly) shake the voodoo "molecular biology maracas" over the PCR machine to try and replicate their result.
A lot of experimental and applied physics operates this way. If you are synthesizing material, for example, it takes a lot of time and effort to get high yields of what you want. Before that your processes can be very probabilistic.
In fact, just finished listening to a talk where a experimentalist was talking about how to get the fabrication yields of superconducting qubits from currently low double digit to 99.99+.
Man, just let me have this.
Biology is messy at a macro level is all I'm saying. I don't need a hundred people butting in saying "butt aschully phsyix and code is also messy and harder at a quantum level." I know. We know.
Every scientist does that at some point. I've easily crossed my fingers and hoped numerous times that code I'd written would work, especially on the first time. Even more rewarding in the superstition when the project is hard, and you're a bit daffy at the end.
It's a human thing.
Surely Feynman made jested comments before running experiments. I'm sure some digging in his wonderful books and letters will find many examples.
[dead]
Genetic algorithm: https://en.wikipedia.org/wiki/Genetic_algorithm :
> Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems via biologically inspired operators such as selection, crossover, and mutation.
AP®/College Biology: https://www.khanacademy.org/science/ap-biology
AP®/College Biology > Unit 6: Gene Expression and Regulation > Lesson 6: Mutations: https://www.khanacademy.org/science/ap-biology/gene-expressi...
AP®/College Biology > Unit 7: Natural selection: https://www.khanacademy.org/science/ap-biology/natural-selec...
Rosalind.info has free CS algorithms applied bioinformatics exercises in Python; in a tree or a list; including genetic combinatorics. https://rosalind.info/problems/list-view/
FWICS there is not a "GA with code exercise" in the AP Bio or Rosalind curricula.
YouTube has videos of simulated humanoids learning to walk with mujoco and genetic algorithms that demonstrate goal-based genetic programming with Cost / Error / Fitness / Survival functions.
Mutating source code AST is a bit different from mutating to optimize a defined optimization problem with specific parameters; though the task is basically the same: minimize error between input and output, and then XAI.
Justifying that genetic algorithms are CS and Biology applied, which satisfies OT's implicit yearning