> Cargo-cult thinking means only looking for, and only accepting, confirming evidence.
This article's definition of cargo-cult thinking seems incorrect. The definition I'm familiar with: when you lack a true understanding of some idea and end up just mimicking the superficial qualities. It's a great metaphor that comes up all the time in software engineering.
For example, seeing a successful system that uses microservices and thinking that switching your system to microservices will make it successful. If you don't understand exactly what the tradeoffs are and why those tradeoffs worked well for the successful system, you're not going to get the result you want.
Maybe the author confused "cargo-cult thinking" with just plain "cult-like thinking"?
This mashes a few different ideas together that don't really fit. Cargo-cult thinking means emulating something's appearance without understanding how it works. This is a specific kind of irrationality. Removing gall bladders en masse is also irrational, but it's not a cargo cult. The beliefs of a cargo cult may be unfalsifiable, but that doesn't mean all unfalsifiable beliefs are like a cargo cult.
It may be perfectly reasonable to be unable to imagine a belief being falsified. For instance, what would it take to convince you that you remembered your own name wrong and that you've been called something different for your entire life?
I second this. I think we’re confusing cognitive dissonance with cargo culting in this article.
Cargo culting is blindly copying something without understanding the mechanistic behavior of the thing being copied. Plenty of cargo culters end up convinced after following the recipe but the planes don’t show up.
Cognitive dissonance on the other hand is about not updating your beliefs when there’s overwhelming verifiable evidence to the contrary.
I think the two can go hand in hand. Hype cycles in tech suffer from both but I think they’re distinctly different things.
The problem of course is that so many things are experiential (rather than data driven) so the question “What would convince you otherwise?” could only be answered by “Having lived a different life.”
Take, for example, a software engineering opinion that I hold dearly: “Good commit titles and descriptions, with sufficient description of why a change was made and notes on any non-obvious implementation decisions, are valuable and necessary.” This arises from my experiences trying to debug code in repos filled with commits titles ‘wip’ and ‘address feedback’ and ‘changes’ which inexplicably touch 5k LoC. I simply despise trying to divine what some past developer intended without any clue as to why; it is an epistemological impossibility even when sometimes the past developer is me. More over I am convinced this is necessary because I also have the experience of doing all the work to understand some spaghetti—using old issue tickets and git blame to slowly build a mental model of the codebase—finding the bug, changing it, and having the whole system violently reject my fix because what appears to the naive observer to be a bug is, in fact, intended behavior the system depends on.
What would convince me otherwise that spending time on writing good commits is not worth the time and effort? Either an impossible-to-execute-without-confounding-variables longitudinal study that measures developer happiness over time in code bases that do/do not emphasize commit message quality. Or I could have different experiences that lead me to not care. Like if I never had to maintain legacy code or if everywhere I ever worked had issue trackers filled with explicit technical details and motivations then maybe I wouldn’t care; but that’s not my experience so I do care.
I don’t know how you hack your way past the reality that there is not one perfect way to write maintainable software and even if there was nobody actually has time for that. So you have to chose what to prioritize and some of those priorities may become your personal cargo cult.
That's a great example. My experience is precisely the opposite. I gave up on commit messages years ago in my own projects and it has never had the slightest negative effect. I literally have sequences of hundreds of commit messages that are simply "update".
But my experiences are not yours. my projects are likely simpler, and I don't collaborate much. It would be dumb for me to follow your practices, and you mine.
Exactly. What is important is knowing what to prioritize for the project at hand. If it’s useful it’s not a cargo cult. Of course the less experienced would ask “how do you know if it will be useful?” but truly the answer is you experiment.
Cargo-cult thinking is the mimicry, not whatever this is (being wrong, but still insisting you're right?). The way you detect cargo-cult thinking is you ask "why did you do that?" If the answer is "because they did that" it's cargo-cult thinking.
Cargo-cult thinking can also be more successful than analytical thinking over the short term, and over the long term can be subjected to analysis and improved. Cargo is still a religion in contemporary New Guinea and Vanuatu, but now it's an abstract one like the Church of England. If you accept Cargo as symbolic, but still believe in its effectiveness, you might as well be a Methodist.
This article is missing a lot of the meaning of cargo cult. It’s got religious tones as well as irrational beliefs.
Good YouTube video on it it: https://youtu.be/x5jbHDMeDFE
The claim of the WWCYO question universality is based on assumption that all knowledge is derived from experience, and all other information is not true knowledge at all.
Some people explicitly would not agree with that.
Unrelated to that, what would convince you that you don't even exist?
It feels like WWCYO is still a good question, abstractly, to ask though even in cases where you don't think there's a good contrary argument, or don't have the knowledge, or it feels more value based.
For example, if I apply your example question to my beliefs, I don't necessarily come up with any specific answer, because I'm not super well versed in philosophy. It highlights that my belief that I exist might be on shaky ground, or it just might not be testable at all, but I'm open to being convinced otherwise.
And so that's the more abstract answer, which is : if I saw an argument that seemed rational to me, I might be convinced that I don't actually exist, but I'm sticking with what I believe for now. That's all you need -- WWCYO doesn't mean that there is a valid contrary argument, but that you're open to hearing one and changing your views. If the belief is more about experience, of course, you can get more specific about your null hypothesis.
After seeing two cargo-cult articles on the HN front page today, I have to say that "cargo cult" has become a meaningless cliche that people use to describe anything that's wrong.
He brought up cargo cults as a way to illustrate improper scientific thinking — because the core driver of cargo-cult thinking is unfalsifiable beliefs. I appreciate the sharp reading and will be more explicit next time.
This article's definition of cargo-cult thinking seems incorrect. The definition I'm familiar with: when you lack a true understanding of some idea and end up just mimicking the superficial qualities. It's a great metaphor that comes up all the time in software engineering.
For example, seeing a successful system that uses microservices and thinking that switching your system to microservices will make it successful. If you don't understand exactly what the tradeoffs are and why those tradeoffs worked well for the successful system, you're not going to get the result you want.
Maybe the author confused "cargo-cult thinking" with just plain "cult-like thinking"?