Next time you're wondering why more kids don't go into STEM in this age of wondrous scientific and technological progress; virtually any bright 10 year old knows the story of how and why Archimedes leapt out of his bathtub shouting "Eureka!", but none of them would have the foggiest notion of what a real scientist does at work on a day-to-day basis. Neither would their parents, unless they are scientists themselves. Even students halfway through college on course to become such scientists often have only a vague notion of what their careers would actually look like- they just know they like the coursework. Its a leap of faith.
Same goes for most kinds of engineering, and mathematics, and a bunch of other disciplines that are inexplicably underrepresented in children's career aspirations.
(I suppose software engineers are the exception, since the popular notion "they just sit at computers all day typing arcane things" is actually true. : ) )
You can get some idea of what its like to be in politics through watching C-SPAN or reading news accounts of high-profile political struggles or watching fictional accounts like the West Wing. The life of actors is often meticulously charted out by the public consciousness just because many people have such tremendous interest in celebrity actors. I'm not saying that its only STEM fields to which this applies, but it certainly doesn't apply to everything.
You really don't get a good idea about any of those professions through the media. Watching C-SPAN or even the West Wing only gives you a minute glimpse. Same with actors (most of the actors time on set is spent waiting or doing nothing). 90% of a director's job is done before the 1st day the camera's start rolling (and the majority of a director's career is spent emailing or meeting about prospective jobs, not actually on set directing actors).
How many 10 year olds do you think watch the West Wing or C-SPAN?
And you do realize that most politics is local. You need to watch local access cable of town meetings, not C-SPAN, to get an idea of what most politics is like. Bartlet never walked in the local city parade, mingled at the 4th of July town pancake breakfast, nor had to listen to the local kook make a complaint.
Watching West Wing for an idea of how politics works is like watching a documentary on Einstein. Even if completely accurate, it's still a very unusual example.
Similarly, most actors are not celebrity actors. Wil Wheaton's blog, for example, doesn't match what little I've read from the tabloids, and Wheaton is one of the more well known actors.
I suppose software engineers are the exception, since
the popular notion "they just sit at computers all day
typing arcane things" is actually true. : )
I know you are joking, but it has to be said.
In my experience most software engineers spend considerable time in (formal or informal) meetings.
According to "Mythical Man Month" the average productivity is 10 lines per developer per day.
I agree that LOC isn't an ideal metric. But there isn't really an ideal simplistic metric.
Out of curiosity, I thought I'd look at a recent productive day. In actual numbers, my 42 check-ins totaled 409 new lines of code compared with 213 lines deleted, so close to 50% (not counting check-ins with simple line changes).
At the end I had several new features implemented and a lot of code refactored to be cleaner.
Well, yes, I'm simplifying things a bit. Although, now I actually think of it, I would be curious to know how average time spent in meetings stacks up for software engineers versus other professions- scientists, architects, accountants, etc etc etc. Certainly something someone might want to consider when plotting out their career path, but I don't believe I've seen anything looking into it.
I think that goes for almost all jobs. Do most high schoolers have a good idea what the job of a lawyer or a doctor or a pilot, or a small business owner or a cop or a professor really looks like? I'd say most adults don't have a good idea of what other jobs outside their experience or contact really are day to day.
I mean, nowadays, if you have an idea of the job you are going into in a STEM field you are usually kidding yourself (exception being sons and daughters whose parents own a business or have another secure job type). Heck, many times the stuff you learned in the 1st year is irrelevant or has changed by the 4th year.
For instance, how many 2 terminal linear circuit components are there? 3 right? Resistor, capacitor, inductor.
Wrong. There is a 4th, the memristor, and those are available today (http://www.mouser.com/ProductDetail/Panasonic/MN101LR05DXW/?...). Now this particular example is not that big, but it speaks to the variety and rate of change in an undergrad course set these days. You have to be able to change, and change a lot. Everything thing you learn has an asterisk next to it. it never really did not, but you have to talk about those things now in class, not just sweep them under the rug.
If you think you know what is going on after university, you are now very naive. I'd say, most of the kids at least know this now, that they are not going to have a career in one company, but many, and in a lot of different fields too. EE will bleed into neuroscience and geology and ME into bioengineering and FDA food stuff.
I have an example of a lone genius, Calvin Mooers. He's best known for his work in information retrieval; he coined the terms 'information retrieval' and 'descriptor', among other things. He's also known for the TRAC programming language.
But he's little known in my field of chemical information. You have to read the original papers carefully to discover that he came up with the first concrete method to represent a molecular structure on a computer, a proposal for how to search molecular structures via arbitrary query topology, and the idea that there is a simple algorithm (though not practical) for producing a canonical description of a molecular graph.
As far as I can tell, he was 5 years ahead of the rest of the field, and the founding papers in my field all cite him as a source of their ideas. (That said, his ideas were not all implementable. His method for substructure isomorphism is hand-waving optimism, for example.)
Unlike the other pioneers of IR, he was self-employed (Luhn was high up at IBM, Taube was originally high up at the Library of Congress, etc) with no collaborators. But to all accounts, it was not easy to work with him. As one person described it, he didn't like other people playing with his toys. (Eg, see the fallout on his attempt to protect TRAC by trademarking its name. He was 10 years before Microsoft's much more famous 'Open Letter to Hobbyists'.)
Conspicuously absent in Bill Gross's "four types" theory at the first link is the Engineer -- the person (team) who actually creates the thing all the high-paid celebrities will hype/sell. All four types are essentially "business" people.
This is also a warning against genius worship. Sometimes it really does seem like someone has pulled something from thin air (esp in mathematics. You may recall on more than one occasion learning something that seemed unsolvable until it was obvious), but most often those people have placed themselves -through hard work- in a position where they are most likely to make a given discovery.
"In the fields of observation chance favors only the prepared mind." ―Louis Pasteur
We cannot ignore the influence of prior knowledge on observation and reasoning. Observation is theory- and paradigm-laden. (See also the Duhem-Quine thesis.)
I agree with the story and liked it -
but the aha moment still might be at a lonely place doing something totally different. The human brain needs both intensity and distance to generate new thoughts.
"Two thousand years ago, Aristotle’s “Physics” was a wide-ranging set of theories that were easy to state and understand."
I'm having a hard time imagining that anyone comes away from just having read Aristotle with the impression that his work is easy to state and understand. Science is great, but let's shed ourselves of the need to belittle other disciplines in order to make our own accomplishments seem more grand.
The same reasoning can be applied to the popularization of science, a contradiction in terms. If you want to understand a science, you've got to do the science. Trying to lure people into science by presenting an entertaining counterfeit is not only disingenuous and intellectually vacuous, but it is likewise harmful. Open ignorance is a good thing. Having to unlearn bad metaphors and a habit cheap thrill is not.
He says this statement is not true: "[Darwin] discovered the theory of evolution after studying finches on the Galápagos Islands". Granted, there is certainly more to the story than that, but unless he's suggesting that Darwin devised his theory before he went to the Galapagos, the statement is literally true.
I think this dual-meaning for 'after' is quite common, at least in Ireland and UK (and Australia). Here's an example in lieu of a formal citation:
1. After getting dinner, Alice and Bob met friends at a local bar
- Purely temporal, one thing followed the other
2. After hearing Alice recommend her gym, Bob signed up for a membership.
- Causal (and temporal of course): because of the first thing, the second thing happened
You can disambiguate this in many ways, the most common in my experience is to make explicit the time
"Darwin discovered the theory of evolution many years after studying finches on the Galápagos Islands"
Most human communication is oblique and symbolic, not literal.
These are fairy tales, and fairy tales do say something. They just don't say what they literally seem to be saying. They contain layers of symbolism, often saying things about the culture, the inventor, and the invention/discovery at the same time.
For example Newton's apple is interesting for its Genesis reference -- an apple from a tree. What is this myth actually saying about knowledge, science, and discovery? What's it saying about Newton? Newton was known to be a religious heretic and a practicing alchemist/occultist as well as a scientific thinker. You don't hear much about that, but you do get it in an indirect way via the apple symbolism I suppose. So perhaps the myth is a way of condensing down a lot of information about both the discovery (falling, gravity, mechanics) and the discoverer (heretic, religious outcast) into a single compact vignette.
I think it likely that the story evolved memetically, rather than being "for" anything in particular. It might have started with something as simple as Newton mentioning a falling fruit by way of illustration during a lecture, but as people pass the story around embellishments that resonate stack up. Apples, as you note, have rich cultural connotations involving intelligence in general and instantaneous revelation in particular (interestingly, Genesis does not actually mention apples, referring only to "fruit", which would seem to suggest that apples get their symbolism from somewhere else. Possibly their brain-like cross section?). The more recent retellings have added the embellishment that Newton was not just inspired by a falling apple but literally struck by one, the apple serving as a physical manifestation of an idea.
So I think the guiding principle is not so much "how can we poetically represent Newton" as "what do humans think is cool".
Newton also had the famous quote "If I have seen further it is by standing on the shoulders of giants." I was taught that in school as well as the apple story. So I think the article is a tad hyperbolic.
Unless things have changed considerably when learning about atomic theory you get taught about Thompson, Rutherford and Bohr before you get to Schrodinger. Science is full of examples about building on the work of others. So I don't think kids are given unrealistic expectations that you have to be a genius rogue thinker to make a contribution.
Maybe it was that at one time, but these days it is passed off as a factual event along with the false story about Archimedes' Eureka, the false story about Albert Einstein failing math, Thomas Edison inventing the light bulb, Henry Ford inventing the assembly line, etc.
The popular lies in the history of science and tech are endemic.
Same goes for most kinds of engineering, and mathematics, and a bunch of other disciplines that are inexplicably underrepresented in children's career aspirations.
(I suppose software engineers are the exception, since the popular notion "they just sit at computers all day typing arcane things" is actually true. : ) )