In condensed matter physics, there is an area called turbulence that has wide practical application: weather, golfing, navigation, bridges, building subs, boats, and planes.
(Most of you know turbulence from those random unexplained dips you get when your plane is in flight.)
But for theoreticians, turbulence is different.
In 1941, some Russian guy wrote a theory for the dissipation of vortices in highly turbulent flows:
Since then…nothing. Any significant contribution to turbulence has been beyond smartest minds in theoretical physics, despite the describing equations discovered by 19th century classical physics.
In physics, we like to say:
Turbulence is the graveyard of great physicists.
Artificial Intelligence
The topic of machine learning came up before dinner as it relates to online-analytical processing (OLAP). I find it absurd, that in an area where most people can’t even code a proper data warehouse, working in the business world where practical realities are paramount, people talk about doing machine learning. You need a great OLAP before you can even talk about machine learning.
Machine learning is a branch of Artificial Intelligence (AI). And since I recently stated my opinions of AI, Andrei steered the topic that direction—to wait for the horse hair to snap.
He didn’t have to wait long.
“Look, you’re talking about a field that has been around for sixty years now that has yet to make something of significance—Clippy is like their claim to fame. Search? Nope, map-reduce crushed it. Fingerprint recognition and spam detection? Nope, statistical probability-based recognizers like Bayesian ones are how those are done. Chess? No, a brute force Alpha-beta pruning game theory approach crushes those all the time.”
“…but chess isn’t an interesting problem,” Some Guy interrupts.
(I’ll mention at this point that Some Guy is apparently writing a book on Artificial Intelligence. In my defense, he didn’t have the balls to admit this at the time…or at all to me for that matter.)
“Isn’t it funny how that as soon as AI gets their asses handed to them, the problem suddenly becomes uninteresting? You’re talking about a branch that attracts the brightest minds in computer science. At a certain point, you should just get a little humility, admit failure, and move on.”
“So what you’re saying is you’re afraid of failure?” A triumphant smirk creeps across Some Guy’s face. I can read his thoughts. I think, Your heros are the ones who washed out of my major.
Instead, I say, “So what you’re saying is you’re afraid of making a contribution to the world?”
And the horse hair goes… Snap!
I interrupted with “…but chess isn’t an interesting problem”, because that is what I heard when I discussed AI the last time with someone. Go seems to be a much more interesting game to look at.
Just stating the obvious: I am not “Some Guy” and I am not writing an AI/SF novel 🙂
Go is still amenable to AI approaches because of the difficulty in doing pruning (score tracking) and because of the combinatorial explosion in it. OTOH, even the best Go programs can’t beat a good amateur. Heck, even a beginner like me (a good amateur spots me 17 stones) can beat GnuGo on an 9×9 pretty consistently if I play black without komi.
Yes, Sebastian was the one who said that but I compressed it for narrative economy, because Some Guy used that as a hook later. Where I was standing Andrei also mentioned Go. Neither is Some Guy. 😉
I mentioned at the time that chess was an interesting problem right up until brute force algorithms started crushing AI playing programs (then Moore’s law kicked in and did in Kasparov).
Speaking of which, I couldn’t fit in the part where he talked to you guys about his “sorta girlfriend.” When you told me that, I thought, “I wonder if he thinks you can be ‘sorta’ pregnant also?”
Thanks for answering a question for me in 10 words or less.
Man, I can’t even get HALFway through your blog without getting distracted by your links…. your brain is a clusterjam of information 🙂
Thanks for your knowledge.
Einstein also got his Nobel prize for his study of Brownian motion. The Nobel committee pointedly did not award him the prize for Special Relativity – the maths had been worked out by Lorentz and Poincaré, Einstein only held the press conference, so to speak (General Relativity is indeed Einstein’s work, but by then he was too old to qualify for a Nobel).
Of course, before you get too smug about physics, there is that other xkcd cartoon: http://www.xkcd.com/435/ or the Mark Pilgrim variant: http://diveintomark.org/archives/2008/06/11/purity (of course, I disagree with the latter, Philosophy being mostly rhetorical BS posing as logic, as exposed by the Sophists, hence the philosophers’ undying hatred of them).
@Fazal: ORLY?
I don’t doubt that in 1905, Einstein wrote three papers that launched three fields: special relativity (General Relativity and Gravitation), the photoelectric effect (Quantum Mechanics), and brownian motion (Chaos theory). Any one of which deserved a Nobel. The photoelectric effect—as mentioned in my article—was the only one of the three singled out by the committee.
I will disagree with the statement that Einstein “only held a press conference” for Lorentz and Poincaré. If that, then the same could be said of general relativity and the the mathematicians who created field theory (a minimum of 11 years for him to learn the math that was already year), or of the photoelectric effect and the work of Planck and Hertz. You singled out only the former—I have to wonder (rhetorically): Why?
BTW, you can never be “too old” to qualify for a Nobel Prize. The only age restriction is that you are alive, and not more than three-headed. In fact, Einstein described General Relativity in 1916 and experimental validation occurred shortly afterward. This was before he was awarded his first and only prize in 1921.
As to why the photoelectric effect? Please read this article to understand the politics of winning a prize. The key is quantum mechanics and anyone interested in the sordid pettiness of science would enjoy the read.
As for being “too smug” I have only a shrug. I am talking about a fact of my college, not the real world. At the time, two friends, both of whom later won Caltech’s undergraduate math prizes, asked me why I didn’t double major in mathematics. I replied, “Well I’ve certainly reached the point where I can understand what you write. But it is like reading poetry. I can read it—which in this case is rare…But to be a poet? That is another thing altogether.” They understood. And just to balance out that humility with a bit of ego, I’ll point out that they asked this because I was crushing most of the math majors in the math and applied math classes—it was just that I had reached the point only to realize that I had learned enough to be able to see just how good some truly brilliant mathematicians are.
I feel xkcd’s comic is neither here nor there on the physics/math divide—in my more selfish, cynical moments I feel that when xkcd is funny, he’s a physicist, but when he’s not, “Oh yeah, he’s a mathematician too!” 😀
As for Mark’s comic, like you, sometimes I have little respect for philosophy. However, at least they check their facts before leading down a logical tear into me based on a faulty premise. 😉
Yeah, I got your xkcd comic right here: http://xkcd.com/435/. 😛 Not all computer scientist majors are washout physics majors, or arrogant “Some Guys.” I’ve got a personal theory that the math-derived majors give us a good excuse to focus on the areas of math we like and avoid the ones that drive us nuts. Just my opinion, though; I could be wrong.
Fazal — “the maths had been worked out by Lorentz and Poincaré”
So what if the Lorentz contraction preceded special relativity. Everyone knows this. Einstein explained it, just like he explained the photoelectric effect. His real contribution was to propose the basic assumptions that changed the problems at hand. After he made these assumptions, the physics world forever changed.
It’s too bad for Lorentz that his paper contained the word, ether, stuff that does not exist. And too bad he never pointed out that the speed of light is a constant for all observers.
I should mention that Lorentz won a nobel prize in 1902 for it and it is generally accepted that had there been a Nobel Prize in Mathematics, Poincaré would have been among the first awarded.
Thanks, Cuil!, you make me look good