The Muon g-2 experiment is one of the last places where there is a reasonable hope of finding discrepancies with the Standard Model of Particle Physics. This Science Magazine article describes what this is all about and why it is relevant. Jester of Resonaances tweets that at this year’s Rencontres there is a talk “Most recent results from g-2 experiment”. This might just be an update, not necessarily the announcement of a discovery but who knows.
As far as I understand, even in the exciting case that we finally find something that the SM does not predict, this will not really change the current state of particle physics. Since this is a high-precision experiment, we will only know that something is wrong with the SM but not what or even at what energy scale. But even so, it would be the first crack in the SM and therefore a thing to celebrate.
David Tanzer on John Baez’ blog Azimuth advertised for a new blog The Signal Beat where you can find a series on language complexity.
The posts start by viewing a language as a set of all its sentences and then guide you through ideas such as how to decide whether a sentence belongs to that language and how complex that decision procedure is, ending on the P vs NP conjecture.
There’s an interview of David Adger in the current issue of Nautilus. It contains everything that the laws of physics dictate a linguistics interview should be about: Universal Grammar, merge, Arrival, Piraha, something something language is such a central part of human life/society blabla.
One (part of an) interview question that I celebrated was the following: Your ideas evoke probably the only controversy in the linguistics world that has spilled over to popular culture—the debate over “universal grammar.”
In general, David Adger is as informative and gentle as ever, using non-inflammatory rhetoric on the topic of UG (which is as always very welcome).
So, cool thing to have actual linguists in science magazines although the day is still to come where actual in-depth questions about theoretical linguistics are asked.
The Atacama Cosmology Telescope just released its measurements of the cosmic microwave background. The inferred expansion rate of the universe agrees with the ones of previous Planck missions.
This is interesting insofar as the question of the rate of the expansion is one of cosmology’s biggest mysteries right now. Measurements from the early universe and measurements of later times derived from i.a. supernova data do not agree. So this new measurement gives additional credibility to the old Planck data, making experimental error increasingly unlikely.
The Hubble tension as its called gets tenser and tenser.
Just discovered another (semi-active) linguistics blog: Buffalo linguist which is largely about phonetics, e.g. see here the most recent post on phonetic universals.
I’ll add the blog to my overview post.
News were making the rounds on Friday that the CERN council unanimously decided that in their research strategy they’re placing (most but not all) their bets on the Future Circular Collider (FCC), the 100km circumference proton-proton collider that would reach energies of about 100TeV. This is discussed in Nature and on Peter Woit’s blog.
A critical essay by Sabine Hossenfelder appeared in SciAm where she voiced her previous concerns that in contrast to the LHC there is no guarantee that physicists will find anything and that it is therefore hard to make a scientific case for such a huge amount of money.
Update: For anyone who speaks French or is able to use online translating tools, there is also an article in Le Monde about the topic by Adam Falkowski, author of Résonaances.
News came out just today that the XENON experiment, an underground lab in Italy looking for dark matter particles, saw a 3.5sigma excess in their data. Physicists right now are on a spectrum ranging from cautiously hyped to pessimistic due to trust issues from one too many disappearing anomalies.
If it is not due to impurities in their detector material which seems to be most people’s best cautious guess, and if the anomaly doesn’t vanish with more data, the best explanation would be axions or sofar unknown properties of neutrinos. However, both options also seem problematic since apparently that these particles with the properties necessary to explain the excess would carry away a lot of energy from stars (where they’re probably produced) than observations allow.
You can read about this, as always, in Quantamagazine and the experiment’s own webpage.
To honour the occasion, my favourite physics blog has awaken from its slumber: Résonaances aka Adam Falkowski is back with its first post in two years!
So, fingers crossed!
A new issue of Inference just appeared, this time again with articles on linguistics. The first one is about Misused Terms in Linguistics by Eveline Leivada where she undertakes the laudable task to define and explain controversial terms in linguistics; terms (such as, you guessed it, UG) that are controversial because, according to her, they have been used inconsistently and inaccurately. The best quote in the article by far is that ‘linguists would rather share each others’ toothbrush than each others’ terminology’. Something tells me that, still, all her definitions could lead to controversy.
The second is rather loosely related to theoretical linguistics, a text about people’s inner speech by David Lobina.
The news has made the rounds that the T2K experiment in Japan discovered evidence for an asymmetry in the behaviour of neutrinos and antineutrinos. The violation of this symmetry (CP-symmetry) seems to be almost as large as possible which is why the experiment was able to gather the data faster than initially thought.
More data is still needed for a definitive discovery which will still require some years to gather.
However, if this hint turns out to be true (and it seems most physicists expect that it will) then it will provide new directions for beyond-Standard-Model physics.
As usual, you can best read about it in Quanta (with an earlier discussion some years ago when this effect first turned up here). Nature and Interactions also have worthwhile articles, while CernCourier has a bit more technical coverage (with some explanation as to what such a result would hint at theoretically here).
I want to keep the tradition alive that this is an ‘interdisciplinary’ blog, i.e. to post about stuff I don’t understand, i.e. physics and math.
The first item is an interesting ongoing real life experiment in the sociology of science. 8 years ago, Shinichi Mochizuki claimed to have proven the abc-conjecture (I henceforth refuse for the indefinite future to be shamed by mathematicians for unimaginative technical terms in linguistics). Apparently, this is a conjecture about a deep and unexpected relationship between addition and multiplication.
To achieve that, Mochizuki developed a whole theory of his own, Interuniversal Teichmüller theory (again…), dropped like 1500 pages of impenetrable, idiosyncratic notation on the arXiv and left it at that. He refuses to give talks on it or hold lectures outside of Japan and leaves it to his colleagues to try to explain this to other mathematicians. In these 8 years, nobody was able to verify the proof. Granted, a lot of people simply didn’t try because the volume of necessary reading was way too much and because they couldn’t follow the style of presentation. Then, Jakob Stix and Peter Scholze (of Fields Medal fame) worked through the material, found an alleged gap in the proof and a week-long meeting with Mochizuki in Japan couldn’t remove their doubt (portrayed in this Quanta article).
Now, the math journal of Mochizuki’s institute (of which he is an editor) decided to publish his proof, a weird choice given that this usually means that the proof has been vetted and verified in the peer review process – while at the same time, most experts in the field can’t follow the logic of the proof.
Peter Scholze also commented on the current situation on Peter Woit’s blog, with an ongoing discussion with people who claim to understand the proof.
The last bits are from Scott Aaronson’s answers in his post “AMA: Apocalypse Edition“: here are his thoughts on the It from Qubit idea in fundamental physics (the whole spacetime emerges from quantum entanglement business) and whether undecidability/uncomputability are relevant for physics.
Finally, the editor’s choice for interesting article today appears in Quanta about progress in the Langlands program (a set of conjectures relating vastly different fields in maths to each other in deep and surprising ways).