News on the linguistics blogs front:
Heidi Harley has a new post (after 3 years 😉 ) about a light bulb moment about Hiaki echo vowels where she accessibly explains an idea she had about a problem she encountered.
Dan Milway continues his discussion of Jerrold Katz’s 1972 Semantic Theory after a hiatus probably caused by finishing his PhD (congrats!).
Thomas Graf’s series on subregular complexity in phonology finally arrived at syntax with takes on Merge and Move, islands and more islands. His writing is very clear and understandable, even for a computational amateur like myself. Simply read everything on this amazing blog!
Last but not least is Omer with some anecdotes about the occasional value of imprecise recall where your brain sometimes generously autocorrects your memory of, say, a principle or a generalization into the version that turns out to be more helpful for your research.
On the ‘linguistics in popular press’ front we have an article about the Bender Rule (of hashtag fame), i.e. the rule that you should always mention the language that you are working on to avoid the subconscious equation “Natural Language = English”.
There’s a TEDtalk (the natural product of evolution if you introduce presentations into the ecological niche that is American culture) by Ed Gibson on how efficiency shapes human language.
Linguistics is in ArsTechnica with an MIT study that found that languages generally minimize dependency length.
Some people at MIT also enriched a recurrent neural net (RNN) with a grammar – it is almost as if this RNN had innate knowledge of language – and found out that it compares best in performance to models with little or no added grammar.
A crossover between linguistics and mathematics (in this case applied category theory) is as usual above my paygrade: dynamic syntax, grammars as parsers, context-freeness and monoidal categories or whatever all that means over on the n-Category café. So if you want to go down the rabbit hole of general abstract nonsense (not my term!), there you go!