Self-taught C++

I have recently fielded a few requests from students about self-directed learning of C++. I thought I’d combine my notes here. So, compared to Python for instance, C++ is a very large language both in terms of syntactic richness and the large standard library. Secondly, it has been popular for at least two decades longer than Python, so there is a lot of really dated material out there that doesn’t incorporate the huge positive changes to the language made in C++11.

I recommend two books. First and most importantly is the 4th edition of (C++ creator) Bjarne Stroustrup’s The C++ Programming Language. This is a gigantic hardback textbook that basically covers everything you need to know through C++11. It does not cover C++14, C++17, C++20, or C++23, but those are all pretty minor changes by comparison, and you’ll catch on. Stroustrup is actually a pretty good technical writer, too. (If a 5th edition ever comes out, get that one instead.) The other one I recommend is the Scott Myers’ Effective Modern C++, a smaller book which focuses on the newer C++11 and C++14 features. Myers’ book is structured like a series of essays about when and how to incorporate these new features.

There are two other things I recommend that aspiring C++ users use. The first is a good style guide. C++ just isn’t very opinionated, but good code is. I definitely recommend the widely-used Google C++ style guide, but I’m sure there are other good ones out there. The second is Godbolt, an incredible website that combines the functionality of a pastebin with an in-browser compiler.

Optionality as acquirendum

A lot of work deals with the question of acquiring “optional” or “variable” grammatical rules, and my impression is that different communities are mostly talking at cross-purposes. I discern at least three ways linguists conceive of optionality as something which the child must acquire.

  1. Some linguists assume—I think without much evidence—that optionality is mere “free variation”, so that the learner simply needs to infer which rules bear a binary [optional] feature. This is an old idea, going back to at least Dell (1981); Rasin et al. (2021:35) explicitly state the problem in this form.
  2. Variationist sociolinguists focus on the differential rates at which grammatical rules apply. They generally recognize the acquirenda as essentially conditional probability distributions which give the probability of rule application in a given grammatical context. Bill Labov is a clear avatar of this strain of thinking (e.g., Labov 1989). David Adger and colleagues have attempted to situate this within modern syntactic frameworks (e.g., Adger 2006).
  3. Some linguists believe that optionality is not statable within a single grammar, and must reflect the competing grammars. The major proponent of this approach is Anthony Kroch (e.g., Kroch 1989). While this conception might license some degree of “nihilism” about optionality, it also has led to some interesting work which hypothesizes interesting substantive constraints on grammar-internal constraints on variation as in the work of Laurel MacKenzie and colleagues (e.g., MacKenzie 2019). This work is also very good at ridding the (2) of some of its unfortunate “externalist” thinking.

I have to reject (1) as overly simplicistic. I find (2) and (3) both compelling in some way but a lot of work remains to synthesize or adjudicate between them.

References

Adger, D. 2006. Combinatorial variability. Journal of Linguistics 42(3): 503-530.
Dell, F. 1981. On the learnability of optional phonological rules. Linguistic Inquiry 12(1): 31-37.
Kroch, A. 1989. Reflexes of grammar in patterns of language change. Language Variation & Change 1(1): 199-244.
Labov, W. 1989. The child as linguistic historian. Language Variation & Change 1(1): 85-97.
MacKenzie, L. 2019. Perturbing the community grammar: Individual differences and community-level constraints on sociolinguistic variation. Glossa 4(1): 28.
Rasin, E., Berger, I., Lan, R., Shefi, I., and Katzir, R. 2021. Approaching explanatory adequacy in phonology using Minimum Description Length. Journal of Language Modelling 9(1): 17-66.