*Last modified 28 March 2017*

I recently attended the Heidelberg Laureate Forum, in which some older researchers--mainly Fields Medalists and Turing Award winners--talked to and with young researchers. The young researchers ranged from a few students just entering university to young faculty members, most being graduate students. They came from all over the world and were about evenly divided between mathematicians and computer scientists. There was a rigorous selection process, and almost all the young researchers seemed to be very intelligent.

In my talk I presented a formula that I said was the definition of "the set of all permutations from 1 to N" [sic]. I had already explained that [1..N ⟶ 1..N] is the set of functions that map the set 1..N of integers from 1 through N into itself, and that I was writing f[x] instead of f(x) to mean function application. Here's the formula, approximately how it appeared on my slide:

- {f ∈ [1..N ⟶ 1..N] :

∀ y ∈ 1..N : ∃ x ∈ 1..N : f[x]=y}

I then asked the audience to raise their hands if they could look at this formula and understand it. About half the audience raised their hands. I then asked all the mathematicians to lower their hands. Almost no hands remained up. Hardly any of the computer scientists could understand it.

Remember, these were specially selected, bright young researchers from around the world.

This may explain why TLA+ is not more widely used, and why it should be.

Click here to see a video of my talk. The relevant part starts at 15:00 and lasts about 2.5 minutes; you can go directly to that part by clicking here.