Conversations with Gil Strang

At JuliaCon 2018 in London, one of the keynote presentations was a conversation with Gil Strang led by Alan Edelman and Pontus Stenetorp. Gil is well known for his many books on linear algebra, applied mathematics and numerical analysis, as well as his research contributions.

Gil talked about his famous 18.06 linear algebra course at MIT, which in its YouTube form has had almost 3 million views. A number of people in the audience commented that they had learned linear algebra from Gil.

180810-0858-43_0655.jpg

Gil also talked about his book Linear Algebra and Learning from Data, due to be published at the end of 2018, which includes chapters on deep neural networks and back propagation. Many people will want to read Gil’s insightful slant on these topics (see also my SIAM News article The World’s Most Fundamental Matrix Equation).

As well as the video of Gil’s interview embedded below, two written interviews will be of interest to Gil’s fans:

Tricks and Tips in Numerical Computing

In a keynote talk at JuliaCon 2018 I described a variety of tricks, tips and techniques that I’ve found useful in my work in numerical computing.

The first part of the talk was about two aspects of complex arithmetic: the complex step method for approximating derivatives, and understanding multivalued functions with the help of the unwinding number. Then I talked about the role of the associativity of matrix multiplication, which turns out to be the key property that makes the Sherman-Morrison formula work (this formula gives the inverse of a matrix after a rank 1 update). I pointed out the role of associativity in backpropagation in neural networks and deep learning.

After giving an example of using randomization to avoid pathological cases, I discussed why low precision (specifically half precision) arithmetic is of growing interest and identified some issues that need to be overcome in order to make the best use of it.

Almost every talk at JuliaCon was livecast on YouTube, and these talks are available to watch on the Julia Language channel. The slides for my talk are available here.

Also at the conference, my PhD student Weijian Zhang spoke about the work he has been doing on evolving graphs in his PhD.