What’s New in MATLAB R2017a?

MATLAB R2017a was released last week. Many of the changes reported in the release notes are evolutionary, building on and extending major new features introduced previously. For example, the Live Editor continues to gain expanded capabilities. In this post I pick out a few new features that caught my eye. This is very much a personal selection. For full details of what’s new, see the release notes, and see also my previous post What’s New in MATLAB R2016b if you are not familiar with R2016b.

Parula

The parula color map has been modified slightly. The difference is subtle, but as the following example illustrates, the R2017a parula is a bit more vibrant and has a bit less cyan and yellow in the blues and greens.

parula-2016b-2017a.jpg

I note, however, that the difference between the old and the new parula is smaller when the images are converted to the CMYK color space, as they must be for printing.

Heatmap

The new heatmap function plots a heatmap of tabular data. Although it is intended mainly for use with the table data type, I think heatmap will be useful for getting insight into the structure of matrices, as illustrated by the following examples.

heatmaps.jpg

String Arrays

String arrays, introduced in MATLAB R2016b, can now be formed using double quotes:

>> s = string('This is a string') % R2016b
s = 
    "This is a string"
>> t = "This is a string"         % R2017a
t = 
    "This is a string"
>> isequal(s,t)
ans =
  logical
   1
>> whos
  Name      Size            Bytes  Class     Attributes

  s         1x1               166  string              
  t         1x1               166  string

However, there is one major caveat. Many MATLAB functions that take a char as input argument have not yet been adapted to accept strings. Hence

>> A = gallery('moler',3)
A =
     1    -1    -1
    -1     2     0
    -1     0     3
>> A = gallery("moler",3)
Error using nargin
Argument must be either a character vector or a function handle.
Error in gallery (line 191)
nargs = nargin(matname);

I expect that such functions will be updated in future releases.

Missing

A new function missing creates missing values appropriate to the data type in question.

>> A = ones(2); A(2,2) = missing
A =
     1     1
     1   NaN

> d = datetime('2014-05-26')
d = 
  datetime
   26-May-2014
>> d(2) = missing
d = 
  1×2 datetime array
   26-May-2014   NaT

Until converted to the target datatype, a missing value has class missing:

>> m = missing
m = 
  missing
    
>> class(m)
ans =
    'missing'

Performance Improvements

The release notes report performance improvements under a variety of headings, including execution engine, scripts, and mathematics functions. These are very welcome, as the user automatically benefits from them. One comment that caught my eye is “The backslash command A\B is faster when operating on negative definite matrices”. I think this means that MATLAB checks whether the matrix, A, is symmetric with all negative diagonal elements and, if it is, attempts a Cholesky factorization of -A.

Tracing the Early History of MATLAB Through SIAM News

A recent blog post by Ned Gulley points out that the new mathematics gallery (“Mathematics: The Winton Gallery”) at the Science Museum, London, contains a copy of the disk and manual for MATLAB 1.3, from 1985, sitting next to a trial assembly of Charles Babbage’s analytical engine.

This set me thinking about the early history of MATLAB and The MathWorks. Items such as manuals and disks must be quite rare nowadays. What other traces are there of early MATLAB history? I have recently been looking through some back issues of SIAM News and spotted a number of adverts for MATLAB over the period 1985–1991. Let’s see what historical insight these early adverts give.

1985

sinews-1985-18-2-mathworks_ad.jpg PDF file

The first advert I can find is from the March 1985 SIAM News, and it is for PC-MATLAB, priced at $695, running on an IBM-PC or compatible computer. The advert features the now famous MathWorks logo, which represents an eigenfunction of the L-shaped membrane. It quotes a time of 10.1 seconds for a 50-by-50 real matrix multiplication on a machine with an Intel 8087 coprocessor. This floating-point coprocessor was a useful add-on to the IBM-PC, which used the Intel 8086 chip.

MATLAB benefited from being launched at a time when the IBM PC with 8087 coprocessor was just starting to become popular. The 8086-8087 combination made it possible to carry out computations on a desktop PC that had previously required a minicomputer—and they could now be done with the interactive MATLAB interface.

The advert mentions “mainframe MATLAB”, which it says is written in Fortran, runs on “larger computers”, and is in use in several hundred organizations worldwide.

PC-MATLAB had been rewritten in C, and it supported graphics, IEEE arithmetic (as implemented in the 8087), and “user-defined functions” (M-files).

Note that at this time MathWorks was located at 124 Foxwood Road, Portola Valley, California. In his article The Growth of MATLAB and The MathWorks Over Two Decades, Cleve Moler explains that this first mailing address was “a rented A-frame cabin where Jack [Little] lived in the hills above Stanford University in Portola Valley, California”.

The September 1985 issue of SIAM News contains a rather different advert that now refers to “the original mainframe version of MATLAB” and emphasizes the ease of use of MATLAB.

sinews-1985-18-5-mathworks_ad.jpg PDF file

1987

sinews-1987-20-1-mathworks_ad.jpg PDF file

The next advert is from the January 1987 SIAM News. MATLAB is now also available as Pro-MATLAB running on a Sun workstation or a VAX computer. M-files are now mentioned by that name and LINPACK benchmark figures are stated, the largest figure being 98 Kflops on the MicroVAX II.

The MathWorks has now moved to Massachusetts.

1990

sinews-1990-23-3-mathworks_ad.jpg PDF file

The advert in the May 1990 SIAM News gives a version number, MATLAB 3.5, and it announces the Signal Processing Toolbox. It boasts that MATLAB has over 400 built-in functions and supports an enlarged range of computers, which include Cray supercomputers. The benchmark for a 50-by-50 real matrix multiplication is now down to 0.71 seconds on a 20 Mhz 386-based PC: a reduction by a factor of 14 from the figure quoted in 1985.

A strange feature of the advert is that the toolbox is only explicitly mentioned in the box at top left and the meaning of “toolbox” is not stated.

The MathWorks address has changed again, to 21 Eliot St., South Natick, Massachusetts. In the article mentioned above, Cleve explains, “When the company reached about a dozen employees, we moved several miles east to take over the second floor of a lovely building in South Natick, Massachusetts”.

1991

sinews-1991-24-5-mathworks_ad.jpg PDF file

The advert in the May 1991 issue of SIAM News focuses on two new toolboxes: the Spline Toolbox and the Optimization Toolbox. The gray box defines toolboxes as “sets of routines written in the MATLAB programming language for specialized applications”.

MathWorks is now located at Cochituate Place on Prime Park Way in Natick. The street was so-named because it had been the home of the Prime Computer Corporation, which produced minicomputers from 1972 to 1992.

These adverts give some insight into the development of MATLAB in its early years. They also show how the rapid growth of MathWorks necessitated frequent relocation of the company. Indeed, in the article mentioned above Cleve Moler notes that the number of employees roughly doubled every year for the first seven years.

Parallel Numerical Linear Algebra for Extreme Scale Systems

A minisymposium Parallel Numerical Linear Algebra for Extreme Scale Systems was held at the SIAM Conference on Computational Science and Engineering, Atlanta on February 28, 2017.

170228-0921-02-5781.jpg
Jack Dongarra.

Today’s most powerful supercomputers are composed of hundreds of thousands of computing cores (CPUs and accelerators) connected in high speed networks that make up a massively parallel high performance computing (HPC) system. These systems are placing new demands on effective scalable numerical algorithms and software libraries, which will only increase in the future as we move towards increasingly heterogeneous systems with millions of compute cores. This minisymposium, which I organized jointly with Bo Kågström (Umeå University, Sweden), focused on addressing these challenges in the context of linear algebra problems through developing novel parallel algorithms, exploring advanced scheduling strategies and runtime systems, carrying out offline and online autotuning, and avoiding communication and synchronization bottlenecks.

170228-1020-54-5783.jpg
Iain Duff showing the new second edition of Direct Methods for Sparse Matrices.

The speakers were all members of the NLAFET (Parallel Numerical Linear Algebra for Future Extreme-Scale Systems) project, which is one of the high-profile extreme-scale computing research projects funded by the European Commission within the Future and Emerging Technologies (FET) program under Horizon 2020. Much of the work described in the minisymposium was carried out within NLAFET.

Around 75 people attended and there was standing room only. Here are the talks, with links to the slides. The names of the speakers are italicized.

Related to this minisymposium was the two-day Workshop on Batched, Reproducible, and Reduced Precision BLAS, held a couple of days beforehand at Georgia Tech. The workshop included presentations from both academia and industry and the program contains links to the speakers’ slides.