What's New
Create deep learning networks with more complex architecture to improve accuracy and use many popular pretrained models.
Create deep learning networks with the LSTM recurrent neural network topology for time-series classification and prediction.
Train convolutional neural networks (also known as ConvNets, CNNs) for regression tasks.
Transfer learning with pretrained CNN models AlexNet, VGG-16, and VGG-19, and import models from Caffe (including Caffe Model Zoo).
Define new layers with learnable parameters, and specify loss functions for classification and regression output layers.
Monitor training progress with plots of accuracy, loss, validation metrics, and more.
Automatically validate network and stop training when validation metrics stop improving.
Visualize the features ConvNet has learned using deep dream and activations.
Find optimal settings for training deep networks (Requires Statistics and Machine Learning Toolbox).
Train convolutional neural networks on multiple GPUs on PCs (using Parallel Computing Toolbox) and clusters (using MATLAB Distributed Computing Server).
Train convolutional neural networks using multiple GPUs in MATLAB and MATLAB Distributed Computing Server for Amazon EC2.
Latest Releases
R2018a (version 11.1) - Mar 2018
Version 11.1, part of Release 2018a, includes the following enhancements:
- Long Short-Term Memory (LSTM) Networks: Solve regression problems with LSTM networks and learn from full sequence context using bidirectional LSTM layers
- Deep Learning Optimization: Improve network training using Adam, RMSProp, and gradient clipping
- Deep Learning Data Preprocessing: Read data and define preprocessing operations efficiently for training and prediction
- Deep Learning Layer Validation: Check layers for validity, GPU compatibility, and correctly defined gradients
- Directed Acyclic Graph (DAG) Networks: Accelerate DAG network training using multiple GPUs and compute intermediate layer activations
See the Release Notes for details.
R2017b (version 11.0) - Sep 2017
Version 11.0, part of Release 2017b, includes the following enhancements:
- Directed Acyclic Graph (DAG) Networks: Create deep learning networks with more complex architecture to improve accuracy and use many popular pretrained models
- Long Short-Term Memory (LSTM) Networks: Create deep learning networks with the LSTM recurrent neural network topology for time-series classification and prediction
- Deep Learning Validation: Automatically validate network and stop training when validation metrics stop improving
- Deep Learning Layer Definition: Define new layers with learnable parameters, and specify loss functions for classification and regression output layers
- Deep Learning Training Plots: Monitor training progress with plots of accuracy, loss, validation metrics, and more
- Deep Learning Image Preprocessing: Efficiently resize and augment image data for training
- Bayesian Optimization of Deep Learning: Find optimal settings for training deep networks (Requires Statistics and Machine Learning Toolbox)
See the Release Notes for details.
R2017a (version 10.0) - Mar 2017
Version 10.0, part of Release 2017a, includes the following enhancements:
- Deep Learning for Regression: Train convolutional neural networks (also known as ConvNets, CNNs) for regression tasks
- Pretrained Models: Transfer learning with pretrained CNN models AlexNet, VGG-16, and VGG-19, and import models from Caffe (including Caffe Model Zoo)
- Deep Learning with Cloud Instances: Train convolutional neural networks using multiple GPUs in MATLAB and MATLAB Distributed Computing Server for Amazon EC2
- Deep Learning with Multiple GPUs: Train convolutional neural networks on multiple GPUs on PCs (using Parallel Computing Toolbox) and clusters (using MATLAB Distributed Computing Server)
- Deep Learning with CPUs: Train convolutional neural networks on CPUs as well as GPUs
- Deep Learning Visualization: Visualize the features ConvNet has learned using deep dream and activations
See the Release Notes for details.
R2016b (version 9.1) - Sep 2016
Version 9.1, part of Release 2016b, includes the following enhancements:
- Deep Learning with CPUs: Run trained CNNs to extract features, make predictions, and classify data on CPUs as well as GPUs
- Deep Learning with Arbitrary Sized Images: Run trained CNNs on images that are different sizes than those used for training
- Performance: Train CNNs faster when using ImageDatastore object
- Deploy Training of Models: Deploy training of a neural network model via MATLAB Compiler or MATLAB Compiler SDK
See the Release Notes for details.
R2016a (version 9.0) - Mar 2016
Version 9.0, part of Release 2016a, includes the following enhancements:
- Deep Learning: Train deep convolutional neural networks with built-in GPU acceleration for image classification tasks (using Parallel Computing Toolbox)
See the Release Notes for details.