adapt | Adapt neural network to data as it is simulated |

adaptwb | Adapt network with weight and bias learning rules |

adddelay | Add delay to neural network response |

boxdist | Distance between two position vectors |

cascadeforwardnet | Cascade-forward neural network |

catelements | Concatenate neural network data elements |

catsamples | Concatenate neural network data samples |

catsignals | Concatenate neural network data signals |

cattimesteps | Concatenate neural network data timesteps |

closeloop | Convert neural network open-loop feedback to closed loop |

combvec | Create all combinations of vectors |

compet | Competitive transfer function |

competlayer | Competitive layer |

con2seq | Convert concurrent vectors to sequential vectors |

configure | Configure network inputs and outputs to best match input and target data |

confusion | Classification confusion matrix |

convwf | Convolution weight function |

crossentropy | Neural network performance |

disp | Neural network properties |

display | Name and properties of neural network variables |

dist | Euclidean distance weight function |

distdelaynet | Distributed delay network |

divideblock | Divide targets into three sets using blocks of indices |

divideind | Divide targets into three sets using specified indices |

divideint | Divide targets into three sets using interleaved indices |

dividerand | Divide targets into three sets using random indices |

dividetrain | Assign all targets to training set |

dotprod | Dot product weight function |

elliot2sig | Elliot 2 symmetric sigmoid transfer function |

elliotsig | Elliot symmetric sigmoid transfer function |

elmannet | Elman neural network |

errsurf | Error surface of single-input neuron |

extendts | Extend time series data to given number of timesteps |

feedforwardnet | Feedforward neural network |

fitnet | Function fitting neural network |

fixunknowns | Process data by marking rows with unknown values |

formwb | Form bias and weights into single vector |

fromnndata | Convert data from standard neural network cell array form |

gadd | Generalized addition |

gdivide | Generalized division |

genFunction | Generate MATLAB function for simulating neural network |

gensim | Generate Simulink block for neural network simulation |

getelements | Get neural network data elements |

getsamples | Get neural network data samples |

getsignals | Get neural network data signals |

getsiminit | Get Simulink neural network block initial input and layer delays states |

gettimesteps | Get neural network data timesteps |

getwb | Get network weight and bias values as single vector |

gmultiply | Generalized multiplication |

gnegate | Generalized negation |

gridtop | Grid layer topology function |

gsqrt | Generalized square root |

gsubtract | Generalized subtraction |

hardlim | Hard-limit transfer function |

hardlims | Symmetric hard-limit transfer function |

hextop | Hexagonal layer topology function |

ind2vec | Convert indices to vectors |

init | Initialize neural network |

initcon | Conscience bias initialization function |

initlay | Layer-by-layer network initialization function |

initlvq | LVQ weight initialization function |

initnw | Nguyen-Widrow layer initialization function |

initwb | By weight and bias layer initialization function |

initzero | Zero weight and bias initialization function |

isconfigured | Indicate if network inputs and outputs are configured |

layrecnet | Layer recurrent neural network |

learncon | Conscience bias learning function |

learngd | Gradient descent weight and bias learning function |

learngdm | Gradient descent with momentum weight and bias learning function |

learnh | Hebb weight learning rule |

learnhd | Hebb with decay weight learning rule |

learnis | Instar weight learning function |

learnk | Kohonen weight learning function |

learnlv1 | LVQ1 weight learning function |

learnlv2 | LVQ2.1 weight learning function |

learnos | Outstar weight learning function |

learnp | Perceptron weight and bias learning function |

learnpn | Normalized perceptron weight and bias learning function |

learnsom | Self-organizing map weight learning function |

learnsomb | Batch self-organizing map weight learning function |

learnwh | Widrow-Hoff weight/bias learning function |

linearlayer | Linear layer |

linkdist | Link distance function |

logsig | Log-sigmoid transfer function |

lvqnet | Learning vector quantization neural network |

lvqoutputs | LVQ outputs processing function |

mae | Mean absolute error performance function |

mandist | Manhattan distance weight function |

mapminmax | Process matrices by mapping row minimum and maximum values to [-1 1] |

mapstd | Process matrices by mapping each row's means to 0 and deviations to 1 |

maxlinlr | Maximum learning rate for linear layer |

meanabs | Mean of absolute elements of matrix or matrices |

meansqr | Mean of squared elements of matrix or matrices |

midpoint | Midpoint weight initialization function |

minmax | Ranges of matrix rows |

mse | Mean squared normalized error performance function |

narnet | Nonlinear autoregressive neural network |

narxnet | Nonlinear autoregressive neural network with external input |

nctool | Neural network classification or clustering tool |

negdist | Negative distance weight function |

netinv | Inverse transfer function |

netprod | Product net input function |

netsum | Sum net input function |

network | Create custom neural network |

newgrnn | Design generalized regression neural network |

newlind | Design linear layer |

newpnn | Design probabilistic neural network |

newrb | Design radial basis network |

newrbe | Design exact radial basis network |

nftool | Neural network fitting tool |

nncell2mat | Combine neural network cell data into matrix |

nncorr | Crross correlation between neural network time series |

nndata | Create neural network data |

nndata2sim | Convert neural network data to Simulink time series |

nnsize | Number of neural data elements, samples, timesteps, and signals |

nnstart | Neural network getting started GUI |

nntraintool | Neural network training tool |

normc | Normalize columns of matrix |

normprod | Normalized dot product weight function |

normr | Normalize rows of matrix |

nprtool | Neural network pattern recognition tool |

ntstool | Neural network time series tool |

numelements | Number of elements in neural network data |

numfinite | Number of finite values in neural network data |

numnan | Number of NaN values in neural network data |

numsamples | Number of samples in neural network data |

numsignals | Number of signals in neural network data |

numtimesteps | Number of time steps in neural network data |

openloop | Convert neural network closed-loop feedback to open loop |

patternnet | Pattern recognition network |

perceptron | Perceptron |

perform | Calculate network performance |

plotconfusion | Plot classification confusion matrix |

plotep | Plot weight-bias position on error surface |

ploterrcorr | Plot autocorrelation of error time series |

ploterrhist | Plot error histogram |

plotes | Plot error surface of single-input neuron |

plotfit | Plot function fit |

plotinerrcorr | Plot input to error time-series cross-correlation |

plotpc | Plot classification line on perceptron vector plot |

plotperform | Plot network performance |

plotpv | Plot perceptron input/target vectors |

plotregression | Plot linear regression |

plotresponse | Plot dynamic network time series response |

plotroc | Plot receiver operating characteristic |

plotsomhits | Plot self-organizing map sample hits |

plotsomnc | Plot self-organizing map neighbor connections |

plotsomnd | Plot self-organizing map neighbor distances |

plotsomplanes | Plot self-organizing map weight planes |

plotsompos | Plot self-organizing map weight positions |

plotsomtop | Plot self-organizing map topology |

plottrainstate | Plot training state values |

plotv | Plot vectors as lines from origin |

plotvec | Plot vectors with different colors |

plotwb | Plot Hinton diagram of weight and bias values |

pnormc | Pseudonormalize columns of matrix |

poslin | Positive linear transfer function |

preparets | Prepare input and target time series data for network simulation or training |

processpca | Process columns of matrix with principal component analysis |

prune | Delete neural inputs, layers, and outputs with sizes of zero |

prunedata | Prune data for consistency with pruned network |

purelin | Linear transfer function |

quant | Discretize values as multiples of quantity |

radbas | Radial basis transfer function |

radbasn | Normalized radial basis transfer function |

randnc | Normalized column weight initialization function |

randnr | Normalized row weight initialization function |

rands | Symmetric random weight/bias initialization function |

randsmall | Small random weight/bias initialization function |

randtop | Random layer topology function |

regression | Linear regression |

removeconstantrows | Process matrices by removing rows with constant values |

removedelay | Remove delay to neural network's response |

removerows | Process matrices by removing rows with specified indices |

roc | Receiver operating characteristic |

sae | Sum absolute error performance function |

satlin | Saturating linear transfer function |

satlins | Symmetric saturating linear transfer function |

scalprod | Scalar product weight function |

selforgmap | Self-organizing map |

separatewb | Separate biases and weight values from weight/bias vector |

seq2con | Convert sequential vectors to concurrent vectors |

setelements | Set neural network data elements |

setsamples | Set neural network data samples |

setsignals | Set neural network data signals |

setsiminit | Set neural network Simulink block initial conditions |

settimesteps | Set neural network data timesteps |

setwb | Set all network weight and bias values with single vector |

sim | Simulate neural network |

sim2nndata | Convert Simulink time series to neural network data |

softmax | Soft max transfer function |

srchbac | 1-D minimization using backtracking |

srchbre | 1-D interval location using Brent's method |

srchcha | 1-D minimization using Charalambous' method |

srchgol | 1-D minimization using golden section search |

srchhyb | 1-D minimization using a hybrid bisection-cubic search |

sse | Sum squared error performance function |

sumabs | Sum of absolute elements of matrix or matrices |

sumsqr | Sum of squared elements of matrix or matrices |

tansig | Hyperbolic tangent sigmoid transfer function |

tapdelay | Shift neural network time series data for tap delay |

timedelaynet | Time delay neural network |

tonndata | Convert data to standard neural network cell array form |

train | Train neural network |

trainb | Batch training with weight and bias learning rules |

trainbfg | BFGS quasi-Newton backpropagation |

trainbr | Bayesian regularization backpropagation |

trainbu | Batch unsupervised weight/bias training |

trainc | Cyclical order weight/bias training |

traincgb | Conjugate gradient backpropagation with Powell-Beale restarts |

traincgf | Conjugate gradient backpropagation with Fletcher-Reeves updates |

traincgp | Conjugate gradient backpropagation with Polak-Ribiére updates |

traingd | Gradient descent backpropagation |

traingda | Gradient descent with adaptive learning rate backpropagation |

traingdm | Gradient descent with momentum backpropagation |

traingdx | Gradient descent with momentum and adaptive learning rate backpropagation |

trainlm | Levenberg-Marquardt backpropagation |

trainoss | One-step secant backpropagation |

trainr | Random order incremental training with learning functions |

trainrp | Resilient backpropagation |

trainru | Unsupervised random order weight/bias training |

trains | Sequential order incremental training with learning functions |

trainscg | Scaled conjugate gradient backpropagation |

tribas | Triangular basis transfer function |

tritop | Triangle layer topology function |

unconfigure | Unconfigure network inputs and outputs |

vec2ind | Convert vectors to indices |

view | View neural network |

Was this topic helpful?