Community Profile

photo

Greg Heath


Last seen: Today
3,124 2011 이후 총 참여 횟수

Backgound in Electromagnetic Theory, Plasma Physics and Radar Target Identification using Neural Networks.
PhD Student, Research Assistant and Lecturer at Stanford;
AB, ScB, ScM Student; Research Assistant, Fellow and Professor at Brown;
27 yrs researching Ballistic and Theatre Missile Defense using Neural Networks at MIT Lincoln Laboratory. Retired 2003.

PLEASE DO NOT SEND QUESTIONS AND DATA TO MY EMAIL. HOWEVER, CAN SEND LINKS TO POSTS.
Professional Interests: Neural Netwoks, Spectral Analysis

담당자

Greg Heath's 배지

  • First Review
  • 36 Month Streak
  • Thankful Level 4
  • Ace
  • Revival Level 4
  • Knowledgeable Level 4
  • First Answer

세부 정보 보기...

참여 게시물
보기 기준

답변 있음
simulink neural network producing different outputs to workspace
A simpler solution is to ALWAYS begin the program with a resetting of the random number generator. For example, choose your favo...

약 16시간 전 | 0

답변 있음
How to avoid getting negative values when training a neural network?
Use a sigmoid for the output layer. Hope this helps THANK YOU FOR FORMALLY ACCEPTING MY ANSWER GREG

7일 전 | 0

답변 있음
weird plotregression plots for a 10% of my fitnet neural networks
Sometimes training gets into a parameter space rut. That is why it is wise to train multiple models. Hope this helps. Greg ...

8일 전 | 0

답변 있음
What is the purpose of shuffling the validation set?
To impose and verify a consistent GENERALIZED path to convergence by avoiding repetitive anomalies. Hope this helps Greg

8일 전 | 0

답변 있음
How to force overfiting of Deep Learning Network for Classification
OVERFITTING = More training unknowns (e.g., weights) than training vectors. OVERTRAINING1 = Training an overfit network to...

8일 전 | 0

답변 있음
Neural Network Pattern Recognition
Targets are 1-dimensional unit vectors with 4 zeros and a single 1 . Thank you for formally accepting my answer. Greg

8일 전 | 0

| 수락됨

답변 있음
semanticseg producing marginally different values when inference is repeated
Use clear all, close all, clc, rng(0) on the 1st line Thank you for formally accepting my answer Greg

14일 전 | 0

답변 있음
combining two neural networks in one bigger network
Just 1. Save the outputs of net1 in a file 2. Use the file to train net2 Greg

14일 전 | 0

답변 있음
Problem with the TreeBagger Command
The sizes of the input function and output target must be [ I N ] = size(input) [ O N ] = size (target) Hope this helps...

25일 전 | 0

답변 있음
How do I create a neural network that will give multiple input and outputs?
ALWAYS arrange your data so that [ I N ] = size(input) [ O N ] = size(output) Hope this helps. Greg

25일 전 | 0

답변 있음
How to make a hybrid model (LSTM and Ensemble) in MATLAB
Replace your YES/NO data with either 1/0 or 1/-1. Hope this helps. Greg

25일 전 | 0

답변 있음
how to augment image data only for a specific class?
Separate class 0 and interpolate. If you have a good feel for the data you could extrapolate. However the latter might be tricky...

25일 전 | 0

답변 있음
NTSTOOL - How to get predicted values of "the future"?
The known time series is analyzed to yield a time-series model that uses past and present values to predict future values. The ...

4달 전 | 0

답변 있음
Timedelaynet output calculation principle
You did not include tHe 2 biases. Hope this helps. Greg THANK YOU FOR FORMALLY ACCEPTING MY ANSWER

5달 전 | 0

답변 있음
Understand number of weights of Neural Network
It is possible. In general, however, you don't have the slightest idea what choice would be significantly better than random. ...

5달 전 | 0

답변 있음
Using pca for features selections
PCA (Principal Coordinate Analysis) is a very useful method for regression (it ranks linear combinations of the original variabl...

5달 전 | 0

답변 있음
How do we decide the number of hiddenlayers in a PatternNet?
patternnet(10) indicates ONE HIDDEN LAYER WITH TEN NODES It is important to be mindful of the number of layers and nodes. The ...

6달 전 | 0

| 수락됨

답변 있음
Why sets Matlab automatically the activation functions for a neural network like this?
The simplest useful approximation is is a series of blocks with different heights and widths. The simplest useful DIFFERENTIAB...

6달 전 | 0

답변 있음
Artificial Neural Networks Hidden Layers
Number of input and output nodes is determined by the data. Number of hidden layers and nodes is determined by the program auth...

6달 전 | 0

| 수락됨

답변 있음
Neural Network Classification Results
The original class sizes are unequal. Hope this helps THANK YOU FOR FORMALLY ACEEPTING MY ANSWER Greg

6달 전 | 0

답변 있음
Please help with narnext error Subscripted assignment dimension mismatch.????
x5=data_inputs(5,1:17); x6=data_inputs(6,1:17); x7=data_inputs(6,1:17); x8=data_inputs(7,1:17); x9=data_inputs(9,1:17); Hop...

6달 전 | 0

답변 있음
Hyperparameter tuning of neural network
One hidden layer is always sufficient. However, sometimes 1. Knowledge of the physical or mathematical process may lead to a ...

6달 전 | 1

답변 있음
Different results in training a CNN with Matlab 2018a and Matlab 2019a
You are making the task difficult by going backwards. Start with a single hidden node and add nodes one at a time. Hope this...

6달 전 | 0

질문


NEURAL NETWORK DATA SET EXAMPLES
For demonstration of old AND new concepts and ideas, PLEASE use the sample NN data sets provided by MATLAB help nndatas...

6달 전 | 답변 수: 0 | 0

0

답변

답변 있음
cross validation in neural network using K-fold
%i am using neural network for classification but i need to use instead of holdout option , K-fold. ==> FALSE!. You mean y...

6달 전 | 0

답변 있음
Can the number of Predictors be different for Train and Test data?
Of course not. The ultimate purpose of training is to create a model that works well on non-training data. Thank you for form...

6달 전 | 0

답변 있음
How to check the robustness of the Neural network model?
If you are going to test with white noise, include white noise in your design (i.e., training + validation) Then, given a fixed...

7달 전 | 0

| 수락됨

답변 있음
NARX with Complex Values Input
Decades ago I learned (the hard way) to forget about trying to use complex computations for NNs. However, if you insist, let us...

7달 전 | 1

답변 있음
Why sets Matlab automatically the activation functions for a neural network like this?
That is a standard configuation for a neural net. It's operation is explained in every elementary text. Thank you for formally...

7달 전 | 0

답변 있음
I get a "Performance function replaced with squared error performance" warning when trying to set 'crossentropy' as the performance function.
If you insist on using CROSSENTROPY, try PATTERNNET. Hope this helps. Thank you for formally accepting my answer Greg

7달 전 | 0

Load more