- Improve Neural Network Classifier Using OptimizeHyperparameters
- Customize Neural Network Classifier Optimization
tuning hyper parameters using bayesian optimisation
조회 수: 19 (최근 30일)
이전 댓글 표시
Hi all,
I have configured an artificial neural network using 'fitnet' and need to tune the hyper parameters within the network since it is performing rather poorly. I am still rather new to matlab and this is all new to me. Any help with understanding the bayesian optimisation process in basic terms would be greatly appreciated. I am potentially looking to use the 'bayesopt' function but dont understand how this works.
댓글 수: 0
답변 (1개)
Katja Mogalle
2024년 5월 6일
Hi Luke,
In simple terms, Bayesian optimization is an algorithm that helps you choose the best hyperparameters that define the structure or training circumstainces for a neural network. You typically define a set of values for the algorithm to explore (e.g. network size, activations functions, training parameters) and Bayesian optimization takes care of figuring out which combinations of parameters to try out in order to reach an optimal outcome (the objective function is typically the cross-validation loss).
Using the bayesoft function is one approach (it is the most flexible, but maybe also more difficult to use). Though depending on the complexity and size of your task I can propose to look into one of these two directions which might be easier for getting started:
A) If you think a smaller, less complex network is able to solve the task, you can try out fitcnet (for classification) or fitrnet (for regression) of the Statistics and Machine Learning Toolbox. Both functions have built-in support for bayesian hyper-parameter optimization. For example, by using the OptimizeHyperparameters setting, the software will attempt to minimize the cross-validation loss (error) by varying parameters such as the activation functions, layer sizes, pre-processing options, and more. These examples might help you get started:
B) Alternatively, if you need a bigger and more flexible network architecture or you are working with image data, it might be better to start in the Deep Learning area. Here I can recommend the route via the Experiment Manager app, for example:
Or if, in the end, you need more flexibility over the Bayesian optimization settings, here is an example using the bayesopt function: https://www.mathworks.com/help/deeplearning/ug/deep-learning-using-bayesian-optimization.html .
Hope this helps.
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Pattern Recognition and Classification에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!