Deep Learning Memory issues for BiLstm mat files
조회 수: 9 (최근 30일)
이전 댓글 표시
Hi,
I have a huge dataset of over 20GB of numercial time-series data stored in files as shown below:
x1_1 = double(23*921600)
...
I have about 250 of these mat files. and the label or target of these are single row categorical array that shows 0 and 1.
y1_1= categorical(1*921600)
...
I loaded 5 of each file and was able to train and classiy them using a BiLSTM Netowrok of 3 hidden layers with over 300 neurorns. but now i want to run all of them. My example is very similar to this example, I also tried this which was no help. I know that i should use a datastore of some kind but i tried most[File,Tall] of them and couldn't solve the issue. Any ideas how to solve this issue? Thank you.
댓글 수: 0
답변 (1개)
Divya Gaddipati
2019년 7월 19일
Hi,
You could use the Custom Mini-batch Datastore which uses the function sequenceDatastore that reads data from the specified folder and obtains labels from the subfolder names.
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!