Using datastore and tall arrays for files that are too big for memory

조회 수: 1 (최근 30일)
Aaron Smith
Aaron Smith 2017년 7월 3일
댓글: Jeremy Hughes 2017년 7월 7일
I have been looking at using datastore on files that have proven too much for matlab's memory. I was using fopen and textscan, fgetl, sscanf and dlmread. A recurring problem with each different command is memory. My data is saved in two different formats, an extremely large file containing 200 individual matrices and as the individual 200 matrices. When looking at creating tall arrays to fix the memory issues, I came across a problem. Which datastore command should I use? The individual files could be added to datastore using fileDatastore or the bulk file could use tabularTextDatastore but no matter which method i use, i get the same error:
Undefined function 'datastore' for input arguments of type 'char'.
I have tried multiple different methods of specifying the file\folder. I have used uigetfile and uigetdir to specify the file and I have also explicitly stated the file, either by setting the current folder and naming the file or with the 'C:\Users\c13459232\Documents....' method. They all return the same error. Is there something I am missing here?
  댓글 수: 3
Aaron Smith
Aaron Smith 2017년 7월 4일
편집: Aaron Smith 2017년 7월 4일
2012a. This means that tall arrays and datastore are not available to me, yes_
Jeremy Hughes
Jeremy Hughes 2017년 7월 7일
Yes, tall arrays are available in R2016b.

댓글을 달려면 로그인하십시오.

답변 (0개)

카테고리

Help CenterFile Exchange에서 Large Files and Big Data에 대해 자세히 알아보기

제품

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by