Problems with Matlab Database Explorer and postgreSQL (JDBC)
조회 수: 1 (최근 30일)
이전 댓글 표시
I have created a table in postgreSQL with ~1500 columns, however as soon as I try to insert data (using fastinsert) via the JDBC I get a few java errors. I guess these are an indication to some memory overflow in the postgreSQL JDBC Connector?
at org.postgresql.jdbc2.AbstractJdbc2Statement$BatchResultHandler.handleError(AbstractJdbc2Statement.java:2762) at org.postgresql.core.v3.QueryExecutorImpl$ErrorTrackingResultHandler.handleError(QueryExecutorImpl.java:362) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1999) at org.postgresql.core.v3.QueryExecutorImpl.flushIfDeadlockRisk(QueryExecutorImpl.java:1180) at org.postgresql.core.v3.QueryExecutorImpl.sendQuery(QueryExecutorImpl.java:1201) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:412) at org.postgresql.jdbc2.AbstractJdbc2Statement.executeBatch(AbstractJdbc2Statement.java:2929)
I am using the same Connection for database operations where it is working perfectly fine. While I was looking for the problem I discovered that the operation is working fine, when just NaNs are added to the table, but as soon as there are more than ~ 800 numbers instead of NaN the insert crashes. I also checked the datatypes in matlab and postgre, so there should be no reason for crashes there.
The Database can handle the data when I import csv files. So I am bit lost right now where I can find my problem or what I can do.
댓글 수: 0
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Database Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!