Mean of selected range of a matrix based on a range of values from another matrix
조회 수: 8 (최근 30일)
이전 댓글 표시
Hello everyone,
I have a mat file (attached) containing 4 parameters: month, sa, ta, and sig. My intention is to get the mean and standard deviation from each month of sa and ta at a specific range of sig (let's say the value of sig : 27.4 - 27.5).
So, the intended output should be like this:
Thank you!
댓글 수: 2
Shivam Gothi
2024년 10월 10일
What I understand is, you want to find the mean and the standard deviation of only those set of values of "ta" and "sa" for which "sig" is within the range of 27.4 - 27.5. Also, the "sig_range" is different for different months.
Is my understanding of question corrrect ?
채택된 답변
Voss
2024년 10월 10일
load('data_my.mat')
T = table(month,sa,ta,sig);
% only sig 27.4 to 27.5
idx = sig >= 27.4 & sig < 27.5;
G = groupsummary(T(idx,:),'month',{'mean','std'},{'sa','ta'})
% for reference, all sig
G = groupsummary(T,'month',{'mean','std'},{'sa','ta'})
댓글 수: 4
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Numeric Types에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!