How can i do optimization in matlab?
조회 수: 2 (최근 30일)
이전 댓글 표시
Hi,
I have an equation given below
y = x- at
'x' and 't' are known value. I also have field value 'z'that is known.
I want to find the value of 'a' in the equation such that error between obtained 'y' and field value 'z' is minimum.
How can i do this in matlab? I am not sure of which optimization technique in matlab can be used.
Thanks, Menaka
댓글 수: 6
답변 (2개)
José-Luis
2013년 2월 14일
편집: José-Luis
2013년 2월 14일
Are you sure you asked the right question? If so, what you are asking is:
y-x = at; %Find a and t
Since y and x are known, this can be written as
some_value = at;
This has an infinite number of solutions, or none at all if a or t are equal to zero and some_value is not. Just pick whatever value for a and calculate t. Or vice-versa.
If what you mean was a linear regression then:
coeff = polyfit(x,y,1);
Alan Weiss
2013년 2월 14일
Let me try to guess what you are doing. You have a bunch of input data you call x, and a bunch of output data you call z. There is a scalar 'a' and another piece of input data t so that, approximately, for each index i,
z(i) = x(i) - a*t(i)
You want to find the value of 'a' that makes this equation as true as possible, such as minimizing the sum of the squared differences.
If I am correct, write your equation as
a*t = z - x,
where t, z, and x are column vectors. The solution is
a = (z-x)\t
Alan Weiss
MATLAB mathematical toolbox documentation
참고 항목
카테고리
Help Center 및 File Exchange에서 Surrogate Optimization에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!