Hello, the problem is as follows: Minimize R Subject to: (x-a_i)^2+(y-b_i)^2 ≤ R^2
Am looking to find x, y and R, knowing that
  • a_i and b_i are known values from (100*1 matrices) each.
  • x and y are within min and max of a_i and b_i.
Is it possible to find a solution with the optimzation tool of MATLAB? If not, any suggestion for a solution ?

 채택된 답변

Alan Weiss
Alan Weiss 2018년 9월 13일

0 개 추천

I haven't tried this, but it sounds straightforward.
Decision variables: x(1) = x, x(2) = y, x(3) = R.
100 nonlinear inequality constraints: (x(1) - a(i))^2 + (x(2) - b(i))^2 - R^2 <= 0
Objective function: R = x(3)
Bounds: ll = min(min([a,b])), mm = max(max([a,b]))
lb = [ll,ll,0] , ub = [mm,mm,Inf]
Call fmincon from a reasonable start point, such as [(ll+mm)/2,(ll+mm)/2,abs(mm)]
Alan Weiss
MATLAB mathematical toolbox documentation

추가 답변 (2개)

Bruno Luong
Bruno Luong 2018년 9월 13일
편집: Bruno Luong 2018년 9월 13일

1 개 추천

Here is solution
R = -Inf
x = something between min(ai,bi),max(ai,bi)
y = something between min(ai,bi),max(ai,bi)
next
Matt J
Matt J 2018년 9월 13일
편집: Matt J 2018년 9월 13일

0 개 추천

You could probably use minboundcircle in this FEX distribution.

제품

릴리스

R2015b

태그

질문:

2018년 9월 13일

편집:

2018년 9월 13일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by