Putting symbolic array into matrix

조회 수: 1 (최근 30일)
Ramses Young
Ramses Young 2022년 4월 29일
답변: Walter Roberson 2022년 4월 29일
Hello,
I am trying to take a symbolic variable and subtract it from every value in an array (of some 1xn size) and then solve for this unknown variable later, but I am having trouble seemingly getting the symbolic array to nest correctly.
Here is an example of what I am trying to do,
syms q0
Vy=[300 500 300]
q(1,:)=[q0-(Vy.*2.*4./8)]
After this, q(i,:) will be in a for loop that will use this symbolic variable.
I keep getting an error and I dont know how to modify this in a way where I can still solve for q0 later on.
Thanks for the help!!!

채택된 답변

Walter Roberson
Walter Roberson 2022년 4월 29일
That code works in the form you posted.
syms q0
Vy=[300 500 300]
Vy = 1×3
300 500 300
q(1,:)=[q0-(Vy.*2.*4./8)]
q = 
The most common mistake for something like that would have been to initialize q using zeros(), such as
q = zeros(5,3);
If you had initialized q to double precision, then the assigning into q(1,:) would fail because what you are assigning needs to contain a symbolic variable.
The cure for that would be to use something like
q = zeros(5, 3, 'sym');
for new enough versions of MATLAB, or
q = sym(zeros(5,3));
if your MATLAB is older than that.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Symbolic Math Toolbox에 대해 자세히 알아보기

제품


릴리스

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by