MATLAB Answers

matlab android camera and object tracking

조회 수: 15(최근 30일)
by refering to this code:
url = 'http://<ip address>/shot.jpg';
ss = imread(url);
fh = image(ss);
while(1)
ss = imread(url);
set(fh,'CData',ss);
drawnow;
end
i able to display the video from android phone ( using app ip camera), and i dont know on how to combine this code with below code:
vid = videoinput('winvideo', 1);
% Set the properties of the video object
set(vid, 'FramesPerTrigger', Inf);
set(vid, 'ReturnedColorspace', 'rgb')
vid.FrameGrabInterval = 5;
%start the video aquisition here
start(vid) ( here the video is taken from webcam, i want it to be taken from my android phone)
% Set a loop that stop after 100 frames of aquisition
while(vid.FramesAcquired<=200)
% Get the snapshot of the current frame
data = getsnapshot(vid);
% Now to track red objects in real time
% we have to subtract the red component
% from the grayscale image to extract the red components in the image.
diff_im = imsubtract(data(:,:,1), rgb2gray(data));
%Use a median filter to filter out noise
diff_im = medfilt2(diff_im, [3 3]);
% Convert the resulting grayscale image into a binary image.
diff_im = im2bw(diff_im,0.18);
% Remove all those pixels less than 300px
diff_im = bwareaopen(diff_im,300);
% Label all the connected components in the image.
bw = bwlabel(diff_im, 8);
% Here we do the image blob analysis.
% We get a set of properties for each labeled region.
stats = regionprops(bw, 'BoundingBox', 'Centroid');
% Display the image
imshow(data)
hold on
%This is a loop to bound the red objects in a rectangular box.
for object = 1:length(stats)
bb = stats(object).BoundingBox;
bc = stats(object).Centroid;
rectangle('Position',bb,'EdgeColor','r','LineWidth',2)
plot(bc(1),bc(2), '-m+')
a=text(bc(1)+15,bc(2), strcat('X: ', num2str(round(bc(1))), ' Y: ', num2str(round(bc(2)))));
set(a, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize', 12, 'Color', 'yellow');
end
hold off
end
% Both the loops end here.
% Stop the video aquisition.
stop(vid);
% Flush all the image data stored in the memory buffer.
flushdata(vid);
% Clear all variables
clear all

  댓글 수: 0

로그인 to comment.

채택된 답변

Walter Roberson
Walter Roberson 15 Mar 2014
Replace
data = getsnapshot(vid);
with
data = imread(url);

  댓글 수: 2

amirul
amirul 16 Mar 2014
vid = videoinput('winvideo', 1);
set(vid, 'FramesPerTrigger', Inf);
set(vid, 'ReturnedColorspace', 'rgb')
vid.FrameGrabInterval = 5;
start(vid)
% Set a loop that stop after 100 frames of aquisition
while(vid.FramesAcquired<=200)
do i need yo change all this? thank you for ur answer. if it possible could u combine the code.
Walter Roberson
Walter Roberson 16 Mar 2014
You don't need that. Replace it with
FramesAcquired = 0;
before the loop, and
while FramesAcquired <= 200
as the loop condition, and then after the urlread() add
FramesAcquired = FramesAcquired + 1;

로그인 to comment.

추가 답변(2개)

PIYUSH KUMAR
PIYUSH KUMAR 14 Sep 2015
Here's the working code of color detection using android camera:
url = 'http://192.168.0.100:8080/shot.jpg';
framesAcquired = 0;
while (framesAcquired <= 50)
data = imread(url);
framesAcquired = framesAcquired + 1;
diff_im = imsubtract(data(:,:,1), rgb2gray(data)); % subtracting red component from the gray image
diff_im = medfilt2(diff_im, [3 3]); % used in image processing to reduce noise and for filtering
diff_im = im2bw(diff_im,0.18); % convert image to binary image
stats = regionprops(diff_im, 'BoundingBox', 'Centroid'); % measures a set of properties for each connected component in the binary image
drawnow;
imshow(data);
hold on
for object = 1:length(stats)
bb = stats(object).BoundingBox;
bc = stats(object).Centroid;
rectangle('Position',bb,'EdgeColor','b','LineWidth',2)
plot(bc(1),bc(2), '-m+')
end
hold off
end
%stop(vid); % to stop the video
%flushdata(vid); % erase the data video
clear all

  댓글 수: 0

로그인 to comment.


Shahroze Hussain
Shahroze Hussain 6 Jan 2017
편집: Walter Roberson 6 Jan 2017
url = 'http://192.168.10.2:8080/shot.jpg';
ss = imread(url);
fh = image(ss);
while(1)
ss = imread(url);
set(fh,'CData',ss);
drawnow;
end
foregroundDetector = vision.ForegroundDetector('NumGaussians', 3, ...
'NumTrainingFrames', 50);
videoReader = vision.VideoFileReader('visiontraffic.avi');
for i = 1:150
frame = step(videoReader); % read the next video frame
foreground = step(foregroundDetector, frame);
end
%figure;
%imshow(frame);
%title('Video Frame');
%figure;
%imshow(foreground);
%title('Foreground');
se = strel('square', 3);
filteredForeground = imopen(foreground, se);
%figure;
%imshow(filteredForeground);
%title('Clean Foreground');
blobAnalysis = vision.BlobAnalysis('BoundingBoxOutputPort', true, ...
'AreaOutputPort', false, 'CentroidOutputPort', false, ...
'MinimumBlobArea', 150);
bbox = step(blobAnalysis, filteredForeground);
result = insertShape(frame, 'Rectangle', bbox, 'Color', 'red');
numCars = size(bbox, 1);
result = insertText(result, [10 10], numCars, 'BoxOpacity', 1, ...
'FontSize', 14);
%figure;
%imshow(result);
%title('Detected Cars');
videoPlayer = vision.VideoPlayer('Name', 'Detected Cars');
videoPlayer.Position(3:4) = [650,400]; % window size: [width, height]
se = strel('square', 3); % morphological filter for noise removal
while ~isDone(videoReader)
frame = step(videoReader); % read the next video frame
% Detect the foreground in the current video frame
foreground = step(foregroundDetector, frame);
% Use morphological opening to remove noise in the foreground
filteredForeground = imopen(foreground, se);
% Detect the connected components with the specified minimum area, and
% compute their bounding boxes
bbox = step(blobAnalysis, filteredForeground);
% Draw bounding boxes around the detected cars
result = insertShape(frame, 'Rectangle', bbox, 'Color', 'red');
% Display the number of cars found in the video frame
numCars = size(bbox,1);
asciiChars = char(numCars+'A'-1);
result = insertText(result, [10 10], asciiChars, 'BoxOpacity',1,...
'FontSize', 14,'BoxColor','Green');
step(videoPlayer, result); % display the results
end
asciiChars = char(numCars+'A'-1)
release(videoReader); % close the video file
How to combine these two codes

  댓글 수: 3

Walter Roberson
Walter Roberson 6 Jan 2017
url = 'http://192.168.10.2:8080/shot.jpg';
foregroundDetector = vision.ForegroundDetector('NumGaussians', 3, ...
'NumTrainingFrames', 50);
for i = 1:150
frame = imread(url); % read the next video frame
foreground = step(foregroundDetector, frame);
end
se = strel('square', 3);
filteredForeground = imopen(foreground, se);
videoPlayer = vision.VideoPlayer('Name', 'Detected Cars');
videoPlayer.Position(3:4) = [650,400]; % window size: [width, height]
se = strel('square', 3); % morphological filter for noise removal
while true
frame = imread(url); % read the next video frame
% Detect the foreground in the current video frame
foreground = step(foregroundDetector, frame);
% Use morphological opening to remove noise in the foreground
filteredForeground = imopen(foreground, se);
% Detect the connected components with the specified minimum area, and
% compute their bounding boxes
bbox = step(blobAnalysis, filteredForeground);
% Draw bounding boxes around the detected cars
result = insertShape(frame, 'Rectangle', bbox, 'Color', 'red');
% Display the number of cars found in the video frame
numCars = size(bbox,1);
asciiChars = char(numCars+'A'-1);
result = insertText(result, [10 10], asciiChars, 'BoxOpacity',1,...
'FontSize', 14,'BoxColor','Green');
step(videoPlayer, result); % display the results
end
asciiChars = char(numCars+'A'-1)
Rohil Setia
Rohil Setia 4 May 2018
not working. it is showing that blobAnalysis as undefined.
Walter Roberson
Walter Roberson 7 May 2018
You will need to add in
blobAnalysis = vision.BlobAnalysis('BoundingBoxOutputPort', true, ...
'AreaOutputPort', false, 'CentroidOutputPort', false, ...
'MinimumBlobArea', 150);

로그인 to comment.

이 질문에 답변하려면 로그인을(를) 수행하십시오.


Translated by