纹理图像分割代码

1.问题描述:

 提出了一种小波编码图像的分割和分析算法。该算法是图像后处理方案的一部分,它能够成功地恢复压缩图像中的纹理,并能有效地恢复图像中的模糊伪影。该算法包括提取纹理、强度(或颜色)和空间特征。利用k均值算法的变化对大图像进行有效分割。分析阶段使用基于规则的启发式方法将分段分类为潜在的伪影或相邻纹理,这些纹理可以用于恢复它们。这种新的图像后处理方法要求用户交互最小,能够成功地利用压缩图像中的纹理层次相关性。

2.部分程序:

 

% Segmentation of textured images using sampled k-means 
% (c) 2003 by Rajas Sambhare
% ECE 738 - Final Project
% portions based on 
% 1. work by Jitendra Malik and Pietro Perona, "Preattentive texture
%    discrimination with early vision mechanisms," J. Optical Soc. America,
%    Vol 7, No 5, May 1990.
% 2. portions independant work

% Function needed
%   chisq.m
%   doog.m
%   gaussian.m
%   histproc.m
%   pca.m
%   postproc.m
%   preprocfast.m
%   rescalegray.m

clear all; clc; clf; warning off; close all hidden;
totalt = 0; % Total time spent on segmentation.

% PRE-PROCESS the image to produce a feature set.
%   1. Texture processing using DOOG filters
%   2. Principle component analysis to reduce dimensionality
%   3. Random sampling of image

img = im2double(imread('4.bmp')); % Read gray image
%img = im2double(imread('girl.bmp')); % Read color image


disp('Preprocessing...');tic;
% Preprocess all
[allfeatures, rDims, cDims, depth] = preprocfast(img);
[samples,olddimensions] = size(allfeatures);
gallfeatures = allfeatures;

% Combine all texture features to use for later thresholding
% Also save all low pass features for later adjacency processing
if depth == 1
    texturefeature = max(allfeatures(:,4:11), [], 2);
    lowpassfeature = allfeatures(:,3);
    lowpassimage = reshape(lowpassfeature, [cDims rDims])';
else
    texturefeature = max(allfeatures(:,6:13), [], 2);
    lowpassfeature = allfeatures(:,3:5);
    lowpassimage(:,:,1) = reshape(lowpassfeature(:,1), [cDims rDims])';
    lowpassimage(:,:,2) = reshape(lowpassfeature(:,2), [cDims rDims])';
    lowpassimage(:,:,3) = reshape(lowpassfeature(:,3), [cDims rDims])';
end
textures = reshape(texturefeature, [cDims rDims])';

% Principle component based dimensionality reduction of all features
allfeatures = pca(allfeatures, 0.05); 

% Choose 10% of samples randomly and save in DATASET
[samples, dimensions] = size(allfeatures);
% We work on ~WORKSAMPLES pixels. If the image has less we use all pixels. 
% If not then the appropriate portion of pixels is randomly selected.
worksamples = samples/10;
if worksamples < 10000
    worksamples = 10000;
end
if samples < worksamples
    worksamples = samples;
end
choose = rand([samples 1]); choose = choose < (worksamples/samples); 
dataset = zeros([sum(choose), dimensions]);
dataset(1:sum(choose),:) = allfeatures(find(choose),:); % find(choose) returns array where choose is non zero

disp('Preprocessing done.');t = toc; totalt = totalt + t;
disp(['     Original dimensions: ' int2str(olddimensions)]);
disp(['     Reduced dimensions by PCA: ' int2str(dimensions)]);
disp(['     Image has ' int2str(rDims * cDims) ' pixels.']);
disp(['     Using only ' int2str(size(dataset,1)) ' pixels.']);
disp(['Elapsed time: ' num2str(t)]);
disp(' ');

% SEGMENTATION
%   1. k-means (on sampled image)
%   2. Use centroids to classify remaining points
%   3. Classify spatially disconnected regions as separate regions

% Segmentation Step 1. 
%   k-means (on sampled image)
% Compute k-means on randomly sampled points
disp('Computing k-means...');tic;
% Set number of clusters heuristically.
k = round((rDims*cDims)/(100*100)); k = max(k,8); k = min(k,16);

% Uncomment this line when MATLAB k-means unavailable
%[centroids,esq,map] = kmeanlbg(dataset,k);
[map, centroids] = kmeans(dataset, k);  % Calculate k-means (use MATLAB k-mean
disp('k-means done.');t = toc; totalt = totalt + t;
disp(['     Number of clusters: ' int2str(k)]);
disp(['Elapsed time: ' num2str(t)]);
disp(' ');

% Segmentation Step 2. 
%   Use centroids to classify the remaining points
disp('Using centroids to classify all points...');tic;

globsegimage = postproc(centroids, allfeatures, rDims, cDims);   % Use centroids to classify all points

% Segmentation Step 3.
%   Classify spatially disconnected regions as separate regions
globsegimage = medfilt2(globsegimage, [3 3], 'symmetric');
globsegimage = imfill(globsegimage);
region_count = max(max(globsegimage));
count = 1; newglobsegimage = zeros(size(globsegimage));
for i = 1:region_count
    region = (globsegimage == i);
    [bw, num] = bwlabel(region);
    for j = 1:num
        newglobsegimage = newglobsegimage + count*(bw == j);
        count = count + 1;
    end
end
oldglobsegimage = globsegimage;
globsegimage = newglobsegimage;

disp('Classification done.');t = toc; totalt = totalt + t;
disp(['Elapsed time: ' num2str(t)]);
disp(' ');

% DISPLAY IMAGES

% Display segments
%figure(1), imshow(globsegimage./max(max(globsegimage)));
figure(1), imshow(label2rgb(globsegimage, 'gray'));
title('Segments');

% Calculate boundary of segments
BW = edge(globsegimage,'sobel', 0);

% Superimpose boundary on original image
iout = img;
if (depth == 1) % Gray image, so use color lines
    iout(:,:,1) = iout;
    iout(:,:,2) = iout(:,:,1);
    iout(:,:,3) = iout(:,:,1);
    iout(:,:,2) = min(iout(:,:,2) + BW, 1.0);
    iout(:,:,3) = min(iout(:,:,3) + BW, 1.0);
else            % RGB image, so use white lines
    iout(:,:,1) = min(iout(:,:,1) + BW, 1.0);
    iout(:,:,2) = min(iout(:,:,2) + BW, 1.0);
    iout(:,:,3) = min(iout(:,:,3) + BW, 1.0);
end

% Display image and segments
figure(2), imshow(iout);
title('Segmented image');

% POST PROCESSING AND AUTOMATIC SELECTION OF SOURCE AND TARGET REGIONS
%   1. Find overall textured region using Otsu's method (MATLAB graythresh)
%   2. Save each region and region boundary separately and note index of
%      textured regions
%   3. For each textured region, find all adjacent untextured regions and
%      save in adjacency matrix. Regions having a significant common border
%      are considered adjacent.
%   4. Find similarity between textured and adjacent untextured regions
%      using average gray level matching (average color matching). For each
%      textured region, drop those adjacent regions which don't match in
%      gray level.
disp('Post-processing and automatically selecting source and target regions...');tic;

% POSTPROC Step 1
threshold = graythresh(rescalegray(textures));
bwtexture = textures > threshold;
tex_edges = edge(bwtexture, 'sobel', 0);

figure(3),
if depth == 1
    imshow(min(img + tex_edges, 1));
else
    imshow(min(img + cat(3, tex_edges, tex_edges, tex_edges), 1));
end
title('Textured regions');

% POSTPROC Step 2
% Save each region in a dimension
% Save each region boundary in a dimension
% For each region which can be classified as a textured region store index
region_count = max(max(globsegimage));
number_tex_regions = 1; tex_list = [];
for region_number = 1:region_count 
    bwregion = (globsegimage == region_number);
    
    regions(:,:,region_number) = bwregion; % Save all regions
    region_boundaries(:,:,region_number) = edge(bwregion, 'sobel', 0);
    if ( (sum(sum(bwregion.*bwtexture))/sum(sum(bwregion)) > 0.75) && sum(sum(bwregion)) > (32*32) )
        tex_list = [tex_list region_number];
        number_tex_regions = number_tex_regions + 1;
    end
end
number_tex_regions = number_tex_regions - 1;

% POSTPROC Step 3
% Find texture region adjacency and create an adjacency matrix
for i = 1:size(tex_list, 2)
    for j = 1:region_count
        if (tex_list(i) ~= j)
            boundary_overlap = sum(sum( region_boundaries(:,:,tex_list(i)).*region_boundaries(:,:,j) ));
            boundary_total_length = sum(sum( region_boundaries(:,:,tex_list(i)))) + sum(sum(region_boundaries(:,:,j)));
            if (boundary_overlap/boundary_total_length > 0) % If overlap is at least 20% of total boundary length
                region_adjacency(i,j) = boundary_overlap;     % accept it as a boundary
            end
        end
    end
end

% EXPERIMENTAL
% Find adjacency matrix between all regions and segment the regions using
% N-Cut.
% for i = 1:region_count
%     region_feature(i,:) = get_region_features(regions(:,:,i), allfeatures);
% end
% for i = 1:region_count
%     for j = 1:region_count
%         W(i,j) = 
% END EXPERIMENTAL

% Those regions for which the edge overlap length is less than 20% of the
% mean overlap length are not considered adjacent. Update the adjacency
% matrix to reflect this.
region_adj_hard_coded = (region_adjacency - 0.2*repmat(mean(region_adjacency,2), [1 size(region_adjacency,2)])) > 0;

% Copy adjacency into another variable and remove all references to 
% textured regions from the adjacency matrix.
region_output = region_adj_hard_coded;
for tex_count = 1:size(tex_list, 2)
    region_output(:,tex_list(tex_count)) = 0;
end

% POSTPROC Step 4
% Find similarity between textured and adjacent untextured regions
% (This could be changed to a chi-squared distance between histograms of
%  textLP and adjacent by commenting out required code, and uncommenting 
%  other code, as directed in the source)

% For all textured regions find and save average brightness
for tex_count = 1:size(tex_list, 2)
    if depth == 1 
        tex_avg_bright(tex_count) = sum(sum(regions(:,:,tex_list(tex_count)).*lowpassimage)) ...
                         / sum(sum(regions(:,:,tex_list(tex_count))));
        % Comment previous and uncomment next line(s) to use histogram
        % processing   
        %tex_hist{tex_count} = histproc(regions(:,:,tex_list(tex_count)), lowpassimage);
    else
        tex_avg_bright(1,tex_count) = sum(sum(regions(:,:,tex_list(tex_count)).*lowpassimage(:,:,1))) ...
                                / sum(sum(regions(:,:,tex_list(tex_count))));
        tex_avg_bright(2,tex_count) = sum(sum(regions(:,:,tex_list(tex_count)).*lowpassimage(:,:,2))) ...
                                / sum(sum(regions(:,:,tex_list(tex_count))));
        tex_avg_bright(3,tex_count) = sum(sum(regions(:,:,tex_list(tex_count)).*lowpassimage(:,:,3))) ...
                                / sum(sum(regions(:,:,tex_list(tex_count))));
        % Comment previous and uncomment next line(s) to use histogram
        % processing        
        % tex_hist{tex_count} = histproc(regions(:,:,tex_list(tex_count)), lowpassimage);
    end
end

% For all textured regions, consider each non-textured region and update
% adjacency matrix. Keep the relationship if gray levels (colors) are similar and 
% drop if the gray levels (colors) don't match.
for tex_count = 1:size(tex_list, 2) % For all textured regions
    for adj_reg_count = 1:size(region_adj_hard_coded, 2)
        if (region_adj_hard_coded(tex_count, adj_reg_count) > 0)
            if depth == 1
                region_avg_bright = sum(sum(regions(:,:,adj_reg_count).*lowpassimage)) ...
                                    / sum(sum(regions(:,:,adj_reg_count)));
                % Comment previous and uncomment next line(s) to use histogram
                % processing   
                % region_hist = histproc(regions(:,:,adj_reg_count), lowpassimage);
            else
                region_avg_bright(1) = sum(sum(regions(:,:,adj_reg_count).*lowpassimage(:,:,1))) ...
                                    / sum(sum(regions(:,:,adj_reg_count)));
                region_avg_bright(2) = sum(sum(regions(:,:,adj_reg_count).*lowpassimage(:,:,2))) ...
                                    / sum(sum(regions(:,:,adj_reg_count)));
                region_avg_bright(3) = sum(sum(regions(:,:,adj_reg_count).*lowpassimage(:,:,3))) ...
                                    / sum(sum(regions(:,:,adj_reg_count)));
                % Comment previous and uncomment next line(s) to use histogram
                % processing                                   
                % region_hist = histproc(regions(:,:,adj_reg_count), lowpassimage);
            end
            
            if depth == 1
                if abs(tex_avg_bright(tex_count) - region_avg_bright) > 0.2 % Highly similar
                    region_output(tex_count, adj_reg_count) = 0;
                end
                % Comment previous and uncomment next line(s) to use histogram
                % processing                   
%                 if chisq(tex_hist{tex_count}, region_hist) > 0.4
%                     chisq(tex_hist{tex_count}, region_hist)
%                     region_output(tex_count, adj_reg_count) = 0;
%                 end
            else
                if mean(abs(tex_avg_bright(:,tex_count) - region_avg_bright')) > 0.2
                    region_output(tex_count, adj_reg_count) = 0;
                end
                % Comment previous and uncomment next line(s) to use histogram
                % processing                   
%                 thist = tex_hist{tex_count};
%                 chisq(thist(:,1),region_hist(:,1))
%                 chisq(thist(:,2),region_hist(:,2))
%                 chisq(thist(:,3),region_hist(:,3))
%                 t = 0.9;
%                 if (chisq(thist(:,1),region_hist(:,1)) > t) || ...
%                    (chisq(thist(:,2),region_hist(:,2)) > t) || ...
%                    (chisq(thist(:,3),region_hist(:,3)) > t)
%                     region_output(tex_count, adj_reg_count) = 0;
%                 end
            end         
        end
    end
end

disp('Post-processing done.'); t = toc; totalt = totalt + t;
disp(['Elapsed time: ' num2str(t)]);
disp(' ');
disp(['Total time elapsed: ' int2str(floor(totalt/60)) ' minutes ' int2str(mod(totalt,60)) ' seconds.']);

% DISPLAY IMAGES

% Display source and target regions.
if depth == 1
    imgs = zeros([rDims cDims size(tex_list,2)]);
    for tex_count = 1:size(tex_list, 2)
        if (sum(region_output(tex_count,:) > 0)) % If we have target patches
            imgs(:,:,tex_count) = regions(:,:,tex_list(tex_count)).*img; % Save that source patch
            for i = 1:size(region_output, 2) % For each region
                if (region_output(tex_count, i) > 0)    % which is connected to that source patch
                    imgs(:,:,tex_count) = imgs(:,:,tex_count) + 0.5*regions(:,:,i).*img; % Save the target patch
                end
            end
            figure, imshow(min(imgs(:,:,tex_count) + BW, 1));
            ggg{tex_count} = min(imgs(:,:,tex_count) + BW, 1);
            title('Potential source and target regions');
        end
    end
else % depth == 3 
    count = 1;
    for tex_count = 1:size(tex_list, 2)
        if (sum(region_output(tex_count,:) > 0)) % If we have target patches
            tmp(:,:,1) = regions(:,:,tex_list(tex_count)).*img(:,:,1);
            tmp(:,:,2) = regions(:,:,tex_list(tex_count)).*img(:,:,2);
            tmp(:,:,3) = regions(:,:,tex_list(tex_count)).*img(:,:,3);
            imgs{count} = tmp;
            for i = 1:size(region_output, 2) % For each region
                if (region_output(tex_count, i) > 0)    % which is connected to that source patch
                    tmp(:,:,1) = 0.5*regions(:,:,i).*img(:,:,1);
                    tmp(:,:,2) = 0.5*regions(:,:,i).*img(:,:,2);
                    tmp(:,:,3) = 0.5*regions(:,:,i).*img(:,:,3);
                    imgs{count} = imgs{count} + tmp;
                end
            end
            figure, imshow(min(imgs{count} + cat(3,BW,BW,BW), 1));
            ggg{count} = min(imgs{count} + cat(3,BW,BW,BW), 1);
            title('Potential source and target regions');
            count = count+1;
        end
    end
end

3.仿真结论:

D00005

猜你喜欢

转载自blog.csdn.net/ccsss22/article/details/114645516