Classification of electrical loads based on fuzzy clustering with transitive closure method

Classification of electrical loads based on fuzzy clustering with transitive closure method

Reference: "Application of Fuzzy Clustering in Load Measurement Modeling——Huang Mei"

1 Application of fuzzy clustering

Fuzzy clustering is a popular unsupervised learning technique that uses fuzzy set theory to analyze complex datasets. Its goal is to group similar objects in a given dataset into different clusters or categories based on different similarity or distance measures. Different from traditional hard clustering methods, fuzzy clustering assigns objects to clusters according to their degree of membership to each cluster or the possibility of belonging to the cluster, rather than a simple binary decision. This system allows objects to have overlapping affiliations, that is, data points can belong to multiple clusters at the same time.
Fuzzy clustering is widely used in various fields, such as image processing, pattern recognition, bioinformatics, etc. One of the most basic applications of fuzzy clustering is in pattern recognition. It can classify objects based on different similarity measures such as statistical similarity, geometric similarity and relational similarity.
Another application of fuzzy clustering is image processing. In this domain, images can be segmented, features extracted, and classified using fuzzy clustering. For example, fuzzy clustering can identify different regions in an image by separating them by similarity between pixels.
In bioinformatics, fuzzy clustering is used to analyze DNA and RNA sequence data. This technique has been used to identify major pathways in biological systems, as well as to identify patterns and relationships between different genetic markers.

2 Fuzzy clustering theory of transitive closure method
Transitive closure method is a commonly used fuzzy clustering algorithm, which can identify the fuzzy clustering structure in the data set by calculating the similarity between samples.

The basic idea of ​​the transitive closure method is to regard each sample in the data set as a cluster center, and then determine the relationship between the cluster centers by calculating the similarity between samples. Specifically, the transitive closure method calculates the distance or similarity matrix between samples, and then determines the relationship between samples according to the threshold of similarity.

The key to the transitive closure method is the calculation of the transitive closure. The transitive closure means that there is a path between any two samples, so that the similarity between all samples on the path is greater than or equal to a given threshold. By calculating the transitive closure, the samples can be divided into different clusters.

The steps of the transfer closure method are as follows:

1) Initialization: Treat each sample as a cluster center.

2) Calculate the similarity matrix: Calculate the similarity matrix according to the distance or similarity between samples.

3) Construct transitive closure: Calculate the transitive closure according to the similarity matrix and the given threshold.

4) Divide clusters: According to the transitive closure, the samples are divided into different clusters.

The advantage of the transitive closure method is that it can handle irregularly shaped cluster structures and does not require the number of clusters to be determined in advance. However, the transitive closure method also has some disadvantages, such as being sensitive to the selection of the initial cluster center and difficult to select the threshold.

In general, the transitive closure method is a similarity-based fuzzy clustering algorithm, which can identify the fuzzy clustering structure in the data set by calculating the similarity and transitive closure between samples.

3 Test load
insert image description here

4 Load classification results
insert image description here

5 matlab program
1) Main function
%% Electrical load classification based on fuzzy clustering of transitive closure method

clc

close all

%1-24 hour load

load=[455.390000000000;405.948000000000;333.086000000000;275.836000000000;205.576000000000;145.725000000000;130.112000000000;131.112000000000;137.918000000000;150.929000000000;163.941000000000;182.156000000000;208.178000000000;195.167000000000;156.134000000000;150.929000000000;161.338000000000;169.145000000000;169.145000000000;176.952000000000;195.167000000000;210.781000000000;296.654000000000;497.026000000000];

% slave margin calculation

x=zeros(24,2);

for i=1:2

for j=1:24

    if i==1

        x(j,i)=(load(j)-min(load))/(max(load)-min(load));

    end

    if i==2

        x(j,i)=(max(load)-load(j))/(max(load)-min(load));

    end

end

end

[X]=F_JISjBzh(1,x);% standardized

[R]=F_JlR(8,X);% seek fuzzy similarity matrix

F_JIDtjl®;% classification

figure

plot(load)

2) Subfunction


function F_JIDtjl(R)%定义函数
%模糊聚类分析动态聚类
%R模糊相似矩阵
[m,n]=size(R);%获得矩阵的行列数
if(m~=n|m==0) 
    return ;
end
for(i=1:n) R(i,i)=1;%修正错误
    for(j=i+1:n) 
        if(R(i,j)<0) R(i,j)=0;
elseif(R(i,j)>1) R(i,j)=1;
        end
        R(i,j)=round(10000*R(i,j))/10000;%保留四位小数
        R(j,i)=R(i,j);
    end
end
js0=0;
while(1)%求传递闭包
    R1=Max_Min(R,R);%【见附录3.6.1】
js0=js0+1;
    if(R1==R) break;
    else
        R=R1;
    end
end
Imd(1)=1;k=1;
for(i=1:n) 
    for(j=i+1:n) pd=1;%找出所有不相同的元素
        for(x=1:k) 
if(R(i,j)==Imd(x)) 
    pd=0;
    break;
end;
        end
        if(pd) 
            k=k+1;
            Imd(k)=R(i,j);
        end
    end;end
for(i=1:k-1) for(j=i+1:k) 
if(Imd(i)<Imd(j))%从大到小排序
            x=Imd(j);Imd(j)=Imd(i);Imd(i)=x;
        end;end;end
for(x=1:k) %按Imd(x)分类,分类数为flsz(x),临时用Sz记录元素序号
   js=0;flsz(x)=0;
   for(i=1:n) pd=1;
       for(y=1:js) if(Sz(y)==i) pd=0;break;end;end
       if(pd)
           for(j=1:n) 
if(R(i,j)>=Imd(x)) js=js+1;Sz(js)=j;end;end
           flsz(x)=flsz(x)+1;
       end
   end
end
for(i=1:k-1) 
for(j=i+1:k) 
if(flsz(j)==flsz(i)) flsz(j)=0;end;end;end
fl=0;%排除相同的分类
for(i=1:k) if(flsz(i)) fl=fl+1;Imd(fl)=Imd(i);end;end
for(i=1:n) xhsz(i)=i;end
for(x=1:fl)%获得分类情况:对元素分类进行排序
    js=0;flsz(x)=0;
    for(i=1:n) pd=1;
        for(y=1:js) if(Sz(y)==i) pd=0;break;end;end
        if(pd) if(js==0) y=0;end
            for(j=1:n) if(R(i,j)>=Imd(x)) js=js+1;Sz(js)=j;end;end
            flsz(x)=flsz(x)+1;
            Sz0(flsz(x))=js-y;
        end
    end
    js0=0;
    for(i=1:flsz(x))
        for(j=1:Sz0(i)) Sz1(j)=Sz(js0+j);end
        for(j=1:n) for(y=1:Sz0(i)) 
if(xhsz(j)==Sz1(y)) 
js0=js0+1;Sz(js0)=xhsz(j);end;end;end
    end
    for(i=1:n) xhsz(i)=Sz(i);end
end
for(x=1:fl)%获得分类情况:每一子类的元素个数
    js=0;flsz(x)=0;
    for(i=1:n) pd=1;
        for(y=1:js) if(Sz(y)==i) pd=0;break;end;end
        if(pd) if(js==0) y=0;end
            for(j=1:n) if(R(i,j)>=Imd(x)) js=js+1;Sz(js)=j;end;end
            flsz(x)=flsz(x)+1;Sz0(flsz(x))=js-y;
        end
    end
    js0=1;
    for(i=1:flsz(x)) y=1;
        for(j=1:flsz(x))
            if(Sz(y)==xhsz(js0)) flqksz(x,i)=Sz0(j);js0=js0+Sz0(j);break;end
            y=y+Sz0(j);
        end
    end
end
F_dtjltx=figure('name','动态聚类图','color','w');
axis('off');
Kd=30;Gd=40;y=fl*Gd+Gd;lx=80;
text(24,y+Gd/2,'λ');
for(i=1:n)
    text(lx-5+i*Kd-0.4*Kd*(xhsz(i)>9),y+Gd/2,int2str(xhsz(i)));
    line([lx+i*Kd,lx+i*Kd],[y,y-Gd]);
    linesz(i)=lx+i*Kd;
end
text(lx*1.25+i*Kd,y+Gd/2,'分类数');
y=y-Gd;
for(x=1:fl)
    text(8,y-Gd/2,num2str(Imd(x)));
    js0=1;js1=0;
    if(x==1)
        for(i=1:flsz(x))
            js1=flqksz(x,i)-1;
            if(js1) line([linesz(js0),linesz(js0+js1)],[y,y]);end
            line([(linesz(js0+js1)+linesz(js0))/2,(linesz(js0+js1)+linesz(js0))/2],[y,y-Gd]);
            linesz(i)=(linesz(js0+js1)+linesz(js0))/2;
            js0=js0+js1+1;
        end
            else for(i=1:flsz(x))
                    js1=js1+flqksz(x,i);
                    js2=0;pd=0;
                    for(j=1:flsz(x-1))
                        js2=js2+flqksz(x-1,j);
                        if(js2==js1) pd=1;break;end
                    end
                    if(j~=js0) line([linesz(js0),linesz(j)],[y,y]);end
                    line([(linesz(js0)+linesz(j))/2,(linesz(js0)+linesz(j))/2],[y,y-Gd]);
                    linesz(i)=(linesz(js0)+linesz(j))/2;
                    js0=j+1;
                end;end
            text(1.5*lx+n*Kd,y-Gd/3,int2str(flsz(x)));
            y=y-Gd;
    end
function [X]=F_JISjBzh(cs,X)
%模糊聚类分析数据标准化变换
%X原始数据矩阵;cs=0,不变换;cs=1,标准差变换
%cs=2,极差变换
if(cs==0) return ;end
[n,m]=size(X);% 获得矩阵的行列数
if(cs==1) % 平移极差变换 
    for(k=1:m) xk=0;
        for(i=1:n) xk=xk+X(i,k);end
        xk=xk/n;sk=0;
        for(i=1:n) sk=sk+(X(i,k)-xk)^2;end
        sk=sqrt(sk/n);
        for(i=1:n) X(i,k)=(X(i,k)-xk)/sk;end
    end
else %平移*极差变换
    for(k=1:m) xmin=X(1,k);xmax=X(1,k);
        for(i=1:n)
            if(xmin>X(i,k)) xmin=X(i,k);end
            if(xmax<X(i,k)) xmax=X(i,k);end
        end
        for(i=1:n) X(i,k)=(X(i,k)-xmin)/(xmax-xmin);end
        end
end
function F_jlfx(bzh,fa,X)
%模糊聚类分析
%bah数据标准型;fa建立模糊相似矩阵的方法;X原始数据矩阵
X=F_jisjbzh(bzh,X);
R=F_jir(fa,X);
[m,n]=size(R);
if(m~=n|m==0)
    return;
end
F_JIDtjl(R)
function[R]=F_JlR(cs,X)%定义函数
%模糊聚类分析建立模糊相似矩阵: [R]=F_JlR(cs,X)
%X,数据矩阵
%cs=1,数量积法
%cs=2,夹角余弦法
%cs=3,相关系数法
%cs=4,指数相似系数法
%cs=5,最大最小法
%cs=6,算术平均最小法
%cs=7,几何平均最小法
%cs=8,一般欧式距离法
%cs=9,一般海明距离法
%cs=10,一般切比雪夫距离法
%cs=11,倒数欧式距离法
%cs=12,倒数海明距离法
%cs=13,倒数切比雪夫距离法
%cs=14,指数欧式距离法
%cs=15,指数海明距离法
%cs=16,指数切比雪夫距离法
[n,m]=size(X);%获得矩阵的行列数
R=[];
if(cs==1)
    maxM=0;
    pd=0;%数量积法
    for(i=1:n)for(j=1:n)if(j~=i)x=0;
        for(k=1:m)x=x+X(i,k)*X(j,k);end
        if(maxM<x)maxM=x;end
    end;end;end
    if(maxM<0.000001)return;end
    for(i=1:n)for(j=1:n)
        if(i==j)R(i,j)=1;
        else R(i,j)=0;
            for(k=1:m)R(i,j)=R(i,j)+X(i,k)*X(j,k);end
            R(i,j)=R(i,j)/maxM;
            if(R(i,j)<0)pd=1;end
        end
    end;end
    if(pd)for(i=1:n)for(j=1:n)R(i,j)=(R(i,j)+1)/2;end;end;end
elseif(cs==2)%夹角余弦法
    for(i=1:n)for(j=1:n)xi=0;xj=0;
        for(k=1:m)xi=xi+X(i,k)^2;xj=xj+X(j,k)^2;end
        s=sqrt(xi*xj);R(i,j)=0;
        for(k=1:m)R(i,j)=R(i,j)+X(i,k)*X(j,k);end
        R(i,j)=R(i,j)/s;
    end;end
elseif(cs==3)%相关系数法
    for(i=1:n)for(j=1:n)xi=0;xj=0;
        for(k=1:m)xi=xi+X(i,k);xj=xj+X(j,k);end
        xi=xi/m;xj=xj/m;xis=0;xjs=0;
        for(k=1:m)xis=xis+(X(i,k)-xi)^2;xjs=xjs+(X(j,k)-xj)^2;end
        s=sqrt(xis*xjs);R(i,j)=0;
        for(k=1:m)R(i,j)=R(i,j)+abs((X(i,k)-xi)*(X(j,k)-xj));end
        R(i,j)=R(i,j)/s;
    end;end
elseif(cs==4)%指数相似系数法
    for(i=1:n)for(j=1:n)R(i,j)=0;
        for(k=1:m)xk=0;
            for(z=1:n)xk=xk+X(z,k);end
            xk=xk/n;sk=0;
            for(z=1:n)sk=sk+(X(z,k)-xk)^2;end
            sk=sk/n;R(i,j)=R(i,j)+exp(-0.75*((X(i,k)-X(j,k))/sk)^2);
        end
        R(i,j)=R(i,j)/m;
    end;end
elseif(cs<=7)%最大最小法 算术平均最小法 几何平均最小法
    for(i=1:n)for(j=1:n)fz=0;fm=0;
        for(k=1:m)
            if(X(j,k)<0)R=[];return;end
            if(X(j,k)>X(i,k))x=X(i,k);
            else x=X(j,k);end
            fz=fz+x;
        end
        if(cs==5)%最大最小法
            for(k=1:m)if(X(i,k)>X(j,k))x=X(i,k);else x=X(j,k);end
            fm=fm+x;end
        elseif(cs==6)for(k=1:m)fm=fm+(X(i,k)+X(j,k))/2;end%算术平均最小法
        else for(k=1:m)fm=fm+sqrt(X(i,k)*X(j,k));end;end%几何平均最小法
        R(i,j)=fz/fm;
    end;end
elseif(cs<=10)C=0;%一般距离法
    for(i=1:n)for(j=i+1:n)d=0;
        if(cs==8)for(k=1:m)d=d+(X(i,k)-X(j,k))^2;end
            d=sqrt(d);%欧式距离
        elseif(cs==9)for(k=1:m)d=d+abs(X(i,k)-X(j,k));end%海明距离
        else for(k=1:m)if(d<abs(X(i,k)-X(j,k)))d=abs(X(i,k)-X(j,k));end;end;end%切比雪夫距离
        if(C<d)C=d;end
    end;end
    C=1/(1+C);
    for(i=1:n)for(j=1:n)d=0;
        if(cs==8)for(k=1:m)d=d+(X(i,k)-X(j,k))^2;end
            d=sqrt(d);%欧式距离
        elseif(cs==9)for(k=1:m)d=d+abs(X(i,k)-X(j,k));end%海明距离
        else for(k=1:m)if(d<abs(X(i,k)-X(j,k)))d=abs(X(i,k)-X(j,k));end;end;end%切比雪夫距离
        R(i,j)=1-C*d;
    end;end
elseif(cs<=13)minM=Inf;%倒数距离法
    for(i=1:n)for(j=i+1:n)d=0;
        if(cs==11)for(k=1:m)d=d+(X(i,k)-X(j,k))^2;end
            d=sqrt(d);%欧式距离
        elseif(cs==12)for(k=1:m)d=d+abs(X(i,k)-X(j,k));end%海明距离
        else for(k=1:m)if(d<abs(X(i,k)-X(j,k)))d=abs(X(i,k)-X(j,k));end;end;end%切比雪夫距离
        if(minM>d)minM=d;end
    end;end
    minM=0.9999*minM;
    if(minM<0.000001)return;end
    for(i=1:n)for(j=1:n)d=0;
        if(j==i)R(i,j)=1;continue;end
        if(cs==11)for(k=1:m)d=d+(X(i,k)-X(j,k))^2;end
            d=sqrt(d);%欧式距离
        elseif(cs==12)for(k=1:m)d=d+abs(X(i,k)-X(j,k));end%海明距离
        else for(k=1:m)if(d<abs(X(i,k)-X(j,k)))d=abs(X(i,k)-X(j,k));end;end;end%切比雪夫距离
        R(i,j)=minM/d;
    end;end
else for(i=1:n)for(j=1:n)d=0;%指数距离法
    if(cs==14)for(k=1:m)d=d+(X(i,k)-X(j,k))^2;end
        d=sqrt(d);%欧式距离
    elseif(cs==15)for(k=1:m)d=d+abs(X(i,k)-X(j,k));end%海明距离
    else for(k=1:m)if(d<abs(X(i,k)-X(j,k)))d=abs(X(i,k)-X(j,k));end;end;end%切比雪夫距离
    R(i,j)=exp(-d);
end;end;end
end
function [C]=Max_Min(A,B)
%模糊矩阵的合成运算,先取大,后取小
[m,s]=size(A);[s1,n]=size(B);C=[];
if(s1~=s) return ;end
for(i=1:m) for(j=1:n) C(i,j)=0;
        for(k=1:s) x=0;
            if(A(i,k)<B(k,j)) x=A(i,k);
            else x=B(k,j);end
            if(C(i,j)<x) C(i,j)=x;end
        end
    end;end

Guess you like

Origin blog.csdn.net/weixin_47365903/article/details/131214575