第四十二篇: 快速排序为什么快?(Chapter 42: why is quick sorting fast?)

好家伙,

前端知识的学习要暂时告一段落了,要去准备某个小小的比赛了,复习一下以前的知识.

“快速排序为什么快?”,这是上次一个学长对我的灵魂拷问,我忘了,准确的说也没完全搞懂.

所以,今天来搞搞它.

上一个c语言朴实无华的快速排序: (功能:输入四个数,排序,从小到大输出)

#include<stdio.h>
int a[5];

void quicksort(int left,int right) //快速排序
{
    int i,j,t,temp; //i为左指针,j为右指针,temp为基准数
    if(left>right) 
        return;
    temp=a[left];   
    i=left;
    j=right;
    while(i!=j)
    {    
        while(a[j]>=temp && i<j)//从右向左找比基准数大的数
            j--;
        while(a[i]<=temp && i<j)//从左向右找比基准数大的数
            i++;
    
        if(i<j)    //若i,j不相遇,交换两数在数组中的位置
        {
        t=a[i];    
        a[i]=a[j];
        a[j]=t;
        }
    }
    
    a[left]=a[i];    //归位基准数
    a[i]=temp;    

    quicksort(left,i-1);  //基准数左边,递归
    quicksort(i+1,right);    //基准数右边,递归
    return;
}

int main()
{
    int i,j;
    
    printf("请输入四个数:"); //输入
    for(i=1;i<=4;i++)
        scanf("%d",&a[i]);

    quicksort(1,4);  //调用快速排序

    for(i=1;i<=4;i++)    //遍历输出
        printf("%d ",a[i]);
    getchar();
    return 0;

}

对于不同排序算法之间的衡量方式就是通过程序执行所占用的时间空间两个维度去考量。.

先解释一下时间复杂度

1.时间复杂度

若存在函数 f(n),使得当 n 趋近于无穷大时,T(n)/ f(n))的极限值为不等于零的常数,则称 f(n)是 T(n)的同数量级函数。

记作 T(n)= O(f(n)),称 O(f(n))为算法的渐进时间复杂度,简称时间复杂度。

简单理解就是一个算法或是一个程序在运行时,所消耗的时间(或者代码被执行的总次数)。

 2.

递归算法的时间复杂度公式:T[n] = aT[n/b] + f(n)  ;(至于怎么来的下次写)

以下是推导: (推导原文来自:https://blog.csdn.net/qfikh/article/details/52870875)

                 令:n = n/2        =  2 { 2 T[n/4] + (n/2) }  + n                                               —————-第二次递归

                                            =  2^2 T[ n/ (2^2) ] + 2n

                令:n = n/(2^2)   =  2^2  {  2 T[n/ (2^3) ]  + n/(2^2)}  +  2n                         —————-第三次递归  

                                            =  2^3 T[  n/ (2^3) ]  + 3n

                …………………………………………………………………………..                        

                令:n = n/(  2^(m-1) )    =  2^m T[1]  + mn                                                  —————-第m次递归(m次后结束)

               当最后平分的不能再平分时,也就是说把公式一直往下跌倒,到最后得到T[1]时,说明这个公式已经迭代完了(T[1]是常量了)。

               得到:T[n/ (2^m) ]  =  T[1]    ===>>   n = 2^m   ====>> m = logn;(m感觉可以理解为递归树的层数)

               T[n] = 2^m T[1] + mn ;其中m = logn;

               T[n] = 2^(logn) T[1] + nlogn  =  n T[1] + nlogn  =  n + nlogn  ;其中n为元素个数

               n比上nlogn,n不要了

               综上所述:

    <1>快速排序最优的情况下时间复杂度为:O( nlogn ), ( 秒啊 )

    <2>快速排序的平均时间复杂度也是:O(nlogn)  (不知道怎么来的,要去问问学长)

      <3>还有最差情况(每次都取到最大值), 那么该情况下的时间复杂度就是O( n*n )

3.总的来说就是快排的时间复杂度低了,[干碎冒泡的平均O( n*n )],所以他快.

结束.

————————

good heavens,

The study of front-end knowledge will come to an end for the time being. I have to prepare for a small game and review my previous knowledge

“Why is quick sorting fast?”, this is the last time a senior student tortured my soul. I forgot and didn’t fully understand it

So let’s do it today

Simple quick sorting of the last C language: (function: input four numbers, sort, output from small to large)

#include<stdio.h>
int a[5];

void quicksort(int left,int right) //快速排序
{
    int i,j,t,temp; //i为左指针,j为右指针,temp为基准数
    if(left>right) 
        return;
    temp=a[left];   
    i=left;
    j=right;
    while(i!=j)
    {    
        while(a[j]>=temp && i<j)//从右向左找比基准数大的数
            j--;
        while(a[i]<=temp && i<j)//从左向右找比基准数大的数
            i++;
    
        if(i<j)    //若i,j不相遇,交换两数在数组中的位置
        {
        t=a[i];    
        a[i]=a[j];
        a[j]=t;
        }
    }
    
    a[left]=a[i];    //归位基准数
    a[i]=temp;    

    quicksort(left,i-1);  //基准数左边,递归
    quicksort(i+1,right);    //基准数右边,递归
    return;
}

int main()
{
    int i,j;
    
    printf("请输入四个数:"); //输入
    for(i=1;i<=4;i++)
        scanf("%d",&a[i]);

    quicksort(1,4);  //调用快速排序

    for(i=1;i<=4;i++)    //遍历输出
        printf("%d ",a[i]);
    getchar();
    return 0;

}

The measurement method between different sorting algorithms is considered through the two dimensions of < strong > time < / strong > and < strong > space < / strong > occupied by program execution

Explain the time complexity first

1. Time complexity

If there is a function f (n) such that when n approaches infinity, the limit value of T (n) / F (n)) is a constant that is not equal to zero, then f (n) is a function of the same order of magnitude of T (n).

It is recorded as t (n) = O (f (n)), and O (f (n)) is the asymptotic time complexity of the algorithm, which is referred to as time complexity for short.

< strong > simple understanding is the time (or the total number of times the code is executed) consumed by an algorithm or a program when it is running

two

Time complexity formula of recursive algorithm: T [n] = at [n / b] + F (n)  ; (as for how to come, write next time)

The following is the derivation: (the original derivation is from: https://blog.csdn.net/qfikh/article/details/52870875 )

Order: n = n / 2        =   2 { 2 T[n/4] + (n/2) }  + n                                               —————- Second recursion

=   2^2 T[ n/ (2^2) ] + 2n

Let: n = n / (2 ^ 2)   =   2^2   {   2 T[n/ (2^3) ]  + n/(2^2)}  +   2n                         —————- Third recursion

=   2^3 T[   n/ (2^3) ]  + 3n

…………………………………………………………………………..

Order: n = n/(   2^(m-1) )    =   2^m T[1]  + mn                                                  —————- M-th recursion (end after M times)

When the last bisector cannot be bisected, that is, the formula falls all the way down until t [1] is finally obtained, it indicates that the formula has been iterated (t [1] is a constant).

Get: T [n / (2 ^ m)]  =   T[1]    ===& gt;& gt;   n = 2^m   ====& gt;& gt; m = logn; (M feels that it can be understood as the number of layers of the recursive tree)

T[n] = 2^m T[1] + mn ; Where M = logn;

T[n] = 2^(logn) T[1] + nlogn  =   n T[1] + nlogn  =   n + nlogn  ; Where n is the number of elements

N is better than nlogn, n is not

in summary:

< 1> In the case of optimal quick sorting, the time complexity is: O (nlogn), (seconds)

< 2> The average time complexity of quick sort is also: O (nlogn)   (I don’t know how to get here. I have to ask the senior student)

& lt; 3> There is also the worst case (the maximum value is taken every time), so the time complexity in this case is O (n * n)

3. Generally speaking, the time complexity of fast scheduling is low, [average o (n * n)], so it is fast

end.