# 第四十二篇: 快速排序为什么快?(Chapter 42: why is quick sorting fast?)-其他

## 第四十二篇: 快速排序为什么快?(Chapter 42: why is quick sorting fast?)

“快速排序为什么快?”,这是上次一个学长对我的灵魂拷问,我忘了,准确的说也没完全搞懂.

``````#include<stdio.h>
int a;

void quicksort(int left,int right) //快速排序
{
int i,j,t,temp; //i为左指针,j为右指针,temp为基准数
if(left>right)
return;
temp=a[left];
i=left;
j=right;
while(i!=j)
{
while(a[j]>=temp && i<j)//从右向左找比基准数大的数
j--;
while(a[i]<=temp && i<j)//从左向右找比基准数大的数
i++;

if(i<j)    //若i,j不相遇,交换两数在数组中的位置
{
t=a[i];
a[i]=a[j];
a[j]=t;
}
}

a[left]=a[i];    //归位基准数
a[i]=temp;

quicksort(left,i-1);  //基准数左边,递归
quicksort(i+1,right);    //基准数右边,递归
return;
}

int main()
{
int i,j;

printf("请输入四个数:"); //输入
for(i=1;i<=4;i++)
scanf("%d",&a[i]);

quicksort(1,4);  //调用快速排序

for(i=1;i<=4;i++)    //遍历输出
printf("%d ",a[i]);
getchar();
return 0;

}``````

1.时间复杂度

2.

令：n = n/2        =  2 { 2 T[n/4] + (n/2) }  + n                                               —————-第二次递归

=  2^2 T[ n/ (2^2) ] + 2n

令：n = n/(2^2)   =  2^2  {  2 T[n/ (2^3) ]  + n/(2^2)}  +  2n                         —————-第三次递归

=  2^3 T[  n/ (2^3) ]  + 3n

…………………………………………………………………………..

令：n = n/(  2^(m-1) )    =  2^m T  + mn                                                  —————-第m次递归(m次后结束)

当最后平分的不能再平分时，也就是说把公式一直往下跌倒，到最后得到T时，说明这个公式已经迭代完了（T是常量了）。

得到：T[n/ (2^m) ]  =  T    ===>>   n = 2^m   ====>> m = logn；(m感觉可以理解为递归树的层数)

T[n] = 2^m T + mn ；其中m = logn;

T[n] = 2^(logn) T + nlogn  =  n T + nlogn  =  n + nlogn  ；其中n为元素个数

n比上nlogn,n不要了

综上所述：

<1>快速排序最优的情况下时间复杂度为：O( nlogn ), ( 秒啊 )

<2>快速排序的平均时间复杂度也是：O(nlogn)  (不知道怎么来的,要去问问学长)

<3>还有最差情况(每次都取到最大值), 那么该情况下的时间复杂度就是O( n*n )

3.总的来说就是快排的时间复杂度低了,[干碎冒泡的平均O( n*n )],所以他快.

————————

good heavens,

The study of front-end knowledge will come to an end for the time being. I have to prepare for a small game and review my previous knowledge

“Why is quick sorting fast?”, this is the last time a senior student tortured my soul. I forgot and didn’t fully understand it

So let’s do it today

Simple quick sorting of the last C language: (function: input four numbers, sort, output from small to large)

``````#include<stdio.h>
int a;

void quicksort(int left,int right) //快速排序
{
int i,j,t,temp; //i为左指针,j为右指针,temp为基准数
if(left>right)
return;
temp=a[left];
i=left;
j=right;
while(i!=j)
{
while(a[j]>=temp && i<j)//从右向左找比基准数大的数
j--;
while(a[i]<=temp && i<j)//从左向右找比基准数大的数
i++;

if(i<j)    //若i,j不相遇,交换两数在数组中的位置
{
t=a[i];
a[i]=a[j];
a[j]=t;
}
}

a[left]=a[i];    //归位基准数
a[i]=temp;

quicksort(left,i-1);  //基准数左边,递归
quicksort(i+1,right);    //基准数右边,递归
return;
}

int main()
{
int i,j;

printf("请输入四个数:"); //输入
for(i=1;i<=4;i++)
scanf("%d",&a[i]);

quicksort(1,4);  //调用快速排序

for(i=1;i<=4;i++)    //遍历输出
printf("%d ",a[i]);
getchar();
return 0;

}``````

The measurement method between different sorting algorithms is considered through the two dimensions of < strong > time < / strong > and < strong > space < / strong > occupied by program execution

Explain the time complexity first

1. Time complexity

If there is a function f (n) such that when n approaches infinity, the limit value of T (n) / F (n)) is a constant that is not equal to zero, then f (n) is a function of the same order of magnitude of T (n).

It is recorded as t (n) = O (f (n)), and O (f (n)) is the asymptotic time complexity of the algorithm, which is referred to as time complexity for short.

< strong > simple understanding is the time (or the total number of times the code is executed) consumed by an algorithm or a program when it is running

two

Time complexity formula of recursive algorithm: T [n] = at [n / b] + F (n)  ； (as for how to come, write next time)

The following is the derivation: (the original derivation is from: https://blog.csdn.net/qfikh/article/details/52870875 )

Order: n = n / 2        =   2 { 2 T[n/4] + (n/2) }  + n                                               —————- Second recursion

=   2^2 T[ n/ (2^2) ] + 2n

Let: n = n / (2 ^ 2)   =   2^2   {   2 T[n/ (2^3) ]  + n/(2^2)}  +   2n                         —————- Third recursion

=   2^3 T[   n/ (2^3) ]  + 3n

…………………………………………………………………………..

Order: n = n/（   2^(m-1) )    =   2^m T  + mn                                                  —————- M-th recursion (end after M times)

When the last bisector cannot be bisected, that is, the formula falls all the way down until t  is finally obtained, it indicates that the formula has been iterated (t  is a constant).

Get: T [n / (2 ^ m)]  =   T    ===& gt;& gt;   n = 2^m   ====& gt;& gt; m = logn； (M feels that it can be understood as the number of layers of the recursive tree)

T[n] = 2^m T + mn ； Where M = logn;

T[n] = 2^(logn) T + nlogn  =   n T + nlogn  =   n + nlogn  ； Where n is the number of elements

N is better than nlogn, n is not

in summary:

< 1> In the case of optimal quick sorting, the time complexity is: O (nlogn), (seconds)

< 2> The average time complexity of quick sort is also: O (nlogn)   (I don’t know how to get here. I have to ask the senior student)

& lt; 3> There is also the worst case (the maximum value is taken every time), so the time complexity in this case is O (n * n)

3. Generally speaking, the time complexity of fast scheduling is low, [average o (n * n)], so it is fast

end.