Python–迭代器与生成器(Python — iterators and generators)

  • 迭代器是一个可以记住遍历的位置的对象。迭代器对象从集合的第一个元素开始访问,直到所有的元素被访问完结束。迭代器只能往前不会后退。
  • 两个基本方法。iter()和 next()
  • iter()创建迭代器对象    next()返回迭代器的下一个元素,当迭代器已经是最后一个元素时,如果再用next()会抛出异常StopIteration
  • 迭代器也可以被迭代,当被for in 迭代时不会报异常,会自动到异常时停止。
  • 迭代器会随着源对象的变化而变化

迭代器  Ieterator 是一个迭代器类,平时所讲的迭代器其实指的是迭代器对象。

生成器  Generator 是一个生成器类,平时所讲的生成器其实指的是生成器对象。

可迭代类  Iterable 是一个可迭代类。

迭代器类和生成器类 都是 可迭代类的子类,也就是说它们都是可迭代的,它们的对象都是可迭代类的子对象。

可迭代类,例如 数组、字典等常见的可以放入for循环中进行迭代的都是可迭代类的子对象。

除了放入for循环之外,可以放入iter()中的也必须是可迭代的。

生成器类 是 迭代器类 的子类。两者区别不大, 生成器可以看做特殊的迭代器。

其实iter()函数的作用就是,将一个iterable的对象生成iterator(迭代器对象)。

要想成为iterable的对象,最关键的一点是要实现 __iter__()构造函数,这个函数是被迭代时自动调用的,__inter__中可以什么都不写但不能没有,如果写了那它必须返回一个迭代器对象(它自身或者其他)。例如数组、字典类里面就有__iter__()函数,自定义生成器(有yield)虽然看不到此函数,但是其实父类也有实现,会自动调用。

而要想成为iterator对象,也就是 直接成为 迭代器对象,__inter__()和__next__()二者缺一不可

,因为迭代器要能够直接通过next()执行,通俗理解:比如列表本身是可迭代对象,但不是迭代器,不可能直接next([1,2,3])必须通过next(iter([1,2,3]))

系统内置函数iter()在运行时就是去用该对象的__inter__()方法。所以两种方式都可 test.__inter__()或inter(test)

系统内置函数next()在运行时就是去用该对象的__next__()方法。所以两种方式都可 test.__next__()或next(test)

next()的使用和__inter__没有任何关系,任何实现了__next__方法的对象都可以被next()调用。

for循环里面也会先生成被迭代对象的 迭代器 (无论是用它自身的还是父类里实现的__inter__),生成了以后,再依次用__next__()方法顺序迭代,直至结束。

拓展:列表/字典生成的迭代器并不是标准iterator类,而是各自继承的,比如list_iterator

通俗理解–:

可迭代对象指的是某个对象能够被迭代,有被迭代的潜力(实现了__inter__),除此以外并不能做什么,

迭代器就像数据流,就像视图对象,创建后会受到源数据的影响,只能顺序靠next执行,也可

以放入for中。它保存的其实是算法(指的是标准迭代器不是自定义的类对象),特性是惰性计算序列,随时执行随时计算,

正因如此它相当于可以保存无限长度的元素。比如自定义迭代器、生成器然后编写while死循环,就可以一直执行了。

通俗理解2–:迭代器就像根据某个可迭代对象生成的特殊算法+引用(占用空间很少),顾名思义,迭代器,每次执行根据算法计算引用对象的下一个值。对于标准内置的迭代器而言(比如列表/字典生成的)每次next就是取下个值,而对于自定义迭代器(实现了__next__有自己的逻辑)或者yield生成器而言,就是执行一次next里面的逻辑,或者说执行某部分代码。

生成器是迭代器的子类,按产生方式分为简单生成器,自定义生成器(yield)。

简单生成器就是     g= (x for x in range(10))  注意是小括号不是[],此时g就是一个生成器对象generator

简单生成器对象跟迭代器对象没什么区别,都是next执行或者for执行,也是惰性序列,受到源数据影响等等,只是增加了一些自定义计算。

自定义生成器比较常见,用法多样。

自定义生成器: 任何函数中只要用到了  yield  关键字,就会成为生成器,调用的函数就成为了生成器对象。

详情见testGenerator   比如 def custom() 里面用到了yield关键字,那么 aaa = custom() ,此时aaa就是一个生成器对象了。

但是此时aaa 相当于并没有任何执行,这个和普通的函数有区别。

自定义生成器中有几个注意的点:

    yield 关键字之后的值,可以是任何值或者变量或者.. ,每次执行到yield停止时都会把它后面的值返回给next()或者send()

    等任何调用执行的函数,当做返回值。等到下一次执行的时候会从yield左边的位置继续运行。

    通俗来讲,可以把custom这个函数分为可以执行的几段或无数段部分,第一部分是从开始到第一次yield,2是从yield到第二个yield,

    以此类推。next()就是依次执行各部分。send()就是先给yield赋值,然后再next()。

    由于第一部分里面还并没有执行到yield,所以如果一开始就用aaa.send(‘test’)是会报错的,如果想用只能aaa.send(None)。

    结束问题。如果生成器里面的循环没有写死,而是有个结束条件,那么当满足条件结束循环的时候,执行函数会报错StopIteration,

    所以如果有结束条件,要在next()/send()的外面加上异常捕捉。

换句话说就是,如果迭代器已经到最后一个元素了,再用next()会报错。放入for in中迭代不会报错。

如果想要拿到整个函数的返回值,必须捕获StopIteration错误,返回值包含在StopIteration的value中。

迭代器的长度:迭代器是不能够正常取长度的,

~.像简单的iter([1,2,3])或者(x+2 for x in range(100))  获取长度两种思路:1.先转为list 再取  len(list(iterator))。这样有点无用操作。2.sum方式  sum(1 for _ in iteration)3.封装一下:

def get_length(generator):     if hasattr(generator,”__len__”):         return len(generator)     else:         return sum(1 for _ in generator)

~.自定义函数逻辑的那种或者无限循环的,只能自定义一个计数变量了。

Python的for循环本质上就是通过不断调用next()函数实现的,例如:

for x in [1, 2, 3, 4, 5]:     pass

完全等价于:

it = iter([1, 2, 3, 4, 5]) # 循环:while True:     try:         x = next(it)     except StopIteration:         # 遇到StopIteration就退出循环         break

下面几段代码可以看一下,便于理解。

可以将几个函数写到一个文件中测试,头部一定要引入几个类。

#这几个类要先引入一下from collections.abc import Iterator
from collections.abc import Iterable
from collections.abc import Generator
import time
#这是从学习网站上找来的一小段例子,参考用
 
import sys         # 引入 sys 模块
 
list=[1,2,3,4]
it = iter(list)    # 创建迭代器对象
 
while True:
    try:
        print (next(it))
    except StopIteration:
        sys.exit()
class MyNumbers:
    # def __init__(self):
    #     self.a = 88

    def __iter__(self):
        self.a = 1
        #返回什么都可以,但一定要是迭代器类对象或其子对象
        return iter([1,2,3])
        # return self
        # return (i for i in range(100))

    def __next__(self):
        x = self.a
        self.a += 1
        return x

def testIterator():
    myclass = MyNumbers()
    print(isinstance(myclass, Iterable))  # True
    print(issubclass(MyNumbers, Iterable))  # True
    print(isinstance(myclass, Iterator))  # True
    print(issubclass(MyNumbers, Iterator))  # True
    # 创建实例,这里mycalss的类型是 <class '__main__.MyNumbers'> ,但是它是迭代器对象了,相当于iterator的子对象
    #  其实实例后不需要再iter()了,实例本身就是迭代器了,可以直接next(myclass)使用了。

    myiter = iter(myclass)  #生成标准迭代器,返回的值就是__inter__所返回的。

    print(type(myclass))
    print(type(myiter))
    # for x in myiter:
    #     print(x)
    print(myiter.__next__())
    print(myiter.__next__())
    print(next(myiter))
def testIterator3():
# 测试 迭代器iterator、生成器generator、可迭代对象iterable之间的关系
# 测试 迭代器iterator、生成器generator、可迭代对象iterable之间的关系#  iterable > iterator > generator   生成器一定是迭代器一定是可迭代对象,但可迭代对象不一定是xx,比如列表是可迭代对象,它的# 类实现了__inter__方法,但它不是迭代器。
    #一个数组
    arr = [1,2,3,4]
    #生成一个迭代器iterator
    ii = iter(arr)
    #生成一个生成器generator
    gg = (x for x in [5,6,7,8])

    print(type(ii)) #<class 'list_iterator'>
    print(type(gg)) #<class 'generator'>

    print(isinstance(arr,Iterable)) #True,列表是可迭代对象
    print(isinstance(ii,Iterable)) #True,迭代器是可迭代对象
    print(isinstance(gg,Iterable)) #True,生成器是可迭代对象

    print(isinstance(arr,Iterator)) #Flase,列表不是迭代器对象,即 列表不是迭代器
    print(isinstance(ii,Iterator)) #True,迭代器是迭代器对象
    print(isinstance(gg,Iterator)) #True,生成器是迭代器对象
    
    print(issubclass(Iterator,Iterable))    #True,迭代器类是可迭代类的子类
    print(issubclass(Generator,Iterator))    #True,生成器类是迭代器类的子类    
def testIterator2():
#测试嵌套iter迭代,值会不会受初始影响
    arr = [6,7,8,9]
    arrIter = iter(arr)
    arrIter2 = iter(arrIter)  #迭代器可以再次生成迭代器
    arrIter3 = iter(arrIter2)

    arr[2] = 888
    print(arr)    #[6, 7, 888, 9]
    print(arrIter)      #<list_iterator object at 0x00000265B935FEB0>
    print(arrIter2)     #<list_iterator object at 0x00000265B935FEB0>
    print(arrIter3)     #<list_iterator object at 0x00000265B935FEB0>
    print(next(arrIter),next(arrIter),next(arrIter))   # 6 7 888  迭代器的值会受到影响
    print(next(arrIter2))    #  9
    #迭代器重复迭代,返回的值是自身,也就是都引用的同一个对象。类似于变量的引用。
#!/usr/bin/python3
 
import sys
 
def fibonacci(n): # 生成器函数 - 斐波那契
    a, b, counter = 0, 1, 0
    while True:
        if (counter > n): 
            return
        yield a
        a, b = b, a + b
        counter += 1
f = fibonacci(10) # f 是一个迭代器,由生成器返回生成
 
while True:
    try:
        print (next(f), end=" ")
    except StopIteration:
        sys.exit()
def testGenerator():
#通过生产包子 吃包子的例子学习 生成器的并发运用
    def consumer(name):  # 消费者
        print("%s 准备吃包子啦!" % name)
        while True:
            baozi = yield '我是yield之后的,每次运行到yield我都会被返回'
            print("包子【%s】来了,被【%s】吃了!" % (baozi, name))

    def producer():  # 生产者
        c = consumer('A')
        c2 = consumer('B')

        # 此时 c和c2都是 生成器generator了,此时它们相当于都没有进行任何执行呢。
        print(type(c))
        print(type(c2))

        # print(c.__next__())
        print(next(c2))
        time.sleep(3)
        print(next(c2))
        time.sleep(3)
        print(next(c2))

        # print(c.send(None))
        # print(c.send(123))
        # time.sleep(3)
        # print(c.send(456))
        # print(c.send(789))

        print("开始做包子了!")
        for i in ["韭菜馅","茴香馅","鸡蛋馅","猪肉馅"]:
            time.sleep(1.5)
            print("做了两个个包子")
            c.send(i)       #------------------------
            c2.send(i)      # .send(i):先给yield发送值再next

    producer()
  • 还有一个点,关于占用内存空间的小测试,借鉴了一位博主的文章。  https://www.cnblogs.com/yinsedeyinse/p/11848287.html  
import os
import psutil

def show_memory_info(hint):
    pid = os.getpid()
    p = psutil.Process(pid)

    info = p.memory_full_info()
    memory = info.uss / 1024. /1024
    print('{} memory used:{}MB'.format(hint,memory))

def test_iterator():
    show_memory_info('initing iterator')
    list_1 = [i for i in range(100000000)]
    show_memory_info('after iterator initiated')
    print(sum(list_1))

def test_generator():
    show_memory_info('intiting generator')
    list_2 = (i for i in range(100000000))
    show_memory_info('after generator initiated')
    print(sum(list_2))
    show_memory_info('after sum called')

test_iterator()
test_generator()

initing iterator memory used:7.21875MB
after iterator initiated memory used:1848.28515625MB
4999999950000000
intiting generator memory used:1.7109375MB
after generator initiated memory used:1.7421875MB
4999999950000000
after sum called memory used:2.109375MB

输出initing iterator memory used:7.21875MB
after iterator initiated memory used:1848.28515625MB
4999999950000000
intiting generator memory used:1.7109375MB
after generator initiated memory used:1.7421875MB
4999999950000000
after sum called memory used:2.109375MB

————————
  • An iterator is an object that remembers where to traverse. The iterator object is accessed from the first element of the collection until all the elements are accessed. Iterators can only move forward, not backward.
  • Two basic methods. Iter() and {next()
  • Iter() creates an iterator object, {next() returns the next element of the iterator. When the iterator is already the last element, next() will throw an exception stopiteration
  • The iterator can also be iterated. When iterated by for in, it will not report an exception and will automatically stop when the exception occurs.
  • The iterator changes as the source object changes

Iterator # iterator is an iterator class. The iterator usually refers to the iterator object.

Generator generator is a generator class. Usually, the generator refers to the generator object.

Iteratable class: iteratable is an iteratable class.

Iterator class and generator class are subclasses of iteratable class, that is, they are iteratable, and their objects are children of iteratable class.

Iteratable classes, such as arrays, dictionaries, etc. common objects that can be put into a for loop for iteration are sub objects of the iteratable class.

In addition to putting in the for loop, what can be put in ITER () must also be iteratable.

The generator class is a subclass of the iterator class. There is little difference between the two, and the generator can be regarded as a special iterator.

In fact, the ITER () function is used to generate an Iterable object into an iterator (iterator object).

The key to becoming an Iterable object is to implement it__ iter__ () constructor, which is called automatically during iteration__ inter__ You can write nothing, but not nothing. If you write it, it must return an iterator object (itself or others). For example, there are in arrays and dictionary classes__ iter__ () function. Although the user-defined generator (with yield) cannot see this function, the parent class is also implemented and will be called automatically.

To become an iterator object, that is, to become an iterator object directly__ inter__ () and__ next__ () both are indispensable

, because the iterator should be able to execute directly through next (), it is commonly understood that, for example, the list itself is an iteratable object, but it is not an iterator. It is impossible to directly next ([1,2,3]) through next (ITER ([1,2,3])

The system built-in function ITER () uses this object at runtime__ inter__ () method. So both methods can be tested__ inter__ () or inter (test)

The system built-in function next () uses this object at runtime__ next__ () method. So both methods can be tested__ next__ () or next (test)

Use and of next()__ inter__ It doesn’t matter, anything is realized__ next__ Method can be called by next ().

The for loop will also become an iterator of the iterated object (whether it is implemented with its own or in the parent class), After generation, use__ next__ () the method iterates sequentially until the end.

Extension: List / dictionary generated iterators are not standard iterator classes, but are inherited from each other, such as list_ iterator

< strong > popular understanding: < / strong >

An iteratable object refers to an object that can be iterated and has the potential to be iterated (realizing _inter _), Nothing else can be done,

< strong > iterators are like data streams, like view objects. After they are created, they will be affected by the source data. They can only be executed sequentially by next, or < / strong >

< strong > to put in for. It actually saves algorithms (referring to standard iterators, not user-defined class objects). Its feature is an inert calculation sequence, which can be performed at any time, < / strong >

< strong > for this reason, it is equivalent to saving elements of infinite length. For example, you can customize the iterator and generator, and then write a while loop, which can be executed all the time

< strong > popular understanding 2 –: < / strong > < strong > iterators are like special algorithms + references generated from an iteratable object (taking up little space). As the name suggests, iterators calculate the next value of the reference object according to the algorithm each time they execute. For standard built-in iterators (such as those generated by lists / dictionaries), each next is to take down a value. For custom iterators (which implement ___ and have their own logic) or yield generators, it is to execute the logic in next once, or execute some code

< strong > generator < / strong > is a subclass of iterators. It is divided into < strong > Simple generator and user-defined generator < / strong > (yield) according to the generation method.

The simple generator is g = (x for X in range (10)). Note that the parentheses are not []. At this time, G is a generator object generator

The simple generator object is no different from the iterator object. It is executed by next or for. It is also an inert sequence, affected by the source data, and so on, but some custom calculations are added.

Custom generators are common and can be used in a variety of ways.

Custom generator: any function that uses the keyword “yield” will become a generator, and the called function will become a generator object.

See testgenerator for details. For example, if the yield keyword is used in def custom(), AAA = custom(), and AAA is a generator object.

But at this time, AAA is equivalent to no execution, which is different from ordinary functions.

< strong > there are several points to note in the user-defined generator: < / strong >

The value after the yield keyword can be any value or variable or, Every time the execution stops at yield, the value after it will be returned to next () or send ()

Wait for any call to execute the function as the return value. The next execution will continue from the position to the left of yield.

Generally speaking, the custom function can be divided into several or countless sections that can be executed. The first part is from the beginning to the first yield, and the second part is from the yield to the second yield,

And so on. Next () is to execute each part in turn. Send() is to assign a value to yield first, and then next().

Since yield has not been implemented in the first part, if AAA is used at the beginning Send (‘test ‘) will report an error. If you want to use it, you can only use AAA send(None)。

End the question. If the loop in the generator is not written dead but has an end condition, the execution function will report an error stopiteration when the condition is met to end the loop,

Therefore, if there is an end condition, add exception capture outside next() / send().

< strong > in other words, if the iterator has reached the last element, then next () will report an error. Put it into for in for iteration and no error will be reported

< strong > if you want to get the return value of the whole function, you must catch the stopiteration error, and the return value is included in the value of stopiteration

< strong > length of iterator: iterators cannot take the length normally, < / strong >

~. Like simple ITER ([1,2,3]) or (x + 2 for X in range (100)), there are two ideas to obtain the length: 1 First turn to list, and then get len (list (iterator)). This is a little useless. 2. Sum method: sum (< / strong > 1 for _initeration < strong >) 3 Encapsulate it: < / strong >

def get_length(generator):     if hasattr(generator,”__len__”):         return len(generator)     else:         return sum(1 for _ in generator)

~. For the custom function logic or infinite loop, you can only customize one count variable

Python’s for loop is essentially realized by continuously calling the next() function, for example:

for x in [1, 2, 3, 4, 5]:     pass

Fully equivalent to:

It = ITER ([1, 2, 3, 4, 5]) # loop: while true: try: x = next (it) except stopiteration: # exit the loop when stopiteration is encountered # break

The following code can be looked at for easy understanding.

Several functions can be written to a file for testing, and several classes must be introduced into the header.

#这几个类要先引入一下from collections.abc import Iterator
from collections.abc import Iterable
from collections.abc import Generator
import time
#这是从学习网站上找来的一小段例子,参考用
 
import sys         # 引入 sys 模块
 
list=[1,2,3,4]
it = iter(list)    # 创建迭代器对象
 
while True:
    try:
        print (next(it))
    except StopIteration:
        sys.exit()
class MyNumbers:
    # def __init__(self):
    #     self.a = 88

    def __iter__(self):
        self.a = 1
        #返回什么都可以,但一定要是迭代器类对象或其子对象
        return iter([1,2,3])
        # return self
        # return (i for i in range(100))

    def __next__(self):
        x = self.a
        self.a += 1
        return x

def testIterator():
    myclass = MyNumbers()
    print(isinstance(myclass, Iterable))  # True
    print(issubclass(MyNumbers, Iterable))  # True
    print(isinstance(myclass, Iterator))  # True
    print(issubclass(MyNumbers, Iterator))  # True
    # 创建实例,这里mycalss的类型是 <class '__main__.MyNumbers'> ,但是它是迭代器对象了,相当于iterator的子对象
    #  其实实例后不需要再iter()了,实例本身就是迭代器了,可以直接next(myclass)使用了。

    myiter = iter(myclass)  #生成标准迭代器,返回的值就是__inter__所返回的。

    print(type(myclass))
    print(type(myiter))
    # for x in myiter:
    #     print(x)
    print(myiter.__next__())
    print(myiter.__next__())
    print(next(myiter))
def testIterator3():
# 测试 迭代器iterator、生成器generator、可迭代对象iterable之间的关系
# 测试 迭代器iterator、生成器generator、可迭代对象iterable之间的关系#  iterable > iterator > generator   生成器一定是迭代器一定是可迭代对象,但可迭代对象不一定是xx,比如列表是可迭代对象,它的# 类实现了__inter__方法,但它不是迭代器。
    #一个数组
    arr = [1,2,3,4]
    #生成一个迭代器iterator
    ii = iter(arr)
    #生成一个生成器generator
    gg = (x for x in [5,6,7,8])

    print(type(ii)) #<class 'list_iterator'>
    print(type(gg)) #<class 'generator'>

    print(isinstance(arr,Iterable)) #True,列表是可迭代对象
    print(isinstance(ii,Iterable)) #True,迭代器是可迭代对象
    print(isinstance(gg,Iterable)) #True,生成器是可迭代对象

    print(isinstance(arr,Iterator)) #Flase,列表不是迭代器对象,即 列表不是迭代器
    print(isinstance(ii,Iterator)) #True,迭代器是迭代器对象
    print(isinstance(gg,Iterator)) #True,生成器是迭代器对象
    
    print(issubclass(Iterator,Iterable))    #True,迭代器类是可迭代类的子类
    print(issubclass(Generator,Iterator))    #True,生成器类是迭代器类的子类    
def testIterator2():
#测试嵌套iter迭代,值会不会受初始影响
    arr = [6,7,8,9]
    arrIter = iter(arr)
    arrIter2 = iter(arrIter)  #迭代器可以再次生成迭代器
    arrIter3 = iter(arrIter2)

    arr[2] = 888
    print(arr)    #[6, 7, 888, 9]
    print(arrIter)      #<list_iterator object at 0x00000265B935FEB0>
    print(arrIter2)     #<list_iterator object at 0x00000265B935FEB0>
    print(arrIter3)     #<list_iterator object at 0x00000265B935FEB0>
    print(next(arrIter),next(arrIter),next(arrIter))   # 6 7 888  迭代器的值会受到影响
    print(next(arrIter2))    #  9
    #迭代器重复迭代,返回的值是自身,也就是都引用的同一个对象。类似于变量的引用。
#!/usr/bin/python3
 
import sys
 
def fibonacci(n): # 生成器函数 - 斐波那契
    a, b, counter = 0, 1, 0
    while True:
        if (counter > n): 
            return
        yield a
        a, b = b, a + b
        counter += 1
f = fibonacci(10) # f 是一个迭代器,由生成器返回生成
 
while True:
    try:
        print (next(f), end=" ")
    except StopIteration:
        sys.exit()
def testGenerator():
#通过生产包子 吃包子的例子学习 生成器的并发运用
    def consumer(name):  # 消费者
        print("%s 准备吃包子啦!" % name)
        while True:
            baozi = yield '我是yield之后的,每次运行到yield我都会被返回'
            print("包子【%s】来了,被【%s】吃了!" % (baozi, name))

    def producer():  # 生产者
        c = consumer('A')
        c2 = consumer('B')

        # 此时 c和c2都是 生成器generator了,此时它们相当于都没有进行任何执行呢。
        print(type(c))
        print(type(c2))

        # print(c.__next__())
        print(next(c2))
        time.sleep(3)
        print(next(c2))
        time.sleep(3)
        print(next(c2))

        # print(c.send(None))
        # print(c.send(123))
        # time.sleep(3)
        # print(c.send(456))
        # print(c.send(789))

        print("开始做包子了!")
        for i in ["韭菜馅","茴香馅","鸡蛋馅","猪肉馅"]:
            time.sleep(1.5)
            print("做了两个个包子")
            c.send(i)       #------------------------
            c2.send(i)      # .send(i):先给yield发送值再next

    producer()
  • Another point is that the small test on occupying memory space draws lessons from the article of a blogger.    https://www.cnblogs.com/yinsedeyinse/p/11848287.html
import os
import psutil

def show_memory_info(hint):
    pid = os.getpid()
    p = psutil.Process(pid)

    info = p.memory_full_info()
    memory = info.uss / 1024. /1024
    print('{} memory used:{}MB'.format(hint,memory))

def test_iterator():
    show_memory_info('initing iterator')
    list_1 = [i for i in range(100000000)]
    show_memory_info('after iterator initiated')
    print(sum(list_1))

def test_generator():
    show_memory_info('intiting generator')
    list_2 = (i for i in range(100000000))
    show_memory_info('after generator initiated')
    print(sum(list_2))
    show_memory_info('after sum called')

test_iterator()
test_generator()

initing iterator memory used:7.21875MB
after iterator initiated memory used:1848.28515625MB
4999999950000000
intiting generator memory used:1.7109375MB
after generator initiated memory used:1.7421875MB
4999999950000000
after sum called memory used:2.109375MB

输出initing iterator memory used:7.21875MB
after iterator initiated memory used:1848.28515625MB
4999999950000000
intiting generator memory used:1.7109375MB
after generator initiated memory used:1.7421875MB
4999999950000000
after sum called memory used:2.109375MB