Skip to content

Multiple get_peer requests causes 100% CPU usage #7

@Ueland

Description

@Ueland

Hi,

Im playing around with this library and have tried asking it for a small bunch of magnet hashes, in the start it goes without any issues at all. But after a random number of get_peer requests for some random hashes for popular torrents, the CPU usage spikes to 100% and stays there until i kill it.

I have tried to run only one request per 10 second, running it both in blocking and non-blocking mode, increasing the debug level, moving the query limit between 10 and 1000. No matter what changes i have done the issue appears.

I tried to reproduce the issue by making a script that simply polled the ubuntu example hash (and some more hashes based on that hash), but that did not seem to fail.

It appears that the last thing that happens before the lockup is that the library receives a couple of errors:

2984 nodes, 1607 goods, 59 bads | in: 51, out: 60 en 14s

ERROR:202:b'Server Error' pour (None, 1520126208.2239816, b'd1:ad2:id20:\x04Ogj^1+\xd3*Q\x0c\xa8\xdf\xcf\xe2\xea\xc2\x12?\xeae1:q4:ping1:t6:=\x08,\x902M1:y1:qe')
obj of type b'e'
8 nodes added to routing table
5 nodes added to routing table
ERROR:202:b'Server Error' pour (None, 1520126208.227913, b'd1:ad2:id20:\x04Ogj^1+\xd3*Q\x0c\xa8\xdf\xcf\xe2\xea\xc2\x12?\xeae1:q4:ping1:t6:\xd6\x92N\xd8 B1:y1:qe')
obj of type b'e'
8 nodes added to routing table
0 nodes added to routing table
7 nodes added to routing table
4 nodes added to routing table
ERROR:202:b'Server Error' pour (None, 1520126208.266789, b'd1:ad2:id20:\x04Ogj^1+\xd3*Q\x0c\xa8\xdf\xcf\xe2\xea\xc2\x12?\xeae1:q4:ping1:t6:d/ X\xaf\x911:y1:qe')
obj of type b'e'
8 nodes added to routing table
3 nodes added to routing table
8 nodes added to routing table
ERROR:202:b'Server Error' pour (None, 1520126208.3035867, b'd1:ad2:id20:\x04Ogj^1+\xd3*Q\x0c\xa8\xdf\xcf\xe2\xea\xc2\x12?\xeae1:q4:ping1:t6:1yV\x97\xc7F1:y1:qe')
obj of type b'e'
ERROR:202:b'Server Error' pour (None, 1520126208.2280884, b'd1:ad2:id20:\x04Ogj^1+\xd3*Q\x0c\xa8\xdf\xcf\xe2\xea\xc2\x12?\xeae1:q4:ping1:t6:\x8a,\n\x1f\xac\xa21:y1:qe')
obj of type b'e'
ERROR:202:b'Server Error' pour (None, 1520126208.2672093, b'd1:ad2:id20:\x04Ogj^1+\xd3*Q\x0c\xa8\xdf\xcf\xe2\xea\xc2\x12?\xeae1:q4:ping1:t6:x\xb7\xffe\x1c\xc81:y1:qe')
obj of type b'e'
ERROR:202:b'Server Error' pour (None, 1520126208.2678056, b"d1:ad2:id20:\x04Ogj^1+\xd3*Q\x0c\xa8\xdf\xcf\xe2\xea\xc2\x12?\xea6:target20:\x87\xd0\x0f\xdc\xdd'\x01,~(\xe0\x9f\xd4\x1c\\jz\xc6\x12be1:q9:find_node1:t6:\x10\xc5\x9e\xd7\x97'1:y1:qe")
obj of type b'e'
0 nodes added to routing table

I tried some profiling with Yappi, it says that these two calls were the most used, not sure if helpful but..

name ncall tsub ttot tavg
...5/queue.py:90 PollableQueue.empty 187.. 21.10275 42.51144 0.000023
..dist-packages/six.py:580 iteritems 187.. 23.93679 37.79221 0.000020

Any calls for get_peers afterwards will simply yield "no peers or nodes found"

I have not seen any special IO-activity while this happens,

This has been tested on the latest release

What could this be, and how can it be avoided?

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions