Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

increase speed 50x with using set() #28

Open
plaber opened this issue Mar 1, 2022 · 30 comments
Open

increase speed 50x with using set() #28

plaber opened this issue Mar 1, 2022 · 30 comments

Comments

@plaber
Copy link

plaber commented Mar 1, 2022

if you change some code to cache list of wallets in ram

	line = open('bit.txt').readlines();
	line.sort()
	lset = set(line)
	while True:
		i=i+1
                ..................skip some code.....................
		#print ('Worker '+str(r)+':'+ str(i) + '.-  # '+pub + ' # -------- # '+ priv+' # ')
		pub = pub + '\n'
		if pub in lset:
			msg = "\nPublic: " + str(pub) + " ---- Private: " + str(priv) + "YEI"
			text = msg

now it show 1200 K/s instead of 45 K/s for list of 1M wallets

@jomagalo
Copy link

jomagalo commented Mar 12, 2022

Once modified with load in RAM (set) it gives me 540 K/s * core. Executed on a raspi 4 cores in total at 2160 K/s for list of 33M wallets

@Plaztomad
Copy link

@plaber @jomagalo Could you please share the whole code once edited? thank you.

@jomagalo
Copy link

@Plaztomad I can share it but it is not enough to get to achieve something. In addition, there are other software that use gpus that are much more powerful and do not achieve anything due to the astronomical number of possible combinations. All of this is just for experimentation.

@Plaztomad
Copy link

@jomagalo i'm aware of that, but thanks for pointing that out

@jomagalo
Copy link

jomagalo commented Mar 30, 2022

@Plaztomad Modified file attached. Try it and tell me how you see it.
seekanddestroy.zip

Everything has been tested on ubuntu server 20.04 with a raspberry 4 and it works perfectly.

Improvements made:

  • database with 33M wallets
  • speed increase with load in RAM (set) of the database
  • send mail if there is success.
  • 2160 K/s for list of 33M wallets

Hardware requirements:

  • 8GB RAM minimum
  • 4 cores minimum

the path to the database is indicated on line 22

The database I use contains about 33 million wallets. Download it from here: database

Lastly, the database surely contains both compressed and uncompressed addresses and this software only generates uncompressed addresses. It would be necessary to add that function.

@Plaztomad
Copy link

working great! thanks for your support.

@ImpAsh13
Copy link

@Plaztomad Modified file attached. Try it and tell me how you see it. seekanddestroy.zip

the path to the database is indicated on line 22

The database I use contains about 33 million wallets. Download it from here: database

Lastly, the database surely contains both compressed and uncompressed addresses and this software only generates uncompressed addresses. It would be necessary to add that function.

I get error:
F:\BitBruteForce-Wallet-master>python seekanddestroy.py
reading database: 35/35 Traceback (most recent call last):
File "F:\BitBruteForce-Wallet-master\seekanddestroy.py", line 133, in
database[3] = database[3] | set(pickle.load(file))
_pickle.UnpicklingError: could not find MARK

@jomagalo
Copy link

jomagalo commented Mar 30, 2022

@ImpAsh13 I run it on ubuntu 20.04 and it works perfectly.

does it work with the original code?
have you installed the requirements?
Do you use python3?
Did you download the database in the correct path?
database/MAR_15_2021

@Plaztomad
Copy link

Plaztomad commented Mar 30, 2022

Hey @jomagalo , is there any way to create a database with the list of addresses you can find here?

@jomagalo
Copy link

@Plaztomad Yes, but you have to pass it to pickle format

@ImpAsh13
Copy link

I run it on ubuntu 20.04 and it works perfectly.

does it work with the original code?
have you installed the requirements?
Do you use python3?
Did you download the database in the correct path? database/MAR_15_2021

I run it on Windows 10 Pro
With original code it work.
All recommendations downloaded and installed, except for 3 billion years.
Python 3.10
Database is loaded. All 34 parts. After that, 10 seconds pass and an error message appears.
I will try to run on ubuntu

@ImpAsh13
Copy link

I found a bug on my computer and was able to reproduce it.
It was all about the disabled paging file.

@jomagalo
Copy link

jomagalo commented Mar 31, 2022

@ImpAsh13 What type of processor and how many cores does it have?

Loading the entire database into memory requires at least 8GB of RAM
I don't think it will work with less than 8GB RAM

@ImpAsh13
Copy link

@jomagalo
I have Intel(R) Core(TM) i7-7700 CPU @ 3.60GHz 8 cores and 16 Gb RAM.
Swap file need for pickle.
Now speed of each core ~1200 K/s

@jomagalo
Copy link

@ImpAsh13 It may be the version of python you have installed, I use Python 3.8.10 and everything works fine.

Check also this part that I have updated: #28 (comment)

@jomagalo
Copy link

wow 1200 K/s * core

@ImpAsh13
Copy link

@jomagalo
Here #28 (comment) I wrote that everything worked for me after I enabled the swap file.

@jomagalo
Copy link

ok, I had not seen it

@jomagalo
Copy link

eye in the email sending configuration, you must put your data. Don't use gmail seems to give problems. With any other mail server it works fine.

@Plaztomad
Copy link

@Plaztomad Yes, but you have to pass it to pickle format

Please any hint or guideline to do that?

@jomagalo
Copy link

@Plaztomad Sorry, I have no info on that. I took the database as it is.

@ImpAsh13
Copy link

I update base, but how to check it?))

@jomagalo
Copy link

@ImpAsh13 uncomment the code part "TEST" 3 lines

@ImpAsh13
Copy link

ImpAsh13 commented Apr 1, 2022

@jomagalo
What printing me:
Worker 0:27654.- # 1GZZ95CPME4hpqkxUXvH7ws5sYVcRTWRPm # -------- # 5JTsQHSr2UF9ZYmmoRgNqGeUfTKYXEtYN5keLcEYMMkeZjkWcv7 #
rand address and key

thats normal?)

@jomagalo
Copy link

jomagalo commented Apr 1, 2022

@ImpAsh13 I don't know what kind of check you want to do. If what you want is to verify that the content of the database in memory works just uncomment lines 65, 66 and 67. What this does is verify after 5k random checks check this wallet 1L1aAhBakQGrhzdJG4iJMp77c7TqVk5bWj with the database in memory and since it obviously exists, it will show you the data.

code line 69 what it does is show the internal operation, that is, it prints each of the random wallets it generates.

@ImpAsh13
Copy link

ImpAsh13 commented Apr 1, 2022

@jomagalo
I want to check the connected database with addresses in pickle-file.
Did it really load in the right format, because I converted it from csv file to pickle.
Is it possible to display a single line or entry from a given pickle-file?

@jomagalo
Copy link

jomagalo commented Apr 1, 2022

@ImpAsh13 You can do what I tell you in the previous post. Take a wallet that you know is included in the pickle file and paste it on line 67 of the code. You must uncomment lines 65, 66 and 67

@arabijo
Copy link

arabijo commented Jan 9, 2023

@Plaztomad I can share it but it is not enough to get to achieve something. In addition, there are other software that use gpus that are much more powerful and do not achieve anything due to the astronomical number of possible combinations. All of this is just for experimentation.

Hi, I've got a lot of GPU, where can I find this software?Thank you

@band1k1
Copy link

band1k1 commented Mar 31, 2023

@jomagalo When I created my pickle file and replaced those 34 from the database folder I get an error:

reading database: 1/2 Traceback (most recent call last):
File "/seekanddestroy.py", line 102, in
database[1] = database[1] | set(pickle.load(file))

_pickle.UnpicklingError: invalid load key, '\x00'.

Can you tell me how to fix it?

@plaber
Copy link
Author

plaber commented Dec 30, 2023

my code for 1000 K/s

seekanddestroy.py.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants