• Hey Guest,

    As you know, censorship around the world has been ramping up at an alarming pace. The UK and OFCOM has singled out this community and have been focusing its censorship efforts here. It takes a good amount of resources to maintain the infrastructure for our community and to resist this censorship. We would appreciate any and all donations.

    Bitcoin Address (BTC): 39deg9i6Zp1GdrwyKkqZU6rAbsEspvLBJt

    Ethereum (ETH): 0xd799aF8E2e5cEd14cdb344e6D6A9f18011B79BE9

    Monero (XMR): 49tuJbzxwVPUhhDjzz6H222Kh8baKe6rDEsXgE617DVSDD8UKNaXvKNU8dEVRTAFH9Av8gKkn4jDzVGF25snJgNfUfKKNC8

  • Security update: At around 2:28AM EST, the site was labeled as malicious by Google erroneously, causing users to get a "Dangerous site" warning in most browsers. It appears that this was done by mistake and has been reversed by Google. It may take a few hours for you to stop seeing those warnings.

    If you're still getting these warnings, please let a member of staff know.
sensenmann

sensenmann

this will be the end of me
Jun 14, 2023
141
Since nothing is forever and the site could be gone any day (hope not) I decided to archive some threads, I didn't find any guide here on how to do it so I am posting my method.

There is probably a faster and more "automated" method but this has worked well for me.

Here are the things you need:
  • The browser add-on SingleFile or SingleFileZ, I use Firefox but they are also available for Chrome.
  • Python to create and execute the script
Now, you can use SingleFile to archive a single page but if you want to archive a whole thread at once you will need some kind of script that generates the urls, then paste those into SingleFile.

This python script can generate multiple urls in a text file, open notepad put in the script, then you put the template of the url in base_text = "url-here" last thing is to put the number of pages in num_strings = 123.

Save the script ending with .py, then double-click it, or open a terminal in the script directory and write in scriptname.py, you will get a .txt file in the same directory with all the urls.

Last thing is to copy all the urls from the .txt file and paste those into singlefile/singlefilez with the "Batch save URLs" option, it will download every page into a .html file, containing pictures, text, etc.

EDIT: removed selenium since it is actually not needed, it is for automation.

Examples:

Url

Pages

Script

Text

Batch
 
Last edited:
  • Like
  • Love
  • Informative
Reactions: donxtwait, Praestat_Mori, ztem and 2 others
アホペンギン

アホペンギン

Jul 10, 2023
2,199
This is quite interesting. It would be very useful in the case of SS being taken down. Unfortunately, though, you can't do shit on a phone so I screenshot things that I want to keep, lol.
 
  • Like
  • Hugs
Reactions: Praestat_Mori, Lost in a Dream and sensenmann
Darkover

Darkover

Angelic
Jul 29, 2021
4,867
someone already did a backup of sasu on a mediafire about a year ago they uses a web crawler can't find it now even tho i've searched high and low for it i have it on a old laptop that broke down
 
  • Like
  • Aww..
Reactions: アホペンギン and Praestat_Mori
アホペンギン

アホペンギン

Jul 10, 2023
2,199
someone already did a backup of sasu on a mediafire about a year ago they uses a web crawler can't find it now even tho i've searched high and low for it i have it on a old laptop that broke down
That's awful… I hope someone else makes a backup of sasu too, because you lost the one you had.
 

Similar threads

nomoredolor
Replies
29
Views
2K
Suicide Discussion
nomoredolor
nomoredolor
GuessWhosBack
Replies
9
Views
2K
Recovery
iji
I
M
Replies
15
Views
2K
Recovery
Life_and_Death
Life_and_Death
DarkRange55
Replies
2
Views
505
Politics & Philosophy
DarkRange55
DarkRange55