i try get requeses multi threading and save to data
but when the multi threading in use itss not save the csv
what can do ?
Must add time.sleep? He can not do the
saving and at the same time I would be happy for help thanks
this the script
import requests
import csv
from concurrent.futures import ThreadPoolExecutor, as_completed
from time import time
import json
url_list = []
with open('bookslinks.csv', newline='') as f:
reader = csv.reader(f)
# urls = list(reader)
for row in reader:
url_list.append(row[0])
def download_file(url):
string = url
say = "="
after = string[string.index(say) + len(say):]
bugeycheck = "https://xxxx/" + after + "data=" + after
j = json.loads(requests.get(bugeycheck, timeout=20,stream=True).content);
dataname = "bbbb/" + str(after) + "bbbb" + ".txt"
print(j["xx"])
with open('beneficiary.csv', 'a') as newFile:
newFileWriter = csv.writer(newFile)
newFileWriter.writerow([after, j["xx"]])
return
start = time()
processes = []
with ThreadPoolExecutor(max_workers=100) as executor:
for url in url_list:
processes.append(executor.submit(download_file, url))
for task in as_completed(processes):
print("test")
print(f'Time taken: {time() - start}')
the console python when the multi threading use
#####
false
false
falsefalse
false
####
false = add to csv file
false = add to csv file
falsefalse = add only one to csv file
false = add to csv file
question from:
https://stackoverflow.com/questions/65940124/multi-threading-andd-save-csv-file 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…