I'm trying to use Selenium to scrape a bunch of websites, that are needed to be scrolled down and clicked on a button. Each url has same structure, but has different number of click times.
My code:
for url in url_list:
while True:
wd.get(url)
last_height = wd.execute_script("return document.body.scrollHeight")
while True:
wd.execute_script("window.scrollTo(0, document.body.scrollHeight);")
#time.sleep = time for waiting
time.sleep(3)
new_height = wd.execute_script("return document.body.scrollHeight")
if new_height == last_height:
break
last_height = new_height
next_button = wd.find_element_by_link_text('next >>')
next_button.click()
However, the code finished only the first url and returned error: "NoSuchElementException". It didn't continue the loop, and sometimes if I changed url list, it stopped in the middle of the loop with error: "ElementClickInterceptedException"
My goal is to continue and finish the loop, and ignore the error.
How can I improve the code? Thanks in advance