Issue
I'm using Selenium in a project that consists of opening a range of websites, that contains pretty much the same structure, collecting data in each site and storing it.
The problem I ran into is that some of the sites I wan't to access are unavailable, and when the program get to one of those it just stops.
What I want it to do, is to skip those and follow on with the next iterations, but so far my tries have been obsolete... In my latest try I used the method is_displayed(), but apparently it will only tell me if an element is visible or not, instead of telling me if it's present or not.
if driver.find_element_by_xpath('//*[@id="main-2"]/div[2]/div[1]/div[1]/div/div[1]/strong').is_displayed():
The example above doesn't work, because the driver needs to find the element before telling me if it visible or not, but the element is simply not there.
Have any of you dealt with something similar?
How one of the sites looks like normally
How it looks like when it is unavailable
Solution
You can use Selenium expected conditions waiting for element presence.
I'm just giving an example below.
I have defined the timeout for 5 seconds here, but you can use any timeout value.
Also, your element locator looks bad.
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
element_xpath_locator = '//*[@id="main-2"]/div[2]/div[1]/div[1]/div/div[1]/strong'
wait = WebDriverWait(browser, 5)
wait.until(EC.presence_of_element_located((By.XPATH, element_xpath_locator)))
Answered By - Prophet
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.