Issue
I am scraping details of a website and I need to create a JSON object dynamically.How can I achieve this like the example I have provided.If not JSON then how can I make a multi-dimensional array from my code as well.
import requests
import bs4 as bs
urls = 'http://dl5.lavinmovie.net/Series/'
url = requests.get(urls).text
soup = bs.BeautifulSoup(url, 'lxml')
title = soup.find_all('a')
for i in title:
if(i.text != '../' and ".mp4" not in i.text):
urll = urls+i.text
# arr.append(i.text)
urll1 = requests.get(urll).text
soupp1 = bs.BeautifulSoup(urll1, 'lxml')
season = soupp1.find_all('a')
print(i.text)
for j in season:
if(j.text != '../'and ".mp4" not in j.text):
urlla = urll+j.text
urll2 = requests.get(urlla).text
soupp2 = bs.BeautifulSoup(urll2, 'lxml')
quality = soupp2.find_all('a')
print(j.text)
for k in quality:
if(k.text != '../' and ".mp4" not in k.text):
urllb = urlla+k.text
urll3 = requests.get(urllb).text
soupp3 = bs.BeautifulSoup(urll3, 'lxml')
episode = soupp3.find_all('a')
print(k.text)
for m in episode:
if(m.text != '../' and ".mp4" not in m.text):
print(m.text)
Series->Seasons->Quality->Episodes.
This is the flow chart of the JSON file that I wish to have. The code I have written goes through nested links and finds all of the data that I need,I am not being able to make a multi-dimensional array out of it.
My code recursively opens all of the links and scrapes the text that I want. All I need to solve is how to add all of it in a nested fashion.
Solution
In python any sort of data can be stored into list
and dictionaries
, there is almost never a need for making multi-dimensional arrays or vectors.
Series->Seasons->Quality->Episodes.
Just by looking at your flowchart(as you call it), you can workaround as follows:
episodes
will be a list
within Quality(dict)
within Seasons(dict)
within Series(dict)
You have to give a read to making dicts of dicts
and dicts of lists
and how to store data in them in Python.
Answered By - kestrel
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.