Skip to content

Conversation

@seancallaway
Copy link

Purpose

This pull request fixes a warning thrown by BeautifulSoup and an error condition that causes the program to crash due to an unhandled exception.

Approach

When ran, the program would display the following error message:

UserWarning: No parser was explicitly specified, so I'm using the best
available HTML parser for this system ("html.parser"). This usually isn't a
problem, but if you run this code on another system, or in a different
virtual environment, it may use a different parser and behave differently.

To get rid of this warning, change this:

 BeautifulSoup([your markup])

to this:

 BeautifulSoup([your markup], "html.parser")

BeautifulSoup recommends using lxml as a faster parser, so I've added lxml to the requirements file and to the BeautifulSoup call.


Regarding the unhandled exception, it appears that some items caught by the crawler identify as podcasts, but do no have feedURLs, which causes the program to throw a KeyError exception, which was previously unhandled causing the application to crash.

I've wrapped the JSON loading in a try/except to catch the exception and have the function return None in the case that the JSON doesn't include a feedURL. Of course, this means that the function calling podcast_from_json() needed to be able to deal with None being returned, so I modified that function to only add a Podcast object to the list if it was not None.

@seancallaway
Copy link
Author

Also did a bit of cleanup for PEP8.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant