How can I reference a shared common module from multiple python scrapers?



I’ve been writing XML scrapers for years now. These scrapers are esoteric, and highly particular to my library and naming conventions so they’ve never been released. Many of them used common functionality I provided in a central script, which would allow me to only update the one common script and then all scripts that depended on it would be updated.

i.e. ScraperA and B would both have this line in the <requires> section of their addon.xml’s:
<import addon=”metadata.common.mordredsrcapers.com” version=”2.2.4″/>
and then I could reference the ridiculously complex chain functions in both ScraperA and B.

I recently had need to write a new scraper and I decided to finally learn how to do it using python. I’ve got this working just fine if all dependencies are in the same script, but I can’t figure out how to import from a common location… or rather I can’t figure out how to reference the functions from a common location. I’m using google to look up information about the files now, so I’ve created a “metadata.common.google.com” addon which contains a “lib/google.py” with my class.

I’ve added the import to the requires, but any time I reference a function or class I get:

Code:
Error Type: <class 'ModuleNotFoundError'>
Error Contents: No module named 'GoogleSearchScraper'

If lib/google.py is contained inside the parent scraper I can import the functions just fine using:

Code:
from lib.google import GoogleSearchScraper

But I can’t figure out how to get python to understand that the path is not contained inside the same scraper, but is instead in one of its dependencies. Does anyone know how to get this to work? I’ve looked at a bunch of other scrapers and the “metadata.common.xxxx” pattern doesn’t seem to be prevalent anymore.