Under the hood of City of Norwood, Payneham and St Peters
More of the gory details of how information gets into PlanningAlerts. This is here so you can see and learn how it's done but also so you are able to fix things and see in detail what isn't working.
- Applications are published on the City of Norwood, Payneham and St Peters website in a human readable, non-structured form.
- The latest scraper code on Github is loaded by our scraping platform morph.io.
- The scraper planningalerts-scrapers/multiple_civica on our scraping platform morph.io collects the information daily and converts it into a machine-readable format.
- If the scraper errors anyone who is watching the scraper on morph.io gets informed via a daily email alert.
- The machine readable data is validated, imported and geocoded by PlanningAlerts daily. If there are any validation errors on an application, the application is skipped and the error is logged. See below for the most recent logs.
- The information is published on PlanningAlerts, made available via the API, and people are informed of new applications via email alerts.
Most recent import logs
123 applications found for City of Norwood, Payneham and St Peters, SA with date from 2020-10-15
Took 8 s to import applications from City of Norwood, Payneham and St Peters, SA
Authority City of Norwood, Payneham and St Peters is fixed but github issue is still open. So labelling.
What you can do next
If something isn't right with the scraper or the data coming in then you could do one or several of the following
- Report or view scraper issues
- Fork the scraper on Github and try to fix it
- If it's an issue with the council website, then contact City of Norwood, Payneham and St Peters
- Contact us for help
If everything is working fine right now but you want to help if something goes wrong