Are web crawlers used here "algorithms"

Hey all, I am a non-technical person and just have a small query: when opensanctions collects its data through what i think is described as web crawlers, does that include “algorithms” in any sense of the word? Also where do human researchers appear in the loop/decision-making? So i am wondering how and in what sense is this work “automated”?

I think there is a technical meaning to the term algorithm, and a policy/cultural one. Your posting this message to this forum using a web browser invoked on the upper end of a hundred thousand different algorithms, many of them incredibly complex. So does our crawling data from 300 different web sites (in fact, all that code is public). The vast majority of the work is fully automated, but we do have a team of engineers who constantly adjust, fix, upgrade etc. the software that collects the data.

I personally think we have some nifty algorithms in there, for example: “if I have 10 different variants of the spelling of a person’s name - which one should I pick as the primary name to be shown to a user?”

The place where this all becomes extra weird is in automatically comparing names from the watchlists with other names to decide if they might be the person (or company, or boat) in question. Here’s a cool blog post about that: