Most of the code for uploading crawl data to Internet Archive or Importing it into a web-monitoring-db instance is duplicated between multiple workflows (for the main crawl, and for re-uploading/re-importing when we need to reprocess old data). These should be made into reusable action steps so they don’t have to be repeated: https://docs.github.com/en/actions/tutorials/create-actions/create-a-composite-action#creating-a-composite-action-within-the-same-repository