Files
miti99bot/.github/workflows/scrape-loldle-data.yml
T
tiennm99 615dc8174c refactor(loldle): import champions.json directly, drop ESM wrapper
Node 24 + wrangler 4.x both accept `import ... with { type: "json" }`,
so the generated champions-data.js wrapper is no longer needed.

Drop scripts/build-loldle-data.js and the build:loldle-data npm script.
Scraper writes champions.json only.
2026-04-22 13:24:24 +07:00

51 lines
1.5 KiB
YAML

name: scrape-loldle-data
# Rebuilds src/modules/loldle/champions.json every Monday 06:00 UTC by
# scraping loldle.net's JS bundle (sole source of truth for classic-mode
# fields). Opens a PR if the output changed. Manually triggerable from the
# Actions tab.
#
# Note: the bundled data ships with the Worker — the change only takes effect
# after `npm run deploy` is run on the updated main branch.
on:
schedule:
- cron: "0 6 * * 1"
workflow_dispatch:
permissions:
contents: write
pull-requests: write
jobs:
scrape:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: "20"
cache: npm
- run: npm ci
- name: Scrape loldle.net
run: npm run scrape:loldle-data
- name: Open PR if data changed
uses: peter-evans/create-pull-request@v7
with:
branch: data/loldle-weekly-refresh
delete-branch: true
commit-message: "data: weekly loldle.net champion refresh"
title: "data: weekly loldle.net champion refresh"
body: |
Automated weekly refresh of `src/modules/loldle/champions.json`
from loldle.net's JS bundle — the canonical source for all
classic-mode fields (`gender`, `species`, `resource`,
`attackType`, `region`, `lane`, `releaseDate`).
Review the diff, merge, then run `npm run deploy` to ship.
add-paths: src/modules/loldle/champions.json