Files
miti99bot/package.json
T
tiennm99 64c0248eea feat(semantle): source target pool from google-10000-english dictionary
The ~250-word hand-curated TARGET_POOL was too small for long-term play.
Replaces it with a build-script-generated dictionary:

- scripts/build-semantle-words.js fetches first20hours/google-10000-english
  (no-swears variant), filters to 4–10 ASCII letters, drops the top-200
  most frequent function words, and writes src/modules/semantle/words-data.js
  as a static ES-module export.
- wordlist.js now just re-exports that data via TARGET_POOL + pickFromPool.
- package.json: new build:semantle-words script; chained into `npm run build`
  alongside build:wordle-data so `npm run deploy` regenerates automatically.

Pool size: ~250 → 7953 words. Same ConceptNet verify-and-fallback flow, so
low-quality picks still cost at most one extra concept lookup.
2026-04-22 23:12:07 +07:00

35 lines
1.1 KiB
JSON

{
"name": "miti99bot",
"version": "0.1.0",
"description": "Telegram bot with plug-n-play module system, deployed to Cloudflare Workers.",
"private": true,
"type": "module",
"engines": {
"node": ">=20.6"
},
"scripts": {
"dev": "wrangler dev",
"build": "npm run build:wordle-data && npm run build:semantle-words",
"build:wordle-data": "node scripts/build-wordle-data.js",
"build:semantle-words": "node scripts/build-semantle-words.js",
"scrape:loldle-data": "node scripts/scrape-loldle-data.js",
"deploy": "npm run build && wrangler deploy && npm run db:migrate && npm run register",
"db:migrate": "node scripts/migrate.js",
"register": "node --env-file-if-exists=.env.deploy scripts/register.js",
"register:dry": "node --env-file-if-exists=.env.deploy scripts/register.js --dry-run",
"lint": "biome check . && eslint src",
"format": "biome format --write .",
"test": "vitest run"
},
"dependencies": {
"grammy": "^1.30.0"
},
"devDependencies": {
"@biomejs/biome": "^1.9.0",
"eslint": "^10.2.0",
"eslint-plugin-jsdoc": "^62.9.0",
"vitest": "^4.1.4",
"wrangler": "^4.84.0"
}
}