## REOPENED TASK — CRITICAL CONTEXT
This task was previously marked 'done' but the audit could not verify
the work actually landed on main. The original work may have been:
- Lost to an orphan branch / failed push
- Only a spec-file edit (no code changes)
- Already addressed by other agents in the meantime
- Made obsolete by subsequent work
**Before doing anything else:**
1. **Re-evaluate the task in light of CURRENT main state.** Read the
spec and the relevant files on origin/main NOW. The original task
may have been written against a state of the code that no longer
exists.
2. **Verify the task still advances SciDEX's aims.** If the system
has evolved past the need for this work (different architecture,
different priorities), close the task with reason "obsolete: "
instead of doing it.
3. **Check if it's already done.** Run `git log --grep=''`
and read the related commits. If real work landed, complete the
task with `--no-sha-check --summary 'Already done in '`.
4. **Make sure your changes don't regress recent functionality.** Many
agents have been working on this codebase. Before committing, run
`git log --since='24 hours ago' -- ` to see what
changed in your area, and verify you don't undo any of it.
5. **Stay scoped.** Only do what this specific task asks for. Do not
refactor, do not "fix" unrelated issues, do not add features that
weren't requested. Scope creep at this point is regression risk.
If you cannot do this task safely (because it would regress, conflict
with current direction, or the requirements no longer apply), escalate
via `orchestra escalate` with a clear explanation instead of committing.
Completion Notes
Auto-completed by supervisor after successful deploy to main
Commit a8b3ccd added page caching to hypothesis_detail() but referenced
_get_cached_page and _set_cached_page without defining them. This caused a NameError on every hypothesis page load, making all /hypothesis/ pages return 500 errors.
Fix
Defined _get_cached_page(key) and _set_cached_page(key, content) as a
simple in-memory dict cache with 60-second TTL and 200-entry limit
Restored timeout=10 on get_db() to prevent indefinite SQLite lock waits
Verification
hypothesis_detail('h-019ad538') returns 108K HTML in ~6.7s (first call)
Cached calls return in <1ms (6688x speedup)
py_compile passes
Work Log
2026-04-02 11:25: Identified NameError from missing cache functions