fix: progress logging, Ollama timeout for LLM extraction. Closes #132…#239
Open
utkarshqz wants to merge 2 commits intofireform-core:mainfrom
Open
fix: progress logging, Ollama timeout for LLM extraction. Closes #132…#239utkarshqz wants to merge 2 commits intofireform-core:mainfrom
utkarshqz wants to merge 2 commits intofireform-core:mainfrom
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
fix: real-time progress logging and Ollama timeout for LLM extraction
Summary
Adds real-time console progress logging during LLM extraction and a request timeout to prevent indefinite hangs on slow or lower-end hardware.
The console was previously silent between
[3] Starting extraction...and the final result — leaving developers unable to tell if the process was working or frozen. This PR fixes that with clear per-field[LOG]output and elapsed time reporting.Closes / Fixes
Closes #132
Addresses #152
Type of change
What changed and why
1. 🪵 Real-time progress logging (
src/llm.py)Added
[LOG]output immediately before the batch Ollama request so developers can see exactly which fields are being processed and confirm the backend is actively working.Console output after this PR:
Fallback per-field extraction path also logs progress:
2. ⏱️ Ollama request timeout (
src/llm.py)Added
timeout=30torequests.post().Previously, if Ollama was slow or unresponsive, the process would hang indefinitely with no feedback. The 30-second timeout raises a clear
requests.exceptions.Timeoutwhich is caught and surfaced as a503error to the client — consistent with existing error handling.This is especially important on lower-end hardware where CPU inference can be slow, as noted in #132.
How to test
Run the test suite:
Expected: 52 passed ✅
Screenshots / Evidence
Console output during extraction (verified locally):