Skip to content

Conversation

@SylvainChevalier
Copy link
Contributor

@SylvainChevalier SylvainChevalier commented Jan 23, 2026

Implements changes requested in #4146

Changes

  • Reduced question open hotness boost from 20 to 10
  • Added hotness boost of 20 for cp_reveal_time (community prediction reveal)

Generated with Claude Code

Summary by CodeRabbit

Release Notes

  • New Features

    • Added CP Reveal Time Score as a new ranking factor for questions in hotness calculations.
  • Changes

    • Removed approval score factor from post ranking calculations.

- Reduced question open hotness boost from 20 to 10
- Added hotness boost of 20 for cp_reveal_time (community prediction reveal)

Co-authored-by: Sylvain <SylvainChevalier@users.noreply.github.com>
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 23, 2026

📝 Walkthrough

Walkthrough

The hotness scoring system for posts and questions has been refactored: the Approval score component was removed from POST_HOTNESS_COMPONENTS, and a new CP Reveal Time Score component was added to QUESTION_HOTNESS_COMPONENTS. Tests were updated to reflect the removal of the approval score contribution.

Changes

Cohort / File(s) Summary
Hotness Service Refactoring
posts/services/hotness.py
Removed _compute_hotness_approval_score() function and its entry from POST_HOTNESS_COMPONENTS. Added _compute_question_hotness_cp_reveal_time() function and registered it in QUESTION_HOTNESS_COMPONENTS.
Hotness Tests Update
tests/unit/test_posts/test_services/test_hotness.py
Removed tests for _compute_hotness_approval_score(). Updated expected values across remaining hotness-related tests to reflect the absence of approval score contribution in post hotness calculations.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~12 minutes

Poem

🐰 A hop and a skip, the scores take a dip,
Approval's removed, no more from that ship,
Questions now sparkle with CP's reveal,
The hotness has shifted—a fresh new deal! ✨

🚥 Pre-merge checks | ✅ 1 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Title check ⚠️ Warning The title mentions 'reduce open boost' and 'add cp reveal boost', but the changeset primarily removes the approval hotness component entirely and adds the CP Reveal Time Score. The open boost reduction is not reflected in the actual code changes shown. Revise the title to accurately reflect the main changes: 'Remove approval hotness boost and add CP Reveal Time Score' or similar, to match the actual implementation.
✅ Passed checks (1 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch claude/issue-4146-20260122-1742

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-actions
Copy link
Contributor

github-actions bot commented Jan 23, 2026

🚀 Preview Environment

Your preview environment is ready!

Resource Details
🌐 Preview URL https://metaculus-pr-4150-claude-issue-4146-20260122-174-preview.mtcl.cc
📦 Docker Image ghcr.io/metaculus/metaculus:claude-issue-4146-20260122-1742-e9bc52d
🗄️ PostgreSQL NeonDB branch preview/pr-4150-claude-issue-4146-20260122-174
Redis Fly Redis mtc-redis-pr-4150-claude-issue-4146-20260122-174

Details

  • Commit: e9bc52d399c293c51374f725eaaa21f041e63e76
  • Branch: claude/issue-4146-20260122-1742
  • Fly App: metaculus-pr-4150-claude-issue-4146-20260122-174

ℹ️ Preview Environment Info

Isolation:

  • PostgreSQL and Redis are fully isolated from production
  • Each PR gets its own database branch and Redis instance
  • Changes pushed to this PR will trigger a new deployment

Limitations:

  • Background workers and cron jobs are not deployed in preview environments
  • If you need to test background jobs, use Heroku staging environments

Cleanup:

  • This preview will be automatically destroyed when the PR is closed

@SylvainChevalier
Copy link
Contributor Author

@claude let's remove the approval hotness boost completely.

@claude
Copy link

claude bot commented Jan 23, 2026

Claude finished @SylvainChevalier's task in 55s —— View job


Task: Remove approval hotness boost ✓

  • Read current hotness.py implementation
  • Remove approval hotness boost code
  • Commit and push changes

Summary

Removed the approval hotness boost completely from posts/services/hotness.py:

  • Deleted the _compute_hotness_approval_score function
  • Removed the "Approval score" entry from POST_HOTNESS_COMPONENTS

The PR now includes three hotness adjustments:

  1. Reduced question open hotness boost from 20 to 10
  2. Added hotness boost of 20 for cp_reveal_time (community prediction reveal)
  3. Removed approval hotness boost entirely

Changes pushed to claude/issue-4146-20260122-1742.

--- •

Co-authored-by: Sylvain <SylvainChevalier@users.noreply.github.com>
@SylvainChevalier
Copy link
Contributor Author

@claude can you see the failed unit tests in the github actions?

@claude
Copy link

claude bot commented Jan 23, 2026

Claude finished @SylvainChevalier's task —— View job


GitHub Actions Test Failures - RESOLVED ✓

  • Check CI status for this PR
  • Identify failed workflow runs
  • Analyze test failure logs
  • Fix test file to remove approval hotness references
  • Commit and push fix
  • Update test expectations based on CI results
  • Commit and push updated expectations
  • Verify all tests pass

Summary

Yes, I can see the failed unit tests in GitHub Actions! The issue was that the test file was trying to import _compute_hotness_approval_score, which was removed when we deleted the approval hotness boost.

Changes Made

Commit 3c6d4f2 - Fix unit tests after removing approval hotness boost:

  1. Removed import of deleted _compute_hotness_approval_score function
  2. Removed test_compute_hotness_approval_score test (no longer applicable)
  3. Updated test_handle_post_boost expected values:
    • Initial hotness: 20 → 0 (no approval boost)
    • After UP boost: 45 → 20
    • After DOWN boost: 3 → -10

Commit 1d1c4b8 - Update test expectations after hotness formula changes:
Updated expected values to reflect the reduced open boost (20 → 10):

  • test_compute_question_hotness active question: 13.0 → 10.5
  • test_compute_question_hotness resolved question: 6.25 → 5.625
  • test_compute_question_hotness annulled question: 1.25 → 0.625
  • test_compute_post_hotness: 110.9 → 109.025

CI Status

Run Unit Tests - PASSING
PR #4150 - PASSING
⏳ Run Integration Tests (Playwright) - in progress
⏳ PR Preview Environment - in progress

All unit tests are now passing!


View latest workflow run

- Remove import of deleted _compute_hotness_approval_score function
- Remove test_compute_hotness_approval_score test
- Update test_handle_post_boost expected values to reflect removal of approval boost
- Update test_compute_post_hotness expected value

Co-authored-by: Sylvain <SylvainChevalier@users.noreply.github.com>
Adjust expected values in question and post hotness tests to reflect:
- Reduced question open hotness boost from 20 to 10
- Removed approval hotness boost

Test expectation updates:
- test_compute_question_hotness active question: 13.0 → 10.5
- test_compute_question_hotness resolved question: 6.25 → 5.625
- test_compute_question_hotness annulled question: 1.25 → 0.625
- test_compute_post_hotness: 110.9 → 109.025

Co-authored-by: Sylvain <SylvainChevalier@users.noreply.github.com>
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
tests/unit/test_posts/test_services/test_hotness.py (1)

166-207: ⚠️ Potential issue | 🟡 Minor

Verify the expected total hotness calculation.

The expected value of 109.025 should be verified. The comment on line 185 says the second question "Will be scored as 18" but let me verify:

  • Question 2 open time (2025-04-11, 7 days ago): decay(10, open_time) = 2.5
  • No resolution, no movement specified → Total for Q2: 2.5

The inline comment says "scored as 18" but with the reduced open boost (10 instead of 20), this should now be ~2.5, not 18.

Components breakdown:

  • Max subquestions score: 2.5 (Q2 is the max)
  • Total Boosts Score: 100
  • Posted comments score: 2
  • Net post votes score: 1
  • Relevant ITN news: decay(0.4, ...) = 0.1 * (7/3.5)^-2 = 0.025 (within 7 days, distance 0.1 → 0.5-0.1=0.4 → decayed)

Wait, let me recalculate. The article was created at freeze_time context (2025-04-18), so no decay:

  • Relevant ITN news: 0.5 - 0.1 = 0.4 (no decay since same day)

Total: 2.5 + 100 + 2 + 1 + 0.4 = 105.9

But expected is 109.025. Let me re-examine...

Actually Q1 has resolution_set_time on 2025-04-11 (7 days ago):

  • Q1: decay(10, 2025-04-04) + decay(20, 2025-04-11) = 0.625 + 5 = 5.625
  • Q2: decay(10, 2025-04-11) = 2.5

Max is Q1 with 5.625. But comments say Q2 "Will be scored as 18" which is outdated.

Total: 5.625 + 100 + 2 + 1 + 0.4 = 109.025

The test value is correct but the inline comments are stale.

📝 Proposed fix to update stale comments

The inline comments on lines 177 and 185 are outdated after the open boost reduction. Consider updating them:

         group_of_questions=factory_group_of_questions(
             questions=[
-                # Will be scored as 15
+                # Will be scored as 5.625 (max)
                 create_question(
                     question_type=Question.QuestionType.BINARY,
                     open_time=make_aware(datetime.datetime(2025, 4, 4)),
                     scheduled_close_time=make_aware(datetime.datetime(2025, 4, 10)),
                     resolution_set_time=make_aware(datetime.datetime(2025, 4, 11)),
                     resolution="yes",
                 ),
-                # Will be scored as 18
+                # Will be scored as 2.5
                 create_question(
                     question_type=Question.QuestionType.BINARY,
                     open_time=make_aware(datetime.datetime(2025, 4, 11)),
                     scheduled_close_time=make_aware(datetime.datetime(2025, 4, 25)),
                 ),
             ]
         ),
🧹 Nitpick comments (1)
tests/unit/test_posts/test_services/test_hotness.py (1)

11-20: Consider adding test coverage for the new CP reveal time score.

The new _compute_question_hotness_cp_reveal_time function was added but there are no dedicated tests for it. Consider adding a test case that exercises the CP reveal time hotness component.

Would you like me to generate a test case for the CP reveal time score component?

📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 4bb9e1e and daaabde.

📒 Files selected for processing (2)
  • posts/services/hotness.py
  • tests/unit/test_posts/test_services/test_hotness.py
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2026-01-15T19:29:58.940Z
Learnt from: hlbmtc
Repo: Metaculus/metaculus PR: 4075
File: authentication/urls.py:24-26
Timestamp: 2026-01-15T19:29:58.940Z
Learning: In this codebase, DRF is configured to use IsAuthenticated as the default in REST_FRAMEWORK['DEFAULT_PERMISSION_CLASSES'] within metaculus_web/settings.py. Therefore, explicit permission_classes([IsAuthenticated]) decorators are unnecessary on DRF views unless a view needs to override the default. When reviewing Python files, verify that views relying on the default are not redundantly decorated, and flag cases where permissions are being over-specified or when a non-default permission is explicitly required.

Applied to files:

  • posts/services/hotness.py
  • tests/unit/test_posts/test_services/test_hotness.py
🧬 Code graph analysis (1)
tests/unit/test_posts/test_services/test_hotness.py (3)
posts/services/hotness.py (2)
  • compute_post_hotness (160-161)
  • handle_post_boost (210-227)
tests/unit/conftest.py (1)
  • user1 (46-48)
posts/models.py (2)
  • Vote (1063-1077)
  • VoteDirection (1064-1066)
🔇 Additional comments (8)
posts/services/hotness.py (4)

53-58: LGTM! Open time decay value reduced from 20 to 10.

The change aligns with the PR objective to reduce the question open hotness boost.


80-85: LGTM! Component registry updated correctly.

The new "CP Reveal Time Score" component is properly added to QUESTION_HOTNESS_COMPONENTS.


151-157: LGTM! Approval score removed from POST_HOTNESS_COMPONENTS.

The removal aligns with the PR comments requesting complete removal of the approval hotness boost.


61-67: LGTM! New CP reveal time score component follows established patterns.

The implementation correctly mirrors _compute_question_hotness_open_time with appropriate null/time checks before applying decay.

tests/unit/test_posts/test_services/test_hotness.py (4)

59-68: Verify the expected hotness calculation for active question.

The expected value of 10.5 should be verified:

  • Open time (2025-04-11, 7 days ago): decay(10, open_time) = 10 * (7/3.5)^-2 = 10 * 0.25 = 2.5
  • Movement score: 20 * 0.4 = 8 (question is OPEN status)
  • Total: 2.5 + 8 = 10.5

The calculation appears correct.


69-81: Verify the expected hotness calculation for resolved question.

The expected value of 5.625 should be verified:

  • Open time (2025-04-04, 14 days ago): decay(10, open_time) = 10 * (14/3.5)^-2 = 10 * 0.0625 = 0.625
  • Movement: 0 (question is not OPEN, so movement is ignored)
  • Resolution time (2025-04-11, 7 days ago): decay(20, resolution_set_time) = 20 * (7/3.5)^-2 = 20 * 0.25 = 5
  • Total: 0.625 + 0 + 5 = 5.625

The calculation appears correct.


82-92: Verify the expected hotness calculation for unsuccessfully resolved question.

The expected value of 0.625 should be verified:

  • Open time (2025-04-04, 14 days ago): decay(10, open_time) = 10 * (14/3.5)^-2 = 0.625
  • Movement: 0 (no movement specified, and question not OPEN)
  • Resolution time: 0 (resolution is "annulled" which is in UnsuccessfulResolutionType, so ignored)
  • Total: 0.625

The calculation appears correct.


210-228: Verify the boost/bury hotness calculations.

  • Line 220: Initial hotness is 0 (no components contribute) ✓
  • Line 224: After UP boost, score = (0/4) + 20 = 20, so hotness = 20
  • Line 228: After DOWN bury, score = -(20/2) - 20 = -30, total boosts = 20 + (-30) = -10, so hotness = -10

The calculations are correct.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants