Skip to content
#

triple-barrier-method

Here are 2 public repositories matching this topic...

Language: All
Filter by language
Daikoku

End-to-end crypto price direction prediction framework built on the Mamba Selective State Space architecture (Python/PyTorch). Covers the full lifecycle: data preparation, Triple Barrier labeling, training, evaluation & live inference — all from a single config. Linear-time sequence processing as a Transformer alternative.

  • Updated Apr 2, 2026
  • Python

This project explores Attention-Based Transformer Encoders to develop robust buy/sell classification models for financial time series. It addresses market non-stationarity and noise by combining De Prado-inspired preprocessing with a hybrid Transformer-LSTM architecture.

  • Updated Oct 18, 2025
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the triple-barrier-method topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the triple-barrier-method topic, visit your repo's landing page and select "manage topics."

Learn more