Skip to content

G-ALI007/pucch-format0-ml-decoder

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

7 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🎯 PUCCH Format 0 ML Decoder

Python 3.12+ TensorFlow 2.21+ License: MIT Paper Status DOI

Research-Grade Implementation for 5G NR PUCCH Format 0 Decoding using Machine Learning


πŸ“‹ Table of Contents


πŸ“– Overview

This repository implements a Machine Learning-based decoder for 5G New Radio (NR) PUCCH Format 0 that demonstrates robust performance under multi-user interference.

Problem Statement

In dense 5G networks, multiple User Equipments (UEs) may transmit PUCCH Format 0 simultaneously on the same time-frequency resources using different cyclic shifts. This creates inter-user interference that degrades traditional correlation-based decoders.

Proposed Solution

We train a Fully Connected Neural Network to decode the target user's UCI (ACK/NACK + SR) in the presence of interference from 1-2 other users. The model is trained on a single SNR (10 dB) and generalizes across a wide SNR range (0-20 dB).


πŸ† Key Contributions

# Contribution Impact
1 Multi-User Interference Robustness Decoder works with 2-3 simultaneous users
2 45% Performance Gain Over traditional correlation-based decoder
3 Statistical Validation 5 independent runs with 95% confidence intervals
4 DTX Detection Extended 5-class model with Discontinuous Transmission detection
5 Complete Reproducibility MATLAB data generation + Python ML pipeline
6 Publication-Ready All results, plots, and analysis included

πŸ“Š Performance Results

Accuracy vs SNR (Multi-User Scenario)

SNR (dB) Neural Network Correlation Decoder Gain
0 64.7% Β± 1.2% 41.5% +23.2%
5 88.0% Β± 0.9% 48.0% +40.0%
10 97.4% ± 0.3% 51.2% +46.2% ⭐
15 98.1% ± 0.2% 52.6% +45.5% ⭐
20 98.2% ± 0.2% 53.1% +45.1% ⭐

Key Metrics

  • Maximum Accuracy: 98.2% (SNR β‰₯ 15 dB)
  • Average Gain: +40.1% over correlation decoder
  • 3GPP Gap: < 1% from 99% requirement (at SNR β‰₯ 15 dB)
  • Statistical Confidence: 95% CI width < 2%

Training Performance

Best Validation Accuracy: 97.36% (Epoch 300) Training Time: ~15 minutes (CPU) Model Size: 134 KB Inference Time: 0.15 ms/sample


πŸš€ Installation

Prerequisites

  • Python 3.12+
  • MATLAB R2023a+ (for data generation)
  • 10 GB free disk space

Clone Repository

git clone https://github.com/YOUR_USERNAME/pucch-format0-ml-decoder.git
cd pucch-format0-ml-decoder
Create Virtual Environment
python -m venv venv
source venv/bin/activate  # Linux/Mac
# or
venv\Scripts\activate  # Windows
Install Dependencies
pip install -r requirements.txt
Requirements
tensorflow>=2.21.0
tf_keras>=2.15.0
numpy>=1.26.0
pandas>=2.2.0
scikit-learn>=1.4.0
matplotlib>=3.8.0
seaborn>=0.13.0
⚑ Quick Start
1. Generate Data (MATLAB)
% Single-User
run('matlab/generate_pucch_f0_data.m')

% Multi-User (2 & 3 users)
run('matlab/generate_pucch_f0_multiuser.m')

% DTX Detection
run('matlab/generate_pucch_f0_dtx.m')
2. Run Single-User Pipeline
python main.py
3. Run Multi-User Pipeline
python main_multi_ue.py
4. Run Statistical Validation (5 runs)
python run_multi_experiments_multi_ue.py --runs 5
5. Run DTX Detection Pipeline
python main_dtx.py
πŸ“ Project Structure
pucch-format0-ml-decoder/
β”‚
β”œβ”€β”€ πŸ“„ README.md                    # This file
β”œβ”€β”€ πŸ“„ LICENSE                      # MIT License
β”œβ”€β”€ πŸ“„ requirements.txt             # Python dependencies
β”œ               
β”‚
β”œβ”€β”€ πŸ“ matlab/                      # MATLAB data generation
β”‚   β”œβ”€β”€ generate_pucch_f0_data.m           # Single-user
β”‚   β”œβ”€β”€ generate_pucch_f0_multiuser.m      # Multi-user (2 & 3 users)
β”‚   └── generate_pucch_f0_dtx.m            # DTX detection
β”‚
β”œβ”€β”€ πŸ“ python/                      # Python ML pipeline
β”‚   β”œβ”€β”€ config.py                   # Base configuration
β”‚   β”œβ”€β”€ config_multi_ue.py          # Multi-user configuration
β”‚   β”œβ”€β”€ config_dtx.py               # DTX configuration
β”‚   β”œβ”€β”€ data_loader.py              # Data loading
β”‚   β”œβ”€β”€ data_loader_multi_ue.py     # Multi-user data loading
β”‚   β”œβ”€β”€ data_preprocessing.py       # Preprocessing
β”‚   β”œβ”€β”€ data_preprocessing_multi_ue.py
β”‚   β”œβ”€β”€ model.py                    # Neural network model
β”‚   β”œβ”€β”€ evaluation.py               # Evaluation metrics
β”‚   β”œβ”€β”€ evaluation_multi_ue.py      # Multi-user evaluation
β”‚   β”œβ”€β”€ visualization.py            # Plotting
β”‚   β”œβ”€β”€ main.py                     # Single-user pipeline
β”‚   β”œβ”€β”€ main_multi_ue.py            # Multi-user pipeline
β”‚   β”œβ”€β”€ main_dtx.py                 # DTX pipeline
β”‚   β”œβ”€β”€ main_twostage.py            # Two-stage DTX
β”‚   β”œβ”€β”€ main_architectures.py       # Architecture comparison
β”‚   └── run_multi_experiments_multi_ue.py  # Statistical validation
β”‚
β”œβ”€β”€ πŸ“ results/                     # Single-user results
β”œβ”€β”€ πŸ“ results_multi_ue/            # Multi-user results
β”œβ”€β”€ πŸ“ results_dtx/                 # DTX results
β”œβ”€β”€ πŸ“ models/                      # Trained models
β”œβ”€β”€ πŸ“ plots/                       # Generated plots
└── πŸ“ logs/                        # Training logs
πŸ“– Usage
Single-User Mode
# Full pipeline
python main.py

# Evaluation only (requires trained model)
python main.py --eval-only

# Without plots (batch mode)
python main.py --no-plots
Multi-User Mode
# Full pipeline (2 users)
python main_multi_ue.py

# 3 users (edit config_multi_ue.py first)
# CURRENT_SCENARIO_KEY = "3users"
python main_multi_ue.py

# Statistical validation (5 runs)
python run_multi_experiments_multi_ue.py --runs 5
DTX Detection Mode
# Full DTX pipeline
python main_dtx.py

# Two-stage approach
python main_twostage.py
Architecture Comparison
# Compare different neural network architectures
python main_architectures.py
πŸ“Š Results
All results are saved in:
results_multi_ue/statistical/statistical_summary.csv - Mean Β± Std Dev
results_multi_ue/statistical/results_table.tex - LaTeX table for paper
results_multi_ue/plots/ - Publication-ready figures
Example Output
--- Multi-User Results Summary ---
SNR (dB)    NN Acc (%)  Corr Acc (%)  Gain (%)
--------------------------------------------------
0           64.70       41.53         +23.17
5           88.00       47.58         +40.42
10          97.40       51.17         +46.23
15          98.10       52.55         +45.55
20          98.20       53.08         +45.12
--------------------------------------------------
Average Gain: +40.10%
πŸ“š Citation
If you use this code in your research, please cite:
@misc{pucch-ml-decoder-2024,
  author = {Ghader Ali},
  title = {PUCCH Format 0 ML Decoder: Multi-User Interference Robust Decoding for 5G NR},
  year = {2024},
  howpublished = {\url{https://github.com/G-ALI007/pucch-format0-ml-decoder}},
  doi = {10.5281/zenodo.XXXXXX}
}
🀝 Contributing
Contributions are welcome! Please follow these steps:
Fork the repository
Create a feature branch (git checkout -b feature/amazing-feature)
Commit your changes (git commit -m 'Add amazing feature')
Push to the branch (git push origin feature/amazing-feature)
Open a Pull Request
Code Style
Follow PEP 8 for Python code
Use type hints for all functions
Include docstrings for all public functions
Add tests for new features
πŸ“§ Contact
Author: ghader ali
Email: aghader563@gmail.com
πŸ™ Acknowledgments
3GPP for PUCCH Format 0 specifications
MATLAB Communications Toolbox for data generation
TensorFlow/Keras for neural network implementation
[Paper Base Reference] for baseline architecture
πŸ“Œ Disclaimer
This code is for research and educational purposes. It implements the methodology described in our paper but may require modifications for production deployment.
<div align="center">

Made with ❀️ for 5G Research Community
⬆ Back to Top
</div>

About

🎯 ML-based decoder for 5G NR PUCCH Format 0 with multi-user interference robustness. Achieves 45% performance gain over traditional correlation methods. Research-grade implementation with statistical validation (5 runs, 95% CI).

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors