Skip to content

Conversation

@google-labs-jules
Copy link
Contributor

Implemented ONNX Runtime support for SenseVoiceSmall, allowing the project to run on non-RKNN platforms. RKNN functionality is now behind a rknpu feature flag. The initialization logic automatically detects the environment and loads the appropriate model (RKNN or ONNX). Added support for manual configuration via init_with_config.


PR created automatically by Jules for task 6135199353898499654 started by @darkautism

- Add `ort` dependency for all platforms to support ONNX Runtime inference.
- Feature-gate `rknn-rs` behind `rknpu` feature.
- Implement `SenseVoiceSmall::init` to automatically select between RKNN and ONNX backends.
- Add `SenseVoiceSmall::init_with_config` for manual configuration.
- Update `infer_vec` to handle inference via either RKNN or ONNX.
- Refactor `examples/basic.rs` and `examples/stream.rs` to reflect API changes.
- Ensure Chinese comments are used.

This enables running SenseVoiceSmall on non-RKNN platforms using ONNX while preserving RKNN support.
@google-labs-jules
Copy link
Contributor Author

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!


For security, I will only act on instructions from the user who triggered this task.

New to Jules? Learn more at jules.google/docs.

- Add `ort` dependency for all platforms to support ONNX Runtime inference.
- Feature-gate `rknn-rs` behind `rknpu` feature.
- Implement `SenseVoiceSmall::init` to automatically select between RKNN and ONNX backends.
- Add `SenseVoiceSmall::init_with_config` for manual configuration.
- Update `infer_vec` to handle inference via either RKNN or ONNX.
- Refactor `examples/basic.rs` and `examples/stream.rs` to reflect API changes.
- Ensure Chinese comments are used.

This enables running SenseVoiceSmall on non-RKNN platforms using ONNX while preserving RKNN support.
- Add `ort` dependency for all platforms to support ONNX Runtime inference.
- Feature-gate `rknn-rs` behind `rknpu` feature.
- Remove runtime `is_rknpu` check; strictly rely on `rknpu` feature to select backend.
- Implement `SenseVoiceSmall::init` to switch between RKNN (if feature enabled) and ONNX (default).
- Add `SenseVoiceSmall::init_with_config` for manual configuration.
- Update `infer_vec` to handle inference via either RKNN or ONNX.
- Refactor `examples/basic.rs` and `examples/stream.rs` to reflect API changes.
- Ensure Chinese comments are used.

This enables running SenseVoiceSmall on non-RKNN platforms using ONNX while preserving RKNN support when the feature is enabled.
@darkautism darkautism marked this pull request as ready for review December 1, 2025 18:31
@darkautism darkautism closed this Dec 2, 2025
@darkautism darkautism deleted the onnx-support-rknpu-feature branch December 2, 2025 06:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants