-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Pull requests: openai/gpt-oss
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Remove sorting from topk in forward method
#229
opened Nov 9, 2025 by
Guido1Alessandro1Trevisan
Loading…
refactor: remove unused base_url and session params from replace_images
#227
opened Nov 7, 2025 by
Ronitsabhaya75
Loading…
docs: Update vLLM latest installation instructions with CUDA and non-CUDA paths
#226
opened Nov 7, 2025 by
sneha-rudra
Loading…
Add llama.cpp server inference backend for responses_api
#225
opened Nov 6, 2025 by
tarruda
Loading…
Fix: Correct type hint for _compute_concentration_and_inv_freq
#223
opened Nov 5, 2025 by
AkshithAI
Loading…
Add Parallel as an additional backend for browser
#219
opened Oct 22, 2025 by
anshultomar746
Loading…
feat: add JAX/Flax reference implementation for inference
#217
opened Oct 22, 2025 by
atveit
Loading…
Fix: Resolve critical bugs in vLLM Online and Offline inference
#209
opened Oct 7, 2025 by
hrithiksagar
Loading…
Add RAG Example using FAISS and Harmony Prompts
#207
opened Oct 5, 2025 by
Ujjwal-Bajpayee
Loading…
12 tasks done
Add stateful Jupyter Notebook option and make browser more reliable
#203
opened Oct 2, 2025 by
romainhuet
Loading…
chore: minimal PEP 8 hygiene (unused imports/vars, trivial fixes)
#193
opened Sep 18, 2025 by
harjothkhara
Loading…
Previous Next
ProTip!
Exclude everything labeled
bug with -label:bug.