-
Notifications
You must be signed in to change notification settings - Fork 123
Expand file tree
/
Copy pathCODEOWNERS
More file actions
59 lines (46 loc) · 2.04 KB
/
CODEOWNERS
File metadata and controls
59 lines (46 loc) · 2.04 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
# CODEOWNERS file for tpu-inference
# This file defines code ownership for different parts of the repository.
# Each line is a file pattern followed by one or more owners.
# Owners are notified when PRs modify code in their areas.
#
# Order matters - the last matching pattern takes precedence.
# Analysis includes full history from tpu_commons and tpu_inference paths.
# Default owners for everything in the repo (fallback)
* @vipannalla
# CI/CD and Build Configuration
/.buildkite/ @jcyang43 @QiliangCui
/.github/ @jcyang43 @QiliangCui
# Documentation
/docs/ @bvrockwell
/README.md @bvrockwell
/CONTRIBUTING.md @jrplatin @bvrockwell
# Distributed Computing
/tpu_inference/distributed/ @mrjunwan-lang @sixiang-google
# Kernel Implementations (Performance-critical)
/tpu_inference/kernels/ @kyuyeunk @bythew3i
# JAX Model Layers - Attention
/tpu_inference/layers/jax/ @bzgoogle @jrplatin @gpolovets1
/tpu_inference/layers/vllm/ @kyuyeunk @vanbasten23
# JAX Model Implementations
/tpu_inference/models/jax/qwen2_5_vl.py @kwang3939
/tpu_inference/models/jax/gpt_oss.py @bzgoogle
/tpu_inference/models/jax/deepseek_v3.py @bzgoogle @gpolovets1 @jrplatin
/tpu_inference/models/vllm/ @kyuyeunk @vanbasten23
# Runner and Execution
/tpu_inference/runner/ @kyuyeunk @jrplatin @wenxindongwork @sixiang-google @mrjunwan-lang
/tpu_inference/runner/tpu_runner.py @jrplatin @kyuyeunk @wenxindongwork @sixiang-google
/tpu_inference/runner/persistent_batch_manager.py @jrplatin @wenxindongwork
/tpu_inference/runner/speculative_decoding_manager.py @Lumosis
/tpu_inference/executors/ @sixiang-google @mrjunwan-lang
/tpu_inference/core/ @sixiang-google @mrjunwan-lang @wenxindongwork
# Worker Management
/tpu_inference/worker/ @sixiang-google @mrjunwan-lang @jrplatin @vanbasten23 @wenxindongwork
# Speculative Decoding
/tpu_inference/spec_decode/ @Lumosis
# Platform Support
/tpu_inference/platforms/ @sixiang-google @mrjunwan-lang
# LoRA and Adapters
/tpu_inference/lora/ @vanbasten23
/tpu_inference/runner/lora_utils.py @vanbasten23
# Docker Configuration
/docker/ @jrplatin @QiliangCui