Skip to content

Commit

Permalink
chore: linter fixes by black
Browse files Browse the repository at this point in the history
  • Loading branch information
peri044 committed Jan 29, 2025
1 parent 8b3e207 commit f08656b
Show file tree
Hide file tree
Showing 25 changed files with 47 additions and 30 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Torch Compile Advanced Usage
======================================================
This interactive script is intended as an overview of the process by which `torch_tensorrt.compile(..., ir="torch_compile", ...)` works, and how it integrates with the `torch.compile` API."""
This interactive script is intended as an overview of the process by which `torch_tensorrt.compile(..., ir="torch_compile", ...)` works, and how it integrates with the `torch.compile` API.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling GPT2 using the dynamo backend
==========================================================
This script illustrates Torch-TensorRT workflow with dynamo backend on popular GPT2 model."""
This script illustrates Torch-TensorRT workflow with dynamo backend on popular GPT2 model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Torch Export with Cudagraphs
======================================================
This interactive script is intended as an overview of the process by which the Torch-TensorRT Cudagraphs integration can be used in the `ir="dynamo"` path. The functionality works similarly in the `torch.compile` path as well."""
This interactive script is intended as an overview of the process by which the Torch-TensorRT Cudagraphs integration can be used in the `ir="dynamo"` path. The functionality works similarly in the `torch.compile` path as well.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling ResNet using the Torch-TensorRT Dyanmo Frontend
==========================================================
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a ResNet model."""
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a ResNet model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling Llama2 using the dynamo backend
==========================================================
This script illustrates Torch-TensorRT workflow with dynamo backend on popular Llama2 model."""
This script illustrates Torch-TensorRT workflow with dynamo backend on popular Llama2 model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling ResNet with dynamic shapes using the `torch.compile` backend
==========================================================
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a ResNet model."""
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a ResNet model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling BERT using the `torch.compile` backend
==============================================================
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a BERT model."""
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a BERT model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Dynamo Compile Advanced Usage
======================================================
This interactive script is intended as an overview of the process by which `torch_tensorrt.dynamo.compile` works, and how it integrates with the new `torch.compile` API."""
This interactive script is intended as an overview of the process by which `torch_tensorrt.dynamo.compile` works, and how it integrates with the new `torch.compile` API.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling a Transformer using torch.compile and TensorRT
==============================================================
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a transformer-based model."""
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a transformer-based model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling ResNet using the Torch-TensorRT Dyanmo Frontend
==========================================================
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a ResNet model."""
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a ResNet model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Dynamo Compile Advanced Usage
======================================================
This interactive script is intended as an overview of the process by which `torch_tensorrt.dynamo.compile` works, and how it integrates with the new `torch.compile` API."""
This interactive script is intended as an overview of the process by which `torch_tensorrt.dynamo.compile` works, and how it integrates with the new `torch.compile` API.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling a Transformer using torch.compile and TensorRT
==============================================================
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a transformer-based model."""
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a transformer-based model.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_compile_advanced_usage.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Torch Compile Advanced Usage
======================================================
This interactive script is intended as an overview of the process by which `torch_tensorrt.compile(..., ir="torch_compile", ...)` works, and how it integrates with the `torch.compile` API."""
This interactive script is intended as an overview of the process by which `torch_tensorrt.compile(..., ir="torch_compile", ...)` works, and how it integrates with the `torch.compile` API.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_compile_resnet_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling ResNet with dynamic shapes using the `torch.compile` backend
==========================================================
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a ResNet model."""
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a ResNet model.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_compile_transformers_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling BERT using the `torch.compile` backend
==============================================================
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a BERT model."""
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a BERT model.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_export_cudagraphs.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Torch Export with Cudagraphs
======================================================
This interactive script is intended as an overview of the process by which the Torch-TensorRT Cudagraphs integration can be used in the `ir="dynamo"` path. The functionality works similarly in the `torch.compile` path as well."""
This interactive script is intended as an overview of the process by which the Torch-TensorRT Cudagraphs integration can be used in the `ir="dynamo"` path. The functionality works similarly in the `torch.compile` path as well.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_export_gpt2.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling GPT2 using the dynamo backend
==========================================================
This script illustrates Torch-TensorRT workflow with dynamo backend on popular GPT2 model."""
This script illustrates Torch-TensorRT workflow with dynamo backend on popular GPT2 model.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_export_llama2.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling Llama2 using the dynamo backend
==========================================================
This script illustrates Torch-TensorRT workflow with dynamo backend on popular Llama2 model."""
This script illustrates Torch-TensorRT workflow with dynamo backend on popular Llama2 model.
"""

# %%
# Imports and Model Definition
Expand Down
2 changes: 1 addition & 1 deletion py/torch_tensorrt/_Input.py
Original file line number Diff line number Diff line change
Expand Up @@ -261,7 +261,7 @@ def _supported_input_size_type(input_size: Any) -> bool:

@staticmethod
def _parse_tensor_domain(
domain: Optional[Tuple[float, float]]
domain: Optional[Tuple[float, float]],
) -> Tuple[float, float]:
"""
Produce a tuple of integers which specifies a tensor domain in the interval format: [lo, hi)
Expand Down
2 changes: 1 addition & 1 deletion py/torch_tensorrt/_enums.py
Original file line number Diff line number Diff line change
Expand Up @@ -1200,7 +1200,7 @@ def _from(

@classmethod
def try_from(
c: Union[trt.EngineCapability, EngineCapability]
c: Union[trt.EngineCapability, EngineCapability],
) -> Optional[EngineCapability]:
"""Create a Torch-TensorRT engine capability enum from a TensorRT engine capability enum.
Expand Down
6 changes: 3 additions & 3 deletions py/torch_tensorrt/dynamo/conversion/_TRTBuilderMonitor.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,13 +53,13 @@ def _redraw(self, *, blank_lines: int = 0) -> None:
if self._render:

def clear_line() -> None:
print("\x1B[2K", end="")
print("\x1b[2K", end="")

def move_to_start_of_line() -> None:
print("\x1B[0G", end="")
print("\x1b[0G", end="")

def move_cursor_up(lines: int) -> None:
print("\x1B[{}A".format(lines), end="")
print("\x1b[{}A".format(lines), end="")

def progress_bar(steps: int, num_steps: int) -> str:
INNER_WIDTH = 10
Expand Down
4 changes: 2 additions & 2 deletions py/torch_tensorrt/dynamo/conversion/impl/activation/ops.py
Original file line number Diff line number Diff line change
Expand Up @@ -247,7 +247,7 @@ def hard_sigmoid(
operation_type = trt.ActivationType.HARD_SIGMOID

def hard_sigmoid_dyn_range_fn(
dyn_range: Tuple[float, float]
dyn_range: Tuple[float, float],
) -> Tuple[float, float]:
def hard_sigmoid_fn(x: float) -> float:
return max(0, min(1, alpha * x + beta))
Expand Down Expand Up @@ -310,7 +310,7 @@ def thresholded_relu(
operation_type = trt.ActivationType.THRESHOLDED_RELU

def thresholded_relu_dyn_range_fn(
dyn_range: Tuple[float, float]
dyn_range: Tuple[float, float],
) -> Tuple[float, float]:
def thresholded_relu_fn(x: float) -> float:
return x if x > alpha else 0
Expand Down
2 changes: 1 addition & 1 deletion py/torch_tensorrt/dynamo/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -465,7 +465,7 @@ def to_torch_device(device: Optional[Union[Device, torch.device, str]]) -> torch


def to_torch_tensorrt_device(
device: Optional[Union[Device, torch.device, str]]
device: Optional[Union[Device, torch.device, str]],
) -> Device:
"""Cast a device-type to torch_tensorrt.Device
Expand Down
2 changes: 1 addition & 1 deletion py/torch_tensorrt/fx/test/converters/acc_op/test_where.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ def __init__(self, x_shape, y_shape):
def forward(self, condition):
return torch.where(condition, self.x, self.y)

inputs = [(torch.randn(condition_shape) > 0)]
inputs = [torch.randn(condition_shape) > 0]
self.run_test(
Where(x_shape, y_shape),
inputs,
Expand Down
5 changes: 2 additions & 3 deletions py/torch_tensorrt/fx/tracer/acc_tracer/acc_tracer.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@
from typing import (
Any,
Callable,
cast,
Dict,
Iterable,
Optional,
Expand All @@ -19,6 +18,7 @@
Tuple,
Type,
Union,
cast,
)

import torch
Expand All @@ -32,7 +32,6 @@

from . import acc_normalizer, acc_ops, acc_shape_prop, acc_utils # noqa: F401


_LOGGER = logging.getLogger(__name__)


Expand Down Expand Up @@ -517,7 +516,7 @@ def _replace_transpose_last_dims_impl(
changed = False

def _calculate_dim(
transpose_dim: Union[torch.fx.Node, int]
transpose_dim: Union[torch.fx.Node, int],
) -> Union[torch.fx.Node, int]:
nonlocal transpose_input_node
nonlocal changed
Expand Down

0 comments on commit f08656b

Please sign in to comment.