Skip to content

Commit

Permalink
Merge pull request #37 from roboflow/fix/update-jetson-dockers
Browse files Browse the repository at this point in the history
Update Jetson Dockers
  • Loading branch information
paulguerrie committed Sep 7, 2023
2 parents 6a32e9d + f9e0c80 commit 7cc992e
Show file tree
Hide file tree
Showing 14 changed files with 52 additions and 36 deletions.
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Build and Push Jetson 4.5.1 Container
name: Build and Push Jetson 4.5.0 Container

on:
release:
Expand Down Expand Up @@ -38,8 +38,8 @@ jobs:
uses: docker/build-push-action@v4
with:
push: true
tags: roboflow/roboflow-inference-server-trt-jetson:latest,roboflow/roboflow-inference-server-trt-jetson:${{ env.VERSION}}
cache-from: type=registry,ref=roboflow/roboflow-inference-server-trt-jetson:cache
cache-to: type=registry,ref=roboflow/roboflow-inference-server-trt-jetson:cache,mode=max
tags: roboflow/roboflow-inference-server-jetson-4.5.0:latest,roboflow/roboflow-inference-server-jetson-4.5.0:${{ env.VERSION}}
cache-from: type=registry,ref=roboflow/roboflow-inference-server-jetson-4.5.0:cache
cache-to: type=registry,ref=roboflow/roboflow-inference-server-jetson-4.5.0:cache,mode=max
platforms: linux/arm64
file: ./docker/dockerfiles/Dockerfile.onnx.jetson
file: ./docker/dockerfiles/Dockerfile.onnx.jetson.4.5.0
6 changes: 3 additions & 3 deletions .github/workflows/docker.jetson.4.6.1.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,8 @@ jobs:
uses: docker/build-push-action@v4
with:
push: true
tags: roboflow/roboflow-inference-server-trt-jetson-4.6.1:latest,roboflow/roboflow-inference-server-trt-jetson-4.6.1:${{ env.VERSION}}
cache-from: type=registry,ref=roboflow/roboflow-inference-server-trt-jetson-4.6.1:cache
cache-to: type=registry,ref=roboflow/roboflow-inference-server-trt-jetson-4.6.1:cache,mode=max
tags: roboflow/roboflow-inference-server-jetson-4.6.1:latest,roboflow/roboflow-inference-server-jetson-4.6.1:${{ env.VERSION}}
cache-from: type=registry,ref=roboflow/roboflow-inference-server-jetson-4.6.1:cache
cache-to: type=registry,ref=roboflow/roboflow-inference-server-jetson-4.6.1:cache,mode=max
platforms: linux/arm64
file: ./docker/dockerfiles/Dockerfile.onnx.jetson.4.6.1
6 changes: 3 additions & 3 deletions .github/workflows/docker.jetson.5.1.1.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,8 @@ jobs:
uses: docker/build-push-action@v4
with:
push: true
tags: roboflow/roboflow-inference-server-trt-jetson-5.1.1:latest,roboflow/roboflow-inference-server-trt-jetson-5.1.1:${{ env.VERSION}}
cache-from: type=registry,ref=roboflow/roboflow-inference-server-trt-jetson-5.1.1:cache
cache-to: type=registry,ref=roboflow/roboflow-inference-server-trt-jetson-5.1.1:cache,mode=max
tags: roboflow/roboflow-inference-server-jetson-5.1.1:latest,roboflow/roboflow-inference-server-jetson-5.1.1:${{ env.VERSION}}
cache-from: type=registry,ref=roboflow/roboflow-inference-server-jetson-5.1.1:cache
cache-to: type=registry,ref=roboflow/roboflow-inference-server-jetson-5.1.1:cache,mode=max
platforms: linux/arm64
file: ./docker/dockerfiles/Dockerfile.onnx.jetson.5.1.1
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -124,13 +124,13 @@ docker run --network=host --gpus=all roboflow/roboflow-inference-server-trt:late
- Run on NVIDIA Jetson with JetPack `4.x`:

```bash
docker run --privileged --net=host --runtime=nvidia roboflow/roboflow-inference-server-trt-jetson:latest
docker run --privileged --net=host --runtime=nvidia roboflow/roboflow-inference-server-jetson:latest
```

- Run on NVIDIA Jetson with JetPack `5.x`:

```bash
docker run --privileged --net=host --runtime=nvidia roboflow/roboflow-inference-server-trt-jetson-5.1.1:latest
docker run --privileged --net=host --runtime=nvidia roboflow/roboflow-inference-server-jetson-5.1.1:latest
```

</details>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,8 +46,6 @@ WORKDIR /app/
COPY inference inference
COPY docker/config/trt_http.py trt_http.py

ENV ONNXRUNTIME_EXECUTION_PROVIDERS=TensorrtExecutionProvider
ENV REQUIRED_ONNX_PROVIDERS=TensorrtExecutionProvider
ENV PROJECT=roboflow-platform
ENV ORT_TENSORRT_FP16_ENABLE=1
ENV ORT_TENSORRT_ENGINE_CACHE_ENABLE=1
Expand Down
2 changes: 0 additions & 2 deletions docker/dockerfiles/Dockerfile.onnx.jetson.4.6.1
Original file line number Diff line number Diff line change
Expand Up @@ -49,8 +49,6 @@ WORKDIR /app/
COPY inference inference
COPY docker/config/trt_http.py trt_http.py

ENV ONNXRUNTIME_EXECUTION_PROVIDERS=TensorrtExecutionProvider
ENV REQUIRED_ONNX_PROVIDERS=TensorrtExecutionProvider
ENV PROJECT=roboflow-platform
ENV ORT_TENSORRT_FP16_ENABLE=1
ENV ORT_TENSORRT_ENGINE_CACHE_ENABLE=1
Expand Down
2 changes: 0 additions & 2 deletions docker/dockerfiles/Dockerfile.onnx.jetson.5.1.1
Original file line number Diff line number Diff line change
Expand Up @@ -46,8 +46,6 @@ WORKDIR /app/
COPY inference inference
COPY docker/config/trt_http.py trt_http.py

ENV ONNXRUNTIME_EXECUTION_PROVIDERS=TensorrtExecutionProvider
ENV REQUIRED_ONNX_PROVIDERS=TensorrtExecutionProvider
ENV PROJECT=roboflow-platform
ENV ORT_TENSORRT_FP16_ENABLE=1
ENV ORT_TENSORRT_ENGINE_CACHE_ENABLE=1
Expand Down
2 changes: 1 addition & 1 deletion docker/publish/jetson_trt_4.4.1_http.sh
Original file line number Diff line number Diff line change
@@ -1 +1 @@
docker/publish/deploy_docker_image.sh roboflow/roboflow-inference-server-trt-jetson-4.4.1 docker/dockerfiles/Dockerfile.onnx.jetson.4.4.1
docker/publish/deploy_docker_image.sh roboflow/roboflow-inference-server-jetson-4.4.1 docker/dockerfiles/Dockerfile.onnx.jetson.4.4.1
2 changes: 1 addition & 1 deletion docker/publish/jetson_trt_http.sh
Original file line number Diff line number Diff line change
@@ -1 +1 @@
docker/publish/deploy_docker_image.sh roboflow/roboflow-inference-server-trt-jetson docker/dockerfiles/Dockerfile.onnx.jetson
docker/publish/deploy_docker_image.sh roboflow/roboflow-inference-server-jetson docker/dockerfiles/Dockerfile.onnx.jetson
2 changes: 1 addition & 1 deletion docker/publish/jetson_trt_http_5.1.1.sh
Original file line number Diff line number Diff line change
@@ -1 +1 @@
docker/publish/deploy_docker_image.sh roboflow/roboflow-inference-server-trt-jetson-5.1.1 docker/dockerfiles/Dockerfile.onnx.jetson.5.1.1
docker/publish/deploy_docker_image.sh roboflow/roboflow-inference-server-jetson-5.1.1 docker/dockerfiles/Dockerfile.onnx.jetson.5.1.1
42 changes: 32 additions & 10 deletions docs/quickstart/docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,18 +40,25 @@ hardware configurations.
docker pull roboflow/roboflow-inference-server-trt
```

=== "Jetson 4.x"
Official Roboflow Inference Server Docker Image for Nvidia Jetson JetPack 4.x Targets.
=== "Jetson 4.5.x"
Official Roboflow Inference Server Docker Image for Nvidia Jetson JetPack 4.5.x Targets.

```
docker pull roboflow/roboflow-inference-server-trt-jetson
docker pull roboflow/roboflow-inference-server-jetson-4.5.0
```

=== "Jetson 4.6.x"
Official Roboflow Inference Server Docker Image for Nvidia Jetson JetPack 4.6.x Targets.

```
docker pull roboflow/roboflow-inference-server-jetson-4.6.1
```

=== "Jetson 5.x"
Official Roboflow Inference Server Docker Image for Nvidia Jetson JetPack 5.x Targets.

```
docker pull roboflow/roboflow-inference-server-trt-jetson-5.1.1
docker pull roboflow/roboflow-inference-server-jetson-5.1.1
```

## Run
Expand Down Expand Up @@ -85,18 +92,26 @@ Server in a container.
roboflow/roboflow-inference-server-trt:latest
```

=== "Jetson 4.x"
=== "Jetson 4.5.x"
```
docker run --privileged --net=host --runtime=nvidia \
roboflow/roboflow-inference-server-jetson-4.5.0:latest
```

=== "Jetson 4.6.x"
```
docker run --privileged --net=host --runtime=nvidia \
roboflow/roboflow-inference-server-trt-jetson:latest
roboflow/roboflow-inference-server-jetson-4.6.1:latest
```

=== "Jetson 5.x"
```
docker run --privileged --net=host --runtime=nvidia \
roboflow/roboflow-inference-server-trt-jetson-5.1.1:latest
roboflow/roboflow-inference-server-jetson-5.1.1:latest
```

**_Note:_** The Jetson images come with TensorRT dependancies. To use TensorRT acceleration with your model, pass an additional environment variable at runtime `-e ONNXRUNTIME_EXECUTION_PROVIDERS=TensorrtExecutionProvider`. This can improve inference speed, however, this also incurs a costly startup expense when the model is loaded.

## Build

To build a Docker image locally, first clone the Inference Server repository.
Expand Down Expand Up @@ -137,16 +152,23 @@ Choose a Dockerfile from the following options, depending on the hardware you wa
roboflow/roboflow-inference-server-trt .
```

=== "Jetson 4.x"
=== "Jetson 4.5.x"
```
docker build \
-f dockerfiles/Dockerfile.onnx.jetson \
-t roboflow/roboflow-inference-server-jetson-4.5.0 .
```

=== "Jetson 4.6.x"
```
docker build \
-f dockerfiles/Dockerfile.onnx.jetson \
-t roboflow/roboflow-inference-server-trt-jetson .
-t roboflow/roboflow-inference-server-jetson-4.6.1 .
```

=== "Jetson 5.x"
```
docker build \
-f dockerfiles/Dockerfile.onnx.jetson.5.1.1 \
-t roboflow/roboflow-inference-server-trt-jetson-5.1.1 .
-t roboflow/roboflow-inference-server-jetson-5.1.1 .
```
2 changes: 1 addition & 1 deletion docs/quickstart/http_inference.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ sudo docker run -it --rm -p 9001:9001 roboflow/roboflow-inference-server-arm-cpu
### TRT

```bash
sudo docker run --privileged --net=host --runtime=nvidia --mount source=roboflow,target=/cache -e NUM_WORKERS=1 roboflow/roboflow-inference-server-trt-jetson:latest
sudo docker run --privileged --net=host --runtime=nvidia --mount source=roboflow,target=/cache -e NUM_WORKERS=1 roboflow/roboflow-inference-server-jetson:latest
```

### GPU
Expand Down
4 changes: 2 additions & 2 deletions examples/inference-client/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,13 +74,13 @@ docker run --gpus=all --net=host -e STREAM_ID=0 -e MODEL_ID=<> -e API_KEY=<> rob
- Run on Nvidia Jetson with JetPack `4.x`:

```bash
docker run --privileged --net=host --runtime=nvidia roboflow/roboflow-inference-server-trt-jetson:latest
docker run --privileged --net=host --runtime=nvidia roboflow/roboflow-inference-server-jetson:latest
```

- Run on Nvidia Jetson with JetPack `5.x`:

```bash
docker run --privileged --net=host --runtime=nvidia roboflow/roboflow-inference-server-trt-jetson-5.1.1:latest
docker run --privileged --net=host --runtime=nvidia roboflow/roboflow-inference-server-jetson-5.1.1:latest
```

### UDP
Expand Down
2 changes: 1 addition & 1 deletion inference/core/version.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
__version__ = "0.8.1"
__version__ = "0.8.2"

if __name__ == "__main__":
print(__version__)

0 comments on commit 7cc992e

Please sign in to comment.