Run custom model on Ubuntu Docker Deepstack for GPU
Posted: Mon Jan 17, 2022 7:59 am
I want to run a custom model via Ubuntu Docker Deepstack for GPU, what command should I write in the terminal?
For the standard launch of embedded DeepStack models, I use: docker run --gps all -e VISION-DETECTION=True -v localstorage:/datastore -p 80:5000 deepquestai/deepstack:gpu
This works great in Blue iris
I want to run a custom model and embedded models (person, truck, car...) at the same time
What commands do I need to write in the terminal?
For the standard launch of embedded DeepStack models, I use: docker run --gps all -e VISION-DETECTION=True -v localstorage:/datastore -p 80:5000 deepquestai/deepstack:gpu
This works great in Blue iris
I want to run a custom model and embedded models (person, truck, car...) at the same time
What commands do I need to write in the terminal?