
Configuring the NVIDIA Jetson Thor
My experience with the Jetson Thor (5000) and how I configured it.
This is a running list of OS tools I use for my local AI projects.
The NVIDIA Jetson™ platform accelerates computing for next-gen robotics and edge AI. Explore our ecosystem partners and their latest solutions
I use the Jetson Orin kits for my edge AI projects and the AGX Thor for Training or running local LLMs. The 128GB unified memory on the Thor allows for running large models, as well as training.
All things AI on the Jetson platforms. ( Orin, Thor , etc )
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution.
Label Studio is a multi-purpose data labeling and annotation tool with flexible data interfaces for building powerful machine learning pipelines. It is open-source and free for everyone.
Ollama is a free, open-source, and easy-to-use LLM API with support for multiple languages and models.
High-performance inference of OpenAI’s Whisper automatic speech recognition (ASR) model.
Real time object detection and custom model creation tools
OpenCV open source image processing and recognition software library with support for GPU inference and YOLO models