TensorRT is built on CUDA, NVIDIA’s parallel programming model, and enables you to optimize inference for all deep learning frameworks. TensorRT is a high performance deep learning inference runtime for image classification, segmentation, and object detection neural networks. Support for Jetson Orin NX and Jetson Orin Nano in Image based OTA tools.Support in disk encryption for encrypting only User Data Partition (UDA) and runtime enabling encryption of UDA partitions.Support for delegated authentication with ability to sign UEFI with platform vendor owned keys.Enhanced secure boot for encrypting kernel, kernel-dtb and initrd.Ability to add and revoke UEFI signing keys.Support for up to 3 signing keys to sign bootloader in secure boot and ability to revoke the keys. Refer to migration guide to migrate from Nvbuf_utils to NvUtils. Support for alternating exposures in Argus (sample argus_userAlternatingAutoexposure added).Deskew calibration support for high data rate sensors (> 1.5 Gbps).Support for multiple camera synchronization (sample argus_syncstereo added).Enhanced error resiliency for improved stability in Argus.Adds support for Jetson AGX Orin Industrial module.JetPack 5.1.2 includes Jetson Linux 35.4.1 which adds following highlights: (Please refer to release notes for additional details) NVIDIA Jetson Linux 35.4.1 provides the Linux Kernel 5.10, UEFI based bootloader, Ubuntu 20.04 based root file system, NVIDIA drivers, necessary firmwares, toolchain and more.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |