![]() ![]() TensorFlow is run by importing it as a Python module: $ python If you have Docker 19.02 or earlier, a typical command to launch the container is: nvidia-docker run -it -rm nvcr.io/nvidia/tensorflow:xx.xx-tfx-p圓 If you have Docker 19.03 or later, a typical command to launch the container is: docker run -gpus all -it -rm nvcr.io/nvidia/tensorflow:xx.xx-tfx-p圓 ![]() For more information about using NGC, refer to the NGC Container User Guide. To run a container, issue the appropriate command as explained in the Running A Container chapter in the NVIDIA Containers For Deep Learning Frameworks User’s Guide and specify the registry, repository, and tags. It is not necessary to install the NVIDIA CUDA Toolkit. No other installation, compilation, or dependency management is required. Using the TensorFlow NGC Container requires the host system to have the following installed:įor supported versions, see the Framework Containers Support Matrix and the NVIDIA Container Toolkit Documentation. This container also contains software for accelerating ETL ( DALI, RAPIDS), Training ( cuDNN, NCCL), and Inference ( TensorRT) workloads. This container may also contain modifications to the TensorFlow source code in order to maximize performance and compatibility. The TensorFlow NGC Container is optimized for GPU acceleration, and contains a validated set of libraries that enable and optimize GPU performance. The TensorFlow NGC Container comes with all dependencies included, providing an easy place to start developing common applications, such as conversational AI, natural language processing (NLP), recommenders, and computer vision. NGC Containers are the easiest way to get started with TensorFlow. It provides comprehensive tools and libraries in a flexible architecture allowing easy deployment across a variety of platforms and devices. TensorFlow is an open source platform for machine learning. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |