Member-only story
Get started with a local deep learning container
Today Hugging Face and Google Cloud announced that they have joined forces to bring a collection of Deep Learning Containers (DLCs) to transform the way you build AI with open models on Google Cloud, and since I love to try things out, I tried to run some of those Deep Learning Containers locally and I loved how quick and easy you can set things up and running.
In this post we going step by step to run one of the Deep Learning Containers (DLCs) in your local machine.
Pre-requirements
- You need to create a new project in Google cloud ( or use an existing one if you prefer ).
- You need to install the
gcloud CLI
so if you don’t have it yet insyall, please follow the official docs based on your OS.
Once you installed the gcloud CLI, you need to login and connect to the project you created earlier
gcloud auth login
gcloud config set project YOUR_PROJECT_ID
- As we are talking about containers, you going to need Docker so go ahead and install if you don’t have it yet and then run it
To ensure that Docker is running, run the following Docker command, which returns the current time and date:
docker run busybox date
- Then you need to use gcloud as credential…