Pushing Python Packages to Artifact Registry Using Cloud Build
Jayanth MKV
Posted on November 22, 2024
Google Artifact Registry is a powerful solution for managing and hosting Python package artifacts in a private, secure, and scalable way. This guide provides a step-by-step walkthrough to push Python package .whl
files to the Artifact Registry using Google Cloud Build and a secret (creds
) from Google Secret Manager for authentication.
Prerequisites
-
Artifact Registry Setup:
- Create a Python repository in your Artifact Registry:
gcloud artifacts repositories create python-packages \ --repository-format=python \ --location=us-central1 \ --description="Python packages repository"
-
Secret Setup:
- Store your key as a secret in Google Secret Manager:
gcloud secrets create creds --data-file=path/to/key.json
-
Grant Cloud Build access to the secret:(Optional, can also be done using IAM)
gcloud secrets add-iam-policy-binding creds \ --member="serviceAccount:$(gcloud projects describe $PROJECT_ID --format='value(projectNumber)')@cloudbuild.gserviceaccount.com" \ --role="roles/secretmanager.secretAccessor"
- Cloud Build Permissions: Ensure your Cloud Build service account has the necessary permissions to access the Artifact Registry and Secret Manager.
Cloud Build YAML Configuration
Here's the full working cloudbuild.yaml
file:
options:
machineType: E2_HIGHCPU_8
substitutionOption: ALLOW_LOOSE
logging: CLOUD_LOGGING_ONLY
steps:
# Step 1: Access the secret `creds` and save it as `key.json`
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'
entrypoint: bash
args:
- '-c'
- |
gcloud secrets versions access latest --secret=creds > /workspace/key.json
# Step 2: Configure `.pypirc` with the Artifact Registry credentials
- name: 'python'
entrypoint: bash
args:
- '-c'
- |
cat > ~/.pypirc << EOL
[distutils]
index-servers = tower-common-repo
[tower-common-repo]
repository: https://us-central1-python.pkg.dev/$PROJECT_ID/python-packages/
username: _json_key_base64
password: $(base64 -w0 /workspace/key.json)
EOL
# Step 3: Build and upload the Python package
pip install twine build && \
python -m build && \
twine upload --repository tower-common-repo dist/* --verbose
Step-by-Step Explanation
-
Define Build Options:
- Set the machine type, substitution behavior, and logging options.
- These configurations ensure efficient builds and manageable logs.
-
Retrieve
key.json
Secret:- Use
gcloud secrets versions access
to fetch thekey.json
file securely from Secret Manager. - Save the file to a known location (
/workspace/key.json
).
- Use
-
Configure
.pypirc
:- Generate a
.pypirc
file dynamically. This file is required fortwine
to authenticate with the Artifact Registry. - The password is base64-encoded content of
key.json
.
- Generate a
-
Build and Push Package:
- Install necessary tools (
twine
,build
). - Build the Python package (
python -m build
). - Use
twine upload
to push the.whl
file to the Artifact Registry.
- Install necessary tools (
Triggering the Build
Save the cloudbuild.yaml
file and trigger the build or can connect to github repository:
gcloud builds submit --config=cloudbuild.yaml .
Key Points
-
Secure Secrets Management: The secret (
key.json
) is accessed securely using Google Secret Manager. -
Dynamic Configuration:
.pypirc
is generated during the build, ensuring no sensitive data is stored in the repository. - Automated Upload: The process automates package building and pushing, reducing manual intervention.
Validation
After the build completes:
- Verify the uploaded package in the Artifact Registry:
gcloud artifacts packages list --repository=python-packages --location=us-central1
- Check for errors or warnings in the build logs.
Posted on November 22, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.