Uploading From Cloud Storage

There are three primary ways to provide video files to NomadicML:

  1. Public URLs or Local Files: The simplest method for getting started. See the Quickstart guide for examples.
  2. Direct Cloud Integration: Grant our backend read-only access to your cloud storage bucket. This is the recommended approach for automated, continuous data ingestion.
  3. Signed URLs: Generate temporary, secure URLs for individual private files. This is ideal for one-off uploads from a secure environment.

This page covers the two cloud-native methods in detail below.

Direct Cloud Integration

To avoid exchanging keys, we offer direct integration with your cloud storage. This approach is ideal for continuous data ingestion pipelines. We support secure, read-only access for the following providers:

Cloud ProviderIntegration MethodDetails
AWS S3Bucket PolicyGrant our IAM role (arn:aws:iam::xxxx:role/xxxx-readonly-role) read-only access to your S3 bucket
Google Cloud StorageIAM PolicyGrant the Storage Object Viewer role to our service account (xxxx@our.iam.gserviceaccount.com) on your GCS bucket.
Azure Blob StorageRole AssignmentPlease contact us to get the correct Service Principal for role assignment.

This method avoids exchanging secret keys and gives you full control to manage access from your cloud console. Please contact our support team to configure a direct integration.

Signed URLs

If your videos are stored privately in Amazon S3 or Google Cloud Storage, you can provide NomadicML with a pre-signed URL. The SDK will download the file directly.

AWS S3 Example

import boto3

s3_client = boto3.client('s3')
url = s3_client.generate_presigned_url(
    'get_object',
    Params={'Bucket': 'my-bucket', 'Key': 'videos/my_video.mp4'},
    ExpiresIn=3600  # URL valid for 1 hour
)

client.video.upload_and_analyze(file_path=url)

Google Cloud Storage Example

from google.cloud import storage

client_gcs = storage.Client()
blob = client_gcs.bucket('my-bucket').blob('videos/my_video.mp4')
url = blob.generate_signed_url(expiration=3600)

client.video.upload_and_analyze(file_path=url)

Azure Blob Storage Example

In Azure, you can generate a Shared Access Signature (SAS) token to create a temporary, secure URL.

from datetime import datetime, timedelta
from azure.storage.blob import BlobServiceClient, generate_blob_sas, BlobSasPermissions

# Get this from the Azure Portal
connection_string = "<YOUR_CONNECTION_STRING>"
blob_service_client = BlobServiceClient.from_connection_string(connection_string)

container_name = "my-container"
blob_name = "videos/my_video.mp4"

sas_token = generate_blob_sas(
    account_name=blob_service_client.account_name,
    container_name=container_name,
    blob_name=blob_name,
    account_key=blob_service_client.credential.account_key,
    permission=BlobSasPermissions(read=True),
    expiry=datetime.utcnow() + timedelta(hours=1)
)

url = f"https://{blob_service_client.account_name}.blob.core.windows.net/{container_name}/{blob_name}?{sas_token}"

client.video.upload_and_analyze(file_path=url)

Use upload_and_analyze_videos if you have many URLs. See SDK Usage Examples for full workflows.

Next Steps

After configuring your cloud storage, you can explore these topics: