Uploading From Cloud Storage
There are three primary ways to provide video files to NomadicML:
- Public URLs or Local Files: The simplest method for getting started. See the Quickstart guide for examples.
- Direct Cloud Integration: Grant our backend read-only access to your cloud storage bucket. This is the recommended approach for automated, continuous data ingestion.
- Signed URLs: Generate temporary, secure URLs for individual private files. This is ideal for one-off uploads from a secure environment.
This page covers the two cloud-native methods in detail below.
Direct Cloud Integration
To avoid exchanging keys, we offer direct integration with your cloud storage. This approach is ideal for continuous data ingestion pipelines. We support secure, read-only access for the following providers:
Cloud Provider | Integration Method | Details |
---|
AWS S3 | Bucket Policy | Grant our IAM role (arn:aws:iam::xxxx:role/xxxx-readonly-role ) read-only access to your S3 bucket |
Google Cloud Storage | IAM Policy | Grant the Storage Object Viewer role to our service account (xxxx@our.iam.gserviceaccount.com ) on your GCS bucket. |
Azure Blob Storage | Role Assignment | Please contact us to get the correct Service Principal for role assignment. |
This method avoids exchanging secret keys and gives you full control to manage access from your cloud console. Please contact our support team to configure a direct integration.
Signed URLs
If your videos are stored privately in Amazon S3 or Google Cloud Storage, you can provide NomadicML with a pre-signed URL. The SDK will download the file directly.
AWS S3 Example
import boto3
s3_client = boto3.client('s3')
url = s3_client.generate_presigned_url(
'get_object',
Params={'Bucket': 'my-bucket', 'Key': 'videos/my_video.mp4'},
ExpiresIn=3600 # URL valid for 1 hour
)
response = client.upload(url)
response = client.analyze(
response['video_id'],
analysis_type="rapid_review",
custom_event="Find all instances of ego vehicle straddling two lanes",
custom_category="driving"
)
Google Cloud Storage Example
from google.cloud import storage
client_gcs = storage.Client()
blob = client_gcs.bucket('my-bucket').blob('videos/my_video.mp4')
url = blob.generate_signed_url(expiration=3600)
response = client.upload(url)
response = client.analyze(
response['video_id'],
analysis_type="rapid_review",
custom_event="Find all instances of ego vehicle straddling two lanes",
custom_category="driving"
)
Azure Blob Storage Example
In Azure, you can generate a Shared Access Signature (SAS) token to create a temporary, secure URL.
from datetime import datetime, timedelta
from azure.storage.blob import BlobServiceClient, generate_blob_sas, BlobSasPermissions
# Get this from the Azure Portal
connection_string = "<YOUR_CONNECTION_STRING>"
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
container_name = "my-container"
blob_name = "videos/my_video.mp4"
sas_token = generate_blob_sas(
account_name=blob_service_client.account_name,
container_name=container_name,
blob_name=blob_name,
account_key=blob_service_client.credential.account_key,
permission=BlobSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1)
)
url = f"https://{blob_service_client.account_name}.blob.core.windows.net/{container_name}/{blob_name}?{sas_token}"
response = client.upload(url)
response = client.analyze(
response['video_id'],
analysis_type="rapid_review",
custom_event="Find all instances of ego vehicle straddling two lanes",
custom_category="driving"
)
If you have many URLs you can pass them as a list or organize them in folders. See SDK Usage Examples for full workflows.
Next Steps
After configuring your cloud storage, you can explore these topics:
Responses are generated using AI and may contain mistakes.