Custom Integrations

NomadicML is designed to integrate seamlessly with your existing systems and workflows. This guide explores various integration options, from simple API usage to complex enterprise integrations.

Integration Approaches

There are several ways to integrate NomadicML into your applications:

API Integration

Direct integration using the NomadicML API

SDK Integration

Embed capabilities using the Python SDK

Webhook Notifications

Receive notifications when events occur

Data Export

Export analysis data for external processing

API Integration

The most flexible integration method is direct API usage. This approach works with any programming language that can make HTTP requests.

Authentication

All API integrations start with proper authentication:

# Python example using requests
import requests

api_key = "your_api_key"
headers = {
    "X-API-Key": api_key,
    "Content-Type": "application/json"
}

response = requests.get(
    "https://api.nomadicml.com/api/videos",
    headers=headers
)

For more details, see the Authentication documentation.

Common Integration Points

The most frequently used integration points include:

  1. Video Upload: Send videos for analysis
  2. Analysis Results: Retrieve and process analysis data
  3. Event Notifications: React to detected events

Example: Custom Reporting System

Here’s how you might integrate NomadicML with a custom reporting system:

import requests
import pandas as pd
from datetime import datetime, timedelta

API_BASE = "https://api.nomadicml.com/api"
API_KEY = "your_api_key"

headers = {
    "X-API-Key": API_KEY,
    "Content-Type": "application/json"
}

# Get events from the past week
end_date = datetime.now().isoformat()
start_date = (datetime.now() - timedelta(days=7)).isoformat()

params = {
    "date_from": start_date,
    "date_to": end_date,
    "firebase_collection_name": "videos"
}

# Get all events
response = requests.get(
    f"{API_BASE}/events",
    headers=headers,
    params=params
)

events_data = response.json()["events"]

# Convert to pandas DataFrame for analysis
df = pd.DataFrame(events_data)

# Generate statistics
event_counts = df.groupby(["type", "severity"]).size().reset_index(name="count")
most_common = df["description"].value_counts().head(5)

# Write to CSV for reporting system
event_counts.to_csv("weekly_event_summary.csv", index=False)
df.to_csv("weekly_events_detail.csv", index=False)

print("Weekly report data exported successfully")

SDK Integration

The NomadicML Python SDK provides a more streamlined integration experience for Python applications.

Installation

pip install nomadicml

Basic Usage

from nomadicml import NomadicML

# Initialize client
client = NomadicML(api_key="your_api_key")

# Upload and analyze a video
result = client.video.upload_and_analyze("path/to/video.mp4")

# Process the results
for event in result["events"]:
    print(f"Event: {event['type']} at {event['time']}s - {event['description']}")
    print(f"Analysis: {event['ai_analysis']}")

Example: Educational Application

Here’s how you might integrate NomadicML into a driver education application:

from nomadicml import NomadicML
from flask import Flask, request, jsonify
import pandas as pd

app = Flask(__name__)
client = NomadicML(api_key="your_api_key")

@app.route("/analyze_student_drive", methods=["POST"])
def analyze_student_drive():
    # Get the video file from the request
    if "video" not in request.files:
        return jsonify({"error": "No video file provided"}), 400
    
    video_file = request.files["video"]
    student_id = request.form.get("student_id", "unknown")
    
    # Save the file temporarily
    temp_path = f"/tmp/student_drive_{student_id}.mp4"
    video_file.save(temp_path)
    
    # Upload and analyze
    try:
        result = client.video.upload_and_analyze(temp_path)
        
        # Create student report
        report = {
            "student_id": student_id,
            "timestamp": pd.Timestamp.now().isoformat(),
            "video_id": result["video_id"],
            "event_count": len(result["events"]),
            "violation_count": len([e for e in result["events"] if e["type"] == "Traffic Violation"]),
            "safety_score": result["analysis"]["safety_score"],
            "dmv_compliance_score": result["analysis"]["dmv_compliance_score"],
            "key_feedback": [e["ai_analysis"] for e in result["events"] if e["severity"] == "high"]
        }
        
        # Store report in database
        # db.save_student_report(report)
        
        return jsonify({
            "success": True,
            "report": report,
            "video_id": result["video_id"]
        })
        
    except Exception as e:
        return jsonify({"error": str(e)}), 500

if __name__ == "__main__":
    app.run(debug=True)

Webhook Notifications

For real-time integrations, you can configure webhook notifications to be sent when specific events occur in NomadicML.

Setting Up Webhooks

Currently, webhook configuration is available through direct contact with NomadicML support. Contact support@nomadicml.com to set up webhooks for your account.

Webhook Events

Webhooks can be triggered for the following events:

  • video.uploaded: Video upload completed
  • video.analysis_started: Analysis process began
  • video.analysis_completed: Analysis process finished
  • video.analysis_failed: Analysis process failed
  • event.detected: New driving event detected
  • event.high_severity: High severity event detected

Webhook Payload

Webhook notifications are sent as HTTP POST requests with a JSON payload:

{
  "event_type": "video.analysis_completed",
  "timestamp": "2025-03-15T14:30:00Z",
  "account_id": "account-uuid",
  "data": {
    "video_id": "video-uuid",
    "filename": "example.mp4",
    "status": "completed",
    "event_count": 5,
    "analysis_url": "https://api.nomadicml.com/api/video/video-uuid/analysis"
  }
}

Webhook Security

Webhooks include a signature header (X-NomadicML-Signature) that you can use to verify the authenticity of the request:

import hmac
import hashlib

def verify_webhook_signature(payload, signature, secret):
    """Verify that the webhook signature is valid."""
    expected = hmac.new(
        secret.encode(),
        payload.encode(),
        hashlib.sha256
    ).hexdigest()
    
    return hmac.compare_digest(expected, signature)

# Flask example
@app.route("/webhook", methods=["POST"])
def webhook_handler():
    signature = request.headers.get("X-NomadicML-Signature")
    payload = request.get_data(as_text=True)
    
    if not verify_webhook_signature(payload, signature, "your_webhook_secret"):
        return "Invalid signature", 401
    
    # Process the webhook
    data = request.json
    event_type = data["event_type"]
    
    if event_type == "event.high_severity":
        # Alert system about high severity event
        send_alert(data["data"])
    
    return "OK", 200

Data Export

For batch processing or integration with data warehouses, you can export data from NomadicML.

Export Formats

NomadicML supports the following export formats:

  • JSON: Full data with all fields
  • CSV: Tabular format for spreadsheet applications
  • PDF: Formatted reports for sharing
  • MP4: Video clips of specific events

Programmatic Export

You can use the API to programmatically export data:

import requests

API_BASE = "https://api.nomadicml.com/api"
API_KEY = "your_api_key"

headers = {
    "X-API-Key": API_KEY
}

# Export all events from a video as CSV
response = requests.get(
    f"{API_BASE}/video/video-uuid/events",
    headers=headers,
    params={
        "firebase_collection_name": "videos",
        "format": "csv"
    }
)

with open("events.csv", "wb") as f:
    f.write(response.content)

# Export a formatted report
response = requests.get(
    f"{API_BASE}/video/video-uuid/report",
    headers=headers,
    params={
        "firebase_collection_name": "videos",
        "format": "pdf",
        "include_thumbnails": True
    }
)

with open("report.pdf", "wb") as f:
    f.write(response.content)

Enterprise Integration Patterns

For enterprise environments, consider these integration patterns:

ETL Processes

Extract, Transform, Load (ETL) processes can be used to move NomadicML data into data warehouses:

  1. Extract: Pull event data from the API
  2. Transform: Convert to the target schema, enrich with business data
  3. Load: Insert into your data warehouse

Message Queues

For high-volume systems, use message queues to process NomadicML events:

  1. Set up webhooks to send events to a message queue (e.g., RabbitMQ, Kafka)
  2. Deploy consumers to process events asynchronously
  3. Implement retry logic for failed processing

Microservices Architecture

In a microservices architecture, create a dedicated service for NomadicML integration:

  1. Create a NomadicML service that handles API communication
  2. Expose internal endpoints for other services to request analysis
  3. Implement caching and rate limiting to optimize API usage

Authentication and Security

When integrating with the NomadicML platform, follow these security practices:

  1. API Key Management: Rotate keys regularly and use environment variables
  2. Input Validation: Validate all input before sending to the API
  3. Error Handling: Implement proper error handling and logging
  4. Rate Limiting: Respect API rate limits and implement backoff strategies
  5. Data Protection: Properly secure any stored video or analysis data

Next Steps