Skip to main content

Bulk Operations Guide

Bulk operations allow you to process multiple projects at once, making it efficient to manage large numbers of projects programmatically.

Overview

Bulk operations include:
  • Bulk Archive - Archive multiple projects
  • Bulk Copy - Copy projects to another workspace
  • Bulk Move - Move projects between workspaces
  • Bulk Delete - Delete multiple projects
  • Bulk Tagging - Add or remove tags from projects
  • Bulk Recipe Execution - Run recipes on multiple projects
  • Bulk Export - Request exports for multiple projects

Bulk Request Workflow

Bulk operations follow an asynchronous pattern:
  1. Submit bulk request - Send list of project IDs and operation
  2. Monitor request status - Check progress of bulk operation
  3. Handle results - Process completed operations

Bulk Archive

  • Python
import requests

api_url = "https://voyager.lumafield.com"
headers = {"Authorization": f"Token {token}"}

project_ids = ["uuid1", "uuid2", "uuid3"]

# Archive multiple projects
response = requests.post(
    f"{api_url}/api/v2/bulkProjects/archive",
    headers=headers,
    json={"projectIds": project_ids}
)
response.raise_for_status()

bulk_request = response.json()
print(f"Bulk archive request ID: {bulk_request['id']}")

Bulk Copy

  • Python
destination_workspace = "destination-workspace-uuid"

response = requests.post(
    f"{api_url}/api/v2/bulkProjects/copy",
    headers=headers,
    json={
        "projectIds": project_ids,
        "destinationWorkspaceId": destination_workspace
    }
)
bulk_request = response.json()

Bulk Move

  • Python
response = requests.post(
    f"{api_url}/api/v2/bulkProjects/move",
    headers=headers,
    json={
        "projectIds": project_ids,
        "destinationWorkspaceId": destination_workspace
    }
)
bulk_request = response.json()

Bulk Delete

  • Python
response = requests.post(
    f"{api_url}/api/v2/bulkProjects/delete",
    headers=headers,
    json={"projectIds": project_ids}
)
bulk_request = response.json()

Bulk Tagging

Add Tags

  • Python
response = requests.post(
    f"{api_url}/api/v2/bulkProjects/addTags",
    headers=headers,
    json={
        "projectIds": project_ids,
        "tags": ["production", "q1-2024"]
    }
)
bulk_request = response.json()

Remove Tags

  • Python
response = requests.post(
    f"{api_url}/api/v2/bulkProjects/removeTags",
    headers=headers,
    json={
        "projectIds": project_ids,
        "tags": ["archived"]
    }
)
bulk_request = response.json()

Bulk Recipe Execution

  • Python
recipe_pk = "recipe-uuid"

response = requests.post(
    f"{api_url}/api/v2/bulkProjects/runRecipe",
    headers=headers,
    json={
        "projectIds": project_ids,
        "recipePk": recipe_pk
    }
)
bulk_request = response.json()

Monitoring Bulk Requests

List Bulk Requests

  • Python
# List all bulk requests
response = requests.get(
    f"{api_url}/api/v2/bulkRequests",
    headers=headers
)
requests_list = response.json()

for req in requests_list['results']:
    print(f"Request: {req['id']}")
    print(f"Status: {req['status']}")
    print(f"Type: {req['operationType']}")

Get Bulk Request Status

  • Python
bulk_request_id = "bulk-request-uuid"

response = requests.get(
    f"{api_url}/api/v2/bulkRequests/{bulk_request_id}",
    headers=headers
)
request_status = response.json()

print(f"Status: {request_status['status']}")
print(f"Progress: {request_status.get('progress', 'N/A')}")
print(f"Completed: {request_status.get('completedCount', 0)}")
print(f"Failed: {request_status.get('failedCount', 0)}")

Cancel Bulk Request

  • Python
# Cancel a bulk request
response = requests.post(
    f"{api_url}/api/v2/bulkRequests/{bulk_request_id}/cancel",
    headers=headers
)
response.raise_for_status()

Best Practices

1. Batch Sizes

Keep batch sizes reasonable:
# ✅ Good - reasonable batch size
project_ids = project_ids[:100]  # Process 100 at a time

# ❌ Avoid - too many at once
# project_ids = all_project_ids  # Could be thousands

2. Monitor Progress

Implement proper monitoring:
def wait_for_bulk_request(bulk_request_id, max_wait=3600):
    start = time.time()
    while time.time() - start < max_wait:
        status = get_bulk_request_status(bulk_request_id)
        
        if status['status'] == 'completed':
            return status
        elif status['status'] == 'failed':
            raise Exception("Bulk operation failed")
        
        print(f"Progress: {status.get('completedCount', 0)}/{status.get('totalCount', 0)}")
        time.sleep(30)
    
    raise TimeoutError("Bulk operation timeout")

3. Handle Failures

Check for partial failures:
status = get_bulk_request_status(bulk_request_id)

if status['failedCount'] > 0:
    print(f"Warning: {status['failedCount']} operations failed")
    # Check failed items and retry if needed

Next Steps