Filecatalyst | Workload Automation 2021

# Retry up to 3 times RETRIES=3 for i in $(seq 1 $RETRIES); do fta-cli --put critical_file.dat --target /incoming/ && break || sleep 10 done | Tool | Integration Method | |------|--------------------| | Apache Airflow | Use BashOperator with fta-cli or SimpleHttpOperator for REST API | | Jenkins | Execute shell script step calling fta-cli | | Rundeck | Create a job step: "Command" → fta-cli ... | | Control-M | FileCatalyst provides a Control-M plugin (File Transfer Hub) | | Apache NiFi | Use ExecuteProcess processor to call fta-cli |

For native workload automation features (dependency management, SLA tracking, visual pipelines), you would typically wrap FileCatalyst commands into a dedicated workload automation platform like , using FileCatalyst as the file movement plugin. filecatalyst workload automation

def run_fta(local, remote, server, user, pw): cmd = ["fta-cli", "--server", server, "--username", user, "--password", pw, "--put", local, "--target", remote] result = subprocess.run(cmd, capture_output=True) return result.returncode == 0 # Retry up to 3 times RETRIES=3 for

Since FileCatalyst itself is primarily a high-speed file transfer solution (using UDP acceleration), it does not have a native "Workload Automation" engine built into its core. Instead, automation is achieved through its , REST API , and Hotfolders . Instead, automation is achieved through its , REST

#!/bin/bash # workload_processor.sh # Step 1: Compress files tar -czf /data/prepared/batch1.tar.gz /data/raw/*.csv fta-cli --server fc.example.com --port 11001 --username auto_user --put /data/prepared/batch1.tar.gz --target /incoming/ Step 3: Verify success (check exit code) if [ $? -eq 0 ]; then echo "Transfer successful, triggering downstream API" curl -X POST https://processing.api/start -d '"file":"batch1.tar.gz"' else echo "Transfer failed" >> /var/log/fc_errors.log fi Method B: Hotfolders – Best for Simple, Event-Driven Workloads Configure hotfolder.properties to watch a directory. Any file dropped is automatically transferred.

headers = "X-API-Key": API_KEY resp = requests.post(f"API_BASE/transfer", json=payload, headers=headers) transfer_id = resp.json()["id"]

# Send 10 files in parallel ls /data/to_send/*.dat | xargs -P 10 -I {} fta-cli --put {} --target /remote/ Check file hash before transfer.