- Profile-scoped API credentials after onboarding
- Integration guidance for secure key handling
- Production activation only after scope alignment
AlgoNav
Positioning API
Connect high-precision GNSS/IMU post-processing to your own products and pipelines.
Upload. Process. Retrieve.
Upload Files
Send raw GNSS/IMU logs (and optional supporting sensor data) with metadata to the file endpoint.
Create Job
Start an asynchronous processing job using your profile (parameters, QC rules, output schema, automation hooks).
Monitor Progress
Poll status or use webhooks for lifecycle updates across single tasks and larger batch campaigns.
Retrieve Results
Download trajectory, orientation, and QC indicators — structured for your downstream pipeline.
Under the Hood: The AlgoNav Positioning Engine
Every deployment option is powered by the same science-driven core — tightly coupled sensor fusion, iterative multi-pass estimation and blazing-fast CPU-parallel processing built on 20+ years of research.
Enterprise API Deployment
For organizations that need dedicated infrastructure, data sovereignty, or custom SLAs — the AlgoNav API can be deployed beyond our shared cloud.
Dedicated Instance
Customer-specific API endpoints with isolated compute and storage resources.
Regional Deployment
Host the API in a specific geographic region to meet data residency requirements.
On-Premise / Air-Gapped
Deploy the API on your own servers or in fully air-gapped environments via Docker containers.
Scaling & Load Balancing
Horizontal scaling and load balancing for high-throughput production pipelines. Unified API endpoint with internal job scheduling across worker clusters — your integration stays simple regardless of scale.
White-Label
Brand the API output and reports under your own identity for end-customer delivery.
Customizable Workflows
Define end-to-end processing pipelines tailored to your data flow. Automate ingestion, processing, QC validation, and result delivery with webhook notifications and custom output schemas.
All processing is CPU-based — no GPU required. Runs on standard server hardware from 4-core VPS to 64+ core machines.
API Usage Example
Example workflow: upload files, reference returned file IDs in tasks, then run async jobs. Final endpoint set and schema are finalized in onboarding.
Technical preview — example workflow
# 1) Upload a file curl -X POST https://api.algonav.de/v1/files \ -H "Authorization: Bearer $API_KEY" \ -F "type=gnss" -F "file=@mission.obs" # → {"file_id": "file_gnss_001"} # 2) Create a job curl -X POST https://api.algonav.de/v1/jobs \ -H "Authorization: Bearer $API_KEY" \ -H "Content-Type: application/json" \ -d '{"job_profile": "survey_v1", "tasks": [{"gnss_file_id": "file_gnss_001", "imu_file_id": "file_imu_001"}]}' # → {"job_id": "job_abc123", "status": "queued"} # 3) Poll status curl https://api.algonav.de/v1/jobs/job_abc123 \ -H "Authorization: Bearer $API_KEY" # → {"status": "completed", "tasks_completed": 1}
import requests API = "https://api.algonav.de/v1" HDRS = {"Authorization": f"Bearer {API_KEY}"} # 1) Upload files gnss = requests.post(f"{API}/files", headers=HDRS, files={"file": open("mission.obs", "rb")}, data={"type": "gnss"}).json() imu = requests.post(f"{API}/files", headers=HDRS, files={"file": open("mission_imu.csv", "rb")}, data={"type": "imu"}).json() # 2) Create a job job = requests.post(f"{API}/jobs", headers=HDRS, json={ "job_profile": "survey_v1", "tasks": [{ "gnss_file_id": gnss["file_id"], "imu_file_id": imu["file_id"], }] }).json() # 3) Poll status status = requests.get( f"{API}/jobs/{job['job_id']}", headers=HDRS ).json() print(status["status"]) # → "completed"
import fetch from 'node-fetch'; import FormData from 'form-data'; import fs from 'fs'; const API = 'https://api.algonav.de/v1'; const hdrs = { Authorization: `Bearer ${API_KEY}` }; // 1) Upload a file const form = new FormData(); form.append('type', 'gnss'); form.append('file', fs.createReadStream('mission.obs')); const upload = await fetch(`${API}/files`, { method: 'POST', headers: { ...hdrs, ...form.getHeaders() }, body: form }).then(r => r.json()); // 2) Create a job const job = await fetch(`${API}/jobs`, { method: 'POST', headers: { ...hdrs, 'Content-Type': 'application/json' }, body: JSON.stringify({ job_profile: 'survey_v1', tasks: [{ gnss_file_id: upload.file_id, imu_file_id: 'file_imu_001' }] }) }).then(r => r.json()); // 3) Poll status const status = await fetch( `${API}/jobs/${job.job_id}`, { headers: hdrs } ).then(r => r.json()); console.log(status.status); // → "completed"
Detailed endpoint documentation and profile schemas are shared as part of API onboarding.
Access, Throughput & Security
Configured to your workflow. Compliant by design.
- Rate limits and concurrency defined per workflow profile
- Batch behavior tuned to your operational load pattern
- Retention and lifecycle controls aligned to project requirements
- EU hosting and encrypted transport (TLS)
- GDPR-aligned processing framework
- Privacy details documented in our policy
Public endpoint details are intentionally limited. Full technical documentation is shared during onboarding. See our Privacy Policy for data handling details.
API Onboarding
Guided setup first. A production API profile configured for your exact workflow afterwards.
Upload & Free Evaluation
Share a representative test dataset via a secure link. We process it and review the expected positioning quality with you — free of charge.
Define Your API Profile
We align inputs, outputs, QC metrics, coordinate handling, and automation hooks to your target pipeline — and configure your production workflow profile.
Go-Live & Operate
Your API credentials are issued and your production profile is activated. Run single tasks or large batch jobs with profile-based throughput and lifecycle controls from day one.
Ready to Build Your API Workflow?
Start with a free test-data evaluation. If quality meets your needs, we configure and activate your production API profile.
Prefer a web interface? Try Positioning Cloud.