API reference
One route, one request shape, one confidence-aware response contract.
This page documents the live photo geolocation endpoint used across the product, demo, and developer surface. It is intentionally narrow: one endpoint, clear fields, and practical output expectations.
Request summary
Endpoint
https://pic2nav.com/api/location-recognition-v2Send a multipart form request with the image file in the `image` field.
Authenticate with `Authorization: Bearer YOUR_API_KEY`.
Expect confidence-aware output rather than guaranteed guesses.
Request fields
imagefileRequiredImage to analyze for visible location evidence.
latitudenumberOptionalExplicit latitude hint when the client already has coordinates.
longitudenumberOptionalExplicit longitude hint paired with latitude.
analyzeLandmarksbooleanOptionalRequest deeper landmark enrichment when relevant.
regionHintstringOptionalRegion hint used to bias route-side search and validation.
Response shape
A top-level success state and confidence score.
Resolved coordinates and address context when a location is found.
Method provenance such as EXIF, Navisense, Claude-assisted, or landmark search.
Optional enrichment such as weather, elevation, nearby places, and device analysis.
Example requests
Copy a request shape that matches how the live route is actually called.
curl example
curl -X POST https://pic2nav.com/api/location-recognition-v2 \
-H "Authorization: Bearer YOUR_API_KEY" \
-F "image=@photo.jpg"javascript example
const formData = new FormData();
formData.append("image", file);
const response = await fetch("https://pic2nav.com/api/location-recognition-v2", {
method: "POST",
headers: {
Authorization: "Bearer YOUR_API_KEY",
},
body: formData,
});
const data = await response.json();python example
import requests
headers = {"Authorization": "Bearer YOUR_API_KEY"}
files = {"image": open("photo.jpg", "rb")}
response = requests.post(
"https://pic2nav.com/api/location-recognition-v2",
headers=headers,
files=files,
)
data = response.json()Example response
{
"success": true,
"name": "Detected Location",
"address": "Example address",
"location": {
"latitude": 51.5007,
"longitude": -0.1246
},
"confidence": 0.92,
"method": "navisense-ml",
"recognitionId": "rec_123",
"weather": {
"temperature": 18.4
}
}Route behavior
The API is a live system surface, not a thin wrapper around one model call.
Direct evidence first
The route prioritizes visible-address extraction, provided coordinates, and EXIF GPS before escalating to weaker inference modes.
Hybrid recognition pipeline
If direct evidence is missing, the stack can route through NaviSense ML, Claude reasoning, and Google Vision before final validation.
Fail-closed behavior
When evidence stays weak or inconsistent, the API returns a controlled failure instead of locking in a low-quality guess.