Menu

Integration Surface

Ludopoly Analytics exposes its capabilities through four complementary protocols, each optimised for a different interaction pattern. Together, they cover the full spectrum of integration needs — from batch queries that pull historical data to persistent streams that push real-time events to downstream systems.

All protocol endpoints share a common authentication layer. API keys, issued through the project dashboard, carry the permissions and rate limits associated with the project's subscription tier. Keys are scoped to individual projects and can be rotated independently. Every request is authenticated, rate-limited, and logged, regardless of the protocol used.

API GatewayAuth · Rate Limit · LogRESTGraphQLWebSocketWebhookClient ApplicationsSDK / Direct HTTPAnalytics PlatformModules · Pipeline · GraphFour protocols share a single gateway — authentication, rate limits, and audit logs applied uniformly

REST API

The REST API follows resource-oriented conventions and is the most appropriate protocol for standard CRUD operations, paginated queries against historical data, and integration with systems that expect conventional HTTP request–response semantics. Endpoints are versioned — current consumers use the /v1 prefix — and backward compatibility is maintained across minor releases.

Typical REST interactions include retrieving the risk profile of an address, querying transaction history with multi-dimensional filters, fetching aggregated metrics for a registered project, and managing project configuration such as alert thresholds and webhook endpoints. Responses are JSON-encoded. Pagination follows cursor-based conventions, which provide stable results even when new data arrives between page fetches.

Rate limits are determined by the project's subscription tier. Starter plans receive ten thousand requests per day. Professional, Business, and Enterprise plans receive progressively higher quotas, with Enterprise customers able to negotiate custom limits as part of their SLA.

GraphQL API

The GraphQL endpoint serves use cases where the client needs precise control over the shape of the returned data. Rather than receiving a fixed response schema and discarding unused fields, consumers construct queries that request exactly the fields they need, across related entities, in a single round trip.

This protocol is particularly useful for dashboard and visualisation integrations, where a single screen may need to combine address metadata, transaction history, risk scores, cohort membership, and graph-neighbourhood information. A well-crafted GraphQL query can retrieve all of this in one request, reducing both network overhead and client-side join logic. Subscriptions — the GraphQL equivalent of server-push — are also supported, enabling real-time dashboard updates without polling.

WebSocket Streams

The WebSocket interface provides persistent, bidirectional connections for consumers that need sub-second event delivery. Once a connection is established, the client subscribes to one or more event streams — for example, all transactions associated with a specific project, all alerts generated by the AML engine, or all risk-score updates for a watched list of addresses.

Events are delivered as structured JSON messages. Each message includes an event type, a timestamp, and a payload whose schema corresponds to the event type. The connection supports heartbeat frames to detect stale sessions, and the server implements automatic reconnection guidance through a retry-after header when connections are dropped.

WebSocket streams are the preferred protocol for compliance dashboards that need real-time alerting, for trading systems that react to on-chain risk signals, and for monitoring infrastructure that watches contract-level activity across multiple chains simultaneously.

Webhook Delivery

Webhooks reverse the direction of communication. Instead of the client polling or maintaining a persistent connection, the platform pushes events to a client-specified HTTPS endpoint when predefined conditions are met. Webhook subscriptions are configured through the REST API or the project dashboard and can be scoped to specific event types, severity levels, and chain identifiers.

Each webhook delivery includes a cryptographic signature in the request headers, computed using the project's webhook secret. Recipients should verify this signature before processing the payload to ensure that the event originates from the analytics platform and has not been tampered with in transit.

Delivery follows an at-least-once guarantee with exponential backoff. If the recipient responds with a non-success status code, the platform retries the delivery with increasing intervals up to a configurable maximum. A dead-letter queue captures events that exhaust their retry budget, and these can be replayed through the dashboard once the recipient endpoint is restored.

Webhooks are the recommended integration protocol for automated compliance workflows. When the AML engine flags a transaction, the webhook delivers the alert, the pre-generated SAR draft, and the risk-score breakdown directly to the compliance team's case management system — no polling required.

SDK Libraries

While all four protocols can be consumed through direct HTTP or WebSocket clients, the platform provides official SDK libraries in five languages — TypeScript, Python, Go, Rust, and Java — that abstract protocol details and provide idiomatic interfaces for each ecosystem.

The SDKs handle authentication, request signing, retry logic, pagination traversal, and WebSocket connection management. They expose typed models for every API entity, so that developers working in statically typed languages benefit from compile-time validation of their integration code. Each SDK is distributed through the standard package registry for its language — npm, PyPI, Go modules, crates.io, and Maven Central — and follows semantic versioning.

Project Registration

Every integration begins with project registration. A developer registers their dApp through the platform dashboard or by submitting a YAML configuration file through the REST API. The registration captures the project's name, the contract addresses it encompasses, the chains those contracts are deployed on, and the analytics features the project wishes to activate.

Once registered, a project enters an unverified state. Verification — which confirms that the registrant controls the declared contract addresses — upgrades the project to verified status and unlocks advanced features such as custom event taxonomies and priority ingestion. The verification process uses a lightweight on-chain challenge that requires the contract owner to call a designated function, proving control without transferring funds or modifying contract state.