Understanding 3rd-party integration connectors
Introduction
Cargo can connect to various third-party tools by leveraging their API endpoints. This capability enables workflows to appropriate the datasets and logic of these endpoints for the benefit of the user.
Setting up integrations
Integrating third-party tools with Cargo requires initial setup, including the configuration of connectors with the necessary credentials for authentication.
Cargo ensures the security of these credentials through encryption and secure storage.
Additionally, a credit-based system allows for the use of Cargo credits for using some connectors, eliminating the need for a direct API key and encouraging experimentation with new tools.
Data module
The data module offers data loaders that are powerful tools available to the user from those tools into models inside Cargo.
These loaders facilitate the extraction of data using API credentials, rendering the data available for use in segments and workflows.
Workflow module
The workflow module offers connectors that package external API calls and abstract the middleware to render these API calls simple for the user to execute.
These connectors offer two methods to query these endpoints:
-
Raw HTTP calls: Directly interact with APIs using HTTP requests.
-
Raw HTTP browser: Retrieve the html content of a public webpage.
-
Native Cargo actions connectors: Connectors ease interactions with these APIs by managing the complications of rate limits, pagination, and error handling.
Rate limits and pagination
Rate limits are constraints set by APIs on the number of requests that can be made within a certain timeframe. These limits are crucial for API providers to ensure their services remain stable and accessible to all users without being overwhelmed by excessive requests. Exceeding these limits typically results in errors, causing disruptions in service and data flow.
For users on an enterprise plan, Cargo implements a sophisticated rate limit management solution to optimize API usage and prevent errors. It dynamically queues and schedules API calls to adhere to the rate limits of each integrated third-party service. This approach ensures that workflows run smoothly without hitting rate limit barriers, which could cause errors and execution failure.
When an API request returns a large data set, it's often split into pages or chunks to prevent overwhelming the client and to reduce the load on the server. This process is known as pagination. Handling pagination manually can be cumbersome, especially when working with extensive data sets across multiple API endpoints.
Cargo simplifies the handling of paginated responses by automatically managing the iteration over pages of data. This means that when Cargo makes an API call that results in a paginated response, it can automatically continue to request subsequent pages until all the data has been retrieved.