Data Source Connections API Tutorial

From BigID Developer Portal

In this article, you'll learn:

  • How to export and save all data source definitions from an existing BigID system.
  • How to import those data sources into the new BigID system using the API..
  • How to verify that all data sources were transferred successfully


scenarioYour organization is migrating to a new BigID environment. Instead of manually recreating every single data source, you want to streamline your migration by exporting the current environment’s data source definitions and importing them into the new system using the BigID Data Source Connections API.

In this tutorial, we'll use SAMPLE as our session token. This is unique to the training sandbox and will not work in other environments. See BigID API/Tutorial for information on authenticating with BigID.

To view the complete code for all steps, see the section labelled Code Samples.

For more information on the API capabilities used in this tutorial, check out theData Source Connections API Docs.

1. Authenticate Using Your API Key

All API requests require authentication using a valid API key. Refer to BigID Documentation to obtain your token. Then, define the Authorization header using the format `Authorization: Bearer YOUR_API_KEY`. This header must be included in every request to ensure proper authentication and access to BigID’s API endpoints. Throughout the tutorial, we will be using SAMPLE as our token.

2. Export All Existing Data Sources

Use the GET /ds-connections/file-download/export endpoint to export your configured data sources into a JSON file. There is an optional ids parameter that can be used to fetch specific data sources only, but because we are transferring all the data, we do not need to include it.

The file returned will contain all the necessary configuration metadata for each data source, including connection type, credentials (if stored), scan options, and more.

3. Import Data Sources into the New BigID System

Now that you've exported your data sources as a JSON file in Step 2, it's time to import them into the new BigID environment. BigID does not support bulk importing via file upload, so each data source must be created individually using the POST /ds_connections endpoint.

This endpoint accepts a ds_connection object with the configuration values for the new data source. These should match the fields exported from your old system, and they must include any required values as defined in the data source’s template.

4. Verify Data Source are Properly Transferred

Once all data sources have been created in the new BigID environment, it’s important to verify that the transfer was successful.

You can confirm this by using the GET /ds-connections endpoint on the new system to retrieve the list of all configured data sources. Compare this list to your original export to ensure that each data source has been recreated accurately.

5. Troubleshooting

Status Code Example Response What It Means How to Fix It
200 Successful response with scan data Everything’s looking good! Keep cruising.
400 { "error": "Scan ID is invalid" } Bad or malformed scan ID provided Double-check the scan ID you’re using.
404 { "error": "Scan 1234 was not found" } Scan ID doesn’t exist Make sure the ID is valid and fetched from the parent scans endpoint.
401 Unauthorized API key missing or invalid Verify your API key and authorization header.
500 { "status": "error", "message": "Server error", "errors": [{}] } BigID server hit a snag (internal error) Wait a moment and retry. If it persists, reach out to support.

Code Samples

# Scan Insights API Tutorial
# in progress
// Scan Insights API Tutorial
// in progress

Summary

Congratulations! In this tutorial, you have learned how to efficiently export existing data sources from one BigID environment and import them into another using the BigID API.