Connectors Overview
Connectors are the bridge between Slim.io and your data infrastructure. Each connector establishes a secure, least-privilege connection to a specific service, enabling Slim.io to discover and scan data for sensitive information across cloud storage, databases, and SaaS platforms.
In the Customer Dashboard, this area is labeled Data Sources. “Data Source” and “Connector” refer to the same configured integration — a single object that owns credentials, scope, and connection method.
Connection Methods
Cloud storage providers (AWS S3, Google Cloud Storage, Azure Blob Storage) can be connected via two methods. Choose the method that fits your security and deployment posture.
| Method | Where It Runs | When To Choose |
|---|---|---|
| Agentless | Slim.io’s managed control plane calls provider APIs directly using the credentials you grant | Fastest path to value. No infrastructure to deploy. Recommended for most cloud accounts. |
| Scanner | A deployable scanner image runs inside your cloud environment (Docker, Kubernetes, Cloud Run) and reports findings back to Slim.io | Required for BYOC (“bring your own cloud”) deployments where data must never leave your network boundary, or for high-volume environments that benefit from local compute. |
Both methods use the same credentials, the same classifier library, and produce the same findings. The wizard prompts you to choose a method when you create a cloud connector. Database and SaaS connectors run in agentless mode.
The Customer Dashboard surfaces both options under the same Data Sources workspace — there is no separate “Agentless” menu.
Supported Providers
Cloud Storage
| Provider | Service | Auth Method | Status |
|---|---|---|---|
| Amazon Web Services | S3 | IAM Role (cross-account assume role) | GA |
| Google Cloud Platform | Cloud Storage | Workload Identity Federation (WIF) | GA |
| Microsoft Azure | Blob Storage | Service Principal (OAuth2 client credentials) | GA |
Databases
| Provider | Auth Method | Status |
|---|---|---|
| PostgreSQL | Username / password or IAM authentication | GA |
| MySQL | Username / password or IAM authentication | GA |
| Snowflake | Username / password or key-pair authentication | GA |
| SQL Server (MSSQL) | SQL authentication or Active Directory | GA |
| Oracle | Username / password | GA |
| IBM DB2 | Username / password | GA |
| Databricks | Personal access token or OAuth | GA |
SaaS & Collaboration
| Provider | Auth Method | Status |
|---|---|---|
| Slack | Bot token (OAuth2) | GA |
| Microsoft Teams | Service Principal (OAuth2 client credentials) | GA |
| OneDrive | Service Principal (OAuth2 client credentials) | GA |
| SharePoint | Service Principal (OAuth2 client credentials) | GA |
| Google Drive | Service account (domain-wide delegation) | GA |
| Salesforce | OAuth2 (connected app) | GA |
Authentication Methods
Each provider uses a different credential model, but all follow the principle of least privilege:
AWS — IAM Role
Slim.io assumes a cross-account IAM Role in your AWS account. You create the role with a trust policy that allows Slim.io’s AWS account to assume it. The role needs only s3:GetObject, s3:ListBucket, and s3:GetBucketLocation permissions.
GCP — Workload Identity Federation
Slim.io uses Workload Identity Federation (WIF) to authenticate without long-lived service account keys. You configure a WIF pool and provider in your GCP project, then grant the Slim.io service account access to your Cloud Storage buckets.
Azure — Service Principal
You create a Service Principal (app registration) in your Azure Active Directory and grant it Storage Blob Data Reader access on the target storage account. Slim.io authenticates using the client credentials flow (client ID + client secret).
Databases — Username / Password or IAM
Database connectors support standard username and password authentication. Where available, IAM-based authentication (AWS RDS IAM, GCP Cloud SQL IAM) is also supported. All connections are established with read-only permissions.
SaaS — OAuth2 or Bot Tokens
SaaS connectors authenticate via OAuth2 flows or platform-specific bot tokens. Slim.io requests only the minimum scopes needed to read content — no write access is ever requested.
All authentication methods are designed to avoid storing long-lived credentials where possible. AWS uses role assumption with temporary tokens, GCP uses WIF with short-lived tokens, database passwords are encrypted at rest, and OAuth tokens are automatically refreshed.
Connector Lifecycle
- Create — Configure the connector with provider credentials and target scope
- Test — Validate that credentials work and Slim.io can access the target resources
- Active — Connector is ready for scanning
- Scan — Trigger manual or scheduled scans against the connector
- Disconnect — Temporarily disable without deleting configuration
- Delete — Permanently remove the connector and associated scan history
Connector Scoping
Each connector can be scoped to control which data is scanned:
- Bucket / Container Selection — Specify which buckets or containers to include (cloud storage)
- Prefix Filters — Limit scanning to specific key prefixes or folder paths (cloud storage)
- File Type Filters — Include or exclude file extensions (e.g., scan only
.csv,.json,.parquet) - Schema / Table Selection — Choose which schemas and tables to include (databases)
- Channel / Site Selection — Choose which channels, sites, or drives to include (SaaS)
- Size Limits — Skip files above a configurable size threshold
Provider-Specific Guides
- AWS S3 — IAM role setup, permission policies, and region configuration
- Google Cloud Storage — WIF configuration and service account setup
- Azure Blob Storage — Service Principal creation and storage account access
- Databases — PostgreSQL, MySQL, Snowflake, SQL Server, Oracle, DB2, and Databricks setup
- SaaS & Collaboration — Slack, Teams, OneDrive, SharePoint, Google Drive, and Salesforce setup
- Cloud DLP Integration — Layer provider-native DLP services alongside Slim.io detection