☁️ S3 Gateway
Secure and autonomous file upload system with dedicated S3 buckets and automatic credential management
The OverSOC S3 Gateway enables users to securely and autonomously upload files to dedicated S3 buckets. This system uses Auth0 authentication and automatically generates temporary AWS credentials.
Overview
The S3 Gateway provides:
- Secure authentication via Auth0 with multi-organization support
- Automatic generation of temporary AWS credentials that are secure
- Intuitive web interface for access management
- Complete isolation between organizations
- AWS CLI compatibility for easy integration
Accessing the Management Interface
Access URL
The S3 credentials management interface is available at:
Login
- Go to https://s3auth.oversoc.com
- Click on "Sign in with Auth0"
- Use your regular OverSOC credentials
- Once logged in, you'll see the credentials management dashboard
Generating Credentials
Steps to Generate Your Credentials
- Log in to the web interface
- In the "Your AWS Credentials" section, click "Generate / Retrieve Credentials"
- The system automatically generates:
- Access Key ID: Unique identifier for your organization
- Secret Access Key: Secret key (keep it confidential)
Credential Security
- The Secret Access Key is only visible at generation time
- Copy and save your credentials immediately
- Never share your credentials with third parties
AWS CLI Configuration
Automatic Installation via Interface
The interface automatically generates the AWS CLI configuration for your organization:
# Automatically generated configuration
aws configure set aws_access_key_id YOUR_ACCESS_KEY --profile ovs_your_org
aws configure set aws_secret_access_key YOUR_SECRET_KEY --profile ovs_your_org
aws configure set region fr-par --profile ovs_your_org
aws configure set endpoint_url https://s3.oversoc.com --profile ovs_your_org
Using the "Copy" Button
- Generate your credentials
- Scroll to the "AWS CLI Configuration" section
- Click "Copy" to copy the commands
- Paste and execute the commands in your terminal
Using S3 Buckets
Bucket Structure
Each organization has dedicated S3 buckets for each connector. Buckets are named directly after the connector type:
azure- For Azure datasentinelone- For SentinelOne datavmware- For VMware data- And other connectors based on your needs
List Your Available Buckets
# List all available buckets for your organization
aws s3 ls --profile ovs_your_org
Uploading Files
Via AWS CLI
# List contents of a specific bucket
aws s3 ls s3://azure/ --profile ovs_your_org
# Upload a file to the Azure bucket
aws s3 cp file.json s3://azure/file.json --profile ovs_your_org
# Upload a file to the SentinelOne bucket
aws s3 cp data.csv s3://sentinelone/data.csv --profile ovs_your_org
# Download a file from a bucket
aws s3 cp s3://azure/file.json ./file.json --profile ovs_your_org
Via Python SDK (boto3)
import boto3
# Configure S3 client
session = boto3.Session(
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY',
region_name='fr-par'
)
s3_client = session.client(
's3',
endpoint_url='https://s3.oversoc.com'
)
# List all available buckets
response = s3_client.list_buckets()
for bucket in response['Buckets']:
print(f"Bucket: {bucket['Name']}")
# Upload a file to the Azure bucket
s3_client.upload_file(
'local_file.json',
'azure',
'remote_file.json'
)
# List objects in a specific bucket
response = s3_client.list_objects_v2(Bucket='sentinelone')
for obj in response.get('Contents', []):
print(f"File: {obj['Key']}, Size: {obj['Size']} bytes")
File Organization
Buckets are organized by connector. Each bucket contains the connector's files directly:
azure/ # Bucket for Azure
├── config.json
├── users.csv
└── logs.txt
sentinelone/ # Bucket for SentinelOne
├── agents.json
├── threats.csv
└── policies.xml
vmware/ # Bucket for VMware
├── vms.json
└── hosts.csv
Multi-Organization Management
Switching Organizations
If you have access to multiple organizations:
- Click the organization selector in the top right
- Select the desired organization
- The interface reloads with the new organization's credentials
Credentials per Organization
- Each organization has its own credentials
- Buckets are completely isolated between organizations
- You must generate credentials for each organization
Credentials Management
Credentials History
The interface displays the complete history of your credentials:
- Status: Active or Revoked
- Creation date
- Last used
- Available actions: View config, Revoke, Delete
Credential Revocation
For security reasons, you can revoke credentials:
- In the "Credentials History" section
- Click "Revoke" next to the desired credential
- Confirm the action
Revoked credentials can no longer be used but remain visible in the history.
Typical Use Cases
1. Upload Azure Data
# Upload an Azure configuration file
aws s3 cp azure-config.json s3://azure/config.json --profile ovs_your_org
# Upload Azure user data
aws s3 cp users.csv s3://azure/users.csv --profile ovs_your_org
2. Upload SentinelOne Data
# Upload SentinelOne agent logs
aws s3 cp agents.json s3://sentinelone/agents.json --profile ovs_your_org
# Sync threat files
aws s3 sync ./threats/ s3://sentinelone/ --profile ovs_your_org
3. Upload VMware Data
# Upload VMware inventory
aws s3 cp inventory.xml s3://vmware/inventory.xml --profile ovs_your_org
# Upload performance metrics
aws s3 cp performance.csv s3://vmware/performance.csv --profile ovs_your_org
Troubleshooting
Common Errors
"Access Denied"
- Verify that your credentials are active (not revoked)
- Confirm you're using the correct bucket name (azure, sentinelone, vmware, etc.)
- Ensure the endpoint URL is correct:
https://s3.oversoc.com
"Invalid credentials"
- Regenerate your credentials from the web interface
- Verify you correctly copied the Access Key ID and Secret Access Key
- Check your AWS CLI profile configuration
Expired Credentials
- Credentials have a limited lifespan
- Generate new credentials from the interface
- Update your AWS CLI configuration
Support
For assistance:
- Email: support@oversoc.com
- Debug interface: Enable debug logs in your browser console
- Service status: Check https://s3auth.oversoc.com/health
Limits and Quotas
- Maximum file size: 5 GB
- Number of active credentials: 10 per organization
- Credential lifespan: 24 hours (renewable)
- Maximum throughput: According to standard AWS S3 limits
- Revoke unused credentials
- Organize your files with a clear structure
- Use explicit file names