Object Storage Security

Comprehensive security features to protect your data in DanubeData Object Storage.

Overview

DanubeData Object Storage provides multiple layers of security to ensure your data is protected at rest and in transit. This guide covers encryption, access control, and security best practices.

Encryption

Encryption at Rest

All data stored in DanubeData Object Storage is automatically encrypted using AES-256 encryption.

  • Server-Side Encryption (SSE-S3): Enabled by default
  • No configuration required: Automatic for all objects
  • Zero performance impact: Hardware-accelerated encryption

Encryption in Transit

All connections use TLS 1.3 for maximum security:

  • HTTPS only: HTTP connections are not accepted
  • Strong cipher suites: Modern TLS configuration
  • Certificate validation: Automatic certificate management

Access Control

Access Keys

Access keys are the primary method for authenticating to your buckets via the S3 API.

Creating Access Keys

Via Dashboard:

  1. Navigate to your bucket
  2. Click Access Keys tab
  3. Click Create Access Key
  4. Configure name and permissions
  5. Copy the secret key immediately (shown only once)

Via API:

Bash
curl -X POST https://api.danubedata.ro/v1/storage/buckets/{bucket_id}/access-keys \
  -H "Authorization: Bearer YOUR_API_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "my-app-key",
    "permissions": ["read", "write"]
  }'

Permission Levels

PermissionDescriptionOperations
readRead-only accessGetObject, ListObjects, HeadObject
writeCreate and update objectsPutObject, CopyObject
deleteRemove objectsDeleteObject, DeleteObjects
adminFull bucket controlAll operations including policy changes

Best Practices for Access Keys

  1. Use least privilege: Only grant permissions that are needed
  2. Rotate regularly: Create new keys and retire old ones periodically
  3. Set expiration dates: Use expiring keys for temporary access
  4. Monitor usage: Check last-used timestamps to identify unused keys
  5. Never commit to code: Use environment variables or secrets management

Public Access Control

By default, all buckets are private. You can enable public read access for specific use cases.

Enabling Public Access

Via Dashboard:

  1. Navigate to your bucket
  2. Click Settings
  3. Toggle Public Access to enabled
  4. Confirm the security warning

Via API:

Bash
curl -X PATCH https://api.danubedata.ro/v1/storage/buckets/{bucket_id} \
  -H "Authorization: Bearer YOUR_API_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "public_access": true
  }'

Public Access Warning

When public access is enabled:

  • Anyone can read objects in your bucket
  • Objects are accessible via direct URL
  • Egress traffic will count against your quota
  • Use for static websites, public assets only

Bucket Policies

For fine-grained access control, you can apply JSON bucket policies.

Example: Allow Read from Specific IP

JSON
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": "*",
      "Action": ["s3:GetObject"],
      "Resource": ["arn:aws:s3:::my-bucket/*"],
      "Condition": {
        "IpAddress": {
          "aws:SourceIp": "192.168.1.0/24"
        }
      }
    }
  ]
}

Example: Deny Delete Operations

JSON
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Deny",
      "Principal": "*",
      "Action": ["s3:DeleteObject", "s3:DeleteBucket"],
      "Resource": [
        "arn:aws:s3:::my-bucket",
        "arn:aws:s3:::my-bucket/*"
      ]
    }
  ]
}

CORS Configuration

Configure Cross-Origin Resource Sharing (CORS) to allow web applications to access your bucket.

Why CORS?

Browsers block cross-origin requests by default. If your web application needs to upload or download files directly from Object Storage, you must configure CORS.

Configuring CORS

Via Dashboard:

  1. Navigate to your bucket
  2. Click SettingsCORS
  3. Add CORS rules
  4. Save changes

Via API:

Bash
curl -X PUT https://api.danubedata.ro/v1/storage/buckets/{bucket_id}/cors \
  -H "Authorization: Bearer YOUR_API_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "cors_rules": [
      {
        "allowed_origins": ["https://myapp.com", "https://*.myapp.com"],
        "allowed_methods": ["GET", "PUT", "POST", "DELETE"],
        "allowed_headers": ["*"],
        "expose_headers": ["ETag", "x-amz-meta-*"],
        "max_age_seconds": 3600
      }
    ]
  }'

CORS Rule Options

FieldDescriptionExample
allowed_originsDomains allowed to make requests["https://myapp.com"]
allowed_methodsHTTP methods allowed["GET", "PUT"]
allowed_headersRequest headers allowed["Content-Type", "Authorization"]
expose_headersResponse headers exposed to browser["ETag"]
max_age_secondsHow long browser caches preflight3600

CORS Best Practices

  1. Be specific with origins: Avoid using * in production
  2. Limit methods: Only allow methods your app needs
  3. Set appropriate max_age: Balance security and performance
  4. Test thoroughly: Use browser dev tools to verify CORS

Presigned URLs

Generate temporary, secure URLs for sharing objects without exposing credentials.

Download URL

Python
import boto3

s3 = boto3.client(
    's3',
    endpoint_url='https://s3.danubedata.ro',
    aws_access_key_id='YOUR_ACCESS_KEY',
    aws_secret_access_key='YOUR_SECRET_KEY'
)

# Valid for 1 hour
url = s3.generate_presigned_url(
    'get_object',
    Params={'Bucket': 'my-bucket', 'Key': 'secret-file.pdf'},
    ExpiresIn=3600
)

Upload URL

Python
# Generate upload URL
upload_url = s3.generate_presigned_url(
    'put_object',
    Params={
        'Bucket': 'my-bucket',
        'Key': 'uploads/user-file.txt',
        'ContentType': 'text/plain'
    },
    ExpiresIn=3600
)

# Client can upload using:
# curl -X PUT -H "Content-Type: text/plain" --data-binary @file.txt "$upload_url"

Presigned URL Security

  • Time-limited: URLs expire after the specified duration
  • Operation-specific: Each URL is valid for one operation
  • Cannot be revoked: Once generated, valid until expiry
  • Audit trail: Track which key generated the URL

Security Best Practices

1. Principle of Least Privilege

Create separate access keys for different applications with only the permissions they need:

Bash
# Read-only key for analytics
curl -X POST .../access-keys -d '{"name": "analytics", "permissions": ["read"]}'

# Write-only key for uploads
curl -X POST .../access-keys -d '{"name": "uploader", "permissions": ["write"]}'

# Full access for backups
curl -X POST .../access-keys -d '{"name": "backup", "permissions": ["read", "write", "delete"]}'

2. Enable Versioning for Critical Data

Protect against accidental deletion or overwrites:

Bash
curl -X PATCH https://api.danubedata.ro/v1/storage/buckets/{bucket_id} \
  -H "Authorization: Bearer YOUR_API_TOKEN" \
  -d '{"versioning_enabled": true}'

3. Implement Lifecycle Rules

Automatically delete temporary files and old versions:

JSON
{
  "rules": [
    {
      "id": "delete-temp-files",
      "prefix": "temp/",
      "expiration_days": 7
    },
    {
      "id": "delete-old-versions",
      "noncurrent_version_expiration_days": 30
    }
  ]
}

4. Monitor Access Key Usage

Regularly review access keys:

  1. Check last-used timestamps
  2. Delete unused keys
  3. Rotate keys periodically
  4. Set expiration dates for temporary access

5. Use Separate Buckets for Sensitivity

Organize data by sensitivity level:

  • production-public - Public assets, CDN content
  • production-private - Application data, user uploads
  • production-sensitive - PII, financial data (most restricted)

6. Audit and Logging

Enable access logging to track all bucket operations:

  • Who accessed what objects
  • When operations occurred
  • Success/failure status
  • Source IP addresses

Compliance

GDPR Compliance

DanubeData Object Storage is fully GDPR compliant:

  • Data residency: All data stored in Germany (EU)
  • Encryption: AES-256 at rest, TLS 1.3 in transit
  • Access control: Fine-grained permission management
  • Data deletion: Objects can be permanently deleted
  • Audit trail: Complete access logging available

Data Retention

Use lifecycle rules to implement data retention policies:

JSON
{
  "rules": [
    {
      "id": "gdpr-retention",
      "prefix": "user-data/",
      "expiration_days": 365,
      "noncurrent_version_expiration_days": 90
    }
  ]
}

Troubleshooting

"Access Denied" Errors

  1. Verify credentials: Check access key and secret
  2. Check permissions: Ensure key has required permissions
  3. Bucket ownership: Verify key belongs to bucket's team
  4. Bucket policy: Check for deny rules blocking access

CORS Errors in Browser

  1. Check origin: Ensure your domain is in allowed_origins
  2. Check method: Verify HTTP method is allowed
  3. Check headers: Ensure required headers are allowed
  4. Browser cache: Clear preflight cache and retry

Presigned URL Not Working

  1. Check expiry: URL may have expired
  2. Clock sync: Ensure server clock is accurate
  3. URL encoding: Don't manually modify the URL
  4. Key revocation: Original key may have been deleted

Next Steps


Questions? Contact support at support@danubedata.ro