Skip to main content
Neptune provides S3-compatible storage buckets for storing files, images, and other assets. Under the hood, these are AWS S3 buckets that your application can access using standard AWS SDKs.

Quick Start

Ask your AI assistant:
“Add an S3 bucket called ‘uploads’ to my project”
Or add it directly to your neptune.json:
neptune.json
{
  "kind": "Service",
  "name": "my-app",
  "resources": [
    {
      "kind": "StorageBucket",
      "name": "uploads"
    }
  ]
}
Then ask your AI assistant to provision and deploy:
“Provision my resources and deploy”

How It Works

When you provision a storage bucket, Neptune:
  1. Creates an S3 bucket with a physical name (e.g., neptune-abc123-uploads) for global uniqueness
  2. Configures all necessary IAM permissions so your running service can access it
  3. Returns the physical bucket name to your AI assistant
The physical bucket name is different from the logical name you define in neptune.json. For example, if you name your bucket uploads, the actual AWS bucket might be called neptune-abc123-uploads.

Getting the Bucket Name

After provisioning, ask your AI assistant for the bucket name:
“What’s the bucket name for my uploads bucket?”
Or:
“Show me the deployment status”
Your AI assistant will call get_deployment_status which returns the resource details including the physical aws_id:
{
  "kind": "StorageBucket",
  "name": "uploads",
  "status": "Available",
  "aws_id": "neptune-abc123-uploads"  // This is your physical bucket name
}
Always use the physical bucket name (aws_id) in your code, not the logical name from your neptune.json.

Using in Your Application

All permissions are pre-configured, so you can use boto3 (or other AWS SDKs) out of the box without any credential configuration.
  • Python
  • Node.js
  • Rust
import boto3

# Your AI assistant will provide this physical bucket name
BUCKET_NAME = "neptune-abc123-uploads"

s3 = boto3.client("s3")

# Upload a file
s3.put_object(
    Bucket=BUCKET_NAME,
    Key="images/photo.jpg",
    Body=file_data
)

# Download a file
response = s3.get_object(Bucket=BUCKET_NAME, Key="images/photo.jpg")
data = response['Body'].read()

# List files
response = s3.list_objects_v2(Bucket=BUCKET_NAME)
for obj in response.get('Contents', []):
    print(obj['Key'])
Your AI assistant knows the physical bucket name after provisioning and can write the code for you with the correct bucket name already filled in.

Configuration Options

{
  "kind": "StorageBucket",
  "name": "uploads"
}
FieldTypeRequiredDescription
kindstringYesMust be "StorageBucket"
namestringYesLogical name for this bucket (Neptune generates a unique physical name)

Managing Files

You can ask your AI assistant to help manage bucket contents:
TaskExample Request
List files”List the files in my uploads bucket”
Get a file”Show me the contents of config.json in my bucket”
Check status”Is my storage bucket ready?”

Presigned URLs

For user uploads or temporary access, generate presigned URLs:
import boto3
from botocore.config import Config

BUCKET_NAME = "neptune-abc123-uploads"

s3 = boto3.client('s3', config=Config(signature_version='s3v4'))

# Presigned download URL (1 hour)
download_url = s3.generate_presigned_url(
    'get_object',
    Params={'Bucket': BUCKET_NAME, 'Key': 'images/photo.jpg'},
    ExpiresIn=3600
)

# Presigned upload URL (5 minutes)
upload_url = s3.generate_presigned_url(
    'put_object',
    Params={'Bucket': BUCKET_NAME, 'Key': f'uploads/{user_id}/{filename}'},
    ExpiresIn=300
)

Best Practices

Use prefixes (folders) to organize your files:
your-bucket/
├── images/
│   ├── avatars/
│   └── posts/
├── documents/
└── temp/
Always set the ContentType when uploading to ensure proper handling:
s3.put_object(
    Bucket=BUCKET_NAME,
    Key='document.pdf',
    Body=buffer,
    ContentType='application/pdf'
)
For user uploads, generate presigned URLs to upload directly to S3 from the browser. This avoids passing large files through your server.