Python Quickstart
Learn how to use Python with Rabata.io for managing your object storage using the boto3 library.
Installation
To use Rabata.io with Python, you’ll need to install the boto3 library, which is the Amazon Web Services (AWS) SDK for Python.
Install boto3
$ pip install boto3
It’s recommended to use a virtual environment:
$ python -m venv venv
$ source venv/bin/activate # On Windows: venv\Scripts\activate
$ pip install boto3
Configuration
There are several ways to configure boto3 to work with Rabata.io.
Method 1: Using AWS Credentials File
If you’ve already configured the AWS CLI as shown in the AWS CLI Quickstart, boto3 will automatically use those credentials.
Method 2: Explicit Configuration in Code
You can explicitly configure the S3 client in your code:
import boto3
# Create an S3 client with Rabata.io endpoint
s3_client = boto3.client(
's3',
endpoint_url='https://s3.eu-west-1.rabata.io',
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY',
region_name='eu-west-1'
)
# Create an S3 resource with Rabata.io endpoint
s3_resource = boto3.resource(
's3',
endpoint_url='https://s3.eu-west-1.rabata.io',
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY',
region_name='eu-west-1'
)
Method 3: Using Environment Variables
You can set environment variables to configure boto3:
# Set these environment variables before running your Python script
export AWS_ACCESS_KEY_ID=YOUR_ACCESS_KEY
export AWS_SECRET_ACCESS_KEY=YOUR_SECRET_KEY
export AWS_DEFAULT_REGION=eu-west-1
Then in your code:
import boto3
import os
# Create an S3 client with Rabata.io endpoint
s3_client = boto3.client(
's3',
endpoint_url='https://s3.eu-west-1.rabata.io'
)
Security Note: Never hardcode your credentials in your source code, especially if it’s stored in a version control system. Use environment variables, AWS credentials file, or a secure secrets management system.
Basic Operations
Here are some common operations you can perform with boto3 and Rabata.io.
Bucket Operations
List All Buckets
import boto3
s3_client = boto3.client(
's3',
endpoint_url='https://s3.eu-west-1.rabata.io',
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY'
)
response = s3_client.list_buckets()
for bucket in response['Buckets']:
print(f"Bucket Name: {bucket['Name']}")
Create a Bucket
s3_client.create_bucket(Bucket='my-bucket-name')
Delete a Bucket
s3_client.delete_bucket(Bucket='my-bucket-name')
Note: The bucket must be empty before it can be deleted.
Object Operations
List Objects in a Bucket
response = s3_client.list_objects_v2(Bucket='my-bucket-name')
if 'Contents' in response:
for obj in response['Contents']:
print(f"Object Key: {obj['Key']}, Size: {obj['Size']} bytes")
Upload a File
# Method 1: Using upload_file
s3_client.upload_file(
'local-file.txt',
'my-bucket-name',
'remote-file.txt'
)
# Method 2: Using put_object
with open('local-file.txt', 'rb') as file:
s3_client.put_object(
Bucket='my-bucket-name',
Key='remote-file.txt',
Body=file.read()
)
Download a File
# Method 1: Using download_file
s3_client.download_file(
'my-bucket-name',
'remote-file.txt',
'local-file.txt'
)
# Method 2: Using get_object
response = s3_client.get_object(
Bucket='my-bucket-name',
Key='remote-file.txt'
)
content = response['Body'].read()
with open('local-file.txt', 'wb') as file:
file.write(content)
Delete a File
s3_client.delete_object(
Bucket='my-bucket-name',
Key='file-to-delete.txt'
)
Delete Multiple Files
s3_client.delete_objects(
Bucket='my-bucket-name',
Delete={
'Objects': [
{'Key': 'file1.txt'},
{'Key': 'file2.txt'},
{'Key': 'file3.txt'}
]
}
)
Advanced Operations
Here are some more advanced operations you can perform with boto3 and Rabata.io.
Working with Object Metadata
s3_client.put_object(
Bucket='my-bucket-name',
Key='file-with-metadata.txt',
Body=b'Hello, World!',
Metadata={
'custom-key': 'custom-value',
'content-type': 'text/plain'
}
)
Multipart Uploads
For large files, you can use multipart uploads:
import os
import math
import threading
def upload_large_file(file_path, bucket, key, part_size=5*1024*1024):
"""Upload a large file using multipart upload."""
# Initiate the multipart upload
mpu = s3_client.create_multipart_upload(Bucket=bucket, Key=key)
upload_id = mpu['UploadId']
try:
# Get file size
file_size = os.path.getsize(file_path)
# Calculate the number of parts
part_count = math.ceil(file_size / part_size)
# Prepare the parts list
parts = []
# Upload each part
with open(file_path, 'rb') as file:
for i in range(part_count):
# Read the part data
file.seek(i * part_size)
data = file.read(min(part_size, file_size - i * part_size))
# Upload the part
part = s3_client.upload_part(
Bucket=bucket,
Key=key,
UploadId=upload_id,
PartNumber=i+1,
Body=data
)
# Add the part to the parts list
parts.append({
'PartNumber': i+1,
'ETag': part['ETag']
})
# Complete the multipart upload
s3_client.complete_multipart_upload(
Bucket=bucket,
Key=key,
UploadId=upload_id,
MultipartUpload={'Parts': parts}
)
print(f"Successfully uploaded {file_path} to {bucket}/{key}")
except Exception as e:
# Abort the multipart upload if something goes wrong
s3_client.abort_multipart_upload(
Bucket=bucket,
Key=key,
UploadId=upload_id
)
print(f"Error uploading {file_path}: {e}")
raise
# Example usage
upload_large_file('large-file.iso', 'my-bucket-name', 'large-file.iso')
Using Presigned URLs
Generate a presigned URL to allow temporary access to an object:
presigned_url = s3_client.generate_presigned_url(
'get_object',
Params={
'Bucket': 'my-bucket-name',
'Key': 'private-file.txt'
},
ExpiresIn=3600 # URL expires in 1 hour
)
print(f"Presigned URL: {presigned_url}")
Using S3 Resource Instead of Client
The boto3 S3 resource provides a higher-level, object-oriented API:
import boto3
s3 = boto3.resource(
's3',
endpoint_url='https://s3.eu-west-1.rabata.io',
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY'
)
# List all buckets
for bucket in s3.buckets.all():
print(bucket.name)
# Get a bucket
bucket = s3.Bucket('my-bucket-name')
# List all objects in a bucket
for obj in bucket.objects.all():
print(obj.key)
# Upload a file
bucket.upload_file('local-file.txt', 'remote-file.txt')
# Download a file
bucket.download_file('remote-file.txt', 'local-file.txt')
# Delete an object
obj = s3.Object('my-bucket-name', 'file-to-delete.txt')
obj.delete()
Error Handling
It’s important to handle errors properly when working with S3:
import boto3
from botocore.exceptions import ClientError
s3_client = boto3.client(
's3',
endpoint_url='https://s3.eu-west-1.rabata.io',
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY'
)
try:
response = s3_client.get_object(
Bucket='my-bucket-name',
Key='non-existent-file.txt'
)
except ClientError as e:
error_code = e.response['Error']['Code']
if error_code == 'NoSuchKey':
print("The object does not exist.")
elif error_code == 'NoSuchBucket':
print("The bucket does not exist.")
else:
print(f"An error occurred: {e}")