AWS S3 Advanced Operations
Master advanced S3 features for production workloads.
A Simple Analogy
S3 advanced features are like setting rules for your filing cabinet. Auto-archive old files, keep multiple versions, organize with tags.
Why Advanced Features?
- Cost optimization: Auto-tier old data
- Compliance: Keep audit trails
- Performance: Multipart uploads for large files
- Reliability: Versioning prevents accidents
- Organization: Tagging and lifecycles
Versioning
# Enable versioning
aws s3api put-bucket-versioning \
--bucket my-bucket \
--versioning-configuration Status=Enabled
# List versions
aws s3api list-object-versions \
--bucket my-bucket \
--key myfile.txt
# Restore old version
aws s3api get-object \
--bucket my-bucket \
--key myfile.txt \
--version-id ABC123 \
restored-file.txt
Lifecycle Policies
{
"Rules": [
{
"Id": "archive-old-logs",
"Filter": { "Prefix": "logs/" },
"Status": "Enabled",
"Transitions": [
{
"Days": 30,
"StorageClass": "STANDARD_IA"
},
{
"Days": 90,
"StorageClass": "GLACIER"
}
],
"Expiration": { "Days": 365 }
}
]
}
Multipart Upload
var client = new AmazonS3Client();
var uploadRequest = new InitiateMultipartUploadRequest
{
BucketName = "my-bucket",
Key = "large-file.zip"
};
var uploadResponse = await client.InitiateMultipartUploadAsync(uploadRequest);
var uploadId = uploadResponse.UploadId;
var partETags = new List<PartETag>();
var partSize = 5 * 1024 * 1024; // 5MB
using (var fileStream = File.OpenRead("large-file.zip"))
{
for (int partNumber = 1; fileStream.Position < fileStream.Length; partNumber++)
{
var uploadPartRequest = new UploadPartRequest
{
BucketName = "my-bucket",
Key = "large-file.zip",
UploadId = uploadId,
PartNumber = partNumber,
InputStream = fileStream
};
var uploadPartResponse = await client.UploadPartAsync(uploadPartRequest);
partETags.Add(new PartETag
{
ETag = uploadPartResponse.ETag,
PartNumber = partNumber
});
}
}
var completeRequest = new CompleteMultipartUploadRequest
{
BucketName = "my-bucket",
Key = "large-file.zip",
UploadId = uploadId,
PartETags = partETags
};
await client.CompleteMultipartUploadAsync(completeRequest);
Tagging and Querying
{
"TagSet": [
{ "Key": "Environment", "Value": "Production" },
{ "Key": "Department", "Value": "Finance" },
{ "Key": "Retention", "Value": "7-years" }
]
}
Best Practices
- Enable versioning: Prevent accidental deletes
- Use lifecycle: Auto-archive old data
- Multipart: Use for files > 100MB
- Tagging: Organize and manage costs
- Access logs: Audit who accesses what
Related Concepts
- S3 Transfer Acceleration
- CloudFront distribution
- Object Lock (compliance)
- S3 Select (query in place)
Summary
Use S3 versioning, lifecycle policies, multipart uploads, and tagging for robust, cost-effective storage at scale.
Related Articles
AWS Lambda with .NET Functions
Learn how to build and deploy serverless .NET functions using AWS Lambda.
Read More awsAWS RDS with ASP.NET Core
Learn how to connect and use AWS Relational Database Service (RDS) with ASP.NET Core applications.
Read More api-developmentAPI Rate Limiting
Implement rate limiting to protect APIs from abuse and overload.
Read More