Isaac.

aws

AWS S3 Advanced Operations

Master advanced S3 features for production workloads.

By Emem IsaacJuly 26, 20212 min read
#aws s3#storage#versioning#lifecycle#performance
Share:

A Simple Analogy

S3 advanced features are like setting rules for your filing cabinet. Auto-archive old files, keep multiple versions, organize with tags.


Why Advanced Features?

  • Cost optimization: Auto-tier old data
  • Compliance: Keep audit trails
  • Performance: Multipart uploads for large files
  • Reliability: Versioning prevents accidents
  • Organization: Tagging and lifecycles

Versioning

# Enable versioning
aws s3api put-bucket-versioning \
  --bucket my-bucket \
  --versioning-configuration Status=Enabled

# List versions
aws s3api list-object-versions \
  --bucket my-bucket \
  --key myfile.txt

# Restore old version
aws s3api get-object \
  --bucket my-bucket \
  --key myfile.txt \
  --version-id ABC123 \
  restored-file.txt

Lifecycle Policies

{
  "Rules": [
    {
      "Id": "archive-old-logs",
      "Filter": { "Prefix": "logs/" },
      "Status": "Enabled",
      "Transitions": [
        {
          "Days": 30,
          "StorageClass": "STANDARD_IA"
        },
        {
          "Days": 90,
          "StorageClass": "GLACIER"
        }
      ],
      "Expiration": { "Days": 365 }
    }
  ]
}

Multipart Upload

var client = new AmazonS3Client();

var uploadRequest = new InitiateMultipartUploadRequest
{
    BucketName = "my-bucket",
    Key = "large-file.zip"
};

var uploadResponse = await client.InitiateMultipartUploadAsync(uploadRequest);
var uploadId = uploadResponse.UploadId;

var partETags = new List<PartETag>();
var partSize = 5 * 1024 * 1024; // 5MB

using (var fileStream = File.OpenRead("large-file.zip"))
{
    for (int partNumber = 1; fileStream.Position < fileStream.Length; partNumber++)
    {
        var uploadPartRequest = new UploadPartRequest
        {
            BucketName = "my-bucket",
            Key = "large-file.zip",
            UploadId = uploadId,
            PartNumber = partNumber,
            InputStream = fileStream
        };

        var uploadPartResponse = await client.UploadPartAsync(uploadPartRequest);
        partETags.Add(new PartETag
        {
            ETag = uploadPartResponse.ETag,
            PartNumber = partNumber
        });
    }
}

var completeRequest = new CompleteMultipartUploadRequest
{
    BucketName = "my-bucket",
    Key = "large-file.zip",
    UploadId = uploadId,
    PartETags = partETags
};

await client.CompleteMultipartUploadAsync(completeRequest);

Tagging and Querying

{
  "TagSet": [
    { "Key": "Environment", "Value": "Production" },
    { "Key": "Department", "Value": "Finance" },
    { "Key": "Retention", "Value": "7-years" }
  ]
}

Best Practices

  1. Enable versioning: Prevent accidental deletes
  2. Use lifecycle: Auto-archive old data
  3. Multipart: Use for files > 100MB
  4. Tagging: Organize and manage costs
  5. Access logs: Audit who accesses what

Related Concepts

  • S3 Transfer Acceleration
  • CloudFront distribution
  • Object Lock (compliance)
  • S3 Select (query in place)

Summary

Use S3 versioning, lifecycle policies, multipart uploads, and tagging for robust, cost-effective storage at scale.

Share:

Written by Emem Isaac

Expert Software Engineer with 15+ years of experience building scalable enterprise applications. Specialized in ASP.NET Core, Azure, Docker, and modern web development. Passionate about sharing knowledge and helping developers grow.

Ready to Build Something Amazing?

Let's discuss your project and explore how my expertise can help you achieve your goals. Free consultation available.

💼 Trusted by 50+ companies worldwide | ⚡ Average response time: 24 hours