Fullstack Python: Cloud-Based Flask App Load Balancing with CloudFront

Cloud storage is a crucial component for modern web applications, especially when dealing with file uploads, media content, and backups. For fullstack Python developers building Flask applications, integrating with Amazon S3 (Simple Storage Service) offers a scalable and reliable solution for storing files. In this blog, we’ll walk through how to set up S3 with a Flask app to upload and manage files seamlessly in the cloud.


Why Use S3 with Flask?

Amazon S3 is a widely-used object storage service that offers:

Durability and scalability

Built-in security and access control

Easy integration with Python via Boto3

Support for static websites, media delivery, and backups

For Flask developers, S3 provides an efficient way to offload storage tasks and keep your application stateless and cloud-native.


Prerequisites

An AWS account

A Flask app

AWS Access Key ID and Secret Access Key

Python packages: boto3, flask, and werkzeug for file handling


Step 1: Install Required Packages

Install the required libraries using pip:


bash


pip install flask boto3 werkzeug


Step 2: Configure AWS Credentials

Use the AWS CLI or create a .aws/credentials file:


ini

Copy

Edit

[default]

aws_access_key_id = YOUR_ACCESS_KEY

aws_secret_access_key = YOUR_SECRET_KEY

Or configure credentials directly in your Flask app using environment variables for better security.


Step 3: Create an S3 Bucket

Go to the AWS S3 console and create a bucket:

Choose a globally unique name.

Set region and permissions.

Allow public access only if your use case requires it (e.g., public media).


Step 4: Flask Code to Upload Files to S3

Here’s a minimal example:

python


from flask import Flask, request, redirect

import boto3

from werkzeug.utils import secure_filename


app = Flask(__name__)


# Initialize S3 client

s3 = boto3.client('s3')

BUCKET_NAME = 'your-bucket-name'


@app.route('/upload', methods=['POST'])

def upload_file():

    if 'file' not in request.files:

        return 'No file part', 400

    

    file = request.files['file']

    if file.filename == '':

        return 'No selected file', 400


    filename = secure_filename(file.filename)

    s3.upload_fileobj(file, BUCKET_NAME, filename)

    return f'File {filename} uploaded to S3!', 200

To test it, build a simple HTML form to upload a file or use a tool like Postman.


Step 5: Accessing Files on S3

You can access public files using:

php-template


https://<bucket-name>.s3.<region>.amazonaws.com/<filename>

For private buckets, use pre-signed URLs generated via Boto3 for temporary access:


python


url = s3.generate_presigned_url('get_object',

    Params={'Bucket': BUCKET_NAME, 'Key': filename},

    ExpiresIn=3600)


Final Thoughts

Integrating S3 with Flask gives your Python application a robust and scalable storage layer. It removes the burden of managing local file systems and allows you to easily handle large media uploads, backups, and downloads. As a fullstack developer, mastering cloud storage integration makes your applications more production-ready and future-proof.

Learn FullStack Python Training Course

Read More : Flask App Deployment with Continuous Integration on Azure DevOps

Read More : Fullstack Python: Setting Up Cloud Storage for Flask Applications on S3

Read More : Fullstack Flask: Building and Deploying APIs on Cloud with Docker

Visit Quality Thought Training Institute

Get Direction

Comments

Popular posts from this blog

Tosca vs Selenium: Which One to Choose?

Flask REST API Versioning: Strategies for Backward Compatibility

How to Build a Reusable Component Library