Skip to content

nima-hash/Fileserver

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Secure File Manager: A Serverless Solution

1. Project Overview

This project presents a secure, serverless application for personal file management. Users can sign up, log in, and manage their documents in the cloud. The system is designed with a tiered access model where authenticated users gain private storage while guests can browse a limited interface. The entire architecture is built on a suite of integrated AWS services, ensuring high availability and scalability without the overhead of traditional servers.

2. System Architecture

The application's core functionality relies on several decoupled AWS services that communicate through events and permissions. The flow is initiated from a web-based frontend and extends through several backend pipelines for data persistence and real-time notifications.

A conceptual overview of the system's components is illustrated below.

       +-------------------+               +----------------------+
       |   User's Browser  |               |  Amazon Cognito      |
       +---------+---------+               | (User & Identity Pools) |
                 |                         +-----------+----------+
                 | Identity/Auth                          |
                 |                                        |
                 V                                        V
       +-------------------+                     +--------------------+
       | S3 Static Website |                     |     IAM Roles      |
       |  (Frontend Code)  |                     | (For Auth & Unauth)|
       |  + CloudFront     |                     +---------+----------+
       +-------------------+                               |
                 |                                         |
                 | File Upload/Delete                      | S3 Permissions (based on User ID)
                 V                                         V
         +---------------+                       +----------------------+
         |   S3 Bucket   | (1) Object Create/Remove  |   S3 Bucket        |
         |  (Frontend)   | ----------------------> |  (User Files)      |
         +---------------+                       +---------+----------+
                                                          |
                                                          | (2) S3 Event Notification
                                                          V
                                                +----------------------+
                                                |     SQS Queue        |
                                                |  (S3 Events) + DLQ   |
                                                +---------+----------+
                                                          |
                                                          | (3) SQS Trigger
                                                          V
                                                +----------------------+
                                                |       Lambda         |
                                                | (Process S3 Event)   |
                                                +---------+----------+
                                                          |
                                                          | (4) DynamoDB Write
                                                          V
                                                +----------------------+
                                                |      DynamoDB        |
                                                |  (File Metadata)     |
                                                +---------+----------+
                                                          |
                                                          | (5) DynamoDB Stream
                                                          V
                                                +----------------------+
                                                |        Lambda        |
                                                | (Stream Processor)   |
                                                +---------+----------+
                                                          |
                                                          | (6) SNS Publish (Filtered)
                                                          V
                                                +----------------------+
                                                |        SNS           |
                                                | (Notification Topic) |
                                                +----------------------+
                                                          |
                                                          | (7) Filtered Notification
                                                          V
                                                +----------------------+
                                                |      User's Email    |
                                                +----------------------+

Workflow Breakdown:

  1. A user uploads a file, which triggers an S3 ObjectCreated event.
  2. The S3 bucket's event notification configuration sends a message to an SQS queue.
  3. An SQS-triggered Lambda function processes the message.
  4. This Lambda extracts metadata from the S3 event and creates a new record in a DynamoDB table.
  5. A DynamoDB Stream captures the change and triggers a second Lambda function.
  6. This Lambda constructs a user-friendly message and publishes it to an SNS topic, applying a filter policy to ensure only the relevant user receives the notification.
  7. The user, who has subscribed to the topic with the correct filter, receives an email notification.

3. Core Features

  • Private User Storage: Files are stored in a private S3 bucket, with access restricted to the user's specific cognito-identity-id prefix via fine-grained IAM policies.
  • Guest Access: The frontend allows unauthenticated users to view the application UI but restricts them from performing file operations.
  • Comprehensive File Management: The application supports file upload, secure pre-signed URL downloads, and "soft deletion" where files are marked as inactive in the database while remaining in S3.
  • Robust Event-Driven Backend: S3 events are routed through a dedicated SQS queue to a Lambda processor. This design pattern ensures durability and handles high-volume uploads without data loss. A Dead-Letter Queue (DLQ) is also configured to capture failed messages.
  • Real-time User Notifications: Changes to a user's file status (uploads, deletions) trigger a notification pipeline. The system uses a DynamoDB stream to capture database changes and sends targeted email notifications via an SNS topic with a filter policy.

4. Prerequisites

Before setting up the project, ensure you have the following:

  • An active AWS account.
  • The AWS CLI installed and configured with administrative permissions.
  • Basic familiarity with AWS services (S3, Lambda, IAM, Cognito, DynamoDB, SQS, SNS).

5. Deployment Guide

Follow these steps to deploy and configure the entire system. Note: all resource names must be globally unique.

Phase 1: Identity and Storage Setup

  1. Configure Amazon Cognito:
    • Create a User Pool to manage users. Note the User Pool ID and App client ID.
    • Create an Identity Pool to grant AWS credentials. Link it to the User Pool. Note the Identity Pool ID.
    • Modify the IAM role created for authenticated identities. The policy for this role must grant s3:GetObject, s3:PutObject, and s3:DeleteObject permissions to arn:aws:s3:::YOUR_CONTENT_BUCKET_NAME/${cognito-identity.amazonaws.com:sub}/*. This policy uses a variable (cognito-identity.amazonaws.com:sub) to enforce user-specific access.
  2. Create S3 Buckets:
    • Create a bucket for the frontend website (e.g., my-frontend-bucket-123). Enable static website hosting on it and add a public read policy.
    • Create a separate, private bucket for user file content (e.g., my-content-bucket-456). Ensure "Block all public access" is enabled.

Phase 2: Frontend Deployment

  1. Update script.js:
    • Modify the constant variables at the top of the file with the IDs and names from your AWS resources (USER_POOL_ID, CLIENT_ID, IDENTITY_POOL_ID, S3_BUCKET_NAME, etc.).
  2. Upload Files:
    • Upload your index.html, script.js, and style.css to the public frontend S3 bucket.
  3. Enable HTTPS with CloudFront:
    • To prevent "insecure download" browser warnings, create an Amazon CloudFront distribution.
    • Set the origin to your S3 static website hosting endpoint.
    • Configure the viewer protocol policy to "Redirect HTTP to HTTPS".
    • Attach a public SSL/TLS certificate from AWS Certificate Manager (ACM) to the distribution.

Phase 3: Core File Operations & Data Persistence

  1. Set up DynamoDB:
    • Create a table named filesystem-DB.
    • Set the Partition Key to user-id (String) and the Sort Key to filename (String).
  2. Create SQS Queue:
    • Create a Standard SQS Queue named S3FileEventQueue.
    • Enable a Dead-Letter Queue (DLQ) for it, named S3FileEventQueueDLQ, to handle failed messages.
  3. Create S3 Event Processing Lambda:
    • Create a new Lambda function (e.g., ProcessS3EventsToDynamoDB) using Python.
    • Grant its IAM role permissions to read from SQS (sqs:ReceiveMessage, sqs:DeleteMessage) and write to DynamoDB (dynamodb:PutItem).
    • Configure the Lambda to be triggered by the S3FileEventQueue SQS queue.
    • The Lambda's code should parse the S3 event notification from the SQS message and write the user's file metadata (ID, filename, size, creation date) to the DynamoDB table.
  4. Configure S3 Event Notifications:
    • In the S3 console, go to your content bucket.
    • Add an event notification rule.
    • Set the event types to "All object create events".
    • Set the destination to your S3FileEventQueue SQS queue.

Phase 4: Event Notification System

  1. Enable DynamoDB Stream:
    • In the DynamoDB console, select your filesystem-DB table.
    • Go to the "Exports and streams" tab and enable the stream with the "New and old images" view.
  2. Create SNS Topic:
    • Create a Standard SNS topic named FileChangeNotificationTopic. Note its ARN.
  3. Create DynamoDB Stream Processor Lambda:
    • Create a new Lambda function (e.g., ProcessDynamoDBStreamToSNS) using Python.
    • Grant its IAM role permissions to read from the DynamoDB stream (dynamodb:GetRecords, dynamodb:GetShardIterator, etc.) and to publish to SNS (sns:Publish).
    • Configure the Lambda to be triggered by the filesystem-DB DynamoDB stream.
    • The Lambda code should check the event type (INSERT, MODIFY, REMOVE).
    • For INSERT events, it should construct a message about a new file upload. For MODIFY events, it should check if the deleted attribute has changed to generate a "file deleted" message.
    • The publish call to SNS must include a filter parameter. The user-id from the DynamoDB image should be included in the message attributes, like so:
      sns.publish(
          TopicArn=topic_arn,
          Message=notification_message_text,
          Subject=notification_subject,
          MessageAttributes={
              "user_id": {
                  "DataType": "String",
                  "StringValue": user_id
              }
          }
      )
  4. Frontend SNS Subscription:
    • The frontend script.js needs to be updated to subscribe the user to the SNS topic.
    • When a user subscribes, the subscribe call to SNS must include a FilterPolicy that uses the user's cognito-identity-id to filter notifications. This ensures the user only receives messages intended for them.
    • The subscription's filter policy should look like this:
    {
      "user_id": [ "YOUR_COGNITO_IDENTITY_ID_FOR_THIS_USER" ]
    }

6. Using the Application

  1. Access the application via your CloudFront HTTPS URL.
  2. Register a new user account.
  3. Log in and start uploading files.
  4. Observe the file list and try downloading or deleting files.
  5. Click the "Subscribe to Events" button and check your email for a confirmation request. After confirming, you will receive notifications for all your future file changes.

7. Troubleshooting

  • CORS Errors: Ensure your API Gateway and S3 bucket CORS policies are correctly configured.
  • 403 Forbidden on Upload/Delete: Double-check the IAM role for your authenticated users. The S3 policy's Resource ARN must be correctly formatted to include the ${cognito-identity.amazonaws.com:sub} variable.
  • No SNS Notifications:
    • Verify your Lambda's CloudWatch logs to see if the function executed successfully and made the sns.publish call.
    • Check if the SNS subscription for your email is confirmed.
    • Confirm that the SNS publish call includes MessageAttributes with user_id as a string.
    • Check the subscription filter policy to ensure it matches the user_id from the published message attributes.