File Upload Notification System

File Upload Notification System

How to trigger notifications once a particular job is done.

·

4 min read

In this project, we will develop a system that observes an S3 Bucket for any newly uploaded files. This will work by activating a Lambda function to handle the uploaded files, dispatches a notification using SNS, and retains metadata in an SQS queue for subsequent processing.

Creating a notification pipeline

  • Create a bucket - give it a name and choose your region.

  • Browse to Amazon SNS and create a topic.

We will utilize standard topics. When multiple Lambda functions are linked to a topic, FIFO ensures that messages are sent to subscribers in the order they are received. Therefore, FIFO operates on a first-come, first-served basis.

Standard doesn't care in what order it is received; it will send randomly to the subscriber.

  • Click create topic option.

  • Once topic is created, proceed to create subscription.

  • Choose topic ARN.

  • In the protocol option tab choose email and your email link.

Request confirmation and confirm the email from your email inbox to activate the subscription.

In the next step we create an SQS Queue.

Amazon SQS helps manage messages between different parts of a software system. Imagine you have different components or services in your application that need to communicate with each other. SQS acts as a messenger service, storing messages in a queue until they're processed by the receiving component.

Same as above, create a standard queue and give it a name and leave the remaining options as they are.

Now that our resources have been created, we need to setup our lambda function.

  • Navigate to the lambda page

  • Create a function, make sure author from scratch is selected.

  • Choose the python runtime environment.

  • In 'change execution role' tab select create role with basic lambda permission

  • Create the function.

Once created, we create the permissions for our lambda function.

  • Click the configuration tab.

  • On the left-hand side navigation panel, click permissions.

  • Click on the link below role name.

This takes you to the IAM console resource where we will be writing the policy for the SNS and SQS access to the function.

Click add permission and attach policy.

In the next step, we'll add the code below into the lambda function. Make sure to adjust the arn which you can obtain from the sns and sqs config interface.

Deploy the code.

import json
import boto3

s3_client = boto3.client('s3')
sns_client = boto3.client('sns')
sqs_client = boto3.client('sqs')


def lambda_handler(event, context):
    sns_topic_arn = 'arn:aws:sns:eu-west-2:662406500657:notificationstandardsystemtopic'

    # Define the SQS queue URL
    sqs_queue_url = 'https://sqs.eu-west-2.amazonaws.com/662406500657/notificationsqueue'

    # Process S3 event records
    for record in event['Records']:
        print(event)
        # Extract S3 bucket and object information
        s3_bucket = record['s3']['bucket']['name']
        s3_key = record['s3']['object']['key']

        # Example: Sending metadata to SQS
        metadata = {
            'bucket': s3_bucket,
            'key': s3_key,
            'timestamp': record['eventTime']
        }

        # Send metadata to SQS
        sqs_response = sqs_client.send_message(
            QueueUrl=sqs_queue_url,
            MessageBody=json.dumps(metadata)
        )

        # Example: Sending a notification to SNS
        notification_message = f"New file uploaded to S3 bucket '{s3_bucket}' with key '{s3_key}'"

        sns_response = sns_client.publish(
            TopicArn=sns_topic_arn,
            Message=notification_message,
            Subject="File Upload Notification"
        )

    return {
        'statusCode': 200,
        'body': json.dumps('Processing complete')
    }

Overall, the code is designed to respond to S3 new file upload events, extract relevant information from these events, send metadata to an SQS queue, and send a notification message to an SNS topic.

Now we will go back to our s3 bucket.

  • Create a new event notification under the properties tab in your selected s3 bucket.

  • Give it a name

  • Select all object create events because our aim is to track every put, post, copy and multipart objects status.

  • At the end we will choose our lambda function and save changes.

Now anytime you upload a file into the bucket, you should get a notification via email.

If you upload and didn't receive anything in your email, check the CloudWatch logs to debug.

Below is an example of a successful event notification logged by CloudWatch. You should also receive an email in your email.

Navigate to your queue.

  • Click on send and receive messages.

  • Poll for messages.

You will see the message being received in your AWS queue.

Done.