AWS Lambda- API Gateway
API Gateway
AWS- API Gateway Documentation
Lambda:
Example API Gateway with Lambda
Step 1: Select Lambda service and Click on Create function button.
Step 2: Select the following properties
Author from Scratch
Function name: func1
Runtime: Python 3.6
Architecture x86_64
Click on Create function button.
Step 3: In the Code Source section you will find some code is written, you can replace it with API code with the code which you want to execute using API Gateway’s methods. I am using the default code.
Step 4: If you make the changes in the code then make sure that code is saved using File–>Save and then click on the Deploy button.
Step 5: Add trigger by clicking on Add Trigger button.
Step 6: Add API Gateway in the Trigger configuration
Create API
Rest API
Security: IAM
Click on Add button
Step 7: Click on the function func1-API link.
Step 8: Select Any and Goto Action –> Delete method
Step 9: Action–>Create method
Step 10: Under func1 select the method as Get method.
Integration Type: Lambda function
Use Lambda Proxy Integer: select the check box
Lambda function: func1
Click on save button.
Step 11: You will see the visual of GET method Execution
Step 12: Select Get Method and Click on Action—>Deploy API
Select Stage as default
Click on the Deploy button.
Step 13: Select GET method and Click on the URL on RHS panel which will invoke GET method of API.
Lambda with S3 Example
Step 1: Create a DynomdbFullAccess Role and Assign to Lambda.
AmazonDynamoDBFullAccess
Step 2: Create a Lambda function Func2 with Runtime as Python 3.6
Write below code in the code window. This code is written in python using Boto Libray which insert S3 object information in Dynamodb table.
import boto3
from uuid import uuid4
def lambda_handler(event, context):
s3 = boto3.client(“s3”)
dynamodb = boto3.resource(‘dynamodb’)
for record in event[‘Records’]:
bucket_name = record[‘s3’][‘bucket’][‘name’]
object_key = record[‘s3’][‘object’][‘key’]
size = record[‘s3’][‘object’].get(‘size’, -1)
event_name = record [‘eventName’]
event_time = record[‘eventTime’]
dynamoTable = dynamodb.Table(‘newtable’)
dynamoTable.put_item(
Item={‘unique’: str(uuid4()), ‘Bucket’: bucket_name, ‘Object’: object_key,’Size’: size, ‘Event’: event_name, ‘EventTime’: event_time})
Step 3: File—> Save and Click on the Deploy button.
Step 4: Click on Configuration Tab and select Permission and click on Edit Role and attach the role which is created on Step1.
Step 4: Add Trigger by clicking on Add trigger button.
Select S3 trigger
Select bucket name
Event Type: All objects create events
Select acknowledgment for recursive invocation.
Click on Add button
Step 5: Create a Dynamodb table newtable and partition key unique.
Step 6: Explore items in newtable and you will find no records for this table.
Step 7: Upload and file in S3 bucket.
Step 8: Explore items in the Dynamodb table. and you will find the metadata information of the uploaded file in the S3 bucket.