Pull and ingest data from a third-party API

This tutorial builds a data pipeline that pulls data from a third-party finance API and loads it into TimescaleDB.

This tutorial requires multiple libraries. This can make your deployment package size larger than the 250 MB limit of Lambda. You can use a Docker container to extend the package size up to 10 GB, giving you much more flexibility in libraries and dependencies. For more about AWS Lambda container support, see the AWS documentation.

The libraries used in this tutorial:

Create an ETL function

Extract, transform, and load (ETL) functions are used to pull data from one database and ingest the data into another. In this tutorial, the ETL function pulls data from a finance API called Alpha Vantage, and inserts the data into TimescaleDB. The connection is made using the values from environment variables.

This is the ETL function used in this tutorial:

  1. # function.py:
  2. import csv
  3. import pandas as pd
  4. import psycopg2
  5. from pgcopy import CopyManager
  6. import os
  7. config = {'DB_USER': os.environ['DB_USER'],
  8. 'DB_PASS': os.environ['DB_PASS'],
  9. 'DB_HOST': os.environ['DB_HOST'],
  10. 'DB_PORT': os.environ['DB_PORT'],
  11. 'DB_NAME': os.environ['DB_NAME'],
  12. 'APIKEY': os.environ['APIKEY']}
  13. conn = psycopg2.connect(database=config['DB_NAME'],
  14. host=config['DB_HOST'],
  15. user=config['DB_USER'],
  16. password=config['DB_PASS'],
  17. port=config['DB_PORT'])
  18. columns = ('time', 'price_open', 'price_close',
  19. 'price_low', 'price_high', 'trading_volume', 'symbol')
  20. def get_symbols():
  21. """Read symbols from a csv file.
  22. Returns:
  23. [list of strings]: symbols
  24. """
  25. with open('symbols.csv') as f:
  26. reader = csv.reader(f)
  27. return [row[0] for row in reader]
  28. def fetch_stock_data(symbol, month):
  29. """Fetches historical intraday data for one ticker symbol (1-min interval)
  30. Args:
  31. symbol (string): ticker symbol
  32. month (int): month value as an integer 1-24 (for example month=4 fetches data from the last 4 months)
  33. Returns:
  34. list of tuples: intraday (candlestick) stock data
  35. """
  36. interval = '1min'
  37. slice = 'year1month' + str(month) if month <= 12 else 'year2month1' + str(month)
  38. apikey = config['APIKEY']
  39. CSV_URL = 'https://www.alphavantage.co/query?function=TIME_SERIES_INTRADAY_EXTENDED&' \
  40. 'symbol={symbol}&interval={interval}&slice={slice}&apikey={apikey}' \
  41. .format(symbol=symbol, slice=slice, interval=interval,apikey=apikey)
  42. df = pd.read_csv(CSV_URL)
  43. df['symbol'] = symbol
  44. df['time'] = pd.to_datetime(df['time'], format='%Y-%m-%d %H:%M:%S')
  45. df = df.rename(columns={'time': 'time',
  46. 'open': 'price_open',
  47. 'close': 'price_close',
  48. 'high': 'price_high',
  49. 'low': 'price_low',
  50. 'volume': 'trading_volume'}
  51. )
  52. return [row for row in df.itertuples(index=False, name=None)]
  53. def handler(event, context):
  54. symbols = get_symbols()
  55. for symbol in symbols:
  56. print("Fetching data for: ", symbol)
  57. for month in range(1, 2):
  58. stock_data = fetch_stock_data(symbol, month)
  59. print('Inserting data...')
  60. mgr = CopyManager(conn, 'stocks_intraday', columns)
  61. mgr.copy(stock_data)
  62. conn.commit()

Add a requirements file

When you have created the ETL function, you need to include the libraries you want to install. You can do this by creating a text file in your project called requirements.txt that lists the libraries. This is the requirements.txt file used in this tutorial:

  1. pandas
  2. requests
  3. psycopg2-binary
  4. pgcopy
note

This example uses psycopg2-binary instead of psycopg2 in the requirements.txt file. The binary version of the library contains all its dependencies, so that you don’t need to install them separately.

Create the Dockerfile

When you have the requirements set up, you can create the Dockerfile for the project.

Creating the Dockerfile

  1. Use an AWS Lambda base image:

    1. FROM public.ecr.aws/lambda/python:3.8
  2. Copy all project files to the root directory:

    1. COPY function.py .
    2. COPY requirements.txt .
  3. Install the libraries using the requirements file:

    1. RUN pip install -r requirements.txt
    2. CMD ["function.handler"]

Upload the image to ECR

To connect the container image to a Lambda function, you need to upload it to the AWS Elastic Container Registry (ECR).

Uploading the image to ECR

  1. Log in to the Docker command line interface:

    1. aws ecr get-login-password --region us-east-1 \
    2. | docker login --username AWS \
    3. --password-stdin <AWS_ACCOUNT_ID>.dkr.ecr.us-east-1.amazonaws.com
  2. Build the image:

    1. docker build -t lambda-image .
  3. Create a repository in ECR. In this example, the repository is called lambda-image:

    1. aws ecr create-repository --repository-name lambda-image
  4. Tag your image using the same name as the repository:

    1. docker tag lambda-image:latest <AWS_ACCOUNT_ID>.dkr.ecr.us-east-1.amazonaws.com/lambda-image:latest
  5. Deploy the image to Amazon ECR with Docker:

    1. docker push <AWS_ACCOUNT_ID>.dkr.ecr.us-east-1.amazonaws.com/lambda-image:latest

Create a Lambda function from the container

To create a Lambda function from your container, you can use the Lambda create-function command. You need to define the --package-type parameter as image, and add the ECR Image URI using the --code flag:

  1. aws lambda create-function --region us-east-1 \
  2. --function-name docker_function --package-type Image \
  3. --code ImageUri=<ECR Image URI> --role <ARN_LAMBDA_ROLE>

Schedule the Lambda function

If you want to run your Lambda function according to a schedule, you can set up an EventBridge trigger. This creates a rule using a cron expression.

Scheduling the Lambda function

  1. Create the schedule. In this example, the function runs every day at 9 AM:

    1. aws events put-rule --name schedule-lambda --schedule-expression 'cron(0 9 * * ? *)'
  2. Grant the necessary permissions for the Lambda function:

    1. aws lambda add-permission --function-name <FUNCTION_NAME> \
    2. --statement-id my-scheduled-event --action 'lambda:InvokeFunction' \
    3. --principal events.amazonaws.com
  3. Add the function to the EventBridge rule, by creating a targets.json file containing a memorable, unique string, and the ARN of the Lambda Function:

    1. [
    2. {
    3. "Id": "docker_lambda_trigger",
    4. "Arn": "<ARN_LAMBDA_FUNCTION>"
    5. }
    6. ]
  4. Add the Lambda function, referred to in this command as the target, to the rule:

    1. aws events put-targets --rule schedule-lambda --targets file://targets.json
important

If you get an error saying Parameter ScheduleExpression is not valid, you might have made a mistake in the cron expression. Check the cron expression examples documentation.

You can check if the rule is connected correctly to the Lambda function in the AWS console. Navigate to Amazon EventBridge → Events → Rules, and click the rule you created. The Lambda function’s name is listed under Target(s):

Lamdba function target in AWS Console