Skip to content

This project is designed to extract, transform, and load log data into BigQuery using DBT and Terraform.

Notifications You must be signed in to change notification settings

Szymok/etl-log-activity

Repository files navigation

ETL Log Activity

This project is designed to extract, transform, and load log data into BigQuery using DBT and Terraform.

Requirements

  • Google Cloud Platform (GCP) account
  • BigQuery
  • DBT
  • Terraform

Setup

  1. Create a GCP project and enable the BigQuery API
  2. Create a BigQuery dataset to store the log data
  3. Use Terraform to create and configure the necessary resources, such as a GCS bucket for storing log files
  4. Use DBT to create the necessary tables and perform transformations on the log data
  5. Run the ETL pipeline to load the log data into BigQuery

Usage

  1. Upload log files to the designated GCS bucket
  2. Run the DBT commands to perform transformations on the data
  3. Run the ETL pipeline to load the data into BigQuery
  4. Run the command: bash-etl.sh

Note

This project is designed to be flexible and can be easily customized to fit your specific needs. If you have any questions or issues, please reach out to the project maintainers for assistance.

About

This project is designed to extract, transform, and load log data into BigQuery using DBT and Terraform.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published