Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This article describes how to implement a data warehouse by subscribing to business events from Quavo and using a combination of AWS tools and Snowflake to consume those events.  Please note that this is only one approach and there may be a different approach that is more appropriate for your organization.

...

It is recommended that an Authentication Profile is used, however basic API Key Authentication is supported.

Events

Quavo will configure QFD so it knows which events should be tracked and published.

Event Stream

QFD will automatically process event queues and send events to the defined endpoints based on the defined settings.

...

This creates Business Event instances.

Requeuing Business Events

This utility provides method to requeue events by TranmissionStatus, Start/End DateTime window, or specific EventId:

...

Snowflake ingests each S3 file as it is notified via the SQS queue. For each event type a stage, snowpipe, staging table, structured final table, merge task, and staging table cleanup task should be set up. A storage integration (quavo_snowflake_qfd_{dev|stg|prod}_s3) and file format (QFD.{DEV|STG|PROD}.JSON_FILE_FORMAT are also created but these are already existing and reused for all event types and will not need to be recreated.

To start you need to be in the "DATAARCHITECT" role for the schema you are working in. For this example we are using QFDDEVDATAARCHITECT. You should be developing and testing initially in the DEV schema before making the same changes in the STG and PROD schemas.

Snowflake Stage

Code Block
languagesql
create or replace stage QFD.DEV.ACCOUNTING
	storage_integration = "quavo_snowflake_qfd_dev_s3" 
    url = 's3://quavo-snowflake-qfd-dev/load/Accounting/'
    file_format = QFD.DEV.JSON_FILE_FORMAT;

...