This article describes how to implement a data warehouse by subscribing to business events from Quavo and using a combination of AWS tools and Snowflake to consume those events. Please not that this is only one approach and there may be a different approach that is more appropriate for your organization.
...
It is recommended that an Authentication Profile is used, however basic API Key Authentication is supported.
Events
Quavo will configure QFD so it knows which events should be tracked and published.
Event Stream
QFD will automatically process event queues and send events to the defined endpoints based on the defined settings.
...
This creates Business Event instances.
Requeuing Business Events
This utility provides method to requeue events by TranmissionStatus, Start/End DateTime window, or specific EventId:
...
Snowflake Stage
Code Block | ||
---|---|---|
| ||
create or replace stage QFD.DEV.ACCOUNTING storage_integration = "quavo_snowflake_qfd_dev_s3" url = 's3://quavo-snowflake-qfd-dev/load/Accounting/' file_format = QFD.DEV.JSON_FILE_FORMAT; |
...
Code Block | ||
---|---|---|
| ||
create or replace TABLE QFD.DEV.ACCOUNTING_STG ( EVENTPAYLOAD VARIANT ); |
Stream on Staging Table
The stream monitors for appended rows in the staging table. This is used in the merge task.
Code Block | ||
---|---|---|
| ||
create or replace stream QFD.DEV.ACCOUNTING_STREAM_APPEND_ONLY on table ACCOUNTING_STG append_only = true; |
...