Start to send data to Snowflake too (#20698)
This PR adds support for sending telemetry events to AWS Kinesis. In our AWS account we now have three new things: * The [Kinesis data stream](https://us-east-1.console.aws.amazon.com/kinesis/home?region=us-east-1#/streams/details/zed-telemetry/monitoring) that we will actually write to. * A [Firehose for Axiom](https://us-east-1.console.aws.amazon.com/firehose/home?region=us-east-1#/details/telemetry-to-axiom/monitoring) that sends events from that stream to Axiom for ad-hoc queries over recent data. * A [Firehose for Snowflake](https://us-east-1.console.aws.amazon.com/firehose/home?region=us-east-1#/details/telemetry-to-snowflake/monitoring) that sends events from that stream to Snowflake for long-term retention. This Firehose also backs up data into an S3 bucket in case we want to change how the system works in the future. In a follow-up PR, we'll add support for ad-hoc telemetry events; and slowly move away from the current Clickhouse defined schemas; though we won't move off click house until we have what we need in Snowflake. Co-Authored-By: Nathan <nathan@zed.dev> Release Notes: - N/A
This commit is contained in:
parent
f449e8d3d3
commit
6ff69faf37
7 changed files with 305 additions and 28 deletions
|
@ -512,6 +512,7 @@ impl TestServer {
|
|||
rate_limiter: Arc::new(RateLimiter::new(test_db.db().clone())),
|
||||
executor,
|
||||
clickhouse_client: None,
|
||||
kinesis_client: None,
|
||||
config: Config {
|
||||
http_port: 0,
|
||||
database_url: "".into(),
|
||||
|
@ -550,6 +551,10 @@ impl TestServer {
|
|||
stripe_api_key: None,
|
||||
supermaven_admin_api_key: None,
|
||||
user_backfiller_github_access_token: None,
|
||||
kinesis_region: None,
|
||||
kinesis_stream: None,
|
||||
kinesis_access_key: None,
|
||||
kinesis_secret_key: None,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue