Translating Snowflake warehouse usage to BigQuery

80 Views Asked by At

More general question question here, but is there currently a way to estimate how workloads running in my Snowflake warehouses would translate to BigQuery if I switched platforms? I know that Snowflake uses credit consumption to monitor activity and utilization and that BiQuery operates on Slots, but there's no direct 1-to-1 translation between the 2. I.e.: I use an average of 150 credits daily in Snowflake, what would this look like in BigQuery?

1

There are 1 best solutions below

1
Mel On

Indeed, there is no direct translation in terms of workload since, as mentioned by @NickW, the two platforms have totally different architecture. However, if you are comparing Snowflake's credits to BigQuery's slots to have cost estimations, it is doable as long as you have the data for each of their pricing components like what is shown in this article.

Example:

Let’s say you have a data warehouse that stores 1 TB of data and you run 100 queries per day that process 10 GB of data each.

With Snowflake, your monthly costs would be:

  • Storage: $40 per TB per month

  • Compute: $2.50 per credit per hour * 24 hours per day * 30 days per month = $1800 per month

Total cost: $1840 per month

With BigQuery, your monthly costs would be:

  • Storage: $20 per TB per month

  • Query data processed: $5 per TB * 10 GB per query * 100 queries per day * 30 days per month = $1500 per month

  • Streaming inserts: $0

Total cost: $1520 per month

As you can see, BigQuery is generally more cost-effective for workloads with high data processing requirements. However, Snowflake can be more cost-effective for workloads with high storage requirements or for workloads that need to be scaled up quickly.

It is important to note that these are just examples and your actual costs may vary depending on your specific workload. I recommend that you use the Snowflake and BigQuery pricing calculators to estimate your costs for your specific workload.