Universal analytics backup & story to big-queary

63 Views Asked by At

I want to export data from GA [Universal Analytics] and store that data to big Queary. I'm new to GA to just want to understand how I can start this process and you can refer any resources.

My plan is to create a script to fetch data from GA and store it to big Queary.

I have just try to learn GCP and create new big queary in GCp.

2

There are 2 best solutions below

1
ali izadi On

this is possible using analytics api v4, you can find a complete explanation in this Medium article or check the corresponding GitHub repository.

https://medium.com/@aliiz/export-from-universal-analytics-to-bigquery-zero-cost-full-control-6470092713b1

https://github.com/aliasoblomov/Universal-Analytics-to-BigQuery/blob/main/backfill-UA.py

1
Fahed Sabellioglu On

This can already be done using the native exports provided by GA. You can enable the connectivity by following the steps in this documentation. The export to the US region is straightforward, but to the EU region requires more steps as mentioned in the document. If it's the first time you are exporting data from GA, GA does provide you with a history export.

As mentioned by ali izadi above, yes you can fetch data from GA using the provided API. But, keep in mind the below issues:

  • Depending on your subscription to GA, the API quota limits change and they can limit your data fetch process.
  • The API for GA Universal uses Sampling and in order to get the whole data GA has, you need to go with higher sampling levels. This leads to some API issues from time to time.

Keeping in mind that UA will be deprecated soon, it will not be worth dealing with the API issues mentioned above. Especially, if native export's schema answers your questions.