Airflow BigQueryInsertJobOperator: how to create partitioned table?

83 Views Asked by At

I have a simple DAG, where I am calling BigQueryInsertJobOperator Is there any way to create a partitioned table?

article= BigQueryInsertJobOperator(
     task_id="article",
     configuration={
          "query":{
               "query":"article.sql",
               "useLegacySql":False,
               "createDisposition":"CREATE_IF_NEEDED",
               "writeDisposition":"WRITE_APPEND",
               "priority": "BATCH",
                "destinationTable": {
            'projectId': "{{var.value.project_id}}",
            'datasetId': "{{var.value.datasetId}}",
            'tableId': "partitioning_table",
            'partitioningType': 'RANGE_BUCKET',
            'rangePartitioning': {
                'field': 'partition_id',
                'generate_array': {
                    'start': '1', 
                    'end': '10000',
                    'interval': '1',
                },
            },
        }}
    },
     job_id="article_"+"{{ dag_run.conf['id'] }}",
     cancel_on_kill =True,
     result_timeout =None,
     deferrable=True,
     params={'id':"{{ dag_run.conf['id'] }}"},
     trigger_rule='none_failed_or_skipped'
) 

I want to do partitioning based on the integer column partition_id.

0

There are 0 best solutions below