transfer_configs
Creates, updates, deletes, gets or lists a transfer_configs
resource.
Overview
Name | transfer_configs |
Type | Resource |
Id | google.bigquerydatatransfer.transfer_configs |
Fields
The following fields are returned by SELECT
queries:
- projects_locations_transfer_configs_get
- projects_transfer_configs_get
- projects_locations_transfer_configs_list
- projects_transfer_configs_list
Successful response
Name | Datatype | Description |
---|---|---|
name | string | Identifier. The resource name of the transfer config. Transfer config names have the form either projects/{project_id}/locations/{region}/transferConfigs/{config_id} or projects/{project_id}/transferConfigs/{config_id} , where config_id is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config. |
dataRefreshWindowDays | integer (int32) | The number of days to look back to automatically refresh the data. For example, if data_refresh_window_days = 10 , then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. |
dataSourceId | string | Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list |
datasetRegion | string | Output only. Region in which BigQuery dataset is located. |
destinationDatasetId | string | The BigQuery target dataset id. |
disabled | boolean | Is this config disabled. When set to true, no runs will be scheduled for this transfer config. |
displayName | string | User specified display name for the data transfer. |
emailPreferences | object | Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config. (id: EmailPreferences) |
encryptionConfiguration | object | The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent. (id: EncryptionConfiguration) |
error | object | Output only. Error code with detailed information about reason of the latest config failure. (id: Status) |
nextRunTime | string (google-datetime) | Output only. Next time when data transfer will run. |
notificationPubsubTopic | string | Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is: projects/{project_id}/topics/{topic_id} |
ownerInfo | object | Output only. Information about the user whose credentials are used to transfer data. Populated only for transferConfigs.get requests. In case the user information is not available, this field will not be populated. (id: UserInfo) |
params | object | Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq |
schedule | string | Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: 1st,3rd monday of month 15:30 , every wed,fri of jan,jun 13:15 , and first sunday of quarter 00:00 . See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. |
scheduleOptions | object | Options customizing the data transfer schedule. (id: ScheduleOptions) |
scheduleOptionsV2 | object | Options customizing different types of data transfer schedule. This field replaces "schedule" and "schedule_options" fields. ScheduleOptionsV2 cannot be used together with ScheduleOptions/Schedule. (id: ScheduleOptionsV2) |
state | string | Output only. State of the most recently updated transfer run. |
updateTime | string (google-datetime) | Output only. Data transfer modification time. Ignored by server on input. |
userId | string (int64) | Deprecated. Unique ID of the user on whose behalf transfer is done. |
Successful response
Name | Datatype | Description |
---|---|---|
name | string | Identifier. The resource name of the transfer config. Transfer config names have the form either projects/{project_id}/locations/{region}/transferConfigs/{config_id} or projects/{project_id}/transferConfigs/{config_id} , where config_id is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config. |
dataRefreshWindowDays | integer (int32) | The number of days to look back to automatically refresh the data. For example, if data_refresh_window_days = 10 , then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. |
dataSourceId | string | Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list |
datasetRegion | string | Output only. Region in which BigQuery dataset is located. |
destinationDatasetId | string | The BigQuery target dataset id. |
disabled | boolean | Is this config disabled. When set to true, no runs will be scheduled for this transfer config. |
displayName | string | User specified display name for the data transfer. |
emailPreferences | object | Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config. (id: EmailPreferences) |
encryptionConfiguration | object | The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent. (id: EncryptionConfiguration) |
error | object | Output only. Error code with detailed information about reason of the latest config failure. (id: Status) |
nextRunTime | string (google-datetime) | Output only. Next time when data transfer will run. |
notificationPubsubTopic | string | Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is: projects/{project_id}/topics/{topic_id} |
ownerInfo | object | Output only. Information about the user whose credentials are used to transfer data. Populated only for transferConfigs.get requests. In case the user information is not available, this field will not be populated. (id: UserInfo) |
params | object | Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq |
schedule | string | Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: 1st,3rd monday of month 15:30 , every wed,fri of jan,jun 13:15 , and first sunday of quarter 00:00 . See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. |
scheduleOptions | object | Options customizing the data transfer schedule. (id: ScheduleOptions) |
scheduleOptionsV2 | object | Options customizing different types of data transfer schedule. This field replaces "schedule" and "schedule_options" fields. ScheduleOptionsV2 cannot be used together with ScheduleOptions/Schedule. (id: ScheduleOptionsV2) |
state | string | Output only. State of the most recently updated transfer run. |
updateTime | string (google-datetime) | Output only. Data transfer modification time. Ignored by server on input. |
userId | string (int64) | Deprecated. Unique ID of the user on whose behalf transfer is done. |
Successful response
Name | Datatype | Description |
---|---|---|
name | string | Identifier. The resource name of the transfer config. Transfer config names have the form either projects/{project_id}/locations/{region}/transferConfigs/{config_id} or projects/{project_id}/transferConfigs/{config_id} , where config_id is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config. |
dataRefreshWindowDays | integer (int32) | The number of days to look back to automatically refresh the data. For example, if data_refresh_window_days = 10 , then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. |
dataSourceId | string | Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list |
datasetRegion | string | Output only. Region in which BigQuery dataset is located. |
destinationDatasetId | string | The BigQuery target dataset id. |
disabled | boolean | Is this config disabled. When set to true, no runs will be scheduled for this transfer config. |
displayName | string | User specified display name for the data transfer. |
emailPreferences | object | Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config. (id: EmailPreferences) |
encryptionConfiguration | object | The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent. (id: EncryptionConfiguration) |
error | object | Output only. Error code with detailed information about reason of the latest config failure. (id: Status) |
nextRunTime | string (google-datetime) | Output only. Next time when data transfer will run. |
notificationPubsubTopic | string | Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is: projects/{project_id}/topics/{topic_id} |
ownerInfo | object | Output only. Information about the user whose credentials are used to transfer data. Populated only for transferConfigs.get requests. In case the user information is not available, this field will not be populated. (id: UserInfo) |
params | object | Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq |
schedule | string | Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: 1st,3rd monday of month 15:30 , every wed,fri of jan,jun 13:15 , and first sunday of quarter 00:00 . See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. |
scheduleOptions | object | Options customizing the data transfer schedule. (id: ScheduleOptions) |
scheduleOptionsV2 | object | Options customizing different types of data transfer schedule. This field replaces "schedule" and "schedule_options" fields. ScheduleOptionsV2 cannot be used together with ScheduleOptions/Schedule. (id: ScheduleOptionsV2) |
state | string | Output only. State of the most recently updated transfer run. |
updateTime | string (google-datetime) | Output only. Data transfer modification time. Ignored by server on input. |
userId | string (int64) | Deprecated. Unique ID of the user on whose behalf transfer is done. |
Successful response
Name | Datatype | Description |
---|---|---|
name | string | Identifier. The resource name of the transfer config. Transfer config names have the form either projects/{project_id}/locations/{region}/transferConfigs/{config_id} or projects/{project_id}/transferConfigs/{config_id} , where config_id is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config. |
dataRefreshWindowDays | integer (int32) | The number of days to look back to automatically refresh the data. For example, if data_refresh_window_days = 10 , then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. |
dataSourceId | string | Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list |
datasetRegion | string | Output only. Region in which BigQuery dataset is located. |
destinationDatasetId | string | The BigQuery target dataset id. |
disabled | boolean | Is this config disabled. When set to true, no runs will be scheduled for this transfer config. |
displayName | string | User specified display name for the data transfer. |
emailPreferences | object | Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config. (id: EmailPreferences) |
encryptionConfiguration | object | The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent. (id: EncryptionConfiguration) |
error | object | Output only. Error code with detailed information about reason of the latest config failure. (id: Status) |
nextRunTime | string (google-datetime) | Output only. Next time when data transfer will run. |
notificationPubsubTopic | string | Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is: projects/{project_id}/topics/{topic_id} |
ownerInfo | object | Output only. Information about the user whose credentials are used to transfer data. Populated only for transferConfigs.get requests. In case the user information is not available, this field will not be populated. (id: UserInfo) |
params | object | Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq |
schedule | string | Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: 1st,3rd monday of month 15:30 , every wed,fri of jan,jun 13:15 , and first sunday of quarter 00:00 . See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. |
scheduleOptions | object | Options customizing the data transfer schedule. (id: ScheduleOptions) |
scheduleOptionsV2 | object | Options customizing different types of data transfer schedule. This field replaces "schedule" and "schedule_options" fields. ScheduleOptionsV2 cannot be used together with ScheduleOptions/Schedule. (id: ScheduleOptionsV2) |
state | string | Output only. State of the most recently updated transfer run. |
updateTime | string (google-datetime) | Output only. Data transfer modification time. Ignored by server on input. |
userId | string (int64) | Deprecated. Unique ID of the user on whose behalf transfer is done. |
Methods
The following methods are available for this resource:
Name | Accessible by | Required Params | Optional Params | Description |
---|---|---|---|---|
projects_locations_transfer_configs_get | select | projectsId , locationsId , transferConfigsId | Returns information about a data transfer config. | |
projects_transfer_configs_get | select | projectsId , transferConfigsId | Returns information about a data transfer config. | |
projects_locations_transfer_configs_list | select | projectsId , locationsId | dataSourceIds , pageToken , pageSize | Returns information about all transfer configs owned by a project in the specified location. |
projects_transfer_configs_list | select | projectsId | dataSourceIds , pageToken , pageSize | Returns information about all transfer configs owned by a project in the specified location. |
projects_locations_transfer_configs_create | insert | projectsId , locationsId | authorizationCode , versionInfo , serviceAccountName | Creates a new data transfer configuration. |
projects_transfer_configs_create | insert | projectsId | authorizationCode , versionInfo , serviceAccountName | Creates a new data transfer configuration. |
projects_locations_transfer_configs_patch | update | projectsId , locationsId , transferConfigsId | authorizationCode , updateMask , versionInfo , serviceAccountName | Updates a data transfer configuration. All fields must be set, even if they are not updated. |
projects_transfer_configs_patch | update | projectsId , transferConfigsId | authorizationCode , updateMask , versionInfo , serviceAccountName | Updates a data transfer configuration. All fields must be set, even if they are not updated. |
projects_locations_transfer_configs_delete | delete | projectsId , locationsId , transferConfigsId | Deletes a data transfer configuration, including any associated transfer runs and logs. | |
projects_transfer_configs_delete | delete | projectsId , transferConfigsId | Deletes a data transfer configuration, including any associated transfer runs and logs. | |
projects_transfer_configs_schedule_runs | exec | projectsId , transferConfigsId | Creates transfer runs for a time range [start_time, end_time]. For each date - or whatever granularity the data source supports - in the range, one transfer run is created. Note that runs are created per UTC time in the time range. DEPRECATED: use StartManualTransferRuns instead. | |
projects_transfer_configs_start_manual_runs | exec | projectsId , transferConfigsId | Manually initiates transfer runs. You can schedule these runs in two ways: 1. For a specific point in time using the 'requested_run_time' parameter. 2. For a period between 'start_time' (inclusive) and 'end_time' (exclusive). If scheduling a single run, it is set to execute immediately (schedule_time equals the current time). When scheduling multiple runs within a time range, the first run starts now, and subsequent runs are delayed by 15 seconds each. | |
projects_locations_transfer_configs_schedule_runs | exec | projectsId , locationsId , transferConfigsId | Creates transfer runs for a time range [start_time, end_time]. For each date - or whatever granularity the data source supports - in the range, one transfer run is created. Note that runs are created per UTC time in the time range. DEPRECATED: use StartManualTransferRuns instead. | |
projects_locations_transfer_configs_start_manual_runs | exec | projectsId , locationsId , transferConfigsId | Manually initiates transfer runs. You can schedule these runs in two ways: 1. For a specific point in time using the 'requested_run_time' parameter. 2. For a period between 'start_time' (inclusive) and 'end_time' (exclusive). If scheduling a single run, it is set to execute immediately (schedule_time equals the current time). When scheduling multiple runs within a time range, the first run starts now, and subsequent runs are delayed by 15 seconds each. |
Parameters
Parameters can be passed in the WHERE
clause of a query. Check the Methods section to see which parameters are required or optional for each operation.
Name | Datatype | Description |
---|---|---|
locationsId | string | |
projectsId | string | |
transferConfigsId | string | |
authorizationCode | string | |
dataSourceIds | string | |
pageSize | integer (int32) | |
pageToken | string | |
serviceAccountName | string | |
updateMask | string (google-fieldmask) | |
versionInfo | string |
SELECT
examples
- projects_locations_transfer_configs_get
- projects_transfer_configs_get
- projects_locations_transfer_configs_list
- projects_transfer_configs_list
Returns information about a data transfer config.
SELECT
name,
dataRefreshWindowDays,
dataSourceId,
datasetRegion,
destinationDatasetId,
disabled,
displayName,
emailPreferences,
encryptionConfiguration,
error,
nextRunTime,
notificationPubsubTopic,
ownerInfo,
params,
schedule,
scheduleOptions,
scheduleOptionsV2,
state,
updateTime,
userId
FROM google.bigquerydatatransfer.transfer_configs
WHERE projectsId = '{{ projectsId }}' -- required
AND locationsId = '{{ locationsId }}' -- required
AND transferConfigsId = '{{ transferConfigsId }}' -- required;
Returns information about a data transfer config.
SELECT
name,
dataRefreshWindowDays,
dataSourceId,
datasetRegion,
destinationDatasetId,
disabled,
displayName,
emailPreferences,
encryptionConfiguration,
error,
nextRunTime,
notificationPubsubTopic,
ownerInfo,
params,
schedule,
scheduleOptions,
scheduleOptionsV2,
state,
updateTime,
userId
FROM google.bigquerydatatransfer.transfer_configs
WHERE projectsId = '{{ projectsId }}' -- required
AND transferConfigsId = '{{ transferConfigsId }}' -- required;
Returns information about all transfer configs owned by a project in the specified location.
SELECT
name,
dataRefreshWindowDays,
dataSourceId,
datasetRegion,
destinationDatasetId,
disabled,
displayName,
emailPreferences,
encryptionConfiguration,
error,
nextRunTime,
notificationPubsubTopic,
ownerInfo,
params,
schedule,
scheduleOptions,
scheduleOptionsV2,
state,
updateTime,
userId
FROM google.bigquerydatatransfer.transfer_configs
WHERE projectsId = '{{ projectsId }}' -- required
AND locationsId = '{{ locationsId }}' -- required
AND dataSourceIds = '{{ dataSourceIds }}'
AND pageToken = '{{ pageToken }}'
AND pageSize = '{{ pageSize }}';
Returns information about all transfer configs owned by a project in the specified location.
SELECT
name,
dataRefreshWindowDays,
dataSourceId,
datasetRegion,
destinationDatasetId,
disabled,
displayName,
emailPreferences,
encryptionConfiguration,
error,
nextRunTime,
notificationPubsubTopic,
ownerInfo,
params,
schedule,
scheduleOptions,
scheduleOptionsV2,
state,
updateTime,
userId
FROM google.bigquerydatatransfer.transfer_configs
WHERE projectsId = '{{ projectsId }}' -- required
AND dataSourceIds = '{{ dataSourceIds }}'
AND pageToken = '{{ pageToken }}'
AND pageSize = '{{ pageSize }}';
INSERT
examples
- projects_locations_transfer_configs_create
- projects_transfer_configs_create
- Manifest
Creates a new data transfer configuration.
INSERT INTO google.bigquerydatatransfer.transfer_configs (
data__name,
data__destinationDatasetId,
data__displayName,
data__dataSourceId,
data__params,
data__schedule,
data__scheduleOptions,
data__scheduleOptionsV2,
data__dataRefreshWindowDays,
data__disabled,
data__userId,
data__notificationPubsubTopic,
data__emailPreferences,
data__encryptionConfiguration,
projectsId,
locationsId,
authorizationCode,
versionInfo,
serviceAccountName
)
SELECT
'{{ name }}',
'{{ destinationDatasetId }}',
'{{ displayName }}',
'{{ dataSourceId }}',
'{{ params }}',
'{{ schedule }}',
'{{ scheduleOptions }}',
'{{ scheduleOptionsV2 }}',
{{ dataRefreshWindowDays }},
{{ disabled }},
'{{ userId }}',
'{{ notificationPubsubTopic }}',
'{{ emailPreferences }}',
'{{ encryptionConfiguration }}',
'{{ projectsId }}',
'{{ locationsId }}',
'{{ authorizationCode }}',
'{{ versionInfo }}',
'{{ serviceAccountName }}'
RETURNING
name,
dataRefreshWindowDays,
dataSourceId,
datasetRegion,
destinationDatasetId,
disabled,
displayName,
emailPreferences,
encryptionConfiguration,
error,
nextRunTime,
notificationPubsubTopic,
ownerInfo,
params,
schedule,
scheduleOptions,
scheduleOptionsV2,
state,
updateTime,
userId
;
Creates a new data transfer configuration.
INSERT INTO google.bigquerydatatransfer.transfer_configs (
data__name,
data__destinationDatasetId,
data__displayName,
data__dataSourceId,
data__params,
data__schedule,
data__scheduleOptions,
data__scheduleOptionsV2,
data__dataRefreshWindowDays,
data__disabled,
data__userId,
data__notificationPubsubTopic,
data__emailPreferences,
data__encryptionConfiguration,
projectsId,
authorizationCode,
versionInfo,
serviceAccountName
)
SELECT
'{{ name }}',
'{{ destinationDatasetId }}',
'{{ displayName }}',
'{{ dataSourceId }}',
'{{ params }}',
'{{ schedule }}',
'{{ scheduleOptions }}',
'{{ scheduleOptionsV2 }}',
{{ dataRefreshWindowDays }},
{{ disabled }},
'{{ userId }}',
'{{ notificationPubsubTopic }}',
'{{ emailPreferences }}',
'{{ encryptionConfiguration }}',
'{{ projectsId }}',
'{{ authorizationCode }}',
'{{ versionInfo }}',
'{{ serviceAccountName }}'
RETURNING
name,
dataRefreshWindowDays,
dataSourceId,
datasetRegion,
destinationDatasetId,
disabled,
displayName,
emailPreferences,
encryptionConfiguration,
error,
nextRunTime,
notificationPubsubTopic,
ownerInfo,
params,
schedule,
scheduleOptions,
scheduleOptionsV2,
state,
updateTime,
userId
;
# Description fields are for documentation purposes
- name: transfer_configs
props:
- name: projectsId
value: string
description: Required parameter for the transfer_configs resource.
- name: locationsId
value: string
description: Required parameter for the transfer_configs resource.
- name: name
value: string
description: >
Identifier. The resource name of the transfer config. Transfer config names have the form either `projects/{project_id}/locations/{region}/transferConfigs/{config_id}` or `projects/{project_id}/transferConfigs/{config_id}`, where `config_id` is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config.
- name: destinationDatasetId
value: string
description: >
The BigQuery target dataset id.
- name: displayName
value: string
description: >
User specified display name for the data transfer.
- name: dataSourceId
value: string
description: >
Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list
- name: params
value: object
description: >
Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq
- name: schedule
value: string
description: >
Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: `1st,3rd monday of month 15:30`, `every wed,fri of jan,jun 13:15`, and `first sunday of quarter 00:00`. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source.
- name: scheduleOptions
value: object
description: >
Options customizing the data transfer schedule.
- name: scheduleOptionsV2
value: object
description: >
Options customizing different types of data transfer schedule. This field replaces "schedule" and "schedule_options" fields. ScheduleOptionsV2 cannot be used together with ScheduleOptions/Schedule.
- name: dataRefreshWindowDays
value: integer
description: >
The number of days to look back to automatically refresh the data. For example, if `data_refresh_window_days = 10`, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value.
- name: disabled
value: boolean
description: >
Is this config disabled. When set to true, no runs will be scheduled for this transfer config.
- name: userId
value: string
description: >
Deprecated. Unique ID of the user on whose behalf transfer is done.
- name: notificationPubsubTopic
value: string
description: >
Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is: `projects/{project_id}/topics/{topic_id}`
- name: emailPreferences
value: object
description: >
Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.
- name: encryptionConfiguration
value: object
description: >
The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.
- name: authorizationCode
value: string
- name: versionInfo
value: string
- name: serviceAccountName
value: string
UPDATE
examples
- projects_locations_transfer_configs_patch
- projects_transfer_configs_patch
Updates a data transfer configuration. All fields must be set, even if they are not updated.
UPDATE google.bigquerydatatransfer.transfer_configs
SET
data__name = '{{ name }}',
data__destinationDatasetId = '{{ destinationDatasetId }}',
data__displayName = '{{ displayName }}',
data__dataSourceId = '{{ dataSourceId }}',
data__params = '{{ params }}',
data__schedule = '{{ schedule }}',
data__scheduleOptions = '{{ scheduleOptions }}',
data__scheduleOptionsV2 = '{{ scheduleOptionsV2 }}',
data__dataRefreshWindowDays = {{ dataRefreshWindowDays }},
data__disabled = {{ disabled }},
data__userId = '{{ userId }}',
data__notificationPubsubTopic = '{{ notificationPubsubTopic }}',
data__emailPreferences = '{{ emailPreferences }}',
data__encryptionConfiguration = '{{ encryptionConfiguration }}'
WHERE
projectsId = '{{ projectsId }}' --required
AND locationsId = '{{ locationsId }}' --required
AND transferConfigsId = '{{ transferConfigsId }}' --required
AND authorizationCode = '{{ authorizationCode}}'
AND updateMask = '{{ updateMask}}'
AND versionInfo = '{{ versionInfo}}'
AND serviceAccountName = '{{ serviceAccountName}}'
RETURNING
name,
dataRefreshWindowDays,
dataSourceId,
datasetRegion,
destinationDatasetId,
disabled,
displayName,
emailPreferences,
encryptionConfiguration,
error,
nextRunTime,
notificationPubsubTopic,
ownerInfo,
params,
schedule,
scheduleOptions,
scheduleOptionsV2,
state,
updateTime,
userId;
Updates a data transfer configuration. All fields must be set, even if they are not updated.
UPDATE google.bigquerydatatransfer.transfer_configs
SET
data__name = '{{ name }}',
data__destinationDatasetId = '{{ destinationDatasetId }}',
data__displayName = '{{ displayName }}',
data__dataSourceId = '{{ dataSourceId }}',
data__params = '{{ params }}',
data__schedule = '{{ schedule }}',
data__scheduleOptions = '{{ scheduleOptions }}',
data__scheduleOptionsV2 = '{{ scheduleOptionsV2 }}',
data__dataRefreshWindowDays = {{ dataRefreshWindowDays }},
data__disabled = {{ disabled }},
data__userId = '{{ userId }}',
data__notificationPubsubTopic = '{{ notificationPubsubTopic }}',
data__emailPreferences = '{{ emailPreferences }}',
data__encryptionConfiguration = '{{ encryptionConfiguration }}'
WHERE
projectsId = '{{ projectsId }}' --required
AND transferConfigsId = '{{ transferConfigsId }}' --required
AND authorizationCode = '{{ authorizationCode}}'
AND updateMask = '{{ updateMask}}'
AND versionInfo = '{{ versionInfo}}'
AND serviceAccountName = '{{ serviceAccountName}}'
RETURNING
name,
dataRefreshWindowDays,
dataSourceId,
datasetRegion,
destinationDatasetId,
disabled,
displayName,
emailPreferences,
encryptionConfiguration,
error,
nextRunTime,
notificationPubsubTopic,
ownerInfo,
params,
schedule,
scheduleOptions,
scheduleOptionsV2,
state,
updateTime,
userId;
DELETE
examples
- projects_locations_transfer_configs_delete
- projects_transfer_configs_delete
Deletes a data transfer configuration, including any associated transfer runs and logs.
DELETE FROM google.bigquerydatatransfer.transfer_configs
WHERE projectsId = '{{ projectsId }}' --required
AND locationsId = '{{ locationsId }}' --required
AND transferConfigsId = '{{ transferConfigsId }}' --required;
Deletes a data transfer configuration, including any associated transfer runs and logs.
DELETE FROM google.bigquerydatatransfer.transfer_configs
WHERE projectsId = '{{ projectsId }}' --required
AND transferConfigsId = '{{ transferConfigsId }}' --required;
Lifecycle Methods
- projects_transfer_configs_schedule_runs
- projects_transfer_configs_start_manual_runs
- projects_locations_transfer_configs_schedule_runs
- projects_locations_transfer_configs_start_manual_runs
Creates transfer runs for a time range [start_time, end_time]. For each date - or whatever granularity the data source supports - in the range, one transfer run is created. Note that runs are created per UTC time in the time range. DEPRECATED: use StartManualTransferRuns instead.
EXEC google.bigquerydatatransfer.transfer_configs.projects_transfer_configs_schedule_runs
@projectsId='{{ projectsId }}' --required,
@transferConfigsId='{{ transferConfigsId }}' --required
@@json=
'{
"startTime": "{{ startTime }}",
"endTime": "{{ endTime }}"
}';
Manually initiates transfer runs. You can schedule these runs in two ways: 1. For a specific point in time using the 'requested_run_time' parameter. 2. For a period between 'start_time' (inclusive) and 'end_time' (exclusive). If scheduling a single run, it is set to execute immediately (schedule_time equals the current time). When scheduling multiple runs within a time range, the first run starts now, and subsequent runs are delayed by 15 seconds each.
EXEC google.bigquerydatatransfer.transfer_configs.projects_transfer_configs_start_manual_runs
@projectsId='{{ projectsId }}' --required,
@transferConfigsId='{{ transferConfigsId }}' --required
@@json=
'{
"requestedTimeRange": "{{ requestedTimeRange }}",
"requestedRunTime": "{{ requestedRunTime }}"
}';
Creates transfer runs for a time range [start_time, end_time]. For each date - or whatever granularity the data source supports - in the range, one transfer run is created. Note that runs are created per UTC time in the time range. DEPRECATED: use StartManualTransferRuns instead.
EXEC google.bigquerydatatransfer.transfer_configs.projects_locations_transfer_configs_schedule_runs
@projectsId='{{ projectsId }}' --required,
@locationsId='{{ locationsId }}' --required,
@transferConfigsId='{{ transferConfigsId }}' --required
@@json=
'{
"startTime": "{{ startTime }}",
"endTime": "{{ endTime }}"
}';
Manually initiates transfer runs. You can schedule these runs in two ways: 1. For a specific point in time using the 'requested_run_time' parameter. 2. For a period between 'start_time' (inclusive) and 'end_time' (exclusive). If scheduling a single run, it is set to execute immediately (schedule_time equals the current time). When scheduling multiple runs within a time range, the first run starts now, and subsequent runs are delayed by 15 seconds each.
EXEC google.bigquerydatatransfer.transfer_configs.projects_locations_transfer_configs_start_manual_runs
@projectsId='{{ projectsId }}' --required,
@locationsId='{{ locationsId }}' --required,
@transferConfigsId='{{ transferConfigsId }}' --required
@@json=
'{
"requestedTimeRange": "{{ requestedTimeRange }}",
"requestedRunTime": "{{ requestedRunTime }}"
}';