airbyte.DestinationBigquery
Explore with Pulumi AI
DestinationBigquery Resource
Example Usage
Coming soon!
Coming soon!
Coming soon!
Coming soon!
package generated_program;
import com.pulumi.Context;
import com.pulumi.Pulumi;
import com.pulumi.core.Output;
import com.pulumi.airbyte.DestinationBigquery;
import com.pulumi.airbyte.DestinationBigqueryArgs;
import com.pulumi.airbyte.inputs.DestinationBigqueryConfigurationArgs;
import java.util.List;
import java.util.ArrayList;
import java.util.Map;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;
public class App {
    public static void main(String[] args) {
        Pulumi.run(App::stack);
    }
    public static void stack(Context ctx) {
        var myDestinationBigquery = new DestinationBigquery("myDestinationBigquery", DestinationBigqueryArgs.builder()
            .configuration(DestinationBigqueryConfigurationArgs.builder()
                .big_query_client_buffer_size_mb(15)
                .credentials_json("...my_credentials_json...")
                .dataset_id("...my_dataset_id...")
                .dataset_location("US")
                .disable_type_dedupe(true)
                .loading_method(%!v(PANIC=Format method: runtime error: invalid memory address or nil pointer dereference))
                .project_id("...my_project_id...")
                .raw_data_dataset("...my_raw_data_dataset...")
                .transformation_priority("interactive")
                .build())
            .definitionId("92c3eb2b-6d61-4610-adf2-eee065419ed9")
            .workspaceId("acee73dd-54d3-476f-a8ea-d39d218f52cd")
            .build());
    }
}
resources:
  myDestinationBigquery:
    type: airbyte:DestinationBigquery
    properties:
      configuration:
        big_query_client_buffer_size_mb: 15
        credentials_json: '...my_credentials_json...'
        dataset_id: '...my_dataset_id...'
        dataset_location: US
        disable_type_dedupe: true
        loading_method:
          batchedStandardInserts: {}
          gcsStaging:
            credential:
              hmacKey:
                hmacKeyAccessId: 1234567890abcdefghij1234
                hmacKeySecret: 1234567890abcdefghij1234567890ABCDEFGHIJ
            gcsBucketName: airbyte_sync
            gcsBucketPath: data_sync/test
            keepFilesInGcsBucket: Delete all tmp files from GCS
        project_id: '...my_project_id...'
        raw_data_dataset: '...my_raw_data_dataset...'
        transformation_priority: interactive
      definitionId: 92c3eb2b-6d61-4610-adf2-eee065419ed9
      workspaceId: acee73dd-54d3-476f-a8ea-d39d218f52cd
Create DestinationBigquery Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new DestinationBigquery(name: string, args: DestinationBigqueryArgs, opts?: CustomResourceOptions);@overload
def DestinationBigquery(resource_name: str,
                        args: DestinationBigqueryArgs,
                        opts: Optional[ResourceOptions] = None)
@overload
def DestinationBigquery(resource_name: str,
                        opts: Optional[ResourceOptions] = None,
                        configuration: Optional[DestinationBigqueryConfigurationArgs] = None,
                        workspace_id: Optional[str] = None,
                        definition_id: Optional[str] = None,
                        name: Optional[str] = None)func NewDestinationBigquery(ctx *Context, name string, args DestinationBigqueryArgs, opts ...ResourceOption) (*DestinationBigquery, error)public DestinationBigquery(string name, DestinationBigqueryArgs args, CustomResourceOptions? opts = null)
public DestinationBigquery(String name, DestinationBigqueryArgs args)
public DestinationBigquery(String name, DestinationBigqueryArgs args, CustomResourceOptions options)
type: airbyte:DestinationBigquery
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args DestinationBigqueryArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args DestinationBigqueryArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args DestinationBigqueryArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args DestinationBigqueryArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args DestinationBigqueryArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Constructor example
The following reference example uses placeholder values for all input properties.
var destinationBigqueryResource = new Airbyte.DestinationBigquery("destinationBigqueryResource", new()
{
    Configuration = new Airbyte.Inputs.DestinationBigqueryConfigurationArgs
    {
        DatasetId = "string",
        DatasetLocation = "string",
        ProjectId = "string",
        BigQueryClientBufferSizeMb = 0,
        CredentialsJson = "string",
        DisableTypeDedupe = false,
        LoadingMethod = new Airbyte.Inputs.DestinationBigqueryConfigurationLoadingMethodArgs
        {
            BatchedStandardInserts = null,
            GcsStaging = new Airbyte.Inputs.DestinationBigqueryConfigurationLoadingMethodGcsStagingArgs
            {
                Credential = new Airbyte.Inputs.DestinationBigqueryConfigurationLoadingMethodGcsStagingCredentialArgs
                {
                    HmacKey = new Airbyte.Inputs.DestinationBigqueryConfigurationLoadingMethodGcsStagingCredentialHmacKeyArgs
                    {
                        HmacKeyAccessId = "string",
                        HmacKeySecret = "string",
                    },
                },
                GcsBucketName = "string",
                GcsBucketPath = "string",
                KeepFilesInGcsBucket = "string",
            },
        },
        RawDataDataset = "string",
        TransformationPriority = "string",
    },
    WorkspaceId = "string",
    DefinitionId = "string",
    Name = "string",
});
example, err := airbyte.NewDestinationBigquery(ctx, "destinationBigqueryResource", &airbyte.DestinationBigqueryArgs{
Configuration: &.DestinationBigqueryConfigurationArgs{
DatasetId: pulumi.String("string"),
DatasetLocation: pulumi.String("string"),
ProjectId: pulumi.String("string"),
BigQueryClientBufferSizeMb: pulumi.Float64(0),
CredentialsJson: pulumi.String("string"),
DisableTypeDedupe: pulumi.Bool(false),
LoadingMethod: &.DestinationBigqueryConfigurationLoadingMethodArgs{
BatchedStandardInserts: &.DestinationBigqueryConfigurationLoadingMethodBatchedStandardInsertsArgs{
},
GcsStaging: &.DestinationBigqueryConfigurationLoadingMethodGcsStagingArgs{
Credential: &.DestinationBigqueryConfigurationLoadingMethodGcsStagingCredentialArgs{
HmacKey: &.DestinationBigqueryConfigurationLoadingMethodGcsStagingCredentialHmacKeyArgs{
HmacKeyAccessId: pulumi.String("string"),
HmacKeySecret: pulumi.String("string"),
},
},
GcsBucketName: pulumi.String("string"),
GcsBucketPath: pulumi.String("string"),
KeepFilesInGcsBucket: pulumi.String("string"),
},
},
RawDataDataset: pulumi.String("string"),
TransformationPriority: pulumi.String("string"),
},
WorkspaceId: pulumi.String("string"),
DefinitionId: pulumi.String("string"),
Name: pulumi.String("string"),
})
var destinationBigqueryResource = new DestinationBigquery("destinationBigqueryResource", DestinationBigqueryArgs.builder()
    .configuration(DestinationBigqueryConfigurationArgs.builder()
        .datasetId("string")
        .datasetLocation("string")
        .projectId("string")
        .bigQueryClientBufferSizeMb(0)
        .credentialsJson("string")
        .disableTypeDedupe(false)
        .loadingMethod(DestinationBigqueryConfigurationLoadingMethodArgs.builder()
            .batchedStandardInserts()
            .gcsStaging(DestinationBigqueryConfigurationLoadingMethodGcsStagingArgs.builder()
                .credential(DestinationBigqueryConfigurationLoadingMethodGcsStagingCredentialArgs.builder()
                    .hmacKey(DestinationBigqueryConfigurationLoadingMethodGcsStagingCredentialHmacKeyArgs.builder()
                        .hmacKeyAccessId("string")
                        .hmacKeySecret("string")
                        .build())
                    .build())
                .gcsBucketName("string")
                .gcsBucketPath("string")
                .keepFilesInGcsBucket("string")
                .build())
            .build())
        .rawDataDataset("string")
        .transformationPriority("string")
        .build())
    .workspaceId("string")
    .definitionId("string")
    .name("string")
    .build());
destination_bigquery_resource = airbyte.DestinationBigquery("destinationBigqueryResource",
    configuration={
        "dataset_id": "string",
        "dataset_location": "string",
        "project_id": "string",
        "big_query_client_buffer_size_mb": 0,
        "credentials_json": "string",
        "disable_type_dedupe": False,
        "loading_method": {
            "batched_standard_inserts": {},
            "gcs_staging": {
                "credential": {
                    "hmac_key": {
                        "hmac_key_access_id": "string",
                        "hmac_key_secret": "string",
                    },
                },
                "gcs_bucket_name": "string",
                "gcs_bucket_path": "string",
                "keep_files_in_gcs_bucket": "string",
            },
        },
        "raw_data_dataset": "string",
        "transformation_priority": "string",
    },
    workspace_id="string",
    definition_id="string",
    name="string")
const destinationBigqueryResource = new airbyte.DestinationBigquery("destinationBigqueryResource", {
    configuration: {
        datasetId: "string",
        datasetLocation: "string",
        projectId: "string",
        bigQueryClientBufferSizeMb: 0,
        credentialsJson: "string",
        disableTypeDedupe: false,
        loadingMethod: {
            batchedStandardInserts: {},
            gcsStaging: {
                credential: {
                    hmacKey: {
                        hmacKeyAccessId: "string",
                        hmacKeySecret: "string",
                    },
                },
                gcsBucketName: "string",
                gcsBucketPath: "string",
                keepFilesInGcsBucket: "string",
            },
        },
        rawDataDataset: "string",
        transformationPriority: "string",
    },
    workspaceId: "string",
    definitionId: "string",
    name: "string",
});
type: airbyte:DestinationBigquery
properties:
    configuration:
        bigQueryClientBufferSizeMb: 0
        credentialsJson: string
        datasetId: string
        datasetLocation: string
        disableTypeDedupe: false
        loadingMethod:
            batchedStandardInserts: {}
            gcsStaging:
                credential:
                    hmacKey:
                        hmacKeyAccessId: string
                        hmacKeySecret: string
                gcsBucketName: string
                gcsBucketPath: string
                keepFilesInGcsBucket: string
        projectId: string
        rawDataDataset: string
        transformationPriority: string
    definitionId: string
    name: string
    workspaceId: string
DestinationBigquery Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The DestinationBigquery resource accepts the following input properties:
- Configuration
DestinationBigquery Configuration 
- WorkspaceId string
- DefinitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- Name string
- Name of the destination e.g. dev-mysql-instance.
- Configuration
DestinationBigquery Configuration Args 
- WorkspaceId string
- DefinitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- Name string
- Name of the destination e.g. dev-mysql-instance.
- configuration
DestinationBigquery Configuration 
- workspaceId String
- definitionId String
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name String
- Name of the destination e.g. dev-mysql-instance.
- configuration
DestinationBigquery Configuration 
- workspaceId string
- definitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name string
- Name of the destination e.g. dev-mysql-instance.
- configuration
DestinationBigquery Configuration Args 
- workspace_id str
- definition_id str
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name str
- Name of the destination e.g. dev-mysql-instance.
- configuration Property Map
- workspaceId String
- definitionId String
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name String
- Name of the destination e.g. dev-mysql-instance.
Outputs
All input properties are implicitly available as output properties. Additionally, the DestinationBigquery resource produces the following output properties:
- CreatedAt double
- DestinationId string
- DestinationType string
- Id string
- The provider-assigned unique ID for this managed resource.
- CreatedAt float64
- DestinationId string
- DestinationType string
- Id string
- The provider-assigned unique ID for this managed resource.
- createdAt Double
- destinationId String
- destinationType String
- id String
- The provider-assigned unique ID for this managed resource.
- createdAt number
- destinationId string
- destinationType string
- id string
- The provider-assigned unique ID for this managed resource.
- created_at float
- destination_id str
- destination_type str
- id str
- The provider-assigned unique ID for this managed resource.
- createdAt Number
- destinationId String
- destinationType String
- id String
- The provider-assigned unique ID for this managed resource.
Look up Existing DestinationBigquery Resource
Get an existing DestinationBigquery resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: DestinationBigqueryState, opts?: CustomResourceOptions): DestinationBigquery@staticmethod
def get(resource_name: str,
        id: str,
        opts: Optional[ResourceOptions] = None,
        configuration: Optional[DestinationBigqueryConfigurationArgs] = None,
        created_at: Optional[float] = None,
        definition_id: Optional[str] = None,
        destination_id: Optional[str] = None,
        destination_type: Optional[str] = None,
        name: Optional[str] = None,
        workspace_id: Optional[str] = None) -> DestinationBigqueryfunc GetDestinationBigquery(ctx *Context, name string, id IDInput, state *DestinationBigqueryState, opts ...ResourceOption) (*DestinationBigquery, error)public static DestinationBigquery Get(string name, Input<string> id, DestinationBigqueryState? state, CustomResourceOptions? opts = null)public static DestinationBigquery get(String name, Output<String> id, DestinationBigqueryState state, CustomResourceOptions options)resources:  _:    type: airbyte:DestinationBigquery    get:      id: ${id}- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- Configuration
DestinationBigquery Configuration 
- CreatedAt double
- DefinitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- DestinationId string
- DestinationType string
- Name string
- Name of the destination e.g. dev-mysql-instance.
- WorkspaceId string
- Configuration
DestinationBigquery Configuration Args 
- CreatedAt float64
- DefinitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- DestinationId string
- DestinationType string
- Name string
- Name of the destination e.g. dev-mysql-instance.
- WorkspaceId string
- configuration
DestinationBigquery Configuration 
- createdAt Double
- definitionId String
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destinationId String
- destinationType String
- name String
- Name of the destination e.g. dev-mysql-instance.
- workspaceId String
- configuration
DestinationBigquery Configuration 
- createdAt number
- definitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destinationId string
- destinationType string
- name string
- Name of the destination e.g. dev-mysql-instance.
- workspaceId string
- configuration
DestinationBigquery Configuration Args 
- created_at float
- definition_id str
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destination_id str
- destination_type str
- name str
- Name of the destination e.g. dev-mysql-instance.
- workspace_id str
- configuration Property Map
- createdAt Number
- definitionId String
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destinationId String
- destinationType String
- name String
- Name of the destination e.g. dev-mysql-instance.
- workspaceId String
Supporting Types
DestinationBigqueryConfiguration, DestinationBigqueryConfigurationArgs      
- DatasetId string
- The default BigQuery Dataset ID that tables are replicated to if the source does not specify a namespace. Read more \n\nhere\n\n.
- DatasetLocation string
- The location of the dataset. Warning: Changes made after creation will not be applied. Read more \n\nhere\n\n. must be one of ["US", "EU", "asia-east1", "asia-east2", "asia-northeast1", "asia-northeast2", "asia-northeast3", "asia-south1", "asia-south2", "asia-southeast1", "asia-southeast2", "australia-southeast1", "australia-southeast2", "europe-central1", "europe-central2", "europe-north1", "europe-southwest1", "europe-west1", "europe-west2", "europe-west3", "europe-west4", "europe-west6", "europe-west7", "europe-west8", "europe-west9", "europe-west12", "me-central1", "me-central2", "me-west1", "northamerica-northeast1", "northamerica-northeast2", "southamerica-east1", "southamerica-west1", "us-central1", "us-east1", "us-east2", "us-east3", "us-east4", "us-east5", "us-south1", "us-west1", "us-west2", "us-west3", "us-west4"]
- ProjectId string
- The GCP project ID for the project containing the target BigQuery dataset. Read more \n\nhere\n\n.
- BigQuery doubleClient Buffer Size Mb 
- Google BigQuery client's chunk (buffer) size (MIN=1, MAX = 15) for each table. The size that will be written by a single RPC. Written data will be buffered and only flushed upon reaching this size or closing the channel. The default 15MB value is used if not set explicitly. Read more \n\nhere\n\n. Default: 15
- CredentialsJson string
- The contents of the JSON service account key. Check out the \n\ndocs\n\n if you need help generating this key. Default credentials will be used if this field is left empty.
- DisableType boolDedupe 
- Disable Writing Final Tables. WARNING! The data format in airbytedata is likely stable but there are no guarantees that other metadata columns will remain the same in future versions. Default: false
- LoadingMethod DestinationBigquery Configuration Loading Method 
- The way data will be uploaded to BigQuery.
- RawData stringDataset 
- The dataset to write raw tables into (default: airbyte_internal)
- TransformationPriority string
- Interactive run type means that the query is executed as soon as possible, and these queries count towards concurrent rate limit and daily limit. Read more about interactive run type \n\nhere\n\n. Batch queries are queued and started as soon as idle resources are available in the BigQuery shared resource pool, which usually occurs within a few minutes. Batch queries don’t count towards your concurrent rate limit. Read more about batch queries \n\nhere\n\n. The default "interactive" value is used if not set explicitly. Default: "interactive"; must be one of ["interactive", "batch"]
- DatasetId string
- The default BigQuery Dataset ID that tables are replicated to if the source does not specify a namespace. Read more \n\nhere\n\n.
- DatasetLocation string
- The location of the dataset. Warning: Changes made after creation will not be applied. Read more \n\nhere\n\n. must be one of ["US", "EU", "asia-east1", "asia-east2", "asia-northeast1", "asia-northeast2", "asia-northeast3", "asia-south1", "asia-south2", "asia-southeast1", "asia-southeast2", "australia-southeast1", "australia-southeast2", "europe-central1", "europe-central2", "europe-north1", "europe-southwest1", "europe-west1", "europe-west2", "europe-west3", "europe-west4", "europe-west6", "europe-west7", "europe-west8", "europe-west9", "europe-west12", "me-central1", "me-central2", "me-west1", "northamerica-northeast1", "northamerica-northeast2", "southamerica-east1", "southamerica-west1", "us-central1", "us-east1", "us-east2", "us-east3", "us-east4", "us-east5", "us-south1", "us-west1", "us-west2", "us-west3", "us-west4"]
- ProjectId string
- The GCP project ID for the project containing the target BigQuery dataset. Read more \n\nhere\n\n.
- BigQuery float64Client Buffer Size Mb 
- Google BigQuery client's chunk (buffer) size (MIN=1, MAX = 15) for each table. The size that will be written by a single RPC. Written data will be buffered and only flushed upon reaching this size or closing the channel. The default 15MB value is used if not set explicitly. Read more \n\nhere\n\n. Default: 15
- CredentialsJson string
- The contents of the JSON service account key. Check out the \n\ndocs\n\n if you need help generating this key. Default credentials will be used if this field is left empty.
- DisableType boolDedupe 
- Disable Writing Final Tables. WARNING! The data format in airbytedata is likely stable but there are no guarantees that other metadata columns will remain the same in future versions. Default: false
- LoadingMethod DestinationBigquery Configuration Loading Method 
- The way data will be uploaded to BigQuery.
- RawData stringDataset 
- The dataset to write raw tables into (default: airbyte_internal)
- TransformationPriority string
- Interactive run type means that the query is executed as soon as possible, and these queries count towards concurrent rate limit and daily limit. Read more about interactive run type \n\nhere\n\n. Batch queries are queued and started as soon as idle resources are available in the BigQuery shared resource pool, which usually occurs within a few minutes. Batch queries don’t count towards your concurrent rate limit. Read more about batch queries \n\nhere\n\n. The default "interactive" value is used if not set explicitly. Default: "interactive"; must be one of ["interactive", "batch"]
- datasetId String
- The default BigQuery Dataset ID that tables are replicated to if the source does not specify a namespace. Read more \n\nhere\n\n.
- datasetLocation String
- The location of the dataset. Warning: Changes made after creation will not be applied. Read more \n\nhere\n\n. must be one of ["US", "EU", "asia-east1", "asia-east2", "asia-northeast1", "asia-northeast2", "asia-northeast3", "asia-south1", "asia-south2", "asia-southeast1", "asia-southeast2", "australia-southeast1", "australia-southeast2", "europe-central1", "europe-central2", "europe-north1", "europe-southwest1", "europe-west1", "europe-west2", "europe-west3", "europe-west4", "europe-west6", "europe-west7", "europe-west8", "europe-west9", "europe-west12", "me-central1", "me-central2", "me-west1", "northamerica-northeast1", "northamerica-northeast2", "southamerica-east1", "southamerica-west1", "us-central1", "us-east1", "us-east2", "us-east3", "us-east4", "us-east5", "us-south1", "us-west1", "us-west2", "us-west3", "us-west4"]
- projectId String
- The GCP project ID for the project containing the target BigQuery dataset. Read more \n\nhere\n\n.
- bigQuery DoubleClient Buffer Size Mb 
- Google BigQuery client's chunk (buffer) size (MIN=1, MAX = 15) for each table. The size that will be written by a single RPC. Written data will be buffered and only flushed upon reaching this size or closing the channel. The default 15MB value is used if not set explicitly. Read more \n\nhere\n\n. Default: 15
- credentialsJson String
- The contents of the JSON service account key. Check out the \n\ndocs\n\n if you need help generating this key. Default credentials will be used if this field is left empty.
- disableType BooleanDedupe 
- Disable Writing Final Tables. WARNING! The data format in airbytedata is likely stable but there are no guarantees that other metadata columns will remain the same in future versions. Default: false
- loadingMethod DestinationBigquery Configuration Loading Method 
- The way data will be uploaded to BigQuery.
- rawData StringDataset 
- The dataset to write raw tables into (default: airbyte_internal)
- transformationPriority String
- Interactive run type means that the query is executed as soon as possible, and these queries count towards concurrent rate limit and daily limit. Read more about interactive run type \n\nhere\n\n. Batch queries are queued and started as soon as idle resources are available in the BigQuery shared resource pool, which usually occurs within a few minutes. Batch queries don’t count towards your concurrent rate limit. Read more about batch queries \n\nhere\n\n. The default "interactive" value is used if not set explicitly. Default: "interactive"; must be one of ["interactive", "batch"]
- datasetId string
- The default BigQuery Dataset ID that tables are replicated to if the source does not specify a namespace. Read more \n\nhere\n\n.
- datasetLocation string
- The location of the dataset. Warning: Changes made after creation will not be applied. Read more \n\nhere\n\n. must be one of ["US", "EU", "asia-east1", "asia-east2", "asia-northeast1", "asia-northeast2", "asia-northeast3", "asia-south1", "asia-south2", "asia-southeast1", "asia-southeast2", "australia-southeast1", "australia-southeast2", "europe-central1", "europe-central2", "europe-north1", "europe-southwest1", "europe-west1", "europe-west2", "europe-west3", "europe-west4", "europe-west6", "europe-west7", "europe-west8", "europe-west9", "europe-west12", "me-central1", "me-central2", "me-west1", "northamerica-northeast1", "northamerica-northeast2", "southamerica-east1", "southamerica-west1", "us-central1", "us-east1", "us-east2", "us-east3", "us-east4", "us-east5", "us-south1", "us-west1", "us-west2", "us-west3", "us-west4"]
- projectId string
- The GCP project ID for the project containing the target BigQuery dataset. Read more \n\nhere\n\n.
- bigQuery numberClient Buffer Size Mb 
- Google BigQuery client's chunk (buffer) size (MIN=1, MAX = 15) for each table. The size that will be written by a single RPC. Written data will be buffered and only flushed upon reaching this size or closing the channel. The default 15MB value is used if not set explicitly. Read more \n\nhere\n\n. Default: 15
- credentialsJson string
- The contents of the JSON service account key. Check out the \n\ndocs\n\n if you need help generating this key. Default credentials will be used if this field is left empty.
- disableType booleanDedupe 
- Disable Writing Final Tables. WARNING! The data format in airbytedata is likely stable but there are no guarantees that other metadata columns will remain the same in future versions. Default: false
- loadingMethod DestinationBigquery Configuration Loading Method 
- The way data will be uploaded to BigQuery.
- rawData stringDataset 
- The dataset to write raw tables into (default: airbyte_internal)
- transformationPriority string
- Interactive run type means that the query is executed as soon as possible, and these queries count towards concurrent rate limit and daily limit. Read more about interactive run type \n\nhere\n\n. Batch queries are queued and started as soon as idle resources are available in the BigQuery shared resource pool, which usually occurs within a few minutes. Batch queries don’t count towards your concurrent rate limit. Read more about batch queries \n\nhere\n\n. The default "interactive" value is used if not set explicitly. Default: "interactive"; must be one of ["interactive", "batch"]
- dataset_id str
- The default BigQuery Dataset ID that tables are replicated to if the source does not specify a namespace. Read more \n\nhere\n\n.
- dataset_location str
- The location of the dataset. Warning: Changes made after creation will not be applied. Read more \n\nhere\n\n. must be one of ["US", "EU", "asia-east1", "asia-east2", "asia-northeast1", "asia-northeast2", "asia-northeast3", "asia-south1", "asia-south2", "asia-southeast1", "asia-southeast2", "australia-southeast1", "australia-southeast2", "europe-central1", "europe-central2", "europe-north1", "europe-southwest1", "europe-west1", "europe-west2", "europe-west3", "europe-west4", "europe-west6", "europe-west7", "europe-west8", "europe-west9", "europe-west12", "me-central1", "me-central2", "me-west1", "northamerica-northeast1", "northamerica-northeast2", "southamerica-east1", "southamerica-west1", "us-central1", "us-east1", "us-east2", "us-east3", "us-east4", "us-east5", "us-south1", "us-west1", "us-west2", "us-west3", "us-west4"]
- project_id str
- The GCP project ID for the project containing the target BigQuery dataset. Read more \n\nhere\n\n.
- big_query_ floatclient_ buffer_ size_ mb 
- Google BigQuery client's chunk (buffer) size (MIN=1, MAX = 15) for each table. The size that will be written by a single RPC. Written data will be buffered and only flushed upon reaching this size or closing the channel. The default 15MB value is used if not set explicitly. Read more \n\nhere\n\n. Default: 15
- credentials_json str
- The contents of the JSON service account key. Check out the \n\ndocs\n\n if you need help generating this key. Default credentials will be used if this field is left empty.
- disable_type_ booldedupe 
- Disable Writing Final Tables. WARNING! The data format in airbytedata is likely stable but there are no guarantees that other metadata columns will remain the same in future versions. Default: false
- loading_method DestinationBigquery Configuration Loading Method 
- The way data will be uploaded to BigQuery.
- raw_data_ strdataset 
- The dataset to write raw tables into (default: airbyte_internal)
- transformation_priority str
- Interactive run type means that the query is executed as soon as possible, and these queries count towards concurrent rate limit and daily limit. Read more about interactive run type \n\nhere\n\n. Batch queries are queued and started as soon as idle resources are available in the BigQuery shared resource pool, which usually occurs within a few minutes. Batch queries don’t count towards your concurrent rate limit. Read more about batch queries \n\nhere\n\n. The default "interactive" value is used if not set explicitly. Default: "interactive"; must be one of ["interactive", "batch"]
- datasetId String
- The default BigQuery Dataset ID that tables are replicated to if the source does not specify a namespace. Read more \n\nhere\n\n.
- datasetLocation String
- The location of the dataset. Warning: Changes made after creation will not be applied. Read more \n\nhere\n\n. must be one of ["US", "EU", "asia-east1", "asia-east2", "asia-northeast1", "asia-northeast2", "asia-northeast3", "asia-south1", "asia-south2", "asia-southeast1", "asia-southeast2", "australia-southeast1", "australia-southeast2", "europe-central1", "europe-central2", "europe-north1", "europe-southwest1", "europe-west1", "europe-west2", "europe-west3", "europe-west4", "europe-west6", "europe-west7", "europe-west8", "europe-west9", "europe-west12", "me-central1", "me-central2", "me-west1", "northamerica-northeast1", "northamerica-northeast2", "southamerica-east1", "southamerica-west1", "us-central1", "us-east1", "us-east2", "us-east3", "us-east4", "us-east5", "us-south1", "us-west1", "us-west2", "us-west3", "us-west4"]
- projectId String
- The GCP project ID for the project containing the target BigQuery dataset. Read more \n\nhere\n\n.
- bigQuery NumberClient Buffer Size Mb 
- Google BigQuery client's chunk (buffer) size (MIN=1, MAX = 15) for each table. The size that will be written by a single RPC. Written data will be buffered and only flushed upon reaching this size or closing the channel. The default 15MB value is used if not set explicitly. Read more \n\nhere\n\n. Default: 15
- credentialsJson String
- The contents of the JSON service account key. Check out the \n\ndocs\n\n if you need help generating this key. Default credentials will be used if this field is left empty.
- disableType BooleanDedupe 
- Disable Writing Final Tables. WARNING! The data format in airbytedata is likely stable but there are no guarantees that other metadata columns will remain the same in future versions. Default: false
- loadingMethod Property Map
- The way data will be uploaded to BigQuery.
- rawData StringDataset 
- The dataset to write raw tables into (default: airbyte_internal)
- transformationPriority String
- Interactive run type means that the query is executed as soon as possible, and these queries count towards concurrent rate limit and daily limit. Read more about interactive run type \n\nhere\n\n. Batch queries are queued and started as soon as idle resources are available in the BigQuery shared resource pool, which usually occurs within a few minutes. Batch queries don’t count towards your concurrent rate limit. Read more about batch queries \n\nhere\n\n. The default "interactive" value is used if not set explicitly. Default: "interactive"; must be one of ["interactive", "batch"]
DestinationBigqueryConfigurationLoadingMethod, DestinationBigqueryConfigurationLoadingMethodArgs          
- BatchedStandard DestinationInserts Bigquery Configuration Loading Method Batched Standard Inserts 
- Direct loading using batched SQL INSERT statements. This method uses the BigQuery driver to convert large INSERT statements into file uploads automatically.
- GcsStaging DestinationBigquery Configuration Loading Method Gcs Staging 
- Writes large batches of records to a file, uploads the file to GCS, then uses COPY INTO to load your data into BigQuery.
- BatchedStandard DestinationInserts Bigquery Configuration Loading Method Batched Standard Inserts 
- Direct loading using batched SQL INSERT statements. This method uses the BigQuery driver to convert large INSERT statements into file uploads automatically.
- GcsStaging DestinationBigquery Configuration Loading Method Gcs Staging 
- Writes large batches of records to a file, uploads the file to GCS, then uses COPY INTO to load your data into BigQuery.
- batchedStandard DestinationInserts Bigquery Configuration Loading Method Batched Standard Inserts 
- Direct loading using batched SQL INSERT statements. This method uses the BigQuery driver to convert large INSERT statements into file uploads automatically.
- gcsStaging DestinationBigquery Configuration Loading Method Gcs Staging 
- Writes large batches of records to a file, uploads the file to GCS, then uses COPY INTO to load your data into BigQuery.
- batchedStandard DestinationInserts Bigquery Configuration Loading Method Batched Standard Inserts 
- Direct loading using batched SQL INSERT statements. This method uses the BigQuery driver to convert large INSERT statements into file uploads automatically.
- gcsStaging DestinationBigquery Configuration Loading Method Gcs Staging 
- Writes large batches of records to a file, uploads the file to GCS, then uses COPY INTO to load your data into BigQuery.
- batched_standard_ Destinationinserts Bigquery Configuration Loading Method Batched Standard Inserts 
- Direct loading using batched SQL INSERT statements. This method uses the BigQuery driver to convert large INSERT statements into file uploads automatically.
- gcs_staging DestinationBigquery Configuration Loading Method Gcs Staging 
- Writes large batches of records to a file, uploads the file to GCS, then uses COPY INTO to load your data into BigQuery.
- batchedStandard Property MapInserts 
- Direct loading using batched SQL INSERT statements. This method uses the BigQuery driver to convert large INSERT statements into file uploads automatically.
- gcsStaging Property Map
- Writes large batches of records to a file, uploads the file to GCS, then uses COPY INTO to load your data into BigQuery.
DestinationBigqueryConfigurationLoadingMethodGcsStaging, DestinationBigqueryConfigurationLoadingMethodGcsStagingArgs              
- Credential
DestinationBigquery Configuration Loading Method Gcs Staging Credential 
- An HMAC key is a type of credential and can be associated with a service account or a user account in Cloud Storage. Read more \n\nhere\n\n.
- GcsBucket stringName 
- The name of the GCS bucket. Read more \n\nhere\n\n.
- GcsBucket stringPath 
- Directory under the GCS bucket where data will be written.
- KeepFiles stringIn Gcs Bucket 
- This upload method is supposed to temporary store records in GCS bucket. By this select you can chose if these records should be removed from GCS when migration has finished. The default "Delete all tmp files from GCS" value is used if not set explicitly. Default: "Delete all tmp files from GCS"; must be one of ["Delete all tmp files from GCS", "Keep all tmp files in GCS"]
- Credential
DestinationBigquery Configuration Loading Method Gcs Staging Credential 
- An HMAC key is a type of credential and can be associated with a service account or a user account in Cloud Storage. Read more \n\nhere\n\n.
- GcsBucket stringName 
- The name of the GCS bucket. Read more \n\nhere\n\n.
- GcsBucket stringPath 
- Directory under the GCS bucket where data will be written.
- KeepFiles stringIn Gcs Bucket 
- This upload method is supposed to temporary store records in GCS bucket. By this select you can chose if these records should be removed from GCS when migration has finished. The default "Delete all tmp files from GCS" value is used if not set explicitly. Default: "Delete all tmp files from GCS"; must be one of ["Delete all tmp files from GCS", "Keep all tmp files in GCS"]
- credential
DestinationBigquery Configuration Loading Method Gcs Staging Credential 
- An HMAC key is a type of credential and can be associated with a service account or a user account in Cloud Storage. Read more \n\nhere\n\n.
- gcsBucket StringName 
- The name of the GCS bucket. Read more \n\nhere\n\n.
- gcsBucket StringPath 
- Directory under the GCS bucket where data will be written.
- keepFiles StringIn Gcs Bucket 
- This upload method is supposed to temporary store records in GCS bucket. By this select you can chose if these records should be removed from GCS when migration has finished. The default "Delete all tmp files from GCS" value is used if not set explicitly. Default: "Delete all tmp files from GCS"; must be one of ["Delete all tmp files from GCS", "Keep all tmp files in GCS"]
- credential
DestinationBigquery Configuration Loading Method Gcs Staging Credential 
- An HMAC key is a type of credential and can be associated with a service account or a user account in Cloud Storage. Read more \n\nhere\n\n.
- gcsBucket stringName 
- The name of the GCS bucket. Read more \n\nhere\n\n.
- gcsBucket stringPath 
- Directory under the GCS bucket where data will be written.
- keepFiles stringIn Gcs Bucket 
- This upload method is supposed to temporary store records in GCS bucket. By this select you can chose if these records should be removed from GCS when migration has finished. The default "Delete all tmp files from GCS" value is used if not set explicitly. Default: "Delete all tmp files from GCS"; must be one of ["Delete all tmp files from GCS", "Keep all tmp files in GCS"]
- credential
DestinationBigquery Configuration Loading Method Gcs Staging Credential 
- An HMAC key is a type of credential and can be associated with a service account or a user account in Cloud Storage. Read more \n\nhere\n\n.
- gcs_bucket_ strname 
- The name of the GCS bucket. Read more \n\nhere\n\n.
- gcs_bucket_ strpath 
- Directory under the GCS bucket where data will be written.
- keep_files_ strin_ gcs_ bucket 
- This upload method is supposed to temporary store records in GCS bucket. By this select you can chose if these records should be removed from GCS when migration has finished. The default "Delete all tmp files from GCS" value is used if not set explicitly. Default: "Delete all tmp files from GCS"; must be one of ["Delete all tmp files from GCS", "Keep all tmp files in GCS"]
- credential Property Map
- An HMAC key is a type of credential and can be associated with a service account or a user account in Cloud Storage. Read more \n\nhere\n\n.
- gcsBucket StringName 
- The name of the GCS bucket. Read more \n\nhere\n\n.
- gcsBucket StringPath 
- Directory under the GCS bucket where data will be written.
- keepFiles StringIn Gcs Bucket 
- This upload method is supposed to temporary store records in GCS bucket. By this select you can chose if these records should be removed from GCS when migration has finished. The default "Delete all tmp files from GCS" value is used if not set explicitly. Default: "Delete all tmp files from GCS"; must be one of ["Delete all tmp files from GCS", "Keep all tmp files in GCS"]
DestinationBigqueryConfigurationLoadingMethodGcsStagingCredential, DestinationBigqueryConfigurationLoadingMethodGcsStagingCredentialArgs                
DestinationBigqueryConfigurationLoadingMethodGcsStagingCredentialHmacKey, DestinationBigqueryConfigurationLoadingMethodGcsStagingCredentialHmacKeyArgs                    
- HmacKey stringAccess Id 
- HMAC key access ID. When linked to a service account, this ID is 61 characters long; when linked to a user account, it is 24 characters long.
- HmacKey stringSecret 
- The corresponding secret for the access ID. It is a 40-character base-64 encoded string.
- HmacKey stringAccess Id 
- HMAC key access ID. When linked to a service account, this ID is 61 characters long; when linked to a user account, it is 24 characters long.
- HmacKey stringSecret 
- The corresponding secret for the access ID. It is a 40-character base-64 encoded string.
- hmacKey StringAccess Id 
- HMAC key access ID. When linked to a service account, this ID is 61 characters long; when linked to a user account, it is 24 characters long.
- hmacKey StringSecret 
- The corresponding secret for the access ID. It is a 40-character base-64 encoded string.
- hmacKey stringAccess Id 
- HMAC key access ID. When linked to a service account, this ID is 61 characters long; when linked to a user account, it is 24 characters long.
- hmacKey stringSecret 
- The corresponding secret for the access ID. It is a 40-character base-64 encoded string.
- hmac_key_ straccess_ id 
- HMAC key access ID. When linked to a service account, this ID is 61 characters long; when linked to a user account, it is 24 characters long.
- hmac_key_ strsecret 
- The corresponding secret for the access ID. It is a 40-character base-64 encoded string.
- hmacKey StringAccess Id 
- HMAC key access ID. When linked to a service account, this ID is 61 characters long; when linked to a user account, it is 24 characters long.
- hmacKey StringSecret 
- The corresponding secret for the access ID. It is a 40-character base-64 encoded string.
Import
$ pulumi import airbyte:index/destinationBigquery:DestinationBigquery my_airbyte_destination_bigquery ""
To learn more about importing existing cloud resources, see Importing resources.
Package Details
- Repository
- airbyte airbytehq/terraform-provider-airbyte
- License
- Notes
- This Pulumi package is based on the airbyteTerraform Provider.