Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion src/e2e-test/features/bigquery/sink/GCSToBigQuery.feature
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,10 @@ Feature: BigQuery sink - Verification of GCS to BigQuery successful data transfe
Then Connect source as "GCS" and sink as "BigQuery" to establish connection
Then Save the pipeline
Then Preview and run the pipeline
Then Verify the preview of pipeline is "success"
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,10 @@ Feature: BigQuery sink - Verification of GCS to BigQuery successful data transfe
Then Enter runtime argument value "bqTruncateTableTrue" for key "bqTruncateTable"
Then Enter runtime argument value "bqUpdateTableSchemaTrue" for key "bqUpdateTableSchema"
Then Run the preview of pipeline with runtime arguments
Then Verify the preview of pipeline is "success"
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Expand All @@ -74,3 +77,82 @@ Feature: BigQuery sink - Verification of GCS to BigQuery successful data transfe
Then Verify the pipeline status is "Succeeded"
Then Get count of no of records transferred to target BigQuery Table
Then Validate the cmek key "cmekBQ" of target BigQuery table if cmek is enabled

@GCS_CSV_TEST @BQ_SINK_TEST @SERVICE_ACCOUNT_JSON_TEST
Scenario:Validate successful records transfer from GCS to BigQuery with macro arguments - Service account type as Json
Given Open Datafusion Project to configure pipeline
When Source is GCS
When Sink is BigQuery
Then Open GCS source properties
Then Enter GCS property reference name
Then Enter GCS property "projectId" as macro argument "gcsProjectId"
Then Enter GCS property "serviceAccountType" as macro argument "gcsServiceAccountType"
Then Enter GCS property "serviceAccountJSON" as macro argument "gcsServiceAccountJSON"
Then Enter GCS property "path" as macro argument "gcsSourcePath"
Then Enter GCS source property "skipHeader" as macro argument "gcsSkipHeader"
Then Enter GCS property "format" as macro argument "gcsFormat"
Then Enter GCS source property output schema "outputSchema" as macro argument "gcsOutputSchema"
Then Validate "GCS" plugin properties
Then Close the GCS properties
Then Open BigQuery sink properties
Then Enter BigQuery property reference name
Then Enter BigQuery property "projectId" as macro argument "bqProjectId"
Then Enter BigQuery property "datasetProjectId" as macro argument "bqDatasetProjectId"
Then Enter BigQuery property "serviceAccountType" as macro argument "bqServiceAccountType"
Then Enter BigQuery property "serviceAccountJSON" as macro argument "bqServiceAccountJSON"
Then Enter BigQuery property "dataset" as macro argument "bqDataset"
Then Enter BigQuery property "table" as macro argument "bqTargetTable"
Then Enter BigQuery cmek property "encryptionKeyName" as macro argument "cmekBQ" if cmek is enabled
Then Enter BigQuery sink property "truncateTable" as macro argument "bqTruncateTable"
Then Enter BigQuery sink property "updateTableSchema" as macro argument "bqUpdateTableSchema"
Then Validate "BigQuery" plugin properties
Then Close the BigQuery properties
Then Connect source as "GCS" and sink as "BigQuery" to establish connection
Then Save the pipeline
Then Preview and run the pipeline
Then Enter runtime argument value "projectId" for key "gcsProjectId"
Then Enter runtime argument value "serviceAccountTypeJSON" for key "gcsServiceAccountType"
Then Enter runtime argument value "serviceAccountJSON" for key "gcsServiceAccountJSON"
Then Enter runtime argument value "gcsCsvFile" for GCS source property path key "gcsSourcePath"
Then Enter runtime argument value "gcsSkipHeaderTrue" for key "gcsSkipHeader"
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
Then Enter runtime argument value "gcsCSVFileOutputSchema" for key "gcsOutputSchema"
Then Enter runtime argument value "projectId" for key "bqProjectId"
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
Then Enter runtime argument value "serviceAccountTypeJSON" for key "bqServiceAccountType"
Then Enter runtime argument value "serviceAccountJSON" for key "bqServiceAccountJSON"
Then Enter runtime argument value "dataset" for key "bqDataset"
Then Enter runtime argument value for BigQuery sink table name key "bqTargetTable"
Then Enter runtime argument value "cmekBQ" for BigQuery cmek property key "cmekBQ" if BQ cmek is enabled
Then Enter runtime argument value "bqTruncateTableTrue" for key "bqTruncateTable"
Then Enter runtime argument value "bqUpdateTableSchemaTrue" for key "bqUpdateTableSchema"
Then Run the preview of pipeline with runtime arguments
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Enter runtime argument value "projectId" for key "gcsProjectId"
Then Enter runtime argument value "serviceAccountTypeJSON" for key "gcsServiceAccountType"
Then Enter runtime argument value "serviceAccountJSON" for key "gcsServiceAccountJSON"
Then Enter runtime argument value "gcsCsvFile" for GCS source property path key "gcsSourcePath"
Then Enter runtime argument value "gcsSkipHeaderTrue" for key "gcsSkipHeader"
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
Then Enter runtime argument value "gcsCSVFileOutputSchema" for key "gcsOutputSchema"
Then Enter runtime argument value "projectId" for key "bqProjectId"
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
Then Enter runtime argument value "serviceAccountTypeJSON" for key "bqServiceAccountType"
Then Enter runtime argument value "serviceAccountJSON" for key "bqServiceAccountJSON"
Then Enter runtime argument value "dataset" for key "bqDataset"
Then Enter runtime argument value for BigQuery sink table name key "bqTargetTable"
Then Enter runtime argument value "cmekBQ" for BigQuery cmek property key "cmekBQ" if BQ cmek is enabled
Then Enter runtime argument value "bqTruncateTableTrue" for key "bqTruncateTable"
Then Enter runtime argument value "bqUpdateTableSchemaTrue" for key "bqUpdateTableSchema"
Then Run the Pipeline in Runtime with runtime arguments
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Get count of no of records transferred to target BigQuery Table
Then Validate the cmek key "cmekBQ" of target BigQuery table if cmek is enabled
14 changes: 11 additions & 3 deletions src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,10 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data
Then Connect source as "BigQuery" and sink as "BigQuery" to establish connection
Then Save the pipeline
Then Preview and run the pipeline
Then Verify the preview of pipeline is "success"
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Click on preview data for BigQuery sink
Then Verify preview output schema matches the outputSchema captured in properties
Then Close the preview data
Expand Down Expand Up @@ -61,7 +64,9 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data
Then Connect source as "BigQuery" and sink as "BigQuery" to establish connection
Then Save the pipeline
Then Preview and run the pipeline
Then Verify the preview of pipeline is "failed"
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "failed"

@BQ_PARTITIONED_SOURCE_TEST @BQ_SINK_TEST
Scenario: Verify records are getting transferred with respect to partitioned date
Expand All @@ -87,7 +92,10 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data
Then Connect source as "BigQuery" and sink as "BigQuery" to establish connection
Then Save the pipeline
Then Preview and run the pipeline
Then Verify the preview of pipeline is "success"
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Click on preview data for BigQuery sink
Then Verify preview output schema matches the outputSchema captured in properties
Then Close the preview data
Expand Down
5 changes: 4 additions & 1 deletion src/e2e-test/features/bigquery/source/BigQueryToGCS.feature
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,10 @@ Feature: BigQuery source - Verification of BigQuery to GCS successful data trans
Then Connect source as "BigQuery" and sink as "GCS" to establish connection
Then Save the pipeline
Then Preview and run the pipeline
Then Verify the preview of pipeline is "success"
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Click on preview data for GCS sink
Then Verify preview output schema matches the outputSchema captured in properties
Then Close the preview data
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,10 @@ Feature: BigQuery source - Verification of BigQuery to GCS successful data trans
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled
Then Run the preview of pipeline with runtime arguments
Then Verify the preview of pipeline is "success"
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Click on preview data for GCS sink
Then Close the preview data
Then Deploy the pipeline
Expand All @@ -66,3 +69,74 @@ Feature: BigQuery source - Verification of BigQuery to GCS successful data trans
Then Verify the pipeline status is "Succeeded"
Then Verify data is transferred to target GCS bucket
Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled

@BQ_SOURCE_TEST @GCS_SINK_TEST @SERVICE_ACCOUNT_JSON_TEST
Scenario:Validate successful records transfer from BigQuery to GCS with macro arguments - Service account type as Json
Given Open Datafusion Project to configure pipeline
When Source is BigQuery
When Sink is GCS
Then Open BigQuery source properties
Then Enter BigQuery property reference name
Then Enter BigQuery property "projectId" as macro argument "bqProjectId"
Then Enter BigQuery property "datasetProjectId" as macro argument "bqDatasetProjectId"
Then Enter BigQuery property "serviceAccountType" as macro argument "bqServiceAccountType"
Then Enter BigQuery property "serviceAccountJSON" as macro argument "bqServiceAccountJSON"
Then Enter BigQuery property "dataset" as macro argument "bqDataset"
Then Enter BigQuery property "table" as macro argument "bqSourceTable"
Then Validate "BigQuery" plugin properties
Then Close the BigQuery properties
Then Open GCS sink properties
Then Enter GCS property reference name
Then Enter GCS property "projectId" as macro argument "gcsProjectId"
Then Enter GCS property "serviceAccountType" as macro argument "gcsServiceAccountType"
Then Enter GCS property "serviceAccountJSON" as macro argument "gcsServiceAccountJSON"
Then Enter GCS property "path" as macro argument "gcsSinkPath"
Then Enter GCS sink property "pathSuffix" as macro argument "gcsPathSuffix"
Then Enter GCS property "format" as macro argument "gcsFormat"
Then Enter GCS sink cmek property "encryptionKeyName" as macro argument "cmekGCS" if cmek is enabled
Then Validate "GCS" plugin properties
Then Close the GCS properties
Then Connect source as "BigQuery" and sink as "GCS" to establish connection
Then Save the pipeline
Then Preview and run the pipeline
Then Enter runtime argument value "projectId" for key "bqProjectId"
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
Then Enter runtime argument value "serviceAccountTypeJSON" for key "bqServiceAccountType"
Then Enter runtime argument value "serviceAccountJSON" for key "bqServiceAccountJSON"
Then Enter runtime argument value "dataset" for key "bqDataset"
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable"
Then Enter runtime argument value "projectId" for key "gcsProjectId"
Then Enter runtime argument value "serviceAccountTypeJSON" for key "gcsServiceAccountType"
Then Enter runtime argument value "serviceAccountJSON" for key "gcsServiceAccountJSON"
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath"
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix"
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled
Then Run the preview of pipeline with runtime arguments
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Click on preview data for GCS sink
Then Close the preview data
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Enter runtime argument value "projectId" for key "bqProjectId"
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
Then Enter runtime argument value "serviceAccountTypeJSON" for key "bqServiceAccountType"
Then Enter runtime argument value "serviceAccountJSON" for key "bqServiceAccountJSON"
Then Enter runtime argument value "dataset" for key "bqDataset"
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable"
Then Enter runtime argument value "projectId" for key "gcsProjectId"
Then Enter runtime argument value "serviceAccountTypeJSON" for key "gcsServiceAccountType"
Then Enter runtime argument value "serviceAccountJSON" for key "gcsServiceAccountJSON"
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath"
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix"
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled
Then Run the Pipeline in Runtime with runtime arguments
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Verify data is transferred to target GCS bucket
Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,10 @@ Feature: BigQuery source - Verification of BigQuery to Multiple sinks successful
Then Close the PubSub properties
Then Save the pipeline
Then Preview and run the pipeline
Then Verify the preview of pipeline is "success"
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Expand Down
5 changes: 4 additions & 1 deletion src/e2e-test/features/gcs/sink/GCSSink.feature
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,10 @@ Feature: GCS sink - Verification of GCS Sink plugin
Then Close the GCS properties
Then Save the pipeline
Then Preview and run the pipeline
Then Verify the preview of pipeline is "success"
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Expand Down
5 changes: 4 additions & 1 deletion src/e2e-test/features/gcs/source/GCSSourceToBigQuery.feature
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,10 @@ Feature: GCS source - Verification of GCS to BQ successful data transfer
Then Close the BigQuery properties
Then Save the pipeline
Then Preview and run the pipeline
Then Verify the preview of pipeline is "success"
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Click on preview data for BigQuery sink
Then Verify preview output schema matches the outputSchema captured in properties
Then Close the preview data
Expand Down
Loading