Weekend Sale - Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dpm65

ARA-R01 SnowPro Advanced: Architect Recertification Exam Questions and Answers

Questions 4

What is a characteristic of event notifications in Snowpipe?

Options:

A.

The load history is stored In the metadata of the target table.

B.

Notifications identify the cloud storage event and the actual data in the files.

C.

Snowflake can process all older notifications when a paused pipe Is resumed.

D.

When a pipe Is paused, event messages received for the pipe enter a limited retention period.

Buy Now
Questions 5

Which command will create a schema without Fail-safe and will restrict object owners from passing on access to other users?

Options:

A.

create schema EDW.ACCOUNTING WITH MANAGED ACCESS;

B.

create schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS - 7;

C.

create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 1;

D.

create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 7;

Buy Now
Questions 6

How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)

Options:

A.

A task scheduled in a UTC-based schedule will have no issues with the time changes.

B.

Task schedules can be designed to follow specified or local time zones to accommodate the time changes.

C.

A task will move to a suspended state during the daylight savings time change.

D.

A frequent task execution schedule like minutes may not cause a problem, but will affect the task history.

E.

A task schedule will follow only the specified time and will fail to handle lost or duplicated hours.

Buy Now
Questions 7

The data share exists between a data provider account and a data consumer account. Five tables from the provider account are being shared with the consumer account. The consumer role has been granted the imported privileges privilege.

What will happen to the consumer account if a new table (table_6) is added to the provider schema?

Options:

A.

The consumer role will automatically see the new table and no additional grants are needed.

B.

The consumer role will see the table only after this grant is given on the consumer side:

grant imported privileges on database PSHARE_EDW_4TEST_DB to DEV_ROLE;

C.

The consumer role will see the table only after this grant is given on the provider side:

use role accountadmin;

Grant select on table EDW.ACCOUNTING.Table_6 to share PSHARE_EDW_4TEST;

D.

The consumer role will see the table only after this grant is given on the provider side:

use role accountadmin;

grant usage on database EDW to share PSHARE_EDW_4TEST ;

grant usage on schema EDW.ACCOUNTING to share PSHARE_EDW_4TEST ;

Grant select on table EDW.ACCOUNTING.Table_6 to database PSHARE_EDW_4TEST_DB ;

Buy Now
Questions 8

A company is designing high availability and disaster recovery plans and needs to maximize redundancy and minimize recovery time objectives for their critical application processes. Cost is not a concern as long as the solution is the best available. The plan so far consists of the following steps:

1. Deployment of Snowflake accounts on two different cloud providers.

2. Selection of cloud provider regions that are geographically far apart.

3. The Snowflake deployment will replicate the databases and account data between both cloud provider accounts.

4. Implementation of Snowflake client redirect.

What is the MOST cost-effective way to provide the HIGHEST uptime and LEAST application disruption if there is a service event?

Options:

A.

Connect the applications using the - URL. Use the Business Critical Snowflake edition.

B.

Connect the applications using the - URL. Use the Virtual Private Snowflake (VPS) edition.

C.

Connect the applications using the - URL. Use the Enterprise Snowflake edition.

D.

Connect the applications using the - URL. Use the Business Critical Snowflake edition.

Buy Now
Questions 9

An Architect is designing a solution that will be used to process changed records in an orders table. Newly-inserted orders must be loaded into the f_orders fact table, which will aggregate all the orders by multiple dimensions (time, region, channel, etc.). Existing orders can be updated by the sales department within 30 days after the order creation. In case of an order update, the solution must perform two actions:

1. Update the order in the f_0RDERS fact table.

2. Load the changed order data into the special table ORDER _REPAIRS.

This table is used by the Accounting department once a month. If the order has been changed, the Accounting team needs to know the latest details and perform the necessary actions based on the data in the order_repairs table.

What data processing logic design will be the MOST performant?

Options:

A.

Useone stream and one task.

B.

Useone stream and two tasks.

C.

Usetwo streams and one task.

D.

Usetwo streams and two tasks.

Buy Now
Questions 10

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously and efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

Options:

A.

Ingest the data using copy into and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Buy Now
Questions 11

A table, EMP_ TBL has three records as shown:

ARA-R01 Question 11

The following variables are set for the session:

ARA-R01 Question 11

Which SELECT statements will retrieve all three records? (Select TWO).

Options:

A.

Select * FROM Stbl_ref WHERE Scol_ref IN ('Name1','Nam2','Name3');

B.

SELECT * FROM EMP_TBL WHERE identifier(Scol_ref) IN ('Namel','Name2', 'Name3');

C.

SELECT * FROM identifier WHERE NAME IN ($var1, $var2, $var3);

D.

SELECT * FROM identifier($tbl_ref) WHERE ID IN Cvarl','var2','var3');

E.

SELECT * FROM $tb1_ref WHERE $col_ref IN ($var1, Svar2, Svar3);

Buy Now
Questions 12

An Architect has a design where files arrive every 10 minutes and are loaded into a primary database table using Snowpipe. A secondary database is refreshed every hour with the latest data from the primary database.

Based on this scenario, what Time Travel query options are available on the secondary database?

Options:

A.

A query using Time Travel in the secondary database is available for every hourly table version within the retention window.

B.

A query using Time Travel in the secondary database is available for every hourly table version within and outside the retention window.

C.

Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) in the retention window.

D.

Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) and outside the retention window.

Buy Now
Questions 13

A company has an external vendor who puts data into Google Cloud Storage. The company's Snowflake account is set up in Azure.

What would be the MOST efficient way to load data from the vendor into Snowflake?

Options:

A.

Ask the vendor to create a Snowflake account, load the data into Snowflake and create a data share.

B.

Create an external stage on Google Cloud Storage and use the external table to load the data into Snowflake.

C.

Copy the data from Google Cloud Storage to Azure Blob storage using external tools and load data from Blob storage to Snowflake.

D.

Create a Snowflake Account in the Google Cloud Platform (GCP), ingest data into this account and use data replication to move the data from GCP to Azure.

Buy Now
Questions 14

A Snowflake Architect is designing an application and tenancy strategy for an organization where strong legal isolation rules as well as multi-tenancy are requirements.

Which approach will meet these requirements if Role-Based Access Policies (RBAC) is a viable option for isolating tenants?

Options:

A.

Create accounts for each tenant in the Snowflake organization.

B.

Create an object for each tenant strategy if row level security is viable for isolating tenants.

C.

Create an object for each tenant strategy if row level security is not viable for isolating tenants.

D.

Create a multi-tenant table strategy if row level security is not viable for isolating tenants.

Buy Now
Questions 15

The following DDL command was used to create a task based on a stream:

ARA-R01 Question 15

Assuming MY_WH is set to auto_suspend – 60 and used exclusively for this task, which statement is true?

Options:

A.

The warehouse MY_WH will be made active every five minutes to check the stream.

B.

The warehouse MY_WH will only be active when there are results in the stream.

C.

The warehouse MY_WH will never suspend.

D.

The warehouse MY_WH will automatically resize to accommodate the size of the stream.

Buy Now
Questions 16

Database DB1 has schema S1 which has one table, T1.

DB1 --> S1 --> T1

The retention period of EG1 is set to 10 days.

The retention period of s: is set to 20 days.

The retention period of t: Is set to 30 days.

The user runs the following command:

Drop Database DB1;

What will the Time Travel retention period be for T1?

Options:

A.

10 days

B.

20 days

C.

30 days

D.

37 days

Buy Now
Questions 17

A company has a source system that provides JSON records for various loT operations. The JSON Is loading directly into a persistent table with a variant field. The data Is quickly growing to 100s of millions of records and performance to becoming an issue. There is a generic access pattern that Is used to filter on the create_date key within the variant field.

What can be done to improve performance?

Options:

A.

Alter the target table to Include additional fields pulled from the JSON records. This would Include a create_date field with a datatype of time stamp. When this field Is used in the filter, partition pruning will occur.

B.

Alter the target table to include additional fields pulled from the JSON records. This would include a create_date field with a datatype of varchar. When this field is used in the filter, partition pruning will occur.

C.

Validate the size of the warehouse being used. If the record count is approaching 100s of millions, size XL will be the minimum size required to process this amount of data.

D.

Incorporate the use of multiple tables partitioned by date ranges. When a user or process needs to query a particular date range, ensure the appropriate base table Is used.

Buy Now
Questions 18

A group of Data Analysts have been granted the role analyst role. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.

How should these requirements be met?

Options:

A.

Grant ANALYST_R0LE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.

B.

Grant SYSADMIN ownership of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.

C.

Make every schema in the database a managed access schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.

D.

Grant ANALYST_ROLE ownership on the database, but grant the ownership on future [object type] s in database privilege to SYSADMIN.

Buy Now
Questions 19

An Architect needs to automate the daily Import of two files from an external stage into Snowflake. One file has Parquet-formatted data, the other has CSV-formatted data.

How should the data be joined and aggregated to produce a final result set?

Options:

A.

Use Snowpipe to ingest the two files, then create a materialized view to produce the final result set.

B.

Create a task using Snowflake scripting that will import the files, and then call a User-Defined Function (UDF) to produce the final result set.

C.

Create a JavaScript stored procedure to read. join, and aggregate the data directly from the external stage, and then store the results in a table.

D.

Create a materialized view to read, Join, and aggregate the data directly from the external stage, and use the view to produce the final result set

Buy Now
Questions 20

A company has built a data pipeline using Snowpipe to ingest files from an Amazon S3 bucket. Snowpipe is configured to load data into staging database tables. Then a task runs to load the data from the staging database tables into the reporting database tables.

The company is satisfied with the availability of the data in the reporting database tables, but the reporting tables are not pruning effectively. Currently, a size 4X-Large virtual warehouse is being used to query all of the tables in the reporting database.

What step can be taken to improve the pruning of the reporting tables?

Options:

A.

Eliminate the use of Snowpipe and load the files into internal stages using PUT commands.

B.

Increase the size of the virtual warehouse to a size 5X-Large.

C.

Use an ORDER BY command to load the reporting tables.

D.

Create larger files for Snowpipe to ingest and ensure the staging frequency does not exceed 1 minute.

Buy Now
Questions 21

The following table exists in the production database:

A regulatory requirement states that the company must mask the username for events that are older than six months based on the current date when the data is queried.

How can the requirement be met without duplicating the event data and making sure it is applied when creating views using the table or cloning the table?

Options:

A.

Use a masking policy on the username column using a entitlement table with valid dates.

B.

Use a row level policy on the user_events table using a entitlement table with valid dates.

C.

Use a masking policy on the username column with event_timestamp as a conditional column.

D.

Use a secure view on the user_events table using a case statement on the username column.

Buy Now
Questions 22

A Snowflake Architect is setting up database replication to support a disaster recovery plan. The primary database has external tables.

How should the database be replicated?

Options:

A.

Create a clone of the primary database then replicate the database.

B.

Move the external tables to a database that is not replicated, then replicate the primary database.

C.

Replicate the database ensuring the replicated database is in the same region as the external tables.

D.

Share the primary database with an account in the same region that the database will be replicated to.

Buy Now
Questions 23

A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

Options:

A.

Use, at minimum, the Business Critical edition of Snowflake.

B.

Create Dynamic Data Masking policies and apply them to columns that contain PHI.

C.

Use the Internal Tokenization feature to obfuscate sensitive data.

D.

Use the External Tokenization feature to obfuscate sensitive data.

E.

Rewrite SQL queries to eliminate projections of PHI data based on current_role().

F.

Avoid sharing data with partner organizations.

Buy Now
Questions 24

A user is executing the following command sequentially within a timeframe of 10 minutes from start to finish:

ARA-R01 Question 24

What would be the output of this query?

Options:

A.

Table T_SALES_CLONE successfully created.

B.

Time Travel data is not available for table T_SALES.

C.

The offset -> is not a valid clause in the clone operation.

D.

Syntax error line 1 at position 58 unexpected 'at’.

Buy Now
Questions 25

Company A would like to share data in Snowflake with Company B. Company B is not on the same cloud platform as Company A.

What is required to allow data sharing between these two companies?

Options:

A.

Create a pipeline to write shared data to a cloud storage location in the target cloud provider.

B.

Ensure that all views are persisted, as views cannot be shared across cloud platforms.

C.

Setup data replication to the region and cloud platform where the consumer resides.

D.

Company A and Company B must agree to use a single cloud platform: Data sharing is only possible if the companies share the same cloud provider.

Buy Now
Questions 26

An Architect is designing a data lake with Snowflake. The company has structured, semi-structured, and unstructured data. The company wants to save the data inside the data lake within the Snowflake

system. The company is planning on sharing data among its corporate branches using Snowflake data sharing.

What should be considered when sharing the unstructured data within Snowflake?

Options:

A.

A pre-signed URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with no time limit for the URL.

B.

A scoped URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 24-hour time limit for the URL.

C.

A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 7-day time limit for the URL.

D.

A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with the "expiration_time" argument defined for the URL time limit.

Buy Now
Questions 27

Which organization-related tasks can be performed by the ORGADMIN role? (Choose three.)

Options:

A.

Changing the name of the organization

B.

Creating an account

C.

Viewing a list of organization accounts

D.

Changing the name of an account

E.

Deleting an account

F.

Enabling the replication of a database

Buy Now
Questions 28

Files arrive in an external stage every 10 seconds from a proprietary system. The files range in size from 500 K to 3 MB. The data must be accessible by dashboards as soon as it arrives.

How can a Snowflake Architect meet this requirement with the LEAST amount of coding? (Choose two.)

Options:

A.

Use Snowpipe with auto-ingest.

B.

Use a COPY command with a task.

C.

Use a materialized view on an external table.

D.

Use the COPY INTO command.

E.

Use a combination of a task and a stream.

Buy Now
Questions 29

The diagram shows the process flow for Snowpipe auto-ingest with Amazon Simple Notification Service (SNS) with the following steps:

Step 1: Data files are loaded in a stage.

Step 2: An Amazon S3 event notification, published by SNS, informs Snowpipe — by way of Amazon Simple Queue Service (SQS) - that files are ready to load. Snowpipe copies the files into a queue.

Step 3: A Snowflake-provided virtual warehouse loads data from the queued files into the target table based on parameters defined in the specified pipe.

ARA-R01 Question 29

If an AWS Administrator accidentally deletes the SQS subscription to the SNS topic in Step 2, what will happen to the pipe that references the topic to receive event messages from Amazon S3?

Options:

A.

The pipe will continue to receive the messages as Snowflake will automatically restore the subscription to the same SNS topic and will recreate the pipe by specifying the same SNS topic name in the pipe definition.

B.

The pipe will no longer be able to receive the messages and the user must wait for 24 hours from the time when the SNS topic subscription was deleted. Pipe recreation is not required as the pipe will reuse the same subscription to the existing SNS topic after 24 hours.

C.

The pipe will continue to receive the messages as Snowflake will automatically restore the subscription by creating a new SNS topic. Snowflake will then recreate the pipe by specifying the new SNS topic name in the pipe definition.

D.

The pipe will no longer be able to receive the messages. To restore the system immediately, the user needs to manually create a new SNS topic with a different name and then recreate the pipe by specifying the new SNS topic name in the pipe definition.

Buy Now
Questions 30

What are characteristics of Dynamic Data Masking? (Select TWO).

Options:

A.

A masking policy that Is currently set on a table can be dropped.

B.

A single masking policy can be applied to columns in different tables.

C.

A masking policy can be applied to the value column of an external table.

D.

The role that creates the masking policy will always see unmasked data In query results

E.

A masking policy can be applied to a column with the GEOGRAPHY data type.

Buy Now
Questions 31

A company is using a Snowflake account in Azure. The account has SAML SSO set up using ADFS as a SCIM identity provider. To validate Private Link connectivity, an Architect performed the following steps:

* Confirmed Private Link URLs are working by logging in with a username/password account

* Verified DNS resolution by running nslookups against Private Link URLs

* Validated connectivity using SnowCD

* Disabled public access using a network policy set to use the company’s IP address range

However, the following error message is received when using SSO to log into the company account:

IP XX.XXX.XX.XX is not allowed to access snowflake. Contact your local security administrator.

What steps should the Architect take to resolve this error and ensure that the account is accessed using only Private Link? (Choose two.)

Options:

A.

Alter the Azure security integration to use the Private Link URLs.

B.

Add the IP address in the error message to the allowed list in the network policy.

C.

Generate a new SCIM access token using system$generate_scim_access_token and save it to Azure AD.

D.

Update the configuration of the Azure AD SSO to use the Private Link URLs.

E.

Open a case with Snowflake Support to authorize the Private Link URLs’ access to the account.

Buy Now
Questions 32

What is a key consideration when setting up search optimization service for a table?

Options:

A.

Search optimization service works best with a column that has a minimum of 100 K distinct values.

B.

Search optimization service can significantly improve query performance on partitioned external tables.

C.

Search optimization service can help to optimize storage usage by compressing the data into a GZIP format.

D.

The table must be clustered with a key having multiple columns for effective search optimization.

Buy Now
Questions 33

An Architect would like to save quarter-end financial results for the previous six years.

Which Snowflake feature can the Architect use to accomplish this?

Options:

A.

Search optimization service

B.

Materialized view

C.

Time Travel

D.

Zero-copy cloning

E.

Secure views

Buy Now
Questions 34

An Architect has been asked to clone schema STAGING as it looked one week ago, Tuesday June 1st at 8:00 AM, to recover some objects.

The STAGING schema has 50 days of retention.

The Architect runs the following statement:

CREATE SCHEMA STAGING_CLONE CLONE STAGING at (timestamp => '2021-06-01 08:00:00');

The Architect receives the following error: Time travel data is not available for schema STAGING. The requested time is either beyond the allowed time travel period or before the object creation time.

The Architect then checks the schema history and sees the following:

CREATED_ON|NAME|DROPPED_ON

2021-06-02 23:00:00 | STAGING | NULL

2021-05-01 10:00:00 | STAGING | 2021-06-02 23:00:00

How can cloning the STAGING schema be achieved?

Options:

A.

Undrop the STAGING schema and then rerun the CLONE statement.

B.

Modify the statement: CREATE SCHEMA STAGING_CLONE CLONE STAGING at (timestamp => '2021-05-01 10:00:00');

C.

Rename the STAGING schema and perform an UNDROP to retrieve the previous STAGING schema version, then run the CLONE statement.

D.

Cloning cannot be accomplished because the STAGING schema version was not active during the proposed Time Travel time period.

Buy Now
Questions 35

What are some of the characteristics of result set caches? (Choose three.)

Options:

A.

Time Travel queries can be executed against the result set cache.

B.

Snowflake persists the data results for 24 hours.

C.

Each time persisted results for a query are used, a 24-hour retention period is reset.

D.

The data stored in the result cache will contribute to storage costs.

E.

The retention period can be reset for a maximum of 31 days.

F.

The result set cache is not shared between warehouses.

Buy Now
Questions 36

An Architect is using SnowCD to investigate a connectivity issue.

Which system function will provide a list of endpoints that the network must be able to access to use a specific Snowflake account, leveraging private connectivity?

Options:

A.

SYSTEMSALLOWLIST ()

B.

SYSTEMSGET_PRIVATELINK

C.

SYSTEMSAUTHORIZE_PRIVATELINK

D.

SYSTEMSALLOWLIST_PRIVATELINK ()

Buy Now
Questions 37

A user can change object parameters using which of the following roles?

Options:

A.

ACCOUNTADMIN, SECURITYADMIN

B.

SYSADMIN, SECURITYADMIN

C.

ACCOUNTADMIN, USER with PRIVILEGE

D.

SECURITYADMIN, USER with PRIVILEGE

Buy Now
Questions 38

Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?

Options:

A.

External table

B.

Materialized view

C.

Search optimization

D.

Result cache

Buy Now
Questions 39

An Architect on a new project has been asked to design an architecture that meets Snowflake security, compliance, and governance requirements as follows:

1) Use Tri-Secret Secure in Snowflake

2) Share some information stored in a view with another Snowflake customer

3) Hide portions of sensitive information from some columns

4) Use zero-copy cloning to refresh the non-production environment from the production environment

To meet these requirements, which design elements must be implemented? (Choose three.)

Options:

A.

Define row access policies.

B.

Use the Business-Critical edition of Snowflake.

C.

Create a secure view.

D.

Use the Enterprise edition of Snowflake.

E.

Use Dynamic Data Masking.

F.

Create a materialized view.

Buy Now
Questions 40

An Architect is designing a file ingestion recovery solution. The project will use an internal named stage for file storage. Currently, in the case of an ingestion failure, the Operations team must manually download the failed file and check for errors.

Which downloading method should the Architect recommend that requires the LEAST amount of operational overhead?

Options:

A.

Use the Snowflake Connector for Python, connect to remote storage and download the file.

B.

Use the get command in SnowSQL to retrieve the file.

C.

Use the get command in Snowsight to retrieve the file.

D.

Use the Snowflake API endpoint and download the file.

Buy Now
Questions 41

Which of the following ingestion methods can be used to load near real-time data by using the messaging services provided by a cloud provider?

Options:

A.

Snowflake Connector for Kafka

B.

Snowflake streams

C.

Snowpipe

D.

Spark

Buy Now
Questions 42

A company is using Snowflake in Azure in the Netherlands. The company analyst team also has data in JSON format that is stored in an Amazon S3 bucket in the AWS Singapore region that the team wants to analyze.

The Architect has been given the following requirements:

1. Provide access to frequently changing data

2. Keep egress costs to a minimum

3. Maintain low latency

How can these requirements be met with the LEAST amount of operational overhead?

Options:

A.

Use a materialized view on top of an external table against the S3 bucket in AWS Singapore.

B.

Use an external table against the S3 bucket in AWS Singapore and copy the data into transient tables.

C.

Copy the data between providers from S3 to Azure Blob storage to collocate, then use Snowpipe for data ingestion.

D.

Use AWS Transfer Family to replicate data between the S3 bucket in AWS Singapore and an Azure Netherlands Blob storage, then use an external table against the Blob storage.

Buy Now
Questions 43

A Snowflake Architect Is working with Data Modelers and Table Designers to draft an ELT framework specifically for data loading using Snowpipe. The Table Designers will add a timestamp column that Inserts the current tlmestamp as the default value as records are loaded into a table. The Intent is to capture the time when each record gets loaded into the table; however, when tested the timestamps are earlier than the loae_take column values returned by the copy_history function or the Copy_HISTORY view (Account Usage).

Why Is this occurring?

Options:

A.

The timestamps are different because there are parameter setup mismatches. The parameters need to be realigned

B.

The Snowflake timezone parameter Is different from the cloud provider's parameters causing the mismatch.

C.

The Table Designer team has not used the localtimestamp or systimestamp functions in the Snowflake copy statement.

D.

The CURRENT_TIMEis evaluated when the load operation is compiled in cloud services rather than when the record is inserted into the table.

Buy Now
Questions 44

How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)

Options:

A.

Shared databases are read-only.

B.

Shared databases must be refreshed in order for new data to be visible.

C.

Shared databases cannot be cloned.

D.

Shared databases are not supported by Time Travel.

E.

Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.

F.

Shared databases can also be created as transient databases.

Buy Now
Questions 45

What transformations are supported in the below SQL statement? (Select THREE).

CREATE PIPE ... AS COPY ... FROM (...)

Options:

A.

Data can be filtered by an optional where clause.

B.

Columns can be reordered.

C.

Columns can be omitted.

D.

Type casts are supported.

E.

Incoming data can be joined with other tables.

F.

The ON ERROR - ABORT statement command can be used.

Buy Now