Weekend Sale - 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dm70dm

ARA-C01 SnowPro Advanced: Architect Certification Exam Questions and Answers

Questions 4

An Architect is designing a file ingestion recovery solution. The project will use an internal named stage for file storage. Currently, in the case of an ingestion failure, the Operations team must manually download the failed file and check for errors.

Which downloading method should the Architect recommend that requires the LEAST amount of operational overhead?

Options:

A.

Use the Snowflake Connector for Python, connect to remote storage and download the file.

B.

Use the get command in SnowSQL to retrieve the file.

C.

Use the get command in Snowsight to retrieve the file.

D.

Use the Snowflake API endpoint and download the file.

Buy Now
Questions 5

An Architect needs to improve the performance of reports that pull data from multiple Snowflake tables, join, and then aggregate the data. Users access the reports using several dashboards. There are performance issues on Monday mornings between 9:00am-11:00am when many users check the sales reports.

The size of the group has increased from 4 to 8 users. Waiting times to refresh the dashboards has increased significantly. Currently this workload is being served by a virtual warehouse with the following parameters:

AUTO-RESUME = TRUE AUTO_SUSPEND = 60 SIZE = Medium

What is the MOST cost-effective way to increase the availability of the reports?

Options:

A.

Use materialized views and pre-calculate the data.

B.

Increase the warehouse to size Large and set auto_suspend = 600.

C.

Use a multi-cluster warehouse in maximized mode with 2 size Medium clusters.

D.

Use a multi-cluster warehouse in auto-scale mode with 1 size Medium cluster, and set min_cluster_count = 1 and max_cluster_count = 4.

Buy Now
Questions 6

A company’s daily Snowflake workload consists of a huge number of concurrent queries triggered between 9pm and 11pm. At the individual level, these queries are smaller statements that get completed within a short time period.

What configuration can the company’s Architect implement to enhance the performance of this workload? (Choose two.)

Options:

A.

Enable a multi-clustered virtual warehouse in maximized mode during the workload duration.

B.

Set the MAX_CONCURRENCY_LEVEL to a higher value than its default value of 8 at the virtual warehouse level.

C.

Increase the size of the virtual warehouse to size X-Large.

D.

Reduce the amount of data that is being processed through this workload.

E.

Set the connection timeout to a higher value than its default.

Buy Now
Questions 7

Which of the following are characteristics of Snowflake’s parameter hierarchy?

Options:

A.

Session parameters override virtual warehouse parameters.

B.

Virtual warehouse parameters override user parameters.

C.

Table parameters override virtual warehouse parameters.

D.

Schema parameters override account parameters.

Buy Now
Questions 8

An Architect is integrating an application that needs to read and write data to Snowflake without installing any additional software on the application server.

How can this requirement be met?

Options:

A.

Use SnowSQL.

B.

Use the Snowpipe REST API.

C.

Use the Snowflake SQL REST API.

D.

Use the Snowflake ODBC driver.

Buy Now
Questions 9

Following objects can be cloned in snowflake

Options:

A.

Permanent table

B.

Transient table

C.

Temporary table

D.

External tables

E.

Internal stages

Buy Now
Questions 10

What built-in Snowflake features make use of the change tracking metadata for a table? (Choose two.)

Options:

A.

The MERGE command

B.

The UPSERT command

C.

The CHANGES clause

D.

A STREAM object

E.

The CHANGE_DATA_CAPTURE command

Buy Now
Questions 11

Which of the below commands will use warehouse credits?

Options:

A.

SHOW TABLES LIKE 'SNOWFL%';

B.

SELECT MAX(FLAKE_ID) FROM SNOWFLAKE;

C.

SELECT COUNT(*) FROM SNOWFLAKE;

D.

SELECT COUNT(FLAKE_ID) FROM SNOWFLAKE GROUP BY FLAKE_ID;

Buy Now
Questions 12

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

Options:

A.

Developers create their own datasets to work against transformed versions of the live data.

B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Buy Now
Questions 13

A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

Options:

A.

1. Create a share.2. Add objects to the share.3. Add a consumer account to the share for the vendor to access.

B.

1. Create a share.2. Create a reader account for the vendor to use.3. Add the reader account to the share.

C.

1. Create a new role called db_share.2. Grant the db_share role privileges to read data from the company database and schema.3. Create a user for the vendor.4. Grant the ds_share role to the vendor's users.

D.

1. Promote an existing database in the company's local account to primary.2. Replicate the database to Snowflake on Azure in the West-Europe region.3. Create a share and add objects to the share.4. Add a consumer account to the share for the vendor to access.

Buy Now
Questions 14

What is a characteristic of loading data into Snowflake using the Snowflake Connector for Kafka?

Options:

A.

The Connector only works in Snowflake regions that use AWS infrastructure.

B.

The Connector works with all file formats, including text, JSON, Avro, Ore, Parquet, and XML.

C.

The Connector creates and manages its own stage, file format, and pipe objects.

D.

Loads using the Connector will have lower latency than Snowpipe and will ingest data in real time.

Buy Now
Questions 15

An Architect uses COPY INTO with the ON_ERROR=SKIP_FILE option to bulk load CSV files into a table called TABLEA, using its table stage. One file named file5.csv fails to load. The Architect fixes the file and re-loads it to the stage with the exact same file name it had previously.

Which commands should the Architect use to load only file5.csv file from the stage? (Choose two.)

Options:

A.

COPY INTO tablea FROM @%tablea RETURN_FAILED_ONLY = TRUE;

B.

COPY INTO tablea FROM @%tablea;

C.

COPY INTO tablea FROM @%tablea FILES = ('file5.csv');

D.

COPY INTO tablea FROM @%tablea FORCE = TRUE;

E.

COPY INTO tablea FROM @%tablea NEW_FILES_ONLY = TRUE;

F.

COPY INTO tablea FROM @%tablea MERGE = TRUE;

Buy Now
Questions 16

Which of the following are characteristics of how row access policies can be applied to external tables? (Choose three.)

Options:

A.

An external table can be created with a row access policy, and the policy can be applied to the VALUE column.

B.

A row access policy can be applied to the VALUE column of an existing external table.

C.

A row access policy cannot be directly added to a virtual column of an external table.

D.

External tables are supported as mapping tables in a row access policy.

E.

While cloning a database, both the row access policy and the external table will be cloned.

F.

A row access policy cannot be applied to a view created on top of an external table.

Buy Now
Questions 17

A DevOps team has a requirement for recovery of staging tables used in a complex set of data pipelines. The staging tables are all located in the same staging schema. One of the requirements is to have online recovery of data on a rolling 7-day basis.

After setting up the DATA_RETENTION_TIME_IN_DAYS at the database level, certain tables remain unrecoverable past 1 day.

What would cause this to occur? (Choose two.)

Options:

A.

The staging schema has not been setup for MANAGED ACCESS.

B.

The DATA_RETENTION_TIME_IN_DAYS for the staging schema has been set to 1 day.

C.

The tables exceed the 1 TB limit for data recovery.

D.

The staging tables are of the TRANSIENT type.

E.

The DevOps role should be granted ALLOW_RECOVERY privilege on the staging schema.

Buy Now
Questions 18

An Architect has designed a data pipeline that Is receiving small CSV files from multiple sources. All of the files are landing in one location. Specific files are filtered for loading into Snowflake tables using the copy command. The loading performance is poor.

What changes can be made to Improve the data loading performance?

Options:

A.

Increase the size of the virtual warehouse.

B.

Create a multi-cluster warehouse and merge smaller files to create bigger files.

C.

Create a specific storage landing bucket to avoid file scanning.

D.

Change the file format from CSV to JSON.

Buy Now
Questions 19

The diagram shows the process flow for Snowpipe auto-ingest with Amazon Simple Notification Service (SNS) with the following steps:

Step 1: Data files are loaded in a stage.

Step 2: An Amazon S3 event notification, published by SNS, informs Snowpipe — by way of Amazon Simple Queue Service (SQS) - that files are ready to load. Snowpipe copies the files into a queue.

Step 3: A Snowflake-provided virtual warehouse loads data from the queued files into the target table based on parameters defined in the specified pipe.

ARA-C01 Question 19

If an AWS Administrator accidentally deletes the SQS subscription to the SNS topic in Step 2, what will happen to the pipe that references the topic to receive event messages from Amazon S3?

Options:

A.

The pipe will continue to receive the messages as Snowflake will automatically restore the subscription to the same SNS topic and will recreate the pipe by specifying the same SNS topic name in the pipe definition.

B.

The pipe will no longer be able to receive the messages and the user must wait for 24 hours from the time when the SNS topic subscription was deleted. Pipe recreation is not required as the pipe will reuse the same subscription to the existing SNS topic after 24 hours.

C.

The pipe will continue to receive the messages as Snowflake will automatically restore the subscription by creating a new SNS topic. Snowflake will then recreate the pipe by specifying the new SNS topic name in the pipe definition.

D.

The pipe will no longer be able to receive the messages. To restore the system immediately, the user needs to manually create a new SNS topic with a different name and then recreate the pipe by specifying the new SNS topic name in the pipe definition.

Buy Now
Questions 20

When using the Snowflake Connector for Kafka, what data formats are supported for the messages? (Choose two.)

Options:

A.

CSV

B.

XML

C.

Avro

D.

JSON

E.

Parquet

Buy Now
Questions 21

A Snowflake Architect is designing an application and tenancy strategy for an organization where strong legal isolation rules as well as multi-tenancy are requirements.

Which approach will meet these requirements if Role-Based Access Policies (RBAC) is a viable option for isolating tenants?

Options:

A.

Create accounts for each tenant in the Snowflake organization.

B.

Create an object for each tenant strategy if row level security is viable for isolating tenants.

C.

Create an object for each tenant strategy if row level security is not viable for isolating tenants.

D.

Create a multi-tenant table strategy if row level security is not viable for isolating tenants.

Buy Now
Questions 22

Which SQL alter command will MAXIMIZE memory and compute resources for a Snowpark stored procedure when executed on the snowpark_opt_wh warehouse?

A)ARA-C01 Question 22

B)ARA-C01 Question 22

C)ARA-C01 Question 22

D)ARA-C01 Question 22

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Buy Now
Questions 23

A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.

What is the recommended way to validate data accessibility by the consumers?

Options:

A.

Create reader accounts as shown below and impersonate the consumers by logging in with their credentials.create managed account reader_acctl admin_name = userl , adroin_password ■ 'Sdfed43da!44T , type = reader;

B.

Create a row access policy as shown below and assign it to the data share.create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acctl_role' = current_role() then true else false end;

C.

Set the session parameter called SIMULATED_DATA_SHARING_C0NSUMER as shown below in order to impersonate the consumer accounts.alter session set simulated_data_sharing_consumer - 'Consumer Acctl*

D.

Alter the share settings as shown below, in order to impersonate a specific consumer account.alter share sales share set accounts = 'Consumerl’ share restrictions = true

Buy Now
Questions 24

A user is executing the following command sequentially within a timeframe of 10 minutes from start to finish:

ARA-C01 Question 24

What would be the output of this query?

Options:

A.

Table T_SALES_CLONE successfully created.

B.

Time Travel data is not available for table T_SALES.

C.

The offset -> is not a valid clause in the clone operation.

D.

Syntax error line 1 at position 58 unexpected 'at’.

Buy Now
Questions 25

What actions are permitted when using the Snowflake SQL REST API? (Select TWO).

Options:

A.

The use of a GET command

B.

The use of a PUT command

C.

The use of a ROLLBACK command

D.

The use of a CALL command to a stored procedure which returns a table

E.

Submitting multiple SQL statements in a single call

Buy Now
Questions 26

An Architect Is designing a data lake with Snowflake. The company has structured, semi-structured, and unstructured data. The company wants to save the data inside the data lake within the Snowflake system. The company is planning on sharing data among Its corporate branches using Snowflake data sharing.

What should be considered when sharing the unstructured data within Snowflake?

Options:

A.

A pre-signed URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with no time limit for the URL.

B.

A scoped URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 24-hour time limit for the URL.

C.

A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 7-day time limit for the URL.

D.

A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with the "expiration_time" argument defined for the URL time limit.

Buy Now
Questions 27

Which statements describe characteristics of the use of materialized views in Snowflake? (Choose two.)

Options:

A.

They can include ORDER BY clauses.

B.

They cannot include nested subqueries.

C.

They can include context functions, such as CURRENT_TIME().

D.

They can support MIN and MAX aggregates.

E.

They can support inner joins, but not outer joins.

Buy Now
Questions 28

A company has a table with that has corrupted data, named Data. The company wants to recover the data as it was 5 minutes ago using cloning and Time Travel.

What command will accomplish this?

Options:

A.

CREATE CLONE TABLE Recover_Data FROM Data AT(OFFSET => -60*5);

B.

CREATE CLONE Recover_Data FROM Data AT(OFFSET => -60*5);

C.

CREATE TABLE Recover_Data CLONE Data AT(OFFSET => -60*5);

D.

CREATE TABLE Recover Data CLONE Data AT(TIME => -60*5);

Buy Now
Questions 29

What is a characteristic of event notifications in Snowpipe?

Options:

A.

The load history is stored In the metadata of the target table.

B.

Notifications identify the cloud storage event and the actual data in the files.

C.

Snowflake can process all older notifications when a paused pipe Is resumed.

D.

When a pipe Is paused, event messages received for the pipe enter a limited retention period.

Buy Now
Questions 30

How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)

Options:

A.

A task scheduled in a UTC-based schedule will have no issues with the time changes.

B.

Task schedules can be designed to follow specified or local time zones to accommodate the time changes.

C.

A task will move to a suspended state during the daylight savings time change.

D.

A frequent task execution schedule like minutes may not cause a problem, but will affect the task history.

E.

A task schedule will follow only the specified time and will fail to handle lost or duplicated hours.

Buy Now
Questions 31

A company is designing a process for importing a large amount of loT JSON data from cloud storage into Snowflake. New sets of loT data get generated and uploaded approximately every 5 minutes.

Once the loT data is in Snowflake, the company needs up-to-date information from an external vendor to join to the data. This data is then presented to users through a dashboard that shows different levels of aggregation. The external vendor is a Snowflake customer.

What solution will MINIMIZE complexity and MAXIMIZE performance?

Options:

A.

1. Create an external table over the JSON data in cloud storage.2. Create a task that runs every 5 minutes to run a transformation procedure on new data, based on a saved timestamp.3. Ask the vendor to expose an API so an external function can be used to generate a call to join the data back to the loT data in the transformation procedure.4. Give the transformed table access to the dashboard tool.5. Perform the aggregations on the dashboard

B.

1. Create an external table over the JSON data in cloud storage.2. Create a task that runs every 5 minutes to run a transformation procedure on new data based on a saved timestamp.3. Ask the vendor to create a data share with the required data that can be imported into the company's Snowflake account.4. Join the vendor's data back to the loT data using a transformation procedure.5. Create views over the larger dataset to perform the aggrega

C.

1. Create a Snowpipe to bring the JSON data into Snowflake.2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives.3. Ask the vendor to expose an API so an external function call can be made to join the vendor's data back to the loT data in a transformation procedure.4. Create materialized views over the larger dataset to perform the aggregations required by the dashboard.5. Give the materialized views acce

D.

1. Create a Snowpipe to bring the JSON data into Snowflake.2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives.3. Ask the vendor to create a data share with the required data that is then imported into the Snowflake account.4. Join the vendor's data back to the loT data in a transformation procedure5. Create materialized views over the larger dataset to perform the aggregations required by the dashboard

Buy Now
Questions 32

A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

Options:

A.

Use, at minimum, the Business Critical edition of Snowflake.

B.

Create Dynamic Data Masking policies and apply them to columns that contain PHI.

C.

Use the Internal Tokenization feature to obfuscate sensitive data.

D.

Use the External Tokenization feature to obfuscate sensitive data.

E.

Rewrite SQL queries to eliminate projections of PHI data based on current_role().

F.

Avoid sharing data with partner organizations.

Buy Now
Questions 33

What does a Snowflake Architect need to consider when implementing a Snowflake Connector for Kafka?

Options:

A.

Every Kafka message is in JSON or Avro format.

B.

The default retention time for Kafka topics is 14 days.

C.

The Kafka connector supports key pair authentication, OAUTH. and basic authentication (for example, username and password).

D.

The Kafka connector will create one table and one pipe to ingest data for each topic. If the connector cannot create the table or the pipe it will result in an exception.

Buy Now
Questions 34

Which Snowflake architecture recommendation needs multiple Snowflake accounts for implementation?

Options:

A.

Enable a disaster recovery strategy across multiple cloud providers.

B.

Create external stages pointing to cloud providers and regions other than the region hosting the Snowflake account.

C.

Enable zero-copy cloning among the development, test, and production environments.

D.

Enable separation of the development, test, and production environments.

Buy Now
Questions 35

How can the Snowflake context functions be used to help determine whether a user is authorized to see data that has column-level security enforced? (Select TWO).

Options:

A.

Set masking policy conditions using current_role targeting the role in use for the current session.

B.

Set masking policy conditions using is_role_in_session targeting the role in use for the current account.

C.

Set masking policy conditions using invoker_role targeting the executing role in a SQL statement.

D.

Determine if there are ownership privileges on the masking policy that would allow the use of any function.

E.

Assign the accountadmin role to the user who is executing the object.

Buy Now
Questions 36

A company has an inbound share set up with eight tables and five secure views. The company plans to make the share part of its production data pipelines.

Which actions can the company take with the inbound share? (Choose two.)

Options:

A.

Clone a table from a share.

B.

Grant modify permissions on the share.

C.

Create a table from the shared database.

D.

Create additional views inside the shared database.

E.

Create a table stream on the shared table.

Buy Now
Questions 37

An Architect is troubleshooting a query with poor performance using the QUERY_HIST0RY function. The Architect observes that the COMPILATIONJHME is greater than the EXECUTIONJTIME.

What is the reason for this?

Options:

A.

The query is processing a very large dataset.

B.

The query has overly complex logic.

C.

The query is queued for execution.

D.

The query is reading from remote storage.

Buy Now
Questions 38

A company is storing large numbers of small JSON files (ranging from 1-4 bytes) that are received from IoT devices and sent to a cloud provider. In any given hour, 100,000 files are added to the cloud provider.

What is the MOST cost-effective way to bring this data into a Snowflake table?

Options:

A.

An external table

B.

A pipe

C.

A stream

D.

A copy command at regular intervals

Buy Now
Questions 39

Consider the following scenario where a masking policy is applied on the CREDICARDND column of the CREDITCARDINFO table. The masking policy definition Is as follows:

ARA-C01 Question 39

Sample data for the CREDITCARDINFO table is as follows:

NAME EXPIRYDATE CREDITCARDNO

JOHN DOE 2022-07-23 4321 5678 9012 1234

if the Snowflake system rotes have not been granted any additional roles, what will be the result?

Options:

A.

The sysadmin can see the CREDICARDND column data in clear text.

B.

The owner of the table will see the CREDICARDND column data in clear text.

C.

Anyone with the Pl_ANALYTICS role will see the last 4 characters of the CREDICARDND column data in dear text.

D.

Anyone with the Pl_ANALYTICS role will see the CREDICARDND column as*** 'MASKED* **'.

Buy Now
Questions 40

What are characteristics of the use of transactions in Snowflake? (Select TWO).

Options:

A.

Explicit transactions can contain DDL, DML, and query statements.

B.

The autocommit setting can be changed inside a stored procedure.

C.

A transaction can be started explicitly by executing a BEGIN WORK statement and ended explicitly by executing a COMMIT WORK statement.

D.

A transaction can be started explicitly by executing a BEGIN TRANSACTION statement and ended explicitly by executing an END TRANSACTION statement.

E.

Explicit transactions should contain only DML statements and query statements. All DDL statements implicitly commit active transactions.

Buy Now
Questions 41

The IT Security team has identified that there is an ongoing credential stuffing attack on many of their organization’s system.

What is the BEST way to find recent and ongoing login attempts to Snowflake?

Options:

A.

Call the LOGIN_HISTORY Information Schema table function.

B.

Query the LOGIN_HISTORY view in the ACCOUNT_USAGE schema in the SNOWFLAKE database.

C.

View the History tab in the Snowflake UI and set up a filter for SQL text that contains the text "LOGIN".

D.

View the Users section in the Account tab in the Snowflake UI and review the last login column.

Buy Now
Questions 42

A company has a Snowflake account named ACCOUNTA in AWS us-east-1 region. The company stores its marketing data in a Snowflake database named MARKET_DB. One of the company’s business partners has an account named PARTNERB in Azure East US 2 region. For marketing purposes the company has agreed to share the database MARKET_DB with the partner account.

Which of the following steps MUST be performed for the account PARTNERB to consume data from the MARKET_DB database?

Options:

A.

Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA create a share of database MARKET_DB, create a new database out of this share locally in AWS us-east-1 region, and replicate this new database to AZABC123 account. Then set up data sharing to the PARTNERB account.

B.

From account ACCOUNTA create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then make this database the provider and share it with the PARTNERB account.

C.

Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA replicate the database MARKET_DB to AZABC123 and from this account set up the data sharing to the PARTNERB account.

D.

Create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then replicate this database to the partner’s account PARTNERB.

Buy Now
Questions 43

A company has a source system that provides JSON records for various loT operations. The JSON Is loading directly into a persistent table with a variant field. The data Is quickly growing to 100s of millions of records and performance to becoming an issue. There is a generic access pattern that Is used to filter on the create_date key within the variant field.

What can be done to improve performance?

Options:

A.

Alter the target table to Include additional fields pulled from the JSON records. This would Include a create_date field with a datatype of time stamp. When this field Is used in the filter, partition pruning will occur.

B.

Alter the target table to include additional fields pulled from the JSON records. This would include a create_date field with a datatype of varchar. When this field is used in the filter, partition pruning will occur.

C.

Validate the size of the warehouse being used. If the record count is approaching 100s of millions, size XL will be the minimum size required to process this amount of data.

D.

Incorporate the use of multiple tables partitioned by date ranges. When a user or process needs to query a particular date range, ensure the appropriate base table Is used.

Buy Now
Questions 44

An Architect clones a database and all of its objects, including tasks. After the cloning, the tasks stop running.

Why is this occurring?

Options:

A.

Tasks cannot be cloned.

B.

The objects that the tasks reference are not fully qualified.

C.

Cloned tasks are suspended by default and must be manually resumed.

D.

The Architect has insufficient privileges to alter tasks on the cloned database.

Buy Now
Questions 45

Data is being imported and stored as JSON in a VARIANT column. Query performance was fine, but most recently, poor query performance has been reported.

What could be causing this?

Options:

A.

There were JSON nulls in the recent data imports.

B.

The order of the keys in the JSON was changed.

C.

The recent data imports contained fewer fields than usual.

D.

There were variations in string lengths for the JSON values in the recent data imports.

Buy Now
Questions 46

An Architect needs to allow a user to create a database from an inbound share.

To meet this requirement, the user’s role must have which privileges? (Choose two.)

Options:

A.

IMPORT SHARE;

B.

IMPORT PRIVILEGES;

C.

CREATE DATABASE;

D.

CREATE SHARE;

E.

IMPORT DATABASE;

Buy Now
Questions 47

A table for IOT devices that measures water usage is created. The table quickly becomes large and contains more than 2 billion rows.

ARA-C01 Question 47

The general query patterns for the table are:

1. DeviceId, lOT_timestamp and Customerld are frequently used in the filter predicate for the select statement

2. The columns City and DeviceManuf acturer are often retrieved

3. There is often a count on Uniqueld

Which field(s) should be used for the clustering key?

Options:

A.

lOT_timestamp

B.

City and DeviceManuf acturer

C.

Deviceld and Customerld

D.

Uniqueld

Buy Now
Questions 48

A company needs to have the following features available in its Snowflake account:

1. Support for Multi-Factor Authentication (MFA)

2. A minimum of 2 months of Time Travel availability

3. Database replication in between different regions

4. Native support for JDBC and ODBC

5. Customer-managed encryption keys using Tri-Secret Secure

6. Support for Payment Card Industry Data Security Standards (PCI DSS)

In order to provide all the listed services, what is the MINIMUM Snowflake edition that should be selected during account creation?

Options:

A.

Standard

B.

Enterprise

C.

Business Critical

D.

Virtual Private Snowflake (VPS)

Buy Now
Exam Code: ARA-C01
Exam Name: SnowPro Advanced: Architect Certification Exam
Last Update: Jun 25, 2025
Questions: 162

PDF + Testing Engine

$49.5  $164.99

Testing Engine

$37.5  $124.99
buy now ARA-C01 testing engine

PDF (Q&A)

$31.5  $104.99
buy now ARA-C01 pdf
dumpsmate guaranteed to pass
24/7 Customer Support

DumpsMate's team of experts is always available to respond your queries on exam preparation. Get professional answers on any topic of the certification syllabus. Our experts will thoroughly satisfy you.

Site Secure

mcafee secure

TESTED 28 Jun 2025