Spring Sale - 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dm70dm

ARA-C01 SnowPro Advanced: Architect Certification Exam Questions and Answers

Questions 4

A company uses the COPY INTO <title> command with the following sequence:

• A file is staged on March 1

• The table is loaded on March 2

• On June 30, the company attempts to reload the same file into the same table, but the file is skipped

Which options can load the file? (Select TWO).

Options:

A.

Set the PURGE option to TRUE.

B.

Set the FORCE option to TRUE.

C.

Set the VALIDATION_MODE option to FALSE.

D.

Set the LOAD_UNCERTAIN_FILES option to TRUE.

E.

Set the ALLOW_DUPLICATE option to TRUE.

Buy Now
Questions 5

An Architect needs to automate the daily Import of two files from an external stage into Snowflake. One file has Parquet-formatted data, the other has CSV-formatted data.

How should the data be joined and aggregated to produce a final result set?

Options:

A.

Use Snowpipe to ingest the two files, then create a materialized view to produce the final result set.

B.

Create a task using Snowflake scripting that will import the files, and then call a User-Defined Function (UDF) to produce the final result set.

C.

Create a JavaScript stored procedure to read. join, and aggregate the data directly from the external stage, and then store the results in a table.

D.

Create a materialized view to read, Join, and aggregate the data directly from the external stage, and use the view to produce the final result set

Buy Now
Questions 6

When using the Snowflake Connector for Kafka, what data formats are supported for the messages? (Choose two.)

Options:

A.

CSV

B.

XML

C.

Avro

D.

JSON

E.

Parquet

Buy Now
Questions 7

When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP() what will occur?

Options:

A.

All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

B.

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

C.

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

D.

All rows loaded using a specific COPY statement will have the same timestamp value.

Buy Now
Questions 8

Data is being imported and stored as JSON in a VARIANT column. Query performance was fine, but most recently, poor query performance has been reported.

What could be causing this?

Options:

A.

There were JSON nulls in the recent data imports.

B.

The order of the keys in the JSON was changed.

C.

The recent data imports contained fewer fields than usual.

D.

There were variations in string lengths for the JSON values in the recent data imports.

Buy Now
Questions 9

A company has a table with that has corrupted data, named Data. The company wants to recover the data as it was 5 minutes ago using cloning and Time Travel.

What command will accomplish this?

Options:

A.

CREATE CLONE TABLE Recover_Data FROM Data AT(OFFSET => -60*5);

B.

CREATE CLONE Recover_Data FROM Data AT(OFFSET => -60*5);

C.

CREATE TABLE Recover_Data CLONE Data AT(OFFSET => -60*5);

D.

CREATE TABLE Recover Data CLONE Data AT(TIME => -60*5);

Buy Now
Questions 10

An Architect runs the following SQL query:

ARA-C01 Question 10

How can this query be interpreted?

Options:

A.

FILEROWS is a stage. FILE_ROW_NUMBER is line number in file.

B.

FILEROWS is the table. FILE_ROW_NUMBER is the line number in the table.

C.

FILEROWS is a file. FILE_ROW_NUMBER is the file format location.

D.

FILERONS is the file format location. FILE_ROW_NUMBER is a stage.

Buy Now
Questions 11

A retail company has over 3000 stores all using the same Point of Sale (POS) system. The company wants to deliver near real-time sales results to category managers. The stores operate in a variety of time zones and exhibit a dynamic range of transactions each minute, with some stores having higher sales volumes than others.

Sales results are provided in a uniform fashion using data engineered fields that will be calculated in a complex data pipeline. Calculations include exceptions, aggregations, and scoring using external functions interfaced to scoring algorithms. The source data for aggregations has over 100M rows.

Every minute, the POS sends all sales transactions files to a cloud storage location with a naming convention that includes store numbers and timestamps to identify the set of transactions contained in the files. The files are typically less than 10MB in size.

How can the near real-time results be provided to the category managers? (Select TWO).

Options:

A.

All files should be concatenated before ingestion into Snowflake to avoid micro-ingestion.

B.

A Snowpipe should be created and configured with AUTO_INGEST = true. A stream should be created to process INSERTS into a single target table using the stream metadata to inform the store number and timestamps.

C.

A stream should be created to accumulate the near real-time data and a task should be created that runs at a frequency that matches the real-time analytics needs.

D.

An external scheduler should examine the contents of the cloud storage location and issue SnowSQL commands to process the data at a frequency that matches the real-time analytics needs.

E.

The copy into command with a task scheduled to run every second should be used to achieve the near-real time requirement.

Buy Now
Questions 12

Which of the below commands will use warehouse credits?

Options:

A.

SHOW TABLES LIKE 'SNOWFL%';

B.

SELECT MAX(FLAKE_ID) FROM SNOWFLAKE;

C.

SELECT COUNT(*) FROM SNOWFLAKE;

D.

SELECT COUNT(FLAKE_ID) FROM SNOWFLAKE GROUP BY FLAKE_ID;

Buy Now
Questions 13

Which SQL ALTER command will MAXIMIZE memory and compute resources for a Snowpark stored procedure when executed on the snowpark_opt_wh warehouse?

Options:

A.

ALTER WAREHOUSE snowpark_opt_wh SET MAX_CONCURRENCY_LEVEL = 1;

B.

ALTER WAREHOUSE snowpark_opt_wh SET MAX_CONCURRENCY_LEVEL = 2;

C.

ALTER WAREHOUSE snowpark_opt_wh SET MAX_CONCURRENCY_LEVEL = 8;

D.

ALTER WAREHOUSE snowpark_opt_wh SET MAX_CONCURRENCY_LEVEL = 16;

Buy Now
Questions 14

An Architect has a table called leader_follower that contains a single column named JSON. The table has one row with the following structure:

{

"activities": [

{ "activityNumber": 1, "winner": 5 },

{ "activityNumber": 2, "winner": 4 }

],

"follower": {

"name": { "default": "Matt" },

"number": 4

},

"leader": {

"name": { "default": "Adam" },

"number": 5

}

}

Which query will produce the following results?

ACTIVITY_NUMBER

WINNER_NAME

1

Adam

2

Matt

Options:

A.

SELECT lf.json:activities.activityNumber AS activity_number,

IFF(

lf.json:activities.activityNumber = lf.json:leader.number,

lf.json:leader.name.default,

lf.json:follower.name.default

)::VARCHAR

FROM leader_follower lf;

B.

SELECT

C.

value:activityNumber AS activity_number,

IFF(

D.

value:winner = lf.json:leader.number,

lf.json:leader.name.default,

lf.json:follower.name.default

)::VARCHAR AS winner_name

FROM leader_follower lf,

LATERAL FLATTEN(input => json:activities) p;

E.

SELECT

F.

value:activityNumber AS activity_number,

IFF(

G.

value:winner = lf.json:leader.number,

lf.json:leader,

lf.json:follower

)::VARCHAR AS winner_name

FROM leader_follower lf,

LATERAL FLATTEN(input => json:activities) p;

Buy Now
Questions 15

A table, EMP_ TBL has three records as shown:

ARA-C01 Question 15

The following variables are set for the session:

ARA-C01 Question 15

Which SELECT statements will retrieve all three records? (Select TWO).

Options:

A.

Select * FROM Stbl_ref WHERE Scol_ref IN ('Name1','Nam2','Name3');

B.

SELECT * FROM EMP_TBL WHERE identifier(Scol_ref) IN ('Namel','Name2', 'Name3');

C.

SELECT * FROM identifier WHERE NAME IN ($var1, $var2, $var3);

D.

SELECT * FROM identifier($tbl_ref) WHERE ID IN Cvarl','var2','var3');

E.

SELECT * FROM $tb1_ref WHERE $col_ref IN ($var1, Svar2, Svar3);

Buy Now
Questions 16

A company's Architect needs to find an efficient way to get data from an external partner, who is also a Snowflake user. The current solution is based on daily JSON extracts that are placed on an FTP server and uploaded to Snowflake manually. The files are changed several times each month, and the ingestion process needs to be adapted to accommodate these changes.

What would be the MOST efficient solution?

Options:

A.

Ask the partner to create a share and add the company's account.

B.

Ask the partner to use the data lake export feature and place the data into cloud storage where Snowflake can natively ingest it (schema-on-read).

C.

Keep the current structure but request that the partner stop changing files, instead only appending new files.

D.

Ask the partner to set up a Snowflake reader account and use that account to get the data for ingestion.

Buy Now
Questions 17

A company is designing high availability and disaster recovery plans and needs to maximize redundancy and minimize recovery time objectives for their critical application processes. Cost is not a concern as long as the solution is the best available. The plan so far consists of the following steps:

1. Deployment of Snowflake accounts on two different cloud providers.

2. Selection of cloud provider regions that are geographically far apart.

3. The Snowflake deployment will replicate the databases and account data between both cloud provider accounts.

4. Implementation of Snowflake client redirect.

What is the MOST cost-effective way to provide the HIGHEST uptime and LEAST application disruption if there is a service event?

Options:

A.

Connect the applications using the - URL. Use the Business Critical Snowflake edition.

B.

Connect the applications using the - URL. Use the Virtual Private Snowflake (VPS) edition.

C.

Connect the applications using the -<accountLocator> URL. Use the Enterprise Snowflake edition.

D.

Connect the applications using the -<accountLocator> URL. Use the Business Critical Snowflake edition.

Buy Now
Questions 18

An Architect is designing a solution that will be used to process changed records in an orders table. Newly-inserted orders must be loaded into the f_orders fact table, which will aggregate all the orders by multiple dimensions (time, region, channel, etc.). Existing orders can be updated by the sales department within 30 days after the order creation. In case of an order update, the solution must perform two actions:

1. Update the order in the f_0RDERS fact table.

2. Load the changed order data into the special table ORDER _REPAIRS.

This table is used by the Accounting department once a month. If the order has been changed, the Accounting team needs to know the latest details and perform the necessary actions based on the data in the order_repairs table.

What data processing logic design will be the MOST performant?

Options:

A.

Useone stream and one task.

B.

Useone stream and two tasks.

C.

Usetwo streams and one task.

D.

Usetwo streams and two tasks.

Buy Now
Questions 19

A large manufacturing company runs a dozen individual Snowflake accounts across its business divisions. The company wants to increase the level of data sharing to support supply chain optimizations and increase its purchasing leverage with multiple vendors.

The company’s Snowflake Architects need to design a solution that would allow the business divisions to decide what to share, while minimizing the level of effort spent on configuration and management. Most of the company divisions use Snowflake accounts in the same cloud deployments with a few exceptions for European-based divisions.

According to Snowflake recommended best practice, how should these requirements be met?

Options:

A.

Migrate the European accounts in the global region and manage shares in a connected graph architecture. Deploy a Data Exchange.

B.

Deploy a Private Data Exchange in combination with data shares for the European accounts.

C.

Deploy to the Snowflake Marketplace making sure that invoker_share() is used in all secure views.

D.

Deploy a Private Data Exchange and use replication to allow European data shares in the Exchange.

Buy Now
Questions 20

An Architect entered the following commands in sequence:

ARA-C01 Question 20

USER1 cannot find the table.

Which of the following commands does the Architect need to run for USER1 to find the tables using the Principle of Least Privilege? (Choose two.)

Options:

A.

GRANT ROLE PUBLIC TO ROLE INTERN;

B.

GRANT USAGE ON DATABASE SANDBOX TO ROLE INTERN;

C.

GRANT USAGE ON SCHEMA SANDBOX.PUBLIC TO ROLE INTERN;

D.

GRANT OWNERSHIP ON DATABASE SANDBOX TO USER INTERN;

E.

GRANT ALL PRIVILEGES ON DATABASE SANDBOX TO ROLE INTERN;

Buy Now
Questions 21

An Architect on a new project has been asked to design an architecture that meets Snowflake security, compliance, and governance requirements as follows:

1) Use Tri-Secret Secure in Snowflake

2) Share some information stored in a view with another Snowflake customer

3) Hide portions of sensitive information from some columns

4) Use zero-copy cloning to refresh the non-production environment from the production environment

To meet these requirements, which design elements must be implemented? (Choose three.)

Options:

A.

Define row access policies.

B.

Use the Business-Critical edition of Snowflake.

C.

Create a secure view.

D.

Use the Enterprise edition of Snowflake.

E.

Use Dynamic Data Masking.

F.

Create a materialized view.

Buy Now
Questions 22

User1 and User2 are new users that were granted different functional roles.

User1 was granted the IT_ANALYST_ROLE

User2 was granted the FIN_ANALYST_ROLE

Review the following security design (as shown in the diagram):

ARA-C01 Question 22

A database (DB) grants USAGE and SELECT on all tables to DB_IT_RO_ROLE

DB_IT_RO_ROLE is granted to IT_ANALYST_ROLE

IT_SCHEMA contains TABLE1

FINANCE_SCHEMA grants USAGE and SELECT to DB_FIN_ROLE

DB_FIN_ROLE is granted to FIN_ANALYST_ROLE

FINANCE_SCHEMA contains FIN_TABLE

Which tables can each user read?

Options:

A.

User1 will be the only user able to read tables from both schemas, since the DB_IT_RO_ROLE has SELECT privileges on all database tables.

B.

User1 will be able to read tables from both schemas, while User2 will be able to read only the FINANCE_SCHEMA tables.

C.

User2 will be able to read tables from the FINANCE_SCHEMA, while User1 will be unable to read any table.

D.

User2 will be able to read tables from both schemas, while User1 will be able to read tables only in IT_SCHEMA.

Buy Now
Questions 23

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

Options:

A.

Ingest the data using COPY INTO and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Buy Now
Questions 24

How can the Snowpipe REST API be used to keep a log of data load history?

Options:

A.

Call insertReport every 20 minutes, fetching the last 10,000 entries.

B.

Call loadHistoryScan every minute for the maximum time range.

C.

Call insertReport every 8 minutes for a 10-minute time range.

D.

Call loadHistoryScan every 10 minutes for a 15-minutes range.

Buy Now
Questions 25

The IT Security team has identified that there is an ongoing credential stuffing attack on many of their organization’s system.

What is the BEST way to find recent and ongoing login attempts to Snowflake?

Options:

A.

Call the LOGIN_HISTORY Information Schema table function.

B.

Query the LOGIN_HISTORY view in the ACCOUNT_USAGE schema in the SNOWFLAKE database.

C.

View the History tab in the Snowflake UI and set up a filter for SQL text that contains the text "LOGIN".

D.

View the Users section in the Account tab in the Snowflake UI and review the last login column.

Buy Now
Questions 26

What are characteristics of the use of transactions in Snowflake? (Select TWO).

Options:

A.

Explicit transactions can contain DDL, DML, and query statements.

B.

The autocommit setting can be changed inside a stored procedure.

C.

A transaction can be started explicitly by executing a BEGIN WORK statement and ended explicitly by executing a COMMIT WORK statement.

D.

A transaction can be started explicitly by executing a BEGIN TRANSACTION statement and ended explicitly by executing an END TRANSACTION statement.

E.

Explicit transactions should contain only DML statements and query statements. All DDL statements implicitly commit active transactions.

Buy Now
Questions 27

Which of the following are characteristics of how row access policies can be applied to external tables? (Choose three.)

Options:

A.

An external table can be created with a row access policy, and the policy can be applied to the VALUE column.

B.

A row access policy can be applied to the VALUE column of an existing external table.

C.

A row access policy cannot be directly added to a virtual column of an external table.

D.

External tables are supported as mapping tables in a row access policy.

E.

While cloning a database, both the row access policy and the external table will be cloned.

F.

A row access policy cannot be applied to a view created on top of an external table.

Buy Now
Questions 28

What integration object should be used to place restrictions on where data may be exported?

Options:

A.

Stage integration

B.

Security integration

C.

Storage integration

D.

API integration

Buy Now
Questions 29

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

Options:

A.

Developers create their own datasets to work against transformed versions of the live data.

B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Buy Now
Questions 30

A company wants to configure the Client Redirect feature on their Snowflake account to ensure business continuity during failover events.

How should an Architect accomplish this?

Options:

A.

Enable users to log in using another Snowflake URL where replicated objects are available.

B.

Redirect users’ workloads and queries between available accounts.

C.

Redirect connections to an available account in the closest region.

D.

Redirect client connections to another Snowflake account in a different region where the secondary account acts as the primary.

Buy Now
Questions 31

Role A has the following permissions:

. USAGE on db1

. USAGE and CREATE VIEW on schemal in db1

. SELECT on tablel in schemal

Role B has the following permissions:

. USAGE on db2

. USAGE and CREATE VIEW on schema2 in db2

. SELECT on table2 in schema2

A user has Role A set as the primary role and Role B as a secondary role.

What command will fail for this user?

Options:

A.

use database db1;use schema schemal;create view v1 as select * from db2.schema2.table2;

B.

use database db2;use schema schema2;create view v2 as select * from dbl.schemal. tablel;

C.

use database db2;use schema schema2;select * from db1.schemal.tablel union select * from table2;

D.

use database db1;use schema schemal;select * from db2.schema2.table2;

Buy Now
Questions 32

The Business Intelligence team reports that when some team members run queries for their dashboards in parallel with others, the query response time is getting significantly slower What can a Snowflake Architect do to identify what is occurring and troubleshoot this issue?

A)

ARA-C01 Question 32

B)

ARA-C01 Question 32

C)

ARA-C01 Question 32

D)

ARA-C01 Question 32

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Buy Now
Questions 33

An Architect is troubleshooting a query with poor performance using the QUERY_HIST0RY function. The Architect observes that the COMPILATIONJHME is greater than the EXECUTIONJTIME.

What is the reason for this?

Options:

A.

The query is processing a very large dataset.

B.

The query has overly complex logic.

C.

The query is queued for execution.

D.

The query is reading from remote storage.

Buy Now
Questions 34

You are a snowflake architect in an organization. The business team came to to deploy an use case which requires you to load some data which they can visualize through tableau. Everyday new data comes in and the old data is no longer required.

What type of table you will use in this case to optimize cost

Options:

A.

TRANSIENT

B.

TEMPORARY

C.

PERMANENT

Buy Now
Questions 35

A user is executing the following command sequentially within a timeframe of 10 minutes from start to finish:

ARA-C01 Question 35

What would be the output of this query?

Options:

A.

Table T_SALES_CLONE successfully created.

B.

Time Travel data is not available for table T_SALES.

C.

The offset -> is not a valid clause in the clone operation.

D.

Syntax error line 1 at position 58 unexpected 'at’.

Buy Now
Questions 36

What built-in Snowflake features make use of the change tracking metadata for a table? (Choose two.)

Options:

A.

The MERGE command

B.

The UPSERT command

C.

The CHANGES clause

D.

A STREAM object

E.

The CHANGE_DATA_CAPTURE command

Buy Now
Questions 37

An Architect needs to design a Snowflake account and database strategy to store and analyze large amounts of structured and semi-structured data. There are many business units and departments within the company. The requirements are scalability, security, and cost efficiency.

What design should be used?

Options:

A.

Create a single Snowflake account and database for all data storage and analysis needs, regardless of data volume or complexity.

B.

Set up separate Snowflake accounts and databases for each department or business unit, to ensure data isolation and security.

C.

Use Snowflake's data lake functionality to store and analyze all data in a central location, without the need for structured schemas or indexes

D.

Use a centralized Snowflake database for core business data, and use separate databases for departmental or project-specific data.

Buy Now
Questions 38

A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).

What is the recommended sequence of operations that must be followed to meet this requirement?

Options:

A.

1. Create a share and add the database privileges to the share2. Create a new listing on the Snowflake Marketplace3. Alter the listing and add the share4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace

B.

1. Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia)2. Create a share and add the database privileges to the share3. Alter the share and add the customer's Snowflake account to the share

C.

1. Create a new Snowflake account in Azure East US 2 (Virginia)2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared3. Create a share and add the database privileges to the share4. Alter the share and add the customer's Snowflake account to the share

D.

1. Create a reader account in Azure East US 2 (Virginia)2. Create a share and add the database privileges to the share3. Add the reader account to the share4. Share the reader account's URL and credentials with the customer

Buy Now
Questions 39

A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.

Which requirements will be addressed with this approach? (Choose two.)

Options:

A.

There needs to be fewer objects per tenant.

B.

Security and Role-Based Access Control (RBAC) policies must be simple to configure.

C.

Compute costs must be optimized.

D.

Tenant data shape may be unique per tenant.

E.

Storage costs must be optimized.

Buy Now
Questions 40

Which data models can be used when modeling tables in a Snowflake environment? (Select THREE).

Options:

A.

Graph model

B.

Dimensional/Kimball

C.

Data lake

D.

lnmon/3NF

E.

Bayesian hierarchical model

F.

Data vault

Buy Now
Questions 41

An Architect has a design where files arrive every 10 minutes and are loaded into a primary database table using Snowpipe. A secondary database is refreshed every hour with the latest data from the primary database.

Based on this scenario, what Time Travel query options are available on the secondary database?

Options:

A.

A query using Time Travel in the secondary database is available for every hourly table version within the retention window.

B.

A query using Time Travel in the secondary database is available for every hourly table version within and outside the retention window.

C.

Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) in the retention window.

D.

Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) and outside the retention window.

Buy Now
Questions 42

A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.

What steps should be taken to allow the IP addresses to be accessed? (Select TWO).

Options:

A.

ALTERROLEANALYST_ROLESETNETWORK_POLICY='ANALYST_POLICY';

B.

ALTERUSERANALYSTJJSERSETNETWORK_POLICY='ANALYST_POLICY';

C.

ALTERUSERANALYST_USERSETNETWORK_POLICY='10.1.1.20';

D.

USE ROLE SECURITYADMIN;CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

E.

USE ROLE USERADMIN;CREATE OR REPLACE NETWORK POLICY ANALYST_POLICYALLOWED_IP_LIST = ('10.1.1.20');

Buy Now
Questions 43

A company needs to share its product catalog data with one of its partners. The product catalog data is stored in two database tables: product_category, and product_details. Both tables can be joined by the product_id column. Data access should be governed, and only the partner should have access to the records.

The partner is not a Snowflake customer. The partner uses Amazon S3 for cloud storage.

Which design will be the MOST cost-effective and secure, while using the required Snowflake features?

Options:

A.

Use Secure Data Sharing with an S3 bucket as a destination.

B.

Publish product_category and product_details data sets on the Snowflake Marketplace.

C.

Create a database user for the partner and give them access to the required data sets.

D.

Create a reader account for the partner and share the data sets as secure views.

Buy Now
Questions 44

What considerations need to be taken when using database cloning as a tool for data lifecycle management in a development environment? (Select TWO).

Options:

A.

Any pipes in the source are not cloned.

B.

Any pipes in the source referring to internal stages are not cloned.

C.

Any pipes in the source referring to external stages are not cloned.

D.

The clone inherits all granted privileges of all child objects in the source object, including the database.

E.

The clone inherits all granted privileges of all child objects in the source object, excluding the database.

Buy Now
Questions 45

A Snowflake Architect is setting up database replication to support a disaster recovery plan. The primary database has external tables.

How should the database be replicated?

Options:

A.

Create a clone of the primary database then replicate the database.

B.

Move the external tables to a database that is not replicated, then replicate the primary database.

C.

Replicate the database ensuring the replicated database is in the same region as the external tables.

D.

Share the primary database with an account in the same region that the database will be replicated to.

Buy Now
Questions 46

Consider the following COPY command which is loading data with CSV format into a Snowflake table from an internal stage through a data transformation query.

ARA-C01 Question 46

This command results in the following error:

SQL compilation error: invalid parameter 'validation_mode'

Assuming the syntax is correct, what is the cause of this error?

Options:

A.

The VALIDATION_MODE parameter supports COPY statements that load data from external stages only.

B.

The VALIDATION_MODE parameter does not support COPY statements with CSV file formats.

C.

The VALIDATION_MODE parameter does not support COPY statements that transform data during a load.

D.

The value return_all_errors of the option VALIDATION_MODE is causing a compilation error.

Buy Now
Questions 47

A company has a source system that provides JSON records for various loT operations. The JSON Is loading directly into a persistent table with a variant field. The data Is quickly growing to 100s of millions of records and performance to becoming an issue. There is a generic access pattern that Is used to filter on the create_date key within the variant field.

What can be done to improve performance?

Options:

A.

Alter the target table to Include additional fields pulled from the JSON records. This would Include a create_date field with a datatype of time stamp. When this field Is used in the filter, partition pruning will occur.

B.

Alter the target table to include additional fields pulled from the JSON records. This would include a create_date field with a datatype of varchar. When this field is used in the filter, partition pruning will occur.

C.

Validate the size of the warehouse being used. If the record count is approaching 100s of millions, size XL will be the minimum size required to process this amount of data.

D.

Incorporate the use of multiple tables partitioned by date ranges. When a user or process needs to query a particular date range, ensure the appropriate base table Is used.

Buy Now
Questions 48

An Architect is integrating an application that needs to read and write data to Snowflake without installing any additional software on the application server.

How can this requirement be met?

Options:

A.

Use SnowSQL.

B.

Use the Snowpipe REST API.

C.

Use the Snowflake SQL REST API.

D.

Use the Snowflake ODBC driver.

Buy Now
Questions 49

An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).

Options:

A.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

B.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.

C.

Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

D.

Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.

E.

Configure the client application to issue a COPY INTO

command to Snowflake when new files have arrived in Amazon S3 Glacier storage.

Buy Now
Questions 50

A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

Options:

A.

1. Create a share.2. Add objects to the share.3. Add a consumer account to the share for the vendor to access.

B.

1. Create a share.2. Create a reader account for the vendor to use.3. Add the reader account to the share.

C.

1. Create a new role called db_share.2. Grant the db_share role privileges to read data from the company database and schema.3. Create a user for the vendor.4. Grant the ds_share role to the vendor's users.

D.

1. Promote an existing database in the company's local account to primary.2. Replicate the database to Snowflake on Azure in the West-Europe region.3. Create a share and add objects to the share.4. Add a consumer account to the share for the vendor to access.

Buy Now
Questions 51

An Architect has a VPN_ACCESS_LOGS table in the SECURITY_LOGS schema containing timestamps of the connection and disconnection, username of the user, and summary statistics.

What should the Architect do to enable the Snowflake search optimization service on this table?

Options:

A.

Assume role with OWNERSHIP on future tables and ADD SEARCH OPTIMIZATION on the SECURITY_LOGS schema.

B.

Assume role with ALL PRIVILEGES including ADD SEARCH OPTIMIZATION in the SECURITY LOGS schema.

C.

Assume role with OWNERSHIP on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

D.

Assume role with ALL PRIVILEGES on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

Buy Now
Questions 52

A table for IOT devices that measures water usage is created. The table quickly becomes large and contains more than 2 billion rows.

ARA-C01 Question 52

The general query patterns for the table are:

1. DeviceId, lOT_timestamp and Customerld are frequently used in the filter predicate for the select statement

2. The columns City and DeviceManuf acturer are often retrieved

3. There is often a count on Uniqueld

Which field(s) should be used for the clustering key?

Options:

A.

lOT_timestamp

B.

City and DeviceManuf acturer

C.

Deviceld and Customerld

D.

Uniqueld

Buy Now
Questions 53

Why might a Snowflake Architect use a star schema model rather than a 3NF model when designing a data architecture to run in Snowflake? (Select TWO).

Options:

A.

Snowflake cannot handle the joins implied in a 3NF data model.

B.

The Architect wants to remove data duplication from the data stored in Snowflake.

C.

The Architect is designing a landing zone to receive raw data into Snowflake.

D.

The Bl tool needs a data model that allows users to summarize facts across different dimensions, or to drill down from the summaries.

E.

The Architect wants to present a simple flattened single view of the data to a particular group of end users.

Buy Now
Questions 54

What does a Snowflake Architect need to consider when implementing a Snowflake Connector for Kafka?

Options:

A.

Every Kafka message is in JSON or Avro format.

B.

The default retention time for Kafka topics is 14 days.

C.

The Kafka connector supports key pair authentication, OAUTH. and basic authentication (for example, username and password).

D.

The Kafka connector will create one table and one pipe to ingest data for each topic. If the connector cannot create the table or the pipe it will result in an exception.

Buy Now
Exam Code: ARA-C01
Exam Name: SnowPro Advanced: Architect Certification Exam
Last Update: Feb 20, 2026
Questions: 182

PDF + Testing Engine

$49.5  $164.99

Testing Engine

$37.5  $124.99
buy now ARA-C01 testing engine

PDF (Q&A)

$31.5  $104.99
buy now ARA-C01 pdf
dumpsmate guaranteed to pass

24/7 Customer Support

DumpsMate's team of experts is always available to respond your queries on exam preparation. Get professional answers on any topic of the certification syllabus. Our experts will thoroughly satisfy you.

Site Secure

mcafee secure

TESTED 21 Feb 2026