Win IT Exam with Last Dumps 2025


Microsoft AZ-204 Exam

Page 34/36
Viewing Questions 331 340 out of 355 Questions
94.44%

Question 331
You need to ensure that the solution can meet the scaling requirements for Policy Service.
Which Azure Application Insights data model should you use?
A. an Application Insights dependency
B. an Application Insights event
C. an Application Insights trace
D. an Application Insights metric
Application Insights provides three additional data types for custom telemetry:
Trace - used either directly, or through an adapter to implement diagnostics logging using an instrumentation framework that is familiar to you, such as Log4Net or
System.Diagnostics.
Event - typically used to capture user interaction with your service, to analyze usage patterns.
Metric - used to report periodic scalar measurements.
Scenario:
Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/app/data-model

Question 332
DRAG DROP -
You need to implement telemetry for non-user actions.
How should you complete the Filter class? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:
AZ-204_332Q.png related to the Microsoft AZ-204 Exam
Image AZ-204_332R.png related to the Microsoft AZ-204 Exam
Scenario: Exclude non-user actions from Application Insights telemetry.
Box 1: ITelemetryProcessor -
To create a filter, implement ITelemetryProcessor. This technique gives you more direct control over what is included or excluded from the telemetry stream.
Box 2: ITelemetryProcessor -
Box 3: ITelemetryProcessor -
Box 4: RequestTelemetry -
Box 5: /health -
To filter out an item, just terminate the chain.
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/app/api-filtering-sampling

Question 333
DRAG DROP -
You need to ensure that PolicyLib requirements are met.
How should you complete the code segment? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:
AZ-204_333Q.png related to the Microsoft AZ-204 Exam
Image AZ-204_333R.png related to the Microsoft AZ-204 Exam
Scenario: You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications. The PolicyLib library must:
- Exclude non-user actions from Application Insights telemetry.
- Provide methods that allow a web service to scale itself.
- Ensure that scaling actions do not disrupt application usage.
Box 1: ITelemetryInitializer -
Use telemetry initializers to define global properties that are sent with all telemetry; and to override selected behavior of the standard telemetry modules.
Box 2: Initialize -
Box 3: Telemetry.Context -
Box 4: ((EventTelemetry)telemetry).Properties["EventID"]
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/app/api-filtering-sampling

Question 334
You need to ensure receipt processing occurs correctly.
What should you do?
A. Use blob properties to prevent concurrency problems
B. Use blob SnapshotTime to prevent concurrency problems
C. Use blob metadata to prevent concurrency problems
D. Use blob leases to prevent concurrency problems
You can create a snapshot of a blob. A snapshot is a read-only version of a blob that's taken at a point in time. Once a snapshot has been created, it can be read, copied, or deleted, but not modified. Snapshots provide a way to back up a blob as it appears at a moment in time.
Scenario: Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in
Azure Blob Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.
Reference:
https://docs.microsoft.com/en-us/rest/api/storageservices/creating-a-snapshot-of-a-blob

Question 335
You need to resolve the capacity issue.
What should you do?
A. Convert the trigger on the Azure Function to an Azure Blob storage trigger
B. Ensure that the consumption plan is configured correctly to allow scaling
C. Move the Azure Function to a dedicated App Service Plan
D. Update the loop starting on line PC09 to process items in parallel
If you want to read the files in parallel, you cannot use forEach. Each of the async callback function calls does return a promise. You can await the array of promises that you'll get with Promise.all.
Scenario: Capacity issue: During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.
AZ-204_335E.jpg related to the Microsoft AZ-204 Exam
Reference:
https://stackoverflow.com/questions/37576685/using-async-await-with-a-foreach-loop


Question 336
You need to resolve the log capacity issue.
What should you do?
A. Create an Application Insights Telemetry Filter
B. Change the minimum log level in the host.json file for the function
C. Implement Application Insights Sampling
D. Set a LogCategoryFilter during startup
Scenario, the log capacity issue: Developers report that the number of log message in the trace output for the processor is too high, resulting in lost log messages.
Sampling is a feature in Azure Application Insights. It is the recommended way to reduce telemetry traffic and storage, while preserving a statistically correct analysis of application data. The filter selects items that are related, so that you can navigate between items when you are doing diagnostic investigations. When metric counts are presented to you in the portal, they are renormalized to take account of the sampling, to minimize any effect on the statistics.
Sampling reduces traffic and data costs, and helps you avoid throttling.
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/app/sampling

Question 337
HOTSPOT -
You need to implement event routing for retail store location data.
Which configurations should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
AZ-204_337Q.jpg related to the Microsoft AZ-204 Exam
Image AZ-204_337R.jpg related to the Microsoft AZ-204 Exam
Box 1: Azure Blob Storage -
Azure event publishers and event handlers are at the core of the Event Grid routing-service. Event Grid listens to Azure event publishers, such as Blog Storage, then reacts by routing specific events to Azure event handlers, such as WebHooks. You can easily control this entire process at a granular level through event subscriptions and event filters.
Box 2: Azure Event Grid -
Azure Event Grid is a highly scalable event-routing service that listens for specific system events, then reacts to them according to your precise specifications. In the past, event handling has relied largely on polling - a high latency, low-efficiency approach that can prove prohibitively expensive at scale.
Box 3: Azure Logic App -
Event Grid's supported event handlers currently include Event Hubs, WebHooks, Logic Apps, Azure Functions, Azure Automation and Microsoft Flow.
Reference:
https://www.appliedi.net/blog/using-azure-event-grid-for-highly-scalable-event-routing

Question 338
HOTSPOT -
You need to update the order workflow to address the issue when calling the Printer API App.
How should you complete the code? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
AZ-204_338Q.jpg related to the Microsoft AZ-204 Exam
Image AZ-204_338R.jpg related to the Microsoft AZ-204 Exam
Box 1: fixed -
The 'Default' policy does 4 exponential retries and from my experience the interval times are often too short in situations.
Box 2: PT60S -
We could set a fixed interval, e.g. 5 retries every 60 seconds (PT60S).
PT60S is 60 seconds.
Scenario: Calls to the Printer API App fail periodically due to printer communication timeouts.
Printer communication timeouts occur after 10 seconds. The label printer must only receive up to 5 attempts within one minute.
Box 3: 5 -
Reference:
https://michalsacewicz.com/error-handling-in-power-automate/

Question 339
DRAG DROP -
You need to support the message processing for the ocean transport workflow.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:
AZ-204_339Q.png related to the Microsoft AZ-204 Exam
Image AZ-204_339R.png related to the Microsoft AZ-204 Exam
Step 1: Create an integration account in the Azure portal
You can define custom metadata for artifacts in integration accounts and get that metadata during runtime for your logic app to use. For example, you can provide metadata for artifacts, such as partners, agreements, schemas, and maps - all store metadata using key-value pairs.
Step 2: Link the Logic App to the integration account
A logic app that's linked to the integration account and artifact metadata you want to use.
Step 3: Add partners, schemas, certificates, maps, and agreements
Step 4: Create a custom connector for the Logic App.
Reference:
https://docs.microsoft.com/bs-latn-ba/azure/logic-apps/logic-apps-enterprise-integration-metadata

Question 340
You need to support the requirements for the Shipping Logic App.
What should you use?
A. Azure Active Directory Application Proxy
B. Site-to-Site (S2S) VPN connection
C. On-premises Data Gateway
D. Point-to-Site (P2S) VPN connection
Before you can connect to on-premises data sources from Azure Logic Apps, download and install the on-premises data gateway on a local computer. The gateway works as a bridge that provides quick data transfer and encryption between data sources on premises (not in the cloud) and your logic apps.
The gateway supports BizTalk Server 2016.
Note: Microsoft have now fully incorporated the Azure BizTalk Services capabilities into Logic Apps and Azure App Service Hybrid Connections.
Logic Apps Enterprise Integration pack bring some of the enterprise B2B capabilities like AS2 and X12, EDI standards support
Scenario: The Shipping Logic app must meet the following requirements:
- Support the ocean transport and inland transport workflows by using a Logic App.
- Support industry-standard protocol X12 message format for various messages including vessel content details and arrival notices.
- Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.
- Maintain on-premises connectivity to support legacy applications and final BizTalk migrations.
Reference:
https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-gateway-install