Win IT Exam with Last Dumps 2025


Microsoft AZ-400 Exam

Page 31/54
Viewing Questions 301 310 out of 535 Questions
57.41%

Question 301
You store source code in a Git repository in Azure Repos. You use a third-party continuous integration (CI) tool to control builds.
What will Azure DevOps use to authenticate with the tool?



Personal access tokens (PATs) give you access to Azure DevOps and Team Foundation Server (TFS), without using your username and password directly.
Reference:
https://docs.microsoft.com/en-us/azure/devops/repos/git/auth-overview

Question 302
DRAG DROP -
You are configuring Azure Pipelines for three projects in Azure DevOps as shown in the following table.
AZ-400_302Q_1.png related to the Microsoft AZ-400 Exam
Which version control system should you recommend for each project? To answer, drag the appropriate version control systems to the correct projects. Each version control system may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:
AZ-400_302Q_2.png related to the Microsoft AZ-400 Exam
Image AZ-400_302R.png related to the Microsoft AZ-400 Exam



Project1:Git in Azure Repos -
Project2: Github Enterprise -
GitHub Enterprise is the on-premises version of GitHub.com. GitHub Enterprise includes the same great set of features as GitHub.com but packaged for running on your organization's local network. All repository data is stored on machines that you control, and access is integrated with your organization's authentication system (LDAP, SAML, or CAS).
Project3: Bitbucket cloud -
One downside, however, is that Bitubucket does not include support for SVN but this can be easily amended migrating the SVN repos to Git with tools such as
SVN Mirror for Bitbucket .
Note: SVN is a centralized version control system.
Incorrect Answers:
Bitbucket:
Bitbucket comes as a distributed version control system based on Git.
Note: A source control system, also called a version control system, allows developers to collaborate on code and track changes. Source control is an essential tool for multi-developer projects.
Our systems support two types of source control: Git (distributed) and Team Foundation Version Control (TFVC). TFVC is a centralized, client-server system. In both Git and TFVC, you can check in files and organize files in folders, branches, and repositories.
Reference:
https://www.azuredevopslabs.com/labs/azuredevops/yaml/
https://enterprise.github.com/faq

Question 303
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to recommend an integration strategy for the build process of a Java application. The solution must meet the following requirements:
- The builds must access an on-premises dependency management system.
- The build outputs must be stored as Server artifacts in Azure DevOps.
- The source code must be stored in a Git repository in Azure DevOps.
Solution: Configure an Octopus Tentacle on an on-premises machine. Use the Package Application task in the build pipeline.
Does this meet the goal?



Octopus Deploy is an automated deployment server that makes it easy to automate deployment of ASP.NET web applications, Java applications, NodeJS application and custom scripts to multiple environments.
Octopus can be installed on various platforms including Windows, Mac and Linux. It can also be integrated with most version control tools including VSTS and
GIT.
When you deploy software to Windows servers, you need to install Tentacle, a lightweight agent service, on your Windows servers so they can communicate with the Octopus server.
When defining your deployment process, the most common step type will be a package step. This step deploys your packaged application onto one or more deployment targets.
When deploying a package you will need to select the machine role that the package will be deployed to.
Reference:
https://octopus.com/docs/deployment-examples/package-deployments
https://explore.emtecinc.com/blog/octopus-for-automated-deployment-in-devops-models

Question 304
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to recommend an integration strategy for the build process of a Java application. The solution must meet the following requirements:
- The builds must access an on-premises dependency management system.
- The build outputs must be stored as Server artifacts in Azure DevOps.
The source code must be stored in a Git repository in Azure DevOps.
Solution: Install and configure a self-hosted build agent on an on-premises machine. Configure the build pipeline to use the Default agent pool. Include the Java
Tool Installer task in the build pipeline.
Does this meet the goal?



Instead use Octopus Tentacle.
Reference:
https://explore.emtecinc.com/blog/octopus-for-automated-deployment-in-devops-models

Question 305
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to recommend an integration strategy for the build process of a Java application. The solution must meet the following requirements:
- The builds must access an on-premises dependency management system.
- The build outputs must be stored as Server artifacts in Azure DevOps.
- The source code must be stored in a Git repository in Azure DevOps.
Solution: Configure the build pipeline to use a Hosted VS 2019 agent pool. Include the Java Tool Installer task in the build pipeline.
Does this meet the goal?



Instead use Octopus Tentacle.
Reference:
https://explore.emtecinc.com/blog/octopus-for-automated-deployment-in-devops-models


Question 306
HOTSPOT -
You need to create deployment files for an Azure Kubernetes Service (AKS) cluster. The deployments must meet the provisioning storage requirements shown in the following table.
AZ-400_306Q_1.png related to the Microsoft AZ-400 Exam
Which resource type should you use for each deployment? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
AZ-400_306Q_2.png related to the Microsoft AZ-400 Exam
Image AZ-400_306R.png related to the Microsoft AZ-400 Exam



Deployment 1: Kubernetes.io/azure-file
You can use Azure Files to connect using the Server Message Block (SMB) protocol.
Deployment 2: Kubernetes.io/azure-disk
Deployment 3: azurekeyvault-flexvolume
azurekeyvault-flexvolume: Key Vault FlexVolume: Seamlessly integrate your key management systems with Kubernetes.
Secrets, keys, and certificates in a key management system become a volume accessible to pods. Once the volume is mounted, its data is available directly in the container filesystem for your application.
Incorrect Answers:
blobfuse-flexvolume: This driver allows Kubernetes to access virtual filesystem backed by the Azure Blob storage.
Reference:
https://docs.microsoft.com/bs-cyrl-ba/azure/aks/azure-files-dynamic-pv
https://docs.microsoft.com/en-us/azure/aks/azure-disks-dynamic-pv

Question 307
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to recommend an integration strategy for the build process of a Java application. The solution must meet the following requirements:
- The builds must access an on-premises dependency management system.
- The build outputs must be stored as Server artifacts in Azure DevOps.
- The source code must be stored in a Git repository in Azure DevOps.
Solution: Configure the build pipeline to use a Hosted Ubuntu agent pool. Include the Java Tool Installer task in the build pipeline.
Does this meet the goal?



Instead use Octopus Tentacle.
Reference:
https://explore.emtecinc.com/blog/octopus-for-automated-deployment-in-devops-models

Question 308
SIMULATION -
You need to create a notification if the peak average response time of an Azure web app named az400-9940427-main is more than five seconds when evaluated during a five-minute period. The notification must trigger the "https://contoso.com/notify" webhook.
To complete this task, sign in to the Microsoft Azure portal.



1. Open Microsoft Azure Portal
2. Log into your Azure account and go to App Service and look under Monitoring then you will see Alert.
3. Select Add an alert rule
4. Configure the alert rule as per below and click Ok.
Source: Alert on Metrics -
Resource Group: az400-9940427-main
Resource: az400-9940427-main -
Threshold: 5 -
Period: Over the last 5 minutes -
Webhook:
https://contoso.com/notify
AZ-400_308E.jpg related to the Microsoft AZ-400 Exam
Reference:
https://azure.microsoft.com/es-es/blog/webhooks-for-azure-alerts/

Question 309
Your company uses a Git repository in Azure Repos to manage the source code of a web application. The master branch is protected from direct updates.
Developers work on new features in the topic branches.
Because of the high volume of requested features, it is difficult to follow the history of the changes to the master branch.
You need to enforce a pull request merge strategy. The strategy must meet the following requirements:
- Consolidate commit histories.
- Merge the changes into a single commit.
Which merge strategy should you use in the branch policy?



Squash merging is a merge option that allows you to condense the Git history of topic branches when you complete a pull request. Instead of each commit on the topic branch being added to the history of the default branch, a squash merge takes all the file changes and adds them to a single new commit on the default branch.
A simple way to think about this is that squash merge gives you just the file changes, and a regular merge gives you the file changes and the commit history.
Note: Squash merging keeps your default branch histories clean and easy to follow without demanding any workflow changes on your team. Contributors to the topic branch work how they want in the topic branch, and the default branches keep a linear history through the use of squash merges. The commit history of a master branch updated with squash merges will have one commit for each merged branch. You can step through this history commit by commit to find out exactly when work was done.
Reference:
https://docs.microsoft.com/en-us/azure/devops/repos/git/merging-with-squash

Question 310
Your company uses cloud-hosted Jenkins for builds.
You need to ensure that Jenkins can retrieve source code from Azure Repos.
Which three actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.



B: Jenkins requires a plug-in to connect to TFS and check for updates to a project.
Jenkins' built-in Git Plugin or Team Foundation Server Plugin can poll a Team Services repository every few minutes and queue a job when changes are detected.
C: Use Azure DevOps/ Visual Studio Team Services to create a Personal access token.
D: After you have generated credentials using Visual Studio Team Services, you need to use those credentials in Jenkins.
Reference:
http://www.aisoftwarellc.com/blog/post/how-to-setup-automated-builds-using-jenkins-and-visual-studio-team-foundation-server/2044