web analytics

New Released Exam Dumps Free Download In Lead2pass

Updated Lead2pass PDF And VCE Dumps

[2017 New] Lead2pass Offering Free 70-534 Dumps Files For Free Downloading By 70-534 Exam Candidates (41-60)

2017 June Microsoft Official New Released 70-534 Dumps in Lead2pass.com!

100% Free Download! 100% Pass Guaranteed!

70-534 easy pass guide: Preparing for Microsoft 70-534 exam is really a tough task to accomplish. However, Lead2pass delivers the most comprehensive braindumps, covering each and every aspect of 70-534 exam curriculum.

Following questions and answers are all new published by Microsoft Official Exam Center: http://www.lead2pass.com/70-534.html

QUESTION 41
Hotspot Question
Resources must authenticate to an identity provider.
You need to configure the Azure Access Control service.
What should you recommend? To answer, select the appropriate responses for each requirement in the answer area.


Answer:

 

Explanation:
Box 1:
* Token – A user gains access to an RP application by presenting a valid token that was issued by an authority that the RP application trusts.
* Identity Provider (IP) – An authority that authenticates user identities and issues security tokens, such as Microsoft account (Windows Live ID), Facebook, Google, Twitter, and Active Directory. When Azure Access Control (ACS) is configured to trust an IP, it accepts and validates the tokens that the IP issues. Because ACS can trust multiple IPs at the same time, when your application trusts ACS, you can your application can offer users the option to be authenticated by any of the IPs that ACS trusts on your behalf.
How to Authenticate Web Users with Azure Active Directory Access Control
http://azure.microsoft.com/en-gb/documentation/articles/active-directory-dotnet-how-to-useaccess-control/
Box 2: WS-Trust is a web service (WS-*) specification and Organization for the Advancement of Structured Information Standards (OASIS) standard that deals with the issuing, renewing, and validating of security tokens, as well as with providing ways to establish, assess the presence of, and broker trust relationships between participants in a secure message exchange. Azure Access Control (ACS) supports WS-Trust 1.3.
Incorrect:
ACS does not support Kerberos.
Protocols Supported in ACS
https://msdn.microsoft.com/en-us/library/azure/gg185948.aspx

QUESTION 42
Drag and Drop Question
Contoso, Ltd., uses Azure websites for their company portal sites.
Admin users need enough access to effectively perform site monitoring or management tasks. You need to grant admin access to a group of 10 users.
How should you configure the connection? To answer, drag the role or object to the correct connection setting. Each item may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

 

Answer:

 

Explanation:
RBAC and Azure Websites Publishing
http://azure.microsoft.com/blog/2015/01/05/rbac-and-azure-websites-publishing/

QUESTION 43
Drag and Drop Question
You are migrating Active Directory Domain Services (AD DS) domains to Azure.
You need to recommend the least complex directory synchronization solution.
What should you recommend? To answer, drag the appropriate solution to the correct client requirement. Each solution may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

 

Answer:

 

Explanation:
https://msdn.microsoft.com/en-us/library/azure/dn246918.aspx?f=255&MSPPError=-2147217396
http://blogs.office.com/2014/04/15/synchronizing-your-directory-with-office-365-is-easy/
http://blogs.office.com/2014/05/13/choosing-a-sign-in-model-for-office-365/

QUESTION 44
Drag and Drop Question
You have a web application on Azure.
The web application does not employ Secure Sockets Layer (SSL).
You need to enable SSL for your production deployment web application on Azure.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

 

Answer:

 

Explanation:
http://azure.microsoft.com/en-gb/documentation/articles/cloud-services-configure-sslcertificate/

QUESTION 45
You are designing a plan for testing a Windows Azure service.
The service runs in the development fabric but fails on Windows Azure.
You need to recommend an approach for identifying errors that occur when the service runs on Windows Azure.
What should you recommend?

A.    Attach a debugger to the Windows Azure role instance.
B.    Analyze debugging information captured by Windows Azure Diagnostics.
C.    Modify the service configuration for the Windows Azure role to access development storage.
D.    Analyze debugging information written to the output window of the Windows Azure role instance.

Answer: B

QUESTION 46
You are designing a Windows Azure web application.
The application will be accessible at a standard cloudapp.net URL.
You need to recommend a DNS resource record type that will allow you to configure access to the application through a custom domain name.
Which type should you recommend?

A.    A
B.    CNAME
C.    MX
D.    SRV

Answer: B

QUESTION 47
You are designing a plan to migrate an existing application to Windows Azure. The application currently resides on a server that has 20 GB of hard disk space. You need to recommend the smallest compute instance size that provides local storage equivalent to that of the existing server.
Which size should you recommend?

A.    ExtraSmall
B.    ExtraLarge
C.    Small
D.    Large

Answer: A

QUESTION 48
An application currently resides on an on-premises virtual machine that has 2 CPU cores, 4 GB of RAM, 20 GB of hard disk space, and a 10 megabit/second network connection.
You plan to migrate the application to Azure.
You have the following requirements:

– You must not make changes to the application.
– You must minimize the costs for hosting the application.

You need to recommend the appropriate virtual machine instance type.
Which virtual machine tier should you recommend?

A.    Network Optimized (A Series)
B.    General Purpose Compute, Basic Tier (A Series)
C.    General Purpose Compute, Standard Tier (A Series)
D.    Optimized Compute (D Series)

Answer: B
Explanation:
General purpose compute: Basic tier
An economical option for development workloads, test servers, and other applications that don’t require load balancing, auto-scaling, or memory-intensive virtual machines.
CPU core range: 1-8
RAM range: 0.75 – 14 GB
Disk size: 20-240 GB
Incorrect answers:
Not A: Network optimized: fast networking with InfiniBand support Available in select data centers. A8 and A9 virtual machines feature Intel® Xeon® E5 processors. Adds a 40Gbit/s InfiniBand network with remote direct memory access (RDMA) technology. Ideal for Message Passing Interface (MPI) applications, high-performance clusters, modeling and simulations, video encoding, and other compute or network intensive scenarios.
Not C: CPU core range: 1-8
RAM range: 0.75 – 56 GB
Disk size: 20-605 GB
Not D: D-series virtual machines feature solid state drives (SSDs) and 60% faster processors than the A-series and are also available for web or worker roles in Azure Cloud Services. This series is ideal for applications that demand faster CPUs, better local disk performance, or higher memories.
Virtual Machines Pricing. Launch Windows Server and Linux in minutes
http://azure.microsoft.com/en-us/pricing/details/virtual-machines/

QUESTION 49
You are designing an Azure web application that includes many static content files.
The application is accessed from locations all over the world by using a custom domain name.
You need to recommend an approach for providing access to the static content with the least amount of latency,
Which two actions should you recommend? Each correct answer presents part of the solution.

A.    Place the static content in Azure Table storage.
B.    Configure a CNAME DNS record for the Azure Content Delivery Network (CDN) domain.
C.    Place the static content in Azure Blob storage.
D.    Configure a custom domain name that is an alias for the Azure Storage domain.

Answer: BC
Explanation:
B: There are two ways to map your custom domain to a CDN endpoint.
1. Create a CNAME record with your domain registrar and map your custom domain and subdomain to the CDN endpoint
2. Add an intermediate registration step with Azure cdnverify
C: The Azure Content Delivery Network (CDN) offers developers a global solution for delivering highbandwidth content by caching blobs and static content of compute instances at physical nodes in the United States, Europe, Asia, Australia and South America.
The benefits of using CDN to cache Azure data include:
/ Better performance and user experience for end users who are far from a content source, and are using applications where many ‘internet trips’ are required to load content
/ Large distributed scale to better handle instantaneous high load, say, at the start of an event such as a product launch
Using CDN for Azure
https://azure.microsoft.com/en-gb/documentation/articles/cdn-how-to-use/
How to map Custom Domain to Content Delivery Network (CDN) endpoint
https://github.com/Azure/azure-content/blob/master/articles/cdn-map-content-to-customdomain.md
https://github.com/Azure/azure-content/blob/master/articles/cdn-map-content-to-customdomain.md

QUESTION 50
You are designing an Azure development environment.
Team members learn Azure development techniques by training in the development environment.
The development environment must auto scale and load balance additional virtual machine (VM) instances.
You need to recommend the most cost-effective compute-instance size that allows team members to work with Azure in the development environment.
What should you recommend?

A.    Azure A1 standard VM Instance
B.    Azure A2 basic VM Instance
C.    Azure A3 basic VM Instance
D.    Azure A9 standard VM Instance

Answer: A
Explanation:
Azure A1 standard VM Instance would be cheapest with 1 CPU core, 0.75 GB RAM, and 40 GB HD. It would be good enough for training purposes.
Virtual Machines Pricing, Launch Windows Server and Linux in minutes
http://azure.microsoft.com/en-us/pricing/details/virtual-machines/

QUESTION 51
You are designing an Azure application that provides online backup storage for hundreds of media files. Each file is larger than 1GB.
The data storage solution has the following requirements:

– It must be capable of storing an average of 1TB of data for each user.
– It must support sharing of data between all Windows Azure instances.
– It must provide random read/write access.

You need to recommend a durable data storage solution.
What should you recommend?

A.    Azure Drive
B.    Azure Page Blob service
C.    Azure Block Blob service
D.    Local storage on an Azure instance

Answer: B
Explanation:
Block blobs can store up to 200 GB of data and are optimized for streaming.
This is the type by which most blobs are stored.
Page blobs can store up to 1 TB and are optimized for random read/ write operations.
They provide the ability to write to a range of bytes in a Blob.
Virtual Drives in Azure Virtual Machines use page blobs because they are accessed randomly.
https://msdn.microsoft.com/en-us/library/azure/ee691964.aspx

Case Study 2 – Trey Research (Question 52 – Question 57)
Background
Overview
Trey Research conducts agricultural research and sells the results to the agriculture and food industries. The company uses a combination of on-premises and third-party server clusters to meet its storage needs. Trey Research has seasonal demands on its services, with up to 50 percent drops in data capacity and bandwidth demand during low-demand periods. They plan to host their websites in an agile, cloud environment where the company can deploy and remove its websites based on its business requirements rather than the requirements of the hosting company.
A recent fire near the datacenter that Trey Research uses raises the management team’s awareness of the vulnerability of hosting all of the company’s websites and data at any single location. The management team is concerned about protecting its data from loss as a result of a disaster.

Websites
Trey Research has a portfolio of 300 websites and associated background processes that are currently hosted in a third-party datacenter. All of the websites are written in ASP.NET, and the background processes use Windows Services. The hosting environment costs Trey Research approximately S25 million in hosting and maintenance fees.

Infrastructure
Trey Research also has on-premises servers that run VMs to support line-of-business applications. The company wants to migrate the line-of-business applications to the cloud, one application at a time. The company is migrating most of its production VMs from an aging VMWare ESXi farm to a Hyper-V cluster that runs on Windows Server 2012.

Applications
DistributionTracking
Trey Research has a web application named Distribution!racking. This application constantly collects realtime data that tracks worldwide distribution points to customer retail sites. This data is available to customers at all times. The company wants to ensure that the distribution tracking data is stored at a location that is geographically close to the customers who will be using the information. The system must continue running in the event of VM failures without corrupting data. The system is processor intensive and should be run in a multithreading environment.

HRApp
The company has a human resources (HR) application named HRApp that stores data in an on-premises SQL Server database. The database must have at feast two copies, but data to support backups and business continuity must stay in Trey Research locations only.
The data must remain on-premises and cannot be stored in the cloud.
HRApp was written by a third party, and the code cannot be modified. The human resources data is used by all business offices, and each office requires access to the entire database. Users report that HRApp takes all night to generate the required payroll reports, and they would like to reduce this time.

MetricsTracking
Trey Research has an application named MetricsTracking that is used to track analytics for the DistributionTracking web application. The data MetricsTracking collects is not customer-facing. Data is stored on an on-premises SQL Server database, but this data should be moved to the cloud. Employees at other locations access this data by using a remote desktop connection to connect to the application, but latency issues degrade the functionality.
Trey Research wants a solution that allows remote employees to access metrics data without using a remote desktop connection. MetricsTracking was written in-house, and the development team is available to make modifications to the application if necessary. However, the company wants to continue to use SQL Server for MetricsTracking.

Business Requirements
Business Continuity
You have the following requirements:
– Move all customer-facing data to the cloud.
– Web servers should be backed up to geographically separate locations, If one website becomes unavailable, customers should automatically be routed to websites that are still operational.
– Data must be available regardless of the operational status of any particular website.
– The HRApp system must remain on-premises and must be backed up.
– The Met ricsTrac king data must be replicated so that it is locally available to all Trey Research offices.

Auditing and Security
You have the following requirements:
– Both internal and external consumers should be able to access research results.
– Internal users should be able to access data by using their existing company credentials without requiring multiple logins.
– Consumers should be able to access the service by using their Microsoft credentials.
– Applications written to access the data must be authenticated. Access and activity must be monitored and audited.
– Ensure the security and integrity of the data collected from the worldwide distribution points for the distribution tracking application.

Storage and Processing
You have the following requirements:
– Provide real-time analysis of distribution tracking data by geographic location.
– Collect and store large datasets in real-time data for customer use.
– Locate the distribution tracking data as close to the central office as possible to improve bandwidth.
– Co-locate the distribution tracking data as close to the customer as possible based on the customer’s location.
– Distribution tracking data must be stored in the JSON format and indexed by metadata that is stored in a SQL Server database.
– Data in the cloud must be stored in geographically separate locations, but kept with the same political boundaries.

Technical Requirements
Migration
You have the following requirements:
– Deploy all websites to Azure.
– Replace on-premises and third-party physical server clusters with cloud-based solutions.
– Optimize the speed for retrieving exiting JSON objects that contain the distribution tracking data.
Recommend strategies for partitioning data for load balancing.

Auditing and Security
You have the following requirements:
– Use Active Directory for internal and external authentication.
– Use OAuth for application authentication.

Business Continuity
You have the following requirements:
– Data must be backed up to separate geographic locations.
– Web servers must run concurrent versions of all websites in distinct geographic locations.
– Use Azure to back up the on-premises MetricsTracking data.
– Use Azure virtual machines as a recovery platform for MetricsTracking and HRApp.
– Ensure that there is at least one additional on-premises recovery environment for the HRApp.

QUESTION 52
You need to recommend an authentication solution for the DistributionTracking application.
What should you include in the recommendation?

A.    a certificate
B.    a Graph API endpoint
C.    a security principal in Azure Active Directory
D.    a managed service account in Azure Active Directory.

Answer: A

QUESTION 53
Drag and Drop Question
You need to recommend a test strategy for the disaster recovery system.
What should you do? To answer, drag the appropriate test strategy to the correct business application. Each test strategy may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

 

Answer:

 

Explanation:
* Distribution tracking
The company wants to ensure that the distribution tracking data is stored at a location that
* / HRApp
The data must remain on-premises and cannot be stored in the cloud.
* / Metrics application
Data is stored on an on-premises SQL Server database, but this data should be moved to the cloud.

QUESTION 54
Hotspot Question
You need to plan the business continuity strategy.
For each requirement, what should you recommend? To answer, select the appropriate option from each list in the answer area.

 

Answer:

 
Explanation:
https://docs.microsoft.com/en-us/azure/best-practices-availability-paired-regions
https://azure.microsoft.com/en-us/documentation/articles/traffic-manager-routing-methods/

QUESTION 55
Drag and Drop Question
You need to ensure that customer data is secured both in transit and at rest.
Which technologies should you recommend? To answer, drag the appropriate technology to the correct security requirement. Each technology may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

 

Answer:

 

QUESTION 56
You need to configure the distribution tracking application.
What should you do?

A.    Map each role to a single upgrade domain to optimize resource utilization.
B.    Design all services as stateless services.
C.    Configure operations to queue when a role reaches its capacity.
D.    Configure multiple worker roles to run on each virtual machine.

Answer: D
Explanation:
DistributionTracking
Trey Research has a web application named Distribution!racking. This application constantly collects realtime data that tracks worldwide distribution points to customer retail sites. This data is available to customers at all times. The company wants to ensure that the distribution tracking data is stored at a location that is geographically close to the customers who will be using the information. The system must continue running in the event of VM failures without corrupting data. The system is processor intensive and should be run in a multithreading environment.

QUESTION 57
Hotspot Question
You need to design a data storage strategy for each application.
In the table below, identify the strategy that you should use for each application. Make only one selection in each column.

 

Answer:

 

Explanation:
* Scenario:
/ HRApp
The company has a human resources (HR) application named HRApp that stores data in an onpremises SQL Server database.
The data must remain on-premises and cannot be stored in the cloud.
The human resources data is used by all business offices, and each office requires access to the
entire database.
/ Metrics application
Data is stored on an on-premises SQL Server database, but this data should be moved to the cloud.

Case Study 3 – Contoso, Ltd (Question 58 – Question 62)
Background
Overview
Contoso, Ltd., manufactures and sells golf clubs and golf balls. Contoso also sells golf accessories under the Contoso Golf and Odyssey brands worldwide.
Most of the company’s IT infrastructure is located in the company’s Carlsbad, California, headquarters. Contoso also has a sizable third-party colocation datacenter that costs the company USD $30,000 to $40,000 a month. Contoso has other servers scattered around the United States.

Contoso, Ltd., has the following goals:
Move many consumer-facing websites, enterprise databases, and enterprise web services to Azure.
Improve the performance for customers and resellers who are access company websites from around the world.
Provide support for provisioning resources to meet bursts of demand. Consolidate and improve the utilization of website- and database-hosting resources.
Avoid downtime, particularly that caused by web and database server updating. Leverage familiarity with Microsoft server management tools.

Infrastructure
Contoso’s datacenters are filled with dozens of smaller web servers and databases that run on under-utilized hardware. This creates issues for data backup. Contoso currently backs up data to tape by using System Center Data Protection Manager. System Center Operations Manager is not deployed in the enterprise.

All of the servers are expensive to acquire and maintain, and scaling the infrastructure takes significant time. Contoso conducts weekly server maintenance, which causes downtime for some of its global offices. Special events, such as high-profile golf tournaments, create a large increase in site traffic. Contoso has difficulty scaling the web- hosting environment fast enough to meet these surges in site traffic.

Contoso has resellers and consumers in Japan and China. These resellers must use applications that run in a datacenter that is located in the state of Texas, in the United States. Because of the physical distance, the resellers experience slow response times and downtime.

Business Requirements
Management and Performance
Management
Web servers and databases must automatically apply updates to the operating system and products.
Automatically monitor the health of worldwide sites, databases, and virtual machines.
Automatically back up the website and databases. Manage hosted resources by using on-premises tools.

Performance
The management team would like to centralize data backups and eliminate the use of tapes.
The website must automatically scale without code changes or redeployment. Support changes in service tier without reconfiguration or redeployment. Site-hosting must automatically scale to accommodate data bandwidth and number of connections.
Scale databases without requiring migration to a larger server. Migrate business critical applications to Azure. Migrate databases to the cloud and centralize databases where possible.

Business Continuity and Support
Business Continuity
Minimize downtime in the event of regional disasters. Recover data if unintentional modifications or deletions are discovered. Run the website on multiple web server instances to minimize downtime and support a high service level agreement (SLA).

Connectivity
Allow enterprise web services to access data and other services located on- premises.
Provide and monitor lowest latency possible to website visitors. Automatically balance traffic among all web servers. Provide secure transactions for users of both legacy and modern browsers. Provide automated auditing and reporting of web servers and databases.
Support single sign-on from multiple domains.

Development Environment
You identify the following requirements for the development environment:
– Support the current development team’s knowledge of Microsoft web development and SQL Service tools.
– Support building experimental applications by using data from the Azure deployment and on-premises data sources.
– Mitigate the need to purchase additional tools for monitoring and debugging.
– System designers and architects must be able to create custom Web APIs without requiring any coding.
– Support automatic website deployment from source control.
– Support automated build verification and testing to mitigate bugs introduced during builds.
– Manage website versions across all deployments.
– Ensure that website versions are consistent across ail deployments.

Technical Requirement
Management and Performance
Management
Use build automation to deploy directly from Visual Studio. Use build-time versioning of assets and builds/releases. Automate common IT tasks such as VM creation by using Windows PowerShell workflows.
Use advanced monitoring features and reports of workloads in Azure by using existing Microsoft tools.

Performance
Websites must automatically load balance across multiple servers to adapt to varying traffic.
In production, websites must run on multiple instances. First-time published websites must be published by using Visual Studio and scaled to a single instance to test publishing.
Data storage must support automatic load balancing across multiple servers. Websites must adapt to wide increases in traffic during special events. Azure virtual machines (VMs) must be created in the same datacenter when applicable.

Business Continuity and Support
Business Continuity
Automatically co-locate data and applications in different geographic locations. Provide real-time reporting of changes to critical data and binaries. Provide real-time alerts of security exceptions. Unwanted deletions or modifications of data must be reversible for up to one month, especially in business critical applications and databases. Any cloud-hosted servers must be highly available.

Enterprise Support
The solution must use stored procedures to access on-premises SQL Server data from Azure.
A debugger must automatically attach to websites on a weekly basis. The scripts that handle the configuration and setup of debugging cannot work if there is a delay in attaching the debugger.

QUESTION 58
Drag and Drop Question
You need to deploy the virtual machines to Azure.
Which four Azure PowerShell scripts should you run in sequence? To answer, move the appropriate scripts from the list of scripts to the answer area and arrange them in the correct order.

 

Answer:

 

QUESTION 59
You need to recommend a solution for publishing one of the company websites to Azure and configuring it for remote debugging.
Which two actions should you perform? Each correct answer presents part of the solution.

A.    From Visual Studio, attach the debugger to the solution.
B.    Set the application logging level to Verbose and enable logging.
C.    Set the Web Server logging level to Information and enable logging.
D.    Set the Web Server logging level to Verbose and enable logging.
E.    From Visual Studio, configure the site to enable Debugger Attaching and then publish the site.

Answer: AE
Explanation:
https://azure.microsoft.com/en-us/documentation/articles/web-sites-dotnet-troubleshoot-visual-studio/

QUESTION 60
Drag and Drop Question
You need to recommend network connectivity solutions for the experimental applications.
What should you recommend? To answer, drag the appropriate solution to the correct network connection requirements. Each solution may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

 

Answer:

Lead2pass provides guarantee of Microsoft 70-534 exam because Lead2pass is an authenticated IT certifications site. The 70-534 dump is updated with regular basis and the answers are rechecked of every exam. Good luck in your exam.

70-534 new questions on Google Drive: https://drive.google.com/open?id=0B3Syig5i8gpDWU9xQUQzY1NIN1E

2017 Microsoft 70-534 exam dumps (All 230 Q&As) from Lead2pass:

http://www.lead2pass.com/70-534.html [100% Exam Pass Guaranteed]