1. Goal / Introduction The main goal for this lab exercise was to implement a Windows 2012 Data Server onto the virtual network used by the previous labs. Sever services including DHCP (Dynamic Host Configuration Protocol), DNS (Domain Name Service), AD (Active Directory), and Microsoft Baseline Security Analyzer. DNS also known as Domain Name Service is a network service used for distributing domain names and mapping IP addresses to them and keeping track of connections on the local level. DHCP or Dynamic Host Configuration Protocol is used to assign IP addresses to network devices connected to the server. An Active Directory is used to authorize and check the credentials of the users and machines connected to the network. The process for …show more content…
2. Procedure The main steps that I took when setting up the Windows 2012 Server was first reading the documentation that was posted on MyCourses then referencing Window’s server documentation website for any information that I needed that was not covered in the documentation section on MyCourses. The instructions were spelled out clearly in the Lab Guide that was posted and I followed the configurations steps for all of the services by referencing Windows Server manual that I found by doing a quick search online. I did not run into any major problems when I was setting up and configuring the services during the lab and I tested the configuration by connecting a windows client to the domain I created and logged in as a user that I set up. These sources are posted at the end of the report under the references section. The wiki that can be access by logging onto the VM labeled “Red_Hat_Server” not the “Main_Server” VM and typing the local address 127.0.0.1 into the address bar in the default web browser will bring up the basic steps that I took when I was setting up the services. I will be changing the naming conventions of the VM’s during lab 4 and beyond so all of this is subject to change. All passwords for any user or service will be “P4ssw0rd12” any questions don’t hesitate to email me at pxg6044@rit.edu. 3. Security Considerations In regards to security the open
Installing and configuring key services on Windows Server 2012 was a breeze thanks to its intuitive GUI. My favorite part of this lab was the feeling of control over the network from a single machine. Instead of switching to a different VM, I was able to simply switch to a different window. Windows Server 2012 made it easy to push changes across the network, allowing me to set up the same network configuration in far less time. Even working with only 7 VMs, I can clearly see why tools like Active Directory are all but necessary in a large enterprise network. Using a centralized controller like Windows Server 2012 would save hundreds of man-hours over time, increasing efficiency and profitability for the company. I look forward to working with Windows Server 2012 further and learning more of its capabilities in the labs to
Significance: This topic is important to my audience because of the increasing number of people accessing the internet. Implementing virtualization
The objectives of this lab were to install essential services such as Active Directory, Dynamic Host Configuration Service, Domain Name Service and Network Time Service on a Windows Server Platform. I used windows server 2016 technical review version 3 and successfully installed all the mentioned services on it. In this lab we used all services on windows as primary and I configured the services on Linux as secondary services.
The Dynamic Host Control Protocol automatically assigns IP addresses to hosts on a network as they request them. DHCP packets also include information like DNS server addresses, domain names, and default gateways.
In this lab, we must add the Windows 2012 server to our monitoring server with monitoring DNS and DHCP on the Windows 2012 server. In my case, I have Nagios as the monitoring service set up. In Nagios, I added the Windows 2012 server and configured it to monitor DHCP, DNS, and the CPU of the Windows 2012 server. This is an excellent practice for Sys Admins to make sure what services are critical to watch or troubleshoot in the infrastructure.
Write a paragraph (minimum five college-level sentences) below that summarizes what was accomplished in this iLab, what you learned by performing it, how it relates to this week’s TCOs and other course material; and just as important, how you feel it will benefit you in your academic and professional career. (9 points)
Vitale, D. (2013, Feb 07). Doug Vitale Tech Blog. Retrieved from Network administration commands for Microsoft Windows and Active
A benefit of Windows Server 2012 is the option of Server Core. It provides a low maintenance, limited function server environment that is suited to act as a DHCP server, Active Directory, DNS Server in addition to other roles.
While at Verizon, I applied knowledge from my Windows Server training and knowledge acquired from product documentation and peer support forums to manage Active Directory services and other roles and features while maintaining the distributed Windows Server lab environment. Additionally, I continue to enhance my knowledge while installing, configuring and managing new functionality in the lab, as new Windows Server versions were released. As an independent consultant servicing small businesses, my customers required that I have expert knowledge of Windows Server network services. To install, configure, and provide ongoing maintenance of Windows Server for these customers, I needed to keep my knowledge of Windows Server releases up to date. To do so, I maintain a Windows Server Hyper-V installation at my home office for learning about new Windows features before implementing them for customers. While managing infrastructure in the Verizon Windows Server lab environment and for customers, I have installed, configured, and managed Active Directory Domain controllers and roles and their related services including DNS, DHCP, File and Print Services, and Remote Desktop
Identification of controls already in place – including policies, firewalls, applications, intrusion and detection prevention systems, virtual private networks, data loss prevention and encryption.
Dynamic Host Configuration Protocol (DHCP), allows a computer to join a network without a statically assigned Internet Protocol (IP) address (Mitchell, n.d., para 1). With DHCP a device can be assigned a unique IP address, as it joins the network (Mitchell, n.d., para 1). DHCP is done on a server, and parameters are set on the pool of IP addresses that can be assigned (Mitchell, n.d., para 4). In Server 2012 there are options to allow for fault-tolerance for DHCP. The three options available including, a failover cluster, setting up a split-scope DHCP, and a DHCP failover (De Clercq, 2014). It is recommended that Kris Corporation uses fault-tolerance for DHCP. The recommended method of fault-tolerance is DHCP failover, which is new in Server 2012. This option can replicate a complete DHCP scope to another DHCP server (De Clercq, 2014). When configuring there is an option to use a hot standby or to do load sharing (De Clercq, 2014). Hot standby uses one server as the active DHCP server, the second server is only used if the first DHCP server is not available (De Clercq, 2014). Load sharing uses both DHCP server at the same time, and the request are shared between the two servers (De Clercq, 2014). Load sharing is the default option, and the recommended option for Kris Corporation.
In today’s world each sector of industry need the internet services. The way we enjoy the deep services provided by the internet is only possible because of data centers. They play a critical role for enterprises by helping them expanding their capabilities. Incorporating software abstractions with DCN has helped with the evolution of DCN. As the need for cloud based applications increase so does the need of DCN to work more efficiently increases as well. Because of cloud computing the DCN are growing even larger in size and will grow further more in future. DCN contains thousands of servers. Interconnecting all these servers is the challenge the researchers face. They are generally connected via network interface cards, cables, routers and switches. Placement of all these devices
Going forward with lab 2, the focus was placed on core services seen on enterprise networks, what they do to keep the network running efficiently, and how to centrally monitor them. Some of these services (DNS and DHCP) were review from what was learned in the “Network Services” class, but by taking the basic concepts and applying them to an already running network instead of creating a new environment specifically to test them, students learned valuable lessons on how to integrate important services into an already thriving network. At the end of lab 1, all virtual machines set up were given a static IP address which could be marked down in the wiki and referenced by other machines if needed, such as the Opsview software which needed to know the location of the other machines for monitoring purposes. With DHCP and DNS now in place, the focus of how to locate resources on the network was moved from logical IP addressing to that of domain names. DHCP is now dynamically giving out any address within its given range (a pool from our subnet), and it might not always be the same every time, meaning two things had to change. The first is reservations on the ISC DHCP server so that the same address is given out to systems already in place based on what their static addresses were previously, and the second is letting resolver software on the virtual machine’s query the new DNS server for the IP address of a VM, rather than someone having to lookup what the address of the
The concept Initially was focused on decoupling of the network control plane from the network forwarding plane. Eventually the focus shifted to providing programmatic interfaces into network equipment, which provides a broader value to it. Looked at this way, SDN allows IT organizations to replace a manual interface into networking equipment with a programmatic interface that can enable the automation of tasks such as configuration and policy management and can also enable the network to dynamically respond to application requirements.
Another topic that is to be discussed in our project is Software Defined Networking. In olden days when the Internet is new to the world, the educational organizations and other corporate companies were the only one who had access to the Internet. They are mainly concerned with the transfer of data or files. After a decade, common people also started to use the Internet for their day-to-day activities, which lead to the expansion of the Internet. During this period, the people are mainly concerned about the speed with which files were being transferred. Due to this revolution in the networking environment, many high speed networking devices came into play like routers, switches and data centers which were used to transfer files or data at a very high speed. The speed was achieved by including a control plane and data plane in the networking devices. The control plane took care of routing and selection of the path for the transfer of the data, whereas the data plane took care of forwarding the packet based on the commands given by the control plane like port on which the data should be sent and so on. Thus the control plane acted as a brain. Both the control plane and data plane used to be in the same networking equipment. Thus if a packet comes at the router or switch, then that networking equipment will decide by itself the path onto which data should be forwarded. This technology brought a tremendous