You're about to create your best presentation ever

Data Center Design Presentation Templates

Create your presentation by reusing one of our great community templates.

Templates Presentation

Transcript: 1- The user creates a New Space 2- During some time it's modified according to the project needs 3- As the space results useful for a whole company or area the user decides to ask for saving it as Template 4-The Collaborate Team takes care of this process 5- The new process covers the Analysis of the Space that we should save as Template and the Estimation to finish it. 6- Also we should contemplate the current release dates to provide the user the go live Date Save Templates with Content Some Issues are: New Process: 1- URLs that are inherited Some items to take in care to know in which Release the Template will go live: Templates Creation Process Columns Duplicated that couldn't be removed Current Process: Advantages and Disadvantages Features that couldn't be enable Content Type Duplicated that couldn't be removed 1- The user creates a New Space 2- During some time it's modified according to the project needs 3- As the space results useful for a whole company or area the user decides to ask for saving it as Template 4-The Collaborate Team takes care of this process, but during it we usually face some issues Duplicated Content Type This Process will be easier if... - The Site Collections are aligned - The user doesn't modify the Template meanwhile the support team is working on it - If the changes are planed with time Missing Features Issues Estimation: Duplicated Columns Hidden Features 1-The complex of the data inside the Space and its estimation. 2-The issues that we found testing it and the estimation related to them . 3- We will accept simple changes until 7 days before the first INT deploy. 4- Once we passed the limit to request changes the Template will go live in the Next Release. 5- If the user request new Changes after the first INT deploy they will be performed to the next release. .Dotx files required - Old Library Template

Data Templates, Templates and Styling

Transcript: WPF Templating Styles Property Triggers Data Triggers Event Triggers Templates Refactoring It is better practice to store templates and other large resources as seperate resource dictionaries. As an example, we would store our custom Listbox in a resource dictionary called CustomListBox.xaml and import it in our App when we want to instantiate that custom control. Control Templates Apply a set of property values to an element Change the appearance of any style-related (read: asthetic) property Used when you need to have a common set of property values on multiple elements Default Style element is FrameworkElement. Specify TargetType to style other elements IsMouseOver = true Visual tree takes the hierarchical tree and expands its elements into all visual components. Control templates are XAML "recipe" for a control. We can copy the contents of a control template and modify it to "reskin" a core WPF control. Templates vs. Styles Although you can find the templates for WPF core controls in MSDN documentation, easiest way is to use Expression Blend. - Add a control - Right Click on control -> Edit Template Templates used for hierarchical data such as TreeViewItem and MenuItem A chunk of XAML markup that defines how a bound data object should be displayed. Visual Tree Hierarchical Data Templates Styles EventTriggers are hooked up to routed events and generally spark an animation related action on the element to which the style applies. Used to bind to properties of the control to which you are applying the template. Useful when the property is not specific to the aspects you are customizing. Event Triggers Item Panel Templates Templates used to override how objects are layed out in a Panel (or any class derived from Panel). DataTriggers work much like PropertyTriggers but operate on any object property instead of just WPF dependency Property. Property Triggers Logical Tree vs. Visual Tree Styles are used to adjust properties of an element. Templates can replace the entire visual tree of an element. Ex. Use template to replace the background of a button with an ellipse or path object. Data Triggers Styles can be inherited from other styles. Simply use the property "BasedOn" in the Style element. Data Templates Logical tree is a representative tree that describes the heirarchical composition of a WPF application. Style Inheritance IsMouseOver = false Define a list of setters that are activated when specific condition is met. Template Bindings Triggers Logical Tree

Data Center

Transcript: Data Center vs The Cloud Sources Cloud Computing: A Brief History Data Center vs. Cloud: Operation Costs So, which is better? Data Center: A Brief History Cloud computing is the practice of storing regularly used computer data on multiple servers that can be accessed through the internet. What is the function of a data center? Consider... What is the function of cloud computing? http://www.computerworld.com/s/article/359383/The_Real_Costs_of_Cloud_Computing?taxonomyId=154&pageNumber=2 http://www.compu-dynamics.com/%E2%80%98cloud-first%E2%80%99-does-not-mean-colo-first/ http://www.dummies.com/how-to/content/comparing-traditional-data-center-and-cloud-data-c.html http://www.datacenterknowledge.com/archives/2010/08/23/comparing-the-cost-of-cloud-vs-colocation/ http://envoc.com/web-expertise/data-center-or-cloud/ http://www.nytimes.com/2012/09/23/technology/data-centers-waste-vast-amounts-of-energy-belying-industry-image.html?pagewanted=all&_r=0 http://www.merriam-webster.com/dictionary/cloud%20computing) http://www.datacentermap.com/usa/north-carolina/charlotte/ http://m.wisegeek.com/what-is-a-data-center.htm On average a company would spend $10-25 million PER YEAR to operate and maintain a large data center.58% of that goes towards hardware, air conditioning, property and sales taxes, and labor costs 42% of that goes towards Hardware, software, disaster recovery arrangements, uninterrupted power supplies, and networking.*Costs can quickly climb higher if maintenance is necessary "To use off site internet based programs and services instead of those based in the local computer" (Basically...a fancy term for internet storage) What is Cloud Computing? The idea of data centers arose in the 1980s when the popularity and the use of computers was becoming more popular, than it was referred to as a computer room and was used the house main servers. In current times setting up a data center has a specific set of needs, design, construction and operation. vs. What are the potential benefits of a data center? Is it more costly to construct and maintain a data center versus a cloud computing storage method? A data center is awesome for any large company to have because it is a safe and secure location that can house as much data desired and although it can prove to be costly most of the costs are in the construction of the facility. The only major continuing costs are maintenance, labor costs and air. When deciding between a data center or the cloud, it all depends on the type of business that is using the system, how much they are willing to spend and which feels like a better fit. --------------------------------------------- The idea of cloud computing came in the 1980s when the demand for less expensive, yet powerful processors. In the 1990s as technology and computing began to boom and grid and utility computer became popular. Today the term cloud computing is "a collection of services delivered via the internet" and is customizable to fit the needs of a business or industry. What is a Data Center? Many people think that because cloud computing is an off site, internet based method of storage that is it cheaper, false. To convert to cloud computing a company must consider information moving costs, upload or download fees, labor costs etc. The cloud can cost 3 to 4 times more than on site data storage. Cloud computing is a smart way to go for a business cause storage is unlimited and can be accessed anywhere because it is an off site, internet location that the data is stored. Most of the costs for the cloud go towards labor costs and the initial moving of data. A data center is a physical location that stores a computer network's most important systems and operations. Including: back up power supplies, air conditioning, and security applications. Although both data centers and cloud computing have pros and cons, and can both be quite expensive there are pros and cons to each. "To compile and protect the data of a person or company"

Data Center Design Group

Transcript: -IpCop firewall was installed to be our Firewall, NAT router, and DHCP server. -With our NAT router and DHCP server we are able to give out our private IP's throughout our network. -We installed Ubuntu Server for our main OpenSource operating system. -Installed OpenStack "Grizzly" as our IASS (Infrastructure As A Service) for our cloud infrastructure. -Having Ubuntu as our main operating system, we have avoided having licensing issues. Where we are Now Our clients goals Screenshot of OpenStack Dashboard Image Files -Move server rack into datacenter in room 2204 -Remove old cables from room and make cable management more effective for our rack -Launch the virtual machines from the OpenStack dashboard. Our Goals Screenshots of OpenStack Network Topolog Description of the Project What our future plans are -The purpose of this cloud hosted OpenStack data center is to provide students in the Webpage classes remote access to the data center. This will help make it easier for the end users. -The data center will also be used by professors for research purposes to better educate future Georgia Southern University students. IT Capstone: Data Center Design Group -Our client asked for us to create a data base using OpenStack so many virtualized Operating Systems and image files could be accessed through the OpenStack dashboard. -The server must be able to handle different types of web interfaces so students could access their specific IP Address and domain names. Where we are Now Where we are Now

DATA CENTER

Transcript: - ALWAYS QUESTION LINKS, DOWNLOAD REQUEST, AND ANY UNKNOWN WEBSITES -TURN ON SOFTWARE UPDATES FOR ANY OPERATING SYSTEM/APPLICATIONS YOU MAY USE -SECURE ALL HOME NETWORKS WITH PASSWORD AND FIREWALL PROTECTION - REGULARLY CHANGE PASSWORDS Pros Cons Internet access connects individual computer terminals, computers, mobile devices, and computer networks to the Internet, enabling users to access Internet services. Alexsey Belan In mainframe computers and some minicomputers, a direct access storage device, or DASD, is any secondary storage device which has relatively low access time relative to its capacity. OUTLOOK Must be individuals who fully understand the systems Business persons must be trained and be able to apply their new found techniques Though training can help understand the working process of virtualization ,one important obstacle to its expansion is the fact that administrators have to know its organization ,how its implemented ,sustained and controlled. The more they master such skills ,the easier it will be for companies to start using the process and even if those challenge exist,the reators and vendors of virtualization are finding ways to overcome the issues and we are seeing an expansion nevertheless of virtualization. What is a data center? HOW REGISTERS AFFECT SPEED - Virtualization surfaced from a need in the 1960 ’s to partition large mainframe hardware. - First implemented by IBM more than 30 years ago. - Improved in the 1990s to allow mainframes to multitask. PREVENTION Disaster Recovery is used after a information system crashes, naturaldisasters, or human/criminal attacks happen to a data center. Disaster Recovery is the efforts in which companies take in order to prevent incidents from occurring as well as aftermath procedures in case a disaster occurs Data theft, cyber criminals, and natural causes are some of the main reasons for a disaster of a data center. MOORE’S LAW VIRTUALIZATION HISTORY AND EVOLUTION History of disaster recoveries began in the 1970s when the technological advancements of computers made companies more dependent on their information systems Disaster recovery plans began to really take off in the 80s and 90s when awareness of these potential human induced or natural disasters became more prone as information became more available Today, nearly every organization is online, making any Internet-connected network a potential entry point for the growing worldwide community of computer criminals DATA ACCESS SPEED One great quality of virtualization is that it allows the users to operate in both a cost effective & time efficient manner . The System is intuitive which makes it very simple to manage . Compared to other structures, it offers more options in its system while uniting both the server and the infrastructure Future A measurement of how fast data can be transferred from the Internet to a connected computer. Security programs must be improved and if the current technological trend continues there is very little doubt that new programs will be developed that will be able to prevent these attacks PROS OF VIRTUALIZATION Computer worm discovered in 2010 Secret hacking and destruction of information against Iranian uranium enriched centrifuges “The attack was so sophisticated that it even altered equipment readings to report normal activity so that operators didn’t even know something was wrong until it was too late” Proves how advanced technology has become within the 21st century How Virtualization Works Hackers create tools that make it easy for the criminally inclined to automate attacks These tools probe systems for vulnerabilities and then launch attacks - Initial security development originated from the military - As computer use became more prevalent, the need to protect intellectual property, bank accounts, and stored data increased as well 1970's: - US Federal Bureau of Standard issued data encryption standard - IBM created algorithm in 1977 known today as Data Encryption Standard What is Disaster Recovery? By: Shiwei Miao 1960s: Mainframe computer containing a CPU, a memory cache and storage in one container 1980s: Boom of microcomputer (birth of IBM Personal Computer) – installed everywhere 1990s: Companies began to put server rooms in their company walls 2000s: 5.75 million new servers deployed every year and government data centers rose from 432 in 1999 to 1,100+ - VIRTUALIZATION - DISASTER RECOVERY/ CONTINGENCY - DATA STORAGE - SECURITY - SPEED - Data access speed - Internet access speed History Computer components that has the ability of data retention Data storage predictions: 2006-2011: 200 exabytes to nearly 2 zettabytes, a ten times growth 2015: Increase to over 8 zettabytes HOW TO PREVENT SECURITY BREACHES Wanted for allegedly compromising the cyber security systems of three major US based e-commerce companies in Nevada and California between January of 2012 and April of 2013. He is accused of stealing and exporting user databases with passwords

Now you can make any subject more engaging and memorable