Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

DATA CENTER

No description
by

Daniella Z

on 20 October 2015

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of DATA CENTER

Though training can help understand the working process of virtualization ,one important obstacle to its expansion is the fact that administrators have to know its organization ,how its implemented ,sustained and controlled.
The more they master such skills ,the easier it will be for companies to start using the process and even if those challenge exist,the reators and vendors of virtualization are finding ways to overcome the issues and we are seeing an expansion nevertheless of virtualization.

DATA CENTERS
VIRTUALIZATION

VIRTUALIZATION HISTORY AND EVOLUTION
OUTLOOK
One great quality of virtualization is that it allows the users to operate in both a cost effective & time efficient manner .

The System is intuitive which makes it very simple to manage .

Compared to other structures, it offers more options in its system while uniting both the server and the infrastructure



- Virtualization surfaced from a need in the 1960 ’s to
partition large mainframe hardware.

- First implemented by IBM more than 30 years ago.

- Improved in the 1990s to allow mainframes to multitask.



PROS OF VIRTUALIZATION
To start using Virtualization, two important elements must be considered.

1. Training is obligatory; Potential users must learn how to navigate the system beforehand

2. It is essential to know that performing processors are the only way to create a virtual environment

- Virtualization essentially forms
a computer that is able to create its operating system.

Virtualization produces essential elements of the computer such as the hard disk and the RAM

Virtualization literally transforms hardware into software




How Virtualization Works

Virtualization permits one processor
to do the job of several processors.

Virtual settings lets one computer host
multiple operating systems at the same time


What is Virtualization?

CONS OF VIRTUALIZATION
What is a data center?
- A physical or virtual facility that stores, manages, and disseminates data & information

- Data centers may exist privately within an organization, or publicly within a National organization such as the National Climatic Data Center
SECURITY
- VIRTUALIZATION

- DISASTER RECOVERY/ CONTINGENCY

- DATA STORAGE

- SECURITY

- SPEED
Aspects of Data Centers
History of Computer Security
- Initial security development originated from the military

- As computer use became more prevalent, the need to protect intellectual property, bank accounts, and stored data increased as well

1970's:

- US Federal Bureau of Standard issued data encryption standard
- IBM created algorithm in 1977 known today as
Data Encryption Standard


HOW TO PREVENT SECURITY BREACHES
- ALWAYS QUESTION LINKS, DOWNLOAD REQUEST, AND ANY UNKNOWN WEBSITES

-TURN ON SOFTWARE UPDATES FOR ANY OPERATING SYSTEM/APPLICATIONS YOU MAY USE

-SECURE ALL HOME NETWORKS WITH PASSWORD AND FIREWALL PROTECTION

- REGULARLY CHANGE PASSWORDS

ISO 27,000
- Standard that has been put into place to help organizations protect themselves against security threats

- Establishes measures and codes for organizations to ensure security for its customers, employees, shareholders, and others

Computer processor
Cables and Modem
Wireless Routers
USB Modems
Internet Variables
Amount of Memory Being Used


Factors Affecting
Internet Speed

Dial-up connection
Broadband connections
Satellite Internet service
Wireless connection

Ways to connect to the Internet

RAM and ROM Fastest
Hard Disk
Flash Memory Stick
DVD/ CD
Zip Disk
Floppy Disk
Magnetic Tape Slowest

Storage Device
Order of speed of access to data

Swapping

The amount of RAM memory
The speed and generation of the CPU (the system clock)
The size of the Register on the CPU
The Bus type and speed
The amount of Cache memory

Factors Affecting Processor Speed
on a computer

- Data access speed
- Internet access speed

THE SPEED OF DATA CENTERS
Cache Memory

A register is a high-speed memory area on
the CPU, which holds data and instructions
currently being processed.

Word size

HOW REGISTERS AFFECT SPEED
More RAM can also make the computer run faster. The computer does not necessarily have to load an entire
program into memory to run it. However, the greater the amount of the program that fits into memory, the faster the program runs.

MEMORY AND COMPUTING
POWER
The sequential access used in tape drives, in contrast to the direct access, also known as random access, required a proportionally long time to access a distant point in a medium.

Data access typically refers to software and activities related to storing, retrieving, or acting on data housed in a database or other repository. There are two types of data access, sequential access and random access.


DATA ACCESS SPEED
Internet access connects individual computer terminals, computers, mobile devices, and computer networks to the Internet, enabling users to access Internet services.

Internet access speed

A measurement of how fast data can
be transferred from the Internet to a
connected computer.




Some storage devices can access data very quickly, whilst others are extremely slow...


Access speeds are measured in bytes per second (Bps).

Slow devices have speeds measured in thousands of Bps (kBps).

E.g. a floppy disc can save/read data at a speed of 60kBps

Fast devices have speeds measured in millions of Bps (MBps).

E.g. a hard-drive can save/read data at a speed of 300MBps
(5000 times quicker than the floppy!)

Speed

- The internal (or system) bus
- The external (or expansion) bus

The Bus

 

The Computer's
Internal Clock

– Not reading fast enough (CPU,
Disk, etc.)

– Fast retransmission
– Timeout

– Send buffer not large enough

Receiver

Network

Send Buffer

– Bottlenecked by CPU, disk, etc.
– Slow due to app design (small writes)

Sender App

By: Shiwei Miao
In mainframe computers and
some minicomputers, a direct access
storage device, or DASD, is any
secondary storage device which
has relatively low access time
relative to its capacity.

- Clock speed
“Criminals stole more than $560 million from U.S. firms in 2009, and they did it “without drawing a gun or passing a note to a teller”

Without maintaining a data center through regulatory checks and governance over information, there are many potential risks that a company can endure

Companies tend to spend 70-80% of there tech. budgets on information systems

TCO – total cost of ownership for a data center are immensely growing year by year

21st Century

Many threats such as failing software, terrorism, vandalism, dumpster divers, shoulder surfers, hackers, etc.

Database management systems (DBMS), in order to keep provisions in backing up systems

Disaster recovery plans should always be set in place and maintained in order to sustain data and protect again any potential threats or impacts on information

Outsourcing over seas and far away locations



PREVENTION

One of the leading technology companies in the world did a study on their own personal disaster recovery plans

Comparison between whether in-house disaster recovery or outsource plans are more beneficial to their company

Costs and impacts that are associated with the decision of choosing disaster recovery plans

The management team looks at many factors including availability of expertise, ensuring ongoing funding, proper resources, routine checks and evaluations, and if there is a focus on continuous improvement within their data centers




IBM


History of disaster recoveries began in the 1970s when the technological advancements of computers made companies more dependent on their information systems
Disaster recovery plans began to really take off in the 80s and 90s when awareness of these potential human induced or natural disasters became more prone as information became more available
Today, nearly every organization is online, making any Internet-connected network a potential entry point for the growing worldwide community of computer criminals



History

Strict regulations and continuous improvement on securing data should be a priority to all businesses
Disaster recovery plans should always be set in place and maintained in order to sustain data and protect again any potential threats or impacts on information
Through continuation of privacy, changing of passwords, and limiting the release of information, companies can surely decline the attacks of cyber warfare
Outsourcing to third parties and having secure in-house replication of data and information


FUTURE

Computer worm discovered in 2010

Secret hacking and destruction of information against Iranian uranium enriched centrifuges

“The attack was so sophisticated that it even altered equipment readings to report normal activity so that operators didn’t even know something was wrong until it was too late”

Proves how advanced technology has become within the 21st century


STUXNET


Disaster Recovery is used after a information system crashes, naturaldisasters, or human/criminal attacks happen to a data center.
Disaster Recovery is the efforts in which companies take in order to prevent incidents from occurring as well as aftermath procedures in case a disaster occurs
Data theft, cyber criminals, and natural causes are some of the main reasons for a disaster of a data center.




What is Disaster Recovery?

Disaster Recovery

1960s: Mainframe computer containing a CPU, a memory cache and storage in one container

1980s: Boom of microcomputer (birth of IBM Personal Computer) – installed everywhere

1990s: Companies began to put server rooms in their company walls

2000s: 5.75 million new servers deployed every year and government data centers rose from 432 in 1999 to 1,100+



Data Centers Timeline
Computer components that has the ability of
data retention

Data storage predictions:
2006-2011: 200 exabytes to nearly 2 zettabytes, a
ten times growth
2015: Increase to over 8 zettabytes

WHAT IS DATA STORAGE?
Data storage

By the help of fast interconnect technologies,
disaggregation was once again possible

Able to send and receive data at speeds with practically no latency

Separate entities that work together for better allocation of storage space and better utilization of data

BACK TO DISAGGREGATION

Reduction of distance between the compute and storage allowed for better data transfer and possibility of real-time data usage

Disadvantage: new servers have less flexibility, higher cost, and more wasted storage space



TO AN AGGREGATE STORAGE

Eliminates physical data storage systems and replacing it with hardware clouds

Needed-basis business – easily modifiable storage space and proving simultaneous real-time sharing between many different machines

Disadvantages: technical issues (power outages, unsaved data), vulnerability of data in the cloud

CLOUD COMPUTING

A chip performance doubles every eighteen months

RAM chips and flash memory – you will pay the same price as today for twice as much storage

MOORE’S LAW

Push-Button Hacking
Hackers create tools that make it easy for the criminally inclined to automate attacks

These tools probe systems for vulnerabilities
and then launch attacks
Alexsey Belan
FBI's Most Wanted List
Wanted for allegedly compromising the cyber security systems of three major US based e-commerce companies in Nevada and California between January of 2012 and April of 2013.

He is accused of stealing and exporting user databases with passwords to his server and also for supposedly trying to sell these databases
Pros


Cons

Must be individuals who fully understand the systems

Business persons must be trained and be able to apply their new found techniques
Detects Cyber Crimes

Protects Business and Individual Data

Prevents Internal and External Threats
Future
Intel-officials predict that within five years hackers will acquire the cyberattack capabilities that we now associate with criminal gangs or nation states

Example: online sabotage of industrial control systems that run power plants, factories and utilities
Security programs must be improved and if the current technological trend continues there is very little doubt that new programs will be developed that will be able to prevent these attacks
Full transcript