วันศุกร์ที่ 26 กรกฎาคม พ.ศ. 2556

Facebook data


With more than 900 million active users,Facebook is the busiest site on the Internet and has built an extensive infrastructure to support this rapid growth. The social networking site was launched in February 2004, initially out of Facebook founder Mark Zuckerberg’s dorm room at Harvard University and using a single server. The company’s web servers and storage units are now housed in data centers around the country.

Each data center houses thousands of computer servers, which are networked together and linked to the outside world through fiber optic cables. Every time you share information on Facebook, the servers in these data centers receive the information and distribute it to your network of friends.

We’ve written a lot about Facebook’s infrastructure, and have compiled this information into a series of Frequently Asked Questions. Here’s the Facebook Data Center FAQ (or “Everything You Ever Wanted to Know About Facebook’s Data Centers”).

How Big is Facebook’s Internet Infrastructure?

Facebook is currently the world’s most popular web site, with more than 1 trillion page views each month, according tometrics from Google’s DoubleClick service. Facebook currently accounts for about 9 percent of all Internet traffic, slightly more than Google, according to HitWise.

Facebook requires massive storage infrastructure to house its enormous stockpile of photos, which grows steadily as users add 300 million new photos every day. In addition, the company’s infrastructure must support platform services for more than 1 million web sites and 550,000 applications using the Facebook Connect platform.

To support that huge activity, Facebook has built two huge data centers, has two more under construction, and leases additional server space in at least nine data centers on both coasts of the United States. More than 70 percent of Facebook’s audience is in other countries, prompting Facebook to announce its first non-U.S. data center in Lulea, Sweden.

The company’s massive armada of servers and storage must work together seamlessly to deliver each Facebook page, the company said. “Loading a user’s home page typically requires accessing hundreds of servers, processing tens of thousands of individual pieces of data, and delivering the information selected in less than one second,” the company said.

For most of its history, Facebook has managed its infrastructure by leasing “wholesale” data center space from third-party landlords. Wholesale providers build the data center, including the raised-floor technical space and the power and cooling infrastructure, and then lease the completed facility. In the wholesale model, users can occupy their data center space in about five months, rather than the 12 months needed to build a major data center. This has allowed Facebook to scale rapidly to keep pace with the growth of its audience.

Where are Facebook’s Data Centers Located?

The first building at Facebook’s data center campus in Oregon, built on a high plain above the small town of Prineville (Population: 10,000).

In January 2010 Facebook announced plans to build its own data centers, beginning with a facility in Prineville, Oregon. This typically requires a larger up-front investment in construction and equipment, but allows greater customization of power and cooling infrastructure. The social network has since announced plans for data centers in Forest City, North Carolina (November 2010) andLulea, Sweden (October, 2011). The company has brought the facilities in Prineville and North Carolina online, and begun work on second data centers on both campuses.

Facebook currently leases space in about six different data centers in Silicon Valley, located in Santa Clara and San Jose, and at least one in San Francisco. The company has also leased space in three wholesale data center facilities in Ashburn, Virginia. Both Santa Clara and Ashburn are key data center hubs, where hundreds of fiber networks meet and connect, making them ideal for companies whose content is widely distributed.

If Facebook’s growth continues at the current rate, it will likely require a larger network of company-built data centers, as seen with Google, Microsoft, Yahoo and eBay.

How Big Are Facebook’s Server Farms?

An aerial view of the 300,000 square foot Facebook data center in Forest City, North Carolina.

As Facebook grows, its data center requirements are growing along with it. The new data center Oregon was announced as being 147,000 square feet. But as construction got rolling, the company announced plans to add a second phase to the project, which will add another 160,000 square feet. That brings the total size of the Prineville facility to 307,000 square feet of space – larger than two Wal-Mart stores. The North Carolina facility is about the same size.

In the leased data centers where it operates, Facebook typically leases between 2.25 megawatts and 6 megawatts of power capacity, or between 10,000 and 35,000 square feet of space. Due to the importance of power for data centers, most landlords now price deals using power as a yardstick, with megawatts replacing square feet as the primary benchmark for real estate deals.


วันอาทิตย์ที่ 21 กรกฎาคม พ.ศ. 2556

Google data

Google's Data Liberation Front is an engineering team at Google whose "goal is to make it easier for users to move their data in and out of Google products."The team, which consults with other engineering teams within Google on how to "liberate" Google products, currently supports 27 products. The purpose of the Data Liberation Front is to ensure that data can be migrated from Google once an individual or company stops using their services.



วันศุกร์ที่ 19 กรกฎาคม พ.ศ. 2556

Home network





home network or home area network(HAN) is a residential local area network(LAN) for communication between digital devices typically deployed in the home, usually a small number of personal computers and accessories, such as printers and mobile computing devices. An important function is the sharing of Internet access, often a broadband service provisioned by fiber-to-the-home or via Cable Internet accessDigital Subscriber Line (DSL) or mobile broadband by Internet service providers (ISPs). If an ISP only provides one IP address, a router including network address translation (NAT), proxy server software and typically a network firewall, allows several computers to share the external IP address. The router function may be assumed by a PC with several network interfaces, but a dedicated router device is more common, often including a wireless accesspoint, providing WiFi access.

วันจันทร์ที่ 15 กรกฎาคม พ.ศ. 2556

Computer network

computer network is a telecommunications network that allowscomputers to exchange data. The physical connection between networked computing devices is established using either cable media or wireless media. The best-known computer network is the Internet.
Network devices that originate, route and terminate the data are called network nodes.[1] Nodes can include hosts such asservers and personal computers, as well asnetworking hardware. Two devices are said to be networked when a process in one device is able to exchange information with a process in another device.
Computer networks support applications such as access to the World Wide Web, shared use of application and storage serversprinters, and fax machines, and use of email and instant messagingapplications. The remainder of this article discusses local area network technologies and classifies them according to the following characteristics: the physical media used to transmit signals, thecommunications protocols used to organize network traffic, along with the network's size, its topology and its organizational intent.

Properties

Computer networking may be considered a branch of electrical engineering,telecommunicationscomputer science,information technology or computer engineering, since it relies upon the theoretical and practical application of the related disciplines.
A computer network has the following properties:
Facilitates interpersonal communications
People can communicate efficiently and easily via email, instant messaging, chat rooms, telephone, video telephone calls, and video conferencing.
Allows sharing of files, data, and other types of information
Authorized users may access information stored on other computers on the network. Providing access to information on shared storage devices is an important feature of many networks.
Allows sharing of network and computing resources
Users may access and use resources provided by devices on the network, such as printing a document on a shared network printer. Distributed computinguses computing resources across a network to accomplish tasks.
May be insecure
A computer network may be used bycomputer Hackers to deploy computer viruses or computer worms on devices connected to the network, or to prevent these devices from accessing the network (denial of service).
May interfere with other technologies
Power line communication strongly disturbs certain[5] forms of radio communication, e.g., amateur radio. It may also interfere with last mile access technologies such as ADSL and VDSL.
May be difficult to set up
A complex computer network may be difficult to set up. It may be costly to set up an effective computer network in a large organization.

Data center

An operation engineer overseeing a network operations control room of a data center

data center is a facility used to house computer systems and associated components, such as telecommunications and storage systems. It generally includes redundant or backup power supplies, redundant data communications connections, environmental controls (e.g., air conditioning, fire suppression) and security devices. Large data centers are industrial scale operations using as much electricity as a small town[1] and sometimes are a significant source of air pollution in the form of diesel exhaust.

History

Data centers have their roots in the huge computer rooms of the early ages of the computing industry. Early computer systems were complex to operate and maintain, and required a special environment in which to operate. Many cables were necessary to connect all the components, and methods to accommodate and organize these were devised, such as standard racks to mount equipment, elevated floors, and cable trays(installed overhead or under the elevated floor). Also, a single mainframe required a great deal of power, and had to be cooled to avoid overheating. Security was important – computers were expensive, and were often used for military purposes. Basic design guidelines for controlling access to the computer room were therefore devised.

During the boom of the microcomputer industry, and especially during the 1980s, computers started to be deployed everywhere, in many cases with little or no care about operating requirements. However, as information technology (IT) operations started to grow in complexity, companies grew aware of the need to control IT resources. With the advent ofclient-server computing, during the 1990s, microcomputers (now called "servers") started to find their places in the old computer rooms. The availability of inexpensive networking equipment, coupled with new standards for networkstructured cabling, made it possible to use a hierarchical design that put the servers in a specific room inside the company. The use of the term "data center," as applied to specially designed computer rooms, started to gain popular recognition about this time.

The boom of data centers came during thedot-com bubble. Companies needed fast Internet connectivity and nonstop operation to deploy systems and establish a presence on the Internet. Installing such equipment was not viable for many smaller companies. Many companies started building very large facilities, called Internet data centers (IDCs), which provide businesses with a range of solutions for systems deployment and operation. New technologies and practices were designed to handle the scale and the operational requirements of such large-scale operations. These practices eventually migrated toward the private data centers, and were adopted largely because of their practical results.

With an increase in the uptake of cloud computing, business and government organizations are scrutinizing data centers to a higher degree in areas such as security, availability, environmental impact and adherence to standards. Standard Documents from accredited professional groups, such as the Telecommunications Industry Association, specify the requirements for data center design. Well-known operational metrics for data center availability can be used to evaluate the business impact of a disruption. There is still a lot of development being done in operation practice, and also in environmentally friendly data center design. Data centers are typically very expensive to build and maintain.