Equinix Data Center Outage in London Blamed on Faulty UPS

Equinix Data Center Outage in London Blamed on Faulty UPS

datacenterknowledgelogoBrought to you by Data Center Knowledge

Wednesday’s data center outage at one of the Telecity facilities in London Equinix took over in its recent acquisition of the European service provider was caused by a problem with a UPS system, Equinix told its customers via email, according to news reports. Studies show that UPS failure is the most common cause of data center outages.

The company did not say what exactly went wrong with the UPS, but the outage caused connectivity problems for many subscribers to internet services by BT, whose spokeswoman told the Register that about one in every 10 attempts to reach a website by its users failed during the outage.

The data center outage affected a portion of BT subscribers in England, Wales, Scotland, and Northern Ireland, according to the review of affected areas posted on BT’s status page by the BBC.

Equinix issued a press statement by Russell Poole, its managing director for the UK, confirming the outage at the former Telecity LD8 data center. “This impacted a limited number of customers, however service was restored within minutes,” he said.

A spokesman for the London Internet Exchange (LINX) told the BBC that the outage lasted from 7:55 am to 8:17 am BST.

The Telecity LD8 data center, now called 8/9 Harbour Exchange, is one of five data centers that make up the Telecity campus in the London Docklands that was the crown jewel in the data center provider’s portfolio acquired by Equinix for $3.6 billion in a deal that closed earlier this year. The campus hosts a substantial portion of the LINX infrastructure, as well as many financial services firms, cloud providers, and companies in other business verticals.

A data center outage impacting a user like LINX can have effects that reach wider than even an outage that impacts a major internet service provider like BT. Internet exchanges are where many network operators and internet content providers interconnect their networks to more effectively deliver traffic to their end users.

BT is one of 700 LINX members. The LINX spokesman, however, pointed out that there are usually redundant network routes that ensure traffic continues to flow when there is an outage on one of them.

“Over 80% of our traffic continued to flow and it immediately started to recover even before the power was restored,” he said.

UPS failure has for years been the most frequently cited cause of data center outages, according to studies by Emerson Network Power and the Ponemon Institute. Last year, UPS and UPS battery failures caused 25 percent of outages – up from 24 percent in 2013 but down from 29 percent in 2010, according to their most recent study, released earlier this year.

Source: TheWHIR

New Google AI Services Bring Automation to Customer Service

New Google AI Services Bring Automation to Customer Service

By Jack Clark

(Bloomberg) — Google is trying to use its artificial intelligence know-how to tempt businesses onto its cloud and away from dominant services run by Amazon.com Inc. and Microsoft Corp. The latest lure: Use Google computers to automatically handle irate customer calls.

The Alphabet Inc. unit announced two new artificial intelligence software tools Wednesday for its Google Cloud Platform service and made another of its many data centers available to rent by outside companies.

SEE ALSO: Google Launches Its First Cloud Data Center on West Coast

The moves are part of a broader push by Google to use its lead in AI technology to improve existing services and products, develop new ones and ultimately build new businesses. It recently used cutting-edge AI developed by its DeepMind subsidiary to improve the efficiency of its data centers.

The products introduced Wednesday also increase competition with Microsoft, which is making AI tools available via its Azure cloud, and set Google apart from Amazon Web Services, which has focused on letting customers program their own AI tools.

Google’s two new AI tools let companies analyze language and convert speech into text. U.K.-based grocery delivery service Ocado Group Plc has used them to help it rank and respond to customer queries, the internet company said.

Businesses can use the technologies to automatically “prioritize the most irate customers first” by spotting language from e-mails and phone calls associated with feelings like anger, frustration and irritation, said Rob Kraft, a product manager for Google Cloud Platform.

The products will cost a few cents per use, he said. The company expects people to mix-and-match its various AI offerings. For example, a business could transcribe a phone call using the speech service, interpret the tone of it, and figure out which product the call relates to, no human required.

The current emphasis of Google’s AI services is to help automate aspects of conversations, said Kraft. In the future, he thinks interesting work could be done in fraud prevention and cybersecurity for other companies.

The company also announced that customers can now rent storage, computing power and other cloud services from its data center in Oregon, giving people on the West Coast of the U.S. faster access. That’s part of a plan by cloud chief and board-member Diane Greene to add 12 new data centers over the next 12 to 18 months.

Source: TheWHIR

HostingCon Global 2016 Countdown: How to Find Your "Voice" (And Why it Matters)

HostingCon Global 2016 Countdown: How to Find Your "Voice" (And Why it Matters)

The countdown to HostingCon Global 2016 in New Orleans is on with four days to go before the hosting and cloud industry touches down at the Ernest N. Morial Convention Center. Education is one of the defining aspects of the HostingCon conferences, and with so many excellent sessions and opportunities for learning we wanted to spend the next week offering a preview for our readers who are attending HostingCon.

DreamHost Vice President of Brand & Community Brett Dunst wants you to know that if you come to his session at HostingCon, all of your wildest dreams will come true. Now how’s that for conference return on investment?

If you follow DreamHost online or read the company’s blog posts, you know that the company has a very distinct voice – or maybe you don’t know what “voice” is, exactly. According to Contently, “the fundamentals of voice comes down to a personality—prioritizing a set of traits that comprise an identity, and then communicating in a way that expresses and prioritizes those traits.”

Google Launches Its First Cloud Data Center on West Coast

Google Launches Its First Cloud Data Center on West Coast

datacenterknowledgelogoBrought to you by Data Center Knowledge

Google has brought online its first West Coast cloud data center, promising US and Canadian cloud users on or close to the coast a 30 to 80 percent reduction in latency if they use the new region instead of the one in central US, which was closest to them before the new region launched.

This data center in Oregon isn’t the first Google data center on the West Coast. The company has had a data center campus in the Dalles, Oregon, for a decade. The launch means this is the first time Google’s cloud services are served out of Oregon in addition to other Google services, such as search or maps.

With the new cloud data center online, the company said its cloud users in cities like Vancouver, Seattle, Portland, San Francisco, and Los Angeles should expect to see big performance improvements if they choose to host their virtual infrastructure in the new region, called us-west1.

See also: What Cloud and AI Do and Don’t Mean for Google’s Data Center Strategy

The launch is part of an effort Google kicked off recently to expand its global cloud data center infrastructure as it competes with cloud giants like Amazon Web Services and Microsoft Azure, both of whom are far ahead in the amount of cloud availability regions. The company said in March it would add 10 data center locations to support its cloud services by both leasing data center space and building its own facilities.

One of the planned new cloud data center locations on the list will be in Japan, the company has disclosed.

See also: Nadella: We’ll Build Cloud Data Centers Wherever Demand Takes Us

The Oregon cloud region has been launched with three initial services: Compute Engine, Cloud Storage, and Container Engine, the company said in a blog post announcing the launch Wednesday. The region includes two Compute Engine zones for high-availability applications, which usually means there are two separate data halls, each with its own independent infrastructure.

“A zone usually has power, cooling, networking, and control planes that are isolated from other zones, and most single failure events will affect only a single zone,” Google says on its cloud platform website.

Oregon is Google cloud’s fifth availability region. The other ones are Central US, Eastern US, Western Europe, and Eastern Asia.

A lot more detail about Google’s cloud data center strategy in this presentation by Joe Kava, the man in charge of the company’s data center operations:

[embedded content]

Source: TheWHIR

WordPress, Drupal Hosting Platform Pantheon Raises $29 Million Series C Funding

WordPress, Drupal Hosting Platform Pantheon Raises Million Series C Funding

Website management platform Pantheon announced a $29 million Series C funding round on Wednesday, saying it will use the funds to fuel expansion towards a goal of powering 30 percent of the web. The funding round was led by Foundry Group, Industry Ventures, OpenView Investment Partners, and Scale Venture Partners.

The third round brings Pantheon’s total funding to $57 million, and continues a busy streak for Pantheon. This year the company grew its customer base by over 100 percent year-over-year to surpass 150,000 websites launched, up from 65,000 in 2014, and teamed up with 2,500 agencies and 50 reseller partners.

SEE ALSO: WordPress Hosting Provider Flywheel Raises $4 Million in Series A Funding

The company claims that its container-based infrastructure allows it to offer the “world’s fastest hosting” for both WordPress and Drupal. Its platform includes workflow and collaboration tools, continuous integration, performance monitoring, and scaling tools to reduce the excess resources web development teams often put into systems administration and infrastructure management.

“Pantheon is leading the revolution in web hosting,” Pantheon CEO and co-founder Zack Rosen said in a statement. “This new funding and the explosion of our partner ecosystem demonstrates confidence in our approach and the market’s adoption of our platform. Pantheon takes the heavy lifting out of building, launching, and managing websites for digital agencies and corporate web developers. Our website management platform provides the development tools and scalable infrastructure teams need to build amazing web experiences, launch faster, and maximize their efficiency.”

Pantheon added Niall Hayes as vice president of engineering, and appointed Twitter vice president of data strategy to its board of directors this year, and released enterprise services last September.

This news comes on the heels of several WordPress-related announcements and launches from competitors in the open-source CMS hosting market.

Source: TheWHIR

Microsoft Rises Most in Nine Months After Profit Beat Estimates

Microsoft Rises Most in Nine Months After Profit Beat Estimates

By Ian King and Dina Bass

(Bloomberg) — Microsoft Corp. rose the most in nine months Wednesday after reporting quarterly sales and profit that topped analysts’ estimates, rekindling optimism about Chief Executive Officer Satya Nadella’s cloud strategy as more customers shifted to the company’s internet-based software and services. Shares jumped as much as 5.9 percent.

Key Points

Including some adjustments, fiscal fourth-quarter revenue was $22.6 billion, compared with the average analyst estimate for $22.1 billion, according to data compiled by Bloomberg. Revenue from Azure, the company’s corporate cloud platform, doubled in the quarter that ended June 30. Profit, excluding certain items, was 69 cents a share, Microsoft said Tuesday in a statement. Analysts on average had forecast profit of 58 cents. During the quarter, Microsoft recorded total charges of $1.11 billion, related to the restructuring of the phone business it acquired from Nokia and job cuts. Shares reached $56.20, the highest since October. They were trading up 5.3 percent at $55.90 at 9:49 a.m. in New York. The stock gained 14 percent in the past year through Tuesday.

READ MORE: Microsoft Wins Big in Fight for User Privacy as Irish Search Warrant Found Invalid

The Big Picture

Nadella, well into his third year at the helm, has been reorienting Microsoft’s business around cloud and productivity services to fuel growth as traditional software sales shrink. Annualized revenue from commercial cloud products was more than $12.1 billion in the recent quarter, a number that Microsoft has pledged will reach $20 billion by fiscal 2018. The company is relying on the switch to recurring cloud contracts to help make up for weaker one-time corporate software purchases, which are still on course to decline but came in stronger than the company projected in the recent quarter.

CFO Interview

Microsoft continues to see businesses moving to the cloud and subscription-based software and services, Chief Financial Officer Amy Hood said via telephone. Transactional purchases of legacy products were “a little better this quarter,” she said. “There’s a structural trend and shift to the cloud,” she said. In traditional products, “quarter to quarter, you see some volatility in the results.” “The PC market was a little better than we had expected three months ago,” she said. “We saw it more specifically in more developed markets.”

SEE ALSO: Not So Fast: Microsoft Azure Could Surpass AWS as Most Used Public Cloud by 2019

The Detail

Corporate versions of the Office 365 cloud-based productivity software saw revenue increase by 54 percent in the fiscal fourth quarter, the Redmond, Washington-based company said. Net income was $3.12 billion, or 39 cents a share, including the Nokia-related charges, compared with a loss of $3.2 billion a year earlier. Revenue in the Intelligent Cloud division rose 6.6 percent to $6.71 billion, compared with the $6.58 billion average estimate of analysts polled by Bloomberg. In the current period, Microsoft forecast unit sales of $6.1 billion to $6.3 billion. Productivity group sales gained 4.6 percent to $6.97 billion. Analysts had projected $6.64 billion. In the fiscal first quarter, the company expects to report $6.4 billion to $6.6 billion. More Personal Computing division sales, which include Windows and Xbox, fell 3.7 percent to $8.9 billion in the recent period, slightly better than the $8.87 billion average analyst estimate. Revenue will be $8.7 billion to $9 billion in the current quarter, Microsoft said. Fourth-quarter unearned revenue, a measure of future sales, was $33.9 billion. Five analysts polled by Bloomberg expected an average of $30.88 billion. Microsoft’s profit was boosted in the recent period by a more favorable tax rate. Minus the effects of that gain, profit would have been 63 cents a share, according to a research note from UBS Group AG analyst Brent Thill. Microsoft in June agreed to buy professional networking service LinkedIn Corp. for $26.2 billion.

Street Takeaways

“What’s comforting is the key underlying trends are in place,” said Sid Parakh, a fund manager at Becker Capital Management, which owns Microsoft stock. “At least the long-term trajectory is intact here. There was concern last quarter.” “Deferred revenue growth was pretty decent,” Parakh said.

Source: TheWHIR

LinkedIn Pushes Own Data Center Hardware Standard

LinkedIn Pushes Own Data Center Hardware Standard

datacenterknowledgelogo

Brought to you by Data Center Knowledge

LinkedIn, the social network for the professional world that was in June acquired by Microsoft, has announced a new open design standard for data center servers and racks it hopes will gain wide industry adoption.

It’s unclear, however, how the initiative fits with the infrastructure strategy of its new parent company, which has gone all-in with Facebook’s Open Compute Project, an open source data center and hardware design initiative with its own open design standards for the same components. When it joined OCP two years ago Microsoft also adopted a data center strategy that would standardize hardware on its own OCP-inspired designs across its global operations.

Yuval Bachar, who leads LinkedIn’s infrastructure architecture and who unveiled the Open19 initiative in a blog postTuesday, told us earlier this year that the company had decided against using OCP hardware when it was switching to a hyperscale approach to data center deployment because OCP hardware wasn’t designed for standard data centers and data center racks. That, however, was in March, before LinkedIn was gobbled up by the Redmond, Washington-based tech giant.

“Our plan is to build a standard that works in any EIA 19-inch rack in order to allow many more suppliers to produce servers that will interoperate and be interchangeable in any rack environment,” Bachar wrote in the blog post.

See also: LinkedIn Data Centers Adopting the Hyperscale Way

The standard OCP servers are 21 inches wide, and so are the standard OCP racks. Facebook switched to 21 inches in its data centers several years ago, and announced its 21-inch rack design, called Open Rack, in 2012. Multiple vendors, however, have designed OCP servers in the traditional 19-inch form factor and racks that accommodate them.

There is more to LinkedIn’s proposed Open19 standard than rack width, however. Here is the full list of Open19 specifications:

  • Standard 19-inch 4 post rack
  • Brick cage
  • Brick (B), Double Brick (DB), Double High Brick (DHB)
  • Power shelf—12 volt distribution, OTS power modules
  • Optional Battery Backup Unit (BBU)
  • Optional Networking switch (ToR)
  • Snap-on power cables/PCB—200-250 watts per brick
  • Snap-on data cables—up to 100G per brick
  • Provides linear growth on power and bandwidth based on brick size

linkedin open19 rack

Illustration of LinkedIn’s proposed Open19 rack and server design (Image: LinkedIn)

Bachar and his colleagues believe designs that follow these specs “will be more modular, efficient to install, and contain components that are easier to source than other custom open server solutions.”

Making open hardware easier to source is an important issue and probably the strongest argument for an alternative standard to OCP. We have heard from multiple people close to OCP that sourcing components for OCP gear is difficult, especially if you’re not a high-volume buyer like Facebook or Microsoft. OCP vendors today are focused predominantly on serving those hyperscale data center operators, which substantially limits access to that kind of hardware for smaller IT shops.

Read more: Why OCP Servers are Hard to Get for Enterprise IT Shops

Still, the amount of industry support OCP has gained over the last several years will make it difficult for a competing standard to take hold, especially given that one of OCP’s biggest supporters is now LinkedIn’s parent company. Other OCP members include Apple, Google, AT&T, Deutsche Telekom, and Equinix, as well as numerous large financial institutions and the biggest hardware and data center infrastructure vendors.

Source: TheWHIR

HostingCon Global 2016 Countdown: New Trends in Web Application Security

HostingCon Global 2016 Countdown: New Trends in Web Application Security

The countdown to HostingCon Global 2016 in New Orleans is on with five days to go before the hosting and cloud industry touches down at the Ernest N. Morial Convention Center. Education is one of the defining aspects of the HostingCon conferences, and with so many excellent sessions and opportunities for learning we wanted to spend the next week offering a preview for our readers who are attending HostingCon.

Without a doubt it’s been an interesting year for cybersecurity, and while in many ways security threats have become more complicated, service providers are having an easier time talking to customers about the importance of security solutions because of growing awareness.

Over the last year, several trends have emerged, but one that sticks out to SiteLock president and HostingCon speaker Neill Feather is that attackers are starting to more aggressively starting to target small and medium-sized businesses – who are in turn looking to their hosting providers for help mitigating these threats.

“We’re beginning to see more awareness of the issue from these small and medium sized businesses, and I think one of the good things from a hosting community perspective is there’s been a lot more pull from the end customers to make sure their website is secure, and they’re asking more and more about products to help them do that,” Feather said.

SiteLock provides website security solutions and partners with web hosting providers to deliver these solutions to end customers. Feather said that this past year has been a year of growth for the company, who had to move offices in Scottsdale to accommodate its 150 employees. SiteLock currently protects around 8 million websites.

In his HostingCon session on Mon., July 25 at 9 a.m., Feather will share some research on “effectiveness around using traditional tools versus purpose-built tools to help protect web applications, and give people a bit of a view into why it’s important to use the right tool for the right job.”

“Beyond that we want to show some of the advances that we’ve been making around being more proactive in identifying risk to a website prior to a compromise,” he said.

“Once there’s a vulnerability or a compromise you’re having a more negative conversation with customers. We want to enable our hosting partners to have more of a proactive conversation and we’ve got some products in the works to help them with that,” he said.

Interested in other security topics? Check out the full HostingCon schedule to learn how to better mitigate DDoS attacks, what the encryption landscape looks like today, and how government regulation impacts your business.

Source: TheWHIR

Big Data, Managed Security Services to Drive Latin American IT Growth

Big Data, Managed Security Services to Drive Latin American IT Growth

IT revenues in Latin America are projected to grow by over 20 percent in 2016, led by big data and cloud computing, according to a report released Tuesday by Frost & Sullivan.

The 2016 Latin America Outlook for the Information Technology Services Industry shows major growth in the opportunities available in the market for security providers as various stakeholders come together to tackle lingering reliability and infrastructure impediments.

SEE ALSO: Trends in Big Data for SMEs: What Providers Should Know

IT services in the region are expected to bring in $7.78 billion in 2016, up from $6.46 billion in 2015. Data center services remain the source of almost half of the sector’s regional revenue, but its compound annual growth rate is lower than any of the other services examined, including cloud computing, big data and analytics, mobility, and managed security services. Managed security revenues are predicted to grow by 18.4 percent from $580.2 million in 2015 to $687.5 million this year.

“One of the biggest hurdles to the mass adoption of potentially disruptive technologies is security,” Frost & Sullivan Digital Transformation Consultant Leandro Scalize said in a statement. “No matter which technology is in focus, without a well-drawn security strategy, there is little chance of long-term success.”

READ MORE: Drowning in Data: How Channel Providers Help Customers Make Sense of the World of Information Around Them

Rising awareness and deployment of SaaS and IaaS in Latin America is contributing to a rapidly maturing cloud computing segment, the report says, predicting it will surpass $2 billion in 2016.

A report produced earlier this year by 451 Research showed public cloud is more expensive in Latin America than any other global region.

A previous Frost & Sullivan report, released in 2015, showed significant adoption of cloud services among Brazillian companies. Since then, however, Latin America’s largest country has been continuously wracked by recession, corruption, and political division.

Source: TheWHIR

IDG Contributor Network: Introduction to streaming data platforms

IDG Contributor Network: Introduction to streaming data platforms

Data-driven strategies are becoming a greater part of an organization’s DNA. Executive management is embracing how data can be used to create, sustain, and strengthen competitive advantage. Disruptive companies are building business models based on data that other organizations leave behind. Employees are growing into the role of data analysts as part of their day-to-day responsibilities, and companies are introducing new data sources to continue this trend.

Specifically, data-driven strategies can be seen in the way that organizations are taking advantage of modern technical architectures in mobility, cloud, and device sensors, and integrating that information into new ways of doing business. The use of location-based mobile apps, optimized supply chains for online retail applications, and the introduction of the internet of things, increased the focus on low-latency data collection, transformation, and analytics.

With the rise of importance of data-driven organizations and the focus on low-latency decision making, the speed of analytics increased almost as rapidly as the ability to collect information. This is where the world of streaming data platforms comes into play. These modern data management platforms bring together not just the low-latency analysis of information, but the important aspect of being able to integrate information from operation systems, mobile applications, databases, and the Internet of Things in real-time/near real-time.

The true key to streaming data platforms, and the applications they support, is the integration of technical and business data sources in real time. Without this level of streaming data acquisition, or data integration, the analytics that data-driven strategies and the business models built on those strategies cannot match the promise of the business stakeholders who are looking to create new business value and increase competitive advantage.