WordPress Unveils Plans for .Blog, the New gTLD that Cost it $19M

WordPress Unveils Plans for .Blog, the New gTLD that Cost it M

WordPress parent company Automattic bought the rights to the new gTLD .blog last year for $19 million, beating Google and a handful of other companies in the auction.

Companies with a trademark can register a .blog starting in August, and then October is the land-rush period.

On Thursday Automattic CEO Matt Mullenweg said in a blog post: “It’s now public that Automattic is the company behind Knock Knock Whois There LLC, the registry for the new .blog TLD. (And a great pun.) We wanted to stay stealth while in the bidding process and afterward in order not to draw too much attention, but nonetheless the cost of the .blog auction got up there (people are estimating around $20M).”

SEE ALSO: WordPress.com Secures Millions of Domains with Free, Automatic HTTPS

.Blog domains will be available from WordPress.com or through one if its partner domain name registrars.

Mullenweg told VentureBeat in an interview that it worked with Primer Nivel in the auction process in order to conceal its identity; “But there could be an incentive for other bidders if they knew it was us, and we’d just raised funding, that they would try to drive the price up. It was right after we did our $160 million round.”

Mullenweg said that “because companies use blog as a subdirectory and a subdomain, I think that the domain is interesting.”

READ MORE: Google Domains Service Moves to .Google

New gTLDs have not seen a lot of mainstream adoption, but with WordPress’ widespread reach it could help educate users.

“It is too early to tell whether the new [gTLD] space will be accepted as a viable alternative to the established space,” a recent academic report said.

Source: TheWHIR

Japan's SoftBank and Alibaba Launch Joint Cloud Computing Venture

Japan's SoftBank and Alibaba Launch Joint Cloud Computing Venture

Japanese telecom SoftBank and Alibaba announced a partnership on Friday to launch cloud computing services in Japan based on Alibaba Cloud.

According to the announcement, the cloud services will be offered out of SB Cloud Corporation, or SB Cloud, which will open a new data center in Japan. SoftBank’s parent company is a major shareholder of Alibaba Group.

The joint venture will provide public cloud services from Alibaba Cloud to a range of companies, and allow Alibaba to extend its reach with access to SoftBank’s business customer base in Japan, a country which is ranked 2nd among the top markets for global cloud services according to a recent report by the International Trade Administration. Gartner predicts that by 2018 the Asia Pacific and Japan region will account for $11.5 billion in total cloud services spending.

SEE ALSO: Yahoo Japan to Run on OpenStack, Cloud Foundry

SB Cloud CEO and executive vice president of SoftBank Eric Gan said that the companies have been working on the joint venture over the “past few months.”

SB Cloud’s offerings will include data storage and processing services, enterprise-level middle as well as cloud security services, according to an announcement.

“We are proud that Alibaba Cloud can leverage its cloud computing expertise in the joint venture with SoftBank,” Sicheng Yu, vice president of Alibaba Cloud said in a statement. “We look forward to helping more Japanese companies grow their business with our secure, scalable and innovative cloud computing services.”

SB Cloud CEO and executive vice president of SoftBank Eric Gan said that the companies have been working on the joint venture over the “past few months.”

Just last month, Digital Realty announced that it has pre-leased the entirety of its first Japan data center to a “major hyperscale cloud provider.”

http://www.thewhir.com/web-hosting-news/big-cloud-provider-pre-leases-digitals-entire-first-japan-data-center

Other cloud providers with a presence in Japan include Microsoft and Amazon.

Source: TheWHIR

WebRTC: Disruptive Technology with Revenue Potential or It Just Sucks?

WebRTC: Disruptive Technology with Revenue Potential or It Just Sucks?

Where are we with WebRTC anyway? Do we stay with SIP or switch to the up and coming WebRTC technology? I’ve noticed that since the beginning of the year some of the most prolific communication industry bloggers have been very critical of the new technology that held such promise over the last five years or so.

No Jitter’s Zeus Kerravala claims that “WebRTC is losing steam.” At TalkingPointz, analyst Dave Michels complains that “WebRTC is a distraction” and exhorts his readers to go out and buy a tried-and-tested solution. Jumping on this, Todd Carothers, EVP of marketing and products at CounterPath argues in his blog that “WebRTC applications are restrictive in terms of the capabilities that can be offered as a true unified communications solution” and holds up his company’s SIP-based VoIP products for consideration instead.

Hang on guys, just because a new technology does work seamlessly doesn’t mean it’ll never get there. Disruptive technologies take a long time to perfect and users can be impatient.

WebRTC’s current status reminds me of my experiences after founding a VoIP service provider 12 years ago. In 2004 I had a channel partner approach me at a trade show who said, “So you’re the CEO of SimpleSignal? I love VoIP technology! I make a lot of money on it.” “Great!” I replied. Then through a big laugh he said, “I make a lot of money taking it OUT of my customers offices that were stupid enough to give it a try.”

Downtrodden, I had to agree. It had it’s issues. But we were close and it really wasn’t “stupid to try.” It just wasn’t perfect yet and people expected “five nines” from their land lines. But the new “cloud” feature set rocked and that’s what gave us the faith as a service provider to stick with this less than perfect technology.

Simultaneous ring was just one mobility feature that changed how I could run my business from anywhere. Add to that the ability VoIP had to integrate into the software I used all day which resulted in a business outcome that was so much better. So I put up with occasional jitter, chop and dropped calls. Yeah, The internet was expensive then and not ready for VoIP. Standards and policies weren’t complete. Fraud and security concerns kept us up at night. God help you if you dialed 911.

Bottom line, VoIP sucked in the early days. But it sucked less than settling for a POTs line.

Innovative service providers might choose to invest some time investigating whether disruptive technologies in early stages are worth some extra patience. WebRTC technology could have the same kind of trajectory as VoIP; initial issues, but features that are better than current options.

Let’s look at what’s good about webRTC right now.

  • It’s free and who doesn’t love free technology? I’m thankful that Google makes so much money on adwords they can give away most of what they do for free. GOOG bought most of the WebRTC IP and let it become open source. Because it’s “open,” it’s a potential source of new revenue for service providers who can figure out how to productize it.
  • It’s browser based which makes it so much easier to use than downloading an app.
  • Because it’s browser based and needs no federation between users, you can make a video call to someone using just their email address rather than numbers. That’s way easier for me to remember.
  • It combines sound, HD video, PDFs and it’s Real Time Communication on the web man! Click and you’re connected. No app to download or proprietary plug-in or client to struggle with. No muss, no fuss.

So how’s it being used now? In an excellent defense of WebRTC, Phil Edholm listed the following use cases:

  • Internet of Things: Connected doorbells, baby monitors and security systems, using a companion app running on a mobile device
  • Closed user communities: Facebook Messenger is an excellent example of this
  • Web-based merchants operating contact centers. As a bonus, you can run the agent-side app in a browser without needing a separate soft client or desk phone, and you can take advantage of omnichannel features such as co-browsing
  • Large-scale videoconferencing: Traditional multipoint control units are expensive and have a limited repertoire of display formats. A type of video switching technology known as a Selective Forwarding Unit allows the easy customization of display formats — the layout is determined by a Web page without the need for the central server to mix the video output.
  • Contact centers: These represent a strong potential market for WebRTC-based services because people are increasingly using the Web to access customer support information, rather than turning to the Yellow Pages (phone directory) or their existing bills. Customers would find an easier time of accessing real-time interactive customer support services directly through their Web browsers with voice, messaging, and video chat applications. A WebRTC-based service would be an instant direct connection between the consumer and the business, and could decrease time to resolution, lowering customer support costs and increasing customer satisfaction and retention. Many vendors and operators have developed contact center and customer service applications based on WebRTC. For example, LiveOps offers a cloud-based WebRTC-based solution with zero on-premises equipment and Amazon’s Mayday feature on its Kindle tablets allows users to launch video chats with support representatives. Slack, Zendesk and Freshdesk also introduced WebRTC functionality in 2015.

It is clear that WebRTC is becoming a major part of the fabric of many communications solutions. WebRTC plays a major if not dominant role in three major communications companies: Spark for Cisco, Everything for Google, and within the Zang solution from Avaya. Spark and HipChat are totally based on WebRTC and Zang has a WebRTC component.

The momentum for WebRTC is rapidly accelerating. It is clear that WebRTC is through the hype cycle and well into the adoption cycle. Just because it it took longer than some of us thought, I still believe it is not the time to underestimate the impact of WebRTC and the web model of communications.

Concluding Thoughts

At the end of the day, regardless if they are WebRTC or SIP based, the systems that will prevail in the market are those who find a way to provide customers with a true unified communications experience. The solution that offers high-quality, reliable communications through voice, messaging, presence, and video across platforms, networks, and devices will become the true champion in the UC race.

I’m proud to say I was part of the brave group of Unified Communication pioneers that helped evolve the first wave of VoIP technology that can now address the need of any customer that is looking to communicate and collaborate anywhere, anytime on any device.

It would be foolish to pretend that WebRTC is a plug-and-play technology, as it is evidently quite raw and still evolving. But who knows, you might hear me saying it sucks less than settling for a VoIP connection.

This article is brought to you by HostingCon, the Cloud and Service Provider Ecosystem event. Join us in New Orleans, Louisiana July 24-27, 2016 to hear Dave and other thought leaders talk about issues and trends in the cloud, hosting and service provider ecosystem.

Save $100 off your HostingCon All Access Pass with coupon code: H1279

Source: TheWHIR

AWS Makes Its First Big Submarine Cable Investment

AWS Makes Its First Big Submarine Cable Investment

datacenterknowledgelogoBrought to you by Data Center Knowledge

Amazon Web Services has made its first investment in a submarine cable project, looking to improve capacity on the global network connecting the data centers that host its cloud services in the US, Australia, and New Zealand.

When the Hawaiki Submarine Cable comes online – target live date is in June 2018 – it will provide considerably more bandwidth between the US, Australia, and New Zealand than available today. The cable is expected to reduce latency for AWS users operating between these three countries.

Amazon has agreed to become the cable’s fourth anchor customer, and its financial commitment provided the last bit of funding necessary to kick off the submarine construction project, a person familiar with the deal who wished to remain anonymous told Data Center Knowledge.

The 14,000-kilometer cable will provide a much welcomed third competitor to the two submarine cable systems on the US-Australia route today: Telstra Endeavour and Southern Cross Cable Network.

Telstra Endeavour cable

Telstra Endeavour (above) lands in Hawaii on the US side and in Paddington in Australia. (Image source: TeleGeography’s Submarine Cable Map)

Southern Cross Cable Network

Southern Cross (above) will land in Australia, New Zealand, Hawaii, Oregon, and California. (Image source: TeleGeography’s Submarine Cable Map)

Hawaiki cable

The Hawaiki cable (above) will land in Oregon, Australia, New Zealand, as well as American Samoa. (Image source: TeleGeography’s Submarine Cable Map)

Oregon’s Pacific shore is an important place on the global connectivity map. A lot of transpacific network traffic enters the US through the high concentration of submarine cable landing stations in the Oregon towns of Hillsboro, Nedonna Beach, Warrenton, Pacific City, and Warrenton, from where it is carried south to data centers and carrier hotels in Silicon Valley and Los Angeles or inland.

Because of this, there is a fairly large data center cluster in Hillsboro and the surrounding area. Amazon’s US West cloud availability region is hosted in data centers in Oregon, as well as GovCloud, its dedicated cloud region for government agencies.

Intercontinental connectivity is crucial to cloud service providers of Amazon’s caliber, who try to offer customers as many global location options for hosting their virtual infrastructure as possible. As more and more companies start using cloud services and the amount of data created and exchanged in general keeps growing rapidly, demand for this kind of connectivity is on the rise, and so is construction of submarine cables to address the demand.

“We are seeing a resurgence of subsea cable projects to support global cloud deployments and growth of international data traffic,” Equinix CEO Stephen Smith said on the company’s first-quarter earnings call this month. “There are more than 50 global submarine cable projects under consideration over the coming two years, which places Equinix in a great position to win a portion of this next generation of submarine cable investment.”

Amazon’s biggest rivals in the cloud services market, Google and Microsoft, have both made big investments in submarine cable construction projects.

The Faster cable system, backed by Google and several Asian telecommunications and IT services companies, is expected to come online this year. Another big project is the New Cross Pacific Cable System, which is backed by Microsoft and a group of Asian telcos. NCP is expected to come online in 2017. Both will land in Oregon on the US side.

The three anchor customers of the Hawaiki cable besides Amazon are British telco Vodafone, REANNZ, a government-backed New Zealand research and education network, and the American Samoa Telecommunications Authority, the US territory’s government-owned incumbent carrier.

Original article appeared here: Amazon’s Cloud Arm Makes Its First Big Submarine Cable Investment

Source: TheWHIR

Digiweb Hardware Failure Brings Down Shared Hosting

Digiweb Hardware Failure Brings Down Shared Hosting

Ireland-based web host Digiweb has suffered a “major hardware failure” which knocked its servers offline at around 3 pm local time on Wednesday. As of Thursday evening, the company had partially restored customer access.

Earlier this month Digiweb experienced disruptions to its storage network, and IMAP service, before its shared hosting services went down Wednesday mid-morning.

The company provided updates to the latest event through its website and social media, and attempts to contact the company met with the automated message: “Due to a major hardware failure, customers will be unable to access any of the shared hosting services. We are working to get this resolved as quickly as possible.”

As is often the case with prolonged outages, Digiweb’s customer communications gradually became the focus of frustrated Tweets.

The Digiweb hosting support panel was updated at 2 pm on Thursday to report: “all services are now running as expected. Our engineers are monitoring these and will take appropriate action to ensure normal service.” The disruption did not, however, appear to be over.

Digiweb did not respond to a request for comment in time for publication.

Source: TheWHIR

Google Declares War on Boring Data Center Walls

Google Declares War on Boring Data Center Walls

datacenterknowledgelogoBrought to you by Data Center Knowledge

Usually, if you drive by a data center, there is little indication that the huge gray building you are passing houses one of the engines of the digital economy. Sometimes, if you happen to be a data center geek, you may deduce the facility’s purpose from observing a fleet of massive cooling units along one of its walls, but even those often hide from plain sight.

Data centers by and large are non-descript, and many, if not most, in the industry like to keep it that way. After all, the fewer people know where a facility that’s critical to a nation’s economy (a stock exchange data center), or one that’s critical to a nation’s security (a mission-critical US Navy data center) is located, the better.

But Google has decided to flaunt the huge server farms it has built around the world. From images and videos the company has released in the past, the insides of these facilities are works of art. Here’s a 360-degree tour inside one of them:

[embedded content]

Now, the company wants their external walls to both reflect their function in society and be a pleasure to look at.

“Because these buildings typically aren’t much to look at, people usually don’t—and rarely learn about the incredible structures and people inside who make so much of modern life possible,” Joe Kava, VP of Google Data Centers, wrote in ablog post.

In what it dubbed the “Data Center Mural Project,” Google has hired four artists to paint murals on the walls of four of its data centers: in Mayes County, Oklahoma; St. Ghislain, Belgium; Dublin, Ireland; and Council Bluffs, Iowa.

The artists were tasked with portraying each building’s function and reflecting the community it’s in.

[embedded content]

The murals in Oklahoma and Belgium have been completed, and the remaining two are in progress.

Jenny Odell, the artist who worked on the Mayes County project, used Google Maps imagery to create large collages, each reflecting a different type of infrastructure in use today(Photo: Google):

google oklahoma mural

Oli-B, who painted the mural on a wall of Google’s St. Ghislain data center, created an abstract interpretation of “the cloud.” He used elements specific to the surrounding community, as well as the data center site and the people who work there (Photo: Google):

google st gislain mural closeup

The four sites are just the start. The company hopes to expand the Data Center Mural Project to more locations.

More images and video on the Data Center Murals Project website.

Original article appeared here: Google Declares War on Boring Data Center Walls

Source: TheWHIR

Security Fears Prompt House to Block Google, Yahoo Cloud Services

Security Fears Prompt House to Block Google, Yahoo Cloud Services

US congressional representatives and their staffers have been blocked from Google and Yahoo cloud services while on the House of Representatives network by the House IT team, following warnings from the FBI about potential security vulnerabilities, Reuters reports. Separate and seemingly unrelated incidents involving Yahoo mail and Google cloud apps led to the blocks, which were implemented within the past two weeks, and have affected internal House communications.

Reuters reports that an email sent to lawmakers and staffers by the House Information Security Office on April 30 warns against increased phishing attacks on the House network attempting to install ransomware. The email said that the ransomware attacks came from third-party web-based mail applications, and that Yahoo mail, which appeared to be the focus of the attack, would be indefinitely blocked on the House network.

READ MORE: What Obama Thinks of Privacy vs. Security in the Age of Apple vs. FBI

The attacks had succeeded in installing ransomware on two individuals’ devices after they clicked on Word attachments, though the infected files were retrieved without paying the ransom, a source told Reuters. The FBI issued a warning in June about remote access tools capable of stealing data, including a “BLT” Trojan found on appspot.com.

Appspot.com, where custom Google apps are hosted, has also been blocked on devices connected to the House’s Internet through WiFi or Ethernet.

“We began blocking appspot.com on May 3 in response to indicators that appspot.com was potentially still hosting a remote access trojan named BLT that has been there since June 2015,” one of the sources, a House staffer with direct knowledge of the situation, told Reuters.

A former employee of the House of Representatives told Reuters that he had created two apps hosted on appspot.com for use by congressional staffers, which they now cannot use.

Spokespeople for both Yahoo and Google said they will work with the House on a resolution of the vulnerability.

Ransomware became a significantly more common attack type in 2015, according to research by IBM X-Force, and Trend Micro predicted ransomware attacks would increase in 2016.

The US government is attempting to update Federal IT systems to make use of cloud services through the FedRAMP program, but some within the industry say the process needs to be reformed.

Source: TheWHIR

When it Comes to Cloud, Customer Service Still Counts for a Lot

When it Comes to Cloud, Customer Service Still Counts for a Lot

Despite the flexibility that the cloud offers customers, a new survey by Microsoft and 451 Research suggests that customers are fiercely loyal to their primary service provider.

According to the survey, The Digital Revolution, Powered by Cloud, which was released Wednesday at the Microsoft Cloud & Hosting Summit in Washington, more than one-third of customers (38 percent) surveyed said they plan to increase spending with their primary cloud and hosting service provider upon contract renewal.

In an interview with The WHIR, Microsoft’s vice president, Hosting and Cloud Service Provider Business, Aziz Benmalek said that this indicates the critical role service providers play in continuing to “drive organic growth in existing customers and help them in their cloud journey.”

“Loyalty is high for the primary services providers,” he said. “In fact, 95 percent of the customers surveyed are expecting to stay with their current primary provider in the next year. Almost 70 percent have an annualized agreement with their service provider.”

This customer loyalty is critical for service providers as more options hit the market; pulling clients in every direction to fight for a piece of their IT spend.

SEE ALSO: Security, Cloud Computing Remain CIO Budget Priorities: Report

The study indicates just how customers are spending the majority of their IT budgets. According to the report, 71 percent of customers’ cloud and hosting budgets are now allocated to managed services, application hosting and security services.

Benmalek said that managed services in particular is “one of the fastest-growing segments” of IT spend in the cloud.

Survey respondents said that it is important that cloud and hosting providers have experience helping customers transform existing IT environments to cloud-based services, offer services beyond infrastructure (including managed services), can make recommendations for cloud platforms or apps to purchase, and can migrate workloads to different cloud environments. The survey also suggests that customers want service providers who can be a single point of contact for a variety of cloud services, and can broker contracts with other service providers.

Microsoft has more than 30,000 hosting partners, and while Benmalek wouldn’t say specifically how much revenue these partners drive for the vendor, he did say that it continues to grow “double digits from year to year.”

“It’s one of the fastest growing businesses for us,” he said.

Service providers are one component of Microsoft’s hybrid cloud strategy, Benmalek said. Microsoft’s “three-legged stool” is on-premise, hosted private cloud and services providers, as well as Azure public cloud.

“It’s a very exciting time for us and I think the vibrant ecosystem we see continues to be a key bet for us,” he said.

The full 78-page report is available for download on Microsoft’s website.

MSFT_2015Hosting_Infographic_v3

Source: TheWHIR

Ease Technologies Releases Cloud Workspace® Suite Software

Ease Technologies Releases Cloud Workspace® Suite Software

IndependenceIT, a workspace automation software platform provider that allows IT departments, service providers and ISVs to easily deliver workspaces, applications and data from any cloud infrastructure, has announced a partnership success with Ease Technologies, Inc., a managed service provider. Since implementing IndependenceIT’s software platform in 2014 and introducing its Ease Cloud Workspace® service, the company has built a robust Workspace-as-a-Service solution that leverages the software for access to applications, desktops, data, and complete cloud workspaces.

Headquartered in Columbia, Maryland, Ease Technologies serves leading corporations, educational institutions, law firms, accounting practices, membership-based organizations, healthcare organizations, and many small businesses. In addition to cloud workspace services, the company’s experience includes developing solutions in application and content management, Web strategy and design, data management, managed services, network services, and staffing.  Clients include firms such as Apple Computer, Comcast, C-SPAN, Fairfax County Government, Merck, Georgetown University as well as other well-known organizations. Ease Technologies has received many awards of distinction over the last two decades and aligns with some of the fastest growing tech firms in the country.

Prior to entering the Workspace-as-a-Service (WaaS) market, Ease Technologies had reviewed the merits of several WaaS software platforms, finally choosing IndependenceIT because of the high degree of automation built into Cloud Workspace Suite software. The easy to deploy and manage software has been deployed across several of its data center locations, hosted by U.S. Signal, an industry-leading network and cloud-hosting provider. Since receiving expert led training by IndependenceIT, Ease Technologies easily manages their entire workspace-as-a-service environment from a central control plane which has resulted in significant management time savings. Using Cloud Workspace Suite, the company serves a growing number of end-customers, especially in the education, legal, and accounting verticals who enjoy the additional productivity and efficiency the technology provides.

Cloud Workspace Suite is the only Workspace-as-a-Service (WaaS) software to offer true workspace automation to greatly reduce deployment time and management complexity, regardless of the WaaS platform preferred by the IT service provider. The software can be used as a standalone solution, or to provide a layer of automation and control that simplifies third-party WaaS software. The technology is used by IT service providers such as Ease Technologies, who deliver workspace services to their end-customers. The solution delivers an exceptional return on investment with savings through automation greatly reducing IT overhead.

“Because outsourcing of the IT workspace allows users to focus on what they do best, it is viewed as an imperative for those challenged with performing a technology refresh of on-premises computing infrastructure,” said Jason Shirdon, VP, Operations, Ease Technologies. “Technology solutions that enable greater agility and access are critical for businesses in distributed environments or where IT mobility is a must. As the software engine behind our services, Cloud Workspace Suite has made the entire process of workspace deployment and management easy, allowing us to focus on the customer experience. As a result, Ease Cloud Workspace is by far our most popular service option.”

“Cloud Workspace Suite is a powerful WaaS automation platform for partners such as Ease Technologies as it provides a high level of systems management automation and policy based management which greatly simplifies solution delivery,” said Seth Bostock, CEO, IndependenceIT. “The ability to provision application, workspace, and SaaS delivery from a centralized control plane eliminates hundreds of hours of management time per year, substantially reducing complexity and the total cost of ownership. IT service providers taking advantage of this are able to consolidate operations for easy procurement, management, and integration to optimize the administrator experience.”

Source: CloudStrategyMag

Dear Silicon Valley: Stop saying stupid stuff

Dear Silicon Valley: Stop saying stupid stuff

“Disruption” isn’t the same as “stupid,” but they sometimes sound similar. At least, they do when uttered by a certain strain of Silicon Valley entrepreneur.

This thought struck me while listening to a Valley exec at an enterprise software conference. He stumbled through PowerPoint (“How do you people use this app? I’m a Keynote guy”), agonized over how he could “possibly get used to Exchange after running his startup on Gmail” (his company had recently been acquired by a large software vendor), and generally made it clear that he had no idea how real companies work.

He lives in a bubble that has drones delivering tacos to those not already subsisting on Soylent. He wants to change enterprise computing, but he clearly has no appreciation for the challenges facing enterprises mired in decades of technical debt.

He is, in other words, either the worst or best person to change the world. (My vote: worst.)