Online Tech Acquires Echo Cloud

Online Tech Acquires Echo Cloud

Online Tech has announced it has acquired Echo Cloud, a Kansas City, MO, based enterprise cloud company. The acquisition is set to boost Online Tech’s reach in the Midwest market and provide further geographical diversity to its growing cloud infrastructure.

Yan Ness, CEO of Online Tech, said the acquisition provides several benefits for his company, including an expanded product line and geographic reach to the Kansas City and Missouri markets. “I’m very pleased with this deal,” Ness said. “We really like the Kansas City market. There’s lots of demand that we think is underserved, and this is a great opportunity to provide companies in the area with our secure, compliant hybrid IT services.”

Additional benefits include adding two Kansas City data centers and extra cloud infrastructure to Online Tech’s existing data centers across Michigan and Indiana.

Bill Severn, CEO of Echo Cloud, is equally enthusiastic for the two companies to come together. “Echo Cloud is extremely excited to be joining Online Tech,” he said. “I believe the values we hold match Online Tech’s very well, and I think this will be a great partnership moving forward.”

The companies will combine their existing services into Online Tech’s client portal to allow for easy account viewing and service management. Echo Cloud will also provide new services from Online Tech to its existing clients that are compliant with standards such as PCI and HIPAA. “Online Tech has been a leader in compliance for many years,” Severn said. “I’m pleased we can now offer HIPAA- or PCI-compliant data hosting to our existing customers here in Kansas City.”

 

Source: CloudStrategyMag

NTT Communications' Extends Multi-Cloud Connect To Oracle Cloud

NTT Communications' Extends Multi-Cloud Connect To Oracle Cloud

NTT Communications Corporation (NTT Com) has announced the extension of Multi-Cloud Connect connection to Oracle Cloud, to help multi-national customers take advantage of performance, cost and innovation benefits of the cloud.

While enterprises understand the promise and many benefits of the cloud, most experience issues such as latency, packet loss and security threats given that connectivity to cloud services are still heavily dependent to public Internet. With Multi-Cloud Connect, Oracle Cloud users will be able to leverage NTT Com’s secure, reliable, high performing MPLS network to access their business critical applications.

Multi-Cloud Connect will connect directly to Oracle Cloud’s platform through Oracle Network Cloud Services- FastConnect enabling private connection to its broad portfolios and features: platform as a service (PaaS), and infrastructure as a service (IaaS). This includes middleware such as “Oracle Database Cloud Service” and “Oracle Java Cloud Service”, as well as integration and business analytics features. Furthermore, NTT Com and Oracle will enable hybrid deployment of Oracle Cloud and Oracle software hosted on-premises or “Oracle Cloud at Customer”, under one global network.

Source: CloudStrategyMag

IDG Contributor Network: 3 reasons why data scientist remains the top job in America

IDG Contributor Network: 3 reasons why data scientist remains the top job in America

Glassdoor recently revealed its report highlighting the 50 best jobs in America, and unsurprisingly, data scientist claimed the top spot for the second year in a row. Every year, the jobs site releases this report based on each job’s overall “Glassdoor Job Score.” The score is determined by three key factors: the number of job openings, the job satisfaction rating, and the median annual base salary.

With a job score of 4.8 out of 5, a job satisfaction score of 4.4 out of 5, and a median base salary of $110,000, data scientist jobs came in first, followed by other technology jobs, such as data engineers and DevOps engineers.

In fact, data-related roles are dominating similar jobs reports released over the past year as well. A new study by CareerCast.com revealed data scientist jobs have the best growth potential over the next seven years, as they are one of the toughest jobs to fill. Statistics from rjmetrics.com show that there were anywhere from 11,400 to 19,400 data scientists in 2015, and over 50% of those roles were filled in the last four years.

A quick search for data scientist jobs in the United States on LinkedIn reveals over 13,700 open positions. Additionally, this job trends tool by Indeed, which showcases the demand for data scientists, reveals that both data scientist job listings and job seeker interest are showing no signs of slowing down.

It’s estimated there will be one million more computing jobs than employees to fill those computing jobs in the next ten years, according to Computer Science Zone. So how did the role of the data scientist rise to the top of the rankings? Let’s examine a few of the reasons and trends that led the data scientist position to claim the number one spot for the best job in America again this year.

Reason #1: There’s a shortage of talent

Not only are individuals with skills in statistics and analytics highly sought-after, but those with the soft skills to match are driving demand for data scientists. Business leaders are after professionals who can not only understand the numbers but also communicate their findings effectively. Because there is a still such a shortage of talent who can combine these two skillsets, salaries for data scientists are projected to grow over 6% this year alone.

So where are all the data scientists to fill these jobs? The main answer to this question is that they’re not trained yet. While computer science programs are on the rise, it’s still going to take some time for supply to catch up with demand. Big data and analytics courses have started making their way into the classroom only in the past couple of years so addressing the data science talent shortage won’t happen overnight. The number of job openings will certainly continue to outweigh the number of professionals with a sophisticated understanding of data and analysis to fill those openings over the next couple of years.

Reason #2: Organizations continue to face enormous challenges in organizing data

The role of the data scientist is evolving, and organizations desperately need professionals who can take on data organizing as well as preparing data for analysis. Data wrangling, or cleaning data and connecting tools to get the data into a usable format, is still highly in demand.

Data preparation may require many steps, from translating specific system codes into usable data to handling incomplete or erroneous data, but the costs of bad data are high. Some research shows that analyzing bad data can cost a typical organization more than $13 million every year.

Therefore, there will always be a demand for individuals who can weed out bad data that can alter results or lead to inaccurate insights for an organization. There’s no doubt it’s time-consuming work. In fact, data preparation accounts for about 80% of the work of data scientists. But even with the increased availability of highly sophisticated analytics dashboards and data collection tools, there will always be a demand for professionals who possess the advanced skill sets needed to clean and organize data before being able to extract valuable insights from it.

Reason #3: The need for data scientists is no longer restricted to tech giants

The demand for data scientists has finally pushed beyond large technology firms, such as Google or Facebook, as smaller organizations realize that they too can use data to make better, more informed decisions. This HBR feature on big data reported that “companies in the top third of their industry in the use of data-driven decision making were, on average, 5% more productive and 6% more profitable than their competitors.”

While small-to-medium sized organizations are not churning out nearly as much data as larger enterprises, sifting through that data to extract meaningful insights into their businesses can be a powerful competitive advantage nonetheless.

We’re also seeing entry-level data scientists flock towards startups and smaller firms because of the perception that they will be able to tackle higher-level work earlier in their careers. Data scientists possess a broad range of skills, and they want to be able to put all of those skills to use right away.

Smaller firms are also hiring fast. Large organizations looking to recruit entry-level data scientists are taking note that their multistep, legacy hiring and recruiting processes may need some updating if they are going to attract the top talent that they desire. So for now, as the demand for data professionals continues to surge, agile organizations continue to be the more favorable choice for data scientists, regardless of their size.

How to get into the field

The demand for data scientists is high, and professionals can enter the world of data science a number of ways. University programs are a good start, but a data science position often requires a mixture of skills that many schools are unable to package all together.

One way to develop all of the necessary skills is by attending a data science boot camp. Not only will you learn the analytical skills required for a data science position but you’ll also receive training for the softer skills that are becoming more and more common in data science roles – skills such as managing projects and teams across multiple departments, consulting with clients, assisting with business development, and taking abstract business issues and turning them into analytical solutions.

So if you’re still deciding the right career path, or thinking about making a career change in 2017, consider exploring what it takes to be a data scientist – one of the fast-growing and highest paid jobs in America right now.

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data

Faction® Receives New Patent

Faction® Receives New Patent

Faction® has announced that the U.S. Patent and Trademark Office (USPTO) has granted Faction a new US Patent #9,621,374, which extends the functionality of hybrid and multi-cloud to new protocols including Virtual Extensible LAN (VXLAN) and Software Defined Networking (SDN). The new patent further reinforces Faction’s place at the forefront of hybrid and multi-cloud solutions. 

Faction’s new patent allows customers to leverage two key innovations in the networking and cloud arenas: VXLAN and SDN. VXLAN technology is attractive to enterprises as it works to improve scalability challenges associated with large cloud deployments. SDN technology enables networks to add agility and flexibility, while allowing administrators to respond quickly through a centralized control location.

The new patent comes on the heels of Faction’s recently announced USPTO Patent #9,571,301 for the company’s pioneering work on hybrid and multi-cloud networking, which allows users to tap into the best features of private and public clouds to create one unified, optimized cloud. Enterprises and service providers using hybrid and multi-cloud typologies benefit from establishing a seamless extension of infrastructure to and between public clouds. These networks are especially useful in data center configurations to connect physical resources to one or more cloud providers.

“Our newest patent further validates Faction’s intellectual property leadership in hybrid and multi-cloud solutions,” states Luke Norris, CEO and founder, Faction. “By choosing Faction cloud, enterprises and service providers can take advantage of the many benefits inherent in combining the best features of private and public clouds – truly creating a cutting-edge approach to IT cloud transformation. Our patent leadership also paves the way for clients to leverage future cloud capabilities and allows Faction to confidently meet the sharp increase in demand we are seeing from enterprises seeking to establish hybrid and multi-cloud cloud strategies.”

Source: CloudStrategyMag

3W Infra Migrates Its IaaS Infrastructure To New Data Center

3W Infra Migrates Its IaaS Infrastructure To New Data Center

3W Infra has announced the migration of its complete IaaS infrastructure to a newly commissioned data center configured for high-redundancy in Amsterdam, the Netherlands. The migration to this new data hall is part of an infrastructural upgrade to support 3W Infra’s rapid customer growth. 

3W Infra has migrated its IaaS infrastructure including dedicated servers and network infrastructure to a new data hall within Switch Datacenters’ Amsterdam-based data center complex, Switch AMS1. This new data center features a highly energy-efficient data center design through the use of indirect adiabatic cooling technology and hot aisle containment (calculated pPUE = 1.04). Its efficiency and highly redundant 2N power configuration would cater to the uptime needs of customers with demanding applications including enterprises, cloud providers, gaming companies, and financials.

Designed for flexibility and scalability, this new data center hall in Switch AMS1 offers an extended capacity of about 400 data center racks and enables 3W Infra to offer companies considerable growth perspective. Its scalable power modules starting at 5 Ampere per cabinet up to 25 Ampere for high-density requirements (scalable steps of Ampere) are aimed at a wide range of customer types — from startups to enterprises and companies with demanding applications.

3W Infra expects to complete its phased data center infrastructure migration at the end of April 2017.

ISO 27001, 100% Uptime Guarantee

“The 2N power configuration gives 3W Infra customers a robust 100% uptime guarantee instead of the 99,999% we had before,” said Roy Premchand, managing director, 3W Infra. “The easy-scalable power modules available onsite allow our clients to grow their power infrastructure in a cost-efficient way. They can start with 5 Ampere and grow their infrastructure with 5 Ampere each time they need to add more power for their equipment.”

“Power, cabling, security…really all data center infrastructure included in this newly built data hall is very robust,” added Premchand. “The robustness and high security features will help us meet ISO 27001 requirements, as we’re currently aiming to achieve ISO/IEC 27001 certification for Information Security Management.”

OCP Specifications

The newly commissioned data hall in Amsterdam, Switch AMS1, is one of the first colocation data centers in Europe that’s suitable for Open Rack Systems based on OCP principles. The Open Compute Project (OCP) originates with Facebook, while companies like Intel and Microsoft joined the OCP in an early stage. A variety of industry giants joined the OCP later on, including Google, Cisco, IBM, NetApp, Lenovo, Ericsson, and AT&T. It means that Switch AMS1 is a modern premises well suited for housing OCP-specified equipment based on open standards. Its infrastructural efficiency would also be a good fit for 3W Infra’s ‘pure-play’ IaaS business regarding the delivery of traditional, highly customized dedicated server technology, colocation services and IP Transit.

Fast Growing Company 

The announcement follows the news that 3W Infra published the results of its latest server count. 3W Infra now has 4,000 dedicated servers under management, 1,000 more than half a year ago. Although quite a young company (founded in 2014), 3W Infra has been able to show significant growth in the second half of 2016.

3W Infra’s jump-start growth would be triggered by the company’s ‘pure-play’ IaaS hosting approach where cloud delivery is left to customers. 3W Infra’s high-volume transit-only network with global reach, and its ability to deliver highly customized IT infrastructures are also part of 3W Infra’s success in growing at such a fast pace, according to Premchand.

“We expect to continue our exponential growth rate, as we have quite some large sales opportunities in our sales pipeline,” added Premchand. “A variety of potential customers has shown interest in the new data hall already, companies with extensive and complex infrastructures I must say, but they were waiting until we have finished our migration processes.”

Source: CloudStrategyMag

Telehouse Launches Advanced Cloud Interconnection Solution In The U.S.

Telehouse Launches Advanced Cloud Interconnection Solution In The U.S.

TELEHOUSE has announced the launch of Telehouse Cloud Link in the United States. Currently available in the EMEA region, Telehouse Cloud Link is a multi-cloud connectivity exchange that allows enterprises to manage access to multiple cloud services through a single, secure and dedicated private connection.

Telehouse Cloud Link helps customers simplify their hybrid cloud infrastructure and accelerate data transfer between their network and cloud services by establishing direct, private connections to multiple Cloud Service Providers (CSPs), including Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform, as well as network on-demand including TELEHOUSE’s own NYIIX and LAIIX.

According to the Cisco Global Cloud Index, cloud data center traffic associated with cloud consumer and business applications is growing at 30% CAGR, and global cloud IP traffic is expected to account for more than 92% of total data center traffic by 2020.  As an increasing number of enterprises utilize cloud services to perform business-critical operations, data centers must enable fast and cost-effective access to key CSPs.

“Cloud computing has become a necessary requirement for many of today’s enterprises,” explains Aki Yamaguchi, COO, Telehouse America.  “After its success in Europe, we’re very excited about the launch of Telehouse Cloud Link in the U.S., as it enables fast, easy and secure cloud connectivity for all of our customers.”

 

Source: CloudStrategyMag

WPS Expands WPS Office

WPS Expands WPS Office

WPS Office Software has announced that it has expanded WPS Office with WPS Cloud, enabling users to work faster, more flexibly, and more productively with added storage, file roaming, and sharing capabilities.

In a report by Allied Research, cloud adoption continues to soar, with the personal cloud market in particular expected to reach $89.9 billion, globally, by 2020. In line with this trend, WPS Cloud is designed to work with WPS Office as a companion, offering users the benefits of cloud computing such as flexibility and collaboration.

WPS Cloud is a complete document storage and management service, enabling users to view and edit files anywhere at anytime. Automatic backup and link sharing tools promote collaborating in real time, across multiple devices and platforms. Files are protected from loss with enterprise-level data security and multiple backups.

Users can access files either directly from within WPS Office, or through the WPS Cloud web portal. Specific features include:

  • Free storage space: Users receive 1GB of free storage space on WPS Cloud, where they can store files and rich media.
  • File roaming and cross device support: File roaming allows users to open or create documents in WPS Office, which are then automatically saved to WPS Cloud. Recent files are shown and may be accessed across all connected devices including Windows PCs, Android or iOS mobile devices, or through a web browser.
  • Easy one-click file sharing: Once a document is uploaded to WPS Cloud, the user can create a unique URL that can be shared with anyone to view and download the file.

“Users are increasingly adopting and using apps and documents in the cloud, and growing accustomed to accessing content across devices regardless of their environment,” said Cole Armstrong, senior director of marketing, WPS Office Software. “More and more, they want equal access to office software whether on a mobile device or traditional PC. We’re pleased to offer WPS Cloud to address this trend as we continue to look for ways to improve our users’ computing experience and productivity.”

WPS Office is a full Office Software Suite for desktop and mobile devices that provides a high performing, yet considerably more affordable solution to con/prosumers and a preferred alternative to Microsoft Office. Users prefer WPS Office over software like Microsoft Office, citing that WPS is more reliable, efficient and faster. Unique features enable users to create, view, edit, store and share documents for greater productivity from their office software. WPS Writer, Presentation, Spreadsheets and PDF Reader/Converter are fully compatible with other office software solutions, and the ideal FREE office software alternative for Microsoft Office users whose trials have ended.

 

Source: CloudStrategyMag

Green House Data Acquires Cirracore

Green House Data Acquires Cirracore

Green House Data has announced the acquisition of Cirracore, an Atlanta-based infrastructure provider of enterprise-ready Infrastructure-as-a-Service (IaaS) and hybrid cloud products. The Cirracore customer list includes a strong presence in the Southeast as well as large national and international brands.

“As a high-growth market and innovation hub, Atlanta has been a target expansion market for us,” said Shawn Mills, CEO, Green House Data. “Integrating Cirracore and its management team into Green House Data will allow us to deliver a larger set of products with greater geographic diversity, ultimately to provide higher value for all of our customers, both existing and future.”

Cirracore was founded in 2008 by Fred Tanzella, a veteran technology executive with deep experience in information security and infrastructure technology start-ups. Under his leadership, Cirracore has beat revenue projections and boasts an exceptionally low customer churn rate. Mr. Tanzella serves on the Board of Directors for the Technology Association of Georgia, and the Executive Advisory Board for the National Association of Telecom Professionals. He will join the Green House Data executive team.

Green House Data will celebrate its 10-year anniversary in July of 2017. The company has expanded from a single facility in Cheyenne, Wyoming to over 100,000 sq ft of white space and cloud connectivity. With access to over 250 carriers, service providers, and content providers spread across west, central, east, and now Southeast locations, Green House Data customers are uniquely positioned for rapid scale and location-specific workloads. All Green House Data facilities are compliant to standards including HIPAA, SSAE 16 Type II, and PCI-DSS. Committed to sustainability, the company is the nation’s 25th largest green power buyer within the technology and telecom space.¹

“As we entered this next phase, we looked for a strategic acquisition partner,” said Tanzella. “We’ve more than doubled our footprint in the last two years, and it was critical to my team that Cirracore’s model of enterprise-focused, VMware-based, and hyper-growth IaaS be pulled forward in any merger or acquisition scenario.”

Cirracore’s compute and storage infrastructure includes high-speed carrier-redundant network connectivity together with managed carrier-grade security. From Equinix’s AT1 and AT3 facilities in Atlanta, Cirracore can cross-connect customers directly into cloud resources from over 180 network and service providers.

“We’re thrilled to bring yet another location into our portfolio, and have Fred’s leadership and vision added to our team,” said Mills. “It’s an exciting time for both of our companies, and the industry as a whole.”

¹ https://www.epa.gov/greenpower/green-power-partnership-top-30-tech-telecom

Source: CloudStrategyMag

LLVM-powered Pocl puts parallel processing on multiple hardware platforms

LLVM-powered Pocl puts parallel processing on multiple hardware platforms

LLVM, the open source compiler framework that powers everything from Mozilla’s Rust language to Apple’s Swift, is emerging in yet another powerful role: an enabler of code deployment systems that target multiple classes of hardware for speeding up jobs like machine learning.

To write code that can run on CPUs, GPUs, ASICs, and FPGAs alike—something hugely useful with machine learning apps—it’s best to use something like OpenCL, which allows a program to be written once and then automatically deployed across all those different types of hardware.

Pocl, an implementation of OpenCL that was recently revamped to version 0.14, is using the LLVM compiler framework to do that kind of targeting. With Pocl, OpenCL code can be automatically deployed to any hardware platform with LLVM back-end support.

Pocl uses LLVM’s own Clang front end to take in C code that uses the OpenCL standard. Version 0.14 works with both LLVM 3.9 and the recently released LLVM 4.0. It also offers a new binary format for OpenCL executables, so they can be run on hosts that don’t have a compiler available.

Aside from being able to target multiple processor architectures and hardware types automatically, another reason Pocl uses LLVM  is that it aims to “[improve] performance portability of OpenCL programs with the kernel compiler and the task runtime, reducing the need for target-dependent manual optimizations,” according to the release note for version 0.14. 

There are other projects that automatically generate OpenCL code tailored to multiple hardware targets. The Lift project, written in Java, is one such code generation system. Lift generates a specially tailored IL (intermediate language) that allows OpenCL abstractions to be readily mapped to the behavior of the target hardware. In fact, LLVM works like this; it generates an IL from source code, which is then compiled for a given hardware platform. Another such project, Futhark, generates GPU-specific code.

LLVM is also being used as a code-generating system for other aspects of machine learning. The Weld project generates LLVM-deployed code that is designed to speed up the various phases of a data analysis framework. Code spends less time shuttling data back and forth between components in the framework and more time doing actual data processing.

The development of new kinds of hardware targets is likely to continue driving the need for code generation systems that can target multiple hardware types. Google’s Tensor Processing Unit, for instance, is a custom ASIC devoted to speeding one particular phase of a machine learning job. If hardware types continue to proliferate and become more specialized, having code for them generated automatically will save time and labor.

Source: InfoWorld Big Data

Hyperscale Operators Continue Ramping Up Share Of Cloud Markets

Hyperscale Operators Continue Ramping Up Share Of Cloud Markets

New data from Synergy Research Group shows that hyperscale operators are aggressively growing their share of key cloud service markets, which are themselves growing at impressive rates. Synergy’s new research has identified 24 companies that meet its definition of hyperscale, and in 2016 those companies in aggregate accounted for 68% of the cloud infrastructure services market (IaaS, PaaS, private hosted cloud services) and 59% of the SaaS market. In 2012 those hyperscale operators accounted for just 47% of each of those markets. Hyperscale operators typically have hundreds of thousands of servers in their data center networks, while the largest, such as Amazon and Google, have millions of servers.

In aggregate those 24 hyperscale operators now have almost 320 large data centers in their networks, with many of them having substantial infrastructure in multiple countries. The companies with the broadest data center footprint are the leading cloud providers — Amazon, Microsoft, and IBM. Each has 45 or more data center locations with at least two in each of the four regions (North America, APAC, EMEA, and Latin America). The scale of infrastructure investment required to be a leading player in cloud services or cloud-enabled services means that few companies are able to keep pace with the hyperscale operators, and they continue to both increase their share of service markets and account for an ever-larger portion of spend on data center infrastructure equipment — servers, storage, networking, network security, and associated software.

“Hyperscale operators are now dominating the IT landscape in so many different ways,” said John Dinsdale, a chief analyst and research director at Synergy Research Group. “They are reshaping the services market, radically changing IT spending patterns within enterprises, and causing major disruptions among infrastructure technology vendors. Our latest forecasts show these factors being accentuated over the next five years.”

Source: CloudStrategyMag