QTS Enhances Carrier-Neutral Cloud Connectivity Ecosystem

QTS Enhances Carrier-Neutral Cloud Connectivity Ecosystem

Responding to increasing customer demand for diverse cloud connectivity options, QTS Realty Trust announced that customers will be able to access elastic interconnection services powered by Megaport, a leader in global Software Defined Networking (SDN).

Megaport is part of QTS’ multi-phased approach to expand its carrier-neutral cloud ecosystem and simplify customer network strategies by providing diverse connectivity for cloud and hybrid IT environments.

Megaport enables QTS’ customers and partners to connect over a Software Defined Network (SDN), reducing cost, and providing rapid connectivity and provisioning across a single platform. Virtual Cross Connects (VXCs) to services are provisioned via QTS’ portal and managed via any mobile device. Megaport complements QTS’ existing portfolio of network services to include direct connectivity to some of the world’s largest hyperscale cloud providers.

“Expansion of QTS’ connectivity ecosystem supports today’s enterprises seeking speed, agility and efficiency to solve their hybrid IT strategies,” said Dan Bennewitz, chief operations officer, sales, product & marketing. QTS. “By partnering with Megaport, QTS customers can connect with who they want when they want, more efficiently and effectively.”

“QTS has evolved into a highly innovative provider of cloud and hybrid IT solutions backed by an increased emphasis on strategic carrier-neutral cloud interconnection,” said Vincent English, CEO of Megaport. “Megaport provides the ideal foundation for QTS customers to connect easily to each other as well as over 200 service providers across our ecosystem, including the top five global cloud service providers.”

Enterprises will be able to access Megaport from QTS’ mega data centers in Atlanta, Georgia; Chicago, Illinois; Dallas-Irving, Texas; Richmond, Virginia; and Santa Clara, California.

Source: CloudStrategyMag

Report: Most Businesses Using the Cloud Spend $100,000+ on Additional Security Features

Report: Most Businesses Using the Cloud Spend 0,000+ on Additional Security Features

Businesses prefer storing data in the cloud, but plan to invest in extra security precautions, according to the second report in Clutch’s Annual Cloud Computing Survey. Clutch is a B2B ratings and reviews firm.

Nearly 70% of businesses on the cloud prefer storing data in the cloud instead of on a legacy system, and these businesses are willing to invest heavily in keeping their cloud’s data secure.

In fact, over half of companies surveyed spend more than $100,000 annually on additional cloud security features.

One reason for the high investment in cloud security is companies’ greater awareness of the security risks that are out of their cloud provider’s control, according to industry experts. When it comes to application-level security, including user access, password sharing, and other individual interactions, the company and its employees shoulder the responsibility for security.

“There is suddenly a number of people recognizing that application-level security needs to be done by the user, not the vendor,” said Haresh Kumbhani, founder and CEO of Zymr, Inc., a San Francisco-based cloud consulting and agile software development services company. “If this is the case, then they need to invest top dollar in securing the data.”

In another finding, nearly 1 in 4 businesses on the cloud indicate that they use IoT services. However, the quality of security for IoT varies.

“Nascent is the first word that comes to mind [regarding IoT security]. For every company that properly locks down IoT-enabled machines on a factory floor, you have thousands of unsecured ‘smart’ lightbulbs,” said Jamie MacQuarrie, co-founder of Appivo, a platform for developing cloud-based web and mobile applications.

Companies can protect themselves from security threats and prevent issues by following both mandatory and voluntary security guidelines and implementing additional security features, such as extra encryption.

Currently, 65% of businesses follow the security guidelines released by the Cloud Security Alliance, and 64% of businesses use encryption as an additional security tool, according to the survey.

The survey includes responses from 283 IT professionals at businesses across the United States that use cloud computing.

Read the full article here.

Source: CloudStrategyMag

DataSite Launches Its Cloud And IT Enablement Division

DataSite Launches Its Cloud And IT Enablement Division

DataSite has announced the launch of its cloud and IT enablement division, DataSite Atmosphere. DataSite Atmosphere uses a consultative approach to craft cloud and managed IT solutions designed to specific customer needs and requirements.

Harnessing its vast experience in IT, DataSite Atmosphere’s service solutions are sensitively architected to meet customers’ needs and utilize best-in-class cloud service platforms to fulfill the requirements for compute, memory, storage, and network.  DataSite Atmosphere provides a single contract, embedding services with colocation, to arrive at the smartest and most complete hybrid colocation and cloud service solution for tomorrow’s enterprise. When customer requirements demand, DataSite Atmosphere will build infrastructure necessary to serve those specific needs. DataSite provides creative, custom solutions that include:

  • Cloud ramps to AWS Direct Connect, Google Cloud, Interconnect, and Azure ExpressRoute
  • Private, public, and hybrid cloud computing
  • Network services including internet bandwidth, SD WAN, and edge network connectivity
  • Unified communications
  • Virtual desktop and hosted exchange
  • Backup and disaster Recovery-as-a-Service
  • Cloud storage

“The Atmosphere team is keenly aware that the technology ecosystem is ever changing,” says Jeff burges, president and founder of DataSite.  “We believe in creating long-term partnerships that enable customers to continue to evolve utilizing opportune solutions.  Atmosphere is uniquely positioned, with its customer centric model, to craft best-in-class colocation and cloud solutions for today and tomorrow.”

Source: CloudStrategyMag

Report: Cloud & Hyperscale Growth Forecast

Report: Cloud & Hyperscale Growth Forecast

New cloud forecast data from Synergy Research Group shows that worldwide revenues from cloud and SaaS services will grow at an average annual rate of 23% to 29% over the next five years, passing the $200 billion milestone in 2020, which will help to pull through 11% annual growth in sales of infrastructure to hyperscale cloud providers. Meanwhile, revenues from the sale of hardware and software to enterprise data centers will continue to slowly decline, reflecting the ongoing shift in workloads from privately owned infrastructure to the public cloud. Across the major cloud service categories, public IaaS/PaaS will see the strongest growth at an average 29% per year, while managed or hosted private cloud service revenues will grow by 26% per year and enterprise SaaS 23%.

Strong growth will continue to be seen across all of the major cloud service segments, with some of the highest growth being seen in database and IoT-oriented IaaS/PaaS services, and ERP within enterprise SaaS. APAC will be the highest growth region, followed by EMEA and North America.

Synergy’s hyperscale operator tracking research shows that the 24 hyperscale companies now have a global footprint comprising over 360 large data centers. As this figure continues to grow by almost 20% per year, the hyperscale companies are increasingly dominating the cloud market. They will soon account for over 80% of all cloud and SaaS service revenues and almost 40% of spend on all public and private data center equipment.

“As cloud markets reach a massive scale, growth rates will inevitably tail off, but our latest forecasts are still showing that over the next five years we will still see some really robust growth,” said John Dinsdale, a chief analyst and research director at Synergy Research Group. “As cloud continues to dramatically reshape the IT world it is also clear that the hyperscale phenomenon continues to dramatically reshape cloud. Hyperscale operators are diminishing the growth opportunity for traditional non-hyperscale service providers and are also seriously challenging the technology vendor community to rethink its position in the new world.”

Source: CloudStrategyMag

Which Spark machine learning API should you use?

Which Spark machine learning API should you use?

You’re not a data scientist. Supposedly according to the tech and business press, machine learning will stop global warming, except that’s apparently fake news created by China. Maybe machine learning can find fake news (a classification problem)? In fact, maybe it can.

But what can machine learning do for you? And how will you find out? There’s a good place to start close to home, if you’re already using Apache Spark for batch and stream processing. Along with Spark SQL and Spark Streaming, which you’re probably already using, Spark provides MLLib, which is, among other things, a library of machine learning and statistical algorithms in API form.

Here is a brief guide to four of the most essential MLlib APIs, what they do, and how you might use them.  

Basic statistics

Mainly you’ll use these APIs for A-B testing or A-B-C testing. Frequently in business we assume that if two averages are the same then the two things are roughly equivalent. That isn’t necessarily true. Consider if a car manufacturer replaces the seat in a car and surveys customers on how comfortable it is. At one end the shorter customers may say the seat is much more comfortable. At the other end, taller customers will say it is really uncomfortable to the point that they wouldn’t buy the car and the people in the middle balance out the difference. On average the new seat might be slightly more comfortable but if no one over 6 feet tall buys the car anymore, we’ve failed somehow. Spark’s hypothesis testing allows you to do a Pearson chi-squared or a Kolmogorov–Smirnov test to see how well something “fits” or whether the distribution of values is “normal.” This can be used most anywhere we have two series of data. That “fit” might be “did you like it” or did the new algorithm provide “better” results than the old one. You’re just in time to enroll in a Basic Statistics Course on Coursera.

Classification

What are you? If you take a set of attributes you can get the computer to sort “things” into their right category. The trick here is coming up with the attribute that matches the “class,” and there is no right answer there. There are a lot of wrong answers. If you think of someone looking through a set of forms and sorting them into categories, this is classification. You’ve run into this with spam filters, which use a list of words spam usually has. You may also be able to diagnose patients or determine which customers are likely to cancel their broadcast cable subscription (people who don’t watch live sports). Essentially classification “learns” to label things based on labels applied to past data and can apply those labels in the future. In Coursera’s Machine Learning Specialization there is a course specifically on this that started on July 10, but I’m sure you can still get in.

Clustering

If k-means clustering is the only thing out of someone’s mouth after you ask them about machine learning, you know that they just read the crib sheet and don’t know anything about it. If you take a set of attributes you may find “groups” of points that seem to be pulled together by gravity. Those are clusters. You can “see” these clusters but there may be clusters that are close together. There may be one big one and one small one on the side. There may be smaller clusters in the big cluster. Because of these and other complexities there are a lot of different “clustering” algorithms. Though different from classification, clustering is often used to sort people into groups. The big difference between “clustering” and “classification” is that we don’t know the labels (or groups) up front for clustering. We do for classification. Customer segmentation is a very common use. There are different flavors of that, such as sorting customers into credit or retention risk groups, or into buying groups (fresh produce or prepared foods), but it is also used for things like fraud detection. Here’s a course on Coursera with a lecture series specifically on clustering and yes, they cover k-means for that next interview, but I find it slightly creepy when half the professor floats over the board (you’ll see what I mean).

Collaborative filtering

Man, collaborative filtering is a popularity contest. The company I work for uses this to improve search results. I even gave a talk on this. If enough people click on the second cat picture it must be better than the first cat picture. In a social or e-commerce setting, if you use the likes and dislikes of various users, you can figure out which is the “best” result for most users or even specific sets of people. This can be done on multiple properties for recommender systems. You see this on Google Maps or Yelp when you search for restaurants (you can then filter by service, food, decor, good for kids, romantic, nice view, cost). There is a lecture on collaborative filtering from the Stanford Machine Learning course, which started on July 10 (but you can still get in).

This is not all you can do (by far) but these are some of the common uses along with the algorithms to accomplish them. Within each of these broad categories are often several alternative algorithms or derivatives of algorithms. Which to pick? Well, that’s a combination of mathematical background, experimentation, and knowing the data. Remember, just because you get the algorithm to run doesn’t mean the result isn’t nonsense.

If you’re new to all of this, then the Machine Learning Foundations course on Coursera is a good place to start — despite the creepy floating half-professor.

Source: InfoWorld Big Data

Online Tech Extends Hybrid Cloud Service

Online Tech Extends Hybrid Cloud Service

Online Tech has extended its hybrid cloud offering to support managed Microsoft Azure services. This extension is the first step for Online Tech towards integrating the public cloud into its suite of hybrid IT products, which includes colocation, virtual private cloud and data protection services.

This new service includes SprawlGuard™, the first-of-its-kind technology that will help clients manage their monthly cloud spend. It allows them to set a monthly spending budget, or target, and then alerts them when it looks like they will exceed that amount. In the future, it will help to identify unused or under-utilized resources so clients can be more efficient with their public cloud spend.

“Clients will now not only effortlessly manage their entire environment, but also achieve transparency and control over their expenses with SprawlGuard™,” said Nick Lumsden, vice president of technology and product strategy, Online Tech. “This will provide our clients with a single window into the cost drivers of their critical business systems, so they can harness the power of their multi-cloud strategy.”

Organizations thinking of adopting a multi-cloud strategy often face many challenges. Public cloud bills are frequently unpredictable and complex, and businesses find that they need to hire additional personnel to architect and support their solution.

Online Tech’s managed Azure services solve both challenges: Clients get a complete, fully managed public cloud that integrates with their existing virtual private cloud infrastructure within a single portal and with a simple bill. Their hybrid environment is fully managed by Online Tech’s Azure experts and support team, giving businesses additional depth to their cloud teams.

“One of the biggest challenges we’ve seen with our clients is the struggle to understand their public cloud bill,” said Kurt Schaldenbrand, product design manager, Online Tech. “We’ve effectively solved that challenge by offering our clients who have Azure a complete view into their total environment, where they receive a single bill for all services rendered, allowing them to easily track their spending and better manage costs.”

“We’re really excited that Online Tech is leading the pack when it comes to hybrid cloud,” said Scott Larsen, chief technology officer, Mingle Analytics. “This new service will allow us to easily manage both our public and private cloud and give us the support we need to meet the growing demands of our health care quality reporting business.”

Source: CloudStrategyMag

Engility Joins Cloud Security Alliance

Engility Joins Cloud Security Alliance

Engility Holdings, Inc. has joined the Cloud Security Alliance, the world’s leading organization dedicated to defining and raising awareness of best practices to help secure cloud computing environments.

 “Joining CSA means our customers gain the best possible path forward when addressing cloud computing challenges,” said Gay Porter, vice president of Engility’s Technical Solutions Group. “We look forward to collaborating with others in the industry to maintain best practices and drive innovation and advancements around cloud security. We bring extensive systems engineering and enterprise modernization expertise and solutions, such as the recently introduced Cloud ASCEND™.”

Engility’s Cloud ASCEND™ combines state-of-the-art commercial cloud tools with a proprietary methodology and team of certified cloud experts to transform organizations’ digital enterprise. Scalable and tailorable, the tool designs secure and resilient cloud and hybrid enterprise architectures. These new architectures can be modeled using computer-aided design tools before any applications are actually migrated to the cloud. Cloud ASCEND also develops and tests systems bound for the Intelligence Community’s (IC) C2S cloud in an unclassified emulator, saving time and money while helping to design the optimal solution based on specific client needs.

The company helps government organizations with digital transformation. Engility combines state of the art commercial tools with deep operational and mission understanding to deliver secure, modern IT infrastructure to our customers. For example, in 2016, Engility helped the Defense Information Systems Agency establish a Virtual Operating Environment that supports all unclassified laboratory needs within the Joint Interoperability Test Command.

“We’re pleased to welcome Engility to CSA. As a leading aerospace and defense company, Engility brings a fresh perspective and new, innovative tools and approaches to our organization and the clients we serve,” said Jim Reavis, CEO, Cloud Security Alliance.

Source: CloudStrategyMag

Apache Spark 2.2 gets streaming, R language boosts

Apache Spark 2.2 gets streaming, R language boosts

With version 2.2 of Apache Spark, a long-awaited feature for the multipurpose in-memory data processing framework is now available for production use.

Structured Streaming, as that feature is called, allows Spark to process streams of data in ways that are native to Spark’s batch-based data-handling metaphors. It’s part of Spark’s long-term push to become, if not all things to all people in data science, then at least the best thing for most of them.

Structured Streaming in 2.2 benefits from a number of other changes aside from losing its experimental designation. It can now work as a source or a sink for data coming from or being written to an Apache Kafka source, with lower latency for Kafka connections than previously.

Kafka, itself an Apache Software Foundation, is a distributed messaging bus widely used in streaming applications. Kafka has typically been paired with another stream-processing framework, Apache Storm, but Storm is limited to stream processing only, and Spark presents less complex APIs to the developer.

Structured Streaming jobs can now use Spark’s triggering mechanism to run a streaming job once and quit. Databricks, the chief commercial outfit supporting Spark development, claims this is a more efficient execution model than running Spark batch jobs intermittently.

The native collection of machine learning libraries in Spark, MLlib, has been outfitted with new algorithms for tasks like performing PageRank on datasets, or running multiclass logistic regression analysis (e.g., which current hit movie will a person in various demographic categories probably like best?). Machine learning is a common use case for Spark. 

Machine learning in Spark also gets a major boost from improved support for the R language. Earlier versions of Spark had wider support for Java and Python than R, but Spark 2.2 adds R support for 10 distributed algorithms. Structured Streaming and the Catalog API (used for accessing query metadata in Spark SQL) can now also be used within Spark.

Source: InfoWorld Big Data

IBM To Launch the Next Generation Processors In The Cloud

IBM To Launch the Next Generation Processors In The Cloud

IBM has announced that it will be the first major cloud provider to launch bare metal servers powered by the next generation Intel Xeon Scalable processors globally on the IBM Cloud. New IBM Cloud bare metal servers powered by the Intel Xeon Silver 4110 processor and Intel Xeon Gold 5120 and 6140 processors will be designed to help deliver greater performance and generate faster insights from big data workloads, and will offer the global reach and security-rich environment of the IBM Cloud.

Data is quickly becoming one of the greatest competitive differentiators for enterprises across industries including financial services, manufacturing and healthcare. IDC estimates that by 2020, organizations that are able to analyze all relevant data and deliver actionable information will achieve an extra $430 billion in productivity benefits over their peers.1 As the volume of data grows rapidly, enterprises require higher levels of performance and efficiency to quickly and easily generate these valuable insights.

New IBM Cloud bare metal servers will be powered by the latest Intel Xeon Scalable processors and are designed to help enterprises run high performance computing workloads such as complex financial simulations, manufacturing design simulations or genomic analysis faster than previous generation processors, and to decrease the time it takes to deliver insights from mission-critical data. According to Intel, the new processor technology can accelerate insights up to 2.3 times faster for financial services workloads, up to 1.5 times faster for manufacturing workloads and up to 1.7 times faster for life sciences workloads.2

In addition to increased performance, the new bare metal deployment options will also provide a dedicated, security-rich environment that is highly customizable for a client’s most sensitive big data workloads. To help clients gain new insights into their data, the IBM Cloud provides seamless access to more than 160 APIs and services ranging from cognitive and analytics to blockchain and IoT. IBM is committed to delivering the fastest and most comprehensive technology to the cloud including POWER-based offerings for data-intensive workloads, and GPUs for cognitive and high performance computing. For enterprises leveraging private or hybrid cloud environments, IBM utilizes multiple platforms including IBM Power Systems and Z Systems.

“IBM Cloud provides the foundation that enterprises need so that they can fully harness data for better decision-making and transformative growth,” said John Considine, general manager for cloud infrastructure services, IBM. “The launch of Intel Xeon Scalable processors on the IBM Cloud is another milestone in IBM’s commitment to providing access to the latest infrastructure technology so clients can continue to generate greater value from their data.”

IBM Cloud bare metal servers powered by Intel Xeon Scalable processors will be available in IBM Cloud data centers in the United States, United Kingdom, Germany and Australia in Q3 2017.

 

1. IDC FutureScape: Worldwide Big Data and Analytics 2016 Predictions http://www.idc.com/getdoc.jsp?containerId=259835

2. Up to 1.6x Gains based on Geomean of Weather Research Forecasting – Conus 12Km, HOMME, LSTCLS-DYNA Explicit, INTES PERMAS V16, MILC, GROMACS water 1.5M_pme, VASPSi256, NAMDstmv, LAMMPS, Amber GB Nucleosome, Binomial option pricing, Black-Scholes, Monte Carlo European options.

 

Source: CloudStrategyMag

Red Hat Introduces Next Generation Of OpenShift Online Public Cloud Offering

Red Hat Introduces Next Generation Of OpenShift Online Public Cloud Offering

Red Hat, Inc. has introduced the next generation of Red Hat OpenShift Online, the industry’s first open source, container-native, multi-tenant cloud platform. Based on the same Linux container- and Kubernetes-based foundation as the award-winning Red Hat OpenShift Container Platform, Red Hat OpenShift Online gives developers the ability to more quickly and easily build, deploy and scale cloud-native applications in a public cloud environment.

Since OpenShift Online’s launch in 2011 it has hosted more than three million applications built by hundreds of thousands of individual developers, startups, educational institutions, ISVs and enterprise organizations around the world, making it one of the industry’s most popular developer platforms to build any app, anywhere, at any scale on the public cloud.

OpenShift Online enables developers to build cloud-native apps on a cloud-based container platform without having to worry about the inherent complexity of provisioning, managing and scaling applications as demands change. With operations and management provided by Red Hat in the public cloud, developers can focus on writing the code for their business, prototyping new features, or working on their next big idea–all in a self-service environment.

OpenShift is a polyglot platform that supports multiple languages, including Java, Node.js, .NET, Ruby, Python, PHP and more. It offers optimized workflows to help configure and deploy applications on any given framework, such as Spring Boot, Eclipse Vert.x, Node.js and Red Hat JBoss Middleware, helping developers start their project easier and start coding faster.

Red Hat OpenShift Online delivers a world-class developer experience with new capabilities, including:

  • Simplified deployment – New one-click and “Git push” command deployment capabilities have been added to help streamline application provisioning and deployment for developers and sysadmins that do not need full control over the deployment lifecycle.
  • Automatic scaling – Cloud elasticity is enabled through automatic application scaling, which helps eliminate the need for manual intervention from Operations when an increase in application load requires more application instances.
  • S2I builds – Using the source-to-image (S2I) framework to build reproducible container images. This helps eliminate the need for developers to understand docker, or create and manage docker images, reducing errors and enabling them to focus on writing their applications in the language of their choice.
  • IDE integration – The platform features built-in integration with popular integrated development environments (IDEs), including Eclipse, Red Hat JBoss Developer Studio, and Titanium Studio, enabling developers to stay within the IDE they are most comfortable with when working with OpenShift.
  • Middleware services – Red Hat OpenShift Application Services provide the powerful capabilities of products in the Red Hat JBoss Middleware portfolio as cloud-based services on OpenShift. These services can be used by developers to build applications, integrate with other systems, orchestrate using rules and processes, and then deploy across hybrid environments.

Source: CloudStrategyMag