Faction Partners With CoreSite

Faction Partners With CoreSite

CoreSite Realty Corporation has announced that Faction has partnered with CoreSite in both the Northern Virginia and Silicon Valley markets to deliver expanded multi-cloud service offerings to its customers.

Faction chose to partner with CoreSite based on its scalable data center platform, strategic locations and high-performance interconnection solutions that allow Faction to optimize the delivery of its multi-cloud service offering by providing low-latency connectivity to leading cloud and network service providers. Faction will leverage connectivity to the CoreSite Open Cloud Exchange to deliver high-performance and cost-effective VMware based clouds with Administrator-level access to VMware vCenter that offers unprecedented control, flexibility, and integration capabilities for hybrid & multi-cloud deployments.

Along with Faction’s leading enterprise-class private cloud and Veeam backup solutions, Faction is launching a multi-cloud NetApp storage solution and Managed VMware on AWS offering that is enabled by leveraging the low-latency cloud on-ramps within the CoreSite portfolio.

“CoreSite provides the facilities, high-performance interconnection solutions and close proximity to the major public cloud providers, which are key to enabling our multi-cloud solutions and go-to-market strategy,” said Luke Norris, Chief Executive Officer at Faction. “We are committed to providing our customers with maximum control in an unmatched multi-cloud environment, and CoreSite enables us to deliver a differentiated customer experience.”

The CoreSite Silicon Valley market is comprised of seven operational data centers, providing colocation solutions to one of the largest concentrations of Internet and technology companies in the world. More than 185 international and national carriers, social media companies, cloud computing providers, media and entertainment firms, and enterprise customers connect to do business in CoreSite’s Silicon Valley data center market.

The CoreSite Northern Virginia data center market currently includes three highly scalable facilities — one in Washington, D.C. and two on its Reston, VA campus (VA1 and VA2). CoreSite recently announced the expansion of both its Reston and Washington, D.C. campuses, all of which will now total over 1,097,000 sq ft of colocation data center space upon full build out. CoreSite’s customer community includes a diverse mix of government, financial services and cloud service providers, as well as domestic and international networks providing a direct connection to U.S. and European markets. With the growing importance of Northern Virginia as a communications and enterprise hub, CoreSite’s Reston campus provides flexible colocation and hybrid-cloud deployment solutions for customers located in Washington, D.C. and the greater Northern Virginia area.

“We are pleased that Faction has partnered with CoreSite to expand the reach of their VMware-based private cloud and Veeam backup solutions. Additionally, their multi-cloud service offering is a valuable addition to the ecosystem, and through the CoreSite Open Cloud Exchange, Faction is able to deliver a true hybrid-cloud experience without compromising security or performance,” said Steve Smith, senior vice president, sales and marketing at CoreSite. “Faction provides enterprises with unique and customized cloud solutions built on a powerful and well-adopted VMware environment, and we look forward to supporting their future growth.”

Source: CloudStrategyMag

Stream Data Centers And Megaport Partner

Stream Data Centers And Megaport Partner

Stream Data Centers has announced that it has signed a partnership agreement with Megaport (USA), Inc., the U.S. subsidiary of Megaport Limited to deliver secure, direct consumption-based connectivity services to enterprises.

The Megaport Software Defined Network enables Stream customers with the ability to right size their capacity, by scaling bandwidth up and down, with a consumption-based payment model. Customers can access multiple service providers over the ecosystem and manage direct, multi-cloud connectivity through the Megaport API or via the Megaport portal.

“A primary mission for Megaport is to reduce the barrier of entry for enterprises adopting cloud services,” said Nicole Cooper, executive vice president, Americas, Megaport. “Our industry-leading SDN brings a powerful set of capabilities in which to execute a network and connectivity strategy that aligns to cost and performance goals. Stream’s U.S. data center portfolio complements our neutral networking model extremely well. It enables further expansion of our footprint in the United States as we help accelerate cloud service adoption in emerging cloud markets.”

“We are excited to partner with Megaport to enable flexible cloud connectivity for our enterprise customers,” said Eric Ballard, vice president of network and cloud at Stream Data Centers. “Megaport is a market leader in the elastic cloud connectivity space and is redefining how our customers can grow and scale cloud, multicloud, and hybrid cloud connectivity. The Megaport partnership enriches the value to our customers by bringing a substantial ecosystem of service providers to our industry-leading data centers. It expands the horizons of network connectivity and what we can offer our customers.”

Megaport brings direct, private connectivity to the top five cloud service providers. In particular, this partnership will take advantage of a new Microsoft Azure ExpressRoute gateway. It will offer a direct connection to Microsoft Azure services as well as a low-latency private link to customer-resources located in the nearby Azure South Central region. All of these services will now be available at Stream’s San Antonio data center.

Ross Ortega, partner program manager, Microsoft Azure Networking, Microsoft Corp., said, “The cloud is driving transformation through new business models, global expansion and accelerated innovation, along with an enhanced customer experience. Collaborating with Megaport to bring ExpressRoute to customers is a core part of these attributes and values.”

The Megaport and Stream partnership will launch in San Antonio for customer availability in 4Q17. Following the launch, Stream and Megaport plan to continue expansion of their partnership to other Stream data centers in Houston and Minneapolis.

Source: CloudStrategyMag

IDG Contributor Network: Cloud data warehouse: The technology no one knows about

IDG Contributor Network: Cloud data warehouse: The technology no one knows about

We’ve all heard of exciting new technologies in the data warehouse world—tools like Amazon Redshift, Google BigQuery, and more recently Azure SQL Data Warehouse. What would you call this category of tools?

Well, of course, “cloud data warehouse.” Check out the Google Trends graph for this search term. Explosive growth.

infoworld cloud dw image 1Gilad David Maayan

But look at this:

infoworld cloud dw image 2Gilad David Maayan

The red graph represents searches for “Amazon Redshift,” compared to “cloud data warehouse.” It is growing much more rapidly, and appears many times larger, than the category it represents.

In fact, according to Google, there are only approximately 300 people per month over the past year in the entire world who searched for the term “cloud data warehouse.” (By comparison, “Amazon Redshift” is searched 14,800 times per month worldwide.)

Ha, you might be thinking, “they must be searching for other things, maybe ‘data warehouse on the cloud.’” As someone who has done several rounds of market research in this field, I can tell you assertively there are no larger search terms that describe the category.

In fact, the category does not exist.

In search of a label

I know it sounds funny to say the cloud data warehouse category doesn’t exist. After all the market is hot, tools are popular and growing rapidly. But the fact is—what comes to people’s minds is the tools, the brands; not the category.

But wait a minute. Data warehouses have been around for ages. As far back as the 19th century Thomas Edison stored the results of his electricity experiments in a (legacy) data warehouse, installed on the highly scalable Kinetoscope Platform in his Menlo Park laboratory.

So surely there must be searches for just “data warehouse?”

infoworld cloud dw image 3Gilad David Maayan

Yes there are. They’re declining. But look at the comparison between “data warehouse” in blue and “amazon redshift” in red. Redshift’s 14,800 are a drop in the ocean compared to “data warehouse.” That term alone is searched 90,500 times per month globally, and there are many other related search terms. “Data warehouse” is huge.

’Plain’ data warehouse is much bigger than cloud

Let show you another data point. I went through the laborious exercise of gathering all the possible search terms people have used recently, around “plain” data warehouse, vs. the three leading cloud data warehouses (Hey I used that phrase! That’s 301 mentions worldwide).

We’re not just talking about the brand name “Amazon Redshift” or “BigQuery” but any possible combination – what is Redshift, Redshift architecture, Redshift clusters, etc.

Excluding Redshift the cosmological phenomenon of course.

Here are the results:

infoworld cloud dw image 4Gilad David Maayan

In words: every month there are around a half a million people, probably wannabe data engineers, expressing interest in “data warehouse” or any variation of that. Compared to 38,000 searching for Redshift, 26,700 for BigQuery, and a measly 13,000 for Microsoft’s SQL Data Warehouse.

Opium for the masses

Hundreds of thousands of people search for “data warehouse.” They are only now making their first steps in the data warehouse world. They want to learn basic things like what data warehouses are for, how they work, how much they cost. But their eyes are closed to the truth of cool new products and architectures.

This is what Google gives them (see diagram #1). You are warned, it’s not a pretty sight. A circa 2008 diagram (yes, I checked) of the “Enterprise Data Warehouse.”

Wow. That’s miles away from shiny technology from Amazon and friends—see diagram #2

Those half a million people will never (well, not really, but please allow me some dramatic effect) see diagram #2. They’ll see diagram #1 and move on to read about solutions like Oracle and Teradata. Not to knock those products, they’re great. But how many of this audience could be interested in the new generation of tools on the cloud with unlimited scalability and blazing fast query speeds?

There’s money lying on the table

Let’s sum it up:

  • Apparently no one is aware of a category called “cloud data warehouse”
  • Lots of people know about specific brands like Redshift
  • But there are about 10X more people who don’t. They simply search for “data warehouse”
  • And what they get is old architectures and old-guard solutions
    Redshift, Google, Microsoft, and friends aren’t there
  • If they were there, maybe their market share would be 5X by now

Does that make sense to you? Let me know in the comments. IMHO there is a huge missed opportunity here, a market education gap that none of the big players has noticed or seems to care about. Hundreds of thousands who are lost in Thomas Edison Land with little chance of graduating to the new and cool.

This creates a lot of room for smaller players, such as Snowflake, which is going head to head with the big players, and Panoply’s self-optimizing data warehouse, which differentiates itself by making data ingestion, preparation, and query optimization much easier (disclaimer: I am an advisor for Panoply). They can, and will, use this opportunity to grab market share from the unsexy “old guard.” Instead of fighting very hard to grab it from Amazon, Google, and Microsoft.

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data

MapR Orbit Cloud Suite Extends Analytics And Applications Across Clouds

MapR Orbit Cloud Suite Extends Analytics And Applications Across Clouds

MapR Technologies, Inc. has announced the MapR Orbit Cloud Suite. Offering a comprehensive set of cloud computing capabilities for the MapR Converged Data Platform, MapR Orbit enables organizations to build data fabrics for the first time that manage data across one or more clouds, hybrid clouds, or to the edge. It also includes advanced features specifically useful to cloud builders and cloud service providers.

Organizations today are looking to take advantage of the economics and business agility that the cloud promises but are encountering several challenges such as the “component” approach to building applications that draws on multiple services which has led to a crisis of complexity for companies, and the incompatibility between cloud providers which creates lock-in. Additional challenges include the inability of cloud services to fully reach the edge resulting in data silos and the ongoing balance of in-country cloud requirements with geographically distributed data and applications. 

MapR is addressing these cloud challenges with the new MapR Orbit Cloud Suite. Optimizing the MapR Converged Data Platform, MapR Orbit simplifies analytics and application deployment across clouds and takes advantage of business agility and cloud-scale economics such as object tiering.

“In addition to being a global cloud IaaS provider, Outscale offers the entire MapR Converged Data Platform to provide fast access to data stored in files, databases and event streams for performing real-time analysis on business critical, operational applications,” said Rob Rosborough, CEO, Outscale Inc. “Whether a business seeks to deploy MapR as a Service or stand up a business continuity solution, Outscale can deliver a public, hybrid or private Cloud solution through our network of global data centers. Many of our customers already benefit from existing multi-tenancy capabilities provided by MapR. We look forward to leveraging future innovations from MapR in this area and others, as they become available.”

The new MapR Orbit Cloud Suite is designed to address complexities for a variety of cloud use cases, including the following four major use cases:

Cloud-Native Data Management and Object Tiering to Public Cloud: MapR Orbit includes new product capabilities and offerings for companies looking to deploy MapR in a public cloud like AWS or Azure.

  • Object tiering for the automatic, secure archiving of data to cloud object stores based on “data temperature,” while keeping the metadata in the MapR global namespace so that it is available for any application, anywhere.
    – Cloud-native cluster management for seamless operations of MapR in the cloud, including installation and cluster scaling. Uniquely customizable to support any cloud architecture, capability, or provisioning framework.
    – Integrations with major cloud object storage systems, including AWS S3, Azure Blob Store, OpenStack Swift, and Google Cloud Storage simplify the movement of data between the MapR platform and native cloud stores.

Build Multi-Cloud/Hybrid Data Fabrics: MapR Orbit also includes rich capabilities for synchronizing data between different cloud providers or between on-premises installations and the cloud. These features let customers build distributed, global applications and data fabrics for analytics, operations, and streaming.

Seamless Bridge to the Edge: Edge-to-Cloud file migration to automatically move files from MapR Edge to the cloud in real-time to power hybrid edge/cloud applications. MapR Orbit Cloud Suite includes capabilities for gathering and processing data close to the source, while moving and replicating this data to cloud or on-premises environments.

A Single Platform to Build Converged Clouds : MapR Orbit empowers companies building private clouds or regional public clouds to provide cross-cloudoperational, analytic, and streaming services to their own customers. Cloud-scale multi-tenancy allows cloud builders to provide converged data services to all end customers on a single shared platform, increasing resource utilization and simplifying operations. In addition, a plugin for Openstack Manila that allows the OpenStack cloud operating system to provision MapR file resources in a multi-tenant way, enabling MapR to be the file storage medium for private or public clouds.

“After speaking to numerous customers over the years it’s clear to me that the growth of data in the cloud has its challenges. MapR lays the foundation to remove the barriers to scale and is designed to enable companies bring their cloud strategies to reality,” said Tom Fisher, CTO, MapR Technologies. “With MapR Orbit Cloud Suite, data access across multiple clouds and cloud types as well as Edge Computing are simply treated as another seamless tier for analytics. The future is leveraging data, no matter where it is persisted, and its use, in real-time, for delivery through the next generations of applications.”

Source: CloudStrategyMag

iland Expands DRaaS Services To Support VMware Vcloud Availability

iland Expands DRaaS Services To Support VMware Vcloud Availability

iland has introduced iland Secure DRaaS powered by VMware. With this new service, iland adds support of VMware vCloud® Availability for vCloud Director® to their mature portfolio of DRaaS services.

Faced with ongoing and increasing risks to IT systems from cyber-attacks such as ransomware and other threats, having a secure and robust disaster recovery solution in place is a priority for many companies. The benefits of using familiar VMware technology and management frameworks in the cloud enhances iland’s IaaS customers experience, and will now be extended to disaster recovery solutions with this new offering from iland and VMware.

“iland has always been an earlier adopter of VMware’s cloud-enabled technologies. Once again, they have demonstrated the value of the VMware Cloud Provider Program by becoming one of the first cloud providers to offer support of VMware vCloud Availability,” said Geoff Waters, vice president, Global Cloud Sales, VMware. “VMware vCloud Availability enables VMware Cloud Providers to offer simple, cost-effective cloud-based disaster recovery services that seamlessly support their customers’ VMware vSphere® environments by leveraging native vSphere replication capabilities.”

iland Secure DRaaS powered by VMware is designed for organizations running on-premises vSphere environments or outsourced private clouds with other cloud providers that want to natively protect their virtual workloads in the cloud by leveraging VMware vSphere replication. In the event of a live or test failover, customers can view and manage their workloads through a single pane of glass in iland’s Secure Cloud Console.

“With over ten years of experience providing DRaaS solutions, we’re always looking to expand our solutions portfolio to reach new customers and help them to protect their IT environments,” said Justin Giardina, chief technology officer, iland. “As one of the first VMware Cloud Providers, we are expanding our disaster recovery footprint to incorporate a new VMware offering designed specifically for VMware Cloud Providers. iland Secure DRaaS powered by VMware leverages vCloud Availability, enabling VMware vCenter® server based organizations to protect VMs from their local datacenter to a cloud, natively integrating the replication with the underlying virtualization technology.”

vCloud Availability leverages native VMware vSphere Replication™ to track block changes to virtual disks (VMDKs) of running virtual machines and providing replication in regular intervals to the target environment depending on the required RPO. Using vCloud Availability, customers can achieve RPOs as low as 15 minutes to recover applications and maintain business continuity.

Customers that operate on-premises or in a co-located VMware ecosystem may not have a disaster recovery plan in place, or lack a secondary location for that purpose. The challenge around disasters, whether a natural catastrophe, system failure or even malicious software isn’t about keeping secure copies of information, but being able to react quickly to lessen the impact of an outage. DR testing is a critical component of a well-designed business continuity plan. iland Secure DRaaS powered by VMware will allow for non-intrusive, self-service testing. The benefits of this are twofold, organizations are enabled to test more frequently without disrupting business operations, and they will have more confidence in a well-tested disaster recovery solution.

iland Secure DRaaS powered by VMware Features:

  • Support for full failover, partial failover, and failback
  • Encryption in-motion and at-rest
  • Straightforward implementation using VMware vSphere
  • Simple pricing model based on the amount of replicated storage and number of VMs, with hourly compute rates upon failover
  • Replication to and from the cloud, including iland and VMware Cloud™ on AWS
  • Simplified on-premises installation using a single virtual appliance (vSphere Replication)
  • Simplified built-in connectivity between the tenant and iland (no need for VPN)

Analyst recognition of iland as a Leader in DRaaS:

  • iland was positioned by Gartner Inc. in the “Leaders” quadrant of the June 2017 “Magic Quadrant for Disaster Recovery as a Service.” In the Magic Quadrant for this market sector, Gartner analysts evaluated 23 service providers offering DRaaS based on the criteria of ‘ability to execute’ and ‘completeness of vision.’
  • iland is also ranked as a leader in Forrester Research, Inc.’s report entitled, “The Forrester Wave™: Disaster-Recovery-As-A-Service Providers, Q2 2017.” In addition to being recognized for “an impressive roadmap”, iland received the highest scores possible in the categories of data security, pricing, service levels and contract terms.

Source: CloudStrategyMag

Rackspace To Deliver Fanatical Support® For Vmware Cloud On AWS

Rackspace To Deliver Fanatical Support® For Vmware Cloud On AWS

Rackspace® has announced from VMworld® in Las Vegas that it intends to offer Fanatical Support® for VMware Cloud on AWS through VMware’s Managed Service Provider (MSP) program early next year.

VMware Cloud on AWS brings VMware’s enterprise class Software-Defined Data Center software to the AWS Cloud, and enables customers to run applications across VMware vSphere®-based private, public and hybrid cloud environments, with optimized access to AWS services. Delivered, sold and supported by VMware as an on-demand and subscription service, IT teams manage their cloud-based resources with familiar VMware tools, and work with their partners of choice such as Rackspace.

With this announcement, Rackspace will build on its relationship with VMware to offer mutual customers:

  • Multi-Cloud Choice: Mutual customers will have more choice across multiple cloud environments. Rackspace will help enable mutual customers to run their VMware workloads out of the datacenter and in the best-fit location, whether in Rackspace datacenters or VMware Cloud on AWS.
  • VMware and AWS Expertise: Rackspace is recognized as a global, at-scale provider of managed and professional services for VMware and AWS. The company is a leading VMware Cloud Provider Partner and runs one of the largest vSphere footprints in the world. In addition, Rackspace has achieved Premier Consulting Partner status, the highest tier within the AWS Partner Network (APN), and is an audited AWS Managed Service Partner for its Fanatical Support for AWS offering.
  • Managed Services: Through Fanatical Support for VMware Cloud on AWS, Rackspace will provide architecture, provisioning and management guidance, as well as assist customers with capacity management and workload mobility between Rackspace data centers and AWS data centers.

“With the same architecture and operational experience on-premises and in the cloud, customers can quickly derive business value from a hybrid cloud experience based on AWS, Rackspace and VMware,” said Ajay Patel, senior vice president and general manager of Product Development for Cloud Services at VMware. “Rackspace has extended its Fanatical Support to VMware technologies for more than 10 years, and VMware Cloud on AWS presents another opportunity to help mutual customers architect, manage and optimize their existing and future VMware-based applications. We’re excited to take this next step and collaborate with Rackspace as they grow and evolve with VMware to meet our mutual customers’ needs.”

“VMware Cloud on AWS is a strong collaboration between two of the leading technology providers in the industry, and will deliver high value to customers who want to run their VMware workloads in the best-fit environment,” said Peter FitzGibbon, vice president and general manager of VMware at Rackspace. “Operating across multiple cloud deployments is relatively new to many organizations, however, and some mutual customers will want support to operate VMware Cloud on AWS effectively. As a leading VMware Cloud Provider Partner and a Premier Consulting Partner in the AWS Partner Network, Rackspace is uniquely positioned to provide multi-cloud expertise and identify customer needs. We’re excited to deliver Fanatical Support for VMware Cloud on AWS to customers and look forward to continuing our work with VMware to support the managed service provider model for VMware Cloud on AWS.”

Source: CloudStrategyMag

Trend Micro Announces Support for VMware Cloud On AWS

Trend Micro Announces Support for VMware Cloud On AWS

Trend Micro Incorporated has announced its market-leading Trend Micro Deep Security™ server security product is available to customers of VMware Cloud™ on AWS. VMware Cloud on AWS brings together VMware’s enterprise-class Software-Defined Data Center (SDDC) software and the dedicated, elastic, bare-metal infrastructure of Amazon Web Services (AWS) to give organizations a consistent operating model and application mobility for private and public cloud. The addition of Deep Security provides seamless visibility and security for virtualized workloads across the SDDC, whether on-premises or in the new VMware Cloud on AWS environment.

Organizations undergoing cloud transformation projects can face a range of operational challenges and regulatory considerations that must be addressed. Trend Micro enables security tool consolidation and streamlined regulatory compliance with a single security product that can consistently secure workloads across the hybrid cloud, including physical,virtual, cloud, & container environments. Leveraging multiple security techniques and deep integration with VMware and AWS, workloads can be automatically secured as they are launched, including vulnerability scanning & policy application to ensure that attacks like WannaCry and Erebus won’t be successful. With a consistent approach to securing all environments, operational costs can be reduced while taking advantage of the cost-benefits of the cloud.

“Trend Micro’s long history of innovation with both VMware and AWS makes us well-suited to provide security for our mutual customers’ data center and cloud workloads,” said Steve Quane, executive vice president of network defense and hybrid cloud security for Trend Micro. “With millions of secured VMs and nearly 2.5 billion protection hours in the cloud, we have unparalleled experience and expertise to help our customers as they take advantage of the new VMware Cloud on AWS offering.”

VMware Cloud on AWS technology partners enable customers to deploy the same proven solutions seamlessly in both the public and private cloud. VMware simplifies deployment and eliminates the need for partners to refactor solutions for VMware Cloud on AWS. If a partner solution works on-premises in a VMware vSphere® environment, it will easily support VMware Cloud on AWS. VMware technology partners complement and enhance native VMware Cloud on AWS service and enable customers to realize new capabilities.

“VMware Cloud on AWS provides customers a seamlessly integrated hybrid cloud offering that gives them the SDDC experience from the leader in private cloud, running on the leading public cloud provider, AWS,” said Mark Lohmeyer, vice president, products, Cloud Platforms Business Unit, VMware. “Solutions such as Trend Micro’s Deep Security enable IT teams to reduce cost, increase efficiency, and create operational consistency across cloud environments. We’re excited to work with partners such as Trend Micro to enhance native VMware Cloud on AWS capabilities and empower customers with flexibility and choice in solutions that can drive business value.”

Source: CloudStrategyMag

IDG Contributor Network: Letting go: trusting AI to do its thing while humans do theirs

IDG Contributor Network: Letting go: trusting AI to do its thing while humans do theirs

Not even a year ago, businesses across industries were still fairly united in their skepticism of artificial intelligence. It took Salesforce announcing its AI platform Einstein for them to really take note of and accept that AI was not only officially here, but here to stay.

Businesses that have since adopted AI quickly realized that part of the unwritten contract is surrendering control of data-related tasks and decision-making to a machine—either at the tactical or entire process level. They’re also learning to let go of their need to interpret and act on data insights. As a result, the decisions they’re making post-AI adoption are drastically different in nature than the ones they were making pre-AI. But somewhere in between AI adoption and full AI integration, these companies are having to cope with their fear of relinquishing control to a machine.

This fear has been here since day one, but as businesses become more educated about how AI works, the “letting go” narrative is now more prolific than ever—even among those who are already using AI to unprecedented success.

The fact of the matter is that humans, by our very nature, want to be in control. I’m sure there’s some Darwinian self-preservationist reasoning behind our need to carefully choreograph the elements around us; something to do with eliminating threat and carrying on as a species. But when it comes to surrendering control to AI—specifically as it relates to machines that automate overwhelmingly complex, data-oriented business processes—it’s likely that our ability to surrender control will influence our ongoing success, rather than prevent it.

In business, AI’s role is to maximize our productivity by taking away the minutea created by data, freeing humans to work on higher-level strategic tasks, thereby making businesses and the individuals behind them “fitter”—in the Darwinian sense—over time.

It’s important to recognize that many people’s resistance to giving up control isn’t just a garden variety power struggle or even their need to micromanage a given situation. It’s a matter of being cautious and establishing a foundation of trust, rather than going in blindly. Having worked with dozens of brands at varying phases of the AI adoption process, there are a few common themes that invariably unfold among them, offering insight into how man and machine can effectively work together, and how to make the ‘letting go’ process easier. 

Humans don’t want to do robots’ jobs—or for robots to do theirs

To opponents of AI, machine is the competition. From this perspective, AI is either going to take man’s jobs completely or AI is a literal competitor that must be competed with (and won against).  The current trajectory of AI adoption in business reveals neither to be true.

Businesses that take a hybrid approach to the division of responsibilities are seeing better results from AI and humans working together, than either one working independently. When it comes to data gathering, analysis and insights, man trying to keep up with robots is a losing proposition. Man can attempt to become robot, or robot can be forced to adopt the unique characteristics of man: creativity, reasoning, emotion and intuition. But the better approach is for them to work together and produce the best outcome possible as a result.

Humans need proof, and quickly

In my experience working specifically with marketers, they are very interested in AI when they’re introduced to it through the lens of the day-to-day tasks it will alleviate for them. Their enthusiasm dissipates, however, when they realize that the AI tool taking over these tasks for them is going to do it in a way they don’t understand. They don’t want to actively manage and implement the nitty gritty tasks, but in order to give them up they need to know that that the technology will do execute those tasks better than they or their teams can.

One way of combatting their fears here is to introduce highly-targeted, quick turnaround programs as trials that allow the AI to show what it’s made of. (Depending on the solution, “quick” could be a weekend or six months.) The sooner the machine can demonstrate its ability to not only understand what seems to be a complex problem, but also out-produce man’s ability to solve it manually, the sooner humans will be able to relinquish control.

Transparency into what the machine is doing is imperative

Giving up day-to-day execution is a lot easier for humans than giving up their decision-making privileges. For marketers, relying on a wholly autonomous AI to process data and then act on its insights without bothering to ask for its human colleague’s thoughts on approach is the biggest obstacle to letting go of control.

This is important for AI providers to consider as they design the outputs shared with businesses. As it turns out, humans don’t necessarily need or want to be involved decision their AI is making, but they do want to understand how and why the AI made them. It would, of course, be impossible for a human to keep up with the pace of a machine’s decision-making processing, but making them privy to key insights goes a long way in creating trust between robot and man. It also creates a foundation for collaboration and idea sharing, as the human learns from the AI’s insights, and is able to complement the AI’s work as a result.

Ultimately, it is this type of collaboration that makes man and machine allies rather than competitors. It’s also what establishes trust, so that humans can rest assured they’re evolving, rather than endangered.

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data

Navisite Launches VMware vCloud Availability Program At VMworld

Navisite Launches VMware vCloud Availability Program At VMworld

Navisite’s managed services for the VMware-native cloud-based replication and disaster recovery capabilities for VMware vSphere and VMware vCloud Availability for vCloud Director at VMworld, Aug 28-31. This complements Navisite’s recent launch of VMware NSX network and security virtualization platform. 

With Navisite’s VMware cloud services, running VMware vCloud Availability for vCloud Director Navisite clients will be able to test replicated environments and move applications seamlessly, all while production is actively running.

“Our clients have been looking for a native solution to replicate and migrate protected virtual machines to the cloud — without the need for costly dedicated hardware or complicated third-party solutions,” said Sumeet Sabharwal, group vice president and general manager, Navisite. “The spike in recent cyberattacks and ransomware has highlighted interest in cloud replication disaster recovery implementations. Navisite VMware vCloud Availability for vCloud Director based replication service provides clients with a simple and cost effective solution to enable availability of their VMware environments.”

Navisite’s VMware vCloud Availability for vCloud Director solutions offer a variety of benefits for clients, including:

  • On-premises installation and complete VMware compatibility: Using a Navisite VMware Cloud as a vCloud Availability for vCloud Director replication target, clients can replicate virtual machines (VMs) from primary VMware vSphere environments to a highly scalable, geographically-diverse site, based in one of Navisite’s enterprise-grade data centers.
  • Simplified failover switch capabilities: Clients can gain access to a remote site for disaster recovery without the painful process of configuring a VPN. After an interruption at a client’s primary site, users can simply initiate a failover switch to keep business applications running in a remote Navisite environment. Once the primary site is restored, clients can then initiate a failback to return to running applications as usual. Leveraging a distributed architecture provides Navisite clients with unique failover testing capabilities.
  • VMware-native solution with enhanced functionality: Unlike alternative third party replication tools that can be costly and difficult to integrate, vCloud Availability for vCloud Director is a native solution developed with the full support of the vSphere stack.

“Navisite’s innovative program gives their clients access to VMware’s latest replication and Network Virtualization technologies,” said Ajay Patel, senior vice president, general manager, Cloud Provider Software Business Unit, VMware. “Navisite’s vCloud Availability for vCloud Director solution offers clients an end-to-end, VMware-native solution that can surpass existing third-party data protection solutions.” 

Clients are already taking advantage of this cloud replication and Disaster Recovery as a Service (DRaaS) solution. One example is Ceridian, a global human capital management company that has turned to Navisite to help meet the needs of its changing business model.

“As the only provider in the U.S. with a VMware vCloud Availability replication offering, Navisite is exceeding our expectations yet again,” said Warren Perlman, CIO, Ceridian. “Their new multi-cloud solution combines the power VMware NSX and vCloud Availability for vCloud Director to deliver seamless and rapid recovery time in the event of a disaster, allowing us to keep the focus on our core business.”

Navisite is a Gold Sponsor at VMworld 2017, the industry’s largest virtualization and cloud computing event, taking place August 28-31 in Las Vegas. Navisite will be discussing and demonstrating its VMware expertise along with new VMware vCloud Availability for vCloud Director at booth #212 during the conference.

Source: CloudStrategyMag

What is data mining? How analytics uncovers insights

What is data mining? How analytics uncovers insights

Organizations today are gathering ever-growing volumes of information from all kinds of sources, including websites, enterprise applications, social media, mobile devices, and increasingly the internet of things (IoT).

The big question is: How can you derive real business value from this information? That’s where data mining can contribute in a big way. Data mining is the automated process of sorting through huge data sets to identify trends and patterns and establish relationships, to solve business problems or generate new opportunities through the analysis of the data.

It’s not just a matter of looking at data to see what has happened in the past to be able to act intelligently in the present. Data mining tools and techniques let you predict what’s going to happen in the future and act accordingly to take advantage of coming trends.

The term “data mining” is used quite broadly in the IT industry. It often applied to a variety of large-scale data-processing activities such as collecting, extracting, warehousing, and analyzing data. It can also encompass decision-support applications and technologies such as artificial intelligence, machine learning, and business intelligence.

Data mining is used in many areas of business and research, including product development, sales and marketing, genetics, and cybernetics—to name a few. If it’s used in the right ways, data mining combined with predictive analytics can give you a big advantage over competitors that are not using these tools.

Deriving business value from data mining

The real value of data mining comes from being able to unearth hidden gems in the form of patterns and relationships in data, which can be used to make predictions that can have a significant impact on businesses.

For example, if a company determines that a particular marketing campaign resulted in extremely high sales of a particular model of a product in certain parts of the country but not in others, it can refocus the campaign in the future to get the maximum returns.

The benefits of the technology can vary depending on the type of business and its goals. For example, sales and marketing managers in retail might mine customer information in different ways to improve conversion rates than those in the airline orfinancial services industries.

Regardless of the industry, data mining that’s applied to sales patterns and client behavior in the past can be used to create models that predict future sales and behavior.

There’s also the potential for data mining to help eliminate activities that can harm businesses. For example, you can use data mining to enhance product safety, or detect fraudulent activity in insurance and financial services transactions.

The applications of data mining

Data mining can be applied to a variety of applications in virtually every industry.

  • Retailers can deploy data mining to better identify which products people are likely to purchase based on their past buying habits, or which goods are likely to sell at certain times of the year. This can help merchandisers plan inventories and store layouts.
  • Banks and other financial services providers can mine data related to their clients’ accounts, transactions, and channel preferences to better meet their needs. They can also gather then analyzed data from their websites and social media interactions to help increase the loyalty of existing customers and attract new ones.
  • Manufacturing companies can use data mining to look for patterns in the production process, so they can precisely identify bottlenecks and flawed methods and find ways to increase efficiencies. They can also apply knowledge from data mining to the design of products, and make tweaks based on feedback from customer experiences.
  • Educational institutions can benefit from data mining such as analyzing data sets to predict the future learning behaviors and performance of students, and then using this knowledge to make improvements in teaching methods or curricula.
  • Health care providers can mine and analyze data to determine better ways of delivering care to patients and cutting costs. With the help of data mining, they can predict how many patients they will need to care for and what type of services those patients will need. In the life sciences, mining can be used to glean insights from massive biological data, to help develop new medicines and other treatments.
  • In multiple industries, including health care and retail, you can use data mining to detect fraud and other abuses—much more quickly than with traditional methods for identifying such activities.

The key components of data mining

The process of data mining includes several distinct components that address different needs:

  • Preprocessing. Before you can apply data mining algorithms, you need to build a target data set. One common source for data is a data mart or warehouse. You need to perform preprocessing to be able to analyze the data sets.
  • Data cleansing and preparation. The target data set must be cleaned and otherwise prepared, to remove “noise,” address missing values, filter outlying data points (for anomaly detection) to remove errors or do further exploration, create segmentation rules, and perform other functions related to data preparation.
  • Association rule learning (also known as market basket analysis). These tools search for relationships among variables in a data set, such as determining which products in a store are often purchased together.
  • Clustering. This feature of data mining is used to discover groups and structures in data sets that are in some way similar to each other, without using known structures in the data.
  • Classification. Tools that perform classification generalize known structures to apply to new data points, such as when an email application tries to classify a message as legitimate mail or spam.
  • Regression. This data mining technique tis used to predict a range of numeric values, such as sales, housing values, temperatures, or prices when given a particular data set.
  • Summarization. This technique provides a compact representation of a data set, including visualization and report generation.

Dozens of vendors provide data mining software tools, some offering proprietary software and others delivering products via open source efforts.

Among the key vendors that offer proprietary data-mining software applications are Angoss, Clarabridge, IBM, Microsoft, Open Text, Oracle, RapidMiner, SAS Institute, and SAP.

Organizations that provide open source data mining software and applications include Carrot2, Knime, Massive Online Analysis, ML-Flex, Orange, UIMA, and Weka.

The risks and challenges of data mining

Data mining comes with its share of risks and challenges. As with any technology that involves the use of potentially sensitive or personally identifiable information, security and privacy are among the biggest concerns.

At a fundamental level, the data being mined needs to be complete, accurate, and reliable; after all, you’re using it to make significant business decisions and often to interact with the public, regulators, investors, and business partners. Modern forms of data also require new kinds of technologies, such as for bringing together data sets from a variety of distributed computing environments (aka big data integration) and for more complex data, such as images and video, temporal data, and spatial data.

Getting the right data and then pulling it together so it can be mined isn’t the end of the challenge for IT. The cloud, storage, and network systems need to enable high performance of the data mining tools. And the resulting information from the data mining needs to be presented clearly to the wide range of users expected to act on and interpret it. You’ll need people with skills in data science and related areas.

From a privacy standpoint, the idea of mining information that relates to how people behave, what they buy, what websites they visit, and so on can set off concerns about companies gathering too much information. That affects not just your technological implementation but your business strategy and risk profile.

Beyond the ethics of tracking individuals so thoroughly, there are also legal requirements about how data can be gathered, identified to a person, and shared. The United States’ Health Insurance Portability and Accountability Act (HIPAA) and the European Union’s General Data Protection Directive (GDPR) are among the best known.

In data mining, the initial act of preparation itself, such as aggregating and then rationalizing data, can disclose information or patterns the might compromise the confidentiality of the data. Thus, it’s possible to inadvertently run afoul of ethical concerns or legal requirements.

Data mining also requires data protection every step of the way, to make sure data is not stolen, altered, or accessed secretly. Security tools include encryption, access controls and network security mechanisms.

Data mining is a key differentiator

Despite these challenges, data mining has become a vital component of the IT strategies at many organizations that seek to gain value from all the information they’re gathering or can access. This drive will no doubt accelerate with ongoing advancements in predictive analytics, artificial intelligence, machine learning, and other related technologies.

Source: InfoWorld Big Data