Report: Dell Technologies Challenging Leaders For Cloud Infrastructure Leadership

Report: Dell Technologies Challenging Leaders For Cloud Infrastructure Leadership

New Q2 data from Synergy Research Group shows that in the burgeoning market for technology to build clouds, the newly formed Dell Technologies is now chasing HPE, Cisco, and Microsoft for leadership in the three main segments — private cloud hardware, public cloud hardware, and cloud infrastructure software respectively. While the Dell EMC merger didn’t officially close until September, Dell and EMC in aggregate would have been ranked second in private cloud hardware and third in public cloud hardware based on worldwide Q2 revenues. Across all cloud hardware, HPE had the lead with 15%, closely followed by Cisco on 14% and Dell EMC 13%. Meanwhile Dell Technologies’ majority owned VMware subsidiary was ranked second in cloud infrastructure software. Other major cloud infrastructure vendors included IBM, Lenovo, Huawei, Oracle and NetApp. The growth rate for the total cloud infrastructure market dropped off a little in the quarter but on a rolling annualized basis it still grew by over 16%.

For the last nine quarters total spend on data center infrastructure — which includes servers, server OS, storage, networking, network security and virtualization software — has been running at an average $29 billion, with the market being increasingly driven by the cloud. Cloud deployments or shipments of systems that are cloud enabled now account for well over half of the total data center infrastructure market.

“While total spending on data center infrastructure remains relatively flat, cloud share of that spending continues to rise as an ever-increasing portion of computer workloads migrate to either public or private clouds,” said Jeremy Duke, founder and chief analyst, Synergy Research Group. “We are also seeing that within the cloud infrastructure market, hyperscale cloud operators are accounting for an ever-larger share of overall capex. This is a trend which is not going to change any time soon.”

Source: CloudStrategyMag

Teradata expands analytics for hybrid cloud

Teradata expands analytics for hybrid cloud

Analytics solutions provider Teradata has released new hybrid-cloud management capabilities to better compete with rising pressure from both open source and commercial solutions.

Teradata Everywhere expands support for existing cloud-hosted Teradata solutions and adds new hybrid and cross-cloud orchestration components that make it possible to manage Teradata instances across “on-premises appliances, on-premises virtualization environments, managed cloud, and public cloud,” according to the company’s announcement.

Teradata was previously available on Amazon Web Services, but the latest iteration provides up to 32 nodes per instance and conveniences like automated backup functionality. Later this year, Microsoft Azure is set to start running this iteration of Teradata, as are VMware environments, Teradata’s own Managed Cloud in Europe (Germany), and Teradata’s on-premises IntelliFlex platform. (Google Compute Engine support was not among the environments mentioned in the announcement.)

Other improvements in the works, but not slated to debut until next year, are features to allow expansion and rebalancing of data loads between Teradata instances without major downtime and a new “in-stream query re-planning” system designed to optimize queries as they are being executed.

Teradata’s plans involve more than providing a way to run cloud-hosted instances of its database on the infrastructure of one’s choices. Rather, the company says it hopes to make Teradata as “borderless,” or hybrid, as possible. Teradata QueryGrid and Teradata Unity are being revised to better support this goal.

One key change — managing Teradata instances across environments — is available now. But many of the others — for example, automatic capture and replay of selected data changes between Teradata systems or one-click database initialization across systems — are projected to be ready in 2017.

Though powerful, Teradata is facing stiffer competition. After Hadoop came to prominence as a commodity open source data-analysis solution, Teradata made use of it as a data source by way of the commercial MapR distribution.

Clouds such as Amazon Redshift or Microsoft’s Azure SQL also offer data warehousing solutions. Azure SQL has been enhanced by changes to SQL Server that encourage the bursting-to-the-cloud expansion that Teradata is now promising. There’s also pressure from new kinds of dedicated cloud services, such as Snowflake, which promises maximum performance with minimal oversight.

Source: InfoWorld Big Data

Global Clients Adopt IBM Aspera Hybrid Cloud Service For Large Media Transfer

Global Clients Adopt IBM Aspera Hybrid Cloud Service For Large Media Transfer

IBM has announced continued adoption of the company’s Aspera Files hybrid cloud-based content sharing service and enhancements to the platform that are enabling more media and entertainment companies around the world to speed content collaboration and distribution.

Based on Aspera FASP® technology, IBM Aspera Files™ is a Software-as-a-Service (SaaS) offering on IBM Cloud that accelerates the sharing and transfer of large files and directories – even the largest content files and associated metadata – directly from its native storage environment whether on premises or in the cloud. A multi-tenant solution, Aspera Files is designed to be easy to use and quickly available, providing transfers at maximal speeds in predictable times and offering a rich set of capabilities for sharing, distributing and managing large files.

The cloud service is offered as an all-inclusive platform hosted on the IBM Cloud with built in Aspera transfer service and IBM Cloud object storage. Files can also seamlessly connect Aspera transfer nodes and storage running on all major third party cloud infrastructure providers and storage deployed on customer premises to provide advanced security & user access controls, fast, direct transfer of content of any size, at any distance independent of storage location.

A new pay-as-you-go option enables customers small to large to take advantage of the power of the Aspera platform with a cost-effective offering that scales with their business, and the brandable workspace model accommodates the project nature of most digital media businesses. New customers such as Beelink Productions in Dubai, action concept in Germany and Outpost VFX in the UK are using Aspera Files to replace existing and often cumbersome content sharing techniques, and to exchange content with an ecosystem of other Aspera users.

“Media companies are moving fast, constantly innovating and looking for technologies that help them accelerate business,” said Michelle Munson, CEO of Aspera. “Aspera Files gives companies high speed sharing quickly, without having to provision infrastructure. In addition, they can quickly and easily leverage the rich features of the solution via an online trial sign up, and an online pay-as-you-go experience.”

More than 20 new capabilities have been introduced for Aspera Files since its launch, including:

  • One click sharing of folders with authenticated 3rd parties for upload and download and one click invitations to submit content and metadata as packages to branded dropboxes;
  • Media carousel previews to view and manage large numbers of video and photo files;
  • New Drive and Mobile Applications to browse, share, sync and send and receive packages from the desktop and to contribute content from iOS and Android devices;
  • Advanced Security for fine-grained control over folder sharing and package delivery with external recipients and enterprise single sign on;
  • Ultra-fast Auto-scaling transfers to Aspera clusters running in the cloud (10 Gbps).
  • Clients Move to Aspera Files

Beelink Productions
Headquartered in Dubai, Beelink produces and distributes exclusive content, mainly Arabic drama series, from their studios in Egypt. With a growing audience base, Beelink committed to offering 3 exclusive Arabic drama series made of 30 episodes each during Ramadan: Grand Hotel, Wanoos, Series and Heba Regel Elghourab. To meet an expanded production schedule, Beelink replaced its legacy HDD distribution, which often resulted in unacceptable delays, with Aspera Files. The company was in full production within hours, with nothing to provision or deploy except an install-on-demand browser plug-in.

Beelink organized episodes in folders on a per series basis and used the “share” facility with view-only permissions to distribute only authorized content to their customers required to login for content access audit purpose. Assets were uploaded from Egypt over a 50Mbps line and then downloaded by Beelink customers at the highest speed from Aspera Files, which is built on top of Aspera FASP high-speed transfer technology. Episodes reaching 17GB were uploaded to Files and downloaded within 25 mins by customers running 100Mb/s lines.

“We are pleased with the Files service and the support we received by the local team to get us started within hours,” said Hala Obied, business coordinator at Beelink. “The breadth of features and functions coupled with the simplicity of the user interface allowed our customers to pull down their assets in minutes.”

Action Concept 
Action Concept, a leading action film producer in Germany, reaches top markets in over 100 countries with prime-time productions. It has also established itself as a sought-after producer for commissioned productions and in-house formats. The company turned to Aspera Files for a recent project with a Chinese client that required the production company to move large volumes of video material between their German facility and a business partner in South Korea. Previously, the company used FTP or physical FEDEX shipments to share video, but due to security concerns and an explicit request from the customer to avoid these methods of transport, action concept sought a new solution that could provide greater security and faster transfer times. Aspera Files made it easy for action concept to send and receive very large files that can reach two terabytes per captured scene in 4k or 6k resolutions.

“We decided to explore the possibilities with Aspera in large part because of its solid reputation in the industry, as well as the high level of security, ease of use and speed offered by the solution,” said Tom Dülks, Head of Technology, action concept. “We were not disappointed.”

Outpost VFX  Outpost VFX is a high-end visual effects company with a diverse portfolio across feature film, broadcast and advertising, music promos and virtual reality. The nature of the business requires Outpost to send and receive large media files to and from clients on a regular basis. The company uses the dropbox function in Aspera Files to securely receive work from clients and subcontractors around the world, and once a shot is officially approved, they use Files’ digital package-sending tool to deliver completed projects directly to the client.

“Aspera Files provides an affordable solution with the flexibility to scale up as we grow. The tool covers all the bases from a security standpoint, and Aspera’s prominence with large studios grants us a reputation of gravitas when we use it – it shows that Outpost is a serious VFX supplier,” said Danny Duke, managing director, Outpost VFX. “We couldn’t be happier with our decision to select Aspera.”

Aspera is demonstrating Aspera Files and is entire suite of high-speed file transfer and streaming solutions at IBC2016 from September 9-13 in Amsterdam, Hall 7, Stand G20.

Source: CloudStrategyMag

Insight Releases Hybrid Cloud Assessment

Insight Releases Hybrid Cloud Assessment

In a recent IDC (International Data Corporation) Multi-Client Study, CloudView 2016, respondents to the survey said they expect to increase their cloud spending by approximately 44% over the next two years, and 70% of heavy cloud users are thinking in terms of a “hybrid” cloud strategy. 

To keep pace with the rapidly changing technological landscape, Insight released a new Hybrid Cloud Assessment service, which helps businesses navigate and take advantage of the complex hybrid cloud environment.

A combination of both public and private platforms, hybrid cloud provides organizations with greater IT and infrastructure flexibility, as well as visibility and control over their cloud usage. As a result, hybrid cloud enables business agility, including streamlined operations and improved cost management. Companies can now enter new markets or launch new products and services more quickly and efficiently in a highly competitive business environment.

“We built a methodology and tool set that allows us to assess a company’s full portfolio of applications and to provide the optimal deployment and consumption model for each client,” said Stan Lequin, VP, services, Insight. “This approach enables us to deliver a non-disruptive and customized Hybrid Cloud roadmap.”

The Hybrid Cloud Assessment provides a clear and unbiased guide for businesses to transition to the cloud, including design, deployment, and management.

“We developed this tool to allow us to efficiently evaluate workloads and determine where they are best deployed based on application dependency mapping, cloud consumption models, and a variety of additional factors,” said Lequin.

Insight takes into account distinct market drivers and challenges and tests every potential IT scenario to develop the right solutions to help clients accomplish their specific goals.

Source: CloudStrategyMag

Fuze Is Named to First-Ever Forbes 2016 World’s Best 100 Cloud Companies List

Fuze Is Named to First-Ever Forbes 2016 World’s Best 100 Cloud Companies List

Fuze (formerly ThinkingPhones), has announced it has been named to the first-ever Forbes 2016 Cloud 100, the definitive list of the top 100 private cloud companies in the world, developed in partnership with Bessemer Venture Partners.

 “We built our multi-tenant cloud platform from the ground up to be exceptionally agile, redundant, and secure without the restrictions of other UC services and the expense of on-premise designs. This enables our customers to deploy cloud communications quickly, easily, and cost effectively,” said Steve Kokinos, CEO, Fuze. “The scalable, flexible solution also allows customers to seamlessly add users from all corners of the globe, making Fuze the perfect fit for multi-location enterprises. Such adaptability, including a breadth of third-party application support, has been key to the platform’s adoption by global organizations and a large part of our continued industry recognition.”

“Cloud companies are revolutionizing how businesses reach their customers today from digitizing painful old processes to allowing them more time to focus on what they really care about — what makes their products unique,” said Alex Konrad, Forbes editor of the Cloud 100 list. “Inclusion in the Forbes 2016 Cloud 100 list recognizes a company for its financial growth and excellence as recognized by customers and peers.”

“These are the companies to watch!” said Byron Deeter, a leading cloud investor and partner at Bessemer Venture Partners. “The Forbes Cloud 100 companies represent the very best private companies in cloud computing. We will see big IPOs and category killers emerge from this list as cloud computing continues to propel the trillion-dollar software industry.”

“On behalf of Fuzers worldwide, we are thrilled to be named to the inaugural Forbes 2016 Cloud 100,” continued Kokinos. “Our mobile-first user experience is designed to delight today’s digitally empowered workforce, while our powerful suite of business analytics integrates with other cloud services to make our solution an indispensable tool for business and technology leaders.”

The list will appear in the Oct. 4, 2016 issue of Forbes magazine.

Methodology
The first-ever Forbes 2016 Cloud 100 list profiles the world’s top-tier private companies leading the cloud technology revolution, plus twenty rising stars within the field. With advancements in software, cloud security, or platform development, these companies are redefining the future for all industries and sectors.

Forbes, in partnership with Bessemer Venture Partners, received hundreds of submissions to identify the most promising private companies in cloud. The Forbes 2016 Cloud 100 was selected by a panel of judges representing leading public cloud companies, using qualitative and quantitative data submitted by nominees, along with publically available third-party data sources.

Recognition

Every company named to the Forbes 2016 Cloud 100 is recognized in print and online by Forbes, and Forbes’ partners Bessemer Venture Partners and Salesforce Ventures. The companies also receive physical awards and digital badges signifying their inclusion on this exclusive list, as well as an invitation to the celebratory Cloud 100 Awards Dinner, hosted in San Francisco by Forbes, Bessemer Venture Partners and Salesforce Ventures.

Source: CloudStrategyMag

Equinix Introduces The Media Cloud Ecosystem For The Entertainment Industry

Equinix Introduces The Media Cloud Ecosystem For The Entertainment Industry

Equinix, Inc. has announced the Equinix Media Cloud Ecosystem for Entertainment (EMCEE™), an ecosystem of interconnected media and content providers, along with content delivery networks (CDNs) and cloud service providers that optimizes content creation, global distribution, and services across the entire media and entertainment (M&E) industry. Today, more than 500 content and media companies such as Content Bridge, Movile, and Selevision use EMCEE to peer with the industry’s largest concentration of CDNs, multiple system operators (MSOs) and social media platforms, enabling faster content development and distribution, as well as significant cost savings.

Digital disruption is affecting the M&E industry at an ever-accelerating pace – changing the way that content is created, enhanced, transported, stored and distributed. To embrace this disruption, M&E companies need to transform their infrastructure, from fixed and siloed to integrated and dynamic with interconnection at the forefront of their IT decision making. Global businesses, including media and entertainment companies, are increasingly leveraging colocation data centers to distribute their digital infrastructure across multiple geographies, and closer to the edge, to solve these challenges while also optimizing their IT for cloud-based offerings.

Components of the EMCEE ecosystem that enable this transformation include Equinix interconnection offerings across Platform Equinix™, network density, access to multiple clouds utilizing the Equinix Cloud Exchange™ and access to billions of consumers leveraging CDNs in Equinix International Business Exchange (IBX) data centers. In tests conducted in Equinix’s global Solution Validation Centers™, video streaming applications that flowed through Equinix experienced 47% lower network latency. The test results also show that Equinix customers save, on average, more than 25% on network bandwidth costs by aggregating Internet traffic delivery to improve performance and scalability.

Today’s consumers expect reliable, on-demand access to bandwidth-heavy digital content such as video, apps and online games. To meet consumer expectation, digital media and entertainment companies need an interconnected neutral ecosystem of content companies, advertising networks and content delivery services, accessible via secure, direct connections.

As media becomes commoditized in the digital era, new business models are increasingly focused on innovation and efficiency across the production cycle – and value creation at the point of engagement, where end-users expect high quality service on every device, all the time, everywhere. To capture the opportunity, businesses are streamlining production workflows, reducing time and cost, and expanding distribution capabilities to tap into billions of smart TVs and devices around the globe.

Built on Interconnection Oriented Architecture™ (IOA™), EMCEE efficiently improves network and application performance, security and end-user satisfaction. IOA directly and securely interconnects clouds, networks, business ecosystems and data at the edge, providing virtual control and transparency across the world’s most globally interconnected data centers, within the largest cloud and network provider-neutral marketplaces.

Equinix global interconnection platform also provides media and entertainment companies with industry leading solutions including Equinix Cloud Exchange which provides direct access major cloud service providers including AWS, Google Cloud Platform, Microsoft Azure and Office 365 and IBM Softlayer in 21 markets globally, and Equinix Performance Hub™ and Equinix Data Hub™ which help develop faster, more efficient content creation workflows.

Equinix will be presenting EMCEE at the IBC 2016 Conference and Exhibition in Amsterdam from September 9-13 at booth B25, Hall 3.

Source: CloudStrategyMag

CloudPhysics Unveils Cost Calculator

CloudPhysics Unveils Cost Calculator

CloudPhysics has introduced the Cost Calculator for Private Cloud with the Public Cloud Comparison tool. This data-driven solution automates the process of determining the accurate costs of a customer’s currently resourced on-premises Virtual Machines (VMs). Customers can compare those amounts to the costs for the same VMs if they were migrated to a public cloud.

The Cost Calculator for Private Cloud allows the customer to rightsize VMs by comparing a VM’s current resources, such as CPU and storage, with the amount the VM actually requires to perform its functions. Because many VMs are overprovisioned with resources, rightsizing helps a customer save costs per workload, whether on-premises or in the cloud. By rightsizing workloads, customers are assured that VM provisioning fits actual usage. 

The calculator also offers customers the unique ability to conduct “apples-to-apples” comparison of virtual workloads in a private cloud model, where resources are shared — vs. the public cloud model, where resources are subscribed from a cloud service provider. Users can create scenarios that compare their private cloud costs vs. public cloud estimates with utilization levels at Peak, 95th, and 99th percentiles. They can then accurately determine what these workloads cost to operate in the public cloud at those respective levels. 

“Organizations typically do not have a current cost-per-workload model in their private cloud, and have very poor tools to allow for the pricing and comparison of private vs. public cloud costs,” said Chris Schin, VP, Products at CloudPhysics. “Our Cost Calculator for Private Cloudensures that IT decision makers have real, actionable data regarding the savings a public cloud can potentially provide vs. their current operational costs.”

The Cost Calculator for Private Cloud determines cost based on selected workloads, hosts, clusters, or data centers and calculates the cost per workload/virtual machine (VM). This allows IT administrators to understand the cost of a workload based on size and resource utilization in a private cloud environment. Once private cloud costs are known, workloads can accurately be compared against public cloud hosting costs to determine if savings can be achieved on a workload-by-workload basis.

Source: CloudStrategyMag

Review: Google Bigtable scales with ease

Review: Google Bigtable scales with ease
Editor's Choice

When Google announced a beta test of Cloud Bigtable in May 2015, the new database as a service drew lots of interest from people who had been using HBase or Cassandra. This was not surprising. Now that Cloud Bigtable has become generally available, it should gain even more attention from people who would like to collect and analyze extremely large data sets without having to build, run, and sweat the details of scaling out their own enormous database clusters.

Cloud Bigtable is a public, highly scalable, column-oriented NoSQL database as a service that uses the very same code as Google’s internal version, which Google invented in the early 2000s and published a paper about in 2006. Bigtable was and is the underlying database for many Google services, including Search, Analytics, Maps, and Gmail.

Source: InfoWorld Big Data

Big data hits $46 billion in revenue — and counting

Big data hits billion in revenue — and counting

Big data has been a big buzzword for more than a few years already, and it has solid numbers to back that up, including $46 billion in 2016 revenues for vendors of related products and services. But the big data era is just beginning to dawn, with the real growth yet to come.

So suggests a new report from SNS Research, which predicts that by the end of 2020, companies will spend more than $72 billion on big data hardware, software, and professional services. While revenue is currently dominated by hardware sales and professional services, that promises to change: By the end of 2020, software revenue will exceed hardware investments by more than $7 billion, the researcher predicts.

“Despite challenges relating to privacy concerns and organizational resistance, big data investments continue to gain momentum throughout the globe,” the company said in a summary of the report, which was announced Monday.

Others echo the same sentiment.

“Sooner rather than later, big data will become table stakes for enterprises,” said Tony Baer, a principal analyst at Ovum. “It will not provide unique competitive edge to innovators, but will add a new baseline to the analytics and decision support that enterprises must incorporate into their decision-making processes.”

It is indeed still early days for such initiatives, said Frank Scavo, president of Computer Economics.

“Business intelligence and data warehousing are top areas for technology spending this year, but only about one-quarter of organizations are including big data in their investment plans,” said Scavo, citing his own company’s research. “So, what we are seeing today is just the tip of the iceberg.”

Cloud storage and services are making big data affordable for most organizations, but realizing the benefits can be a challenge. That’s in large part due to the current shortage of business analysts and IT professionals with the right skills, particularly data scientists, he said.

“If you’re planning to invest in big data, you’d better be ready to invest in your people to develop the needed skills,” Scavo said. “At the same time, if you’re an IT professional just starting out in your career, big data would be a great area to focus on.”

Source: InfoWorld Big Data

IDG Contributor Network: How to answer the top three objections to a data lake

IDG Contributor Network: How to answer the top three objections to a data lake

We’ve all seen the marketing hype surrounding the data lake. Data lakes are much like Michael Corleone at the end of The Godfather. Data lakes will answer all your questions and solve all your problems. However, as with Michael’s pronouncement(s) at the end of The Godfather, there is a downside to this “offer” that marketers may think we cannot refuse. There is usually a set of stakeholders out there who are unfamiliar with Hadoop or the concept of a data lake or perhaps just not interested in changing the status quo of their organizations.

As a data architecture, you are pitching a data lake like you do one of those mountain lakes on travel websites or George Clooney movies … lakes are cool, clear, and usually have the reflection of a snow-tipped mountain peak on their surface to show the purity of the contents within. Everyone wants to drink water from this source. However, when some people hear the concept — data from many sources being stored without a schema for some possible future benefit — they will think more about the concept of a data swamp rather than a pristine data lake.

Data swamps are places where unknown data sits in a Hadoop cluster. You don’t know where the data came from. You don’t know how old the data is. You have no idea what you might use the data for. Heck, the first use of this type of data before a skeptical executive more concerned with the status quo than organizational change will evoke the classic, “what’s your data source? How can you verify this information? I have different experiences….”

But before you can even get to that meeting where people start to question the data from your data lake, you need to propose, build, and populate one. Here are the top three (3) objections that I often hear to “discourage” any budding data architect from attempting start their data lake initiative, and how you might answer those objections:

Aren’t data lakes just another silo to get in the way? Just like the name implies, data lakes provide the opportunity to put all that pure data into a single location. This allows for information from those new, and often voluminous, data sources to share an environment with traditional data sets and each other. This allows for data-driven organizations to discover links between data sets such as mobile and social, make new insights from the data, and potentially create new business models such as how Uber changed the personal transportation business. I would answer this objection with the advances in data integration technologies such as data virtualization and ETL/ELT/ET/ETLT, as well as the ability to share data between data management architectures. The day of “data silos” is more about “want to” than “can’t do.”

Data lakes aren’t robust enough for our needs…Hadoop isn’t even 10 years old! I would say that the above objection is provided by someone who is invested in the care, feeding, and maintenance of a data warehouse. The types of “needs” that this objection is attempting to address are data governance, quality, stewardship, and lineage. True, the data governance practices of data lakes lags behind those other data architectures based on the concept of ‘schema on write’ where you predetermine the questions before you create and populate the structure. I would answer that a data lake attempts to solve a different set of requirements. Instead of assuring the quality of the data for “regulatory quality reporting” (i.e., someone goes to jail if the numbers are wrong), data lakes are designed to allow for discovery and then the potential use for new business models. A data lake’s data quality practices are less about the syntactic quality of the data (are all the fields perfect?) and more about the semantic quality of the data (can we use this well?).

Data lakes threaten the established data management structures such as the data warehouse More often than not, I hear this one coming out the mouth of someone who sells proprietary data warehouse storage components…yes. Some in the EDW world find the presence of the data lake to be a threat to the “single version of the truth” component of the enterprise data warehouse. However, more often than not, the data that exists within a data lake isn’t the type of curated structured data that data warehouses are known for. I would answer that the data that exists within the data lake is more often the type of atomic level event data with lots of extra fields that haven’t proven themselves yet “worthy” of placement in the data warehouse. Part of this is the concept of separating the signal from the noise. Another is the concept that pouring potential petabytes of data into the EDW will cause to two things to happen. One, the data quality people will become “concerned” (okay, have a heart attack) over the data coming into the platform. Two, the storage vendor will retire early to some golf course with the purchase agreement to handle all that information

After you hear the objections a couple hundred of times, the question then becomes: is a data lake worth the time, trouble, and effort if it might devolve from the pure data sources high in the mountains if this is the type of resistance that you encounter? The answer to that is “most certainly!” The advantages of the data lake outweigh the risks. The data lake is how data-driven organizations will validate and power their new businesses.

Does your organization want to be part of the future or part of the past?

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data