Nimbus Data Unveils ExaFlash™

Nimbus Data Unveils ExaFlash™

Nimbus Data has unveiled ExaFlash™, a revolutionary new all-flash platform for cloud, big data, virtualization, and massive digital content.

Nimbus Data’s ExaFlash™ Platform is an historic achievement that will reshape the storage and data center industries. It offers unprecedented scale (from terabytes to exabytes), record-smashing efficiency (95% lower power and 50x greater density than existing all-flash arrays), and a breakthrough price point (a fraction of the cost of existing all-flash arrays). ExaFlash brings the all-flash data center dream to reality and will help empower humankind’s innovation for decades to come.

“Worldwide storage demand continues to escalate, and with it, the need to increase data center efficiency, achieve scale with consistent performance, maintain simplicity, and reduce cost,” stated Thomas Isakovich, CEO and founder of Nimbus Data. “Our vision is unencumbered by conventional thinking, and ExaFlash’s ground-breaking achievements reflect that vision.”

The ExaFlash Platform incorporates a novel architecture that enables virtually infinite scalability, from terabytes to exabytes, as one centrally-managed all-flash system. With ExaFlash, data flow is decoupled from metadata and management is centralized, operating completely out-of-band. This provides one interface and one pane-of-glass for all administration. There is no data network between the storage arrays themselves, so performance truly scales in lock-step with capacity and with consistent latency. Multiple storage protocols, including block, file, and object, can operate simultaneously using Ethernet, Fibre Channel, and InfiniBand networks, eliminating storage “islands”.

The building blocks of the ExaFlash Platform are ExaFlash Arrays, a new family of all-flash system that sets new records for energy efficiency, capacity, connectivity options, and rack space density. Unlike off-the-shelf servers, ExaFlash Arrays are precision-engineered for maximum efficiency and performance. ExaFlash Arrays draw 95% less power per terabyte than competing designs by eliminating irrelevant components and minimizing CPU utilization through intelligent hardware-offload engines. ExaFlash Arrays also incorporate ultra-dense flash drives to deliver up to 50x the capacity in the same rack space as competing alternatives. ExaFlash Arrays are available in sizes ranging from 50 TB to 4.5 PB and feature the latest 32 Gb Fibre Channel, 100 Gigabit Ethernet, and EDR InfiniBand connectivity. ExaFlash Arrays feature dual active-active controllers, redundant power and cooling, and hot-swappable components for always-on availability and easy serviceability.

The all-new ExaFlash Operating System is powerful, API-driven storage and data management software for the ExaFlash Platform. The ExaFlash OS provides a single pane-of-glass administration interface for total control. It also provides support for all major storage protocols, including block, file and object protocols, simultaneously. Using the API, storage administrators can automate every conceivable storage task centrally. Inline variable deduplication and compression maximize storage utilization, and real-time checksums ensure data integrity.

Source: CloudStrategyMag

Denodo Recognized As A Visionary In The 2016 Gartner Magic Quadrant For Data Integration Tools

Denodo Recognized As A Visionary In The 2016 Gartner Magic Quadrant For Data Integration Tools

Denodo has positioned Denodo as a Visionary in its 2016 Magic Quadrant for Data Integration Tools. According to Gartner, “The biggest changes in the market from 2015 are the increased demand for data virtualization, the growing use of data integration tools to combine ‘data lakes’ with existing integration solutions, and the overall expectation that data integration will become cloud- and on-premises-agnostic” (August 2016 Gartner Magic Quadrant for Data Integration Tools).

Data virtualization has achieved broad recognition as an agile method of data integration for organizations frustrated with long, repetitive, and expensive methods of physically moving data such as ETL (Extract, Transform and Load).

Data virtualization is a virtual data layer that combines disparate data from a variety of source systems into complete, connected information and then presents that information to business users through applications and reporting solutions. Data virtualization has grown to become a critical part of modern data architectures. This is due to its flexible approach and ability to quickly integrate new data sources and to provide the information in real-time to business users.

“We believe that Denodo’s furthest placement in the visionaries quadrant for completeness of vision demonstrates that our approach to data virtualization is the most complete in this quadrant,” said Ravi Shankar, chief marketing officer at Denodo. “Version 6.0 of the award winning Denodo Platform accelerates fast data strategy with breakthrough performance in big data, logical data warehouses, and operational scenarios; facilitates solution adoption with data virtualization in the cloud; and expedites the use of data by business users with self-service data discovery and search. Denodo’s customers are benefiting from shorter time, fewer resources, and lower cost to solution.”

The Denodo Platform serves critical enterprise roles including:

  • CIOs and CDOs in supporting new business models, digitalization initiatives, strong ROI from reduction in resources and integration costs. Denodo also accelerates analytics initiatives resulting in more timely insights, increased revenue and operational excellence.
  • Enterprise, data, and solution architects in exploiting modern data architectures such as logical data warehouse, logical data lake, and data as a service in support of analytical and operational initiatives.
  • Business users by accelerating the decision making process through dynamic adaptation to business requirements, agile data delivery, trusted insights, and self-service data discovery capabilities.
  • Developers by providing Denodo Express edition for expediting POCs, ease of use, and configuration of the product and advanced optimization techniques such as dynamic query optimization.

Source: CloudStrategyMag

Informatica Positioned As A Leader In Gartner’s Magic Quadrant For Data Integration Tools

Informatica Positioned As A Leader In Gartner’s Magic Quadrant For Data Integration Tools

Informatica has announced that Gartner, Inc., a leading IT research and advisory firm, has positioned Informatica as a Leader in its 2016 Magic Quadrant for Data Integration Tools. For the third consecutive year, Informatica is positioned the furthest and highest on the completeness of vision and ability to execute axes, respectively. Additionally, Informatica has demonstrated its leadership in the data integration market by being a Leader in the Gartner Magic Quadrant for Data Integration Tools for 11 years in a row. In this year’s Magic Quadrant, published August 8, 2016, Informatica is positioned the furthest and highest than ever before.

According to the Gartner report, “Data integration remains central to enterprises’ information infrastructure.” The report continues to say, “Enterprises pursuing frictionless sharing of data are increasingly favoring tools that are flexible in that they can be designed once for delivery across multiple platforms, mixed architectural and broad deployment without significant rework.”

The authors of the Gartner report note that, “the biggest changes in the market from 2015 are the increased demand for data virtualization, the growing use of data integration tools to combine ‘data lakes’ with existing integration solutions, and the overall expectation that data integration will become cloud- and on-premises-agnostic.”

The report projects that “the growth rate is above the average for the enterprise software market as a whole, as data integration capability continues to be considered of critical importance for addressing the diversity of problems and emerging requirements. The total market revenue is expected to be about $4 billion in 2020 (see ‘Forecast: Enterprise Software Markets, Worldwide, 2013-2020, 2Q16 Update’).”

The report’s authors say “Leaders in the data integration tool market are front-runners in the convergence of single-purpose tools into an offering that supports a full range of data delivery styles. Additionally, they have recognized the growing affinity between data and application integration, and are haltingly approaching location-agnostic deployments (that are not limited only to cloud or on-premises, but can be deployed beyond specific location). They are strong in establishing data integration infrastructure as an enterprise standard and as a critical component of modern information infrastructure. They support both traditional and new data integration patterns to capitalize on market demand. Leaders have significant mind share in the market, and resources skilled in their tools are readily available. These vendors recognize and design to deploy for emerging and new market demands, (to a large degree) often providing new functional capabilities in their products ahead of demand, and by identifying new types of business problem to which data integration tools can bring significant value. Examples of deployments that span multiple projects and types of use case are common among Leaders’ customers. Leaders have an established market presence, significant size and a multinational presence (directly or through a parent company).”

“Informatica is focused on providing customers with solutions that enable them to leverage the new era of data and put great data at the center of everything they do in order to make their businesses more successful,” said Amit Walia, executive vice president and chief product officer, Informatica. “We believe that the positioning of Informatica as a Leader in the Gartner Magic Quadrant for Data Integration Tools further validates our dominance and leadership in data integration for a hybrid world, across on-premises, big data and cloud. We feel that our recognition in this report reflects the unmatched breadth and depth of Informatica’s metadata-driven data management offerings, for any user, any style and any speed of data integration, and for all data, big or small, structured or unstructured. The 2016 Magic Quadrant for Data Integration Tools represents Informatica’s best positioning yet in this arena and we believe we continue to set the pace for data management innovation and customer success.”

Informatica offers the most comprehensive set of solutions focused on enterprise data integration across on-premise, big data, cloud and hybrid integration environments, spanning from departmental to big data solutions. The Informatica Intelligent Data Platform takes away the complexity of accessing, transforming and unifying any type of data from disparate underlying sources, and delivering it to any business system or user at any latency. With Informatica, business users can quickly discover and prepare data for their needs using self-service tools with the assurance that the data is complete, clean, relevant, timely and trusted. At the same time, IT is able to deliver even complex data integration projects up to five times faster than traditional hand-coded methods.

Source: CloudStrategyMag

Red Hat Positioned in the Visionaries Quadrant of Gartner's 2016 Magic Quadrant

Red Hat Positioned in the Visionaries Quadrant of Gartner's 2016 Magic Quadrant

Red Hat, Inc. has announced that Red Hat Enterprise Virtualization has been positioned by Gartner, Inc. in the “Visionaries” quadrant of the August 2016 x86 Server Virtualization Infrastructure Magic Quadrant. With Red Hat Enterprise Virtualization, companies around the world are building a foundation for future technologies while integrating with existing technologies with an open, scalable, and high-performance virtualization infrastructure.

Gartner’s Magic Quadrants are based on rigorous analysis of a vendor’s completeness of vision and ability to execute. According to Gartner, “About 80% of x86 server workloads are virtualized, but virtualization technologies are becoming more lightweight, supporting more workloads and agile development. Price, modernization and specific use cases are driving enterprises to deploy different, and often multiple, virtualization technologies.”

“Many enterprises are looking for an open alternative to proprietary virtualization solutions to obtain better efficiencies and interoperability, and to bridge their traditional infrastructure to cloud-native workloads using OpenStack or other platforms. As the only company included in the Visionaries quadrant, we believe Red Hat’s Magic Quadrant position reinforces our continued innovation, momentum and strong vision for an open, high-performance virtualization alternative,” said Gunnar Hellekson, director of product management, Linux and Virtualization, Red Hat.

Red Hat Enterprise Virtualization is an open infrastructure and management platform for servers and workstations with robust security capabilities. It is built on Red Hat Enterprise Linux and Kernel-based Virtual Machine (KVM) technologies, and enables customers to virtualize both traditional and cloud-native applications. Red Hat Enterprise Virtualization is the open alternative. It offers a high-performing, fault-tolerant, and more secure platform for mission-critical, virtualized Linux and Windows environments. Red Hat reduces the cost and complexity of proprietary virtual machines (VM) through improved economics, interoperability, and agility of virtualization. Backed by Red Hat’s certified ecosystem of software and hardware partners, Red Hat Enterprise Virtualization offers unparalleled performance, scale, and flexibility to support a broad range of critical workloads.

Source: CloudStrategyMag

Nimbix Expands Market Presence In Cloud-based Machine Learning

Nimbix Expands Market Presence In Cloud-based Machine Learning

Nimbix has announced a significant increase in their presence in the machine learning market space as more customers are using their JARVICE platform to help address the need for an easier, more cost efficient way of working with machine learning.

Using JARVICE to manage their machine learning process, customers are able to leverage JARVICE’s turnkey workflows, reducing time to deployment from weeks to hours. Built on NVIDIA GPU’s for optimal neural network training, Nimbix’s per-second billing also enables their customers to capture the best economics for the neural network evaluation phase of machine learning.

Experienced machine learning developer, Hugh Perkins, author of the popular open source OpenCL libraries DeepCL and cltorch, is an avid user of the Nimbix cloud. Perkins chose to work with Nimbix in addressing machine learning due to the powerful platform API, industry-leading selection of GPUs, superior-performance and economics.

“Nimbix is a breath of fresh air,” said Perkins. “The per-second billing, spin up times of seconds, and the availability of high end GPUs, make Nimbix an awesome choice for machine learning developers.”

The Nimbix cloud platform is democratized and developer-friendly, allowing users to monetize their trained neural networks in the application marketplace. Democratizing machine learning APIs will distribute the power of neural networks to smaller organizations, allowing for more breakthroughs in life sciences, IoT, automotive, and more.

“The Nimbix Cloud was a great choice for our research tasks in conversational AI. They are one of the first cloud services to provide NVIDIA Tesla K80 GPUs that were essential for computing neural networks that are implemented as part of Luka’s AI,” said Phil Dudchuck, co-founder at Luka.ai.

Nimbix provides on-demand and scalable compute resources that enable organizations to run large-scale HPC workloads in the cloud. High Performance Computing (HPC) allows scientists, developers, and engineers to solve complex science, engineering, development, big data, and business problems using applications that require high compute capabilities. Nimbix’ growing number of integrated cloud services allows organizations to increase the speed and effectiveness of research by running high performance computing in the Nimbix Cloud.

“The Nimbix Cloud, powered by JARVICE, is an ideal platform for machine learning applications as it provides access to true supercomputing GPU and FPGA-based accelerated computing with large amounts of memory is paramount for training deep neural networks ,” said Leo Reiter, chief technology officer at Nimbix. “The high performance execution of the Nimbix Cloud is key to delivering timely results to our end users.  We are continuing to improve JARVICE by allowing our customers to utilize the platform to democratize these resources and provide a processing API to help add cognitive features to any application seamlessly and cost-effectively.”

Source: CloudStrategyMag

Microsoft Named A Leader By Gartner

Microsoft Named A Leader By Gartner

Gartner has positioned Microsoft in the Leaders Quadrant in the 2016 Magic Quadrant for Cloud Infrastructure as a Service (IaaS) based on its completeness of vision and ability to execute in the IaaS market. Microsoft is the only vendor recognized as a leader across Gartner’s Magic Quadrants for IaaS, PaaS and SaaS solutions for enterprise cloud workloads. Register here to download the full report.

Source: CloudStrategyMag

Rootstock Appoints BT Partners To Represent Its Cloud ERP Solutions

Rootstock Appoints BT Partners To Represent Its Cloud ERP Solutions

Rootstock has announced that BT Partners has been appointed to help manufacturers and distributors learn about, buy, and implement Rootstock’s Cloud Manufacturing ERP software. For 30 years, the Chicago provider of technology-based business consulting services has been helping manufacturers and distribution organizations improve the ways their businesses operate with ERP and financial software solutions. 

“With Rootstock’s Manufacturing and Distribution Cloud ERP, a manufacturer or distributor can reduce its IT operational costs by outsourcing hardware and software maintenance and support to the cloud provider and get their ERP solution up and running faster,” emphasizes Todd Perlman, president of BT Partners. “With its ERP built on the Salesforce platform, Rootstock customers leverage investment in their Salesforce platform by integrating it seamlessly with other Salesforce cloud applications.”

“Today, Cloud ERP buyers need to be cautious,” adds Pat Garrehy, CEO of Rootstock Software. “Not all ERP systems are developed from a manufacturing core, especially those residing in the Cloud.  Some vendors begin with accounting software while others start out in maintenance or human resources. Rootstock’s core functions have been architected based on the needs and environment of supply chain-based organizations. Our solution is extended and enhanced by BT Partners’ technical architects as they are leaders in their field and familiar with the global governance, data management and complex integrations necessary for manufacturers to be successful.”

BT Partners is experienced in working with manufacturing and distribution companies with complex inventory and shipping needs. BT Partners will assess ERP utilization and review processes and identify needs that are not met now or are emerging soon. BT Partners is focused on developing recommendations for optimizing clients’ investments and managing the implementation process.

Source: CloudStrategyMag

Cambridge Semantics Names Steve Hamby Managing Director Government

Cambridge Semantics Names Steve Hamby Managing Director Government

Cambridge Semantics has announced the appointment of Steve Hamby to managing director government.

In this newly created position, Hamby will serve Cambridge Semantics’ federal government customers seeking insights from big data discovery, analysis, and data management solutions, such as the Anzo Smart Data Lake™, to provide timely, accurate and customizable information to staff, citizens, media and businesses.

“We are delighted to have Steve join us as managing director government,” said Alok Prasad, president of Cambridge Semantics. “With our rapidly expanding client roster in the public space, Steve’s addition to the team will permit us to further develop our market presence as big data analysis becomes indispensable to delivering effective and efficient government services.”

Hamby brings over 30 years of experience in the information technology industry to the company, most recently serving public sector customers as the CEO of G Software, Inc. and as chief technology officer for Orbis Technologies, Inc. In 2013, he was recognized by the American Business Awards™ as Technology Executive of the Year, Silver Award for his pioneering efforts on cloud-based HUMINT- and OSINT-centric fusion products at Orbis Technologies. Hamby is also a published author who often speaks at major industry conferences. He holds a bachelor’s degree in management from the University of North Alabama and a master’s degree from Jacksonville State University.

“It’s an exciting time for Cambridge Semantics to step up its presence in the public sector,” said Hamby. “Government agencies have a tremendous interest in semantic-based smart data discovery and analytic solutions, and I look forward to working with these organizations to help them simplify data access and discovery for the citizenry.”

Source: CloudStrategyMag

SAIC Introduces Cloud Migration Edge

SAIC Introduces Cloud Migration Edge

Science Applications International Corp. (SAIC) has launched Cloud Migration Edge™, a multi-tiered methodology that migrates and transforms customers’ current IT applications and systems to a cloud environment securely and effectively. As a cloud services integrator, SAIC teams with the best cloud technology providers to engineer solutions that meet customers’ individual needs.

Cloud Migration Edge is a holistic, five-step approach that encompasses specialized tools, processes, and best practices to guide the cloud migration life cycle and provide ongoing improvement. This formalized framework supports the step-by-step implementation of a mission-centric cloud computing environment by breaking down the cloud migration process into standardized components at each layer of the IT service life cycle.

“Our advanced cloud expertise and proven methodology allow our federal government customers to rapidly and securely integrate and adapt cloud technologies to improve delivery of their IT services,” said Charles Onstott, SAIC senior vice president and general manager of the Cyber, Cloud, and Data Science Service Line. “To accomplish this, we have taken our IT business transformation, cybersecurity, and cloud computing expertise to deliver a systematic approach to cloud migration, while applying IT Infrastructure Library best practices.”

Additionally, SAIC’s customized approach includes several aspects of business transformation such as policies, processes, security, governance, architecture, applications, and change/risk management.

“Our cloud services integration solution creates a comprehensive and secure IT environment, crafted to meet our customers’ unique requirements, using both existing customer investments and modern cloud technologies,” said Coby Holloway, SAIC vice president of Cloud Computing and Business Transformation Services.

SAIC works with customers to analyze their requirements and business needs, develop the appropriate architecture, design the migration approach, and implement the transition plans to include change and risk management. SAIC also establishes a new operations and maintenance model based on the target architecture that includes cloud management and continuous service improvement.

“Migration is not just about the applications, it is about transforming the way business and missions are performed while providing new capabilities that cloud-based systems enable,” Onstott continued. “We evaluate the current system and requirements, future needs, what makes sense to migrate and how, the risks involved, the transition process needed, policies, people, processes and how those are affected, and develop the best implementation plan to transition the business with the lowest impacts on productivity and current operations.”

As part of SAIC’s total solution, Cloud Migration Edge uses industry-leading capabilities from Amazon Web Services, EMC, NetApp, RedHat, VMware, and others. As a cloud services integrator, SAIC is able to bring the best solutions from our partners across the cloud computing industry, avoiding vendor bias and lock-in.

SAIC Cloud Migration Edge five-phase methodology:

  • Assess and Strategize: SAIC defines objectives and builds a cloud strategy that meets technical, regulatory compliance, and security requirements. This involves creating assessments, building requirements, developing a business case, and outlining a return on investment.
  • Design: SAIC tailors a solution that includes the cloud platform, security, management, monitoring, and final design to achieve each customer’s goals. SAIC uses a comprehensive systems engineering approach to create both a final cloud-enabled infrastructure as well as a detailed migration strategy that includes transformation of the customer’s IT processes and organization to a cloud service delivery model.
  • Transition: During this step, SAIC migrates IT services to the cloud with minimal disruption using unique managed business transformation approach, including an implementation plan, operational testing, and final execution.
  • Operate: SAIC orchestrates cloud services to meet performance levels using proven processes to mitigate risk with constant monitoring. SAIC will organize, monitor, verify, report, and manage various operational and governance activities that ensure the production environment meets or exceeds performance metrics. SAIC also introduces heavy automation to increase the efficiency and consistency of the new services, and to facilitate onboarding and cloud service adoption.
  • Improve: SAIC capitalizes on the flexibility of cloud-enabled architectures to optimize service value. During this phase, SAIC provides customers with services, including project management, staff augmentation, data migration, workload migration, independent verification and validation testing, and concept operations updates. Customers benefit from the lessons learned and best practices developed across all of SAIC’s cloud work, which are used to continually update our Cloud Migration Edge approach and implementations. This phase involves evaluating service delivery, identifying, and implementing opportunities for improvement.

Source: CloudStrategyMag

Qligent Integrates Big Data Capabilities Into Vision Cloud Monitoring Platform

Qligent Integrates Big Data Capabilities Into Vision Cloud Monitoring Platform

Qligent is building big data conditioning into its Vision cloud monitoring platform in time for IBC2016 (September 9-13, RAI Exhibition Center, Stand 8.E47). The integration of this new software will help broadcasters and media businesses leverage big data insights much quicker and easier for multiplatform content delivery.

As has always been the case in television, viewers quickly lose patience and tune out if broadcast quality suffers. The challenge for broadcasters and new media businesses, including OTT service-providers, is the sheer cost and complexity of monitoring a quickly escalating density of streams and channels. The Vision cloud monitoring platform gives users a wider palette to monitor these many streams from the studio headend to the last mile more effectively — and cost-efficiently.

At IBC2016, visitors to the Qligent stand can learn how the new big data and other advanced capabilities built into Vision enhance analysis across both linear and non-linear TV and video streams. This includes rich, detailed and customized presentations around combining and structuring specific QoE parameters to see the data in a meaningful and actionable manner, including:

  • Percentage of macroblocking, freeze, black, and other artifacts in a program stream
  • Quality of advertising playout over a specific period of time
  • Presentation of off-air time over a broadcast day or week
  • Capture, verification and correlation of embedded metadata

“Many of our current and prospective customers in the broadcast space share that big data is the only way to reconnect and stay connected with what used to be their captive audiences,” said Ted Korte, COO, Qligent. “There has been an explosion of non-linear avenues for content delivery across gaming consoles, mobile devices and hundreds of social media sites, all stealing eyeballs, time and attention. Everything the linear TV service provider once understood is now completely fragmented, and these customers need a new set of data-centric tools to understand the quality of the viewing experience—and how to monetize that data moving forward.”

Vision users can opt to create and manage big data widgets for on-site analysis, or farm out the application to Qligent’s managed service layer via the company’s Oversight MaaS (“Monitoring as a Service”) platform. This further drives down the costs and labor associated with monitoring multiple streams and sources across many delivery platforms.

“The fact is that the stress around multiplatform monitoring can cause many headaches in understaffed and under-skilled facilities, to the point where they may not know they are off the air on a specific platform until receiving a complaint,” said Korte. “While the tried-and-true linear models still catches more eyeballs and viewers on initial impressions, to remain competitive, broadcasters and MVPDs need to be on as many of these emerging platforms as possible with engaging, high-quality content. Our big data capabilities in Vision will help our customers understand what the quality of experience is across these many platforms. That data really represents the viewer feedback that isn’t typically received, and will help our customers understand when and why viewers tuned out—and how to rectify any viewing quality problems.”

Source: CloudStrategyMag