Two Acquisitions Consolidate Local Hosting and Data Center Markets
[unable to retrieve full-text content]In two separate announcements this week, Hostopia acquires an Australian web hosting business to grow its Asia-Pacific reach, while 365 Data Centers grows its data center footprint. …
Source: TheWHIR
QTS Releases Cloud-Based FedRAMP-Compliant System
QTS Releases Cloud-Based FedRAMP-Compliant System
Responding to increasing Government demand for outsourced cloud and hybrid data center solutions, QTS Realty Trust has announced deployment of the nation’s first multi-tenant unemployment insurance and single state tax system in the cloud.
The Mississippi Department of Employment Security (MDES), a consortium of Mississippi, Rhode Island, Maine, and Connecticut, are the first states to go live on the system that supports the U.S. Department of Labor’s mandate to identify and develop promising repeatable practices that reduce costs throughout the workforce system.
QTS was chosen to develop and host a compliant, cloud-based solution that could accommodate multiple states with the expectation of achieving high levels of commonality. QTS provided a VMware-centric, hybrid IT solution that integrates QTS’ FedRAMP-compliant Government cloud and colocation services from QTS’ Dulles, VA, and Phoenix data centers.
“We are one step closer to creating the future for delivery of unemployment solutions in a cost-effective manner, while being conscientious stewards of our state workforce funds,” said Dale Smith, chief operating officer, MDES. “QTS has been a trusted partner in developing the innovative multi-tenant hybrid IT solution that will be utilized by many states for years to come.”
The new system is expected to attract the attention of states that may be struggling to meet their obligations to pay unemployment benefits to their customers. In Mississippi, the expected cost of operating the consortium system over the on-premise single state system will be reduced by approximately 40%, leaving much needed funding to meet the needs of Mississippi’s unemployed.
“We are pleased to support MDES and the U.S. Department of Labor in their mission to transform State and Local government processes,” said David McOmber, executive vice president, public sector & federal, QTS. “QTS’ Government Solutions is committed to providing the Federal marketplace with innovative and highly secure hybrid data center solutions.”
Source: CloudStrategyMag
NTT Com Launches Enterprise Cloud For ERP
NTT Com Launches Enterprise Cloud For ERP
NTT Communications Corporation (NTT Com) has announced the global launch of its Enterprise Cloud for ERP, a multi-tenant cloud platform service for SAP installations, in collaboration with Dell Technologies Group’s Virtustream, Inc. and EMC Japan K.K., effective immediately.
Incorporating managed services, the new solution will deliver a cloud-based packaged SAP platform to customers in Japan, the Americas, Europe, and Australia, giving them the ability on a global basis to access and run mission-critical core SAP systems in the cloud.
The Enterprise Cloud for ERP service has an option for virtual HANA (offering up to 2.9TB of memory) and physical HANA (up to 8.0TB of memory), which will be available as a multi-tenant cloud platform supporting SAP HANA’s in-memory database architecture. In addition to the base service, a “high availability” option will be available, offering enhanced resilience and disaster recovery via the use of diverse routing and duplicate data centers in Tokyo and Osaka. The service will also include pay-as-you-go options that leverage μVM resource-management technology to regulate CPU & memory usage, enabling customers to enjoy the flexibility and reliability of a cloud environment combining virtual and physical HANA environments. The service aims to reduce total cost of ownership by up to 65% compared to on-premise systems.
In addition to an SAP cloud-based platform, the new solution will deliver a managed service for OS, SAP Basis, SAP HANA, as well as a system monitoring service and a support offering that includes access to a technical account manager. The latter will provide ITIL-based operational support, operational status analysis and best-practice consultancy services. Customers will be able to easily and efficiently operate the platform by selecting from a wide choice of menus, allowing them to focus on design of their own work processes and application development.
Outside Japan, NTT Com will offer a total SAP solution in the Americas, Europe, and Australia incorporating the same onboarding services and managed services. This will enable customers to seamlessly use one-stop globally standardized SAP infrastructures and managed services in a cloud environment.
NTT Com plans to expand the functionality of enterprise cloud for ERP and the number of regions where it will be available in collaboration with Virtustream and EMC Japan. The company also plans to offer a cloud management platform offering integrated management of multiple cloud environments which will facilitate connectivity with Virtustream’s overseas cloud; they will also offer cloud platform connecting services by means of NTT Com’s Software-Defined Exchange Service.
Source: CloudStrategyMag
Bossie Awards 2017: The best databases and analytics tools
Bossie Awards 2017: The best databases and analytics tools
CockroachDB is a cloud-native SQL database for building global, scalable cloud services that survive disasters. Built on a transactional and strongly consistent key-value store, CockroachDB scales horizontally, survives disk, machine, rack, and even datacenter failures with minimal latency disruption and no manual intervention, supports strongly consistent ACID transactions, and provides a familiar SQL API for structuring, manipulating, and querying data. CockroachDB was inspired by Google’s Spanner and F1 technologies.
— Martin Heller
Source: InfoWorld Big Data
Bossie Awards 2017: The best machine learning tools
Bossie Awards 2017: The best machine learning tools
Core ML is Apple’s framework for integrating trained machine learning models into an iOS or MacOS app. Core ML supports Apple’s Vision framework for image analysis, Foundation framework for natural language processing, and GameplayKit framework for evaluating learned decision trees. Currently, Core ML cannot train models itself, and the only trained models available from Apple in Core ML format are for image classification. However, Core ML Tools, a Python package, can convert models from Caffe, Keras, scikit-learn, XGBoost, and LIBSVM.
— Martin Heller
Source: InfoWorld Big Data
IDG Contributor Network: The 80/20 data science dilemma
IDG Contributor Network: The 80/20 data science dilemma
The emergence of cloud has led to an explosion of data that has left data scientists in high demand. A job that didn’t exist a decade ago has topped Glassdoor’s ranking of best roles in America for two years in a row, based on salary, job satisfaction, and number of job openings. It was even dubbed the “sexiest job of the 21st century” by the Harvard Business Review.
Though growing in population, data scientists are scarce and busy. A recent study shows that demand for data scientists and analysts is projected to grow by 28 percent by 2020. This is on top of the current market need. According to LinkedIn, there are more than 11,000 data scientist job openings in the US as of late August. Unless something changes, this skills gap will continue to widen.
Against this backdrop, helping data scientists work more efficiently should be a key priority. Which is why it’s an issue that currently, most data scientists spend only 20 percent of their time on actual data analysis.
The reason data scientists are hired in the first place is to develop algorithms and build machine learning models—and these are typically the parts of the job that they enjoy most. Yet in most companies today, 80 percent of a data scientist’s valuable time is spent simply finding, cleaning and reorganizing huge amounts of data. Without the right cloud tools, this task is insurmountable.
Hard work behind the scenes
When beginning to grapple with and make sense of the many different data streams coming in via cloud-connected devices and systems, data scientists must identify relevant data sets within their data storage repositories, otherwise known as data lakes, which is no small task.
Unfortunately, many organizations’ data lakes have turned into dumping grounds, with no easy way to search for data and unclear strategies and policies around what data is safe to share more broadly. Data scientists often find themselves contacting different departments for the data they need and wait weeks for it to be delivered, only to find that it doesn’t provide the information they need, or worse, it has serious quality issues. At the same time, responsibility for data governance (or data-sharing policies) often falls on data scientists, since corporate-level governance policies can often be confusing, inconsistent, or difficult to enforce.
Even when they can get their hands on the right data, data scientists need to time to explore and understand it. The data may be in a format that can’t be easily analyzed, and with little to no metadata to help, the data scientist may need to seek advice from the data owner. After all this, the data still needs to be prepared for analysis. This involves formatting, cleaning and sampling the data. In some cases, scaling, decomposition and aggregation transformations are required before data scientists are ready to start training the models.
Organizational structure can also cause inefficiencies in the analysis process. Data scientists and developers traditionally work in siloes, with each group performing a related, but isolated task. This creates bottlenecks, increases the potential for error and dries up resources. A unified approach, which leverages cloud platforms and includes proper data governance, boosts efficiency and helps data scientists collaborate both internally and with developers.
Why it’s such a conundrum
These processes can be time-consuming and tedious, but they are crucial. Since models generally improve as they are exposed to increasing amounts of data, it’s in data scientists’ best interests to include as much data as they can in their analysis.
However, due to deadlines and time crunches, data scientists can often be tempted to make compromises on the data they use, aiming for “good enough” rather than optimal results.
However, making hasty decisions during model development can lead to widely different outputs and potentially render a model unusable when it’s put into production. Data scientists are constantly making judgment calls, and starting out with incomplete data can easily lead them down the wrong path.
To balance quality against time constraints, data scientists are generally forced to focus on one model at a time. If something goes wrong, they are forced to start all over again. In effect, they’re obliged to double down on every hand, turning data science into a high-stakes game of chance.
Escaping these pitfalls
Using cloud data services to automate many of the tedious processes associated with finding and cleansing data helps to give data scientists back more time for analysis, without compromising the quality of the data they use, and enables them to build the best foundation for AI and cognitive apps.
A solid cloud data platform features intelligent search capabilities to help data scientists find the data they need, while metadata such as tags, comments and quality metrics help them decide whether a data set will be useful, and how best to extract value from it. Integrated data governance tools also give data scientists confidence that they are permitted to use a given data set, and that the models and results they produce will be used responsibly by others.
As a result, data scientists gain the time they need to build and train multiple models simultaneously. This spreads out the risk of analytics projects, encouraging experimentation that yields breakthroughs without focusing resources on a single approach that may turn out to be a dead end.
Cloud platforms can also equip data scientists with services to save, access and extend models, enabling them to use existing assets as templates for new projects instead of starting from scratch every time. The concept of transfer learning—which focuses on preserving the knowledge gained while solving one problem and applying it to a different but related problem—is a hot topic in the machine learning world. Developing visualizations with data science tools help communicate how models work while saving time and reducing risk.
Data scientists play an essential role in pushing forward innovation and garnering competitive advantage for companies. By giving data science teams the cloud data tools needed to flourish today, the 80/20 dilemma becomes a thing of the past.
This article is published as part of the IDG Contributor Network. Want to Join?
Source: InfoWorld Big Data
Cisco Introduces Management Platform and Strategy For UCS And HyperFlex
Cisco Introduces Management Platform and Strategy For UCS And HyperFlex
Cisco has unveiled Cisco Intersight,™ a management and automation platform for Cisco Unified Computing System™ (Cisco UCS®) and Cisco HyperFlex™ Systems, marking the start of a multi-year strategic program to provide customers with clear IT advantages that support their competitive business goals. Cisco Intersight simplifies data center operations by delivering systems management as-a-service, alleviating the need to maintain islands of on-premise management infrastructure. Complete system lifecycle management is delivered by Cisco Intersight through machine learning, analytics and automation.
Today, most IT organizations are adopting a multicloud strategy, and as a result customers need a scalable and consistent management environment across their data centers, private clouds and public clouds deployments. Customers also seek consistent management and policy enforcement across bare metal server environments, converged infrastructure and hyperconverged infrastructure. Cisco Intersight will deliver the unique ability to connect and manage all of these comprehensively.
Application architectures are also transforming with scale-out and multi-site deployment models, delivered by containers and micro-services. In addition, DevOps accelerates the rate of application development and continuous feature delivery. Cisco Intersight addresses these challenges, helping IT staff optimize operations while enjoying a more intuitive user experience.
According to Gartner, “The growing large quantity of configuration data points and connections aggregated from data center deployments, edge, cloud and IoT will increase IT ops complexity, countering trends in software-defined simplicity.”1
Cisco has invested years of research and development into software innovations to bridge these gaps. With the cost of an unplanned data center outage estimated at thousands of dollars per minute, and the extraordinary costs of corporate security breaches, often due to human error, IT leaders welcome new advances in intelligent automation. Customers are already participating in an engineering preview of the Intersight platform, connecting thousands of UCS and HyperFlex systems for testing and feedback.
It’s about the Management, not the Machines
“Organizations that move to cloud-based systems management platforms will find that service delivery quality is significantly improved, the overall risk to the business goes down, and IT staff productivity is increased,” said Matt Eastwood, Senior Vice President, IDC. “Artificial Intelligence (AI) –infused cloud-based management tools can offer deep insights into the state of the infrastructure, identify troubles before they become major issues, and enable quicker ‘root cause’ identification and analysis of issues.”
Intersight is designed to deliver a new, higher level of simplicity and intelligence that is intuitive from the start and continues to learn and evolve over time:
- Pervasive Simplicity: Cisco Intersight features a dynamic user interface that can be customized by user role. As a cloud-based service, new functionality is delivered via portal updates without burdening customers with upgrades, and the experience scales seamlessly as customers directly connect new systems for management. The platform is designed to constantly learn to help make daily IT operations easier. Analytics combined with tight integration with the Cisco Technical Assistance Center (TAC) constantly improve the assistance provided through the recommendation engine.
- Expertise: tight Cisco Technical Assistance Center (TAC) integration and insights from Cisco and the UCS Community provide recommendations and best practices.
- Continuous Optimization: The ability for the Intersight platform to provide actionable intelligence will grow stronger over time from the power of cloud-based machine learning. It can learn from the collective experience of the UCS user community, as well as the best practices of Cisco experts and peers. This will enable better predictive analytics and resource utilization to be provided through the recommendation engine.
- Agile Delivery: IT can rapidly respond to business demands and frequent changes while maintaining the policy enforcement designed to implement safe and reliable business services Cisco Intersight is API-driven and Cisco UCS and HyperFlex are fully programmable systems, supporting development and operations tool chains for continuous integration and delivery.
- Constant Protection: The Cisco Intersight service adheres to the stringent security standards of Cisco InfoSec, and helps enable secure communication between the Cisco Intersight SaaS platform and managed endpoints, allowing applications to be deployed and updated securely.
Cisco Intersight will be available in Q4 2017 and is integrated and designed to coexist with existing UCS and HyperFlex management tools, so customers can adopt Cisco Intersight as they desire without added complexity. Cisco Intersight will be available in an on-premise deployment model in the future. Cisco Intersight is built on an extensible architecture with OData standards-based RESTful APIs and connector framework simplifying third party software and hardware integrations.
The Cisco Intersight Base Edition will be available at no charge. It includes global health monitoring and inventory, a customizable dashboard, the HyperFlex Installer to quickly deploy clusters, and the ability to context-launch the UCS Manager, IMC and HyperFlex Connect element managers.
The Cisco Intersight Essentials Edition includes all the functionality of the Base edition as well as policy-based configuration with service profiles, firmware management with scheduled updates, Hardware Compatibility Listing (HCL) compliance checks and upgrade recommendations, and other features.
Bringing Next Generation Systems Management to Over 60,000 UCS, HyperFlex and Converged Infrastructure Customers
Cisco recently announced fifth generation UCS Servers with the high-performance UCS M5 series and an accompanying release of UCS Manager that provides connector support to Cisco Intersight. UCS M5 systems are also now available in Cisco’s hyperconverged infrastructure solution with the new Cisco HyperFlex M5 nodes, including HyperFlex Edge, with Cisco Intersight providing cloud-based cluster deployment capability in the Base Edition. Cisco leads the industry with the most comprehensive portfolio of converged infrastructure solutions; Cisco Intersight creates opportunities for deeper partner integrations and even greater operational simplicity.
As customers include Cisco UCS, HyperFlex and Intersight in their IT modernization initiatives, Cisco offers a comprehensive lifecycle of data center services from advisory through optimization, managed technical and learning services to improve cost-efficiency and reduce risk. Cisco’s portfolio of services leverages global expertise, proven processes, and innovative methodologies to help customers accelerate and simplify operations.
1. Gartner, New Generations of Integrated Systems Will Apply AI and Emerge as Self-Organizing Systems of Intelligence, Aug 2017
Source: CloudStrategyMag
DigitalOcean Launches Drag-and-Drop Object Storage
DigitalOcean Launches Drag-and-Drop Object Storage
[unable to retrieve full-text content]In response to developer demand, DigitalOcean has released its latest product, called Spaces, for user-friendly object storage. Read More…
Source: TheWHIR
Why Managed Hosting is the Best Option for Most WordPress Users
Why Managed Hosting is the Best Option for Most WordPress Users
[unable to retrieve full-text content]WordPress is user-friendly and intuitive, but for users without server and web application management experience, managed WordPress hosting is the smart choice….
Source: TheWHIR
Report: Public Cloud Market Is Growing
Report: Public Cloud Market Is Growing
New Q2 data from Synergy Research Group shows that over the last 24 months, quarterly spend on all data center hardware and software has grown by just 5%, while spending on the public cloud portion of that has grown by 35%. The private cloud infrastructure market has also grown, though not as strongly as public cloud, while spending on traditional, non-cloud data center hardware and software has dropped by 18%. ODMs in aggregate account for the largest portion of the public cloud market, with Cisco being the leading individual vendor, flowed by Dell EMC and HPE. The Q2 market leader in private cloud was Dell EMC, followed by HPE and Microsoft. The same three vendors led in the non-cloud data center market, though with a different ranking.
Total data center infrastructure equipment revenues, including both cloud and non-cloud, hardware and software, were over $30 billion in the second quarter, with public cloud infrastructure accounting for over 30% of the total. Private cloud or cloud-enabled infrastructure accounted for over a third of the total. Servers, OS, storage, networking and virtualization software combined accounted for 96% of the Q2 data center infrastructure market, with the balance comprising network security and management software. By segment, HPE is the leader in server revenues, while Dell EMC has a strong lead in storage and Cisco is dominant in the networking segment. Microsoft features heavily in the rankings due to its position in server OS and virtualization applications. Outside of these four, the other leading vendors in the market are IBM, VMware, Huawei, Lenovo, Oracle, and NetApp.
“With cloud service revenues continuing to grow by over 40% per year, enterprise SaaS revenue growing by over 30%, and search/social networking revenues growing by over 20%, it is little wonder that this is all pulling through continued strong growth in spending on public cloud infrastructure,” said John Dinsdale, a Chief Analyst and Research Director at Synergy Research Group. “While some of this is essentially spend resulting from new services and applications, a lot of the increase also comes at the expense of enterprises investing in their own data centers. One outcome is that public cloud build is enabling strong growth in ODMs and white box solutions, so the data center infrastructure market is becoming ever more competitive.”
Source: CloudStrategyMag