IDG Contributor Network: Five core attributes of a streaming data platform

IDG Contributor Network: Five core attributes of a streaming data platform

As your data-driven organization considers incorporating new data sources like mobile apps, websites that serve a global audience, or sensor information from the internet of things, technologists will have questions about the required attributes of a streaming data platform.

There are five core attributes that are necessary for the implementation of an integrated streaming platform and allow for both the acquisition of streaming data and the analytics that make streaming applications possible:

Low latency: Streaming data platforms need to match the pace of the data sources that they will acquire data from as part of a stream. One of the keys to streaming data platforms is the ability to match the speed of data acquisition with the requirements of the near real-time analytics needed to disrupt particular business models or markets. The value of real-time streaming analytics diminishes when you have to wait for the data to be landed in a data warehouse or a Hadoop-based data lake architecture. In particular, for location-based services and predictive maintenance applications, the time between when the data is created and landed in a data management environment represents a missed customer opportunity at the least or a stranded multi-million dollar asset critical to your business operations at the most.

Scalable: Streaming data platforms are not just connecting a couple of data sources behind the corporate firewall. Streaming data platforms need to be able to match the projected growth of connected devices and the internet of things. This means that streaming data platforms will need to be able to stream data from a large number of sources — potentially millions or even billions of sources, both internally and externally.

4 Winners and 3 Losers in Gartner's Magic Quadrant for IaaS

4 Winners and 3 Losers in Gartner's Magic Quadrant for IaaS

Gartner has released the results of its Magic Quadrant for Infrastructure as a Service for 2016. The winners in the public cloud space are innovating and adding new features rapidly, while the losers are falling further and further behind. Here’s a look at some of the highlights of the report.

SEE ALSO: For Gartner, Two Cloud Providers Stand Out from the Pack

Winner: Amazon Web Services

AWS is the clear leader in the IaaS space with “a diverse customer base and the broadest range of use cases.” Its partner ecosystem combined with its training and certification programs “makes it easier to adopt and operate AWS in a best-practice fashion,” Gartner says.

Gartner notes that optimal use of AWS may require professional services, and recommends the use of third-party cost management tools to keep track of cloud expenses. Of course, all of this is good news for the latest crop of cloud service providers who are positioning their services around providing better support for AWS.

Winner: Microsoft

Microsoft Azure is considered one of the big three IaaS providers right now. Gartner says Microsoft’s strengths include integrated IaaS and PaaS components that “operate and feel like a unified whole”, rapid addition of new features and services, and becoming more open – including its support of Red Hat earlier this year.

And Gartner isn’t the only one that recognizes Azure’s mass appeal: other recent research has predicted that adoption of Azure by CIOs could surpass AWS by 2019.

Like AWS, successful implementation of Azure relies on customers forming relationships with partners. But Gartner says that while Microsoft “has been aggressively recruiting managed service and professional services partners… many of these partners lack extensive experience with the Azure platform, which can compromise the quality of the solutions they deliver to customers.”

But it’s not necessarily the fault of the partners; Gartner says that “CMP vendors and MSPs report challenges in working with Azure, particularly in the areas of API reliability and secure authentication, which are slowing their ability to deliver solutions.”

Winner: Google

Google’s capabilities in the IaaS space rely heavily on its own experience running the back-end of its behemoth search engine. In other words, Google allows other companies to “run like Google” which makes it the top contender for cloud-native use cases and applications.

But Google is lacking in key areas that could prevent it from further adoption with established organizations and startups; namely, “user management suitable for large organizations, granular and customizable role-based access control (RBAC), complex network topologies equivalent to those in enterprise data centers, and software licensing via a marketplace and license-portability agreement.”

Unlike AWS and Microsoft, who have been fairly supportive of partners, Google has focused more on delivering its cloud services direct, even pushing some MSPs to vow to never work with the company.

Winner: Rackspace

With roots in OpenStack cloud, Rackspace has worked to be more technology-neutral, and shifted away from this to embrace “its roots as ‘a company of experts,’” offering managed AWS support and other managed services for third-party clouds. Rackspace is also strong when it comes to private cloud offerings.

What has held Rackspace back? According to Gartner, it has not been able to keep up with the pace of innovation of the market leaders.

Gartner also hinted that Rackspace could become an acquisition target – which was at least partially confirmed this week as reports surfaced that it is close to a deal with private equity firm Apollo.

Loser: VMware

While Gartner acknowledges that VMware is the market share leader in virtualization, vCloud Air has “limited appeal to the business managers and application development leaders who are typically the key decision makers for cloud IaaS sourcing.”

“VMware is no longer significantly expanding the geographic footprint of vCloud Air, nor investing in the engineering necessary to expand its feature set beyond basic cloud IaaS,” Gartner says.

Loser: NTT Communications

Thought NTT Communications (NTT Com) has a strong presence in Asia-Pacific – a challenging market for many IaaS providers – its basic cloud IaaS offering is not enough to set it apart from its competitors.

Gartner says it is “missing capabilities that would make it attractive to enterprise IT operations organizations” – which could be somewhat addressed by its CSB portal that includes its offerings and third-party clouds, expected to launch this year.

Loser: Fujitsu

Gartner says that Fujitsu’s cloud IaaS capabilities “lag significantly behind those of the market leaders” and “it will continue to need to aggressively invest in acquiring and building technology in order to be competitive in this market.”

Source: TheWHIR

Cambridge Semantics Names Steve Hamby Managing Director Government

Cambridge Semantics Names Steve Hamby Managing Director Government

Cambridge Semantics has announced the appointment of Steve Hamby to managing director government.

In this newly created position, Hamby will serve Cambridge Semantics’ federal government customers seeking insights from big data discovery, analysis, and data management solutions, such as the Anzo Smart Data Lake™, to provide timely, accurate and customizable information to staff, citizens, media and businesses.

“We are delighted to have Steve join us as managing director government,” said Alok Prasad, president of Cambridge Semantics. “With our rapidly expanding client roster in the public space, Steve’s addition to the team will permit us to further develop our market presence as big data analysis becomes indispensable to delivering effective and efficient government services.”

Hamby brings over 30 years of experience in the information technology industry to the company, most recently serving public sector customers as the CEO of G Software, Inc. and as chief technology officer for Orbis Technologies, Inc. In 2013, he was recognized by the American Business Awards™ as Technology Executive of the Year, Silver Award for his pioneering efforts on cloud-based HUMINT- and OSINT-centric fusion products at Orbis Technologies. Hamby is also a published author who often speaks at major industry conferences. He holds a bachelor’s degree in management from the University of North Alabama and a master’s degree from Jacksonville State University.

“It’s an exciting time for Cambridge Semantics to step up its presence in the public sector,” said Hamby. “Government agencies have a tremendous interest in semantic-based smart data discovery and analytic solutions, and I look forward to working with these organizations to help them simplify data access and discovery for the citizenry.”

Source: CloudStrategyMag

SAIC Introduces Cloud Migration Edge

SAIC Introduces Cloud Migration Edge

Science Applications International Corp. (SAIC) has launched Cloud Migration Edge™, a multi-tiered methodology that migrates and transforms customers’ current IT applications and systems to a cloud environment securely and effectively. As a cloud services integrator, SAIC teams with the best cloud technology providers to engineer solutions that meet customers’ individual needs.

Cloud Migration Edge is a holistic, five-step approach that encompasses specialized tools, processes, and best practices to guide the cloud migration life cycle and provide ongoing improvement. This formalized framework supports the step-by-step implementation of a mission-centric cloud computing environment by breaking down the cloud migration process into standardized components at each layer of the IT service life cycle.

“Our advanced cloud expertise and proven methodology allow our federal government customers to rapidly and securely integrate and adapt cloud technologies to improve delivery of their IT services,” said Charles Onstott, SAIC senior vice president and general manager of the Cyber, Cloud, and Data Science Service Line. “To accomplish this, we have taken our IT business transformation, cybersecurity, and cloud computing expertise to deliver a systematic approach to cloud migration, while applying IT Infrastructure Library best practices.”

Additionally, SAIC’s customized approach includes several aspects of business transformation such as policies, processes, security, governance, architecture, applications, and change/risk management.

“Our cloud services integration solution creates a comprehensive and secure IT environment, crafted to meet our customers’ unique requirements, using both existing customer investments and modern cloud technologies,” said Coby Holloway, SAIC vice president of Cloud Computing and Business Transformation Services.

SAIC works with customers to analyze their requirements and business needs, develop the appropriate architecture, design the migration approach, and implement the transition plans to include change and risk management. SAIC also establishes a new operations and maintenance model based on the target architecture that includes cloud management and continuous service improvement.

“Migration is not just about the applications, it is about transforming the way business and missions are performed while providing new capabilities that cloud-based systems enable,” Onstott continued. “We evaluate the current system and requirements, future needs, what makes sense to migrate and how, the risks involved, the transition process needed, policies, people, processes and how those are affected, and develop the best implementation plan to transition the business with the lowest impacts on productivity and current operations.”

As part of SAIC’s total solution, Cloud Migration Edge uses industry-leading capabilities from Amazon Web Services, EMC, NetApp, RedHat, VMware, and others. As a cloud services integrator, SAIC is able to bring the best solutions from our partners across the cloud computing industry, avoiding vendor bias and lock-in.

SAIC Cloud Migration Edge five-phase methodology:

  • Assess and Strategize: SAIC defines objectives and builds a cloud strategy that meets technical, regulatory compliance, and security requirements. This involves creating assessments, building requirements, developing a business case, and outlining a return on investment.
  • Design: SAIC tailors a solution that includes the cloud platform, security, management, monitoring, and final design to achieve each customer’s goals. SAIC uses a comprehensive systems engineering approach to create both a final cloud-enabled infrastructure as well as a detailed migration strategy that includes transformation of the customer’s IT processes and organization to a cloud service delivery model.
  • Transition: During this step, SAIC migrates IT services to the cloud with minimal disruption using unique managed business transformation approach, including an implementation plan, operational testing, and final execution.
  • Operate: SAIC orchestrates cloud services to meet performance levels using proven processes to mitigate risk with constant monitoring. SAIC will organize, monitor, verify, report, and manage various operational and governance activities that ensure the production environment meets or exceeds performance metrics. SAIC also introduces heavy automation to increase the efficiency and consistency of the new services, and to facilitate onboarding and cloud service adoption.
  • Improve: SAIC capitalizes on the flexibility of cloud-enabled architectures to optimize service value. During this phase, SAIC provides customers with services, including project management, staff augmentation, data migration, workload migration, independent verification and validation testing, and concept operations updates. Customers benefit from the lessons learned and best practices developed across all of SAIC’s cloud work, which are used to continually update our Cloud Migration Edge approach and implementations. This phase involves evaluating service delivery, identifying, and implementing opportunities for improvement.

Source: CloudStrategyMag

Qligent Integrates Big Data Capabilities Into Vision Cloud Monitoring Platform

Qligent Integrates Big Data Capabilities Into Vision Cloud Monitoring Platform

Qligent is building big data conditioning into its Vision cloud monitoring platform in time for IBC2016 (September 9-13, RAI Exhibition Center, Stand 8.E47). The integration of this new software will help broadcasters and media businesses leverage big data insights much quicker and easier for multiplatform content delivery.

As has always been the case in television, viewers quickly lose patience and tune out if broadcast quality suffers. The challenge for broadcasters and new media businesses, including OTT service-providers, is the sheer cost and complexity of monitoring a quickly escalating density of streams and channels. The Vision cloud monitoring platform gives users a wider palette to monitor these many streams from the studio headend to the last mile more effectively — and cost-efficiently.

At IBC2016, visitors to the Qligent stand can learn how the new big data and other advanced capabilities built into Vision enhance analysis across both linear and non-linear TV and video streams. This includes rich, detailed and customized presentations around combining and structuring specific QoE parameters to see the data in a meaningful and actionable manner, including:

  • Percentage of macroblocking, freeze, black, and other artifacts in a program stream
  • Quality of advertising playout over a specific period of time
  • Presentation of off-air time over a broadcast day or week
  • Capture, verification and correlation of embedded metadata

“Many of our current and prospective customers in the broadcast space share that big data is the only way to reconnect and stay connected with what used to be their captive audiences,” said Ted Korte, COO, Qligent. “There has been an explosion of non-linear avenues for content delivery across gaming consoles, mobile devices and hundreds of social media sites, all stealing eyeballs, time and attention. Everything the linear TV service provider once understood is now completely fragmented, and these customers need a new set of data-centric tools to understand the quality of the viewing experience—and how to monetize that data moving forward.”

Vision users can opt to create and manage big data widgets for on-site analysis, or farm out the application to Qligent’s managed service layer via the company’s Oversight MaaS (“Monitoring as a Service”) platform. This further drives down the costs and labor associated with monitoring multiple streams and sources across many delivery platforms.

“The fact is that the stress around multiplatform monitoring can cause many headaches in understaffed and under-skilled facilities, to the point where they may not know they are off the air on a specific platform until receiving a complaint,” said Korte. “While the tried-and-true linear models still catches more eyeballs and viewers on initial impressions, to remain competitive, broadcasters and MVPDs need to be on as many of these emerging platforms as possible with engaging, high-quality content. Our big data capabilities in Vision will help our customers understand what the quality of experience is across these many platforms. That data really represents the viewer feedback that isn’t typically received, and will help our customers understand when and why viewers tuned out—and how to rectify any viewing quality problems.”

Source: CloudStrategyMag

As Rackspace Mulls Private Equity, We Ask: Why Do Public Companies Go Private?

As Rackspace Mulls Private Equity, We Ask: Why Do Public Companies Go Private?

On Thursday news broke that managed cloud services provider Rackspace is in advanced talks with a private equity firm. If the deal goes through, Rackspace will be the latest tech company to go private after being a publicly traded company, following Solarwinds, Dell, and others before it.

IPOs have not been kind to tech companies this year, and the slowdown is hitting investment banks hard. According to a report by the San Francisco Chronicle this week, revenue for U.S. investment banks dropped 20 percent year-over-year to $16.1 billion in the first half of 2016. Investment banks’ IPO revenue fell 58 percent from $1.1 billion in the first half of 2015 to $450 million in the first half of 2016.

This trend, according to Bulger Partners managing director Doug Melsheimer, in an interview with Fortune, is not too surprising “given how much everyone complains about the burden of being a public company and how much money is swirling around the private equity landscape.”

We asked Structure Research founder and managing director Philbert Shih to provide some insight into why a company that is already public would want to go private.

“There are obvious financial benefits for management and shareholders given that a buyout typically involves a very healthy premium on the current stock price,” Shih says. “One of the primary benefits of going private is to focus on a long-term strategy and spend less time meeting quarterly expectations and complex regulatory and compliance requirements. This is a unique point in Rackspace’s history and going private will allow it to execute on some of the big decisions it has made – i.e. the shift to a managed third party cloud model – without the pressure from shareholders to hit numbers and continually drive immediate value.”

For firms like Solarwinds, going private is the best choice for future growth of the company. Solarwinds CEO Kevin Thompson told NetworkWorld earlier this year: “It is never an easy decision to go private because it’s a change in the strategy and course you were on, and ultimately you need to get 100 percent alignment with your board and your management team.”

Source: TheWHIR

Data Centers' Water Use Has Investors on High Alert

Data Centers' Water Use Has Investors on High Alert

By Justin Morton

(Bloomberg) — Data centers, used by governments and large corporations to house their computer systems, have one big environmental problem: They get hot.

To keep them from overheating, large data centers can pump hundreds of millions of gallons of water a year through the facilities, according to company reports. That high demand for water has some investors concerned, especially in places where natural water resources are becoming ever more precious, like tech-heavy California.

READ MORE: Here’s How Much Water All US Data Centers Consume

“We definitely want our portfolio companies to be cognizant of their water use and take the appropriate steps to minimize their water use and recycle water,” said Brian Rice, portfolio manager at the California State Teachers’ Retirement System, which manages about $189 billion in assets as of June 30. He cited water usage as a concern at data centers as well as at other portfolio companies, such as those in agriculture.

Golden State

California—home to companies running some of the world’s biggest data centers—houses more than 800 of the facilities, the most of any U.S. state, according to Dan Harrington, research director of 451 Research LLC, a technology consulting firm.

Water usage there is especially a concern as the state’s drought pushes into its fifth year. California Governor Jerry Brown issued an executive order in May to extend statewide emergency water restrictions, establishing long-term measures to conserve water.

RELATED: Why Salesforce Bought Coolan, a Data Center Optimization Startup

The water risk to investors of California-based companies operating data centers will not affect them gradually, said Julie Gorte, senior vice president of sustainable investing at Pax World Management LLC. “It will probably come in one big splashy moment,” she said.

As a result, some sustainable-minded investors are trying to enhance their understanding of water risk before it becomes a liability, said Cate Lamb, head of water at investor environmental advocacy group CDP. The group held a series of workshops this year for investors to discuss their most crucial water reporting needs, such as isolating water risk of individual assets. The number of institutional investors committed to its water engagement program with companies has grown to 617 from 150 in 2010.

Operational efficiencies at data centers have a direct link to companies’ profitability and pose an increasing risk for investors in a “tense” climate change environment, said Himani Phadke, research director at the Sustainability Accounting Standards Board, a non-profit that writes corporate sustainability reporting guidelines for investors.

Companies, like investors, are trying to get ahead of the risk.

Corporate Response

Bill Weihl, director of sustainability at Facebook Inc., said the company uses a combination of fresh air and water to cool its data centers. In 2015, Facebook said it used a total of 221 million gallons of water, with 70 percent of that consumption at its data facilities. “We designed our data centers to use about half the water a typical data center uses,” he said in e-mailed answers to questions.

Around Facebook’s Prineville, Oregon, data center in particular, water efficiency has become “a big issue,” Weihl said. The center is east of the Cascade Mountains, a region that tends to be drier than western side of the state, and businesses must compete with farmers and a growing local population for water.

Weihl said rainwater capture and reuse, which is used for irrigation and toilet-flushing at the center, saves 272,000 gallons of municipally treated water per year. Facebook is also working with the City of Prineville and its engineers on the town’s water plan, which includes water mitigation and recycling “grey water” from buildings, he said.

Water consumption at eBay Inc.’s Salt Lake City-based data center rose 14 percent in 2014 to 31,354 gallons, according to the online retailer’s sustainability report, while its Phoenix facility saw usage drop 3 percent to 57,421 gallons. A company spokeswoman declined to comment.

Google declined to say how much water the company’s data centers use, but said that the company redesigns its cooling technology on average about every 12 to 18 months. The company has also designed data centers that use air instead of water for cooling, it said.

“There is no ‘one size fits all’ model — each data center is designed for the highest performance and highest efficiency for that specific location and we’re always testing new technologies to further our commitment to efficiency and environmental responsibility,” vice president of data center operations Joe Kava said in an e-mail adapted from an earlier blog post.

Growing Issue

The environmental impact of data centers is poised to grow as the world produces more data each day. Carbon emissions from data centers already represent 0.2 percent of the world’s total carbon dioxide emissions, compared to 0.6 percent for airlines, according to a 2010 McKinsey & Co. report. And more companies are developing larger data centers as they transition to cloud computing, increasing the demand for water needed for cooling their data servers, said Pax World’s Gorte.

The need to boost water unit efficiency at data centers is driving some companies to open up locations near water sources and cooler climates. Menlo Park, California-based Facebook, for example, began operations at its overseas data center in Lulea, Sweden in 2013 near the Arctic Circle. Mountain View, California-based Google operates a total of 15 data centers with four located in northern Europe.

Investor concern about corporate water use will only continue to grow, said William Sarni, director and practice leader of Water Strategy at Deloitte Consulting LLP.

“Over the past few years, we have seen a dramatic increase of interest in water as a business risk and also as a business opportunity issue,” said Sarni. “I see it accelerating.”

Source: TheWHIR

Nearly Half of Developers Worldwide Are Android-First: Report

Nearly Half of Developers Worldwide Are Android-First: Report

Almost half of professional developers now consider Android to be their primary platform, according to research from VisionMobile. The latest edition of its semi-annual State of the Developer Nation Q3 2016 report also shows a strong correlation between the developers cloud and desktop platform of choice.

Based on responses of over 16,000 developers globally, the VisionMobile Developer Economics survey shows that 47 percent of developers are Android-first, a seven percent increase which gives it a 79 percent mindshare among mobile developers. The increased attention came almost directly at the expense of iOS, which fell from the primary platform of 39 percent to only 31 percent of developers in only 6 months.

READ MORE: AWS Sweetens Developer Pitch with Cloud9 Acquisition

The increasing influence of markets and developers in the Eastern hemisphere, where Android leads iOS significantly, could be part of the reason for the shift. The end of the conflict between Google and Oracle over their Android java development kits very late 2015 may also have had an effect.

In addition to mobile platforms, the report focuses on desktop and cloud developer “tribes,” the IoT market, and the new technologies attracting developer attention.

Among Windows classic developers, 36 percent primarily use C# for cloud development, as opposed to only 2 percent of Linux-first developers and 3 percent of macOS developers, according to the report.

SEE ALSO: Cloud: Understanding Sizing and Capacity Requirements Driven by IoT

The ratio of new IoT developers fell drastically from half a year ago to 22 percent, after falling somewhat from Q2 2015 to Q4 2105, from 57 to 47 percent, respectively. The main target of IoT developers is the Smart Home, which was also the fastest growing IoT application, up by 6 to 48 percent. Ericsson has estimated that there will be 3 billion IoT devices in North America alone by 2021, which represents a lot of work for developers.

RELATED: Bsquare’s IoT Software Stack Helps Developers Link Devices to the AWS Cloud

The next big thing, judging by developer interest, is data science and machine learning, which 41 percent are involved with in some way, one-third of those professionally. Just under one-quarter of developers are working with augmented and virtual reality, mostly as a hobby or side-project.

Source: TheWHIR

Rackspace in Acquisition Talks with Private Equity Firm: Report

Rackspace in Acquisition Talks with Private Equity Firm: Report

UPDATE, 08/05/16: Sources tell Fortune that Apollo is the interested private equity buyer.

News that Rackspace could be sold to a private equity firm this week from the Wall Street Journal has pushed its shares up 16.91 percent to $31.03 in after hours trading on Thursday. The San Antonio-based cloud company is in advanced talks with one or more private equity firms that places the value of the company around $4 billion, according to a report by Barron’s blog

If the headline sounds familiar, it’s because almost exactly two years ago, reports indicated that Rackspace was exploring the option of taking the company private. It hired Morgan Stanley to help it explore its M&A options at the time, but nothing ever came of it.

Over the past two years under a new CEO, Rackspace has expanded its portfolio to include support and managed services for some of the most popular public clouds, including AWS. This strategy has helped it appeal to new customers and extend its reach beyond web hosting.

According to a report by Market Realist earlier this week, “Rackspace’s increased customer signing from AWS is an encouraging sign, considering Amazon rules the cloud space with a 31 percent market share. Moreover, its deal with Microsoft’s Azure is also beneficial for the company, as Microsoft is rapidly making its presence felt in the cloud space.”

Source: TheWHIR

Want Your ISP to Respect Your Privacy? It May Come at a Cost

Want Your ISP to Respect Your Privacy? It May Come at a Cost

Comcast has filed an argument (PDF) this week with the FCC to allow it to charge broadband users more to offset the burdens of maintaining their privacy. The FCC is considering new rules for Protecting the Privacy of Customers of Broadband and Other Telecommunications Services, which would require ISPs to disclose what information is tracked and sold, as well to provide a way for users to opt out of such tracking.

Advertisers have complained that consumers could end up with less privacy protections while large volumes of content move behind paywalls, while consumer advocates have argued that the proposed rules simply move the FCC closer to the stronger privacy protection consumers were entitled to under FTC regulation, before broadband providers were reclassified as common carriers for regulation purposes last year.

READ MORE: FCC Open Internet Rules Upheld in Federal Court

“A bargained-for exchange of information for service is a perfectly acceptable and widely used model throughout the U.S. economy, including the Internet ecosystem, and is consistent with decades of legal precedent and policy goals related to consumer protection and privacy,” Comcast wrote to the FCC. The company also claims that blocking its plan “would harm consumers by, among other things, depriving them of lower-priced offerings.”

AT&T is already using this model to charge users of its gigabit broadband service a $30 (or more) add-on charge to opt out of a tracking program called, without any obvious irony in the promotional material, “Internet Preferences.”

In the most recent Who Has Your Back report from the Electronic Frontier Foundation (EFF), which measures the privacy practices of major internet companies and service providers, Comcast earned three out of a possible five stars. The report recommends Comcast adopt a stronger policy around providing users with notice about government data requests.

Source: TheWHIR