The Post Safe Harbor Era: New Opportunities for Service Providers

The Post Safe Harbor Era: New Opportunities for Service Providers

At the beginning of June, Reuters reported that for the first time, fines were imposed on companies in Germany that continue to act as if the Safe Harbor Privacy principles are still valid. The US-EU agreement was invalidated by the EU high court in October 2015.

Immediately after the ban, new negotiations began for obvious reasons. The stakes are very high: the transfer of EU personal data to the US without the explicit consent in each case of every user is prohibited. Every citizen, consumer organization or regulator can in principle, start legal proceedings against companies that ignore the ruling.

This truly is a sword of Damocles hanging over the market and nearly everyone in the IT industry wants it removed as quickly as possible.

The complexity is apparent. The public and policy makers focus their attention on companies that have data. The relevance of the current situation to the hosting and data center industry is however, still underexposed.

Let’s have a closer look at three common issues:

  1. The Infrastructure

Companies that use software or hardware to process personal data should be aware of the “call home” function, mainly for maintenance and monitoring in many applications and devices. Do they know that during that process, (parts) of personal data can be transferred as well?

Even if all the data is stored and processed in the EU, some of it may be transferred to non-EU countries. That possibility concerns about anything that is in a server rack, servers, switches, routers, storage.

Who is responsible for that hardware? Some providers and data centers have already received inquiries about this matter; still not all have answers that will reassure users and end users. Switching off these features is an option. We know that, for example, in the public sector of several EU member states, that option and off as the default mode is required for all new appliances.

  1. Cloud Marketplaces

Data centers and large providers increasingly offer connectivity via their platforms with the infrastructure of clouds from third parties. Providers should analyze all components of those marketplaces to find out if there is a possibility of personal data being transferred to non-EU countries.

Sound complicated? You’re right.

  1. Data of Employees

Let’s turn to one detail of those first fines in Germany. The imposed amount equivalent to $32,000 USD is not particularly high. What is striking, is that the local supervisor (by the federal structure of Germany, each state has its own regulator) found two violations at Adobe, Punica and Unilever. Customer data and data of employees were being exposed.

Most press attention to date has gone to the data of the customers and/or website visitors. For all multinationals, the second category is extremely relevant. Some multinationals, even before the October court ruling, decided to quit the idea of centrally controlled payrolling and so on. They opted instead for decentralized solutions to avoid possible violation of EU legislation. Imagine the costs of that decision and the work that has to be done by IT to make it happen.

So what does this mean for the daily business of service providers, hosters and data centers in this complex situation?

If you are based in the US, the bad news is that chances are you have enduser data on your systems that the customer is not allowed to store or process. That is his responsibility, but potentially, his decision to terminate could hurt you. The second point is about the market places that might include services that move or copy data to other regions. You have to be transparent about that, because a misinformed customer cloud cost you sales.

There is some good news as well: transparency and clear communication is more rewarding than ever.

EU companies are confused by post-Safe Harbor implications and the upcoming GDPR situation and are looking for clear answers. US companies are also looking for future-proof solutions for dealing with customers and situations in the EU.

There are providers both in the US and EU, that consider their proven knowledge of this lesser known data traffic and ability to give advice on application and data migration to specific geographical areas as a unique selling proposition.

Each year, HostingCon Europe focuses on the issues, trends and legislation that affect your business. Attend to get cutting edge information about changing market conditions and how to navigate challenges in the EU marketplace. Learn about post Safe Harbor security issues with our panel of experts including US attorney David Snead and Alban Schmutz, SVP at the number one hosting provider in Europe, OVH.

Source: TheWHIR

Google-Backed FASTER Submarine Cable to Go Live This Week

Google-Backed FASTER Submarine Cable to Go Live This Week

datacenterknowledgelogoBrought to you by Data Center Knowledge

FASTER, the Google-backed submarine cable that adds much needed network bandwidth between data centers in the US and data centers in Japan, Taiwan, and the broader Asia-Pacific market, has been completed, about two years after the project was first announced. The cable will start carrying traffic on Thursday, a Google spokesperson said via email.

As more and more data is generated and transferred around the world, demand for connectivity is skyrocketing. There has been an increase in submarine cable construction activity in response, with major internet and cloud services companies like Google, who are the biggest consumers of bandwidth, playing a bigger and bigger role in this industry.

The FASTER system lands in Bandon, Oregon; two cities in Japan, Chikura and Shima; and in Taishui, Taiwan, according to TeleGeography’ssubmarine cable map. The cable landing stations are connected to nearby data centers, from where the traffic is carried to other locations in their respective regions.

On the US side, data center providers Telx (owned by Digital Realty Trust), CoreSite, and Equinix have made deals to support the new system. A Telx data center in Hillsboro, Oregon, is connected to the landing station in Bandon. FASTER traffic will be backhauled to Equinix data centers in Silicon Valley, Los Angeles, and Seattle. CoreSite’s big connectivity hub in Los Angeles will also have direct accessto the system.

FASTER cable map telegeography

The FASTER submarine cable system lands in the US, Japan, and Taiwan (Source: TeleGeography, Submarine Cable Map)

Google invested $300 million as member of the consortium of companies that financed the submarine cable’s construction. Other members are China Mobile International, China Telecom Global, Malaysian telco Global Transit Communications, Japanese telco KDDI, and Singaporean telco Singtel.

Both KDDI and Singtel are also major data center services players. Singtel is the biggest data center provider in Singapore, one of Asia’s most important network interconnection hubs, and has a partnership with Aliyun, the cloud services arm of China’s internet giant Alibaba. KDDI subsidiary Telehouse operates data centers throughout Asia, as well as in Europe, US, and Africa.

The rate of growth in global internet traffic has been breathtaking. Cisco’s latest Global Cloud Index projects the amount of traffic flowing between the world’s data centers and their end users to grow from 3.4 zettabytes in 2014 to 10.4 zettabytes in 2019. It would take the world’s entire 2019 population streaming music for 26 months straight to generate 10.4 zettabytes of traffic, according to Cisco’s analysts.

Learn more: Data Center Network Traffic Four Years From Now: 10 Key Figures

Cloud will be responsible for the majority of all that traffic four years from now, according to Cisco, so it comes as no surprise that the cloud giants have ramped up investment in global network infrastructure.

Amazon Web Services, the biggest cloud provider, made its first major investment in a submarine cable project earlier this year. The Hawaiki Submarine Cable, expected to go live in June 2018, will increase bandwidth on the network route between the US, Australia, and New Zealand. Amazon agreed to become the cable’s fourth anchor customer, which finalized the financing necessary to build the system.

Microsoft and Facebook announced in May a partnership to finance construction of a transatlantic cable called MAREA, which will land in Virginia and Bilbao, Spain.

Microsoft is also an investor in the New Cross Pacific Cable System, a transpacific cable that will land in Oregon, China, South Korea, Taiwan, and Japan, and the transatlantic system called Express, which will land in Canada, the UK, and Ireland.

Source: TheWHIR

How to get your mainframe's data for Hadoop analytics

How to get your mainframe's data for Hadoop analytics

Many so-called big data — really, Hadoop — projects have patterns. Many are merely enterprise integration patterns that have been refactored and rebranded. Of those, the most common is the mainframe pattern.

Because most organizations run the mainframe and its software as a giant single point of failure, the mainframe team hates everyone. Its members hate change, and they don’t want to give you access to anything. However, there is a lot of data on that mainframe and, if it can be done gently, the mainframe team is interested in people learning to use the system rather than start from the beginning. After all, the company has only begun to scratch the surface of what the mainframe and the existing system have available.

There are many great techniques that can’t be used for data integration in an environment where new software installs are highly discouraged, such as in the case of the mainframe pattern. However, rest assured that there are a lot of techniques to get around these limitations.

Sometimes the goal of mainframe-Hadoop or mainframe-Spark projects is just to look at the current state of the world. However, more frequently they want to do trend analysis and track changes in a way that the existing system doesn’t do. This requires techniques covered by change data capture (CDC).

Amazon's Elastic File System is now open for business

Amazon's Elastic File System is now open for business

Following an extended preview period, Amazon’s Elastic File System is now generally available in three geographical regions, with more on the way.

Originally announced last year, EFS is a fully managed elastic file storage service for deploying and scaling durable file systems in the Amazon Web Services cloud. It’s currently available in the U.S. East (northern Virginia), U.S. West (Oregon), and EU (Ireland) regions, the company announced Wednesday.

Customers can use EFS to create file systems that are accessible to multiple Amazon Elastic Compute Cloud (Amazon EC2) instances via the Network File System (NFS) protocol. They can also scale those systems up or down without needing to provision storage or throughput.

EFS is designed for a wide range of file workloads, including big data analytics, media processing, and genomics analysis, AWS said.

BigCommerce Helps Online Merchants Boost Pinterest Profits

BigCommerce Helps Online Merchants Boost Pinterest Profits

BigCommerce is helping its merchants convert Pinterest fans into paying customers with the launch of Pinterest Buyable Pins on desktop. BigCommerce launched Pinterest Buyable Pins in May, but it was initially only available for mobile devices.

According to an announcement by BigCommerce this week, while the majority of Pinterest’s traffic comes from its mobile app (80 percent), the majority of checkout experiences still take place via desktop.

SEE ALSO: Salesforce Acquires Demandware for Ecommerce Expertise

With Pinterest Buyable Pins BigCommerce merchants provide shoppers the ability to browse and purchase products directly on Pinterest.

Buyable Pins was Pinterest’s first major move in monetizing its platform and converting its some 100 million monthly active users per month into shoppers. With more than 1 in 5 consumers engaging with brands on Pinterest, this functionality could not come soon enough for retailers.

Pinterest has released a new multi-device shopping cart, available on Android now, and in the coming months on iOS. With this release shoppers can now add items to a persistent shopping cart which they can access by logging into their Pinterest account on multiple devices. With this shopping cart buyers can also buy from multiple merchants simultaneously.

“This type of holistic multi-device shopping experience, where a user’s login saves product information and activity, positions Pinterest more actively as an ecommerce marketplace, presenting a wide variety of brands and products on a single platform,” BigCommerce managing editor Tracey Wallace said in a blog post on Tuesday.

For ecommerce hosting providers to stay relevant in the new digital marketplace, where more people are making purchase decisions based on social media, being able to help your merchant customers sell where their customers are buying will be key.

Source: TheWHIR

Younger generation more optimistic about big data, study says

Younger generation more optimistic about big data, study says

IT professionals aged 18 to 34 are much more optimistic than their elders that “big data” analysis will fundamentally change how business is conducted in the next few years, according to an IDG Enterprise survey.

Older IT professionals were more skeptical about the transformative power of big data, the survey found. The reason may be that “older respondents have seen many supposedly transformational technologies come and go throughout their careers. It’s possible that they’re simply less willing to predict that any particular trend … will be a source of fundamental change,” the IDG study said.

big data

In addition, the 18 to 34 age group was more likely than other age groups to say that big-data projects ought to analyze social network comments for consumer sentiment.

The study is based on a survey of 724 IT decision-makers who reported that their organizations are currently implementing, planning or considering big-data projects. IDG Enterprise is a division of IDG Communications, Computerworld‘s parent company.

The IDG report also covers:

  • The top business objectives for data-driven initiatives
  • The top pain points in data projects
  • How IT is handling security for big data

ul>

Register to download a free copy of the executive summary: IDG Enterprise 2016 Data & Analytics Survey.

Source: InfoWorld Big Data

Oracle Brings SPARC To The Cloud

Oracle Brings SPARC To The Cloud

Oracle has announced major new additions to the SPARC platform that for the first time bring the advanced security, efficiency, and simplicity of SPARC to the cloud. Built on the new SPARC S7 microprocessor, the latest additions to the SPARC platform include new cloud services, engineered systems and servers.

While the business benefits of the public cloud are increasingly clear, many organizations have yet to move enterprise workloads to the cloud due to performance, security and management concerns. To help eliminate those concerns and enable organizations to confidently move enterprise workloads to the cloud, the new SPARC platform is designed from the ground up to economically improve on cloud computing delivery of the most critical business applications and scale-out application environments.

The latest additions to the SPARC platform are built on the revolutionary new 4.27 GHz, 8-core/64-thread SPARC S7 microprocessor with Software in Silicon features such as Silicon Secured Memory and Data Analytics Accelerators, which delivers the industry’s highest per-core efficiency and enables organizations to run applications of all sizes on the SPARC platform at commodity price points. All existing commercial and custom applications will run on the new SPARC enterprise cloud services and solutions unchanged with significant improvements in security, efficiency and simplicity.

“We are still in the early phases of cloud computing adoption and as the market matures, organizations will increasingly move critical enterprise workloads to the cloud,” said John Fowler, executive vice president, Systems, Oracle. “To enable our customers to take advantage of this next stage of cloud computing to speed innovation, reduce costs and drive business growth, we are focused on delivering proven enterprise-grade services such as the Oracle SPARC platform in the Oracle Cloud.”

Some of the key features of the new SPARC platform include:

Effortless Security: The latest additions to the SPARC platform are designed for security and compliance and utilize Silicon Secured Memory capabilities to address malware attacks and programming errors. Wide-key encryption ciphers and hashes enable a fully encrypted cloud with less than two percent performance overhead. In addition, security is further enhanced through verified boot, immutable content that prevents unauthorized changes, enforced secured updates and a trusted and secure hardware and software supply chain that does not rely on intermediaries.

Breakthrough Efficiency: By taking advantage of the open APIs in the processor and integrated Data Analytics Accelerators, which deliver up to 10x greater analytics performance spanning enterprise, big data and cloud applications, the latest additions to the SPARC platform reduce latency and cost. When compared to the x86 servers, the fully integrated S7-2 and S7-2L servers delivers up to 100 percent better per core efficiency, 1.7x better per core Java performance efficiency, 1.6x per core database OLTP performance efficiency, and 2-3x more bandwidth for high-traffic analysis and cloud apps.

Straightforward Simplicity: Taking integration a step beyond the server, the Oracle MiniCluster S7-2 Engineered System dramatically simplifies the top four most challenging aspects of enterprise computing: security and compliance; high availability; patching and administration; and performance tuning. By eliminating the need for a standard platform or OS and reducing security and database administration time and effort, the new Engineered System enables organizations to:

  • Secure systems by default by eliminating enterprise security expertise requirements
  • Automate compliance monitoring and auditing in order to maintain the secure state of the system over time
  • Make service resiliency effortless by taking advantage of a high availability operation that is engineered in to the hardware and software
  • Ensure the platform will always be up to date with the latest software and security enhancements through simple full stack patching
  • Enhance database and application performance through automatic performance tuning.

The new SPARC S7 processor-based cloud services and systems deliver commodity x86 economics and significant enterprise-class functionalities for security and analytics with Software in Silicon. They include new Oracle Cloud Compute platform services, the Oracle MiniCluster S7-2 Engineered System and Oracle SPARC S7 servers. These new products are designed to seamlessly integrate with existing infrastructure and include fully integrated virtualization and management for cloud.

The new Oracle SPARC Cloud service that is now part of the SPARC platform is a dedicated compute service to provide organizations with a simple, secure and efficient compute platform in the Cloud. The new service extends the complete suite of cloud services that Oracle provides to help organizations rapidly build and deploy rich applications — or extend Oracle Cloud Applications — on an enterprise-grade cloud platform.

To extend the security and performance benefits of the Oracle SuperCluster engineered systems to mid-size computing, Oracle has also introduced Oracle MiniCluster S7-2. Through full application and database compatibility with SuperCluster M7, Oracle MiniCluster enables organizations to reduce hardware and software costs at a fraction of the cost of commodity solutions. The new Oracle Engineered System is designed to support multi-tenant application and/or database consolidation, remote office/branch office computing demands and test/development environments.

Oracle has also introduced new additions to the SPARC server product line that extend the M7/T7 portfolio to address scale-out and cloud workloads at attractive new low price points. The new two-socket SPARC S7 servers are available in different configuration options that are optimized for either compute or storage and IO density and include Software in Silicon offload features for malware attack prevention, no compromise encryption and data analytics acceleration.

Source: CloudStrategyMag

Firms in Regulated Industries Smarten Up on Cybersecurity, Encrypt More than Ever

Firms in Regulated Industries Smarten Up on Cybersecurity, Encrypt More than Ever

The number of businesses making extensive use of encryption spiked seven percent over the past year, the largest increase in over a decade, according to research released Wednesday by Thales. More than two in five companies (41 percent) now use extensive encryption, the 2016 Encryption Applications Trend Study shows.

Ponemon surveyed over 5,000 professionals from 14 industries in 11 countries on behalf of Thales for the 11th annual study. It found that because of regulations, privacy concerns, and the need to protect against breaches, companies in financial services, healthcare and pharmaceutical, and technology are leading encryption adoption.

RELATED: Despite Increased Awareness of Encryption, Many Internet Users Think it’s Too Complicated

“The increased usage of encryption can be traced to many factors, chief among them being cyber-attacks, privacy compliance regulations and consumer concerns,” John Grimm, senior director security strategy at Thales e-Security said. “Additionally, the continuing rise of cloud computing as well as prominent news stories related to encryption and access to associated keys have caused organizations to evolve their strategy and thinking with respect to encryption key control and data residency. Our global research shows that significantly more companies are embracing an enterprise-wide encryption strategy, and demanding higher levels of performance, cloud-friendliness, and key management capabilities from their encryption applications.”

The study also found that the way companies think about encryption applications changes as their encryption practices mature.

Companies with mature encryption strategies are more likely to deploy Hardware Security Modules (HSMs) broadly across encryption applications. SSL/TLS, database encryption, and application level encryption are the most common uses for HSMs, the study said.

Companies with mature strategies are much more likely to apply encryption to big data repositories, public cloud services, business applications, and private cloud infrastructure, respectively. They also value regional segregation, tamper resistant dedicated hardware, and support for both cloud and on-premise deployment more highly.

Support for encryption both in the cloud and on-premise has risen in consideration to the second most important feature of encryption applications, while companies now consider performance and latency the most important feature.

Earlier this year the 2016 Global Encryption Trends Study, another in the series of Thales-Ponemon reports, showed a gradual increase in whole-enterprise encryption strategies.

The spike in business’ use of encryption roughly coincides with efforts by numerous governments to limit encryption (or its effectiveness), including those of the US, UK, and Russia.

Source: TheWHIR

Fusion Partners With Telarus to Provide Cloud Services

Fusion Partners With Telarus to Provide Cloud Services

Fusion has announced its partnership with Telarus, a leading value added distributor of network, UCaaS, and cloud services. Telarus has been rated a top resource in helping partners, IT VARs, MSPs, integrators, and communications professionals easily source data, voice, and cloud services through its patented pricing tools. Telarus will distribute Fusion’s fully integrated suite of cloud communications, cloud connectivity and managed network services, and cloud computing solutions through its extensive national distribution network of more than 2,000 sub-partners.

“Telarus and Fusion share a passion for service excellence and a desire to help our mutual partners and customers grow faster, stronger and smarter,” said Matthew Rosen, Fusion’s chief executive officer. “Telarus is well known and respected for its innovation in seeking new ways to help its agents succeed and grow. With Fusion’s single source solution for the cloud, Telarus partners will be able to offer everything an enterprise needs to successfully migrate to the cloud, and profit from its many benefits,” Rosen continued.

“Telarus places the highest value on team success,” said Patrick Obom, Telarus co-founder. “We’re firm in our conviction that individual success is only meaningful when it’s shared, and we’re committed to helping our partners win every step of the way. Telarus is focused on providing hands-on support from pre-sale to post-sale, helping develop marketing and business development strategies and working in concert with our providers to ensure an exceptional customer experience. That’s why we’re delighted to partner with Fusion, whose unwavering commitment to delivering the highest level of service is every bit as passionate as our own,” Oborn stated.

“Fusion appreciates the value of strong, enduring partner relationships and works closely with the channel to help distribute the company’s advanced, fully integrated cloud solutions to businesses nationwide,” said Russell P. Markman, Fusion’s president of Business Services. “Fusion’s expanding partner network receives expert, professional support from the company’s experienced sales, technical and customer service teams, and in-depth, ongoing training on technology, product and process. Fusion is committed to delivering the resources needed to attract, secure and maintain opportunities and drive our mutual success,” continued Markman.

Source: CloudStrategyMag

Dimension Data Unveils Next Generation Digital Technologies And Applications For Tour De France

Dimension Data Unveils Next Generation Digital Technologies And Applications For Tour De France

Dimension Data and Amaury Sport Organization (A.S.O.) have announced significant enhancements to the big data cycling analytics platform that will deliver real-time information to viewers, commentators and teams at the 2016 Tour de France.

Headlining the innovations is Race Center, a web-based application hosted on Dimension Data’s cloud platform and developed in partnership with A.S.O. that combines live race data, video, photographs, social media feeds, and race commentary, which combines with a new live tracking website to give viewers an immersive digital experience that goes beyond the television coverage of the race.

Race Center will become A.S.O.’s digital hub of the Tour de France going forward. And the live tracking website is a testament to the evolution of Dimension Data’s real-time big data collection, analytics, and digital platforms over the past 12 months.

Viewers will be able to get far richer and more accurate information from each of the 198 riders in 22 teams, including speed, distance between riders, composition of the race pelotons, wind speed and direction, as well as prevailing weather conditions.

Many of the new technologies that were trialed at last year’s race are now production-ready and feature significant improvements across the board. This year, the telemetry sensors installed under each rider’s seat that are responsible for transmitting live data boast a tenfold increase in transmission range. This means far fewer dropouts or ‘gaps’ in the data, resulting in more seamless communication and continuity throughout the race.

Dimension Data’s big data truck has also been upgraded and enlarged to accommodate the various television graphics, race coordination, data capture and analytics teams responsible for delivering the complete end-to-end data solution at the 2016 Tour de France. This is one of the biggest changes from last year, where each team worked separately, and continues the theme of collaboration and data integration made possible by the advancements in the technologies on display.

Adam Foster, Dimension Data’s group executive, Sports Practice, said, “The enhancements to this year’s solutions means we can tell richer and more enhanced stories as they happen, giving viewers, the media, cycling fans and race commentators deeper insights into some aspects of the sport that weren’t available until now. This year, we’re working with a much broader palette, which means access to more meaningful race data, race routes, riders and current weather conditions. What’s exciting this year is the ability to deliver all of this information to A.S.O. through a unified digital platform. This makes the quality of the data even more valuable for viewer engagement, and speaks directly to a generation of younger viewers who rely on new technologies such as social media and live video to engage with their world.”

Christian Prudhomme, director of the Tour de France, A.S.O, said, “The unprecedented growth in different social channels such as Instagram, Twitter, Facebook, and live video at last year’s race demanded these technologies be embraced and enhanced for modern viewers.”

“The Tour de France is a flagship event in a modern world, and it’s only natural that we give our viewers access to as much quality content, entertainment and analysis as possible through the media they use every day,” said Prudhomme. “Together with Dimension Data, we’ve been working on new ways to appeal to our billions of viewers, and we’re excited to showcase the result of our efforts through Race Center. I believe the appeal of having access to multiple real-time video, social media and live race information from one responsive and intuitive interface will greatly enhance the quality of coverage of the Tour de France, and become an essential companion to the largest live televised event in the world.”

Source: CloudStrategyMag