Google Cloud Dataflow Shows Competitive Advantage for Large-Scale Data Processing

Google Cloud Dataflow Shows Competitive Advantage for Large-Scale Data Processing

MammothData_logoMammoth Data, a leader in Big Data consulting, today announced the findings of its comprehensive cloud solution benchmark study, which compares Google Cloud Dataflow and Apache Spark. The company, specializing in Hadoop®, Apache Spark and other enterprise-ready architectural solutions for data-driven companies, saw a lack of understanding of current cloud technologies with no available comparison of the performance and implementation characteristics of each offering in a common scenario. As a result, Mammoth Data worked with Google to compare Google Cloud Dataflow with well-known alternatives and provide easily digestible metrics.

Google Cloud Dataflow is a fully managed service for large-scale data processing, providing a unified model for batch and streaming analysis. Google Cloud Dataflow provides on demand resource allocation, full life-cycle resource management and auto-scaling of resources.

Google Cloud Platform data processing and analytics services are aimed at removing the implementation complexity and operational burden found in traditional big data technologies.  Mammoth Data found that Cloud Dataflow outperformed Apache Spark, underscoring our commitment to balance performance, simplicity and scalability for our customers,” said Eric Schmidt, product manager for Google Cloud Dataflow.

In its benchmark, Mammoth Data identified five key advantages of using Google Cloud Dataflow:

  • Greater performance: Google Cloud Dataflow provides dynamic work rebalancing and intelligent auto-scaling, which enables increased performance with zero increased operational complexity.
  • Developer friendly: Google Cloud Dataflow features a developer-friendly API with a unified approach to batch and streaming analysis.
  • Operational simplicity: Google Cloud Dataflow holds distinct advantages with a job-centric and fully managed resource model.
  • Easy integration: Google Cloud Dataflow can easily be integrated with Google Platform and its different services.
  • Open-source: Google Cloud Dataflow’s API was recently promoted to an Apache Software Foundation incubation project called Apache Beam.

When Google asked us to compare Dataflow to other Big Data offerings, we knew this would be an exciting project,” said Andrew C. Oliver, president and founder of Mammoth Data. ”We were impressed by Dataflow’s performance, and think it is a great fit for large-scale ETL or data analysis workloads. With the Dataflow API now part of the Apache Software Foundation as Apache Beam, we expect the technology to become a key component of the Big Data ecosystem.”

Sign up for the free insideBIGDATA newsletter.

Source: insideBigData

Google Buys Developer of Training Platform for Google Apps for Work

Google Buys Developer of Training Platform for Google Apps for Work

Synergyse’s voice and interactive text-based help modules will now be available for free to customers of Google Apps for Work and Education.

Google has acquired Synergyse, the developer of an interactive training app for customers of Google Apps for Work.In a statement May 2, Google did not disclose details of the acquisition but said that Synergyse’s virtual coach for Google Apps will now be available for free to all customers of Google Apps for Work and Google Apps for Education.Up to now, Synergyse charged $10 per user per year for business organizations and government customers with up to 5,000 users. The company also charged $10 per user per year for employees at schools and other educational institutions, but offered the software free for students.Synergyse Training is a Chrome extension that installs a virtual guide inside Google Apps. The app offers voice-based modules as well as searchable, interactive text modules to train and help users use Gmail, Calendar, Docs, Drive and other Google productivity applications.

Synergyse has positioned the tool as something that organizations can use to get workers quickly up to to speed with existing product functionality in Google Apps for Works as well as with new features as they are rolled out.

According to Google, organizations that use Synergyse have a 35 percent higher adoption rate of Google Apps for Work than organizations that don’t. As a result, such organizations also tend to get more business value out of the productivity suite than others.Synergyse claims that more than 4 million people across 3,000 organizations around the world use the training software. Peter Scocimara, senior director of Google Apps operations, said the company’s decision to purchase Synergyse stemmed from its popularity among Google Apps users.”Given the enthusiasm that exists for Synergyse already, we want to extend this service to all of our customers,” Scocimara said in the blog post announcing the acquisition. “That is why we’re happy to announce Synergyse will be joining Google, and we intend to make the product available as an integral part of the Google Apps offering later this year.”Google’s Apps Learning Center currently offers numerous tools and tips to help consumers who are new to the company’s portfolio of productivity applications to quickly learn how to use the products and take advantage of the functions embedded in them. The learning center offers everything from quick-start guides for Google email, calendars, video meetings and other apps to product FAQs, cheat sheets and tips on how to migrate from another vendor’s platform to Google Apps.Synergyse isn’t the only organization to offer training for Google Apps; others include BetterCloud and Google UK partner Refractiv, with its Google Apps Tips.
Source: eWeek

Oculus Retail Sales Plans Anger Rift Preorder Buyers

Oculus Retail Sales Plans Anger Rift Preorder Buyers

Oculus will start selling Rift VR headsets in Best Buy stores May 7, sparking outraged comments from preorder buyers who have yet to receive their devices.

Oculus began shipping its long-awaited $599 Oculus Rift virtual reality headsets to early buyers on March 28 but a decision by the company to sell some Rift devices in Best Buy stores starting May 7 is making preorder buyers who are still waiting for their devices very angry.”Today we’re excited to share more details about our retail plans for Rift (pictured), which launches at 48 Best Buy stores on May 7 as part of The Intel Experience,” the company wrote in a May 2 post on the Oculus Blog. “Later this summer, we’ll start offering even more in-store Rift demos at additional Best Buy locations.”By releasing some devices through Best Buy stores, the company said it will be giving potential users “a first chance to jump into truly immersive VR,” the post states. “We’ll have a variety of experiences that everyone can enjoy, including VR vignettes with Oculus Dreamdeck. Ultimate thrill seekers will be able to experience what it’s like rock climbing on the side of a cliff with The Climb, and in the coming weeks, you’ll be able to explore the beautiful alien world of Farlands.”Well, that certainly sounds dreamy, at least until you read the annoyed, angry and bitter comments that have been left so far by some 18 preorder buyers starting just after the post was made.

“This just re affirms that maybe the only thing I truly preordered was hope,” wrote buyer Greg Dietz in the comments section. “Turns out that’s on backorder now too, considering some [person] might randomly walk into Best Buy and pick up a Rift months after I preordered it AND months before mine even ships. This is insanity. I feel more disappointed as time goes by.”

Another preorder buyer, Luke Goddard, wrote: “I must admit this is a bit of a let down from Oculus, it just seems [to be] knock-back after knock-back. First preorders were pushed back a bit due to too many orders and lack of parts, which is fine, but then to find out they will be selling it in shops before preorders are fulfilled—great. Not only that—people like me are still awaiting a confirmation of when our Oculus Rifts will even be dispatched. It’s getting a bit absurd now.”Another commenter, Peter Peterko, wrote: “What a joke … it looks like the preorders which should be the first people to get Rifts will actually be the very last ones. I canceled mine anyway … got [an HTC] Vive 3 weeks ago.”Oculus did not immediately respond to an eWEEK inquiry seeking further comment about its Rift retail sales plans. In its blog post, however, Oculus gave more details about the retail store sales it is planning to start, even while preorders continue to be filled.”A small number of Rifts will be available for purchase at select Best Buy stores starting May 7 and online from Microsoft and Amazon, starting May 6 at [12 noon EDT],” the company wrote in its blog post, explaining its move. “Quantities will be extremely limited while we catch up on Rift preorders.”Oculus knows “that many pre-order customers are still waiting for their Rifts, so we’re offering those customers a chance to purchase Rift from retail instead—while keeping their preorder benefits, like the EVE: Valkyrie Founder’s Pack and priority status for Touch preorders,” the post continued. “Starting May 6th, if you’re interested, simply go to your order status and let us know you’ve purchased a Rift at retail, and we’ll cancel your preorder. Your EVE: Valkyrie entitlement will appear in your order history.”The Rift virtual reality headsets began shipping on March 28 as the company began filling Kickstarter orders for its latest flagship VR devices, according to an earlier eWEEK story. Customers were told at that time that they will receive an email when their orders are being prepped one to three weeks prior to shipping and then another email when their payment method is being charged and the device is being shipped. Rift buyers who didn’t participate in the company’s Kickstarter campaign last year or didn’t preorder their devices through the preorder process that began in January can now order them through Oculus.com, but the site is listing expected shipping dates in July. Buyers can also buy their Rift VR viewer in a bundle with an Oculus-Ready PC through Amazon, Best Buy and the Microsoft Store.Earlier in March, Oculus also announced that some 30 new VR gaming titles will be available to play on the new devices as the first Rift VR headsets begin shipping.The Rift is equipped with dual active-matrix organic LED (AMOLED) displays that are designed to provide users with incredible visual clarity as they explore virtual worlds with the device. The Rift also uses an infrared LED constellation tracking system that provides precise, low-latency 360-degree orientation and position tracking for users for accurate and fluid control and operation when playing games and simulations.Facebook acquired Oculus for $1.9 billion in March 2014 to expand its social media footprint in a new direction.
Source: eWeek

Cavium Brings 64-Bit ARM Architecture to Network, Storage SoCs

Cavium Brings 64-Bit ARM Architecture to Network, Storage SoCs

The company in the past has used the MIPS architecture in its Octeon chips, but now is expanding its strategy to include the ARM-based Octeon TX SoCs.

Cavium officials more than a year ago embraced ARM’s 64-bit architecture for its ThunderX line of server processors. Now the vendor is doing the same with its new line of Octeon TX chips for embedded products in such areas as networking and storage.The company on May 2 introduced the new portfolio of systems-on-a-chip (SoCs) that comprises four product families, all of which are based on 64-bit ARMv8.1 cores. Cavium’s previous Octeon were based on the MIPS64 architecture. The MIPS products reportedly will continue being produced, with the new ARM-based SoCs expanding what Cavium can offer.ARM designs low-power SoCs, and then licenses those designs to a broad array of partners, from Qualcomm and Samsung to Cavium, Applied Micro and Advanced Micro Devices. The company’s chip designs are found in most smartphones and tables, and for the past several years, ARM officials have been pushing to move up the ladder and get their architecture into the data center.Most of the attention to the effort has centered on servers, and vendors like Cavium, Applied Micro, AMD, Qualcomm and others have either put ARM-based server SoCs on the market or are developing them. However, ARM officials have said they see a role for their low-power designs in a wide range of data center systems, including networking and storage appliances.

According to Steve Klinger, general manager of Cavium’s Infrastructure Processor Group, his company’s new product line is an example of what can be done with the ARM architecture.

“The wide range of products in the Octeon TX ARM 64-bit product line builds upon this success [of the company’s MIPS Octeon SoCs] and expands the use of these products into control and embedded processing applications that leverage the fast-growing ARM ecosystem and breadth of open-source initiatives,” Klinger said in a statement.The Octeon TX portfolio includes four product families—the CN80XX and CN81XX (one to four ARMv8.1 ThunderX cores and up to 2MB of last-level cache) and CN82XX and CN83XX (eight 24 cores, up to 8MB of last-level cache). The chips will enable Cavium to get into control plane applications areas in networking and storage for enterprises, service providers and data centers, they said. The chips’ ability to run multiple concurrent data and control planes at the same time will have application in a wide range of areas, from security and router appliances to software-defined networking (SDN) and network-functions virtualization (NFV), service provider customer-premises equipment (CPE), storage controllers and gateways for the Internet of things (IoT).The network is under pressure from such trends as big data, mobility, the IoT, the cloud and the massive amounts of data mirrored across the cloud and enterprise, and there is growing demand for them to become more scalable, agile, open and application-centric. The control plane needs to be able to run commercial software distributions (such as Red Hat Enterprise Linux, Canoncial and Jave SE) and open-source applications, such as OpenStack, OpenFlow and Quagga, Cavium officials said.Businesses also want the data plane to simultaneously support of multiple high-performance applications for firewalls, content delivery, routing and traffic management, they said. There also is increased demand for bandwidth and the growing security threats.While the current MIPS-based SoCs are being used in data plane applications and the control plane with embedded software, control plane applications that need a broader software ecosystem have found it in the x86 architecture used by Intel and AMD. Businesses embracing open, service-centric networks are looking for options that offer lower cost and a wide ecosystem, which Cavium officials said the ARM architecture offers.The new Octeon TX SoCs combine what the MIPS-based offerings do with the ecosystem, virtualization, open-source support and optimized ARMv8.1 CPU cores from the ThunderX portfolio. They also include integrated Cavium’s NitroX V security processors.The SoCs with one to four cores will begin sampling this quarter, while those with eight to 24 cores will sample starting in the third quarter.
Source: eWeek

Microsoft: SQL Server 2016 Gets a June 1 Release Date

Microsoft: SQL Server 2016 Gets a June 1 Release Date
The wait’s nearly over. Microsoft’s cloud- and analytics-friendly database software will be generally available next month. Microsoft has finally settled on a release date for SQL Server 2016, the Redmond, Wash., software giant announced May 2.A year after Microsoft released the first public preview of SQL Server 2016, the database software will be generally available Wednesday, June 1—before summer arrives. Customers can select from four editions: Express, Standard, Enterprise and the free Developer Edition. The latter includes all the features found in SQL Server Enterprise but is meant for development and test deployments, not production workloads.The official release, along with full enterprise support services from Microsoft, will allow customers “to build mission-critical, and business-critical intelligent applications with the most secure database, the highest performance data warehouse, end-to-end mobile BI [business intelligence] on any device, in-database advanced analytics, in-memory capabilities optimized for all workloads, and a consistent experience from on-premises to cloud,” wrote Tiffany Wissner, senior director of Microsoft Data Platform Marketing, in a May 2 blog post.While those are lofty claims, new benchmark data from one of Microsoft’s major hardware partners suggests that Wissner’s remarks are no idle boast.

Separately, Lenovo announced May 2 that it had set a new data warehouse performance record using a x3950 X6 server running Microsoft SQL Server 2016 Edition and Windows Server 2016 Standard. The server was outfitted with eight Intel Xeon E7-8890 v3 processors clocked at 2.5GHz and 12TB of system memory. The non-clustered system scored 1,056,164.7 queries per hour H (QphH) @30,000GB at $2.04/QphH @30,000GB on the TPC-H decision support benchmark that simulates a large number of ad hoc queries and concurrent data changes in business settings.

“This is the first-ever non-clustered result @30,000GB TPC-H benchmark scale. Previously, non-clustered results maxed out @10,000 TPC-H benchmark scale. At 30TB, it is three times larger than the previous tested database, reflecting the growth of in-memory databases both in size, and popularity,” said Lenovo in a statement. In addition to SQL Server’s in-memory data processing capabilities, the Chinese server maker also credited Windows Server 2016’s ability to support 12TB of memory, up from the previous 4TB limit.SQL Server also opens the door to advanced R-based business analytics. R is the statistical computing language that is popular among data scientists for predictive analytics.Last month’s SQL Server 2016 release candidate introduced a new installer that gives administrators the option of installing a stand-alone Microsoft R Server, formerly Revolution R from Revolution Analytics, or SQL Server R Services as an in-database capability. Microsoft acquired Revolution Analytics in early 2015.On the security front, SQL Server 2016 includes a new data privacy enhancing feature called Dynamic Data Masking (DDM), which Microsoft incorporated into its cloud-based Azure SQL Database last year. DDM can be used to obfuscate or limit access to sensitive data without making changes to the data stored in the database or applications.
Source: eWeek

8 Things to Consider When Moving an Enterprise System to the Cloud

8 Things to Consider When Moving an Enterprise System to the Cloud

Pick the Right Cloud Provider

As enterprises’ needs evolve, or in the event of a merger, an acquisition, a company divestiture or a split, it is essential the data is not held hostage to one provider. It must be moved seamlessly from one cloud provider to another. In addition, make sure the provider is the right fit in terms of longevity, scalability, cost, open standards, other integrated services/apps, location and security.

Source: eWeek

Microsoft Restricts Cortana to Edge and Bing to Protect Windows 10

Microsoft Restricts Cortana to Edge and Bing to Protect Windows 10
DAILY VIDEO: Microsoft limits Cortana to Edge and Bing on Windows 10; U.S. risks losing edge in HPC, supercomputing, report says; Pentagon bug bounty program attracts strong hacker interest; and there’s more.

Read more about the stories in today’s news:

Today’s topics include Microsoft’s limitation of its Cortana virtual assistant technology to Edge and Bing on Windows 10, the United States’ plan to accelerate its high-performance computing efforts, the success of the Pentagon’s bug bounty program and PhishLabs’ discovery of malware posing as legitimate apps on Google Play.

Cortana, Microsoft’s virtual assistant technology included with the Windows 10 operating system, is being reined in, the company announced April 28. As the Windows 10 user base has grown—270 million devices are running the OS at last count—Microsoft has discovered that Cortana has been taken in unintended directions, resulting in what the company claims is an unreliable user experience.

“Some software programs circumvent the design of Windows 10 and redirect you to search providers that were not designed to work with Cortana,” said Ryan Gavin, general manager of Microsoft Search and Cortana. In particular, they can interrupt some of Cortana’s task completion and personalized search capabilities, he said. In response, Microsoft is locking down the Cortana search experience. Now, Cortana will only display Bing search results in the Microsoft Edge browser.

Last year, President Obama issued an executive order aimed at accelerating the development of high-performance computing systems in the United States. The executive order created the National Strategic Computing Initiative to coordinate federal government efforts and those of public research institutions and the private sector to create a comprehensive, long-term strategy for ensuring that the United States retains its six-decade lead in research and development of HPC systems.

However, according to a recent report, the United States’ lead in the space is not assured, and other regions and countries—in particular, China—are making concerted efforts to expand their capabilities in the design, development and manufacturing of supercomputers and the components that make up the systems.

“The United States currently leads in HPC adoption, deployment, and development, but its future leadership position is not guaranteed unless it makes sustained efforts and commitments to maintain a robust HPC ecosystem,” the Information Technology and Innovation Foundation reported.

The Pentagon’s bug bounty program hit its midway point this past week, and already the initiative is, in some ways, a success. More than 500 security researchers and hackers have undergone background checks and begun to take part in the search for security flaws, according to HackerOne, the company managing the $150,000 program.

The “Hack the Pentagon” pilot, announced in March, is the first federal government program to use a private-sector crowdsourcing service to facilitate the search for security flaws in government systems. While neither the Pentagon nor HackerOne has disclosed any of the results so far, Alex Rice, chief technology officer and co-founder of vulnerability-program management service HackerOne, stressed that it would be “an extreme statistical outlier” if none of the researchers found a significant vulnerability.

PhishLabs, a company that provides anti-phishing services, said it has discovered 11 malicious applications disguised as mobile apps for popular online payment services on Google’s official Google Play store since the beginning of this year. The applications purport to give users access to their online payment accounts from their mobile devices, PhishLabs security analyst Joshua Shilko said in a blog post.

However, in reality, the only functionality the apps have is to collect the user’s log-on credentials and personal data and to send that to a remote command and control server belonging to the malware authors, Shilko said. PhishLabs did not identify the 11 payment brands whose apps were spoofed and uploaded to Google Play.

Source: eWeek

Thinking of Delving into the Internet of Things? Here are Four Key Considerations

Thinking of Delving into the Internet of Things? Here are Four Key Considerations

The Internet of Things (IoT), which promises to connect more devices, presents new challenges ranging from technology standards to ethics. Like any emerging technology, there’s a lot of excitement around the possibilities it presents, but this should also be tempered with some caution.

This post will provide a few of the major considerations that cloud professionals should keep in mind when connecting smart IoT devices to their applications around the topics of privacy, security, lifecycle, and legal and regulatory requirements.

1. Privacy

By connecting more devices, the Internet of Things essentially expands the possibilities of surveillance and tracking – and this could amplify privacy concerns.

Internet lawyer Girard Kelly at February’s INET-Internet of Things Conference said that when it comes to devices collecting personal information, it’s imperative to provide notice and consent to what data is being collected but also that services ensure a baseline level of privacy – which can benefit both the user and the service provider.

Kelly said developers should aim for a “privacy by design” model where privacy is the standard and default setting, and any use of personal information is clear to the user. He said that focusing on what data is needed, and discarding unneeded data also helps protects the service provider in case of a security breach by limiting its impact. Not holding data could also limit the service provider’s obligation to hand over data through a subpoena or warrant which can be onerous.

“When thinking about privacy from start to finish and approaching a system’s engineering where we want to think about the potential privacy and security breaches that could happen,” he said. “Looking at consumer expectations upfront – we want to prevent incidents where consumers could be potentially harmed through identity theft or exposure of personal information and building that into the product itself.”

To guard against a user feeling that their data is being unexpectedly collected, a service provider can provide a clear summary of what data will be collected and for what use. For instance, it could come as a surprise to someone that their IoT lightbulb was sharing geolocation data without a clear justification.

He also noted that data aggregation and consumer profiling can lead to new forms of discrimination. “Whether I have a mobile device [or not], what applications I use, what games I play – all that in the aggregate can paint of picture of the user that might be unwanted by the user,” he said. A certain combination of factors might lead someone to be turned down for a loan or a job.

This might not be clear to the users of devices.

“When consumers are buying a lot of these IoT devices and the mobile application as a bundle, they’re not going to be scrolling through a very long privacy to get an understanding of what this device may be collecting,” he said. “When we have dozens if not more IoT devices at home or at work, just the amount of time to analyze these policies and arrive at an informed consent is just not going to happen.”

Although the “reasonable expectation of privacy” is continuing to change to reflect our era, it’s important to ensure a certain level of privacy that doesn’t harm the user – even if they’re unwilling to read the terms and conditions.

2. Security

In an era where major data breaches make headlines weekly, users need to believe their connected devices and their information are reasonably secure from misuse or harm to truly trust the Internet of Things.

Security can’t be entirely guaranteed, but exists on a spectrum ranging from totally unprotected devices with no security features to highly secure systems with multiple layers of security features.

The networked connectivity of IoT devices means that security decisions made at the device level can have global impacts on other devices, and changes as high as the cloud level can also affect the security of the entire system.

Some of the specific IoT security challenges outlined by the Internet Society (ISOC) include:

  • The potentially enormous quantity of interconnected links between IoT devices that’s on a much larger scale than existing security tools, methods, and strategies.
  • In deployments where IoT devices are identical or nearly identical, the homogeneity magnifies the potential impact of any single security vulnerability.
  • IoT devices may be in service years longer than typical high-tech equipment, which can make it difficult to reconfigure or upgrade them, presenting potential security flaws in these older devices.
  • Many IoT devices are designed without any ability to be upgraded or the update process is difficult. (Fiat Chrysler issued a recall of 1.4 million vehicles in 2015 due to a vulnerability that would let an attacker wirelessly hack into the vehicle, requiring cars to be taken to a dealer for a manual upgrade.)
  • IoT devices could be deployed in places that aren’t physically secure like a public place where attackers have direct physical access to IoT devices, so anti-tamper features might be required.

3. Lifecycle (Interoperability, Standards & Obsolescence)

In the Internet, interoperability meant that connected systems would be able to speak the same common language of standard protocols and encodings. This is extremely important for the rollout and long-term success of IoT.

Some propose that IoT devices should have a built-in end-of-life expiration so that older, non-interoperable devices would be put out of service and replaced with more secure and interoperable devices. But this model is much less efficient than building IoT devices around the current open standards that are emerging and providing the ability to push software updates to smart objects to keep them up-to-date.

In an interview with the WHIR, Olivier Pauzet, Sierra Wireless’ VP of Marketing & Market Strategy, explained that standards are emerging at each layer of IoT.

“Standards based IoT is a must to enable interoperability and to enable as well the evolution of a system over time because systems are going to evolve, new services will emerge,” Pauzet said. “For that, you’re going to have to add applications into your device, and get cloud-interoperable APIs; Standards allow it to work anywhere in the world.”

At the cloud level, OneM2M is a standard that helps provide a common API for IoT applications. At the data acquisition layer, there’s a push to get a specific 3GPP technology standard that uses long-energy LTE devices for IoT. On the device itself, Linux is often used to power an edge device’s edge intelligence and on-board analytics capabilities – along with providing the ability to easily port applications to the device itself. For a device to stay functional for a long time, standard device management protocols for light-weight M2M updates are also becoming standardized.

The Internet Protocol is also a common technology for IoT, and CoAp is a protocol built on top of IP for specialized web transfer protocol for constrained nodes and constrained networks in the Internet of Things.

4. Legal and Regulatory Issues

The application of IoT devices poses a wide range of challenges and questions from a regulatory and legal perspective, and the pace of technology advances are often much faster than the associated policy and regulatory environments. In some cases, IoT devices amplify legal and civil rights issues that already existed, and in others they create new legal and regulatory situations and concerns.

Some particular areas of IoT legal and regulatory importance include:

  • Data protection and cross-border data flows – IoT devices may collect data about people in one jurisdiction and transmit it to another jurisdiction for processing that has different data protection laws. Typically, however, cross-border data flows are addressed in patchwork of regional and international privacy frameworks like the OECD Privacy Guidelines, Council of Europe Convention 108, and APEC Privacy Framework) or special arrangements like the APEC Cross Border Privacy Rules system and EU Binding Corporate Rules.
  • Co-operation with law enforcement and public safety – IoT devices and the data they generate can be used to fight crime, but the deployment and use of these kinds of IoT technologies can cause concern among some civil rights advocates and others concerned about the potentially adverse impact of surveillance.
  • Device liability – One of the fundamental IoT questions that isn’t always clear is: who is responsible if someone is harmed as a result of an IoT device’s action or inaction? It’s best to ensure that applications limit the potential damage they can do to individuals.

Proceed with Caution

As with any bleeding-edge technology, it’s important to be aware of the potential risks and to learn from the progression of other technologies. Despite the risks, IoT has the potential to expand the capabilities of the cloud beyond the traditional data center and traditional devices. As the industry moves forward, it’s best to keep in mind that small IoT devices can have big real-world consequences.

Source: TheWHIR

Dell, EMC's New Corporate Brand: Dell Technologies

Dell, EMC's New Corporate Brand: Dell Technologies

EMC unveils new mid-range all-flash storage system, Unity, that holds 80TB of content and will cost less than $20,000.

LAS VEGAS — Day 1 of EMC World is always one of the newsiest days of the year in the data storage business, and May 2 at the Sands Expo Center here was no different from years gone by.Next year’s will be a bit different, however, because it will be called Dell World, combining the parent company’s smaller Austin-based conference with EMC’s larger event. About 10,000 attendees are here this week; the conference closes May 5.EMC, the world’s largest storage provider in the midst of an acquisition by Dell that probably won’t close until October due to some red tape (pun intended) involving the Chinese government, made a series of new-product announcements, including an important one about an all-flash array for mid-range-type companies.New products aside for the moment, at least the corporate branding for combining two of the world’s largest and most successful IT companies have been worked out ahead of time.

Dell Wanted a ‘Family’ Name

“We wanted to convey a family of businesses and aligned capabilities, and as family names go, I’m kind of attached to Dell,” Dell CEO and founder Michael Dell (pictured with EMC CEO Joe Tucci) said during his Day 1 keynote. “So after the close of the transaction, our family of businesses will be officially known as Dell Technologies. It’s got a nice ring to it.”Dell Technologies will comprise Dell, EMC Information Infrastructure, VMware, Pivotal, SecureWorks, RSA, and Virtustream.The client solutions business will be branded Dell. “The brand equity of the Dell PC is irreplaceable, and we’re gained (market) share for 13 straight quarters,” Dell said, seizing an opportunity to disparage a key competitor. “In fact, in the United States, our client business grew 4 percent, and in the same period, HP’s (Hewlett-Packard Inc.) client business declined minus 14 percent.”Do you see a correlation there?”The combined enterprise IT business will be named Dell EMC, Dell said.  ‘Standing at the Center of World Technology’“When Dell and EMC combine, our company and all of you, our customers and partners, will stand at the center of the world’s technology infrastructure, and that means that we stand at the center of human progress. And there’s no place I’d rather be,” Dell said.The biggest new-products news of Day 1, EMC World 2016, was about the new Unity mid-range all-flash array. EMC claimed the new systems hold up to 80TB of capacity and provide full enterprise capabilities for about $18,000, EMC President of Products and Marketing Jeremy Burton told reporters and analysts.”That price is about half the cost of the nearest competitor, as far as we can tell,” Burton said.If true, it could truly bring the all-flash storage model into many mid-size businesses that couldn’t afford it previously.The Unity is designed specifically for small, mid-sized and departmental enterprise IT deployments and is available in all-flash array, hybrid array, software-defined and converged configurations, Burton said. The package features unified file and block storage services in a dense 2U footprint.Key Features for UnityUnity is fairly fast, delivering up to 300K IOPS, Burton said. Technical features include:–True dual-active controller architecture;
–Support for file, block and VVols;
–Snapshots and remote sync/async replication;
–Native, controller-based encryption;
–New scalable file-system for transactional and traditional file use cases;
–VMware VASA 2.0, VAAI, VVols, VMware-aware Integration;
–Complete REST-API for automation and DevOps use cases;
–Integrated Copy Data Management with EMC iCDM; and
–Zero-impact garbage collection.Unity also can be deployed in a hybrid configuration to meet individual business requirements.Unity is the latest member of EMC’s all-flash portfolio of file and block storage for small and medium-sized IT departments. Unity joins EMC’s portfolio of all-flash storage arrays–XtremIO, VMAX All Flash and DSSD D5–to ensure that, no matter what a customer needs, EMC has a purpose-built solution to fit virtually any data center use case.For more information, go here.
Source: eWeek

Hulu Plans to Offer Streamed Cable TV Programming

Hulu Plans to Offer Streamed Cable TV Programming

The service reportedly will begin in the first quarter of 2017 as the company seeks new customers who want traditional TV in new ways.

Hulu is preparing to compete directly with cable television companies by launching its own streaming TV service offering to customers that will provide daily network and cable TV programming by subscription.The company “hopes to launch the new cable TV-style online service in the first quarter of 2017, according to a May 1 story by The Wall Street Journal, which was based on reports from anonymous sources who are familiar with the matter.The planned service would stream popular broadcast and cable TV channels and expand the company’s reach from simply offering streamed on-demand TV programming, including current broadcast TV hits, movies, Hulu original shows and more, the article reported. “Walt Disney Co. and 21st Century Fox, co-owners of Hulu, are near agreements to license many of their channels for the platform,” the story continued.Among the networks that are expected to be part of the Hulu service offering are ABC, ESPN, Disney Channel, Fox broadcast network, Fox News, FX and Fox’s national and regional sports channels, The Journal reported. “Preliminary conversations with other programmers have begun, but the service isn’t looking to offer all the hundreds of channels found in the traditional cable bundle, according to the people familiar with the plans.”

One notable omission at this point is Comcast, which is also an owner of Hulu, but “so far hasn’t agreed to license its networks for the planned digital pay-TV service,” the report continued.

Hulu did not immediately respond to an eWEEK inquiry regarding the report about the potential new television streaming service.Hulu’s existing streaming subscriptions are priced from $7.99 to $11.99 per month and provide a wide range of programming. Subscribers of the streaming TV service would not be required to also subscribe to the company’s existing offerings, The Journal article said. The price of the upcoming service could be about $40 a month, according to the story.The streaming video and streaming TV markets are continuing to get more competitive as service offerings to customers expand.In March, mobile phone carrier AT&T announced that AT&T and DirecTV customers will be able to dump their satellite dishes and receive a wide range of video content via wired or wireless Internet streaming on any device under new services that are expected to launch by the end of 2016. Under three options, customers will be able to get a multitude of DirecTV Now packages that contain various assortments of content similar to DirecTV content today, DirecTV Mobile packages that they can view anywhere or DirecTV Preview packages with ad-supported free content, according to the companies.The packages will work over a wired or wireless Internet connection from any provider on a smartphone, tablet, smart TV, streaming media hardware or PC. The services will allow several users to view content over simultaneous sessions, and they will not require annual contracts, satellite dishes or set-top boxes, according to AT&T.The DirecTV Now packages will include on-demand and live programming from many networks, plus premium add-on options, and will be available for use after downloading an app and signing up for an account. AT&T acquired DirecTV for $48.5 billion in July 2015, after having pursued the merger since May 2014, according to an earlier eWEEK story.AT&T’s move to offer enhanced deals to bring over DirecTV customers to grow its own subscriber base was part of the company’s vision for making the acquisition in the first place. The merger turned AT&T into a bigger player with its hands in more markets and a ready pool of new prospects to bring into its business coffers.Last July, Comcast began offering its Comcast Stream online video streaming service for $15 per month, which provides customers with a package of live television stations, all over their cable Internet connections. The Stream service works without a television or cable box, instead bringing a live video stream directly to a customer’s in-home devices over the Internet via a cable modem.Stream is an Internet-only service and is not connected to Comcast’s cable television services, which remain separate. The channels included in Stream are network programming from ABC, CBS, The CW, Fox, NBC, PBS, Telemundo, Univision, HBO and local channels where a subscriber lives.Netflix, a major competitor to Hulu, claims it has about 75 million members in more than 190 countries, and offers streaming video subscription plans priced from $7.99 to $11.99 per month. Its members watch more than 125 million hours of movie and television programming a day, according to the company.In April, Comcast announced that its NBCUniversal division is acquiring DreamWorks Animation for $3.8 billion as the longtime cable company continues to build its future by adding complementary businesses to steady it as the future of cable television remains unfocused. The merger brings huge opportunities for content streaming to Comcast, which like other cable companies, is seeing its business impacted by customers who are replacing their cable connections with streaming video and original programming from services, such as Hulu, Netflix and Amazon Prime. 
Source: eWeek