Dell Rebrands Pending EMC Merger as Dell Technologies

Dell Rebrands Pending EMC Merger as Dell Technologies

varguylogoBrought to you by The VAR Guy

Dell Technologies will be the official name of the new company expected to be formed after the pending merger of Dell and EMC Corp., Michael Dell announced during his keynote speech at this week’s EMC World conference in Las Vegas.

Dell founder and CEO outlined the vision and branding strategies for the new company on Monday morning. All of Dell’s and EMC’s existing business ventures – including VMware and RSA – will be housed under the Dell Technologies brand, according to the announcement.

“Dell Technologies will create more value for customers and partners than any other technology solutions provider today. We will be more nimble and innovative, and we will deliver world-class products and solutions to customers of all shapes and sizes,” Michael Dell said in a statement.

Read more: OpenStack Company Mirantis Rekindles Dell Partnership

While Dell Technologies is the name of the overarching company, all enterprise products and solutions sold directly and indirectly through the channel will be subcategorized under the Dell EMC brand.

All client solutions for consumers, business, and institutional customers, meanwhile, will exist under the Dell name, according to the announcement.

Dell and EMC are currently working on the final visual branding efforts for the combined company, most likely to avoid confusion regarding the impending name changes.

Dell said the merger is progressing according to the original timetable and terms.

Original article appeared here Dell Rebrands Pending EMC Merger as Dell Technologies

Source: TheWHIR

Cold Storage Comes to Microsoft Cloud

Cold Storage Comes to Microsoft Cloud

datacenterknowledgelogoBrought to you by Data Center Knowledge

Microsoft has launched a cold storage service on its Azure cloud, offering a low-cost storage alternative for data that’s not accessed frequently.

The launch is a catch-up move by Microsoft, whose biggest public cloud competitors have had cold-storage options for some time. Amazon launched its Glacier service in 2012, and Google rolled out its Cloud Storage Nearline option last year.

The basic concept behind cold storage is that a lot of data people and companies generate is accessed infrequently, so it doesn’t require the same level of availability and access speed as critical applications do. Therefore, the data center infrastructure built to store it can be cheaper than primary cloud infrastructure, with the cost savings passed down to the customer in the case of a cloud provider.

Microsoft’s new service is called Cool Blob Storage, and it costs from $0.01 to $0.048 per GB per month, depending on the region and the total volume of data stored. The range for the “Hot” Blob storage tier is $0.0223 to $0.061 per GB, so some customers will be able to cut the cost of storing some of their data in Microsoft’s cloud by more than 50 percent of the opt for the “Cool” access tier.

Web-scale data center operators of Microsoft’s caliber have looked at reducing their infrastructure costs by better aligning infrastructure investment with the type of data being stored for some time now. Facebook has revealed more details than others about the way it approaches cold storage, including open sourcing some of its cold storage hardware designs through the Open Compute Project.

Related: Visual Guide to Facebook’s Open Source Data Center Hardware

The social network has designed and built separate data centers next to its core sever farms in Oregon and North Carolina specifically for this purpose. The storage systems and the facilities themselves are optimized for cold storage and don’t have redundant electrical infrastructure or backup generators. The design has resulted in significant energy and equipment cost savings, according to Facebook’s infrastructure team.

Read more: Cold Storage: the Facebook Data Centers that Back Up the Backup

Related: Google Says Cold Storage Doesn’t Have to Be Cold All the Time

Microsoft hasn’t shared details about the infrastructure behind its new cold storage service. In 2014, however, it published a paper describing a basic building block for an exascale cold storage system called Pelican.

Pelican is a rack-scale storage unit designed specifically for cold storage in the cloud, according to Microsoft. It is a “converged design,” meaning everything, from mechanical systems to hardware and software, was designed to work together.

Pelican’s peak sustainable read rate was 1GB per second per 1PB of storage when the paper came out, and it could store more than 5PB in a single rack, which meant an entire rack’s data could be transferred out in 13 days. Microsoft may have a newer-generation cold storage design with higher throughput and capacity today.

Cool Blob Storage and the regular-access Hot Blob Storage have similar performance in terms of latency and throughput, Sriprasad Bhat, senior program manager for Azure Storage, wrote in a blog post recently announcing the launch.

There is a difference in availability guarantees between the two, however. The Cool access tier offers 99 percent availability, while the Hot access tier guarantees 99.9 percent.

With RA-GRS redundancy option, which replicates data for higher availability, Microsoft will give you a 99.9 percent uptime SLA for Cold access versus 99.99 percent for the Hot access tier.

Original article appeared here: Cold Storage Comes to Microsoft Cloud

Source: TheWHIR

Angel Investors Target Science, Technology Startups

Angel Investors Target Science, Technology Startups

Survey results found that nearly one-third (31 percent) ranked solving some of the world’s biggest challenges as their main motivation.

An overwhelming majority (94 percent) of active investment angels said that it helps to have subject-matter experts weigh in when evaluating a startup investment opportunity, according to a Propel(x) survey of 200 aspiring and active angel investors in the United States.The survey found that the majority of active angels have invested in a science and technology startup (68 percent) but that half of respondents noted that they very frequently (21 percent) or somewhat often (29 percent) decide not to invest because they don’t believe they understand the technology well enough.”I was pleased to see an overall high interest in science and technology startups, and its good news that only 1.32 percent of those who haven’t invested are not interested in science and tech companies,” Swati Chaturvedi, CEO and co-founder of Propel(x), told eWEEK. “The disconnect often comes down to a lack of understanding the technology well enough to invest — helping investors overcome this information or knowledge gap is crucial to facilitating investing.”When asked to rank the top three types of science-based companies that are of most interest, the three categories receiving support from more than half of angels were information technology and communications (66 percent), life sciences (56 percent) and energy and green technologies (53 percent).

Survey results found that while nearly half of active angel investors (49 percent) ranked the potential for investment returns as their top motivator for investing in startups, nearly one-third (31 percent) ranked solving some of the world’s biggest challenges as their main motivation.

“Angels will continue to play a larger and more important role in funding early-stage startups,” Chaturvedi said. “There is a huge funding gap in the early stages – this is a well-researched phenomenon – called the ‘valley of death’ and other more colorful terms. This gap is being filled by angels. This is a great thing for speeding up innovation.”Less than a quarter of active angel investors (21 percent) cited a legacy of other investors among their top three reasons for investing in a specific opportunity.Three-quarters of active angels cited the management team, over half (52 percent) noted their ability to understand the technology and 42 percent claimed the potential return on investment as their top reasons for choosing to invest in a specific company.”Investing goes in waves and we are coming off a huge wave in life sciences that’s not abating,” Chaturvedi said. “We are now also seeing a re-emergence in energy. This sector had been dormant, but we’re seeing an increased interest and investment in clean tech with almost half of the angels surveyed having already invested in clean tech. This tells us to expect some clean tech startups coming down the line for VC funding.”
Source: eWeek

Oracle is paying $532 million to snatch up another cloud service provider

Oracle is paying 2 million to snatch up another cloud service provider

Hard on the heels of a similar purchase last week, Oracle has announced it will pay $532 million to buy Opower, a provider of cloud services to the utilities industry.

Once a die-hard cloud holdout, Oracle has been making up for lost time by buying a foothold in specific industries through acquisitions such as this one. Last week’s Textura buy gave it a leg up in engineering and construction.

“It’s a good move on Oracle’s part, and it definitely strengthens Oracle’s cloud story,” said Frank Scavo, president of Computer Economics.

Opower’s big-data platform helps utilities improve customer service, reduce costs and meet regulatory requirements. It currently stores and analyzes more than 600 billion meter readings from 60 million end customers. Opower claims more than 100 global utilities among its clients, including PG&E, Exelon, and National Grid.

Virtustream Launches Global Hyper-scale Storage Cloud

Virtustream Launches Global Hyper-scale Storage Cloud

Dell EMC cloud will face tough, entrenched competition from IBM Softlayer, Amazon, Microsoft and Oracle for enterprise storage business.

LAS VEGAS — EMC, soon to be known as Dell EMC in the enterprise IT world, on May 2 launched a global, hyper-scale storage cloud to compete in a huge emerging market with IBM Softlayer, Oracle Cloud, Microsoft Azure, Amazon Web Services, and a few others.The announcement of Virtustream Storage Cloud was made at EMC World 2016 here at the Sands Conference Center. The cloud system immediately becomes Dell EMC’s frontline Webscale storage, backup and archiving instrument.Virtustream, a San Francisco-based startup acquired by EMC in May 2015, provides a layer of cloud-management abstraction that sits above the virtual machine management layer and affords more accurate controls for administrators. In controlling that layer in the stack, Virtustream sets itself apart from other cloud management offerings.Its application lifecycle control and automation functionality focuses specifically on I/O-intensive enterprise applications, such as SAP’s in-memory S/4HANA database, large conventional parallel databases and others, that run on highly automated, multi-tenant cloud systems.

Available Starting May 10 for On-Premises or IaaS Deployments

Virtustream’s services will become available May 10 as both on-premises and cloud infrastructure-as-a-service (IaaS) deployments for large enterprises.This platform has been extensively tested with underlying elements running successfully in production for several years as the primary object storage platform for a select set of customers managing multiple exabytes of data, with hundreds of billions of objects under management and an event monitoring system that processes more than 35 billion events per day, EMC said.The new Virtustream Storage Cloud provides cloud extensibility for on-premises EMC storage, providing simple and efficient tiering, long-term backup retention, and cold storage in the cloud with single-source EMC support.Key features, according to EMC, include:–engineered-in resiliency delivering up to 13 x 9s of data durability;–architected and optimized for performance, particularly for large object sizes;–available read-after-failure provides resiliency and data integrity even in case of single site failure; and
 
–extensibility of on-premises primary storage and backup to the cloud.EMC offerings that eventually will support the platform include:
 
Data Domain: Using Data Domain Cloud Tier, users can automatically move backup data directly from EMC protection storage to Virtustream Storage Cloud for seamless, cost effective long-term backup retention;EMC Data Protection Suite: Users can tier backup data from EMC protection software to Virtustream Storage Cloud for long-term backup retention;VMAX, XtremIO and Unity Systems: Users can tier data to the cloud to reduce on-site primary storage footprint while maintaining optimal performance through on premise client-side caching and Virtustream Storage Cloud; andEMC Isilon: Users can archive cold data to the cloud using on-premises Isilon CloudPools policies to govern the placement and retention of tiered files to Virtustream Storage Cloud.Enterprises soon will be able to deploy Web-scale object storage for cloud-native applications, using a simple, S3-compatible application programming interface, the company said.”Any modern data center must extend seamlessly to the cloud, which is why we’re making cloud connectivity and cloud tiering an inherent capability of all of our products,” Jeremy Burton, EMC President of Products and Marketing, told eWEEK. “With Virtustream, and the cloud capabilities in our storage products, we’re able to offer our customers even more choice: They can tier to an EMC managed public cloud, EMC private cloud or third-party public cloud of their choice.”What Few People Know About VirtustreamThe movement of Virtustream into the EMC realm during the past 12 months was fairly smooth. The San Francisco-based company had been a longtime partner of SAP, which also is a longtime partner of EMC, and of VMware, which is owned by EMC. Much of the Virtustream software already has been melded into that of EMC-owned companies.What many people do not know is that a lot of the backend of the Virtustream cloud was built and/or enhanced by the same developers who built the Mozy backup cloud service — a Utah-based startup that  the company bought in 2007 to be a consumer-aimed backup cloud.EMC Mozy is still in business, is stable and profitable, but doesn’t get a lot of fanfare.”It’s still a good business for us,” Burton told eWEEK, “but do we see ourselves doing small-business and consumer backup? That’s not our sweet spot. We want to do enterprise.”But why not let the guys who built this mega-consumer cloud that can manage like a 100PB with only a couple of guys — why don’t we have them build out the business backend? Internally, that team is known as the Rubicon team, and it was the Rubincon team that built the hyper-scale cloud that is now the backend that is now the Virtustream Cloud.”Syncplicity Selects Virtustream Storage Cloud
 
Syncplicity, a top-seller for EMC in the hybrid enterprise file sync and share market, will use Virtustream Storage Cloud to meet its customers’ mobility and security needs, the company said on May 2.
 
“Virtustream offers a complete Hybrid EFSS solution enabling rapid large-scale deployments,” said Syncplicity CEO Jon Huberman. “The combination of Syncplicity’s hybrid EFSS solution with Virtustream’s highly secure and scalable storage cloud delivers mobile access  anytime, anywhere and on any device, with the security and data residency compliance demanded by enterprises.”
 
Virtustream Storage Cloud with Syncplicity will be generally available on May 10 with nodes in the United States and Europe. For more information, go here.
Source: eWeek

Samsung's Next VR Headset Won't Need a Smartphone: Report

Samsung's Next VR Headset Won't Need a Smartphone: Report

The company is creating a design for a standalone VR headset that doesn’t rely on a smartphone, like more-costly units from Oculus and HTC.

Samsung is developing an advanced, stand-alone VR headset that won’t require a user to link it to a compatible smartphone, which is necessary with the company’s basic $100 Gear VR consumer headset.The upcoming stand-alone virtual reality headset will incorporate advanced features, including positional tracking that is not found in the Gear VR, according to an April 27 article in Variety. The plans for the premium VR headset were unveiled by Injong Rhee, Samsung’s head of R&D for software and services, at last week’s Samsung Developer Conference in San Francisco.”We are working on wireless and dedicated VR devices, not necessarily working with our mobile phone,” said Rhee, according to the article. Samsung is also working on features such as hand and gesture tracking for future generations of VR devices, but it may take a few more years for those features to arrive, the story reported. “VR is amazing, but the industry is still at its infancy,” said Rhee.Samsung’s $100 Gear VR virtual reality headset was released in the fall of 2015 as a consumer version of virtual reality headsets made by Oculus. The Gear VR works with Samsung’s latest smartphone models—the Galaxy Note 5, Galaxy S7, Galaxy S7 Edge Galaxy S6 Edge+, S6 and S6 Edge.

The Oculus Rift and HTC Vive VR headsets do not require a smartphone to operate, but they are more expensive. The Oculus Rift is priced at $599 and began shipping on March 28. Earlier in March, Oculus also announced that some 30 new VR gaming titles would be available to play on the new devices as the first Rift VR headsets began shipping. The first 30 titles will be joined by more than 100 additional titles through the end of 2016, according to an earlier eWEEK story.

The Rift is equipped with dual active-matrix organic LED (AMOLED) displays that are designed to provide users with incredible visual clarity as they explore virtual worlds with the device. The Rift also uses an infrared LED constellation tracking system that provides precise, low-latency 360-degree orientation and position tracking for users for accurate and fluid control and operation when playing games and simulations. Facebook acquired Oculus for $1.9 billion in March 2014 to expand its social media footprint.Also in March, Sony announced that it will be bringing virtual-reality game play to its PlayStation entertainment systems with a new $399 PlayStation VR headset that will go on sale starting in October. The October release is later than the company previously expected. The VR headset was called Project Morpheus when it was first unveiled as a prototype two years ago at the 2014 Game Developers Conference.In February, it was revealed that Google is in the midst of designing a stand-alone virtual reality headset device that would not require it to be used with a smartphone, unlike its existing Google Cardboard viewer. Google’s stand-alone VR viewer project came on the heels of reports about Google’s planned revision for its existing Google Cardboard viewer, which is made of folded cardboard. Google Cardboard wraps around a compatible smartphone, which provides the technology features that give the VR viewer its functions. The updated Google Cardboard viewer will still be used with a smartphone.The upcoming device will include additional support for the Android operating system and is expected to be released this year to replace Google Cardboard, according to an earlier eWEEK story.Google Cardboard, which first appeared in 2014, is a simple VR viewer made up of cut-and-folded cardboard that is shaped into a boxy-looking VR device. The gadget has a slot that accepts a compatible Android smartphone so that it can take advantage of the phone’s display and other features. Several other parts are used besides the cardboard, including some Velcro, a rubber band, two small magnets and some aftermarket lenses, which can be purchased online.The first Cardboard device was dreamed up and built by Googlers David Coz and Damien Henry in 2014 at the Google Cultural Institute in Paris as part of a 20 percent project, which allows Google employees to use up to 20 percent of their work time to engage in projects that are interesting to them.In an April study, Strategy Analytics estimated that the global virtual reality headset market will bring in about $895 million in revenue in 2016, but while 77 percent of that revenue will come for premium-priced products from Oculus, HTC and Sony, the actual per-device sales totals will be dominated by lower-priced headsets from various vendors.The study predicted that three of the latest devices—the Oculus Rift, the HTC Vive and the coming Sony PlayStation VR—will bring in the bulk of the segment’s revenue this year. At the same time, though, those higher-priced devices will only make up about 13 percent of 12.8 million VR headsets that Strategy Analytics predicts will be sold in 2016, according to the report.  
Source: eWeek

Brocade Cites Weak IT Spending in Q2 Revenue Warning

Brocade Cites Weak IT Spending in Q2 Revenue Warning

Brocade officials expect revenue in the most recent financial quarter to come in below expectations due to a general weakness in IT spending, echoing what other tech vendors have mentioned in recent months.

Executives with the networking vendor in February had said they expected revenue for their second fiscal quarter to come in between $542 million and $562 million. However, they announced May 2 that instead revenue will hit between $518 million and $528 million. Brocade is scheduled to announce second-quarter earnings May 19.

In a statement, Lloyd Carney said the a “general softness in IT spending” is similar to what other vendors in the tech industry had mentioned in recent months, adding that for Brocade in particular, there were weaker than anticipated revenue in its storage-area network (SAN) revenue and pressure on its IP networking unit, especially from service providers and U.S. federal business.

“We are addressing these near-term challenges by continuing our focus on sales execution in this weaker demand environment, maintaining prudent expense controls and managing our investments in line with our stated priorities,” Carney said. “We continue to execute on our strategy to build a pure-play networking company for the digital transformation era that expands our market reach, diversifies our revenue mix, and creates exciting, incremental opportunities for growth.”

Among the other vendors that have issued revenue warnings is rival Juniper Networks, which earlier in April pointed to weak demand from enterprise customers and poor timing on deployments by some top-tier telecommunications customers in both the United States and Europe.

For the previous quarter, Brocade has reported $574 million in revenue, flat from the same period the year before. In a conference call in February to discuss the quarterly financial numbers, Carney noted solid performance in the company’s Fibre Channel storage and SAN businesses, but said the IP networking business was being hurt by a steep seasonal decline in the U.S. federal business.

“As a result, we’re maintaining a more modest view of our IP networking business in the first half of the year,” the CEO said, according to a transcript on Seeking Alpha. “However, we do expect significant improvement in the second half as the U.S. federal markets becomes seasonally stronger and new products provide an opportunity to accelerate growth.”

Brocade officials, who over the past several years have worked to build out the company’s software-defined networking (SDN) capabilities, announced in April that the company was buying Ruckus Wireless for $1.2 billion in a bid to bolster its wireless networking expertise.

Source: eWeek

OpenStack Makes Strides Despite Persistent Identity Problem

OpenStack Makes Strides Despite Persistent Identity Problem

NEWS ANALYSIS: There are plenty of definitions for the OpenStack cloud platform, but the best way to understand what it does is to look at how companies such as AT&T and Verizon are deploying it.

When I hear answers to the question, “What is OpenStack?” I hear echoes of Morpheus trying to explain to Neo, “What is the Matrix?” Morpheus can only talk in metaphors. The Matrix is a dream world, a battery, control. All are true, but none quite captures what it really is.OpenStack likewise is defined in metaphors. In the past year I’ve heard proponents describe OpenStack as “people,” an “interface,” a “set of APIs” and a “platform.” Last week at the OpenStack Summit in Austin, we also heard it be described as a “strategy for taking advantage of diversity in IT,” an “integration engine,” and, according to CoreOS CEO Alex Polvi, OpenStack is simply “an application.”All are true, of course, which is part of the allure of OpenStack and also the source of its ongoing identity crisis. OpenStack is six years and 13 versions into its life as an “open-source software platform for cloud computing,” which is how Wikipedia puts it and actually is a pretty good definition.I’ll throw another metaphor into the pot: OpenStack is a use case. There are a lot of them, and that’s part of the identity problem: Which use case? According to the annual OpenStack User Survey, the number one use case is test and development (63 percent), which is not surprising because that’s where all cloud efforts begin. Others include infrastructure as a service (49 percent) and Web services and e-commerce (38 percent). The total percentages exceed 100 because survey respondents were allowed multiple choices.

The most intriguing use case, however, is network-functions virtualization (29 percent). Work has been quietly proceeding on NFV for the past few years and no one really noticed until AT&T and Verizon recently announced large network architectures deployed on OpenStack.

AT&T claims to be the biggest OpenStack implementation after building 74 AT&T Integrated Cloud (AIC) sites with another 30 coming this year as the telecom service provider works toward its goal of virtualizing 75 of its network operations by 2020.AT&T didn’t wait for analysts or the media or anyone else to declare that OpenStack was ready for production. They did it, as one official told me earlier this year, simply because “OpenStack is a great platform on which to deploy a network.”They came to that decision not because the company wanted to become a cloud computing leader, but because of the need for telcos to become more open and agile or risk not being able to keep up with the demand for network services. The result for AT&T was ECOMP (Enhanced Global Control Orchestration Management Policy), which Sorabh Saxena, AT&T’s senior vice president for software development and engineering, described in detail to Summit attendees.Boris Renski, co-founder and chief marketing officer of Mirantis, whose OpenStack distribution AT&T uses in the AIC, explained how OpenStack also enables telcos to keep their costs under control as they build out enough data centers to meet demand. 
Source: eWeek

Microsoft Unveils Cheaper Azure Cool Blob Storage Option

Microsoft Unveils Cheaper Azure Cool Blob Storage Option

Businesses can now park their back-ups and other infrequently accessed data in the cloud for as low as a penny per gigabyte.

To help manage storage costs, enterprises often turn to storage tiering practices that place older, seldom used data on storage systems and media that are less expensive to operate and maintain. Now Microsoft is offering its Azure Blob Storage customers a similar option for their cloud-based object data, called Cool Blob Storage.”Example use cases for cool storage include backups, media content, scientific data, compliance and archival data. In general, any data which lives for a longer period of time and is accessed less than once a month is a perfect candidate for cool storage,” Sriprasad Bhat, a Microsoft Azure Storage senior program manager, explained in a blog post. “With the new Blob storage accounts, you will be able to choose between Hot and Cool access tiers to store object data based on its access pattern.”Among Cool Blob Storage’s most notable attributes is its low cost.In some regions, customers can expect to pay as little as a penny per gigabyte to keep their data on the service. Latency and throughput performance are similar in both the Hot and Cool tiers, assured Bhat, although their service-level agreements (SLAs) differ.

While Microsoft offers a 99.9 percent (three nines) availability SLA on the Hot tier, the Cool tier gets by with 99 percent. Customers that select for the read-access geo-redundant storage option can bump their Azure Cool Blob Storage SLAs up to 99.9 percent.

A number of data storage vendors are integrating Azure Cool Blob Storage into their products. Backup specialists Commvault and CloudBerry Lab are supporting Microsoft’s new cloud storage option. SoftNAS, a provider of filer software for cloud-based network-attached storage (NAS), and converged storage systems maker Cohesity have also signaled their support.Data protection company Veritas added a cloud connector that supports Cool Blob Storage to the NetBackup 8.0 beta.”As we work to expand our relationship with Microsoft across a wide range of information management solutions, Veritas is pleased to announce beta availability of an integrated connector in NetBackup for Microsoft Azure Blob storage services. We encourage our enterprise customers to test the ease of use, manageability, and performance of NetBackup on Microsoft,” Simon Jelley, vice president of product management for Veritas, said in a May 2 announcement.The new cloud-storage feature comes three months after Veritas celebrated its legal separation from Symantec and emerged as an independent, privately owned company. It also coincides with the company’s induction into Microsoft’s Enterprise Cloud Alliance, a partner program for makers of Azure-compatible cloud solutions for businesses.In another cost-cutting move, at least for select educational customers, Microsoft announced that the company has eliminated Azure egress fees for academic institutions in North America and Europe.  “Azure customers who have an enrollment in Education Solutions (EES) agreement are eligible for this program. These EES customers don’t have to do anything to get this benefit—there is no special contract to sign or agreement to enter into,” Brian Hillger, senior director of Microsoft Cloud and Enterprise Business Planning, wrote in a May 2 blog posting.
Source: eWeek

Adobe Updates Document Cloud to Spur Digital Transformation

Adobe Updates Document Cloud to Spur Digital Transformation

Adobe introduces Adobe Sign, and Adobe Document Cloud and Box team up to transform digital document processes.

Adobe recently introduced Adobe Sign, its e-signature solution, and announced integration between Adobe Sign and Adobe Marketing Cloud.Formerly known as Adobe Document Cloud eSign services, Adobe Sign is an easy, secure way to bring trusted e-signatures to any organization. The integration between Adobe Sign and Adobe Marketing Cloud eliminates manual, paper-based processes and eases the way for digital transformation.Mark Grilli, Adobe’s vice president of product marketing, said Adobe keeps “hearing about digital transformation and how critical it’s becoming as an issue for our customers.”Companies are concerned that if they don’t transform they might disappear, he said.

“The biggest area of concern is around customer experience,” Grilli said. “A fully digital experience is better than a paper-based one, or one that has gaps and is part digital and part paper. This is a theme we see in the market and we see from customers.”

Citing an IDC study, Grilli said 40 percent of companies on the Standard & Poor’s 500 Index will not exist in 10 years. The IDC study also found that 77 percent of organizations reported having gaps in their existing systems that affect the customer experience and 72 percent said improving document processes would increase customer satisfaction. However, 80 percent of document processes still rely on paper, Grilli said.Adobe Sign works with Adobe Experience Manager (AEM) Forms, which is a core piece in the Adobe Marketing Cloud strategy, to help organizations to go completely digital with anything from credit card applications to government benefit forms or medical forms, Adobe said.This integration enables organizations to provide an end-to-end digital form filling and signing experience for customers on any channel; speed up time-to-market and achieve efficiency savings for forms management; continually improve the user experience with Adobe Target; and analyze and optimize performance with Adobe Analytics, said Josh van Tonder, a solutions manager with Adobe’s Worldwide Government group, in a blog post.Adobe also is working on supporting e-signatures is Europe and is rolling out new data centers and meeting the legal requirements in the EU as part of a global expansion that will continue through 2016.In a blog post, Dan Puterbaugh, director and associate general counsel for Adobe, said EU businesses have needed a single e-signature law applied uniformly across all member states, and in 2014, the Council of the European Union adopted eIDAS to meet that need. eIDAS supplies a legal structure for electronic identification, signatures, seals, and documents throughout the EU.eIDAS goes into effect on July 1, 2016. “With eIDAS on the horizon, new opportunities are going to arise for all companies doing business in the EU,” Puterbaugh said. “To help businesses seize those opportunities, Adobe has been building out its infrastructure with technology and information assets to make e-signatures as easy and secure to use in the EU as they are in the US.”Adobe has added support and integration for EU Trust Lists and is the first major vendor to do so, Puterbaugh said.”And we’ve ensured that we have the local expertise in place to serve this new EU single digital market.”Adobe also announced new Document Cloud storage integrations with Box and Microsoft OneDrive, which make it easier to access and work on PDF files from anywhere, as well as new features for Adobe Acrobat DC subscribers.“Every company and organization should be laser-focused on delivering the best customer experience possible, and the best experience does not involve paper,” said Bryan Lamkin, executive vice president and general manager, Digital Media at Adobe, in a statement.
Source: eWeek