Oracle Buys Opower to Read Utility Meters in the Cloud

Oracle Buys Opower to Read Utility Meters in the Cloud

Opower’s software powers more than 100 global utilities, including Pacific Gas & Electric (PG&E), Exelon and National Grid.

Oracle is going into the meter-reading-by-cloud business.A mere one week after buying Textura, a construction management and engineering SaaS software provider, Oracle on May 2 added Opower, a publicly traded provider of customer engagement and energy efficiency cloud services to utilities.The transaction is valued at approximately $532 million, or $10.30 per share, net of Opower’s cash.The acquisition is the fifth thus far this year — second in the cloud services sector — for the Redwood City, Calif.-based database, middleware, apps and data center hardware company.

Opower’s software powers more than 100 global utilities, including Pacific Gas & Electric (PG&E), Exelon and National Grid. Opower’s big data platform stores and analyzes more than 600 billion meter reads from 60 million utility end customers, enabling utilities to proactively meet regulatory requirements, decrease the cost to serve, and improve customer satisfaction.

Oracle Utilities and Opower thus will become the largest provider of mission-critical cloud services to utilities, said Rodger Smith, Senior Vice President and General Manager, Oracle Utilities Global Business Unit.Oracle will be going head-to-head with companies such as Silver Spring, VertexOne, WaterWorks, Muni-Link and UtiliyTrakR. Go here to see a listing of sector leaders.The transaction is expected to close in 2016, subject to Opower’s stockholders tendering a majority of Opower’s outstanding shares and derivative securities exercised prior to the closing of the tender offer, certain regulatory approvals and other customary closing conditions.

For more information, go here.

Source: eWeek

Ubuntu Founder Pledges No Back Doors in Linux

Ubuntu Founder Pledges No Back Doors in Linux

VIDEO: Mark Shuttleworth, founder of Canonical and Ubuntu, discusses what might be coming in Ubuntu 16.10 later this year and why security is something he will never compromise.

Ubuntu developers are gathering this week for the Ubuntu Online Summit (UOS), which runs from May 3-5, to discuss development plans for the upcoming Ubuntu 16.10 Linux distribution release, code-named “Yakkety Yak.”In a video interview with Mark Shuttleworth, founder of Ubuntu Linux and Canonical, he discusses Ubuntu 16.10, including the Mir display server and his views on security including the use of encryption.Ubuntu 16.10 is set to debut in October and follows the Ubuntu 16.04 update, which was released on April 21. While it’s not yet entirely clear what exact features will land in Ubuntu 16.10, one candidate is the Mir display server. The Ubuntu community–and Shuttleworth in particular–has been talking about migrating to Mir since at least 2013. The promise of Mir is a unified display technology that will work across desktops, mobile devices and even TVs. While there is some controversy among members of the Linux community over the transition to Mir, Shuttleworth emphasized that few people will ever know the difference.”I can’t say when Mir will drop into Ubuntu as the default display system, but I can say when it does, no one should notice it,” Shuttleworth told eWEEK. “That’s our commitment: The set of experiences that people enjoy about Ubuntu–they can count on.”

One thing that Ubuntu Linux users will also continue to rely on is the strong principled stance that Shuttleworth has on encryption. With the rapid growth of the Linux Foundation’s Let’s Encrypt free Secure Sockets Layer/Transport Layer Security (SSL/TLS) certificate platform this year, Shuttleworth noted that it’s a good idea to consider how that might work in an integrated way with Ubuntu.

Overall, he said, the move to encryption as a universal expectation is really important.”We don’t do encryption to hide things; we do encryption so we can choose what to share,” Shuttleworth said. “That’s a profound choice we should all be able to make.”Shuttleworth emphasized that on the encryption debate, Canonical and Ubuntu are crystal clear.”We will never backdoor Ubuntu; we will never weaken encryption,” he said.Watch the video interview with Mark Shuttleworth below:

Sean Michael Kerner is a senior editor at eWEEK and InternetNews.com. Follow him on Twitter @TechJournalist.

www.eweek.com/enterprise-apps/slideshows/ubuntu-16.04-linux-debuts-with-support-until-2021.html

Source: eWeek

Canadian Web Hosting Joins Vancouver Internet Exchange

Canadian Web Hosting Joins Vancouver Internet Exchange

Canadian Web Hosting has announced that it has joined the Vancouver Internet Exchange (VANIX), a network neutral, independent exchange based in Vancouver that interconnects multiple IP networks including networks of ISPs, content delivery, and multiple leading networks like Canadian Web Hosting’s Canadian CloudStream Backbone Network.

Canadian Web Hosting continues to expand its base of operations and services in Vancouver and is increasingly focused on their goal of delivering the industry’s lowest latencies for customers on the west coast and across Canada. Today, VANIX is the second largest IX in Canada and with such a strong internet community, Canadian Web Hosting’s customers will continue to push for faster, more efficient internet paths to British Columbia’s businesses and residents.

By joining VANIX, Canadian Web Hosting continues to expand the availability of its recently launched CloudStream Canadian network backbone. With CloudStream, Canadian Web Hosting continues to deliver on its promise to maximize performance and network scalability, offering up to full gigabit ethernet connections for every server and device. In addition, Canadian Web Hosting has removed any potential of data moving onto the public internet, which in turn increases security, giving customers the reassurance that their network usage remains 100% Canadian. The network is designed to handle and move massive amounts of data for current and future services like cloud hosting, IoT, video, enterprise apps and virtual reality (VR) apps.

“As a leading service provider in infrastructure and cloud hosting services, we understand many of our new and existing customers’ pain points,” said Matt McKinney, chief strategy officer at Canadian Web Hosting. “With so many emerging technologies coming to the forefront that require low latency and high capacity network links, we believe that by joining VANIX our customers are well positioned for the next phase of application and IoT hosting and will be able to meet the most demanding requirements.”

Source: CloudStrategyMag

Qualcomm Is Bringing Deep Learning to Mobile Devices

Qualcomm Is Bringing Deep Learning to Mobile Devices

The vendor’s  Neural Processing Engine SDK will run on smartphones and other devices powered by Qualcomm’s Snapdragon 820 SoCs.

Qualcomm wants to make mobile devices running its Snapdragon 820 processor even smarter.Company officials on May 2 introduced a deep-learning software development kit (SDK) for the ARM-based systems-on-a-chip (SoCs) that will enable device makers to run neural network models on their Snapdragon 820-powered products—including smartphones, security cameras, cars and drones—for such tasks as scene detection, text recognition, object avoidance, face and gesture recognition, and natural language processing.Other devices can do many of the same tasks, but what the Neural Processing Engine SDK will allow are those workloads to be processed without having to be connected to the cloud, according to Qualcomm officials. It’s based on Qualcomm’s Zeroth Machine Intelligence Platform, a software portfolio for machine learning on mobile devices and optimized for the Snapdragon SoC lineup. It’s being used in such Qualcomm software as Snapdragon Scene Detect for visual intelligence and Smart Protect advanced malware detection software.The Neural Processing Engine will help Qualcomm meet the growing demand for mobile experiences that are driven by machine learning and that is not linked to the Internet, according to Gary Brotman, director of product management at Qualcomm.

“With the introduction of the new Snapdragon Neural Processing Engine SDK, we are making it possible for myriad sectors, including mobile, IoT [Internet of things] and automotive to harnesses the power of Qualcomm Snapdragon 820 and make high-performance, power efficient on-device deep learning a reality,” Brotman said in a statement.

That includes on smartphones like Samsung’s Galaxy S7, HP Inc.’s Elite X3, LG Electronics’ G5 and Xiaomi’s Mi 5, but also any other mobile devices. Qualcomm, which is the world’s largest provider of processors to smartphones, is looking to expand its reach into an array of other markets, from automobiles to drones. The tablet market continues to contract, and global smartphone sales are going flat as worldwide markets become saturated. According to IDC analysts, the number of smartphones shipped globally in the first quarter was 334.9 million, up only a little over the 334.3 million shipped during the same period in 2015, the smallest year-over-year growth on record.With the Snapdragon 820, the company is looking to leverage its heterogeneous processing capabilities to gain traction in other growth markets. The SoC includes not only the ARM-based 64-bit Kyro CPU, but also Qualcomm’s Adreno GPU and Hexagon digital signal processor (DSP).Deep learning uses neural networks made up of multiple compute layers that are designed to enable systems to learn through experience and act on what they’ve learned rather than having to constantly be programmed what to do by humans. Most neural networks now are run on powerful server-based environments in data centers, but there is a push on to bring such capabilities to mobile devices. Earlier this year, researchers at the Massachusetts Institute of Technology (MIT) unveiled “Eyeriss,” a 168-core processor they said will enable smartphones and other mobile and embedded devices to run artificial intelligence (AI) algorithms locally, letting much of the work of collecting and processing data be done on the device itself.Qualcomm officials said enabling deep-learning capabilities on mobile devices will help organizations in a broad range of verticals, including automotive, security, health care and imaging. Through the new SDK, the companies will be able to run their own trained neural networks on mobile devices, they said.The vendor’s Snapdragon Neural Processing Engine SDK will be available in the second half of 2016. They initially will be available for the Snapdragon 820 SoCs, and will support such deep-learning frameworks as Caffe and CudaConvNet.
Source: eWeek

AWS Now Available At EdgeConneX® Portland Edge Data Center®

AWS Now Available At EdgeConneX® Portland Edge Data Center®

EdgeConneX® has announced the availability of Amazon Web Services (AWS) Direct Connect in its Portland Edge Data Center®. With AWS Direct Connect, companies in the Pacific Northwest can connect their IT infrastructure directly to Amazon Web Services, establishing a private connection to the cloud that can reduce costs, increase performance and provide a more consistent network experience. The announcement marks the first metro offering for AWS Direct Connect in Portland and the first AWS deployment for EdgeConneX.

The idea of bringing content, the cloud and applications closer to end-users is one that has been predicted by many industry experts. Specifically, in a report titled “The Edge Manifesto: Digital Business, Rich Media, Latency Sensitivity and the Use of Distributed Data Centers” (July 2015), Gartner analyst Bob Gill states, “We’ve begun the move to digital business, including rich content via mobile devices, where people, their devices and even unattended “things” become actors in transactions.” Gill further predicts, “To optimize the experience, Gartner believes the topology of networked data centers will push over the next five years from a centralized, mega data center approach, to one augmented by multiple, smaller, distributed sources and sinks of content and information, whether located in distributed, enterprise-owned data centers, hosting providers, colocation or the cloud.”

“The Internet of Everywhere requires a highly diverse and distributed content and cloud architecture, with the network edge extended beyond traditional major peering hubs to ensure the service quality and experience expected by today’s enterprises and consumers,” remarks Clint Heiden, chief commercial officer, EdgeConneX. “AWS Direct Connect provides the Portland/Hillsboro regional enterprise and consumer end-users an easy, high-performance and private on-ramp to the cloud at the edge, enabling access to Amazon’s powerful web services and effective deployment of hybrid solutions supported by EdgeConneX’s world-class EDC infrastructure.”

The EdgeConneX Portland EDC is purpose-built to offer security, speed and performance improvements. These innovations enable customers to deliver digital content, cloud and applications to end-users as fast as possible. Edge Data Centers are proximity-based, strategically located nearest to the end-user’s point of access to reduce network latency and optimize performance. Local proximity access also brings the cloud closer to the enterprise, enabling a more secure, real-time access to cloud applications and services while offering reduced backbone transport costs.

Source: CloudStrategyMag

Secure Web Gateways Fail to Prevent Malicious Attacks

Secure Web Gateways Fail to Prevent Malicious Attacks

Of the 200 billion total communications observed, nearly 5 million attempted malicious outbound communications were from infected devices.

Eighty percent of secure web gateways installed by Fortune 1000 companies miss the vast majority of malicious outbound communications, according to a report from attack detection and analytics specialist Seculert.The study examined a subset of its 1.5 million user base that included more than 1 million client devices that had generated over 200 billion total communications from Fortune 1000 companies in North America.Nearly all the environments studied were running sophisticated perimeter defense systems, which included a secure web gateway and/or next generation firewall, an IPS, as well as a SIEM in addition to fully functioning endpoint protection.”The alarming part of this research is the sheer number of malicious threats that were able to make it through the companies’ secure web gateways time after time,” Richard Greene, CEO of Seculert, told eWEEK. “The research found that 80 percent of secure web gateways blocked zero to two of the 12 latest and most dangerous threats. These are real tests conducted with Fortune 1000 companies, and even they are ill prepared for the increasing complexity of cybercriminals’ attacks.”

Of the 200 billion total communications observed, nearly 5 million attempted malicious outbound communications were from infected devices, and 40 percent of all attempted malicious communication succeeded in defeating their associated secure web gateway.

“Many enterprises rely on only prevention-focused perimeter security tools, like next generation firewalls, IPS, and secure web gateways,” Greene said. “This positions them directly in the crosshairs of cybercriminals and other adversaries capable of penetrating modern perimeter security defenses with startling ease. While useful, these prevention solutions alone cannot protect organizations in the current threat landscape.”The report also found nearly 2 percent of all examined devices were infected and all companies included in the research exhibited evidence of infection.”Understanding the cyber threat landscape is a constant game of trying to stay ahead of the latest threats,” Greene said. “Common cyber criminals will no longer be the most common threat as sophisticated criminal gangs with modern organizational models and tools emerge as the primary threat.”Greene noted that besides being well funded these attackers have the luxury of time on their side, so they’re able to develop more advanced techniques not yet anticipated by the cyber-defense community.”Also, there will be a growing number of state versus state reconnaissance attacks as cyber “armies” research the strengths and weaknesses of their opponents,” he said.Measured over time, nearly all of the gateways observed exhibited uneven performance, and the report noted that while most performed well for weeks or months, eventually all showed evidence of being “defeated” by the adversary.
Source: eWeek

U.S. Risks Losing Edge in HPC, Supercomputing, Report Says

U.S. Risks Losing Edge in HPC, Supercomputing, Report Says

With growing competition from China and other countries, U.S. lawmakers must take steps to accelerate the country’s HPC efforts, the ITIF says.

Last year, President Obama issued an executive order aimed at accelerating the development of high-performance computing systems in the United States.The executive order created the National Strategic Computing Initiative (NSCI), an initiative to coordinate federal government efforts and those of public research institutions and the private sector to create a comprehensive, long-term strategy for ensuring that the United States retains its six-decade lead in research and development of HPC systems.

Noting the importance of supercomputers in government, industry and academia, Obama wrote that the country’s momentum in high-performance computing (HPC) needed a “whole of government” approach that incorporates public and private efforts.

“Maximizing the benefits of HPC in the coming decades will require an effective national response to increasing demands for computing power, emerging technological challenges and opportunities, and growing economic dependency on and competition with other nations,” the president wrote. “This national response will require a cohesive, strategic effort within the Federal Government and a close collaboration between the public and private sectors.”However, according to a recent report, the United States’ lead in the space is not ensured, and that other regions and countries—in particular, China—are making concerted efforts to expand their capabilities in the design, development and manufacturing of supercomputers and the components that make up the systems.The authors of the report by the Information Technology and Innovation Foundation (ITIF) stressed the importance to the United States of the HPC market—to everything from national security to economic development—and listed steps Congress must make to keep the country at the forefront of HPC and supercomputer development.”Recognizing that both the development and use of high-performance computing are vital for countries’ economic competitiveness and innovation potential, an increasing number of countries have made significant investments and implemented holistic strategies to position themselves at the forefront of the competition for global HPC leadership,” the authors, Stephen Ezell and Robert Atkinson, wrote. “The report details how China, the European Union, Japan, and other nations have articulated national supercomputing strategies and announced significant investments in high-performance computing.”The United States needs to meet and exceed those efforts, the authors wrote.”The United States currently leads in HPC adoption, deployment, and development, but its future leadership position is not guaranteed unless it makes sustained efforts and commitments to maintain a robust HPC ecosystem,” they wrote.The report describes HPC as the use of supercomputers and massively parallel processing technologies to address complex computational challenges, using such techniques as computer modeling, simulation and data analysis. It includes everything from computer hardware to algorithms and software running in a single system.The United States continues to be the leader in the development of supercomputers, but the current trends in the industry are threatening. In the latest Top500 list of the world’s fastest systems released in November 2015, the United States had 200 systems on the list. However, it was down from the 231 on the list released in July 2015 and was the lowest number for the country since the list was started in 1993. In addition, China placed on 109 systems in November, almost three times the 37 the country had on the July list. In addition, the Tianhe-2 supercomputer developed by China’s National University of Defense Technology was in the top slot for the sixth consecutive time, with a peak theoretical performance speed of 54.9 petaflops (quadrillion floating point operations per second), twice the speed of Titan, the second fastest system located at the U.S. Department of Energy’s (DOE) Oak Ridge National Laboratory in Tennessee.The next Top500 list will be announced next month at the ISC 2016 show next month in Frankfurt, Germany.
Source: eWeek

After Its Q2 Earnings Decline, Apple Now Loses Investor Carl Icahn

After Its Q2 Earnings Decline, Apple Now Loses Investor Carl Icahn

Icahn, who has been an Apple advocate for years, said he sold all his Apple stock in reaction to disdain toward the company from China.

It’s been a rough week for Apple. First the company suffers its first quarterly revenue decline since 2003, and now one of the world’s most well-known and richest investors, Carl Icahn, announced that he’s sold all of his Apple stock holdings because the company is facing backlash from China’s government in a market that is important to Apple.Icahn revealed his Apple stock sales in an interview with CNBC, telling the network that “China’s attitude toward Apple largely drove him to exit his position,” according to an April 28 report on CNBC.com.”You worry a little bit — and maybe more than a little — about China’s attitude,” Icahn told the network. China’s government could “come in and make it very difficult for Apple to sell there … you can do pretty much what you want there.”Icahn’s action comes in part due to the recent actions of government regulators in China who without warning shut down Apple’s online iBooks Store and iTunes Movies service, which had opened about six months ago. Now Apple’s team is working with the communist government to try to restart the services, but no agreement has been announced. The services shuttering came despite permission that Apple was previously granted by the government when the services began last year, according to a recent eWEEK story.

The action by Chinese officials is a challenge for Apple, which has been garnering more and more of its revenue there in the last several years, according to the company’s revenue reports. In January, Apple reported $18.37 billion in revenue from China in its first quarter of 2016, which made up about 24.2 percent of the company’s $75.87 billion in revenue for the quarter. In the fourth quarter of 2015, Apple reported $12.52 billion in revenue from China, out of a total of $51.5 billion, according to earlier eWEEK reports.

China is Apple’s second-largest global market behind the United States. The company began selling iPhones in China in October 2014, after gaining government device security approvals.Yet despite Icahn’s pullback on Apple stock right now, he did tell CNBC that if China “was basically steadied” in the future, “he would buy back into Apple,” the story reported.Icahn said his actions came despite his opinion that Apple is a “great company” and that CEO Tim Cook is “doing a great job,” the story reported. Icahn had owned “a little less than a percent” of Apple’s shares.Several IT analysts told eWEEK that Icahn’s actions on Apple make sense based on recent news surrounding the company.”Our take is that I think this is a pretty sound reflection of Apple’s performance, in light of this week’s earnings announcement,” said Jeff Orr of ABI Research. “Apple’s business is meeting maturity in all of the markets that it is part of,” from computers to mobile devices, and that’s why the Chinese market is important to the company, said Orr. Icahn is “responding to that kind of information, that Apple is starting to look like every other technology company.”Charles King, principal analyst at Pund-IT, said that Icahn’s decision “makes sense for Icahn whose huge (roughly 1 percent of all shares) position meant that he was acutely aware of movements in Apple’s stock price. Those shares have lost about 10 percent of their value since Apple’s earnings call … when the company cited problems in China as a reason for its poor results.”Other issues, including the company’s existing mature product mix with no new major products on the horizon “raise questions about the effectiveness of the company’s management,” said King. “So while blaming China seems plausible enough, it also provides a simple out for both Icahn and Apple.”Rob Enderle, principal analyst at Enderle Group, said Icahn’s move means “he can’t see a way to improve the situation so he is taking his money out, writing off whatever loss he incurred, and moving it someplace where he can make a difference.”And that, said Enderle, is “a really bad sign because it means even the artificial things that he could drive to spike the stock are either already being done aggressively or he doesn’t see them making enough difference. This would be like standing on the deck of the Titanic and suddenly seeing the folks tasked with patching the hole grab a lifeboat and head for the horizon.”Ken Dulaney, an analyst at Gartner, told eWEEK that Apple’s problems today stem from their own actions “simply because growth at their size requires another blockbuster and there is nothing apparent on the horizon. This is when Apple misses Steve [Jobs] the most.”
Source: eWeek

Pentagon Bug Bountry Program Attracks Strong Hacker Interest

Pentagon Bug Bountry Program Attracks Strong Hacker Interest

The Pentagon is at the midpoint of a crowdsourcing initiative that has attracted about 500 researchers to sign up for the opportunity to search for bugs in the agency’s Websites.

The Pentagon’s bug bounty program hit its midway point this past week, and already the initiative is, in some ways, a success. More than 500 security researchers and hackers have undergone background checks and begun to take part in the search for security flaws, according to HackerOne, the company managing the program.The “Hack the Pentagon” pilot, announced in March, is the first federal government program to use a private-sector crowdsourcing service to facilitate the search for security flaws in government systems.The $150,000 program started two weeks ago and will continue for another two weeks. While neither the Pentagon nor HackerOne has disclosed any of the results so far, Alex Rice, chief technology officer and co-founder of vulnerability-program management service HackerOne, stressed that it would be “an extreme statistical outlier” if none of the researchers found a significant vulnerability.”What I can say is that we haven’t seen any of [these programs] launched, even those with a smaller number of individuals, where the researchers have found nothing,” he told eWEEK. “No one who launches these bounty programs expects to find nothing.”

The Pentagon’s program is the first bug bounty effort sponsored by the federal government, but it will not likely be the last, because companies and government agencies are on the wrong side of an unequal security equation: While defenders have to hire enough security workers to find and close every security hole in their software and systems, attackers only have to find one, said Casey Ellis, CEO and founder of BugCrowd, a vulnerability-bounty organizer.

“The government is in a really bad position right now, which comes from being outnumbered by the adversaries,” he said. “They can’t hire security experts fast enough, and in the meantime they are still being hacked.”Crowdsourcing some aspects of their security work offsets part of the inequality in the math facing these companies, he said.The Department of Defense program, however, is on a much larger scale than most initial commercial efforts, HackerOne’s Rice said. Other efforts typically use dozens of security researchers, rather than hundreds.The Pentagon should get good results because the sheer number of hackers means they will have more coverage of potential vulnerabilities.”Even hiring the best security experts that you are able to find, that will still be a much smaller pool than if you could ask everyone in the world, or in the country,” Rice said. “You really can’t do security effectively unless you come at it from every possible angle.”U.S. Secretary of Defense Ash Carter characterized the initiative as a way for the government to take new approaches to blunt the attacks targeted at the agency’s networks.”I am always challenging our people to think outside the five-sided box that is the Pentagon,” he said in a statement at the time. “Inviting responsible hackers to test our cyber-security certainly meets that test.”The bug bounty pilot started on April 18 and will end by May 12, according to the Department of Defense. HackerOne is slated to pay out bounties to winners no later than June 10. The Department of Defense has earmarked $150,000 for the program.The DOD called the initiative a step toward implementing the administration’s Cyber National Action Plan, a strategy document announced Feb. 9 and which calls for the government to put a priority on immediate actions that bolster the defenses of the nation’s networks. The program is being run by the DOD’s Defense Digital Service, which Carter launched in November 2015.While finding and fixing vulnerabilities is important, the program could also create a potential pipeline to recruit knowledgeable security workers into open positions in the federal government, Monzy Merza, director of cyber research at data-analysis firm Splunk, said in an email interview.”Discovery and fixing of vulnerabilities is a good thing,” he said. “Creating an opportunity for individuals to test their skills and learn is also important. And there is a general shortage of skilled security professionals. Putting all these pieces together, a bug bounty program creates opportunities for people to learn and creates a human resource pool in a highly constrained market.”While attacking government systems may thrill some hackers and make others too nervous to participate, the actual program differs little from the closed bug hunts sponsored by companies, HackerOne’s Rice said.The security firm’s programs—and other efforts by BugCrowd and TippingPoint’s Zero-Day Initiative, now part of security firm Trend Micro—vet security researchers and hackers to some extent before allowing them to conduct attacks on corporate services and Websites, especially production sites. In the Pentagon’s case, more extensive background checks were conducted.In the end, the programs allow companies to spend money on security more efficiently, only paying for results, not just hard-to-find workers, he said.”Companies are not insecure because of a lack of money to spend on security,” Rice said. “There is a ridiculous amount of money being inefficiently and ineffectively spent on security. Even if we could hire all the security experts in our town or in our field, we could not possibly level the playing field against the adversaries.”
Source: eWeek

Enterprises Turn to SD-WANs to Improve Branch Office Connectivity

Enterprises Turn to SD-WANs to Improve Branch Office Connectivity
SD-WANs offer more flexibility, agility and affordability in branch office networks, and there’s a crowded field of vendors giving customers a lot of options. Joe Tan knew he was going to have to improve his company’s WAN environments.Devcon Construction is a commercial building company based in Milpitas, part of Northern California’s Silicon Valley. But it has dozens of construction offices and remote sites big and small throughout Northern California.Connectivity to the central office was important, but the company had to rely on whatever options were available at the individual sites. Some could get Multiprotocol Label Switching (MPLS) while others needed to use T1 connections, 4G wireless devices or even other technologies.The patchwork of disparate connections created an array of problems for Devcon, from high costs and security concerns to traffic bottlenecks, network management and visibility issues, not to mention high time demands on a small IT staff, according to Tan, Devcon’s director of IT.

Audio and video collaboration was difficult because transferring large files could result in high bandwidth consumption and slow performance. Meanwhile, having service providers set up MPLS connections could be expensive and time-consuming.

“We have a lot of construction sites in Northern California, and they all need to connect back to our headquarters,” Tan told eWEEK. “Reliable connectivity is really important for our business to run.”Tan started investigating technology options for the company’s wide-area network (WAN) about two years ago. About 18 months ago, he started talking with VeloCloud Networks, one of a growing number of vendors in the rapidly emerging software-defined WAN (SD-WAN) market.Devcon ran a proof-of-concept with the VeloCloud technology and has since standardized its WAN environment on the vendor’s products.VeloCloud’s software products run on standard x86 systems in a company’s branch offices or remote sites as well as in the cloud by connecting to VeloCoud Gateways housed in cloud data centers worldwide run by Amazon Web Services, Equinix and others.The gateways ensure that all applications and workloads are delivered via the most optimized data paths and enable network services to be delivered from the cloud. VeloCloud Edges are zero-touch appliances at the remote and branch sites that provide secure connectivity to applications and services. They also offer such features as deep application recognition, performance metrics, virtual network functions (VNF) hosting and quality-of-service (QoS) capabilities.Centralized management is provided by VeloCloud Orchestrator for installation, configuration, one-click provisioning of virtual services and real-time monitoring.For Tan, it meant more control over the WAN environments—from management to security to high performance—and the ability to address issues centrally rather than having to constantly send tech pros to multiple sites, a significant win for a company that has an IT staff of five people.”For a company that doesn’t have a lot of IT people, this was a quick and easy way to get reliable and powerful WAN service and not have to spend a lot on infrastructure,” he said.

The Cloud Drives Interest in SD-WAN

For much of the past decade or more, not much new had happened in the enterprise networking space, the WAN included.That’s changed over the past couple of years, as network virtualization—including software-defined networking (SDN) and network-functions virtualization (NFV)—has come to the forefront to help enterprises address the challenges brought by such trends as the cloud, big data, mobility and the Internet of things (IoT).More recently, innovation in the network has spilled over to the WAN with enterprises and service providers looking to SD-WAN technologies to make their networks more flexible, agile and affordable.The WAN over the decades has relied on various connectivity protocols, from Synchronous Optical Network (SONET) and Asynchronous Transfer Mode (ATM) to MPLS. However, none of these options were made for a cloud-centric world. 
Source: eWeek