CloudCodes Releases New Security Product

CloudCodes Releases New Security Product

CloudCodes announces the availability of their next generation cloud security product SSO1. It is an advanced version of their existing product gControl. The SaaS based gControl helped to secure Google For Work customers whereas SSO1 provides support for multiple enterprise cloud applications such SalesForce, Zoho, DropBox, FreshDesk etc-etc.

SSO1 is a Single Sign On(SSO) solution and supports out of the box integration with Google For Work. SSO1 can be integrated with organisation Active Directory or can itself act as an Identity Provider(IdP). The solution is designed ground up keeping security in mind. The solutions provides the capabilities of IdP such as password management, self-password and multi-factor authentication(MFA).

The MFA comes with support for biometric through a smartphone and OTP. Mostly all the advanced smartphones now available with support for fingerprint scan. It leverages the fingerprint scanner available in the smartphones to act as an additional factor for authentication.

The new SSO1 comes with support of anti-phishing control. The solution enables the administrator to restrict access to the login page based on the country or the IP address. This ensures that only internal users have access to the login page. This allows the administrator to control access to the applications from countries which are known for hacking or phishing attacks.

Another important aspect of the new SSO1 is its additional capabilities to control access to organisation’s cloud application based on perimeter. The cloud security solution enables the administrator to control the access of the application based on the IP address, Geo Fencing, Time and Browser. This brings a tighter control based on compliance and regulatory requirements.

SSO1 supports integration with multiple cloud based applications. It uses the standard SAML based integration to integrate with the various cloud applications. Currently SSO1 supports integration with Google For Work, SalesForce,Zoho, DropBox and Freshdesk. The roadmap expects to cover most of the popular cloud applications within the next three months. Another important feature is mapping of multiple users to single account of the cloud application.

Source: CloudStrategyMag

Egenera Partners With Portland Europe To Distribute Xterity Cloud Services

Egenera Partners With Portland Europe To Distribute Xterity Cloud Services

Egenera has announced it has signed a strategic partnership with Portland Europe, a leader in providing high-quality, easy-to-use software and cloud solutions for business use. Portland will deliver Egenera’s Xterity wholesale managed cloud services to its SME customers in the Benelux region.

Portland Europe is one of the leading distributors for value-added resellers and IT managed services providers in the Benelux countries and specializes in delivering IT services and cloud and managed services. Portland provides an extensive toolbox of on-premise and cloud solutions to partners with thousands of endusers. The distribution partnership with Egenera comes as Portland Europe continues to help companies embrace the full potential of the cloud.

“The cloud is the way forward for SMEs to take advantage of hosting applications, disaster recovery, backup, and more,” said Kim van Brugge, managing director, Portland Europe. “We are happy to be teaming with Egenera and to continue to strengthen our network of suppliers to ensure we provide the most state-of-the-art technology to enable seamless cloud migrations.”

Egenera’s Xterity Cloud Services deliver a full range of dedicated, managed, private and public cloud services, including Infrastructure as a Service (IaaS), Disaster Recovery as a Service (DRaaS), Backup as a Service (BaaS), and CloudMigrate? exclusively through the channel. Xterity’s business continuity services deliver on-premise server-to-cloud or cloud-to-cloud backup and disaster recovery. With Xterity, resellers can quickly enter the cloud services market with no up-front capital or ongoing maintenance costs.  Unlike reselling cloud services from large, commodity cloud vendors, Xterity delivers the margins resellers need to develop a profitable cloud services business.

“To say that we are happy to see Xterity offered in the Benelux region is an understatement. Our new partnership with Portland is a key milestone in the adoption of Xterity worldwide,” said Till Brennan, vice president, EMEA, Egenera. “We’re excited about working with Portland to deliver our full range of cloud services to its SME customers.”

Source: CloudStrategyMag

LATAS is the air traffic controller for drones

LATAS is the air traffic controller for drones
It’a likely that hundreds of drones will be flying in our skies over the next couple of year. So how do we keep track of them?

I caught up with Vice President of Airspace Services at PrecisionHawk to talk about LATAS, the company’s new drone communication service.

Question 1: What is LATAS?
Latas is our technology…our platform of essentially different data layers of what’s around the drone that I need to know to be able to fly the drone safely. We are taking lots of satellite data that we collect via satellites and we are processing that data into a very high resolution, 3D map of the Earth. So now I know what’s on the ground. I know where the buildings are, I know where the trees are, where the power lines are. So that I can take that data in conjunction with airspace data and manned air traffic data so that I can understand what’s around the drone. And if I can understand what’s around the drone, I can understand what that risk is.

Question 2: How does it work?
All the software we’re using to process this data has been developed in-house. We’re working with a number of our partners, such as Digital Globe, who owns a number of satellites, and what we’re actually doing is we’re pulling in satellite data, we’re processing that satellite data and turning that 2D image into a 3D model. And we can see where the buildings are, how tall the buildings are, where the trees are. So now we are taking that data from the satellites and using it in a new way.

Question 3: Why do this?
From our company’s background, traditionally we work with large enterprise companies. In the agriculture space, in the energy space, insurance space. That they wanna roll out very large fleets of drones. They wanna put them in the hands of agronomists, they wanna put them in the hands of insurance adjusters, who may not be pilots. So we’re developing technologies, and we were pushed by those partners to develop technologies like LATAS to help them mitigate their own risks so that they could understand how to fly safely. So when they put a drone in the hand of an agronomist, we would know that that drone won’t run into things on the ground and won’t run into things in the air because we know where those things are.

Question 4: What’s next for LATAS?
Our theme is ‘just fly.’ You should be able to have the technology in the drone itself to allow you to go out and fly that drone safely from day one. Whether you’re an amateur or a professional going out and doing a commercial job. So we’re working with a number of different drone manufacturers today, to integrate this technology into their drones, from the recreational drones, all the way up to the expensive commercial drones because everybody needs the same data. We understand what’s around us, so that we don’t run into things. And we mitigate that risk and make flying the drone safer.

Source: InfoWorld Big Data

Real or virtual? The two faces of machine learning

Real or virtual? The two faces of machine learning

There’s a lot of sci-fi-level buzz lately about smart machines and software bots that will use big data and the Internet of things to become autonomous actors, such as to schedule your personal tasks, drive your car or a delivery truck, manage your finances, ensure compliance with and adjust your medical activities, build and perhaps even design cars and smartphones, and of course connect you to the products and services that it decides you should use.

That’s Silicon Valley’s path for artificial intelligence/machine learning, predictive analytics, big data, and the Internet of things. But there’s another path that gets much less attention: the real world. It too uses AI, analytics, big data, and the Internet of things (aka the industrial Internet in this context), though not in the same manner. Whether you’re looking to choose a next-frontier career path or simply understand what’s going on in technology, it’s important to note the differences.

A recent conversation with Colin Parris, the chief scientist at manufacturing giant General Electric, crystalized in my mind the different paths that the combination of machine learning, big data, and IoT are on. It’s a difference worth understanding.

The real-world path

In the real world — that is, the world of physical objects — computational advances are focused on perfecting models of those objects and the environments in which they operate. Engineers and scientists are trying to build simulacra so that they can model, test, and predict from those virtual versions what will happen in the real world.

U.S. Closely Eyeing China's Corporate Hacking Vow, Official Says

U.S. Closely Eyeing China's Corporate Hacking Vow, Official Says

By Nafeesa Syeed

(Bloomberg) — It’s too early to proclaim a U.S.-Chinese agreement to curb the theft of corporate trade secrets a success, according to the chief cyber diplomat at the State Department.

Nine months after Chinese President Xi Jinping and U.S. President Barack Obama vowed that they wouldn’t condone hacking to steal commercial secrets, the U.S. is closely monitoring whether China carries out any “intrusions and theft of intellectual property,” Christopher Painter, the department’s coordinator for cyber issues, said in an interview.

While progress has been made, “the jury is still out. We’re looking very carefully, we’re continuing to watch this,” Painter said in Washington on Wednesday. “We haven’t taken any of the tools we have off the table, but we’re very serious in making sure that this commitment is upheld.”

Before Xi and Obama reached their accord on corporate hacking last year, the U.S. said it was considering economic sanctions on Chinese individuals and companies in response to a string of cyber attacks against American businesses and government agencies. In 2014, the U.S. indicted five Chinese military officials on charges that they stole trade secrets from companies including Westinghouse Electric Co. and United States Steel Corp.

After meeting with Xi, Obama pointedly said he hadn’t ruled out resorting to sanctions if their agreement was violated. China has denied being involved in hacking and has said it’s a victim of cyber espionage.

‘Absolute Sovereignty’

Ahead of an annual gathering of U.S. and Chinese officials in Beijing, the U.S. is trying to “mainstream” cybersecurity as a foreign policy issue and seeking to create standards of acceptable behavior, Painter said. He’ll join Secretary of State John Kerry at the meeting on June 5 to 7 where cyber issues will be just part of the agenda.

“China promotes absolute sovereignty in cyber space; they want to draw borders around their cyber space,” Painter said. “We think sovereignty has a role, but absolute sovereignty doesn’t have a role. There are internationally recognized human rights that transcend national borders.”

Cyber Diplomacy

Painter, a 58-year-old former prosecutor who worked on cyber policy on the National Security Council, said his “cyber diplomacy” also extends to Russia, whose state-linked and organized crime groups are often blamed for hacking attacks.

“We have very different views of the world with the Russians, in terms of how they look at cyber space and the fact they want more state control,” Painter said. “We are looking for ways that we can avoid the inadvertent escalation and keep conflict from happening, so there’s some common ground there.”

By contrast, there’s no “cyber dialogue” with Iran, Painter said. He wouldn’t discuss specific cases but said the U.S. puts “countries on notice of conduct that we think is unacceptable.”

The U.S. faces a range of online threats, from nation-states to “lone gunman hackers” as well as terrorists, he said.

“Terrorists have used the internet to recruit and to spread their messages,” he said. “They haven’t attacked infrastructure yet – but we’re worried about that.”

The U.S. also continues to worry about the economic effects of trade-secret theft, Painter said. In the private sector, technology and financial businesses have more robust cybersecurity, while others, such as manufacturers, are playing catch-up, he said.

“Companies are beginning to see this is a big challenge for them, because it’s the bottom line. If their trade secrets are leaving the door, that’s their future,” he said. “Some of the other sectors who haven’t dealt with this day-in and day-out are still trying to find good policies.”

Source: TheWHIR

AWS Recovers After Cloud Downtime Hits Sydney

AWS Recovers After Cloud Downtime Hits Sydney

Amazon Web Services customers in Australia suffered some cloud downtime on Sunday, where nasty weather ravaged Sydney, knocking out power for more than 85,000 homes and businesses.

While many seem to suspect the weather is to blame for the AWS cloud downtime in its Sydney region, the company has not confirmed whether that is the case. The issues were resolved on Monday morning.

READ MORE: The Cloud: Understanding Resiliency and Outages

AWS said a significant number of EC2 instances and EBS volumes within its Sydney region were impacted by connectivity issues.

While many cloud outages fly under the radar, because of the size and scope of AWS and its high-profile clients, AWS outages – regardless of length – can have a big impact. Last year, an AWS outage in North Virginia impacted the services of Heroku and Netflix.

A number of businesses in Australia were impacted by the AWS outage, including Domino’s Pizza, and TV streaming services Foxtel Play and Stan.

All AWS services in Australia were operating as normal on Monday, including Amazon CloudSearch, Amazon CloudWatch, and Amazon EC2 Container Service.

Source: TheWHIR

IBM joins R Consortium

IBM joins R Consortium

IBM is joining the R Consortium with a “significant investment,” the company is scheduled to announce at today’s Apache Spark Summit, becoming a top-tier Platinum supporter of the open-source R programming language.

R, designed specifically for statistical computing and other data analysis tasks, has become increasingly popular in recent years as both data volumes and interest in data science have exploded. IBM says that R is among the languages it used to develop its Watson natural language/machine learning platform.

Dinesh Nirmal, IBM vice president of development for next-generation analytics platform and big data solutions, will join the R Consortium board of directors.

“IBM is deeply invested in open-source software for computing applications like data science,” Nirmal said in a statement released by the Consortium. “And as a long-time member of The Linux Foundation, it’s a natural fit for us to extend our commitment to collaborative development by joining the R Consortium.”

Cloudability Gets $24 Million to Help Enterprises Save on Cloud Expenses

Cloudability Gets Million to Help Enterprises Save on Cloud Expenses

Cloud cost monitoring platform Cloudability announced on Monday that it has closed a $24 million Series B round of financing, led by the Foundry Select Fund.

According to a blog post by Mat Ellis, Cloudability CEO and founder, the funding will be used to invest in the product, which aims to create more transparency around cloud cost and utilization. Ellis said these improvements will “transform a company’s mountain of billing data into actionable insights to help them build bigger and more complex clouds with confidence and control.”

READ MORE: Startup That Helps AWS Users Reduce Cloud Costs Raises $2 Million

Planning for Cloud Backup: Working with the Right Provider

Planning for Cloud Backup: Working with the Right Provider

This post is the second part of a two-part series. Click here to read part one, Planning for Cloud Backup: Best Practices and Considerations.

The pace of cloud is pretty blistering. We’re seeing new services and offerings diversifying the competitive landscape and giving organizations and users many new options. In fact, Gartner recently pointed out that the worldwide public cloud services market is projected to grow 16.5 percent in 2016 to total $204 billion, up from $175 billion in 2015. That’s a lot of growth.

In our previous post – we discussed a few considerations and best practices when it comes to cloud backup solutions. Now, we discuss working with the right kind of provider. Please remember, it’s not always about cost. Rather, your provider must align with your business and your strategy. This means providing services which are easy to use, compatible with your systems, and are easy to manage. The last thing any organization wants is to experience outages due to poor cloud partner integration.

READ MORE: Understanding Cloud Backup and Disaster Recovery

Consider this, Ponemon Institute and Emerson Network Power have just released the results of the latest Cost of Data Center Outages study. Previously published in 2010 and 2013, the purpose of this third study is to continue to analyze the cost behavior of unplanned data center outages. According to the new study, the average cost of a data center outage has steadily increased from $505,502 in 2010 to $740,357 today (or a 38 percent net change). With this in mind – working with a good cloud backup provider not only helps mitigate this outage risk and associated costs; it also allows you to be a lot more flexible with our cloud-based data.

And so, when it comes to cloud backup and planning, it’s important to know and understand which product and vendor to work with. Remember, every environment is unique so the requirements of each organization will certainly differ. Still, there are some important considerations which must be made:

  • System compatibility. When working with a cloud backup solution, it’s important to take the time and verify that all systems being backed up are compatible. For example, if snapshots are being taken of a virtual environment – can those snapshots be then used to revert or recover the VM? Can they scale into other cloud systems or even an on premise data center? Will the snapshot only take an image of the VM’s storage and nothing else? Using the same concept, we can apply compatibility with other systems within an organization as well – databases, file servers, applications, and others as well. During the planning process, IT teams will need to work with the cloud backup vendor to ensure that their systems are compatible and are capable of being backed up to the functionality desired. Remember, the goal isn’t only to back up the data. One of the biggest benefits of a modern cloud-based backup solution is the fast turnaround of data recovery. So, administrators must be sure that their data is not only backed up, but is capable of quick and effective recovery.
  • Ease of use and training. When working with a cloud backup solution, it’s important to ensure a relative ease of use for the product. Administrators must be able to perform daily tasks to make sure that their data is being backed up safely and normally. There will be times when training is involved to further entrench the technologies within the organization. This is necessary for a smooth backup process since data backup and recovery are vital parts of an IT environment.
  • Management tools and feature considerations. Depending on the cloud platform chosen, there will be many feature considerations involved with the product. As mentioned earlier, data deduplication/compression, encryption, compliance support, VM compatibility, and archiving are just a few examples. Others may include direct virtual environment integration, or even cloud-ready disk-based backup solutions. When selecting the right technology and vendor, it’s important for the organization to establish the business case and need for a given product and its features. Once that is established, the other important task is to familiarize the management tool set. Although native tools offered within the product are powerful, there may be a need for 3rd party offerings as well. Since cloud is a distributed ecosystem, it’s important to consider multi-site deployments. When working with numerous sites and environments, management tools can go a long way in ensuring that data is being backed up normally and efficiently. That means proper data usage, minimal waste within resources, and direct visibility into the cloud backup environment.

Whenever cloud and backup solutions are in the discussion, all infrastructure components which may be affected by the deployment must be considered. Your backup and cloud partner must understand this and be a part of the process. This means compatibility, understanding of the technology and how a backup routine may affect other parts of the infrastructure. There will be times when a backup job may require higher amounts of bandwidth or an appropriate store size – these planning points must be addressed prior to any major rollout. Remember, depending on the environment, there may be options for a pilot or POC. A small-scale rollout of a cloud backup solution may show where some weaknesses can be quickly resolved prior to a large deployment. A good cloud partner can always help there as well.

Source: TheWHIR