Cloud hosting provider Linode is receiving praise for its handling of a string of recent security attacks, but some customers are concerned enough to think about other options node Manager passwords expired last week and users were prompted to line new passwords after an investigation found unauthorized logins into three accounts. technology credit union The reset came on top of ongoing distributed denial-of-service (DDoS) attacks the cloud provider faced in its data centers. Linode also faced downtime earlier in 2015 when it had to try to do a reboot to deal with security issues around Xen.
The overwhelming majority of Linode cloud customers are supportive, consistent with the corporate, but two users said the string of attacks have them looking elsewhere.
The series of attacks was "quite an enormous deal" for Dallas-based consulting company and Linode cloud customer etc.io, which suffered several outages as a results of the attacks, said chief advocate E.T. Cook. Etc.io uses a spread of cloud providers but Linode has been its go-to platform.
"We've been transparent, and although we feel for the Linode DDoS situation and won't be abandoning them, we're beginning to check out diversifying and having failovers outside of Linode for all of our primary properties," Cook said.
Making such a move will create challenges, particularly around database replication, but he's convinced it must be done.
Munzee Inc., a McKinney, Texas-based scavenger hunt game with workloads hosted in Linode's Atlanta facility, said during a blog post that the attacks lasted 10 days before finally stopping on Jan. 3. information technology degree
The worst of it came the weekend prior, with intermittent uptime leading to its apps, websites and stores down the bulk of that point period.The post also said Munzee was taking steps to stop similar downtimes within the future, including hosting servers in multiple data centers or with multiple companies, and possibly changing providers. In an email to SearchCloudComputing, Scott Foster, vice chairman of technology at Munzee, said the corporate was making the move from Linode to Amazon Web Services.
An investigation found unauthorized logins into three accounts within the Linode cloud. Two Linode.com user credentials were used on an external machine -- meaning they might are read from Linode's database, either offline or on, Linode said.
There was no indication that any customers' information was accessed, but it's possible that usernames, email addresses, securely hashed passwords and encrypted two-factor seeds could are read from the user table of its database, consistent with Linode.
Users of the three potentially affected customer accounts were immediately notified, and no additional evidence was found of access to the vendor's infrastructure.information technology schools An unnamed third-party security firm has been brought on to help within the investigation.
Linode has handled things well, supported the knowledge the corporate has been made available, with an appropriate level of transparency regarding what occurred and therefore the steps taken, said Adrian Sanabria, senior security analyst at 451 Research. Linode also has been smart to not disclose information that customers don't got to know, like the name of the firm they've engaged to assist with the investigation, he added.
"It's nice to ascertain that they are not running everywhere social media waving a Mandiant-branded flag, or denying responsibility because the attacker was 'super advanced' or 'sophisticated,'" Sanabria said.
The DDoS attacks started on Dec. 25, and over subsequent week the Linode cloud faced quite 30 attacks of what the firm called "significant duration and impact."
Linode claims to possess no information about who is behind the attacks or if the attacks are connected. the corporate is functioning with enforcement officials and plans to possess a full technical explanation of the incidents once the attacks stop.
Past samples of DDoS attacks have run concurrent with fraud. Investigators will explore any possible connections but which will be difficult to prove, said Robert Westervelt, research manager at IDC Research.
"Identifying a threat actor is extremely difficult, and connecting them to multiple incidents further complicates the difficulty ," Westervelt said.
While acknowledging that unauthorized login of three customer accounts is troubling, Westervelt agreed that Linode appears to be responding appropriately. Users are known to adopt poor password practices, but the corporate used accepted best practices around securely hashing passwords and encrypting two-factor seeds.
Stolen passwords are one among the highest risks for cloud services providers, consistent with Westervelt. One great way [to address this] is to feature multi-factor authentication, he said. "Most providers provide it as an optional capability if customers desire that level of protection."
Another common method for attackers to realize access is thru chinks within the Web-based management system software, which could have vulnerable components, Westervelt said.
"For Linode, providing transparency about its actions to contain the threat and any remediation steps it's taken, is vital for it to take care of the trust of its customer base," Westervelt said.Speed bumps abound on the IoT Hub road
One source conversant in the company's plans, however, said the merchandise has run into snags during its public preview -- particularly around capacity and scalability. While multiple capacity units are often stacked on top of IoT Hub, Microsoft calls out a 6 million message per unit, per day limitation. On the opposite hand, archrival Amazon Web Services (AWS) claims support for billions of devices and trillions of messages on its IoT platform.
Microsoft's Lee denied that the merchandise has run into architectural issues during the preview. He also acknowledged that Microsoft's IoT Hub is priced to accommodate fewer, larger messages than the AWS IoT system, which prices in 512-byte chunks.
While Microsoft has been trying to form hay against Amazon by claiming its overall IoT Suite has been available longer, AWS' IoT platform has natively included bidirectional communication since its release to general availability in December.
Despite the bidirectional capabilities of Amazon's IoT offering, when Dyck's firm evaluated offerings from Microsoft and Amazon, he said the corporate never seriously considered Amazon. He felt there wasn't anything compelling that "would have us reconsider or fundamentally change our strategy."
Both Dyck and Bass said their companies have successfully implemented remote monitoring for IoT through the Azure suite. within the us , Bass said safety standards prohibit the remote operation of elevators, so bidirectional communication therein case isn't relevant.
Google cloud features get older in 2015, but work remainsGoogle continued to bolster its services within the extremely competitive -- if not very crowded -- market of worldwide public cloud vendors, yet many of the concerns that lingered in 2014 remained in 2015.
Containers and large data were front and center in many of the upgrades to Google Cloud Platform this year. Google also continued to hammer home its message on lower pricing, made some interesting strategic partnerships and took steps to reverse its reputation for not catering to enterprises.
Despite the new Google cloud features, the service leaves some industry observers wanting more. Often viewed because the third hyperscale public cloud vendor, market dynamics shifted in 2015, as Microsoft Azure pushed ahead because the primary alternative to Amazon Web Services (AWS).
Here are a number of the foremost important improvements to Google Cloud Platform in 2015, and a couple of requests from industry observers on what they need to ascertain in 2016.
Drilling down on cost with new instances
Better matching of services to customers' needs and optimization of their costs were major parts of Google's cloud plans this year.
Public cloud vendors offer dozens of sorts of machines, but often those don't match precisely with on-premises configurations. With Google's Custom Machine Types, customers can choose what percentage CPUs and the way much RAM each VM uses.
Pre-emptive VMs are Google's answer to AWS EC2 Spot Instances and supply access to instances at 30% of the quality cost. the utilization of those VMs, however, is essentially limited to tests or stateless workloads without strong uptime requirements, as all instances are going to be terminated after 24 hours and users are often began at any time.
"Innovative price schemas, like sustained use discounts and per-minute billing, provide flexibility to satisfy specific user needs and attract customers with overall savings," said Jillian Freeman, senior analyst with Technology Business Research.
Cold storage is a beautiful option for cloud consumers uninterested in archiving data to tape, but a budget alternative still has its limitations in terms of the value and time to retrieve that data. That's where the corporate sees Google Cloud Storage Nearline fitting in, because it offers competitive pricing, with data retrieval in three seconds rather than hours or days.
Big data, big plans
Google made variety of moves around big data in 2015, including the beta release of Cloud Dataproc, the overall availability of Cloud Dataflow and therefore the second generation of Cloud SQL, which promises better pricing and performance for managed MySQL databases.
The Google cloud features for giant data sets are its real strength, said Sudhir Hasbe, vice chairman of software engineering at Zulily, an e-commerce site based in Seattle.
Dataflow, a managed service for real-time operation for batch and streaming data, went into general availability in August after a lengthy beta. Zulily wont to run its stream processing through Hadoop in Google Cloud Platform, but made the switch to Dataflow early .
"The great thing about that's it's completely run on its own infrastructure, and it only uses what's required," Hasbe said.
Hasbe is additionally high on Dataproc, a managed Spark and Hadoop big data service released in beta in September, because it dynamically scales as required . And he would really like to ascertain further investment within the product in 2016.
And in fitting with its price-cutting mantra, Google skilled customers' cost concerns around BigQuery in December with custom quotas to line daily limits projectwide or per user.
OpenStack hasn't made much headway in powering large-scale public clouds, but the open source technology has found an edge with large enterprises and tech vendors -- particularly for personal clouds. So, it had been an enormous step when Google became a politician sponsor and therefore the first major public cloud vendor to support the project.
Google's support "changed the game" for OpenStack, TBR's Freeman said. Google is betting heavily on its popular open source container orchestrator, Kubernetes, and this enables the corporate to integrate it with OpenStack and potentially link with other vendors within the ecosystem.
"While we're not seeing mass adoption of OpenStack yet, portability and interoperability are high on cloud wish lists," Freeman said.
"Google was highly strategic in allowing its technology to raised work with the broader hybrid IT landscape," she said.
In another interesting partnership, Google offered a number of its big data and storage services to customers of VMware's vCloud Air. Google also hired VMware co-founder Diane Greene to go its cloud platform -- a move seen as an honest step to enhance Google's standing with enterprises.
Although Kubernetes are often utilized in myriad environments, it appears to be an enormous a part of Google's cloud ambitions and gained momentum in 2015.
Some of the simplest improvements to Google's cloud this year happened around containers, said Dave Bartoletti, principal analyst with Forrester Research. He cited the smoother pod autoscaling with Kubernetes and improved networking of Container Engine, also because the improved performance and security for Container Registry.
Wishes for 2016
Despite the progress this year, customers and analysts want to ascertain more features from Google to lure during a broader audience.
"I'm hoping to ascertain more hybrid cloud-enabling features from Google to smooth the transition to cloud from a virtualized data center, for enterprise customers and particularly IT operations teams who won't have a relationship with Google," Bartoletti said.