CoreOS brings a different approach to container security

Containers are one of the foremost talked-about technologies in IT since Docker burst onto the scene in 2013 and released its first commercial version a touch over a year later. CoreOS garnered attention in late 2015 when it released rkt while criticizing the safety of Docker as a container engine.      CoreOS CEO Alex Polvi has raised concerns about the Docker model that needs a majority of operations to run through the Docker daemon -- a view he maintains with the 1.0 release. "Without a rewrite of Docker, which will forever be a serious area of security issues," he said. "We built it to deal with an architectural issue that cannot be addressed with a light-weight patch to Docker." Rkt follows the Unix philosophy of privilege separation, consistent with Polvi. Users have the choice of eliminating the necessity to run an API server as a root technology credit union or ask the web to upload and download images.

On an equivalent day rkt 1.0 was released, Docker 1.1 was made available, with an important specialize in container security and more fine-grained access control. Docker declined to comment specifically on the CoreOS claims. Both companies take security very seriously, despite coming at it from different perspectives, explained Fintan Ryan, an analyst with RedMonk, based in Portland, Maine. Customers will pick the choice that most closely fits their needs, but a fairer comparison -- and more intense competition -- will accompany the software that sits on top of containers. The market goes to be absolutely huge for all these things, so there'll definitely be a few alternative ways to try to to it," Ryan said. Docker and CoreOS are fighting for an equivalent IT dollars, but they're also working together alongside a number of the most important tech vendors within the world to determine a typical around container formats and runtimes through the Open Container Initiative. 

Analyst firm 451 Research asked 198 senior IT pros who their primary container supplier is, with 64% saying Docker, compared with only 10% for rkt, consistent with the New York-based company's third quarter of 2015 edition of its Voice of the Enterprise survey on cloud computing. When new technology as popular as Docker comes along, the door opens for alternatives within the marketplace, said Jay Lyman, research manager at 451. Rkt has helped keep Docker honest in its progression and promoted a greater specialize in container security. This is that the classic open-source software competitor disciplining the opposite projects," Lyman said. "It helps Docker and helps rkt when there's quite one viable alternative."Analytics as a service offers hindsight, insight, and foresight When Time Warner Cable wanted to realize deeper insight into the cable TV preferences of its 15 million subscribers, the media giant turned to the science of cloud-based analytics as a service. "We needed to urge a far better understanding of customers' behaviors within different demographic segments of our audience," said Jeff Henshaw, TWC's senior director of digital marketing and analytics.

With its ability to quickly scrutinize huge volumes of knowledge, analytics has quickly become critical to strategic deciding. Consequently, it's no surprise that companies are revving up spending to leverage the technology. information technology degree during a November 2015 study, research firm IDC said worldwide spending on analytics services is predicted to leap from $58.6 billion in 2015 to $101.9 billion in 2019 -- a compound annual rate of growth (CAGR) of 14.7%. Writing within the study, Ali Zaidi, IDC's research manager for IT consulting and systems integration services, pointed to 2 key factors driving that growth: adoption of the latest technologies and a shortage of in-house analyst expertiseTWC's challenge was to know which cable TV packages its customers preferred supported age, income, geography, family size, presence of youngsters and other factors the corporate also wanted to correlate that information with the web behavior of its customers -- whether or not they use TWC's smartphone app or a browser app, follow their navigation paths from page to page, tally how long they linger on individual screens, note whether or not they complete a sale and even identify where users were if they chose to abandon their sessions.

 The goal, Henshaw said, was to get answers to specific business questions and obtain to the guts of data-driven targeted marketing. In turn, that might help TWC adjust the merchandise mix within its various cable TV packages, launch promotions, improve app navigation and even reduce telephone calls to its customer service representatives. Four sorts of analytics Data-driven cloud analytics as a service (PaaS) is usually divided into four types, spanning a worth spectrum that moves from hindsight to foresight Descriptive analytics, the only to implement, addresses the "what happened" question and provides rearview-mirror hindsight into past activities. information technology schools consistent with CI&T, a worldwide IT consultancy headquartered in Brazil, about 35% of companies it surveyed do descriptive analytics on a uniform basis.

Diagnostic analytics, one intensify in terms of both implementation effort and value, provides for a deeper dive into past data to deal with the "why did it happen" question. In its own tutorial on analytics types, IT distributor Ingram Micro said one common use is an aggregation of the numerous aspects of a social media marketing campaign into a view that shows what worked successfully within the past. consistent with CI&T's survey, but 5% of companies perform diagnostic analytics consistently. Predictive analytics, subsequent intensify, shifts from pure hindsight into a mix insight and foresight, answering the question "what might happen." Use the large data that has been amassed, correlate with other available and appropriate data -- like weather, geographic preferences or economic data -- and predictive analytics has the power to predict future data. "You can never predict with certainty what is going to happen, only what might happen," said Judith Hurwitz, president of Hurwitz & Associates, a Needham, Mass., IT consultancy. Fewer than 1% of companies within the CI&T survey are using predictive analytics.

Prescriptive analytics is the ultimate, offering the very best value, but at the value of complexity of implementation. It attempts to answer the question of "what should we do" by providing multiple options for handling business situations. To differentiate predictive and prescriptive, research firm Gartner said predictive analytics can forecast when a machine might break down, while prescriptive might suggest preventative maintenance actions. Though implementations of prescriptive analytics are uncommon, IDC, in its Worldwide Big Data and Analytics 2016 Predictions, said that by 2020, fully half all business analytics software will incorporate prescriptive capabilities.

Where does TWC fall across this spectrum? "We use all four analytics types in some capacity," Henshaw said. "If I had to settle on, I might accompany prescriptively." By going to what he called "data-driven targeting," the company's analytics implementation, cloud-based analytics as a service from Adobe Analytics, provide TWC with suggestions for fine-tuning its offerings. Those adjustments eventually feedback, creating endless process analysis. The analytics as a service movement is spreading rapidly. In its December 2015 forecast, Research and Markets cited lower costs of implementation, simple customization, and agility as factors in its prediction that the worldwide AaaS market will grow from $5.9 billion in 2015 to $22.24 billion in 2020 -- a powerful CAGR of 30.4%.

SapientNitro, the digital subsidiary of selling consulting company Sapient, is additionally a user of AaaS. The goal is to get new insights from collected data, allowing the firm to acknowledge new marketing opportunities for its clients. "Analytics is a kind that's driven by conjecture," said Simon James, global lead for performance analytics at SapientNitro. "Analytics may be a means to an end, either to raised performance or improved gross profit margin margins. you would like more certainty of success, but every answer inevitably results in more questions." Data meets analytics, but where?

Similar to other IT systems that are moving from on-premises installations to the cloud, analytics is not any different. as long as big data often already is cloud resident, that's where the analytics should reside, too, consistent with Nik Rouda, senior analyst for giant data and analytics at the marketing research firm, Enterprise Strategy Group Inc., in Milford, Mass. "It's natural to bring the analytics to the info. you do not bring the info to the analytics; that might be slow and expensive." TWC stores its data on the servers of the Adobe Analytics service. Jim Comfort, head of cloud services at IBM, agreed. "It's one thing to easily store data within the cloud, but it is the analytics within the cloud that make data useful," he said. "If you would like one, two or 20 different analytics approaches, you'll easily do all of that with the pliability and agility that a cloud services environment offers."

Speed is critical For TWC, the most reason for implementing cloud-based was to hurry business decisions. "We needed our implementation to be nimble, mirror what the business wants to live and measure against a particular set of goals -- that's top of mind." For developers, that meant using dynamic tag management, embedding tags or pixels on pages. Using this method frees engineers from an iterative development cycle as business requirements change. "Tag management is an extremely great way to urge around that," Henshaw said. Cloud containers race heats up between Amazon and GoogleAmazon Web Services and Google is aggressively developing their cloud containers services during a bid to capture enterprise app dev business.

The companies' cloud containers services abstract elements of Docker container management far away from users, making it easier to deploy and scale applications built on them. However, there are key differences between their maturing offerings, including where each has chosen to implement autoscaling, redundancy, and interoperability with third-party tools and clouds Autoscaling a key point of contention Google Container Engine (GKE) consists of pods, replication controllers and nodes. Pods are a logical grouping of containers that model an application-specific logical host. Replication controllers make sure that a selected number of pod replicas are running at anybody time. Nodes are the Google Compute Engine virtual machines that underpin the containerized environment.

GKE is predicated on Google's Kubernetes container orchestration platform. Kubernetes version 1.1, released Nov. 24, four months after 1.0 made its debut, was the primary on the market to autoscale pods with horizontal pod autoscaling, a feature highly sought by users to justify many use cases for GKE. We use the autoscaling quite a bit for all kinds of projects," said Tim Kelton, co-founder and head of cloud architecture for Descartes Labs Inc., a machine learning startup based in Los Alamos, N.M., which processes petabytes of satellite data.  Autoscaling pods are available handy running an outsized batch job, Kelton explained. At times, his company processes a petabyte of knowledge, which needs scaling up to 30,000 cores. within the first release of Kubernetes -- which was incorporated soon after by GKE -- "that wasn't a part of the core feature set," he said.

GKE doesn't support vertical container scaling or node autoscaling, but these features are coming soon, consistent with David Aronchick, senior product manager for GKE, who also leads product management for Kubernetes. meanwhile, it consists of services, tasks, and instances. Services are groups of tasks that structure an application, while instances are the Elastic Compute Cloud VMs that underpin the containers -- very similar to nodes in GKE. Amazon ECS' autoscaling capabilities are the inverse of how it works with GKE: Services are often autoscaled using Amazon CloudWatch and Amazon Web Services (AWS) Lambda, and instances are often autoscaled supported CloudWatch metrics also, but tasks -- the rough logical equivalent of pods -- can't be autoscaled. While all the kinds of autoscaling are important, Amazon users want task autoscaling added to ECS.