AI and machine learning are everywhere these days, so no wonder it is starting to get a stronger presence in the DevOps world. They are transforming DevOps by automating tasks and making the software development life cycle more efficient, insightful, and secure across various enterprises. We’ve discussed this briefly in our DevOps trends article, and now it is time to deepen our perspective of AI and machine learning in DevOps. Let’s start with some trends and numbers that highlight this modern technology.
AI and Machine Learning Trends
There is no doubt that AI and machine learning have gradually increased in popularity and are becoming valuable skills. As shown in our recent post about DevOps trends, about 40% of DevOps teams will augment monitoring with artificial intelligence for IT Operations by 2023. In addition, Deloitte predicted that the global market for AI custom application development services will grow to more than $61 billion in 2023. With a projected market growth of this magnitude over the next several years, learning how to work with AI and machine learning will be a profitable skill to possess.
Undoubtedly machine learning will soon find itself used in fields outside the tech industry, like eCommerce, entertainment, and healthcare applications. Since multiple domains will utilize machine learning, it is becoming an important skill that can advance developers’ or IT professionals’ careers and support these growing trends. In GitLab’s 2021 DevSecOps survey, about 30% of developers in the survey believe an understanding of artificial intelligence or machine learning is the most important skill for their future careers. In addition, Forrester predicts that at least a third of test professionals will use machine learning to make test automation smarter in 2022. Thus, the demand for knowing AI and machine learning is growing, which further validates why more developers believe understanding these tools is so important.
Why DevOps Needs AI
Machine learning, along with natural language processing, can help generate high-quality technical project requirements based on widely referenced guidelines. This will help ensure that the running systems and configurations remain compliant with their enterprise security policies. In fact, ML could detect incomplete requirements in a project, immeasurable quantifications, and other project requirements weaknesses. As a result, project management teams can compile better requirements needed for a project while minimizing the number of inaccuracies and weaknesses in the planning, leading to a highly-performant product in the end.
The exponential growth of various tools and technologies that support modern applications results in more components to administer. DevOps teams having more resources to manage can be very tedious and complex, especially when it is done manually. With the need for more automation and managing new technologies on the rise, DevOps needs artificial intelligence and machine learning more and more. AI can help DevOps teams alleviate technical debt and create better project requirements. While AI-based testing will help overcome technical debt in test coverage. Therefore, DevOps teams will be able to run tests on their systems and services without provisioning resources that are not necessary to run.
Another reason DevOps needs AI and machine learning is the fact that they can assist in managing alerts from applications or infrastructure. For example, AI can help prioritize alert responses based on the information derived from past behavior, the intensity of the alert, and where the alert is coming from. As a result, DevOps teams can optimize their alerting systems to automatically distinguish noise or false positives from critical issues that require immediate action. In addition, machine learning algorithms that are trained to monitor services can help remediate critical issues sooner since DevOps teams would spend less time sifting through logged information that has been recorded from monitoring services to determine the root cause of the issue.
Criticism behind AI
The adoption of AI and machine learning comes with some security concerns as well. Machine learning algorithms are only as good as the data they were trained on. Thus, tools trained on open-source projects could enable developers to unintentionally add bugs and security vulnerabilities into their code. This can lead to vulnerabilities being pushed into production that could negatively affect the software and services that enterprises produce.
Furthermore, there are some security vulnerabilities that AI and machine learning are not fully protected from. AI tools could be vulnerable to ransomware attacks, an exploit where a hacker encrypts a system’s drive that can only be decrypted with a key received after paying a ransom. It is a dangerous exploit that has been growing over time. In fact, it is quickly becoming a bigger source of vulnerability for more enterprises. However, as AI and machine learning continue to grow, they will be able to detect and mitigate more vulnerabilities before they reach production.
Some people are concerned that artificial intelligence or machine learning will replace people and their jobs. NoOps, where no human intervention is needed to manage infrastructure, could make Operations teams obsolete in the future. Machines could potentially learn how to configure and manage themselves through configuration patterns in the past, potentially leading to competition between man and machine. The vice president and principal analyst at Forrester states that some people feel like AI would lower the demand for developers in five years, as machines could write some basic infrastructure code on their own.
However, AI and ML adoption could lead to new opportunities. Knowing how to train algorithms and analyze large pools of data with these tools will lead to more accurate results or predictions that a company needs. Building applications that adopt ML or AI can lead to more high-performing applications that keep businesses ahead of their competition. In addition, machines and services can fail from time to time, so having a skilled professional proficient in machine learning can troubleshoot and resolve functionality problems that could come from using AI tools. Therefore, AI and machine learning could take over some of the jobs that people do, but they also provide opportunities in building, managing, and fixing AI tools and technologies that a company needs to run its business.
AI and Machine Learning in DevOps
AI and machine learning have taken various forms upon adoption into the DevOps space. Here are some of the different ways these two technologies are being used in this field.
AI-Ops and ML-Ops
AI-Ops and ML-Ops, which are short for “Artificial Intelligence Operations” and “Machine Learning Operations” respectively, involve using either artificial intelligence or machine learning to execute operational tasks. Teams can train algorithms to automatically run jobs within the different phases of the software development lifecycle. AI-Ops uses analytics and machine learning to investigate large amounts of data from operational tools used in DevOps pipelines. This allows them to learn what tasks need to be executed and how to implement them.
Artificial intelligence and machine learning could be used within the build phase to automatically build virtual environments to test source code. They can also be used in the monitoring phase to ensure that their resources are fully operational. As a result, developers and operations continue to maintain consistency across the board in their applications and infrastructure. With AI-Ops and ML-Ops, computer systems can learn the existing environments and package the code in the development phase to run at the optimal condition in production. This can help teams save time manually provisioning and managing resources since they would be provisioned ahead of time and passively monitor the performance of their systems.
DevOps teams are starting to use AI and machine learning for automating workflows. AI, machine learning, and data science practices can solve problems quickly and effortlessly. For example, artificial intelligence can speed up the development phases of the software development lifecycle. This is done by running AI-powered code completion tools like Tabnine and GitHub Copilot. They provide code suggestions that assist the developers during the development phase. In addition, it can automatically create some of the tests needed for quality assurance. Thus, testing teams can spend less time making tests and more time on automation within their testing environments to validate and push code into production sooner. AI and machine learning can perform more automated tasks than just testing the code. They can automate code reviews based on data sets that were added to machine learning algorithms. This will help automate the process of optimizing code to improve application performance.
Not only can AI and machine learning add more automation in the software development lifecycle, but they can also improve the quality of enterprise software. AI-powered tools can help predict deployment failures ahead of time by examining statistics from previous code
releases and saved application logs. This allows teams to improve their software releases since they will know what methods would work and implement their changes accordingly before pushing it to production. From the Enterprise Project, machine learning can find issues in code like resource leaks, potential concurrency race conditions, and wasted CPU cycles that could affect the performance of the software in production. By obtaining this information, DevOps teams will have more insight into facing performance bottlenecks and optimize their code or infrastructure to remediate those issues. There may be doubts about how reliable machine learning and AI can be when it comes to improving code quality, but recent studies have proven the reliability that these tools have in managing software. Machine learning tools like bug detection have proven to be correct about 80% of the time in finding defects and suggesting remedies in source code at Facebook. The gradual improvement of artificial and machine learning tools will make them even more reliable in the near future in improving software quality.
Although some security concerns come with AI and machine learning, these tools have proven to be useful in keeping services and infrastructure secure. They can detect anomalies in systems and services. From GitLab’s DevSecOps report, AI-powered data integration and machine learning algorithms help streamline anomaly detection even up to the boot time of devices. This will notify DevOps teams of system vulnerabilities as soon as the device finishes booting up. Plus, machine learning tools can be trained to detect anomalies in real-time and alert DevOps teams immediately. Therefore, they can stop more vulnerabilities from being exploited in their software and keep enterprises and end-users safe.
AI and machine learning are also useful in governing compliance across IT environments. For example, DevOps teams could train AI or machine learning algorithms to monitor security compliance across the enterprise by creating a baseline for them to follow. As a result, they can take preventative action and stop threats by informing a user about a vulnerability or shutting down a device to be non-compliant with their security policies. These advancements in security have led companies to adopt or acquire new AI tools for their own services. For example, JFrog has recently acquired Vdoo, an AI tool used to detect and fix vulnerabilities. Its approach has been to survey the devices’ behavior, where the AI compares the device’s behavior to a baseline to identify when it is not operating as designed. From there, DevOps teams will then decide whether to shut down that malicious device, isolate it from other critical resources or take some additional research into the device’s behavior being signaled out by the AI tool. Thanks to the anomaly detection abilities AI and machine learning are capable of, it will be easier for DevOps teams to maintain security and maintain compliance across their resources.
Low Code/No Code
It is worth mentioning the growing adoption of low code or no code tooling in the tech industry. Artificial intelligence’s ability to learn and manage technical resources will give teams less manual management to do on their end, where they will have to write little to no code to get their systems and services operational. Indeed, it will allow future developers to create software with minimal manual work and simple interfaces while maintaining the overall quality of the project. Furthermore, being able to create the required functionalities in an application with less code will make development teams more efficient and make it easier to debug issues since low code/no-code tools reduce the complexity that can come with programming. Therefore, the push for low-code or no-code tools is being adopted more by developers and DevOps engineers as they look into focusing more on the quality of their software and building and deploying high-performance applications.
There are multiple AI and machine learning tools that already exist in the industry. Here are a few examples of tools in that field.
Arize AI is a DevOps machine learning tool that detects root causes and resolves model performance issues faster. It provides real-time monitoring that could be used at scale to automatically find potential issues with performance and data, sending real-time alerts to DevOps teams to remediate issues as soon as possible. This tool helps the DevOps team with continuous monitoring in the software development life cycle since they will receive notifications for potential issues in application or infrastructure performance based on patterns in the metric data that the tool collects for training its models.
Dynatrace is an AIOps tool that provides observability, automation, AI, and cloud-native application security that DevOps teams can leverage in their CI/CD pipelines. With the operational data that its deterministic AI has been trained upon, Dynatrace can provide actionable insights in heterogeneous cloud environments. This helps DevOps teams
safely deploy applications into one or more cloud environments by detecting cloud-native anomalies before they can have a negative impact on the enterprise. In addition, its automated runtime application vulnerability detection feature can cover the entire software development life cycle, by using a combination of Snyk’s vulnerability database together with Dynatrace analysis, aiming to resolve security vulnerabilities from the code phase to the monitoring phase.
Another machine learning tool worth mentioning is Diffblue. Diffblue is an example of an AI-powered platform for automating the testing phase of the software development lifecycle. It integrates with your Java code repository and generates unit tests that reflect the current behavior of the code, thus allowing automatic regression with almost no effort, improving continuous integration workflows, and detecting regression issues early.
The last tool on our list is Amazon CodeGuru. This AI tool is used to detect security vulnerabilities and automate code reviews. By use of machine learning and automated reasoning, together with security best practices and lessons learned from code reviews, the tool helps in automating code reviews and offers recommendations on how to remediate detected issues. In a way one can see this tool as the next generation of static code analysis, adding machine learning to it.
AI and machine learning are already profoundly affecting how software and infrastructure are built, deployed, managed, and tested. From automatic testing to anomaly detection, artificial intelligence, and machine learning, all these will allow significant improvements for the entire development cycle. DevOps teams should look at all these tools and abilities as new enablers for improving product quality and for better managing their systems and services by offloading some of their manual work into automated, AI-powered tools. By training algorithms on the tasks and conditions that need to be automated, DevOps teams will be less overwhelmed with the requirements that they need to maintain for their companies. Despite the concerns that some people have regarding these tools, AI and machine learning will most likely play a significant part in DevOps teams’ systems and services in the years to come.