Decorative image of a yellow lightbulb in a blue circle

AI has long played an important role in IT, but generative AI is opening an entirely different set of doors. Innovative engineers are using LLMs to accelerate and automate functions in new ways, supplementing and supporting, rather than replacing, human expertise.

The Best Uses for Gen AI May Be the Ones Customers Can’t See

Decorative image of boxes with lights inside representing activity inside an operation, behind a blue overlay and the Cascadeo logo in white.

In the nine months since ChatGPT launched to the public, generative AI has become a daily point of discussion across industries and platforms. Most people and enterprises were caught flat-footed, as is often the case when new tech emerges; generative AI amplified this effect by happening years ahead of any expected schedule for its type of output, which most tech leaders had predicted to take place sometime in the next decade, at the soonest.  

The public response has, unsurprisingly, been chaotic. LLMs (large language models) are going to ruin education, or improve it. They will destroy jobs, or create new ones, or both. Generative AI is going to overtake and destroy humanity. Or it’s not. It might revolutionize content creation, except that U.S. courts have determined that material created by generative AI cannot be subject to copyright, likely rendering it useless for that purpose for any organization that derives value from intellectual property. Chat GPT is known for factual inaccuracies in some uses, and its accuracy may be declining. Legislation to regulate LLMs is moving through various stages of completion; though it will be behind the technology by the time it becomes law, it will likely change the entire landscape. In other words, the visible, general public engagement with generative AI in the past year has been a bit of a mess.  

Meanwhile, much of the most interesting and productive use of generative AI is happening behind the scenes. Around the same time Open AI launched ChatGPT to public access, Amazon debuted CodeWhisperer at its annual re:Invent conference, in a session on DevOps and machine learning where Cascadeo’s own Jared Reimer shared the stage. CodeWhisperer, a real-time code generator, is just one example of emerging technologies using generative AI in ways that consumers will never see, functioning behind the scenes to enhance existing operations and create opportunities for new ones. A variety of LLMs are capable of generating code, and are already commonly being used in a multitude of coding contexts.  

AI has long played an important role in IT, notably in monitoring and observability, cloud management, CI/CD automation, and various other functions. Maximizing automation, which reduces security risks by eliminating opportunities for the human errors that cause 88% of cloud breaches, is an established principle in the cloud. Chat-bots, help desks, and any number of other user interfaces have long relied on some type of AI to increase efficiency and speed customer service. But generative AI is opening an entirely different set of doors. Innovative engineers are using LLMs to accelerate and automate functions in new ways, supplementing and supporting, rather than replacing, human expertise.  

Cloud management platform Cascadeo AI is a good example. Pre-LLM iterations of the platform (then called cascadeo.io) provided advanced monitoring and observability, customization, and consolidation of operating environment alerts and billing. But now, with API calls to LLMs integrated, Cascadeo AI provides instantaneous alert verification, analysis, and remediation recommendations an engineer can assess and implement in seconds, rather than spending time researching events and determining the best course of action to take. The platform’s engineers, led by Karol See, are finding new ways to use LLMs to create and manage databases, conduct inventory analyses, maintain security and compliance, and communicate essential operational information to customers. In other words, generative AI is allowing engineers to do more, while maintaining the human knowledge at the core of ethical computing. Cascadeo AI uses Amazon SageMaker, Amazon DevOps Guru, Open AI, and proprietary AI technologies to assure that the platform always integrates the best AI tools. As new options become available, See and her team are constantly refining integrations to keep the platform at the front edge of generative AI advances.  

Operational integrations with generative AI are happening across industries, as well, such as automating human services, supporting data analysis for strategic planning, managing financial data and portfolios, supporting pharmaceutical research and development, providing personalized medicine, and a wealth of other functions. So while generative AI might not write The Great American Novel, it will soon benefit every area of business, often in ways customers will never realize.