How a Hybrid Platform Can Help Enable Trusted Generative AI



Generative artificial intelligence (AI) is still in its infancy, but it already brings irresistible promise to help businesses serve their customers.

Organizations can use generative AI to quickly and economically sift through large volumes of their own data to help create relevant and high-quality text, audio, images, and other content in response to prompts based on legions of training data. And hosted open-source large language models (LLMs) can help organizations add enterprise data context to their outputs, producing more reliable responses while reducing false information (“hallucinations”).

But the dilemma is that, to get more accurate outputs from a generative AI model, organizations need to give third-party AI tools access to enterprise-specific knowledge and proprietary data. And companies that don’t take the proper precautions could expose their confidential data to the world.

That makes optimal hybrid data management critical to any organization with a strategy that entails using third-party software-as-a-service (SaaS) AI solutions with its proprietary data.

Harnessing the Power of Hybrid Cloud

The public cloud offers scalable environments ideal for experimenting with LLMs. However, full-scale LLM deployment can be prohibitively expensive in the cloud. And while LLMs are only as good as their data, sending sensitive or regulated data to cloud-based LLMs presents significant privacy and compliance risks.

The private cloud offers an optimal environment for hosting LLMs with proprietary enterprise data and a more cost-effective solution for long-running LLM deployments than is offered by public clouds. Housing LLMs in a private cloud also ensures enhanced data security, safeguarding sensitive information from external threats and compliance issues.

Organizations that adopt a hybrid workflow can get the best of both worlds, making the most of generative AI without sacrificing privacy and security. They can benefit from the flexibility of the public cloud for initial experimentation while keeping their most sensitive data safe on on-premises platforms.

One organization’s experience demonstrates how hybrid cloud-based data management can incorporate public customer data in real time while protecting confidential company and customer information.

A More Personalized Experience

One of the largest financial institutions in Southeast Asia, Singapore-based, wanted to use AI and machine learning (ML) to enhance the digital customer experience and improve its decision making. It used a hybrid cloud platform to do so.

OCBC built a single entry point for all its LLM use cases: a hybrid framework that could seamlessly integrate multiple data sources, including inputs from thousands of customers and a private-cloud data lake that would keep customer data safe, to get real-time insights customized to its own company standards.

The bank built prompt microservices for accessing LLMs stored on its on-premises servers as well as LLMs available in the public cloud: a cost-effective model that allowed it both to use public cloud LLMs and to host open-source LLMs, depending on the functionality and customization it needed. By deploying and hosting its own code assistant, scaled for 2,000 users, OCBC saved 80% of the cost of using SaaS solutions.

Combining the vast capabilities available on the public cloud with the portability of its private platform helped the bank securely train its AI models and derive more accurate inferences from its outputs.

The platform integrates with the bank’s ML operations pipelines and fits into its larger ML engineering ecosystem. This cloud-based ML-powered platform lets OCBC build its own applications and use the tools and frameworks its data scientists choose.

The initiative has led to a more personalized customer experience, higher campaign conversion rates, faster transactions, reduced downtime for data centers, and an additional SGD100 million (US$75 million) in revenue a year.

Innovating with Generative AI, Securely

Organizations are racing to adopt generative AI to streamline their operations and turbocharge innovation. They need AI tools that have enterprise-specific context and draw on knowledge from proprietary data sources.

But while the technology is still maturing, there’s no need to sacrifice privacy, security, and compliance. By using hosted open-source LLMs, businesses can access the latest capabilities and fine-tune models with their own data while maintaining control and avoiding privacy concerns—and limiting expenses.

Going with a hybrid platform allows organizations to use the advantages of the public cloud while keeping proprietary AI-based insights out of public view. By allowing businesses to store and use their data wherever, whenever, and however they need while offering a significant cost advantage, hybrid workflows incorporating vendor-agnostic and open and flexible solutions are truly democratizing AI.


Learn more about how you can use open-source LLMs with your own data in a secure environment.



Source link: https://hbr.org/sponsored/2023/08/how-a-hybrid-platform-can-help-enable-trusted-generative-ai

Sponsors

spot_img

Latest

Cardano DeFi Cracks Top 10 as TVL Explodes

Cardano enters the top 10 DeFi blockchains. Cardano TVL has exploded over the past month. Cardano DeFi...

Jannik Sinner shares a beautiful praise for Grigor Dimitrov

The final of the Miami Open between Jannik Sinner and Grigor Dimitrov is about to start in a few minutes. of a...

Wimbledon presenter tells one Nick Kyrgios story from 2022: He is fascinating

BBC presenter Lee McKenzie told one interesting Nick Kyrgios story from this year's Wimbledon as the BBC presenter found the Australian to...

What Do You Wear on First Dates?

Recently I was noodling around on Instagram when a question by stylist Amanda Murray caught my eye: “What do you wear on a...

KPMG and Google Cloud expand alliance to accelerate the adoption of generative AI among enterprises

Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More Google...