Technology

Overcoming the challenge of scalability in AI projects – Partner Content

0
Please log in or register to do it.
Overcoming the challenge of scalability in AI projects – Partner Content



In today’s rapidly evolving technological landscape, artificial intelligence (AI) has become a key focal point for enterprises, governments and the general public as a whole.

Generative AI, powered by Large Language Models (LLMs), has captured the popular imagination. ChatGPT, the popular chatbot from Microsoft-backed OpenAI, reached 100 million monthly active users in January 2023, just two months after its launch, making it the fastest-growing consumer application in history.

In comparison, it took TikTok about nine months after its global launch to reach 100 million users and Instagram two-and-a-half years.  

It is of little wonder then that the use of AI, particularly LLMs to scale operations is the top of mind of almost every IT leader in the Asia Pacific region.

However, despite the interest, most organisations struggle to scale their AI initiatives with data suggesting that 77 percent of all AI models never make it to production.

This statistic highlights the need for a robust framework to help businesses scale and govern AI success.

Reiterating this point at a recent roundtable, organised by iTnews Asia, Dataiku’s Director of Sales Engineering, Alex Aung, said that if an AI project comprises only two or three models, the challenges are not obvious.

“But, as soon as you start scaling to support digital transformation, the little worries from the start become a real pain to manage and it would only get worse,” he added.

Operations teams will waste a lot of time fixing the problem and, at the same time, will have to deal with the increasing pressure of new projects. Then there is the added risk of non-compliance with corporate policies.

The teams are, therefore, faced with a problem that is difficult to master because from the start, the loop is often not complete, it is not operationalised and is too specific and nothing is designed to make it repeatable to scale up, Aung added.

“At Dataiku we solve this with our three core conditions of unification, operationalisation, and repeatability that enable organisations to turn their ML projects into critical success factors for their digital transformation,” he added.

Talking about the broad AI landscape, Aung shared with the roundtable panellists that with multiple LLM models proliferating, companies sometimes get confused about which model is best suited for their use.

The speed of change is so fast that people often do not have enough time to learn about particular models and what they can do before the next iteration of the model or better models are available, Aung said.

Giving some perspective he noted that despite the recent buzz, AI has been in use in different applications over the past 10 years but most users have never felt its effect in their daily lives.

Increasing awareness of AI

Aung said things are dramatically different today because people believe AI is equal to ChatGPT. “Everyone is talking about ChatGPT and this has resulted in more people seeing the need for AI in everyday life,” he said.

The important thing is that ChatGPT and other generative AI products are starting to take the market share in the AI space quickly, he noted.

This is because while traditional AI delivery speed is in the region of around three months, generative AI products can deliver updates every month. 

Talking about a wide range of issues surrounding the challenges as well as opportunities in the use of AI, Aung said it is possible to deliver AI quickly when you have the right foundation.

“At Dataiku, we release a major version every year. Because we have the right product and the right approach, we can deliver updates within 30 days.

Our newest release is called LLM – Mesh allows every user to experiment with any approved LLM for business and embed them as part of the business processes in an auditable and controllable manner, he said. 

To a question on successful AI implementation, he said: “In my opinion, success means an ability to apply AI in your business process faster and scale faster so that you can optimise for profit and reduce the cost”.

Aung noted that while implementing AI, it is important to allow every skill set within the company to approach AI, “starting from the data that feeds the AI, and anything in between, and let the people build their scales”.

This he said allows companies to deploy the model faster, and then scale faster in terms of their practice.

Referring to the high failure of companies trying to scale their AI pilots, Aung said: “At Dataiku, we understand the challenges organisations face when trying to scale their operations”.

The platform offers a comprehensive solution for data and analytics project teams, allowing them to streamline their project lifecycle from start to finish by incorporating preparation, design, deployment, and monitoring at once in a single approach, he added.

MLOps are a proven methodology

Aung noted that machine learning operations (MLOps) is a proven methodology that equips companies with the essential tools and techniques for managing and scaling machine learning models in a production environment.

“With MLOps, enterprises can confidently deploy, monitor, and manage their AI initiatives at scale”, Aung said.

He noted that three core conditions enable organisations to turn their ML projects into critical success factors for their digital transformation. They are unification, operationalisation, and repeatability.

Aung noted that one of the main challenges that customers talk about is how to remove barriers in the life cycle of AI projects.

These could include everything from data preparation and model development to deployment and management. These barriers can be caused by a variety of factors, such as difficulties in obtaining high-quality data or issues with model performance and accuracy.

Some of the challenges also include how to encourage domain experts to take ownership of their data without creating chaos. This can be particularly important in organisations where data is siloed and there isn’t a clear process for managing and utilising it, he said.

Creating value

Aung added that organisations are looking for ways to create immediate value with AI without having to start from scratch every time. Implementing AI projects can be a time-consuming and resource-intensive process, so finding ways to streamline it and reuse existing models and algorithms can be incredibly valuable, he noted.

At Dataiku, we understand the challenges organisations face when trying to scale their operations, Aung added.

The platform offers a comprehensive solution for data and analytics project teams, allowing them to streamline their project lifecycle from start to finish by incorporating preparation, design, deployment, and monitoring at once in a single approach, he said.

Aung said LLM Mesh enables an organisation to decouple the application from the AI service layer

The LLM Mesh puts the control of AI resources back in the hands of analytics and IT leaders so they can effectively manage this transformative technology’s safety, performance, and cost, he added.

The Dataiku platform allows customers to connect any models that are suitable for them. This includes when the data sits on the Google Cloud, AWS, Azure, or is stored on-premise, he said.

The system is also available in a Software-as-a-Service model.

This system enables people with different skill sets to build their workflows and data workflows, without having to code or with a low code or, if they choose, full code right, he added.

“In this way, it is possible to optimise the time and build platforms… This is how we can bridge the gap between the skill set and technology and cater to all the data goals of the company”, he said.

Adding to the points made by Aung, Dataiku’s Regional Services Lead, Ankit Singi, said, based on his interactions with customers across the region, the maturity levels in terms of the use of AI vary widely.

There are some organisations which are very advanced in deploying models, while others are just starting to build some dashboards and reports, Singi said.

He noted that an important aspect to think about is how to democratise data initiatives.

Singi added that data has to be embedded across different corporate functions.

“It cannot be a central data science team that is solving all the cases, you have legal and also compliance… There needs to be a basic understanding of how best to use data to solve day-to-day problems.

“There also needs to be some form self-service, you cannot always go to one person to find out what was missing last month, run queries and other tasks… everyone needs to upskill themselves,” he said.

That’s the culture that has to come from the company leadership and how to embed that culture that all your decision-making be based on your data, Singi added.

A lot of organisations are enforcing this and they will see the returns in the next three to five years, he said.

“They will be much ahead in terms of skills and revenue because their (business) decisions would be based on facts, not on not on someone’s whims.”

Data governance

Talking about data governance Singi noted that it was also important to look into the “important” aspect of model governance.

“You’re putting models in production. This data is there but what is the governance framework for the models in production not every model will be a regulatory model, there could be segregation of the model, if, for example, it is serving an end customer while also serving internal users,” he said.

Giving an example he noted that a model could claim that it gives a portal a certain level of accuracy but “who has validated what is the bias of the model”?

“Were the model metrics checked and validated before production? All this needs to be documented before, we have a proper governance framework in place and before you put things in production especially if the model is being used by end-customers,” Singi said.

Responding to a question from one of the participants about the cost factor in the implementation of AI models, Aung noted that cost is a price you pay while the value is the things you get.

So, when we talk about cost we need to think about the various use cases. We usually look at use cases first before we even talk about all these investments or what we can do.

We need to look at the use cases and calculate how much we can save and how much we need to easily meet that bottom line and then apply it to the top lines. In this way, we will be able to convince company leadership and board members.

Aung added that in terms of use cases in the manufacturing sector, he has seen companies modernising their assembly lines with AI, by installing cameras to monitor assembly lines to detect defects and that helps them to reduce (defective) product returns and expensive assembly line errors.

He noted that predictive maintenance with the use of AI is also another important use case.

Aung’s advice to representatives from the small and medium-sized enterprises (SME) sector at the roundtable was to use AI in sales, marketing, product development and service.

During a wide-ranging discussion, several other issues regarding the challenges as well as opportunities in the implementation of AI projects at scale in different industries were discussed.

The consensus was that MLOps offer a pathway to scale AI implementation across industry sectors.



Source link

TVB Bus Captain Calls Out Unnamed A-List Actor For Being So Stingy, He Got Upset When He Wanted To Treat The Crew To A Meal But There Were Non TVB Employees Around
Korean Heartthrob Song Kang’s Nude Scene In Netflix Series Sweet Home Goes Viral