OPINION23 January 2024

Unlocking predictive intelligence with large language models

AI Data analytics Innovations Opinion

Large language models are potentially potent instruments for businesses. Phillip Sewell examines how they can help with predictive insights.

LLM abstract image

Smart data professionals understand that the ability to better understand and predict consumer behaviour is profoundly influenced by the dynamic nature of their surroundings. This requires a deep understanding of diverse data sources, from structured data like weather and financial indices, to the nuanced richness of unstructured text and images.

Structured data offers a straightforward modelling process, characterised by organisation and logic. For example, it’s simple to state that 20 degrees is warmer than 18 degrees. But modelling unstructured data is far more challenging due to its semantic richness. For example, defining whether red is superior to green in a numerical fashion is a far more complex task.

In the rapidly evolving artificial intelligence (AI) landscape, large language models (LLMs) are potentially potent instruments for data-savvy businesses. So, how can data professionals start gaining actionable predictive insights from the vast tapestry of human language and interactions?

The role of LLMs
First, it’s important to understand what LLMs can do. Their primary function is information generation, be it in the form of chat or code. Through exhaustive training on vast datasets containing text and code, LLMs are increasingly acquiring an intricate understanding of the subtleties of human language. However, LLMs’ true value lies in their ability to convert contextual data into a numeric format that enhances predictive modelling.

How can LLMs be effectively utilised to achieve these goals? LLMs operate by converting textual information into numerical values, subjecting these values to algorithmic computations, or ‘tokenisation’ process. Once tokenised, the LLM utilises its language proficiency to derive meaning from the text.

For instance, when presented with the sentence ‘Taylor Swift is a pop singer’, the LLM dissects it, recognising Taylor Swift as a person, a singer and an artist in the pop genre. It comprehends the intricate relationships among these concepts, having already learned this information.

There are two key advantages of using LLMs for contextual data encoding. The first is its ability to capture complex and intricate relationships between diverse concepts. The second is handling unquantifiable data, with LLMs facilitating the representation of ‘challenging to quantify’ data, such as distinctions between different event types.

Working in the real-world
Let us dive into a real-world scenario that showcases how the practical use of LLMs can augment predictive modelling in the retail sector. Picture a chic women’s boutique nestled in the heart of a bustling urban landscape, offering an extensive array of high-end fashion tailored to diverse tastes and preferences. Now, envision the retailer’s ambitious goal of gaining a holistic understanding of how various events in their vibrant market sway their sales trends.

To achieve this, it’s about harnessing the power of LLMs proficient in language understanding. The process unfolds by collecting data on upcoming events relevant to the retailer’s market. These events could encompass a wide spectrum, including fashion shows, cultural festivals, music concerts and sporting events.

For each event, the LLM is tasked with encoding critical information. This could cover: 

  • Event type – categorising the event, whether it’s a fashion show, music concert, sports game or any other event type 
  • Event date – precise date information is recorded to establish the timing of the event 
  • Event location – the LLM captures details about the event’s location, whether it’s in the retailer’s city or another location
  • Clothing line data encoding – simultaneously, the LLM encodes information about the retailer’s clothing lines. This involves a thorough analysis of their diverse product offerings, focusing on factors such as clothing type or brand information.

Creating the predictive model
Once the event and clothing line data is successfully encoded by the LLM, data scientists can build a predictive model. This model is designed to forecast how diverse events will impact the retailer’s sales. How does this work?

By using the encoded data, the predictive model can analyse how specific types of events affect the sales of particular clothing items. For instance, it can identify whether fashion shows actually boost sales of high-end designer dresses, or if music concerts have a more significant impact on casual apparel.

The model considers the timing of events, ensuring that sales predictions consider both the event’s date and lead-up time. It integrates event data with other relevant factors, such as historical sales data, customer demographics and marketing efforts. It then generates comprehensive forecasts.

This predictive model equips the clothing retailer with invaluable insights, so they make more informed and precise forward decisions about inventory management, marketing strategies and event participation.

Ultimately, LLMs extend beyond text and chat applications by converting textual information into numerical values. This is key to enabling a nuanced understanding of complex relationships and unquantifiable data.

The integration of LLMs into predictive intelligence pipelines marks a pivotal shift in data analytics that can increasingly support more precise, foresight-fuelled  decision making.

Phillip Sewell is chief executive officer at Predyktable

0 Comments