Business Intelligence - AnswerRocket https://answerrocket.com An AI Assistant for Data Analysis Fri, 14 Jun 2024 19:06:36 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 https://answerrocket.com/wp-content/uploads/cropped-cropped-ar-favicon-2021-32x32.png Business Intelligence - AnswerRocket https://answerrocket.com 32 32 The Future of Language Models in the Enterprise: A Multi-Model World https://answerrocket.com/the-future-of-language-models-in-the-enterprise-a-multi-model-world/ Thu, 10 Aug 2023 13:38:00 +0000 https://answerrocket.com/?p=2024 In this insightful interview, Mike Finley, AnswerRocket’s CTO and Chief Scientist, delves into the revolutionary possibilities presented by language models, specifically GPT (Generative Pre-trained Transformer). He emphasizes that leveraging large language models (LLMs)  is akin to using a flexible database, allowing for a wide range of versions, locations, and language models to be seamlessly integrated into […]

The post The Future of Language Models in the Enterprise: A Multi-Model World first appeared on AnswerRocket.

]]>
In this insightful interview, Mike Finley, AnswerRocket’s CTO and Chief Scientist, delves into the revolutionary possibilities presented by language models, specifically GPT (Generative Pre-trained Transformer). He emphasizes that leveraging large language models (LLMs)  is akin to using a flexible database, allowing for a wide range of versions, locations, and language models to be seamlessly integrated into solutions. Mike shares that AnswerRocket is embracing the evolving landscape of language models, ensuring independence from any singular model while effectively harnessing their capabilities like completions and embeddings. 

Watch the video below or read the transcript to learn more.

Is Max dependent on GPT or can other LLMs be used?


Mike: So it’s 100% flexible to use lots of different versions of GPT, or lots of different locations where the language models are stored, or lots of different language models. So we look at the language model very much like a database, something that over time will become faster and cheaper and more commoditized and we want to be able to swap in and out whatever those models are over time, so that we’re not dependent on it, on any one. We do use every capability that’s available to us from the language models, things like completions and embeddings, these are technical terms of the capabilities of the models and we will look for those same capabilities as we expand into additional models. But it’s not a dependency for our solution. And in fact there is a mode where AnswerRocket can run, in fact has run, until about six months ago when these language models were introduced. 


That does not rely on external language models at all, right? It relies instead on the semantics of the database, on the ontology that’s defined by a business and how they like to use their terms. And so it does not rely on having to have a GPT source. But when there is a language model in the mix, you get a more conversational flow to the analysis which makes it feel a lot more comfortable to the user. It’s clear that from a foundation model perspective, the providers of the core algorithms behind these models, there will be models that are specific to medical, that are specific to consumers, that are specific to different industries and different spaces. And so we very much expect to be able to multiplex across those models as appropriate for the use case and again treat them like any other component of infrastructure, whether that’s storage or database or compute. 


These models just become one more asset that’s available to enterprise applications that are really putting together productivity suites for the end user. 

Conclusion: AnswerRocket is not solely dependent on GPT; in fact, it initially operated without relying on external language models, using database semantics and business-defined ontologies. However, when language models are incorporated, it enhances the user experience, enabling a more conversational flow in data analysis. The focus is on leveraging the diverse capabilities of language models while treating them as components of infrastructure alongside storage, databases, and compute resources. Analytics and Insights experts like Mike foresee a future with specialized language models catering to various industries. The aim is to provide enterprise applications with enhanced productivity suites for end-users by multiplexing across different models as needed for various use cases.

The post The Future of Language Models in the Enterprise: A Multi-Model World first appeared on AnswerRocket.

]]>