A look at non-SAAS LLMs options Chatbots — the should or should not to open source choices?
One of my readers posted a question. At first, I intended to write a short response, but it ended up being a long-winded answer with some valuable nuggets that deserved a post on its own.
If you are thinking about building an exclusively private ChatGPT clone or LLMs-powered Chatbot, your going to want to read on to find out what your options are.
Let's kick off by having a look at the question that was posted in the comments section here.
Nobody is at least a bit worried that all the training on individual interests is feeding central databases located in private servers. Is there an efficient way to make a stand-alone version of ChatGPT in our own servers (I guess training is the issue to overcome), making personal assistants that not necessarily share our information but, on the contrary, helps us to block information spreading while learning sensitive behavior of the individual users? — Sbrauchi
There are three questions that are related to the reader's question that I would like to address.
What are the current general concerns around Conversation style ChatBots built using LLMS?
What are the things to consider as you weigh all options for ChatGPT clones?
What are the Opensource LLMs options?
TLDR: For those who want to skip ahead to get to the core of options available to build a truly customized, self-owned ChatGPT open-source model.
OpenAI was the first to make significant strides in this area and has made it easier to build abstractions around its service. However, convenience presents issues around centralization, costs of going through an intermediary, data privacy, and copyright issues.
While data sovereignty and governance is the primary issue in question on how these new LLM service providers deal with trade secrets or sensitive information, and user data have been used for pre-training to enhance the LLM model capability. Growing concerns exist that big firms could monopolize such models for vested interests that may not be in your best interest.
When ChatGPT initially launched, this one was one of the core central discussion issues and still is — additionally, concerns around factual accuracy, bias, offensive responses, and hallucinations, which at times do plague ChatGPT though this has been minimized in version 4. While there are issues, I believe the upside benefits outweigh the shortcomings.
We can not dismiss ChatGPT, and LLMs are what they are today because of the collective intelligence of all data shared on various forums and channels that users like you and me have provided indirectly.
Self-hosted with SAAS LLMs API integration
Customize your LLMS chatbot to behave based on your own parameters, i.e., dataset. In this scenario, OpenAI provides the API and has proprietary rights to the model. You are the API integrator building a client that interfaces with their service. Data is owned by you, while you delegate portions of data to OpenAI's model.
Chatbot code and behavior are based on your logic, while the underlying model is on a pay-per-use or, in ChatGPT's case, pay-per-token. Computation resources are primarily on OpenAI servers; you may incur computation expenses to train or tune OpenAI's models on your data.
Fully self-hosted LLMs
There's also another part is having your own LLMs model you self-host and train that has zero interaction with the outside world and can work in isolation. The model is on our physical cloud or on-prem servers.
Users who opt for this are sensitive to sharing proprietary trade secrets or are bound by privacy policies that may not permit data sharing due to geographic restrictions the LLMS may not be able to accommodate.
Fully delegated to LLMs SAAS provider
Where the Cloud solution provider does the training and model building on a subscription-based model or pay-per-use model, you, as a client, provide the additional data and, define the parameters of your model, tune the model according to your specifications. This approach is delegation of control. Such services as Cohere and AI21Labs, and OpenAI are large companies offering such commercial offerings.
Once the hype around ChatGPT weans off and everyone understands the benefits of LLMs, individuals, and organizations will likely want to have models that they own and operate.
We can sum it up the reasons as full ownership and control
Given the impact of such language models, it is imperative that the broader community must have a good understanding of how these models are constructed, how they function, and how they can be improved. With centralized services, it's hard to get information, but we can look into open-source solutions to derive how they may be doing it.
It's always good to familiarize yourself with the available options. The list below can be a starting point to see if they are alternative cheaper ways to build out similar conversation-style ChatBots such as ChatGPT.
Some of these models below can be run off your laptops, given you meet the requirements; you also have options to run some of these via Google Colab, which comes with a 51 GB RAM option. Costs to train open-source models, in general, are relatively low if the objective is not scale counter to what many might think.
Given you have a basic understanding of the processes to do the actual training, iterative cycles can be shortened.
1. OpenChatKit
OpenChatKit uses a 20 billion parameter chat model trained on 43 million instructions and supports reasoning, multi-turn conversation, knowledge, and generative answers. OpenChatkit is designed for conversation and instructions. Generally, the bot is good at summarizing and generating tables, classification, and dialog.
OpenChatKit version 0.15 was launched under an Apache-2.0 license which grants you complete access to the source code, model weights, and training datasets as the initiative is community-driven.
One of the notable things OpenChatKit provides out of the box is a retrieval system for the Live-updating of answers allowing the Chatbot to integrate updated or customized content, such as information from Wikipedia, news feeds, or sports scores, into its responses. Access to the internet was a feature recently integrated into ChatGPT-4 via plugins, but it can easily be done on older GPT models.
Where to find the demo?
2. Vicuna
Vicuna is an open-source chatbot with 13B parameters trained by fine-tuning LLaMA on user conversations data collected from ShareGPT.com, a community site users can share their ChatGPT conversations. Based on evaluations done, the model has a more than 90% quality rate comparable to OpenAI's ChatGPT and Google's Bard, which makes this model one of the top opensourced models when looking at feature parity to ChatGPT. It's also able to write code which is less common in other open-source LLMs chatbots, as illustrated below.
The cost of training Vicuna-13B is estimated to be around $300 based on publicly available information. You can have a look at the git repository for instructions on how to clone and set up your own self-hosted node. The training, serving, and evaluation code has been shared as well.
Demo: https://chat.lmsys.org/
3. Alpaca
Alpaca builds on Meta's LLaMA with the sole goal of enabling LLMs to be more cheaply accessible. Based on prior studies and benchmarks done by Stanford University's Center for Research. The Alpaca model can be retrained for as low as $600, which is cheap, given the benefits derived.
They are also two additional Alpaca variants models Alpaca.cpp and Alpaca-LoRA. Using the cpp variant, you can run a Fast ChatGPT-like model locally on your laptop using an M2 Macbook Air with 4GB of weights, which most laptops today should be able to handle. CPP variant combines Facebook's LLaMA, Stanford Alpaca, alpaca-Lora, and the corresponding weights. you can find data on how fine-tuning was done here.
Demo: https://huggingface.co/spaces/tloen/alpaca-lora
4. GPTall
GPT4all is a community-driven project trained on a massive curated collection of written texts of assistant interactions, including code, stories, depictions, and multi-turn dialogue. The team has provided datasets, model weights, data curation processes, and training code to promote the open-source model. There is also a release of a quantized 4-bit version of the model that is able to run on your laptop as the memory and computation power required is less. A Python client is also available that you can use to interact with the model.
Demo: https://huggingface.co/spaces/rishiraj/GPT4All
5. ChatRWKV
ChatRWKV is an open-source chatbot powered by RWKV, an RNN with Transformer-level LLM performance language model. Model results are comparable with those of ChatGPT. The model uses RNNs. Fine-tuning of the model was done using Stanford Alpaca and other datasets.
Demo: https://huggingface.co/spaces/BlinkDL/ChatRWKV-gradio
6. Bloom
BLOOM is an open-source LLMS with 176 billion+ parameters. Comparatively, it is relatively on par with ChatGPT and is able to master tasks in 46 languages and 13 programming languages. One of the barriers to entry is its 350~ GB of RAM requirement to run. There's a lighter version which you can find here.
The development of BLOOM is coordinated by BigScience, a vibrant open research collaboration with a mission to publicly release LLMs. You can find more details on how to get started with Bloom via the GitHub README.
There's a write-up done with step-by-step instructions on how to In this post source: Deploy BLOOM-176B and OPT-30B on Amazon SageMaker with large model inference Deep Learning Containers and DeepSpeed
Where to find the demo?
- https://huggingface.co/spaces/huggingface/bloom_demo
- https://github.com/aws/amazon-sagemaker-examples/blob/main/inference/nlp/realtime/llm/bloom_176b/djl_deepspeed_deploy.ipynb
Variants of BLOOM
- BLOOM-LoRA
- Petals — This is a P2P bit torrent style collaborative approach to model training; it almost feels like a blockchain network. Petals can run large language models like BLOOM-176B collaboratively — you load a small part of the model, then team up with people serving the other parts to run inference or fine-turning for shared computing resources. With this approach, you do not need that much computation power. You can find the demo here.
7. Honorable mention
HuggingFace is one of those websites you need to have in your Batman/women's tool belt, and you most definitely want to get yourself acquainted with the site. It's the mecca of NLP resources; while HuggingFace is not an LLM model, it is a Natural Language Processing problem-solving company. It acts as a catalyst by making research-level work in NLP accessible to the masses.
At a quick glance at the site, you will notice it houses a centralized repository of open-source libraries called Transformers for natural language processing tasks that form the base for LLMs, such as text classification, language generation, and question-answering.
You will see various ChatGPT-like clones built of various Models. One of the benefits of the platform is that users can store, share, host, and collaborate on their trained models. You can iteratively get feedback from the community — via collective feedback; you can let the community evaluate your solution.
If you notice on the above opensource list, there's a general theme most variants of the LLMs models are either derived from Meta AI's Llama as the foundation model or Bloom. It's relatively straightforward for you to create your own variations, given there's enough literature to help you get started.
If you decide to go open-source, the fundamental principles for integrating these open-source models to make them production ready will be the same. You will have to build an interface to expose these models so you can interact with them. Most completion models will require input text and arguments like temperature, max_input_tokens, max_output_tokens, etc, for tuning.
One of the obstacles to adopting open-source LLMs for conversational chatBots is how to evaluate the performance of the newly trained model. A deep understanding of reasoning and context awareness will be required.
Libraries like Langchain have built abstractions, i.e, the hard stuff. If you are going to build your own understanding, how to make these abstractions will be important.
While there are many merits to rolling your own solution, hosting and scaling custom LLMs does have long-term maintenance costs, which generally you wouldn't worry about when using a SAAS LLMs service.
To put things into perspective, the costs that went into training chatGPT for that scale are estimated to be around $4.6 million~ when using the lowest GPU cloud provider, excluding R&D and human resourcing costs. You can refer to this article for insights on estimated costs for training LLMs at scale.
The model's size in terms of parameters and the number of tokens are variables that scale together — the larger the model, the longer it takes to train on a set of configurations — the longer, the more expensive it becomes — the more mistakes you make, the longer it takes to complete. I believe you get the idea.
Ultimately, you will have to evaluate the ROI, where the choice will be based on your needs. Below are things to think about before going deciding to build a ChatGPT clone using open-sourced models.
Do you have a sufficient data for training?
Do you have the technical ability to do the training?
Do you understand prompt engineering well enough?
Do you have a plan for model evualation
Do you have a good understanding of your target usecase?
(Video) This Is Better Than ChatGPT (With Prompting Guide)
I will be sure to revise the list as better opensource options emerge do bookmark for future reference.
FAQs
Is there an alternative to ChatGPT? ›
Powered by GPT-4, Chatsonic is designed to address the limitations of ChatGPT, providing real-time data, image, voice searches, and a plethora of content creation capabilities. Since the launch of Chatsonic, it is ranking on Google as the "Best alternative to ChatGPT" worldwide.
Is ChatGPT available for free? ›Yes, the basic version of ChatGPT is completely free to use.
What is the open-source alternative to GPT-4? ›- ColossalChat.
- Alpaca-LoRA.
- Vicuna.
- GPT4ALL.
- Raven RWKV.
- OpenChatKit.
- OPT.
- Flan-T5-XXL.
Google is opening public access to the conversational computer program Bard, its answer to the viral chatbot ChatGPT, while stopping short of integrating the new tool into its flagship search engine.
Is Chatsonic better than ChatGPT? ›Natural Language Processing
Chatsonic uses advanced Natural Language Processing (NLP) technology to deliver better results and interpret user queries with high accuracy. ChatGPT also uses NLP, but its accuracy is lower than that of Chatsonic.
A new report has the numbers on how much it costs to operate the generative AI chatbot, and it's a lot. SemiAnalysis' Chief Analyst Dylan Patel released the report this week. According to his analysis, running ChatGPT costs approximately $700,000 a day. That breaks down to 36 cents for each question.
How much does ChatGPT app cost? ›OpenAI's ChatGPT is free to use, and anyone can do so. Although this access is global, it does come with some drawbacks for now.
Which is the best ChatGPT app? ›- ChatGod for Android.
- Aico AI Chat for Android.
- Wisdom Ai — Your AI Assistant for iOS.
- Write For Me — Chat AI Chatbot for iOS.
- Wiz AI Chat Bot Writing Helper for iOS.
- Frank: The AI Chat Assistant for iOS.
- Chatsonic.
- Jasper Chat.
You don't need to be a software developer to try GPT-3, the famous language model that can fool most people into thinking it's a human author. Get access by creating an account and heading to the OpenAI Playground to try it out for free.
Will GPT-3 become open-source? ›GPT-3 AI is an open-source natural language processing (NLP) model developed by OpenAI.
Which GPT is free? ›
The AI tool lets you use GPT-4 for free. Artificial intelligence is making some major leaps in recent times. With tools like OpenAI's ChatGPT, millions of people around the world are finding newer ways to enhance their productivity and amplify creativity.
Is ChatGPT better than google bard? ›In less than 8 seconds, Google's Bard presented three drafts while ChatGPT gave better and detailed response in real-time. So, in terms of response time, ChatGPT takes the cake. ChatGPT and Bard have been trained in different language models.
What is the difference between OpenAI and ChatGPT? ›They both use similar generative AI models, but the main difference is that ChatGPT is designed for use by the general public while OpenAI Playground is more geared toward developers who want to experiment with OpenAI's various AI technologies.
What is the difference between chatbot and ChatGPT? ›Differences between Chatbot and ChatGPT
✅Personalization and Sophistication: Chatbots are typically pre-programmed with a limited set of responses, whereas ChatGPT is capable of generating responses based on the context and tone of the conversation. This makes ChatGPT more personalized and sophisticated than chatbots.
ChatSonic is a powerful feature by WriteSonic that is probably the best ChatGPT alternative. It is an AI-powered chatbot (and AI writing tool) that utilizes NLP and machine learning algorithms. Now powered by CPT-4, ChatSonic is able to understand context and nuances at a higher level than the free version of ChatGPT.
What is the difference between Google Dialogflow and ChatGPT? ›OpenAi ChatGPT can be used for any purpose. Google Dialogflow is designed specifically for customer service applications. OpenAi ChatGPT can be integrated with other software programs. Google Dialogflow cannot be integrated with other software programs.
What is the difference between Jasper and ChatSonic? ›It is because Jasper AI is based on GPT-3.5, which can only process and analyze information published before June 2021. ChatSonic, on the other hand, depends on Google search and can provide real-time updates and reliable facts on the latest trends, events, and topics.
Where can I buy ChatGPT? ›- ChatGPT isn't publicly traded, and OpenAI, the company that built it, isn't either. ...
- The most direct one is through Microsoft (Nasdaq:MSFT). ...
- Another way to get exposure to ChatGPT is through NVIDIA (Nasdaq:NVDA).
The most popular deep learning workload of late is ChatGPT, in beta from Open.AI, which was trained on Nvidia GPUs. According to UBS analyst Timothy Arcuri, ChatGPT used 10,000 Nvidia GPUs to train the model.
Will ChatGPT 4 be free? ›Forefront AI's chatbot Forefront Chat offers free access to GPT-4. Along with GPT-4, you can also try your hands on features like image generation, custom personas, and shareable chats.
What is the most 5 star rated app? ›
Apps | Last Update | |
---|---|---|
1 | Empire:Rise Of BattleShip | 04 May 2023 |
2 | Clash Of Warship | 04 May 2023 |
3 | Legend of UnderWorld | 04 May 2023 |
- Intercom.
- Zendesk Support Suite.
- Tidio.
- Drift.
- Landbot.
- Customers.ai.
- Botmaker.
- Birdeye.
FastR is the #1 vehicle acceleration performance app in the game! Test your vehicle's 0-60 and Quarter mile times, in a hands-free, fully automated manner. You can also post your acceleration times to our leaderboard to earn bragging rights.
Can I use OpenAI API for free? ›No, the ChatGPT API and ChatGPT Plus subscription are billed separately. The API has its own pricing, which can be found at https://openai.com/pricing.
How much does GPT-3 cost? ›The cost of using GPT-3 (Davinci model) in the analyzed case would be ~$14,4K per month. It's important to note, however, that it is only a simplified simulation, and its results are not fully representative.
How much does it cost to develop GPT-3? ›How much did it cost to develop GPT-3? According to OpenAI, the research organization responsible for developing GPT-3, the project's total cost is estimated to be around $4.6 million. This includes not only the cost of developing the model itself but also the cost of training it using massive amounts of data.
What is the best open-source alternative to ChatGPT? ›- 8 Open-Source Alternative to ChatGPT and Bard. Discover the widely-used open-source frameworks and models for creating your ChatGPT like chatbots, integrating LLMs, or launching your AI product. ...
- LLaMA. ...
- Alpaca. ...
- Vicuna. ...
- OpenChatKit. ...
- GPT4ALL. ...
- Raven RWKV. ...
- OPT.
The best alternatives to GPT-3 Crush are OpenAI, Copysmith, and SmartSparks. If these 3 options don't work for you, we've listed over 20 alternatives below.
Is ChatGPT the same as GPT-3? ›While both ChatGPT and GPT-3/GPT-4 were built by the same research company, OpenAI, there's a key distinction: GPT-3 and GPT-4 are large language models trained on terabytes of internet data that give artificial intelligence (AI) applications the ability to generate text.
How much will GPT chatbot cost? ›The recently launched GPT-4 significantly upgrades the already renowned AI chat tool. ChatGPT Plus is the sole gateway to access this advanced version, and it costs $20 per month.
Can I use ChatGPT commercially? ›
The OpenAI API can be used for commercial purposes. However, when it comes to using the code generated by AI, it's important to check whether AI-generated code can be copyrighted in your country. In the United States and the European Union, it's not legal to copyright any kind of AI-generated content.
Is GPT-4 available? ›It was released on March 14, 2023, and has been made publicly available in a limited form via the chatbot product ChatGPT Plus (a premium version of ChatGPT), and with access to the GPT-4 based version of OpenAI's API being provided via a waitlist.
Why is ChatGPT not working? ›The most common reasons include poor internet connectivity, a VPN that's not functioning correctly, an overloaded server, or a temporary outage in OpenAI's server. Additionally, the error might be a result of an error in your chatbot session, which could occur when generating too much text or running too many queries.
What is the difference between Jasper AI and ChatGPT? ›While ChatGPT is a powerful generative AI, it's more geared toward research and experimentation. On the other hand, Jasper Chat is specifically designed for business use cases, like marketing and sales, and social media posts.
Is GPT-3 better than ChatGPT? ›ChatGPT is not just smaller (20 billion vs. 175 billion parameters) and therefore faster than GPT-3, but it is also more accurate than GPT-3 when solving conversational tasks—a perfect business case for a lower cost/better quality AI product.
Is ChatGPT the most advanced AI? ›OpenAI, a leading research organization in the field of artificial intelligence (AI), has recently released Chat GPT-4, the latest iteration of their language model. This release has generated a lot of excitement and anticipation, as it is the most advanced and powerful AI yet.
Does ChatGPT have an app? ›Is ChatGPT on Android? No, there is no ChatGPT Android-specific service for smartphone users, and there is no ChatGPT Android app from OpenAI. The ChatGPT service is accessible via Android devices, just as it is on desktop or laptop computers – via the OpenAI ChatGPT page.
Why is ChatGPT access denied? ›If you're using a proxy server to access ChatGPT, the proxy server's IP address may be blocked by the website or application you're trying to access. This can happen if the proxy server associates with suspicious or malicious activity.
Why is ChatGPT offline? ›Why is ChatGPT down? The most likely cause of the service being unavailable is that the servers on which it works are inaccessible. This could be because the service is overloaded, or it could be that there is a need for OpenAi's servers to be down.
Is copy AI better than ChatGPT? ›Chat by Copy.ai is a better platform than ChatGPT for sales teams because it has the ability to scrape website data. This allows you to create more personalized outreach copy based on a user's public LinkedIn URL. This copy can be used for contacting cold or warm leads, like: LinkedIn InMail.
Is there a free alternative to Jasper AI? ›
Let's cut to the chase, WriteSonic is the #1 alternative to Jasper ai. WriteSonic has all the essential and advanced AI writing features compared to Jasper AI. Note: If you haven't tried Jasper yet, we suggest claiming Jasper AI free trial to try it for free — 10000 Words FREE!
Is jasper as good as ChatGPT? ›In its current state, no, ChatGPT is not a replacement for Jasper. Jasper was built to help business owners with a variety of different tools using the AI model. While ChatGPT can do some of the things Jasper does, it doesn't have the same robustness or capabilities.
What is the most powerful chatbot? ›- HubSpot Chatbot Builder: Most user-friendly chatbot builder.
- Intercom: Best chatbot for customization.
- Drift: Best sales chatbot.
- Salesforce Einstein: Best for Salesforce users.
- WP-Chatbot: Best for WordPress sites.
- LivePerson: Best for omnichannel messaging.
As a general rule, you can distinguish between two types of chatbots: rule-based chatbots and AI bots.