Multi-provider AI
A guide on how to use the new Ruby LLM library, which allows you to connect to several LLMs instead of just using Open AI
Last updated
Was this helpful?
A guide on how to use the new Ruby LLM library, which allows you to connect to several LLMs instead of just using Open AI
Last updated
Was this helpful?
Real-time streaming responses
Multi-modal inputs (images, audio, PDFs)
Easy-to-define tools
Native Rails integration with acts_as_chat
This guide walks you through how to add the ruby_llm
gem in a Lightning Rails project, including configuration, usage, and upgrading of any existing service classes.
ruby_llm
GemThen install it:
Alternatively, install it directly:
In your .env
file (Lightning Rails uses dotenv-rails
by default), add:
Then create a RubyLLM config initializer:
Create app/services/MultiProviderModels.rb
:
If you are creating a Chatbot and don't want the users to have to reload their pages for every message, you can stream responses in real-time, by modifying the ask
method:
Or, in a controller with Turbo Streams:
If you'd like to track (Save/update) chats and messages in the database, use the acts_as_chat
setup:
Prefer RubyLLM.chat(model: "gpt-4o-mini")
for model-specific tasks.
Use streaming when showing responses live in the UI.
Leverage multi-modal inputs (images, PDFs, audio) for advanced functionality.
Use RubyLLM::Tool
to define reusable actions the LLM can call dynamically.
Always sanitize and validate input before using it in prompts if user-submitted.
โ Getting โMissing API keyโ errors?
Make sure your .env
file includes OPENAI_API_KEY
(or the LLM of your choice) and it is loaded via dotenv
.
โ Getting undefined method acts_as_chat
?
Ensure youโve added the proper ActiveRecord models with acts_as_chat
, and that the ruby_llm
gem is loaded correctly. Also, don't forget to restart your server after installing the gem.
โ No response from ask
method?
Wrap your call in a puts
or log output. Also, try a basic prompt like:
Product analysis and market research
Image captioning or PDF summarization
Building internal tools with custom RubyLLM::Tool classes
Streaming AI-powered responses in chat UIs or dashboards
Let me know if youโd like a Stimulus controller example or UI integration using Turbo Streams for this!
The is a powerful, multi-provider LLM client for Ruby that makes it simple to interact with models like OpenAI, Anthropic, Gemini (Google), and Deepseek. It offers rich features like:
Add the new to your Gemfile
:
Read more in ๐ฅ