Async Ruby is the Future of AI Apps (And It's Already Here)
14-Jul-2025 20
After a decade as an ML engineer/scientist immersed in Python’s async ecosystem, returning to Ruby felt like stepping back in time. Where was the async revolution? Why was everyone still using threads for everything? SolidQueue, Sidekiq, GoodJob – all thread-based. Even newer solutions defaulted to the same concurrency model.
Coming from Python, where the entire community had reorganized around asyncio, this seemed bizarre. FastAPI replaced Flask. Every library spawned an async twin. The transformation was total and necessary.
Then, building RubyLLM and Chat with Work, I noticed that LLM communication is async Ruby’s killer app. The unique demands of streaming AI responses – long-lived connections, token-by-token delivery, thousands of concurrent conversations – expose exactly why async matters.
Async Ruby is the Future of AI Apps (And It's Already Here) #ruby #rubydeveloper #rubyonrails #Async #Future #Already #Here) #ai https://rubyonrails.ba/link/async-ruby-is-the-future-of-ai-apps-and-it-s-already-here