LangChain logged 29 release events in 30 days and 30 release events in 90 days, with 134,490 GitHub stars and a ToolVitals score of 100. That is not a sleepy framework. It is a project that keeps moving across a lot of surface area.

The interesting part is where the releases landed. These are not single monolith bumps. They are package-level point releases across langchain-openai, langchain-anthropic, langchain-core, langchain-classic, and langchain-huggingface. That points to a modular stack built around model providers and compatibility layers, with maintenance spread across adapters instead of concentrated in one core library.

The latest release notes show the same pattern. On April 20, langchain-openai 1.1.15 fixed streaming responses that return dict items. On April 17, langchain-anthropic 1.4.1 added Opus 4.7 support. The core line is active too, with 1.3.0 adding chat model and LLM invocation params to traceable metadata, while also tightening SSRF policy handling. This is maintenance plus integration churn, not cosmetic versioning.

ToolVitals can infer shipping intensity and where the maintenance effort is going. It cannot tell you whether LangChain is easy to use, whether teams finish projects faster with it, whether the abstractions are sane, or whether it works well in production. It also cannot see code quality, user satisfaction, revenue, or developer frustration. The 100 scores say the repo is active and healthy in the narrow sense ToolVitals measures, not that every developer loves it.

For comparison, TanStack Query shows 9 release events in 30 days, Qwik shows 14, ToolJet shows 28, and Analog shows 30. LangChain sits at the top of that set with 29, which means it is shipping like a live platform, not a frozen library. The star count is also much larger than the framework peers in this slice, but stars measure attention, not truth.

If your team depends on LLM integrations, track LangChain as a maintained dependency, not a one-off library pick. If you want a quiet, low-churn stack, this is the wrong bet. If you want active adapter support and fast model-specific fixes, the release data says LangChain is still very much in the ring.

Sources