All products
Production-ready toolkit for enterprise LangGraph agents

Fastal LangGraph Toolkit

Open source toolkit for building enterprise LangGraph agents with multi-provider support, intelligent conversation management, and speech processing capabilities.

Key Features

Multi-Provider Model Factory

Support for OpenAI (including GPT-5), Anthropic, Ollama, AWS Bedrock with seamless provider switching.

Intelligent Conversation Management

Summarization based on conversation structure, not just token counting.

Enterprise Speech Processing

Production-ready speech-to-text processing with multi-format support and language detection.

Memory Management

Type-safe SummarizableState with native LangGraph integration and token optimization.

A toolkit born from experience

The Fastal LangGraph Toolkit was originally developed internally by the Fastal Group to support enterprise-grade agentic application implementations across multiple client projects. After proving its effectiveness in production environments, we’ve open-sourced this toolkit to contribute to the broader LangGraph community.

pip install fastal-langgraph-toolkit

View package on PyPI

Why this toolkit

Building production LangGraph agents involves solving common challenges in advanced research and development projects:

  • Multi-provider Management — Support for multiple LLM/embedding/speech providers with seamless switching
  • Context Management — Intelligent conversation summarization for long-running sessions
  • Memory Optimization — Token-efficient context handling for cost control
  • Speech Processing — Enterprise-grade speech-to-text transcription capabilities
  • Type Safety — Proper state management with TypedDict integration
  • Configuration Injection — Clean separation between business logic and framework concerns

This toolkit provides battle-tested solutions for these challenges, extracted from real enterprise implementations.

Main features

Multi-Provider Model Factory

The current version of the model factory supports the following providers:

LLM Support: OpenAI (including GPT-5 models), Anthropic, Ollama, AWS Bedrock

Embeddings Support: OpenAI, Ollama, AWS Bedrock

Speech-to-Text Support: OpenAI Whisper (more providers coming soon)

Features:

  • Full GPT-5 Support: GPT-5, GPT-5-mini, and GPT-5-nano with automatic parameter mapping
  • Configuration Injection: Clean provider abstraction
  • Provider Health Checks: Availability validation
  • Seamless Switching: Change providers without code changes

Enterprise Speech Processing

Production-ready speech-to-text processing with enterprise-grade reliability and performance.

  • Multi-Format Support: MP3, WAV, M4A, and other common audio formats
  • Language Detection: Automatic language identification and custom language hints
  • Async Processing: Full async/await support for non-blocking operations
  • Segment Information: Detailed timestamp and confidence data when available
  • Error Handling: Robust error management with detailed logging
  • Lazy Loading: Efficient resource management with provider lazy loading

Intelligent Conversation Summarization

The LangChain/LangGraph framework provides good support for managing both short-term and long-term memory in agents through the LangMem module. However, we found that automated summarization based solely on token counting is not a sufficient approach for most real and complex agents.

The solution included in this kit offers an alternative and more sophisticated method, based on the structure of the conversation and a focus on the object and content of the discussions.

  • Ready-to-Use LangGraph Node: summary_node() method provides instant integration
  • Conversation Pair Counting: Smart Human+AI message pair detection
  • ReAct Tool Filtering: Automatic exclusion of tool calls from summaries
  • Configurable Thresholds: Customizable trigger points for summarization
  • Context Preservation: Keep recent conversations for continuity
  • Custom Prompts: Domain-specific summarization templates
  • State Auto-Injection: Seamless integration with existing states

Memory Management

  • SummarizableState: Type-safe base class for summary-enabled states
  • Automatic State Management: No manual field initialization required
  • LangGraph Integration: Native compatibility with LangGraph checkpointing
  • Clean Architecture: Separation of concerns between summary and business logic

Benefits

  • Open source Python package on PyPI
  • Battle-tested in real enterprise production projects
  • Full support for GPT-5, GPT-5-mini, GPT-5-nano
  • Native integration with LangGraph checkpointing

Interested?

Contact us for a demo or to receive more information.

Request demo