Introducing Claude 3.7 Sonnet and GPT-3.5 in noir-llm v0.2.3
I'm excited to announce the release of noir-llm v0.2.3, which now includes support for Anthropic's Claude 3.7 Sonnet and OpenAI's GPT-3.5 Turbo models! This update brings two powerful AI models to the noir-llm package, making them freely accessible for educational and personal projects.
What is noir-llm?
noir-llm is a Python package that provides a unified interface for accessing various LLM models freely. It offers a simple API for interacting with different language models, allowing you to easily switch between models without changing your code.
New Models in v0.2.3
Claude 3.7 Sonnet
Claude 3.7 Sonnet is Anthropic's latest model, known for its advanced reasoning capabilities and nuanced responses. Our implementation includes:
- Token refresh mechanism to maintain session validity
- Rate limit bypass with user agent rotation
- Exponential backoff for retries
- Clean response handling without debug information
- System prompt support
GPT-3.5 Turbo
GPT-3.5 Turbo is OpenAI's widely-used model known for its fast response times and good general capabilities. Our implementation features:
- Cookie refresh for session maintenance
- Rate limit detection and handling
- Streaming response parsing
- Clean response output
- System prompt support
How to Use
Installation
pip install noir-llm
Python API
from noir import NoirClient
# Create a client
client = NoirClient()
# List available models
models = client.get_available_models()
print(f"Available models: {models}")
# Select Claude 3.7 model
client.select_model("claude-3-7-sonnet")
# Or select GPT-3.5 model
# client.select_model("gpt-3.5-turbo")
# Set a system prompt
client.set_system_prompt("You are a helpful assistant.")
# Send a message
response = client.send_message("What is the capital of France?")
print(f"Response: {response}")
Command Line Interface
# Start a chat session with Claude 3.7
noir-llm chat --model claude-3-7-sonnet
# Or with GPT-3.5
noir-llm chat --model gpt-3.5-turbo
# Send a single message
noir-llm send "What is the capital of France?" --model claude-3-7-sonnet
Technical Implementation Details
Claude 3.7 Integration
The Claude 3.7 implementation connects to a proxy API that provides access to Anthropic's model. Key features include:
- Cookie management with automatic refresh every 5 minutes
- User agent rotation from a pool of common browser agents
- Exponential backoff with jitter for rate limit handling
- Response cleaning to remove artifacts and debug information
- Conversation history tracking
# Example of the rate limit bypass implementation
if response.status_code in [429, 403]:
# Clear session and get new cookies
self.session = requests.Session()
self.refresh_cookies()
# Rotate user agent
random_user_agent = random.choice(self.user_agents)
self.headers["user-agent"] = random_user_agent
# Wait with exponential backoff
backoff_time = self.retry_delay * (2 ** attempt) * random.uniform(0.8, 1.2)
time.sleep(backoff_time)
GPT-3.5 Integration
The GPT-3.5 implementation uses a similar approach with some model-specific optimizations:
- Streaming response handling for faster user experience
- Special parsing for the chunked response format
- Automatic retry logic for various error conditions
- Clean response formatting to remove debug information
Available Models
noir-llm now provides access to a growing list of models:
- Claude 3.7 Sonnet: Anthropic's advanced reasoning model
- GPT-3.5-Turbo: OpenAI's reliable general-purpose model
- GLM-4-32B: A powerful language model with web search capabilities
- Z1-32B: Another powerful language model with web search capabilities
- Z1-Rumination: A model optimized for deep research and analysis
- Mistral-31-24B: A high-quality language model from Venice AI
- Llama-3.2-3B: A compact but powerful model from Venice AI
Disclaimer
This package is for educational purposes only. Use at your own risk. The package accesses third-party APIs without official authorization, which may violate terms of service of the respective providers.
What's Next?
We're continuously working to add more models and improve the existing implementations. Stay tuned for future updates!
Have you tried noir-llm? What models would you like to see added next? Let me know in the comments!