LM Studio Integration:
- Added LM Studio provider with OpenAI-compatible API support
- Dynamic model discovery via /v1/models endpoint
- Support for both chat and embeddings models
- Docker-compatible networking configuration
Thinking Model Panel:
- Added collapsible UI panel for model's chain of thought
- Parses responses with <think> tags to separate reasoning
- Maintains backward compatibility with regular responses
- Styled consistently with app theme for light/dark modes
- Preserves all existing message functionality (sources, markdown, etc.)
These improvements enhance the app's compatibility with local LLMs and
provide better visibility into model reasoning processes while maintaining
existing functionality.