Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
Find a file
2025-01-06 21:24:45 -07:00
.assets feat(assets): update screenshot 2024-11-17 22:55:57 +05:30
.github test: add CI/CD workflow 2025-01-05 14:16:31 -07:00
data feat(docker-compose): implement data volume 2024-06-29 11:10:26 +05:30
db Add project files: 2025-01-04 17:22:46 -07:00
docs Add project files: 2025-01-04 17:22:46 -07:00
public Add project files: 2025-01-04 17:22:46 -07:00
searxng test: add CI/CD workflow 2025-01-05 14:16:31 -07:00
src refactor: improve server initialization and port handling 2025-01-06 21:24:30 -07:00
ui test: add CI/CD workflow 2025-01-05 14:16:31 -07:00
uploads feat(app): add file uploads 2024-11-23 15:04:19 +05:30
.dockerignore feat(wolfram-search): Remove unused imports 2024-04-17 10:10:28 +05:30
.env.example Add project files: 2025-01-04 17:22:46 -07:00
.gitignore test: add CI/CD workflow 2025-01-05 14:16:31 -07:00
.prettierignore feat(app): add file uploads 2024-11-23 15:04:19 +05:30
.prettierrc.js Initial commit 2024-04-09 16:21:05 +05:30
app.dockerfile feat(dockerfile): downgrade node version, closes #473 2024-11-19 14:40:24 +05:30
backend.dockerfile feat(app): add file uploads 2024-11-23 15:04:19 +05:30
config.toml test: add CI/CD workflow 2025-01-05 14:16:31 -07:00
CONTRIBUTING.md feat(contribution): update guidelines 2024-12-02 21:07:59 +05:30
docker-compose.yaml feat(app): add file uploads 2024-11-23 15:04:19 +05:30
docker-compose.yml Add project files: 2025-01-04 17:22:46 -07:00
drizzle.config.ts feat(db): create schema & config files 2024-06-29 11:08:11 +05:30
jest.config.js test: add CI/CD workflow 2025-01-05 14:16:31 -07:00
LICENSE feat(license): Create license 2024-04-10 20:09:50 +05:30
package-lock.json chore: update dependencies and lock files 2025-01-06 21:24:45 -07:00
package.json chore: update dependencies and lock files 2025-01-06 21:24:45 -07:00
README.md Add project files: 2025-01-04 17:22:46 -07:00
sample.config.toml feat(providers): add gemini 2024-11-28 20:47:18 +05:30
tsconfig.json Add project files: 2025-01-04 17:22:46 -07:00
yarn.lock chore: update dependencies and lock files 2025-01-06 21:24:45 -07:00

BizSearch

A tool for finding and analyzing local businesses using AI-powered data extraction.

Prerequisites

  • Node.js 16+
  • Ollama (for local LLM)
  • SearxNG instance

Installation

  1. Install Ollama:
# On macOS
brew install ollama
  1. Start Ollama:
# Start and enable on login
brew services start ollama

# Or run without auto-start
/usr/local/opt/ollama/bin/ollama serve
  1. Pull the required model:
ollama pull mistral
  1. Clone and set up the project:
git clone https://github.com/yourusername/bizsearch.git
cd bizsearch
npm install
  1. Configure environment:
cp .env.example .env
# Edit .env with your settings
  1. Start the application:
npm run dev
  1. Open http://localhost:3000 in your browser

Troubleshooting

If Ollama fails to start:

# Stop any existing instance
brew services stop ollama
# Wait a few seconds
sleep 5
# Start again
brew services start ollama

To verify Ollama is running:

curl http://localhost:11434/api/version

Features

  • Business search with location filtering
  • Contact information extraction
  • AI-powered data validation
  • Clean, user-friendly interface
  • Service health monitoring

Configuration

Key environment variables:

  • SEARXNG_URL: Your SearxNG instance URL
  • OLLAMA_URL: Ollama API endpoint (default: http://localhost:11434)
  • SUPABASE_URL: Your Supabase project URL
  • SUPABASE_ANON_KEY: Your Supabase anonymous key
  • CACHE_DURATION_DAYS: How long to cache results (default: 7)

Supabase Setup

  1. Create a new Supabase project
  2. Run the SQL commands in db/init.sql to create the cache table
  3. Copy your project URL and anon key to .env

License

MIT

Cache Management

The application uses Supabase for caching search results. Cache entries expire after 7 days.

Manual Cache Cleanup

If automatic cleanup is not available, you can manually clean up expired entries:

  1. Using the API:
curl -X POST http://localhost:3000/api/cleanup
  1. Using SQL:
select manual_cleanup();

Cache Statistics

View cache statistics using:

select * from cache_stats;