biblos-api / README.md
rdmlx
Simplify API: Remove book/testament filters, search entire Bible by default
d9c155c
metadata
title: Biblos Semantic Search API
emoji: πŸ“–
colorFrom: blue
colorTo: purple
sdk: docker
pinned: false
license: mit

Biblos Semantic Search API

Semantic search over the entire Bible using BGE-large embeddings. This API keeps the model and embeddings in memory for fast responses (~50-100ms after initial load).

Features

  • βœ… Fast semantic search with BGE-large-en-v1.5 embeddings
  • βœ… Model stays loaded in memory (no cold starts)
  • βœ… Searches entire Bible (Old and New Testament)
  • βœ… CORS enabled for easy integration
  • βœ… RESTful JSON API with FastAPI
  • βœ… Automatic API documentation at /docs

API Endpoints

GET /

Health check and API information

GET /health

Detailed health status and available books

POST /search

Perform semantic search

Request Body:

{
  "query": "What did Jesus say about love?",
  "limit": 10  // Optional: results to return (1-100, default: 10)
}

Response:

{
  "query": "What did Jesus say about love?",
  "results": [
    {
      "book": "jhn",
      "chapter": 13,
      "testament": "new",
      "content": "A new commandment I give to you, that you love one another...",
      "similarity": 0.892
    }
  ],
  "total_searched": 7957,
  "execution_time_ms": 87.3
}

Quick Start

Using cURL

curl -X POST https://dssjon-biblos-api.hf.space/search \
  -H "Content-Type: application/json" \
  -d '{
    "query": "faith and works",
    "limit": 5
  }'

Using JavaScript

const response = await fetch('https://dssjon-biblos-api.hf.space/search', {
  method: 'POST',
  headers: { 'Content-Type': 'application/json' },
  body: JSON.stringify({
    query: 'faith and works',
    limit: 5
  })
})

const data = await response.json()
console.log(data.results)

Using Python

import requests

response = requests.post(
    'https://dssjon-biblos-api.hf.space/search',
    json={
        'query': 'faith and works',
        'limit': 5
    }
)

data = response.json()
print(data['results'])

Interactive Documentation

Visit /docs on your deployed Space for interactive Swagger UI documentation where you can test the API directly.

Performance

  • First request: ~2-3 seconds (model loading)
  • Subsequent requests: 50-100ms (model already in memory)
  • No cold starts after initial load
  • Supports concurrent requests

Model Information

  • Model: BAAI/bge-large-en-v1.5
  • Embedding dimensions: 1024
  • Total Bible passages: ~31,000
  • Total books: 66

Deployment

This Space uses Docker SDK with FastAPI. The model and embeddings are loaded once at startup and kept in memory for fast responses.

Data

The Bible embeddings are pre-computed and stored in the data/ directory. See prepare_data.py for how to generate embeddings from your own Bible XML source.

License

MIT License - Free to use for any purpose