Skip to content

TONL-MCP BridgeSave 40-60% on LLM Token Costs

Production-ready TypeScript library that optimizes JSON for LLMs. Built for RAG systems, vector databases, and real-time streaming.

Why TONL? โ€‹

The Problem: LLMs charge per token. JSON is verbose:

json
[
  {"id": 1, "name": "Alice", "age": 25},
  {"id": 2, "name": "Bob", "age": 30}
]

Cost: 118 tokens

The Solution: TONL removes redundancy:

tonl
users[2]{id:i32,name:str,age:i32}:
  1, Alice, 25
  2, Bob, 30

Cost: 75 tokens โ†’ 36% savings

Token Savings by Dataset Size โ€‹

RecordsJSON TokensTONL TokensSavings
51187536%
1024713446%
1002,47098760%
1,00024,7009,87060%

๐Ÿ’ก Savings increase with more data


Get Started in 60 Seconds โ€‹

1. Install โ€‹

bash
npm install tonl-mcp-bridge

2. Convert Your First File โ€‹

typescript
import { jsonToTonl } from 'tonl-mcp-bridge';

const users = [
  { id: 1, name: "Alice", email: "alice@example.com" },
  { id: 2, name: "Bob", email: "bob@example.com" }
];

const tonl = jsonToTonl(users, "users");
console.log(tonl);

Output:

tonl
users[2]{id:i32,name:str,email:str}:
  1, Alice, alice@example.com
  2, Bob, bob@example.com

Result: JSON used 118 tokens, TONL uses 75 tokens. 36% savings.

3. Use with Your LLM โ€‹

typescript
import OpenAI from 'openai';

const openai = new OpenAI();

const response = await openai.chat.completions.create({
  model: "gpt-4",
  messages: [
    {
      role: "system",
      content: `Here is user data:\n${tonl}`  // โœ… 36% fewer tokens
    },
    {
      role: "user", 
      content: "Who is the oldest user?"
    }
  ]
});

Common Use Cases โ€‹

RAG with Vector Databases โ€‹

typescript
import { MilvusAdapter } from 'tonl-mcp-bridge/sdk/vector';

const milvus = new MilvusAdapter({ address: 'localhost:19530' });
await milvus.connect();

// Search and get TONL results (automatic conversion)
const result = await milvus.searchToTonl(
  'documents',
  queryEmbedding,
  { limit: 10 }
);

// Use TONL result in LLM prompt
const prompt = `Context:\n${result.tonl}\n\nQuestion: ${userQuestion}`;
// โœ… Saved ${result.stats.savingsPercent}% tokens

Real-Time Log Processing โ€‹

bash
# Stream 1GB log file with constant memory
curl -X POST http://localhost:3000/stream/convert \
  -H "Content-Type: application/x-ndjson" \
  --data-binary @app-logs.ndjson \
  -o logs.tonl

# 250,000 lines/second, 47% compression

Privacy-Compliant Data โ€‹

typescript
// Anonymize sensitive fields before sending to LLM
const masked = jsonToTonl(users, 'users', {
  anonymize: ['email', 'ssn', 'creditCard'],
  mask: true  // Preserves format: a***@example.com
});

// Safe to use in LLM prompts - PII protected

Production Features โ€‹

Deploy Anywhere โ€‹

Docker:

bash
docker run -d -p 3000:3000 \
  -e TONL_AUTH_TOKEN=your-token \
  ghcr.io/kryptomrx/tonl-mcp-bridge:latest

Kubernetes:

yaml
apiVersion: apps/v1
kind: Deployment
metadata:
  name: tonl-mcp-bridge
spec:
  replicas: 3
  template:
    spec:
      containers:
      - name: tonl-server
        image: ghcr.io/kryptomrx/tonl-mcp-bridge:latest
        livenessProbe:
          httpGet:
            path: /health
            port: 3000
        readinessProbe:
          httpGet:
            path: /ready
            port: 3000

Monitor Performance โ€‹

bash
# Real-time dashboard
tonl top

# Prometheus metrics
curl http://localhost:3000/metrics

CLI Tools โ€‹

bash
# Convert files
tonl convert data.json

# With token statistics
tonl convert data.json -s

# Calculate ROI
tonl roi --savings 45 --queries-per-day 1000

# Analyze multiple files
tonl analyze data/*.json --visual

# Start MCP server
tonl-mcp-server

Database Support โ€‹

Native adapters for popular databases:

  • Vector: Milvus, Qdrant, ChromaDB
  • SQL: PostgreSQL, MySQL, SQLite
  • NoSQL: MongoDB Atlas (coming soon)

All adapters include automatic TONL conversion and token statistics.


Real-World ROI โ€‹

Enterprise RAG Platform:

  • 1M queries/day with 1000 database results each
  • JSON: 125K tokens/query = $3.75/query = $3.75M/day
  • TONL: 50K tokens/query = $1.50/query = $1.50M/day
  • Monthly savings: $67.5M

When to Use TONL โ€‹

โœ… Perfect for:

  • RAG systems with 10+ results per query
  • Vector database queries
  • Real-time log processing
  • API responses with repeated structure
  • Applications with PII/PHI data

โŒ Not ideal for:

  • Single object conversions (header overhead)
  • Highly variable schemas
  • Systems requiring strict JSON

Next Steps โ€‹


Credits โ€‹

TONL format by Ersin Koรง. This library adds production features: streaming, privacy, monitoring, and enterprise integrations.

MIT Licensed | GitHub | npm

MIT Licensed | v1.0.0