Skip to Content
Core ConceptsHeuristic Pruning & Rewrites

Heuristic Pruning & Rewrites

To achieve maximum compression without sacrificing output quality, Trunkate AI combines two powerful techniques: Static Rewrites and Heuristic Pruning.

Static Rewrites

Static rewriting is a dictionary-based optimization layer that relies on token-efficiency research. Our compiler analyzes the input and replaces common verbose phrases with shorter, semantically identical alternatives.

Example: The Transformation

User Prompt:

“In order to ensure that the application is running correctly, you need to check the logs.”

Optimized Output:

“To ensure the app runs correctly, check the logs.”

This simple transformation reduces the token count by over 20% while maintaining the exact same directive for the LLM.

Heuristic Pruning

While static rewrites handle phrase-level optimizations, Heuristic Pruning operates at the conceptual level. It intelligently removes redundant or low-value segments.

  1. Duplicate Context: If the prompt restates a core fact multiple times (common in chained RAG pipelines), Trunkate prunes the duplicates.
  2. Irrelevant Stop-Words: Removes “a”, “an”, “the” where grammatically safely omitted for LLM comprehension.
  3. Variable Collapsing: Consolidates spread-out references to the same entity.

Privacy by Design

This architecture ensures two things:

  1. Stateless Processing: Your proprietary data is never stored. It exists only in the ephemeral execution memory during the milliseconds it takes to optimize.
  2. No Data Logging: Trunkate does not train on your prompts. The optimization pipeline is entirely deterministic and relies on pre-compiled linguistic rules.
Last updated on