Skip to content

Templates Overview

The Mistaber encoding plugin uses four standardized templates for generating checkpoint artifacts. These templates ensure consistent documentation across all encoding sessions.

Template Architecture

graph LR
    subgraph "Encoding Pipeline"
        CP[Corpus Prep] --> |generates| CR[Corpus Report]
        HLL[HLL Encode] --> |generates| ER[Encoding Report]
        VAL[Validate] --> |generates| VR[Validation Report]
        REV[Review] --> |generates| RP[Review Package]
    end

    subgraph "Templates"
        T1[corpus-report.md]
        T2[encoding-report.md]
        T3[validation-report.md]
        T4[review-package.md]
    end

    T1 -.-> CR
    T2 -.-> ER
    T3 -.-> VR
    T4 -.-> RP

Template Files

Template Purpose Generated By Approx. Size
corpus-report.md Source compilation report corpus-prep skill ~250 lines
encoding-report.md Rule encoding report hll-encode skill ~310 lines
validation-report.md Test results report validate skill ~570 lines
review-package.md Final review package review skill ~500 lines

Template Location

Templates are stored in the plugin's templates/ directory:

mistaber-skills/
└── templates/
    ├── corpus-report.md
    ├── encoding-report.md
    ├── validation-report.md
    └── review-package.md

Template Variables

All templates use a consistent variable syntax with curly braces:

Variable Description Example
{SIMAN} Siman number 87
{SEIF} Seif number 3
{TIMESTAMP} ISO 8601 timestamp 2026-01-25T10:30:00Z
{TOPIC_TITLE} Topic description Basar BeChalav - Fish with Milk
{HEBREW_TEXT} Hebrew source text Original Hebrew
{ENGLISH_TRANSLATION} English translation Translation text
{RULE_ID} Rule identifier r_87_3_dag_sakana
{WORLD} World name mechaber
{PREDICATE} Predicate expression issur(M, achiila, W)

Template Structure

Each template follows a consistent structure:

  1. Header: Title, timestamp, status
  2. Executive Summary: Key metrics table
  3. Content Sections: Phase-specific content
  4. Checkpoint Criteria: Verification checklist
  5. Approval Section: Approval instructions

Example Structure

# {Report Type}: YD {SIMAN}:{SEIF}

**{Action}:** {TIMESTAMP}
**Status:** Pending Review

---

## Executive Summary

| Metric | Value |
|--------|-------|
| Reference | YD {SIMAN}:{SEIF} |
| [Additional metrics...] |

---

## [Content Sections]

[Phase-specific content...]

---

## Checkpoint Review Criteria

Please verify the following before approval:

- [ ] [Verification item 1]
- [ ] [Verification item 2]
- [ ] [Verification item 3]

---

## Approval

**Status:** Pending Review

To approve, respond with:
> "Approved" or "{Type} approved"

To request changes:
> "Needs revision: [specific feedback]"

---

*Generated by Mistaber Encoding Pipeline*

Template Usage by Phase

Phase 1: Corpus Preparation

Template: corpus-report.md

Content Sections:

  1. Primary Source (Hebrew + English)
  2. Atomic Statements extraction
  3. Commentary Layer (4 tiers)
  4. Derivation Chain
  5. Machloket identification
  6. Semantic Enrichment
  7. Gap Analysis
  8. Questions for Review

Key Variables:

Variable Content
{HEBREW_TEXT} SA text in Hebrew
{ENGLISH_TRANSLATION} English translation
{MERMAID_DIAGRAM} Derivation chain diagram
{SHACH_TEXT} Shach commentary excerpt
{TAZ_TEXT} Taz commentary excerpt

Phase 2: HLL Encoding

Template: encoding-report.md

Content Sections:

  1. Statement → Rule Mapping
  2. World Distribution
  3. Base World Rules
  4. Mechaber World Rules
  5. Rema World Rules
  6. Machloket Encoding
  7. Pre-Compile Validation
  8. World Inheritance Diagram
  9. Output File Preview

Key Variables:

Variable Content
{BASE_WORLD_RULES} Rules for base world
{MECHABER_RULES} Mechaber-specific rules
{REMA_RULES} Rema-specific rules
{MACHLOKET_MARKERS} Machloket marker facts

Phase 3: Validation

Template: validation-report.md

Content Sections:

  1. Phase A: Compilation Validation
  2. Phase B: Semantic Validation
  3. Phase C: Behavioral Testing
  4. Phase D: Query Verification
  5. Test Scenarios
  6. Validation Results Summary

Key Variables:

Variable Content
{COMPILER_ERROR_OUTPUT} Error messages if any
{SAMPLE_ANSWER_SET_ATOMS} Clingo output atoms
{TEST_NAME} Individual test names
{SETUP_FACTS} Test setup facts
{QUERY} Test query
{EXPECTED_RESULT} Expected output
{ACTUAL_RESULT} Actual output

Phase 4: Review

Template: review-package.md

Content Sections:

  1. Component 1: Source Verification
  2. Component 2: Encoding Review
  3. Component 3: Validation Evidence
  4. Component 4: Review Checklists
  5. Component 5: Questions & Concerns
  6. Interactive Testing Section
  7. Output Files
  8. Final Approval

Key Variables:

Variable Content
{SESSION_ID} Unique session identifier
{APPROVER} Checkpoint approver
{DERIVATION_CHAIN_MERMAID} Full derivation diagram
{CONCERNS_STATUS} Outstanding concerns

Generated Artifacts

Templates generate artifacts in .mistaber-artifacts/:

Artifact Template Format
corpus-report-YD-{SIMAN}-{SEIF}.md corpus-report.md Markdown
corpus-sources-YD-{SIMAN}-{SEIF}.yaml (structured data) YAML
corpus-chain-YD-{SIMAN}-{SEIF}.mermaid (diagram only) Mermaid
encoding-report-YD-{SIMAN}-{SEIF}.md encoding-report.md Markdown
encoding-mapping-YD-{SIMAN}-{SEIF}.yaml (structured data) YAML
validation-report-YD-{SIMAN}-{SEIF}.md validation-report.md Markdown
validation-results-YD-{SIMAN}-{SEIF}.yaml (structured data) YAML
test-scenarios-YD-{SIMAN}-{SEIF}.yaml (test definitions) YAML
review-package-YD-{SIMAN}-{SEIF}.md review-package.md Markdown

Approval Workflows

Each template includes an approval section:

Approval Response

To approve, respond with:
> "Approved" or "{Type} approved"

Revision Request

To request changes:
> "Needs revision: [specific feedback]"

Rejection (Review only)

To reject:
> "Rejected: [reason]"

Review Checklists

Templates include standardized checklists:

Halachic Accuracy (Review Package)

Check Description
H1 Ruling accurately represents SA text
H2 All conditions from source captured
H3 Machloket positions accurately encoded
H4 Makor chain reaches authoritative source
H5 Commentary interpretations correct
H6 Madrega levels appropriate
H7 No rulings invented or inferred
H8 World assignments match authority

Technical Accuracy (Review Package)

Check Description
T1 Predicates correctly chosen
T2 Arity matches predicate definition
T3 Variables properly scoped
T4 NAF used appropriately
T5 World inheritance correct
T6 Overrides properly structured
T7 All rules have unique IDs
T8 File structure follows conventions