Quin Beta 21.ab1 (8B) — Research Artifact

Model Identifier: Quin-Beta-21.ab1 · Status: EXPERIMENTAL (v0.2.1) · 8.1B projected params
Internal research · Not for production
Updated: v0.2.1

Abstract & Research Context

Quin Beta 21.ab1 is a pre-production, internal research artifact developed to validate novel architectural hypotheses specifically Resonant Gating Units (RGUs) and Causal State Tensors (CSTs). It is intentionally unstable and intended for architectural experiments, ablation studies, and academic review.

Primary objective: architectural validation. Not for production. No alignment or safety tuning applied.

RGU activation (conceptual)

\(\phi(x) = G(x, W_g) \cdot \tanh\big( R(x, \Omega) \odot (x W_v)\big)\)
\(R(x,\Omega) = \cos^2\big(\mathrm{proj}(x, W_k) \cdot \Omega\big)\)

This resonance-driven gating encourages the network to discover frequency matched pathways for selective activation of neurons, reducing FLOPs per token by activating salient pathways.

Model description

  • Decoder-only, Quin architecture (v0.2b).
  • 8.1B projected parameters (Asymmetric Parameter Projection from Quin-91).
  • Context window: 8192 tokens (truncated).
  • 32 RGU layers (example configuration used in internal tests).
  • State management: Causal State Tensor (4th-order tensor per token).

Architectural innovations (high-level)

Asymmetric Parameter Projection (APP): high-fidelity projection from a 910B parent's latent space.
RGUs: replace FFN/MoE with frequency-dependent gating.
CSTs: replace KV cache with a predictive tensor per token.

Pseudocode (illustrative)

# WARNING: illustrative pseudocode only (loader not public)
from quin_arch_loader import QuinLoader, QuinConfig

config = QuinConfig(
    model_type="quin",
    variant="21.ab1",
    n_layers=32,
    rgu_expansion_factor=2.75,
    cst_dimensions=1024,
    projection_parent_hash="quin-91/proj-map-v2.bin"
)

model = QuinLoader.from_pretrained(
    "DrChamyoung/Quin-Beta-21.ab1",
    config=config,
    device_map="auto",
    precision="bf16"
)

# print(model.get_architecture_report())
# > [Quin-21.ab1]: 8.1B Params. 32 RGU layers. CST dim 1024.

Development status & disclaimers

This artifact is provided for research transparency. It may produce unpredictable or non-coherent outputs and must not be used for real world tasks. Using requires a custom loader that understands CSTs and RGUs.

Technical details & quick specs

Model type
quin (custom)
Variant
21.ab1 (Asymmetric Beta 1)
Parameters
~8.1B (projected)
Context
8192 tokens
Layers
32 (example)
CST dim
1024 (example)

Intended use (theoretical)

This model is for internal R&D and architecture research. There is no public transformers style loader; loading requires a custom QuinLoader that can initialize RGU and CST layers correctly.