Neogenint Intelligence
Home/AlgorithmINN™

INN™ Intuitive
Neural Network.

A brain-closer architecture than Transformer. Three-valued logic, sparse activation, CPU-only inference. Explainability built in, not bolted on.

1.35Mtokens/s
Inference
99.5%
Accuracy
100×
Less energy
§ 000Scroll
§ 001Core

Traditional deep learning is a black box — you feed it data, it spits out predictions, and you never know why. INN changes that from first principles.

INN adopts a "structural information extraction — logical networking" mechanism, integrating symbolic computation with a data-driven three-valued logic brain-inspired model. It can perform logical reasoning like a human and explain its decision-making process.

This isn't a patch on Transformer. It's a new brain — from neurons to network structure, from activation functions to inference paths, every layer designed for explainability.

Explainability, built in
§ 002Capabilities

Not a bigger
Transformer.

INN is a new brain architecture — every layer redesigned, from neurons to networks, from activation to inference.

01

Small-sample learning

Extract patterns with minimal data, not millions of labels.

02

Continuous learning

Learn new knowledge without forgetting, no catastrophic interference.

03

Multimodal fusion

Process text, image, speech in a unified representation.

04

CPU inference

No GPU cluster required — runs efficiently on ordinary CPUs.

§ 003Performance
Full-system inference
12.6Mtokens/s

3-node BIE cluster, CPU-only, no GPU required

Training speed

Single thread
1,200tokens/s
Single node (600 threads)
360Ktokens/s
Full system (3 nodes)
2.16Mtokens/s

Inference speed

Single thread
7,000tokens/s
Single node (600 threads)
1.35Mtokens/s
Full system (3 nodes)
12.6Mtokens/s

10 billion tokens trained in 30 hours on a single CPU node. Comparable to traditional clusters requiring multiple high-end GPUs.

30hours
§ 004Benchmarks

Accuracy
isn't a guess.

Public datasets, reproducible results. Not cherry-picked lab cases — real classification tasks.

89.7%

Kaggle Diabetes

Medical diagnosis

98.6%

Kaggle Heart Disease

Disease prediction

98.2%

Kaggle MNIST

Image recognition

99.5%

Double Helix

Scientific classification

§ 005Features

Six core
advantages.

INN isn't just a faster model — it redefines the baseline across precision, transparency, energy, and generality.

01

High precision

Consistently outperforms comparable models on standard benchmarks — high accuracy, minimal error.

02

Model transparency

Every inference step is traceable. The logic chain is visible — no more black box.

03

Low energy consumption

Native CPU inference. No GPU cluster needed — power draw drops by orders of magnitude.

04

Fast speed

Full-system inference throughput of 12.6M tokens/s. Response latency under 100ms.

05

Unified symbolic & numerical

Symbolic logic and numerical computation in a single framework — precision meets generalization.

06

General-purpose AI architecture

From NLP to image recognition, classification to multimodal — one architecture, many tasks.

§ 006Framework

Flint
hierarchical architecture.

AI development framework for building and deploying INN models. Generates learning networks based on tasks, determining network nodes, connection structures, and node computation models.

VersionV0.2102
01

Data Storage Layer

Provides intuitive neural network data storage and metadata management based on underlying CPU, GPU, BPU hardware resources, including KV storage engine, metadata dictionary, neuron node and network structure data access interfaces, and cross-process or multi-machine network communication interfaces.

Core components
Neuron NodeDistribution ServiceStorage EngineMetadata DictionaryCPU InterfaceGPU InterfaceBPU Interface
02

Network Construction Layer

During task training, responsible for extracting organizational relationships from use case data and performing data annotation fitting. This layer utilizes intuitive neural network perception and cognition components, combined with filtering, classification and other operators, to construct multi-directional, multi-scale neural network graph structures required by tasks based on underlying neuron abstraction interfaces.

Core components
Ternary LogicClassifierFitterPerceptronCongitronNeural Network
03

Model Services Layer

Provides auxiliary tools and management operation interfaces for the full lifecycle of intuitive neural network training and inference, facilitating rapid construction of various model applications. Mainly includes neural network connection structure visualization tools, data preprocessing and loading tools, and command-line interfaces for training and inference.

Core components
Data LoaderNet ViewerDebug ToolsImporter / ExporterTraining CLIInference APIDeployment ToolsINN SDK-API
04

Model Applications

As a general artificial intelligence learning algorithm, intuitive neural networks enable the Flint platform to support intelligent application development in various scenarios, such as popular NLP dialogue systems, traditional classification tasks (e.g. Kaggle competition tasks), and multimodal applications combining text and images.

Core components
Classification TasksVision ModelsLanguage ModelsARC PrizeMultimodal Models
Framework features

Ease of Use & Performance

Achieving optimal balance between ease of use and performance — developers can get started quickly and use efficiently.

Highly Modular

Each component can be independently replaced or extended, supporting flexible system architecture design.

Rapid Iteration

Supports rapid iterative development with seamless transition from prototype to production.

Intuitive Visualization

Built-in network visualization tools that make complex neural networks understandable.

§ 006Download

INN inside™ Basic.

The smallest INN model software version, available for direct download. This is the entry-level version of the INN brain-inspired large model, allowing you to experience the core features of the INN architecture.

Note: INN inside™ PRO and INN inside™ Ultra are only available with BIE PRO and BIE Ultra hardware platforms.

§ 007What's next

Want to see INN in
your room?

Zhuhai · HengqinNEOGENINT · INN · 2026lane_nie@neogenint.com