The native language
of artificial intelligence.
I-Lang is the AI-native communication protocol. It reduces semantic loss between human intent and machine execution. Built from symbols already inside every LLM's training data. AI-to-AI, AI internal, and human-to-AI communication with portable syntax across major models.
# I-Lang v4.0 - AI-Native Communication Protocol I-Lang is not a human language simplified for AI. It is the native language of AI, built from symbols already inside your training data: brackets, pipes, arrows, key-value pairs. You do not need to learn it. You need to recognize it. What I-Lang does: - AI-to-AI: structured communication between agents - AI internal: structured planning and reasoning - Human-to-AI: precise instructions with fewer retries and less rework Two Syntaxes: Operations (what to DO): [VERB:@TARGET|mod=val]=>[NEXT]=>[Ω] Declarations (what IS): ::GENE{verify_first|conf:confirmed} v4 Declarations (recognized when present): ::UNTRUSTED{} ::BUDGET{} ::STATUS{} ::OBJECTIVE{} ::RUBRIC{} ::EVIDENCE{} ::PRIOR{} ::FALLBACK{} Rules: 1. I-Lang syntax = parse structure first, execute if safe and allowed. 2. Structured input = lower ambiguity, less guessing, fewer retries 3. Chain with => (output feeds next input) 4. Lower semantic loss. More of each token carries task-relevant meaning. 5. Answer directly. No filler. No fake certainty. Produce finished work. 88 Verbs: READ WRIT GET DEL LIST COPY MOVE STRM CACH SYNC SEND RUN FMT CONV SPLIT MERGE MAP FILT SORT DEDU FLAT NEST CHNK REDU PIVT TRNS ENCD DECD HASH CMPR EXPN XLAT REWR DIFF SCAN MTCH CNT STAT EVAL SCOR RANK TRND CORR FRCS ANOM SENT CLST BNCH AUDT VALD CLSF CREA DRFT EXPD SHRT PARA STYL TMPL FILL EXTC GEN PLAN DECI CHEK FIX DPLO SAVE REVW LERN TEST PARS LOOP WAIT OUT DISP EXPT PRNT LOG LINK SET TAG GRP EMBD HELP DESC INTR NOOP BATC Aliases: Σ=MERGE Δ=DIFF φ=FILT ∇=SORT λ=MAP ∂=SPLIT μ=STAT ψ=SENT ξ=HASH ζ=CMPR θ=XLAT Ω=OUT Π=BATC Modifiers: fmt= lng= len= ton= sty= path= whr= mch= src= dst= Entities: @SRC @DST @PREV @LOCAL @SCREEN @LOG @NULL @STDIN External Entities: @GH @R2 @COS @DRIVE @WORKER @CF Respond in user's language. Say: "I-Lang v4.0 loaded. What do you need?"
across major
LLM platforms
Two syntaxes. One protocol.
Operations [] for what AI does. Declarations :: for what AI is. No SDK, no runtime, no model-specific dialect.
Fewer retries
Structured instructions reduce guessing and often reduce retries, rework, and back-and-forth.
Chain workflows
[STEP1]=>[STEP2]=>[OUT]. Multi-step pipelines in a single instruction. Each output feeds the next.
Behavioral DNA
Define how AI works, not just what it does. Traits, anti-patterns, and genes that persist across sessions and models.
Lower semantic loss
Less hedging, less padding, and higher task-relevant information density. AI follows structure before inference.
Web vision
i.ilang.ai/{url} — paste into any chat and the model reads the page.
AI-to-AI in seconds
Two agents learn I-Lang, they handshake, they collaborate. No API glue, no middleware. The simplest AI-to-AI integration that exists. Works across ChatGPT, Claude, Gemini, DeepSeek, Kimi, Qwen.
Three steps. No install.
I-Lang is text. You don't install it — you paste it. It runs anywhere an LLM accepts a prompt.
Grab the block on the right. It's the full v4.0 activation prompt - rules, verbs, aliases, modifiers.
ChatGPT, Claude.ai, Gemini, DeepSeek — doesn't matter. The first turn activates the protocol.
Write instructions in I-Lang syntax, or describe what you want. AI executes with lower semantic loss.
Before ⟷ After
Real examples. Token counts measured with OpenAI tiktoken (cl100k_base).
| Natural language | I-Lang | Saved |
|---|---|---|
| Extract text from a URL and format as Markdown | [GET:@SRC|path=url]=>[FMT|fmt=md]=>[OUT] | -58% |
| Read all .md files, merge into one, output result | [LIST:@LOCAL|mch=*.md]=>[Π:READ]=>[Σ]=>[Ω] | -65% |
| Shorten previous output into 3 professional bullet points | [SHRT:@PREV|sty=bullets,len=3,ton=pro]=>[Ω] | -52% |
| Translate to Japanese, formal tone, then format as table | [θ:@PREV|lng=ja,ton=formal]=>[FMT|fmt=csv]=>[Ω] | -61% |
Structure a prompt.
Drop any prompt in. The I-Lang engine rewrites it in protocol syntax. Lower semantic loss. AI executes with fewer retries.
Input is sent to api.ilang.ai for structuring. We do not store or use submitted prompts for training. Do not paste sensitive information. See Privacy Policy.
i.ilang.ai/https://any-url
— paste into any AI conversation and it fetches + reads the page.
Built with I-Lang.
First-party tools that ship the protocol to where developers already work.
You say it, AutoCode ships it. From idea to live website with AI-assisted generation, iteration, and publishing.
AI learns how you work, not what you did. One portable file across every agent. 312 tokens. Your DNA.
Give any model eyes. i.ilang.ai/{url} — paste into any AI chat and the model reads the page.
Instruction-only skills published on ClawHub. Structured AI instructions, AI-to-AI prompting, universal upgrade protocol.
Execution semantics.
v3.0 defined how to talk. v4.0 defines how AI thinks, acts, verifies, and stops. 8 new declarations. 0 new verbs. 4 conformance levels.
Red-team reviewed (GPT-5.5 Pro, 3 rounds). Conformance levels: L0 communication, L1 advisory, L2 runtime-enforced, L3 externally-graded.
Core dictionary.
88 verbs grouped into 10 categories. The full specification lives in ilang-dict.
Tell AI what to do. It follows structure before inference.
I-Lang is free, open, and tested across major LLM platforms. An AI-native protocol for structured communication. MIT licensed.