hanxiao commited on
Commit
7fa51ea
·
verified ·
1 Parent(s): 8f85c5d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -29,6 +29,8 @@ library_name: transformers
29
 
30
  `jina-reranker-v3` is a 0.6B parameter multilingual document reranker with a novel *last but not late interaction* architecture. Unlike ColBERT's separate encoding with multi-vector matching, this model performs causal self-attention between query and documents within the same context window, extracting contextual embeddings from the last token of each document.
31
 
 
 
32
  Built on Qwen3-0.6B with 28 transformer layers and a lightweight MLP projector (1024→512→256), it processes up to 64 documents simultaneously within 131K token context. The model achieves state-of-the-art BEIR performance with 61.94 nDCG@10 while being 10× smaller than generative listwise rerankers.
33
 
34
  | Model | Size | BEIR | MIRACL | MKQA | CoIR |
 
29
 
30
  `jina-reranker-v3` is a 0.6B parameter multilingual document reranker with a novel *last but not late interaction* architecture. Unlike ColBERT's separate encoding with multi-vector matching, this model performs causal self-attention between query and documents within the same context window, extracting contextual embeddings from the last token of each document.
31
 
32
+ ![jina-reranker-v3 architecture](https://jina-ai-gmbh.ghost.io/content/images/2025/10/Heading--54-.svg)
33
+
34
  Built on Qwen3-0.6B with 28 transformer layers and a lightweight MLP projector (1024→512→256), it processes up to 64 documents simultaneously within 131K token context. The model achieves state-of-the-art BEIR performance with 61.94 nDCG@10 while being 10× smaller than generative listwise rerankers.
35
 
36
  | Model | Size | BEIR | MIRACL | MKQA | CoIR |