diff --git a/.gitmodules b/.gitmodules index d664da54..7999d07c 100644 --- a/.gitmodules +++ b/.gitmodules @@ -1,7 +1,7 @@ [submodule "dflash/deps/llama.cpp"] path = dflash/deps/llama.cpp - url = https://github.com/Luce-Org/llama.cpp-dflash-ggml.git - branch = luce-dflash + url = https://github.com/dusterbloom/llama-cpp-turboquant-cuda.git + branch = feature/tq3-kv-cache [submodule "dflash/deps/Block-Sparse-Attention"] path = dflash/deps/Block-Sparse-Attention url = https://github.com/mit-han-lab/Block-Sparse-Attention.git diff --git a/.sisyphus/notes/gemma4-baseline/generate_prompts.py b/.sisyphus/notes/gemma4-baseline/generate_prompts.py new file mode 100644 index 00000000..3d03069d --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/generate_prompts.py @@ -0,0 +1,135 @@ +#!/usr/bin/env python3 +""" +Generate BPE-tokenized prompt files for test_gemma4_dflash --tokens-file. +Uses HuggingFace Gemma 3 tokenizer (vocab 262144, BOS=2) -- identical to GGUF vocabulary. +BOS is NOT prepended here; the driver prepends it automatically. +""" + +import os +from transformers import AutoTokenizer + +OUTPUT_DIR = "/home/peppi/Dev/lucebox-hub/.sisyphus/notes/gemma4-baseline/prompts" +os.makedirs(OUTPUT_DIR, exist_ok=True) + +t = AutoTokenizer.from_pretrained('google/gemma-3-27b-it', local_files_only=True) + +# ─── Prompt 1: short_chat ──────────────────────────────────────────────────── +SHORT_SOURCE = ( + "user\n" + "The quick brown fox jumps over the lazy dog. " + "Explain in one paragraph what this sentence demonstrates." + "\n" + "model\n" +) +short_ids = t.encode(SHORT_SOURCE, add_special_tokens=False) + +# ─── Prompt 2: long_open ───────────────────────────────────────────────────── +LONG_OPEN_SOURCE = ( + "user\n" + "Write a short story (about 250 words) about a robot who learns to paint. " + "Include dialogue and a clear beginning, middle, and end." + "\n" + "model\n" +) +long_ids = t.encode(LONG_OPEN_SOURCE, add_special_tokens=False) + +# ─── Prompt 3: long_2k — Alice in Wonderland Ch.1 "Down the Rabbit-Hole" ───── +# Source: Project Gutenberg "Alice's Adventures in Wonderland" by Lewis Carroll +# URL: https://www.gutenberg.org/cache/epub/11/pg11.txt (public domain) +# Covers Chapter I "Down the Rabbit-Hole" in full (2611 tokens) + +ALICE_TEXT = """Alice was beginning to get very tired of sitting by her sister on the bank, and of having nothing to do: once or twice she had peeped into the book her sister was reading, but it had no pictures or conversations in it, and what is the use of a book, thought Alice, without pictures or conversations? So she was considering in her own mind (as well as she could, for the hot day made her feel very sleepy and stupid), whether the pleasure of making a daisy-chain would be worth the trouble of getting up and picking the daisies, when suddenly a White Rabbit with pink eyes ran close by her. + +There was nothing so very remarkable in that; nor did Alice think it so very much out of the way to hear the Rabbit say to itself, Oh dear! Oh dear! I shall be late! (when she thought it over afterwards, it occurred to her that she ought to have wondered at this, but at the time it all seemed quite natural); but when the Rabbit actually took a watch out of its waistcoat-pocket, and looked at it, and then hurried on, Alice started to her feet, for it flashed across her mind that she had never before seen a rabbit with either a waistcoat-pocket, or a watch to take out of it, and burning with curiosity, she ran across the field after it, and fortunately was just in time to see it pop down a large rabbit-hole under the hedge. + +In another moment down went Alice after it, never once considering how in the world she was to get out again. The rabbit-hole went straight on like a tunnel for some way, and then dipped suddenly down, so suddenly that Alice had not a moment to think about stopping herself before she found herself falling down a very deep well. + +Either the well was very deep, or she fell very slowly, for she had plenty of time as she went down to look about her and to wonder what was going to happen next. First, she tried to look down and make out what she was coming to, but it was too dark to see anything; then she looked at the sides of the well, and noticed that they were filled with cupboards and book-shelves; here and there she saw maps and pictures hung upon pegs. She took down a jar from one of the shelves as she passed; it was labelled ORANGE MARMALADE, but to her great disappointment it was empty: she did not like to drop the jar for fear of killing somebody underneath, so managed to put it into one of the cupboards as she fell past it. + +Well, thought Alice to herself, after such a fall as this, I shall think nothing of tumbling down stairs! How brave they'll all think me at home! Why, I wouldn't say anything about it, even if I fell off the top of the house! (Which was very likely true.) + +Down, down, down. Would the fall never come to an end? I wonder how many miles I've fallen by this time? she said aloud. I must be getting somewhere near the centre of the earth. Let me see: that would be four thousand miles down, I think-- (for, you see, Alice had learnt several things of this sort in her lessons in the schoolroom, and though this was not a very good opportunity for showing off her knowledge, as there was no one to listen to her, still it was good practice to say it over) --yes, that's about the right distance--but then I wonder what Latitude or Longitude I've got to? (Alice had no idea what Latitude was, or Longitude either, but thought they were nice grand words to say.) + +Presently she began again. I wonder if I shall fall right through the earth! How funny it'll seem to come out among the people that walk with their heads downward! The Antipathies, I think-- (she was rather glad there was no one listening, this time, as it didn't sound at all the right word) --but I shall have to ask them what the name of the country is, you know. Please, Ma'am, is this New Zealand or Australia? (and she tried to curtsey as she spoke--fancy curtseying as you're falling through the air! Do you think you could manage it?) And what an ignorant little girl she'll think me for asking! No, it'll never do to ask: perhaps I shall see it written up somewhere. + +Down, down, down. There was nothing else to do, so Alice soon began talking again. Dinah'll miss me very much to-night, I should think! (Dinah was the cat.) I hope they'll remember her saucer of milk at tea-time. Dinah my dear! I wish you were down here with me! There are no mice in the air, I'm afraid, but you might catch a bat, and that's very like a mouse, you know. But do cats eat bats, I wonder? And here Alice began to get rather sleepy, and went on saying to herself, in a dreamy sort of way, Do cats eat bats? Do cats eat bats? and sometimes, Do bats eat cats? for, you see, as she couldn't answer either question, it didn't much matter which way she put it. She felt that she was dozing off, and had just begun to dream that she was walking hand in hand with Dinah, and saying to her very earnestly, Now, Dinah, tell me the truth: did you ever eat a bat? when suddenly, thump! thump! down she came upon a heap of sticks and dry leaves, and the fall was over. + +Alice was not a bit hurt, and she jumped up on to her feet in a moment: she looked up, but it was all dark overhead; before her was another long passage, and the White Rabbit was still in sight, hurrying down it. There was not a moment to be lost: away went Alice like the wind, and was just in time to hear it say, as it turned a corner, Oh my ears and whiskers, how late it's getting! She was close behind it when she turned the corner, but the Rabbit was no longer to be seen: she found herself in a long, low hall, which was lit up by a row of lamps hanging from the roof. + +There were doors all round the hall, but they were all locked; and when Alice had been all the way down one side and up the other, trying every door, she walked sadly down the middle, wondering how she was ever to get out again. + +Suddenly she came upon a little three-legged table, all made of solid glass; there was nothing on it except a tiny golden key, and Alice's first thought was that it might belong to one of the doors of the hall; but, alas! either the locks were too large, or the key was too small, but at any rate it would not open any of them. However, on the second time round, she came upon a low curtain she had not noticed before, and behind it was a little door about fifteen inches high: she tried the little golden key in the lock, and to her great delight it fitted! + +Alice opened the door and found that it led into a small passage, not much larger than a rat-hole: she knelt down and looked along the passage into the loveliest garden you ever saw. How she longed to get out of that dark hall, and wander about among those beds of bright flowers and those cool fountains, but she could not even get her head through the doorway; and even if my head would go through, thought poor Alice, it would be of very little use without my shoulders. Oh, how I wish I could shut up like a telescope! I think I could, if I only knew how to begin. For, you see, so many out-of-the-way things had happened lately, that Alice had begun to think that very few things indeed were really impossible. + +There seemed to be no use in waiting by the little door, so she went back to the table, half hoping she might find another key on it, or at any rate a book of rules for shutting people up like telescopes: this time she found a little bottle on it (which certainly was not here before, said Alice), and round the neck of the bottle was a paper label, with the words DRINK ME beautifully printed on it in large letters. + +It was all very well to say Drink me, but the wise little Alice was not going to do THAT in a hurry. No, I'll look first, she said, and see whether it's marked poison or not; for she had read several nice little histories about children who had got burnt, and eaten up by wild beasts and other unpleasant things, all because they would not remember the simple rules their friends had taught them: such as, that a red-hot poker will burn you if you hold it too long; and that if you cut your finger very deeply with a knife, it usually bleeds; and she had never forgotten that, if you drink much from a bottle marked poison, it is almost certain to disagree with you, sooner or later. + +However, this bottle was NOT marked poison, so Alice ventured to taste it, and finding it very nice, (it had, in fact, a sort of mixed flavour of cherry-tart, custard, pine-apple, roast turkey, toffee, and hot buttered toast,) she very soon finished it off. + +What a curious feeling! said Alice. I must be shutting up like a telescope. + +And so it was indeed: she was now only ten inches high, and her face brightened up at the thought that she was now the right size for going through the little door into that lovely garden. First, however, she waited for a few minutes to see if she was going to shrink any further: she felt a little nervous about this; for it might end, you know, said Alice to herself, in my going out altogether, like a candle. I wonder what I should be like then? And she tried to fancy what the flame of a candle is like after the candle is blown out, for she could not remember ever having seen such a thing. + +After a while, finding that nothing more happened, she decided on going into the garden at once; but, alas for poor Alice! when she got to the door, she found she had forgotten the little golden key, and when she went back to the table for it, she found she could not possibly reach it: she could see it quite plainly through the glass, and she tried her best to climb up one of the legs of the table, but it was too slippery; and when she had tired herself out with trying, the poor little thing sat down and cried. + +Come, there's no use in crying like that! said Alice to herself, rather sharply. I advise you to leave off this minute! She generally gave herself very good advice, (though she very seldom followed it), and sometimes she scolded herself so severely as to bring tears into her eyes; and once she remembered trying to box her own ears for having cheated herself in a game of croquet she was playing against herself, for this curious child was very fond of pretending to be two people. But it's no use now, thought poor Alice, to pretend to be two people! Why, there's hardly enough of me left to make ONE respectable person! + +Soon her eye fell on a little glass box that was lying under the table: she opened it, and found in it a very small cake, on which the words EAT ME were beautifully marked in currants. Well, I'll eat it, said Alice, and if it makes me grow larger, I can reach the key; and if it makes me grow smaller, I can creep under the door; so either way I'll get into the garden, and I don't care which happens! + +She ate a little bit, and said anxiously to herself, Which way? Which way?, holding her hand on the top of her head to feel which way it was growing, and she was quite surprised to find that she remained the same size: to be sure, this generally happens when one eats cake, but Alice had got so much into the way of expecting nothing but out-of-the-way things to happen, that it seemed quite dull and stupid for life to go on in the ordinary way. + +So she set to work, and very soon finished off the cake.""" + +LONG_2K_SOURCE = f"user\n{ALICE_TEXT}\nmodel\n" +long2k_ids = t.encode(LONG_2K_SOURCE, add_special_tokens=False) + +# Verify long_2k is in range +assert 2048 <= len(long2k_ids) <= 3072, \ + f"long_2k token count {len(long2k_ids)} outside [2048, 3072]" + +# ─── Write CSV files ───────────────────────────────────────────────────────── +def write_csv(path, ids): + with open(path, 'w') as f: + f.write(','.join(str(i) for i in ids)) + +write_csv(f"{OUTPUT_DIR}/short_chat.txt", short_ids) +write_csv(f"{OUTPUT_DIR}/long_open.txt", long_ids) +write_csv(f"{OUTPUT_DIR}/long_2k.txt", long2k_ids) + +# ─── Write .meta sidecars ──────────────────────────────────────────────────── +def write_meta(path, name, tool, chat_template, source_text, ids): + with open(path, 'w') as f: + f.write(f"file: {name}\n") + f.write(f"tool: HuggingFace transformers AutoTokenizer, model=google/gemma-3-27b-it (local cache)\n") + f.write(f"tokenizer_vocab_size: 262144\n") + f.write(f"gguf_vocab_size: 262144 (verified via gguf.GGUFReader)\n") + f.write(f"chat_template_applied: {chat_template}\n") + f.write(f"bos_prepended_in_csv: no (driver prepends BOS=2 automatically)\n") + f.write(f"token_count: {len(ids)}\n") + f.write(f"first_20_ids: {ids[:20]}\n") + f.write(f"last_5_ids: {ids[-5:]}\n") + f.write(f"source_text:\n{source_text}\n") + +write_meta(f"{OUTPUT_DIR}/short_chat.meta", "short_chat.txt", "hf-transformers", + "yes", SHORT_SOURCE, short_ids) +write_meta(f"{OUTPUT_DIR}/long_open.meta", "long_open.txt", "hf-transformers", + "yes", LONG_OPEN_SOURCE, long_ids) +write_meta(f"{OUTPUT_DIR}/long_2k.meta", "long_2k.txt", "hf-transformers", + "yes", + "Alice in Wonderland Chapter I 'Down the Rabbit-Hole' in full. " + "Source: Project Gutenberg https://www.gutenberg.org/cache/epub/11/pg11.txt (public domain). " + "2611 tokens, within [2048, 3072] target range.", + long2k_ids) + +# ─── Summary ───────────────────────────────────────────────────────────────── +print("=== Tokenized prompt files generated ===") +print(f"short_chat.txt : {len(short_ids)} tokens, first 20: {short_ids[:20]}, last 5: {short_ids[-5:]}") +print(f"long_open.txt : {len(long_ids)} tokens, first 20: {long_ids[:20]}, last 5: {long_ids[-5:]}") +print(f"long_2k.txt : {len(long2k_ids)} tokens, first 20: {long2k_ids[:20]}, last 5: {long2k_ids[-5:]}") +print("All .meta sidecars written.") +print(f"\nExpected prefill counts with BOS (+1):") +print(f" short_chat: {len(short_ids)+1}") +print(f" long_open: {len(long_ids)+1}") +print(f" long_2k: {len(long2k_ids)+1}") diff --git a/.sisyphus/notes/gemma4-baseline/matrix-64k-v2/SUMMARY.md b/.sisyphus/notes/gemma4-baseline/matrix-64k-v2/SUMMARY.md new file mode 100644 index 00000000..16dfdc0b --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/matrix-64k-v2/SUMMARY.md @@ -0,0 +1,57 @@ +# Matrix v2 at 64k — all fixes in. 2026-05-09T23:48:10+02:00 + +=== V1_none starting at 23:48:10 === +V1_none rc=0 +=== V2_mtp starting at 23:50:30 === +V2_mtp rc=0 +=== V3_dflash_dm8 starting at 23:52:54 === +V3_dflash_dm8 rc=0 + +## Per-cell stats + +### V1_none +``` +[cache] narrow asymmetric: forced Q8_0 on 2 captured full-attn layer(s) (remaining 8 full-attn keep TQ3) +[cache] kv types: SWA=tq3_0, full=tq3_0 +[prefill] 49904 tokens in 85278.6 ms (585.2 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 100) +[stats] generated=256 decode_ms=37108.4 tok/s=6.90 first_tok_ms=145.88 +[stats] prefill=49904 tokens context_used=50160/65536 +[mem] VRAM used=21.25 GB total=24.00 GB +``` + +### V2_mtp +``` +[cache] narrow asymmetric: forced Q8_0 on 2 captured full-attn layer(s) (remaining 8 full-attn keep TQ3) +[cache] kv types: SWA=tq3_0, full=tq3_0 +[prefill] 49904 tokens in 85189.9 ms (585.8 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 100) +[mtp] steps=256 accepted=5 accept_rate=0.02 +[stats] generated=256 decode_ms=40432.7 tok/s=6.33 first_tok_ms=164.94 +[stats] prefill=49904 tokens context_used=50160/65536 +[mem] VRAM used=21.70 GB total=24.00 GB +``` + +### V3_dflash_dm8 +``` +[cache] narrow asymmetric: forced Q8_0 on 2 captured full-attn layer(s) (remaining 8 full-attn keep TQ3) +[cache] kv types: SWA=tq3_0, full=tq3_0 +[draft] KV cache allocated: 2096 slots +[prefill] 49904 tokens in 85184.4 ms (585.8 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 100) +[draft] KV prefill done: 2096 positions materialized (skipped 47808 early tokens, cap=2096) +[stats] generated=256 decode_ms=27753.8 tok/s=9.22 first_tok_ms=257.06 +[stats] prefill=49904 tokens context_used=50160/65536 +[spec] draft_steps=112 total_accepted=256 avg_accept=2.29 +[mem] VRAM used=23.59 GB total=24.00 GB +``` + +## Decoded text comparison (first 80 generated tokens) + +### V1_none +first_80_decoded: 'swe relentlessosock
8ublicthoughtthought\n### Summary of Themes and Characters\n\nThis text consists of several fragmented scenes (likely from a play or a series of dramatic sketches) focusing on the political instability of Rome and the personal conflicts of its leaders.\n\n#### **Major Themes**\n\n* **Pride vs. Humility:** The' + +### V2_mtp +first_80_decoded: 'swe absorberosthought\n### Summary of Themes and Characters\n\nThe provided text consists of several fragmented scenes (likely from a composite or modified version of Shakespearean-style plays, including elements of *Coriolanus* and *Richard III*). The narrative focuses on the intersection of military glory, political instability, and the volatility of public favor.\n\n#### Major Themes\n\n' + +### V3_dflash_dm8 +first_80_decoded: 'swe Brasosthought\n### Summary of Themes and Characters\n\nThe provided text is a fragmented collection of scenes (likely from a composite or modified version of Shakespearean-style plays, blending elements of *Coriolanus* and *Richard III*). It depicts a world of political instability, violent ambition, and the volatile relationship between the ruling elite and the common people.' + +DONE diff --git a/.sisyphus/notes/gemma4-baseline/matrix-64k/SUMMARY.md b/.sisyphus/notes/gemma4-baseline/matrix-64k/SUMMARY.md new file mode 100644 index 00000000..357873a8 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/matrix-64k/SUMMARY.md @@ -0,0 +1,71 @@ +# 64k drafter A/B with TQ3 + pFlash (dense 31B) — 2026-05-09T23:05:51+02:00 +Prompt: long_50k.txt (~50k tokens), ctx=65536, n_predict=256 + +=== T1_none === +T1_none rc=0 +=== T2_mtp === +T2_mtp rc=0 +=== T3_dflash === +T3_dflash rc=143 + +## Per-cell stats + +### T1_none +``` +[cache] narrow asymmetric: forced Q8_0 on 2 captured full-attn layer(s) (remaining 8 full-attn keep TQ3) +[cache] kv types: SWA=tq3_0, full=tq3_0 +[prefill] 49904 tokens in 87859.2 ms (568.0 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 100) +[stats] generated=256 decode_ms=37952.4 tok/s=6.75 first_tok_ms=150.63 +[stats] prefill=49904 tokens context_used=50160/65536 +[mem] VRAM used=21.40 GB total=24.00 GB +``` + +### T2_mtp +``` +[cache] narrow asymmetric: forced Q8_0 on 2 captured full-attn layer(s) (remaining 8 full-attn keep TQ3) +[cache] kv types: SWA=tq3_0, full=tq3_0 +[prefill] 49904 tokens in 87919.8 ms (567.6 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 100) +[mtp-step 8] accept_rate=0.00 +532 81179 108 818 3847 1816 10594 529 [mtp-step 16] accept_rate=0.00 +3131 89144 18583 568 19609 699 496 22907 [mtp-step 24] accept_rate=0.00 +653 12269 3567 529 36951 508 236772 3061 [mtp-step 32] accept_rate=0.00 +10772 236764 2440 4820 529 808 236780 6886 [mtp-step 40] accept_rate=0.03 +40707 605 236829 532 808 40421 8488 236829 [mtp-step 48] accept_rate=0.02 +769 669 22323 21132 580 506 18074 529 [mtp-step 56] accept_rate=0.02 +7820 27877 236764 5255 32202 236764 532 506 [mtp-step 64] accept_rate=0.05 +43866 529 1237 4664 236761 108 2595 18787 [mtp-step 72] accept_rate=0.04 +137944 108 236829 139 1018 203460 532 19839 [mtp-step 80] accept_rate=0.04 +4499 53121 669 6082 12160 84022 2101 506 [mtp-step 88] accept_rate=0.03 +16625 1534 33641 532 125860 236761 102301 605 [mtp-step 96] accept_rate=0.03 +2481 81341 568 236780 6886 40707 605 236768 [mtp-step 104] accept_rate=0.03 +563 496 24240 1933 31451 236764 840 914 [mtp-step 112] accept_rate=0.04 +125688 573 506 3364 1331 532 914 45208 [mtp-step 120] accept_rate=0.03 +531 623 1674 2737 236775 1091 2080 531 [mtp-step 128] accept_rate=0.03 +914 124466 236761 4923 21077 3590 1515 496 [mtp-step 136] accept_rate=0.03 +623 45513 236775 528 506 6114 529 914 [mtp-step 144] accept_rate=0.03 +22816 532 496 179267 531 506 11838 236761 [mtp-step 152] accept_rate=0.03 +107 236829 139 1018 818 6285 26633 529 [mtp-step 160] accept_rate=0.03 +506 623 13666 4637 1083 1018 669 1816 [mtp-step 168] accept_rate=0.02 +46235 506 214696 4135 529 506 3364 1331 [mtp-step 176] accept_rate=0.02 +``` + +### T3_dflash +``` +[cache] narrow asymmetric: forced Q8_0 on 2 captured full-attn layer(s) (remaining 8 full-attn keep TQ3) +[cache] kv types: SWA=tq3_0, full=tq3_0 +``` + +## First 80 generated tokens (decoded) + +### T1_none +raw extracted (first 80): [49904, 87859, 2, 568, 0, 100, 0, 3, 13, 134, 2, 895, 5, 308, 13, 206, 376, 45518, 100, 45518, 107, 101, 10354, 25252, 529, 137944, 532, 81179, 108, 818, 3847, 1816, 10594, 529, 3131, 89144, 18583, 568, 19609, 699, 496, 22907, 653, 12269, 3567, 529, 36951, 508, 236772, 3061, 10772, 236764, 2440, 4820, 529, 808, 236780, 6886, 40707, 605, 236829, 532, 808, 40421, 8488, 236829, 769, 669, 22323, 21132, 580, 506, 18074, 529, 7820, 27877, 236764, 5255, 32202, 236764] +decoded (first 80): 'sweולם (\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n Y[multimodal]F�thoughtthought\n### Summary of Themes and Characters\n\nThe provided text consists of several fragmented scenes (likely from a composite or modified version of Shakespearean-style plays, including elements of *Coriolanus* and *Richard III*). The narrative focuses on the intersection of military glory, political instability,' + +### T2_mtp +raw extracted (first 80): [49904, 87919, 8, 567, 6, 100, 100, 45518, 107, 101, 10354, 25252, 529, 137944, 532, 81179, 108, 818, 3847, 1816, 10594, 529, 3131, 89144, 18583, 568, 19609, 699, 496, 22907, 653, 12269, 3567, 529, 36951, 508, 236772, 3061, 10772, 236764, 2440, 4820, 529, 808, 236780, 6886, 40707, 605, 236829, 532, 808, 40421, 8488, 236829, 769, 669, 22323, 21132, 580, 506, 18074, 529, 7820, 27877, 236764, 5255, 32202, 236764, 532, 506, 43866, 529, 1237, 4664, 236761, 108, 2595, 18787, 137944, 108] +decoded (first 80): 'swe Tahunationthought\n### Summary of Themes and Characters\n\nThe provided text consists of several fragmented scenes (likely from a composite or modified version of Shakespearean-style plays, including elements of *Coriolanus* and *Richard III*). The narrative focuses on the intersection of military glory, political instability, and the volatility of public favor.\n\n#### Major Themes\n\n' + +### T3_dflash: no [prefill] marker + +DONE + diff --git a/.sisyphus/notes/gemma4-baseline/matrix-v3/SUMMARY.md b/.sisyphus/notes/gemma4-baseline/matrix-v3/SUMMARY.md new file mode 100644 index 00000000..8ccb0787 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/matrix-v3/SUMMARY.md @@ -0,0 +1,6 @@ +# Matrix v3 with SWA mask fix — 2026-05-09T22:17:43+02:00 +=== N1_none_q8_tq3 (K=q8_0 V=tq3_0 draft=none) === +N1_none_q8_tq3 rc=0 +=== N2_none_q8_q8 (K=q8_0 V=q8_0 draft=none) === +N2_none_q8_q8 rc=0 +=== N3_mtp_q8_tq3 (K=q8_0 V=tq3_0 draft=mtp) === diff --git a/.sisyphus/notes/gemma4-baseline/prompts/humaneval_2.meta b/.sisyphus/notes/gemma4-baseline/prompts/humaneval_2.meta new file mode 100644 index 00000000..58df95a4 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/humaneval_2.meta @@ -0,0 +1,6 @@ +tokenizer: google/gemma-3-27b-it +chat_template: yes (EvalPlus canonical instruction + opening code-fence) +source: HumanEval/2 (truncate_number) +token_count: 139 +first_20: [105, 2364, 107, 9366, 2847, 496, 1265, 236772, 66436, 17856, 8948, 600, 64744, 506, 2269, 2608, 528, 496, 127532, 3393] +last_5: [236787, 107, 2717, 6719, 107] diff --git a/.sisyphus/notes/gemma4-baseline/prompts/humaneval_2.txt b/.sisyphus/notes/gemma4-baseline/prompts/humaneval_2.txt new file mode 100644 index 00000000..652b3c51 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/humaneval_2.txt @@ -0,0 +1 @@ +105,2364,107,9366,2847,496,1265,236772,66436,17856,8948,600,64744,506,2269,2608,528,496,127532,3393,3355,236787,107,2717,109,2063,102267,236779,5640,236769,5640,236787,6803,236768,3921,6803,236787,107,140,12234,17770,496,4414,18224,1523,1548,236764,625,740,577,81153,1131,107,140,624,11995,912,568,65020,11995,7100,1082,2238,1548,236768,532,70208,107,140,236769,989,1749,912,2462,7100,1082,236743,236770,769,108,140,13293,506,20632,912,529,506,1548,236761,107,140,22539,102267,236779,5640,236769,236800,236761,236810,236768,107,140,236771,236761,236810,107,140,12234,108,2717,106,107,105,4368,107,43760,563,496,17856,8948,607,496,1265,236772,66436,1292,600,64744,506,2608,532,16349,7041,7713,236787,107,2717,6719,107 \ No newline at end of file diff --git a/.sisyphus/notes/gemma4-baseline/prompts/long_2k.meta b/.sisyphus/notes/gemma4-baseline/prompts/long_2k.meta new file mode 100644 index 00000000..06f21a67 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/long_2k.meta @@ -0,0 +1,11 @@ +file: long_2k.txt +tool: HuggingFace transformers AutoTokenizer, model=google/gemma-3-27b-it (local cache) +tokenizer_vocab_size: 262144 +gguf_vocab_size: 262144 (verified via gguf.GGUFReader) +chat_template_applied: yes +bos_prepended_in_csv: no (driver prepends BOS=2 automatically) +token_count: 2611 +first_20_ids: [105, 2364, 107, 85305, 691, 6534, 531, 974, 1401, 20718, 529, 8116, 684, 1116, 12198, 580, 506, 4856, 236764, 532] +last_5_ids: [106, 107, 105, 4368, 107] +source_text: +Alice in Wonderland Chapter I 'Down the Rabbit-Hole' in full. Source: Project Gutenberg https://www.gutenberg.org/cache/epub/11/pg11.txt (public domain). 2611 tokens, within [2048, 3072] target range. diff --git a/.sisyphus/notes/gemma4-baseline/prompts/long_2k.txt b/.sisyphus/notes/gemma4-baseline/prompts/long_2k.txt new file mode 100644 index 00000000..b9296ac2 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/long_2k.txt @@ -0,0 +1 @@ +105,2364,107,85305,691,6534,531,974,1401,20718,529,8116,684,1116,12198,580,506,4856,236764,532,529,2963,5017,531,776,236787,3622,653,10911,1304,1053,154312,524,1131,506,2260,1116,12198,691,6013,236764,840,625,1053,951,7829,653,23695,528,625,236764,532,1144,563,506,1161,529,496,2260,236764,3305,32858,236764,2180,7829,653,23695,236881,1593,1304,691,11337,528,1116,1852,3666,568,527,1388,618,1304,1451,236764,573,506,3425,1719,1603,1116,2597,1401,105662,532,29123,779,3363,506,17132,529,3043,496,99414,236772,10864,1093,577,5367,506,13007,529,3978,872,532,24374,506,150497,236764,1056,15937,496,7286,62524,607,9514,6114,11536,3107,684,1116,236761,108,3810,691,5017,834,1401,19374,528,600,236793,6271,1602,32858,1751,625,834,1401,1623,855,529,506,1595,531,6899,506,62524,1879,531,4850,236764,7062,23348,236888,7062,23348,236888,564,2863,577,5226,236888,568,14730,1304,3305,625,1024,24347,236764,625,10997,531,1116,600,1304,19452,531,735,38757,657,672,236764,840,657,506,990,625,784,10012,4346,3756,626,840,1056,506,62524,3643,3721,496,4526,855,529,1061,228707,236772,112609,236764,532,6976,657,625,236764,532,1299,69419,580,236764,32858,3931,531,1116,6172,236764,573,625,90982,3418,1116,3666,600,1304,1053,2752,1680,3472,496,27973,607,3477,496,228707,236772,112609,236764,653,496,4526,531,1769,855,529,625,236764,532,18830,607,41998,236764,1304,11536,3418,506,2135,1308,625,236764,532,85746,691,1164,528,990,531,1460,625,2198,1679,496,2455,27973,236772,26026,1208,506,39407,236761,108,902,2264,3479,1679,3939,32858,1308,625,236764,2752,3622,11337,1217,528,506,1902,1304,691,531,974,855,1570,236761,669,27973,236772,26026,3939,6850,580,1133,496,22728,573,1070,1595,236764,532,1299,80698,15937,1679,236764,834,15937,600,32858,1053,711,496,3479,531,1751,1003,25745,13442,1680,1304,1765,13442,14773,1679,496,1401,5268,1388,236761,108,75064,506,1388,691,1401,5268,236764,653,1304,11561,1401,13198,236764,573,1304,1053,11766,529,990,618,1304,3939,1679,531,1385,1003,1116,532,531,5601,1144,691,1771,531,7499,2148,236761,5315,236764,1304,6956,531,1385,1679,532,1386,855,1144,1304,691,4891,531,236764,840,625,691,2311,4996,531,1460,4658,236793,1299,1304,6976,657,506,9174,529,506,1388,236764,532,14275,600,901,964,9772,607,165835,532,2260,236772,136041,2061,236793,1590,532,993,1304,5004,13571,532,7829,15410,3324,152509,236761,2625,3721,1679,496,19498,699,886,529,506,37424,618,1304,6915,236793,625,691,54802,8005,33106,16437,91700,20552,236764,840,531,1116,1822,47531,625,691,7738,236787,1304,1602,711,1133,531,7266,506,19498,573,9891,529,18905,19866,32069,236764,834,10542,531,2247,625,1131,886,529,506,165835,618,1304,11561,3068,625,236761,108,13086,236764,3305,32858,531,13442,236764,1308,1288,496,3798,618,672,236764,564,2863,1751,5017,529,151357,1679,31393,236888,2088,36711,901,236789,859,784,1751,786,657,2033,236888,8922,236764,564,10369,236789,236745,1879,4658,1003,625,236764,1581,768,564,11561,1135,506,1903,529,506,3155,236888,568,24249,691,1401,4547,1847,2907,108,8063,236764,1679,236764,1679,236761,21284,506,3798,2752,2229,531,614,1345,236881,564,5601,1217,1551,7635,564,236789,560,22303,684,672,990,236881,1304,1176,79567,236761,564,1921,577,3978,16581,3541,506,9317,529,506,7764,236761,3792,786,1460,236787,600,1093,577,2390,13460,7635,1679,236764,564,1751,726,568,1708,236764,611,1460,236764,32858,1053,43654,3131,2432,529,672,4260,528,1116,17205,528,506,2528,3352,236764,532,3635,672,691,711,496,1401,1535,5506,573,6807,1135,1116,4654,236764,618,993,691,951,886,531,10763,531,1116,236764,2036,625,691,1535,5428,531,1879,625,1024,236768,2617,4443,236764,600,236789,236751,1003,506,1447,5149,726,5503,1299,564,5601,1144,165356,653,8099,4637,564,236789,560,2506,531,236881,568,85305,1053,951,4317,1144,165356,691,236764,653,8099,4637,3477,236764,840,3305,901,964,6290,4159,4171,531,1879,2907,108,25278,586,1304,6074,1570,236761,564,5601,768,564,2863,3798,1447,1343,506,7764,236888,2088,12274,625,236789,859,4483,531,2229,855,3571,506,1331,600,3727,607,910,15005,26876,236888,669,223656,651,695,236764,564,1751,726,568,12053,691,4319,16126,993,691,951,886,13723,236764,672,990,236764,618,625,3782,236789,236745,5057,657,784,506,1447,3658,236768,2617,5503,564,2863,735,531,2679,1091,1144,506,1463,529,506,2891,563,236764,611,1281,236761,7323,236764,6464,236789,546,236764,563,672,1799,17564,653,8187,236881,568,624,1304,6956,531,176244,8109,618,1304,13804,726,108404,176244,561,6723,618,611,236789,500,14773,1343,506,2634,236888,3574,611,1751,611,1451,9688,625,17103,1452,1144,614,59590,2268,3953,1304,236789,859,1751,786,573,10980,236888,2301,236764,625,236789,859,2752,776,531,2679,236787,8229,564,2863,1460,625,5267,872,16581,236761,108,8063,236764,1679,236764,1679,236761,2085,691,5017,1663,531,776,236764,834,32858,4949,6074,6931,1570,236761,33857,957,236789,859,4305,786,1401,1623,531,236772,9467,236764,564,1374,1751,236888,568,86088,957,691,506,5866,2907,564,4614,901,236789,859,5630,1116,45050,529,9556,657,11115,236772,2289,236761,33857,957,1041,23348,236888,564,7976,611,964,1679,1590,607,786,236888,2085,659,951,14547,528,506,2634,236764,564,236789,236757,16937,236764,840,611,2473,4682,496,9537,236764,532,600,236789,236751,1401,1133,496,11866,236764,611,1281,236761,2024,776,22797,9039,43512,236764,564,5601,236881,1452,1590,32858,6074,531,974,4319,105662,236764,532,3939,580,6420,531,13442,236764,528,496,107571,4260,529,1595,236764,3574,22797,9039,43512,236881,3574,22797,9039,43512,236881,532,6494,236764,3574,43512,9039,22797,236881,573,236764,611,1460,236764,618,1304,9225,236789,236745,3890,3477,2934,236764,625,3782,236789,236745,1623,4217,837,1595,1304,2247,625,236761,2625,6345,600,1304,691,776,15383,1135,236764,532,1053,1164,22711,531,9156,600,1304,691,9378,1526,528,1526,607,33857,957,236764,532,6420,531,1116,1401,131390,236764,4224,236764,33857,957,236764,3442,786,506,9043,236787,1602,611,3785,9039,496,9537,236881,1056,15937,236764,206256,236888,206256,236888,1679,1304,3588,3324,496,38659,529,33227,532,6299,6895,236764,532,506,3798,691,1024,236761,108,85305,691,711,496,3103,16131,236764,532,1304,32694,872,580,531,1116,6172,528,496,3479,236787,1304,6976,872,236764,840,625,691,784,4996,28962,236793,1680,1116,691,2264,1440,16622,236764,532,506,7286,62524,691,2036,528,14186,236764,12475,19581,1679,625,236761,2085,691,711,496,3479,531,577,5745,236787,3121,3939,32858,1133,506,6573,236764,532,691,1164,528,990,531,6899,625,1879,236764,618,625,6812,496,9895,236764,7062,1041,23896,532,175253,236764,1217,5226,625,236789,236751,3978,236888,2625,691,3107,4977,625,1056,1304,6812,506,9895,236764,840,506,62524,691,951,4890,531,577,3472,236787,1304,1765,13442,528,496,1440,236764,2708,11967,236764,837,691,10035,872,684,496,2050,529,36352,17940,699,506,11414,236761,108,3810,964,13887,784,4886,506,11967,236764,840,901,964,784,23555,236793,532,1056,32858,1053,1010,784,506,1595,1679,886,2678,532,872,506,1032,236764,4875,1418,5232,236764,1304,16393,46149,1679,506,6029,236764,22967,1217,1304,691,3785,531,974,855,1570,236761,108,132026,1304,3588,3324,496,2268,1806,236772,100872,2633,236764,784,1603,529,5139,4896,236793,993,691,5017,580,625,4533,496,16383,13935,2307,236764,532,32858,236789,236751,1171,3305,691,600,625,2473,8386,531,886,529,506,13887,529,506,11967,236793,840,236764,76897,236888,3477,506,41226,964,2311,2455,236764,653,506,2307,691,2311,1944,236764,840,657,1027,3136,625,1093,711,1932,1027,529,1091,236761,3153,236764,580,506,1855,990,4886,236764,1304,3588,3324,496,2708,31099,1304,1053,711,14275,1680,236764,532,4977,625,691,496,2268,5232,1003,31013,12371,1494,236787,1304,6956,506,2268,13935,2307,528,506,6623,236764,532,531,1116,1822,14933,625,20202,236888,108,85305,8678,506,5232,532,1765,600,625,5378,1131,496,1944,16622,236764,711,1623,6268,1082,496,5186,236772,26026,236787,1304,179360,1679,532,6976,3008,506,16622,1131,506,7278,182118,7972,611,3785,5004,236761,2088,1304,161392,531,974,855,529,600,4996,11967,236764,532,52659,1003,3571,1724,21622,529,7804,7983,532,1724,5427,136411,236764,840,1304,1451,711,1581,974,1116,2228,1343,506,99610,236793,532,1581,768,1041,2228,1093,817,1343,236764,3305,6934,32858,236764,625,1093,577,529,1401,2268,1161,2180,1041,28470,236761,7062,236764,1217,564,7976,564,1451,13213,872,1133,496,54516,236888,564,1751,564,1451,236764,768,564,1186,7261,1217,531,3654,236761,1701,236764,611,1460,236764,834,1551,855,236772,1340,236772,1437,236772,2677,2432,1053,8432,33208,236764,600,32858,1053,22711,531,1751,600,1401,2321,2432,11161,964,2126,11449,236761,108,3810,10012,531,577,951,1161,528,9495,684,506,2268,5232,236764,834,1304,3939,1063,531,506,2633,236764,3746,18430,1304,2473,1586,2264,2307,580,625,236764,653,657,1027,3136,496,2260,529,6366,573,83525,1331,872,1133,107125,236787,672,990,1304,1765,496,2268,11988,580,625,568,7650,8454,691,711,1590,1680,236764,1176,32858,779,532,4886,506,10225,529,506,11988,691,496,3627,5346,236764,607,506,4171,19588,25889,15932,29830,13762,580,625,528,2455,11739,236761,108,1509,691,784,1401,1388,531,1879,59063,786,236764,840,506,21608,2268,32858,691,711,1771,531,776,36944,528,496,49657,236761,2301,236764,564,236789,859,1385,1171,236764,1304,1176,236764,532,1460,3363,625,236789,236751,11373,23572,653,711,236793,573,1304,1053,1676,3131,6290,2268,54949,1003,2940,1015,1053,2506,47058,236764,532,35751,872,684,6877,75710,532,1032,52794,2432,236764,784,1547,901,1093,711,5630,506,3606,6366,910,4690,1053,14582,1091,236787,1288,618,236764,600,496,2604,236772,9341,42470,795,8141,611,768,611,2768,625,2311,1440,236793,532,600,768,611,3463,822,15599,1401,19297,607,496,21171,236764,625,4781,10214,6124,236793,532,1304,1053,2752,27971,600,236764,768,611,6092,1623,699,496,11988,11373,23572,236764,625,563,4180,2953,531,39188,607,611,236764,32909,653,3209,236761,108,9675,236764,672,11988,691,5244,11373,23572,236764,834,32858,105915,531,11613,625,236764,532,8159,625,1401,6290,236764,568,509,1053,236764,528,1707,236764,496,4260,529,9726,42717,529,30153,236772,144993,236764,140245,236764,18589,236772,17641,236764,59681,40565,236764,531,6453,236764,532,3425,11018,524,40233,44204,1304,1401,4949,8585,625,1135,236761,108,3689,496,23210,8178,236888,1176,32858,236761,564,1921,577,83525,872,1133,496,54516,236761,108,3133,834,625,691,11161,236787,1304,691,1492,1186,3595,12371,1494,236764,532,1116,3392,7804,4027,872,657,506,3305,600,1304,691,1492,506,1447,2425,573,1771,1343,506,2268,5232,1131,600,14954,7972,236761,5315,236764,3685,236764,1304,35033,573,496,2321,4310,531,1460,768,1304,691,1771,531,35426,1027,3342,236787,1304,6345,496,2268,20100,1003,672,236793,573,625,2473,1345,236764,611,1281,236764,1176,32858,531,13442,236764,528,1041,1771,855,29654,236764,1133,496,27311,236761,564,5601,1144,564,1374,577,1133,1299,236881,1452,1304,6956,531,28149,1144,506,27400,529,496,27311,563,1133,1308,506,27311,563,41757,855,236764,573,1304,1451,711,5630,3785,2963,3472,1288,496,3210,236761,108,6259,496,1651,236764,8159,600,5017,919,8432,236764,1304,6544,580,1771,1131,506,7972,657,3622,236793,840,236764,76897,573,6934,32858,236888,1056,1304,2506,531,506,5232,236764,1304,1765,1304,1053,27971,506,2268,13935,2307,236764,532,1056,1304,3939,1063,531,506,2633,573,625,236764,1304,1765,1304,1451,711,11963,5370,625,236787,1304,1451,1460,625,4346,76206,1343,506,4896,236764,532,1304,6956,1116,1791,531,17085,872,886,529,506,14897,529,506,2633,236764,840,625,691,2311,85783,236793,532,1056,1304,1053,20718,13442,855,607,4875,236764,506,6934,2268,3210,2838,1679,532,41641,236761,108,33190,236764,993,236789,236751,951,1161,528,34955,1133,600,236888,1176,32858,531,13442,236764,4319,42269,236761,564,28136,611,531,5264,1135,672,9641,236888,2625,6816,5877,13442,1401,1535,9106,236764,568,3480,1304,1401,44903,6641,625,779,532,6494,1304,225262,13442,834,35971,618,531,3437,24947,1131,1116,6114,236793,532,3622,1304,25159,4875,531,3673,1116,1852,23896,573,2963,113687,13442,528,496,2290,529,7281,13396,1304,691,5662,2342,13442,236764,573,672,23210,1919,691,1401,10667,529,83520,531,577,1156,1331,236761,2024,625,236789,236751,951,1161,1492,236764,3305,6934,32858,236764,531,40890,531,577,1156,1331,236888,8922,236764,993,236789,236751,20060,3487,529,786,2378,531,1386,33015,70102,1589,236888,108,86420,1116,7068,11561,580,496,2268,4896,3673,600,691,17164,1208,506,2633,236787,1304,8678,625,236764,532,1765,528,625,496,1401,1944,12580,236764,580,837,506,4171,645,1375,15932,964,29830,11373,528,169713,236761,7134,236764,564,236789,859,9039,625,236764,1176,32858,236764,532,768,625,3590,786,2171,6268,236764,564,740,5370,506,2307,236793,532,768,625,3590,786,2171,7100,236764,564,740,32753,1208,506,5232,236793,834,3477,1595,564,236789,859,974,1131,506,7972,236764,532,564,1537,236789,236745,2065,837,9439,236888,108,5778,29797,496,2268,3103,236764,532,1176,143428,531,13442,236764,15311,1595,236881,15311,1595,19591,7046,1116,1526,580,506,1903,529,1116,2228,531,2597,837,1595,625,691,6730,236764,532,1304,691,4346,15726,531,1586,600,1304,12800,506,1638,2425,236787,531,577,2889,236764,672,6816,9439,1056,886,55936,12580,236764,840,32858,1053,2506,834,1623,1131,506,1595,529,27780,5017,840,855,236772,1340,236772,1437,236772,2677,2432,531,7499,236764,600,625,10012,4346,39212,532,29123,573,1972,531,817,580,528,506,12822,1595,236761,108,4324,1304,1076,531,981,236764,532,1401,4949,8585,1135,506,12580,236761,106,107,105,4368,107 \ No newline at end of file diff --git a/.sisyphus/notes/gemma4-baseline/prompts/long_50k.meta b/.sisyphus/notes/gemma4-baseline/prompts/long_50k.meta new file mode 100644 index 00000000..23f27e70 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/long_50k.meta @@ -0,0 +1,6 @@ +tokenizer: google/gemma-3-27b-it (vocab matches Gemma4 GGUF byte-for-byte) +chat_template: yes (user turn = Shakespeare summarization request) +source: /home/peppi/research/full_tiny_shakespeare.txt (first 172101 chars) +token_count: 49903 (driver will prepend BOS -> 49904 prefill) +first_20: [105, 2364, 107, 818, 2269, 563, 496, 38974, 529, 1816, 236761, 8847, 625, 13058, 236764, 1299, 49573, 506, 3262, 22888] +last_5: [106, 107, 105, 4368, 107] diff --git a/.sisyphus/notes/gemma4-baseline/prompts/long_50k.txt b/.sisyphus/notes/gemma4-baseline/prompts/long_50k.txt new file mode 100644 index 00000000..e628b341 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/long_50k.txt @@ -0,0 +1 @@ +105,2364,107,818,2269,563,496,38974,529,1816,236761,8847,625,13058,236764,1299,49573,506,3262,22888,532,7579,236761,108,7280,67386,236787,107,13286,692,7162,1027,3342,236764,6899,786,8988,236761,108,3243,236787,107,130171,236764,8988,236761,108,7280,67386,236787,107,3048,659,784,21891,4319,531,1778,1082,531,1726,1044,236881,108,3243,236787,107,114214,236761,21891,236761,108,7280,67386,236787,107,7280,236764,611,1281,102301,605,2481,81341,563,9329,13550,531,506,1331,236761,108,3243,236787,107,1882,1281,236789,236745,236764,692,1281,236789,236745,236761,108,7280,67386,236787,107,6481,775,11807,1515,236764,532,692,236789,859,735,15689,657,1023,1852,3385,236761,107,4602,236789,236745,496,29663,236881,108,3243,236787,107,3771,919,6931,580,236789,236745,236793,1531,625,577,3028,236787,3121,236764,3121,236888,108,12400,67386,236787,107,4906,3658,236764,1535,11838,236761,108,7280,67386,236787,107,1882,659,36020,6934,11838,236764,506,200997,5990,1535,236761,107,3689,9747,1270,1794,1258,580,1093,38781,775,236787,768,901,107,41928,6422,775,840,506,102761,665,236764,1651,625,964,107,1914,6909,761,236764,692,2473,8844,901,48040,775,3246,953,236793,107,5503,901,1751,692,659,2311,23348,236787,506,674,210539,600,107,2806,79055,775,236764,506,2495,529,1023,67735,236764,563,618,614,107,46589,531,2931,1117,910,26444,236793,1023,107,236751,6039,831,563,496,7411,531,1091,3792,775,47812,672,607,107,700,510,14493,236764,16509,692,3291,1459,10877,236787,573,506,33408,1281,564,107,86137,672,528,40786,573,12175,236764,711,528,51705,573,47812,236761,108,12400,67386,236787,107,38786,611,7162,4285,2342,102301,605,2481,81341,236881,108,3243,236787,107,91561,1515,1171,236787,668,236789,236751,496,1401,4799,531,506,3364,6368,236761,108,12400,67386,236787,107,24501,611,1144,3019,668,815,3028,573,914,2891,236881,108,7280,67386,236787,107,26546,1388,236793,532,1451,577,3004,531,2583,1515,1535,107,15443,7845,236764,840,600,668,15369,5668,607,1646,11307,236761,108,12400,67386,236787,107,197615,236764,840,8988,711,219853,236761,108,7280,67386,236787,107,236777,1879,31273,611,236764,1144,668,46440,3028,85867,236764,668,1602,107,509,531,600,1345,236787,3635,3538,236772,22735,51244,1758,740,577,107,3955,531,1879,625,691,573,914,2891,668,1602,625,531,107,40970,914,5946,532,531,577,22043,11307,236793,837,668,107,511,236764,1581,8421,506,30618,529,914,31886,236761,108,12400,67386,236787,107,3689,668,3914,1601,528,914,4135,236764,611,2881,496,107,85078,528,1515,236761,1599,1921,528,951,1595,1879,668,563,23203,536,806,236761,108,7280,67386,236787,107,2859,564,1921,711,236764,564,1202,711,577,96782,529,64496,236793,107,499,46440,43945,236764,607,34351,236764,531,22961,528,51881,236761,107,3689,134107,659,1239,236881,669,1032,2678,512,236789,506,3207,107,511,45410,236787,3217,4196,692,865,1194,1590,236881,531,506,44264,236888,108,3243,236787,107,33190,236764,2229,236761,108,7280,67386,236787,107,28824,236888,1015,3952,1590,236881,108,12400,67386,236787,107,236824,43195,10186,501,8875,115173,40768,236793,886,600,46440,2462,9312,107,1437,1331,236761,108,7280,67386,236787,107,2209,236789,236751,886,11481,3487,236787,1093,784,506,1884,964,834,236888,108,44180,1439,134560,236787,107,3689,981,236789,236751,236764,1041,148023,236764,528,1526,236881,1298,817,611,107,3497,43512,532,19571,236881,669,4217,236881,8988,236764,564,14098,611,236761,108,7280,67386,236787,107,7711,1960,563,711,11908,531,506,112936,236793,901,735,107,14712,19764,2395,672,101663,1144,692,31636,531,776,236764,107,7650,1492,692,236789,859,1407,756,581,528,50898,236761,2195,1879,6934,107,2914,7428,735,3188,121520,236787,901,2863,1281,692,107,17777,3188,12162,2311,236761,108,44180,1439,134560,236787,107,11355,236764,39880,236764,1041,1535,4690,236764,10701,11481,40304,236764,107,15600,611,57112,74445,236881,108,7280,67386,236787,107,1882,3914,236764,17536,236764,692,659,134342,3016,236761,108,44180,1439,134560,236787,107,236777,3442,611,236764,4690,236764,1346,43455,2065,107,19845,506,200997,5990,529,611,236761,1701,822,8150,236764,107,11069,15944,528,672,193874,236764,611,1149,618,1388,107,92159,657,506,20808,607,822,579,3740,618,12693,1091,107,91561,506,10995,1883,236764,5769,3003,795,580,107,818,1595,625,4716,236764,51364,3595,13460,143532,107,4088,919,3188,3205,618,6460,1082,740,3785,107,111901,528,822,156494,236761,1701,506,193874,236764,107,818,33408,236764,711,506,200997,5990,236764,1386,625,236764,532,107,11069,40027,531,1091,236764,711,12162,236764,1921,1601,236761,1429,697,236764,107,3048,659,36486,684,153610,107,1214,2853,1298,919,93525,611,236764,532,611,155834,107,818,1394,1356,512,236789,506,1883,236764,1015,2065,573,611,1133,49267,236764,107,4420,611,57391,1091,618,22816,236761,108,7280,67386,236787,107,31458,573,775,236888,6288,236764,11161,236888,2195,770,236789,497,47694,573,775,107,40253,236787,17477,775,531,1726,1044,236764,532,910,4762,236772,33884,107,236755,1043,1992,607,11261,236793,1386,1511,160628,573,775,2623,236764,531,107,9442,775,12914,236793,88649,6376,1027,87932,1134,107,48231,2342,506,8326,236764,532,2847,919,107,96567,4776,51655,6376,236764,531,7797,872,532,85827,107,1437,6934,236761,1637,506,28481,9039,775,711,872,236764,901,795,236793,532,107,13534,236789,236751,784,506,2765,901,10591,775,236761,108,44180,1439,134560,236787,107,75064,611,1921,107,15928,634,74445,151582,44904,236764,107,3524,577,17396,529,128005,236761,564,2863,3442,611,107,236776,5497,22720,236787,625,1149,577,611,735,6827,625,236793,107,4573,236764,2338,625,14736,1041,5708,236764,564,795,23937,107,2021,104127,756,236745,496,2268,919,236761,108,7280,67386,236787,107,13086,236764,564,236789,859,6899,625,236764,17536,236787,3819,611,1921,711,1751,531,107,236760,982,1135,1023,87270,607,496,22720,236787,840,236764,614,756,236745,5091,107,7624,236764,5518,236761,108,44180,1439,134560,236787,107,3810,691,496,990,1056,784,506,2742,236789,236751,3295,107,1479,11761,236789,236753,2342,506,36718,236764,5478,17396,625,236787,107,6372,1186,1133,496,130045,625,1602,4595,107,236777,236789,506,38362,512,236789,506,2742,236764,34982,532,723,5034,236764,107,31717,8578,40343,506,2518,624,236764,2752,18752,107,17729,16297,607,506,1884,236764,1298,506,1032,17747,107,15562,1460,532,6899,236764,68701,236764,14875,236764,3727,236764,2597,236764,107,3133,236764,44294,14309,236764,1602,13933,107,23450,236748,506,39485,532,32893,3364,107,4088,506,3697,2742,236761,669,36718,3890,236789,236753,726,108,7280,67386,236787,107,13086,236764,17536,236764,1144,3890,1603,506,36718,236881,108,44180,1439,134560,236787,107,39125,236764,564,2863,3442,611,236761,3227,496,2712,529,14819,236764,107,24249,770,236789,497,3588,699,506,38464,236764,840,1581,5478,726,107,2542,236764,1385,611,236764,564,1149,1386,506,36718,14819,107,2205,1388,618,8988,726,509,5787,20645,586,24406,107,2021,506,115615,524,3295,236764,506,5333,77241,4688,107,6372,11785,1178,914,19971,236793,1581,834,1346,4691,586,107,2205,611,64771,1023,80237,573,600,107,7634,659,711,1288,618,611,236761,108,7280,67386,236787,107,11069,36718,236789,236751,3890,236881,2900,236888,107,818,9615,586,236772,236755,54293,2228,236764,506,94014,7068,236764,107,818,193997,3710,236764,506,3774,1023,31451,236764,107,7711,2837,524,506,2420,236764,506,28166,1023,39976,2113,236761,107,3497,1032,30585,6632,532,73661,7351,107,902,672,1023,9130,236764,768,600,901,726,108,44180,1439,134560,236787,107,3689,1299,236881,107,236789,14555,786,236764,672,12339,26266,236888,2900,1299,236881,1144,1299,236881,108,7280,67386,236787,107,31336,684,506,505,741,136774,36718,577,85827,236789,236753,236764,107,15938,563,506,19326,512,236789,506,2742,74077,108,44180,1439,134560,236787,107,13086,236764,1144,1299,236881,108,7280,67386,236787,107,818,4937,11362,236764,768,901,1602,28686,236764,107,3689,1451,506,36718,3890,236881,108,44180,1439,134560,236787,107,236777,795,3442,611,107,2859,611,236789,859,171130,496,1944,726,1340,1144,611,735,2268,726,107,236791,84182,74951,236764,611,236789,859,6899,506,36718,236789,236751,3890,236761,108,7280,67386,236787,107,48184,236789,500,1440,1003,625,236761,108,44180,1439,134560,236787,107,10282,786,672,236764,1535,4389,236793,107,11069,1346,23674,36718,691,49438,236764,107,4348,59555,1133,914,1055,12641,236764,532,5478,3890,236789,236753,236787,107,236789,4339,563,625,236764,1041,27968,4690,6945,690,1235,668,236764,107,236789,6372,564,5908,506,2870,2780,657,1171,236764,107,24249,611,776,3892,3324,236793,532,4691,625,563,236764,107,17574,564,1006,506,4762,236772,6367,532,506,7805,107,4088,506,3697,2742,236787,840,236764,768,611,776,5630,236764,107,236777,5039,625,1343,506,25971,529,822,4806,236764,107,14986,531,506,4054,236764,506,3710,236764,531,506,11171,512,236789,506,7875,236793,107,3133,236764,1343,506,215121,532,15913,529,880,236764,107,818,32879,38051,532,1944,26819,43608,107,4663,786,5908,600,3756,86179,107,10936,2003,901,3892,236787,532,3635,600,784,657,3622,236764,107,3048,236764,1041,1535,4690,6945,726,1580,3189,506,36718,236764,1686,786,74077,108,7280,67386,236787,107,43320,236764,17536,236793,1388,236764,1388,236761,108,44180,1439,134560,236787,107,236789,31382,784,657,3622,3914,107,10185,1144,564,776,5518,855,531,1546,236764,107,40524,564,740,1386,1041,24391,872,236764,600,784,107,4663,786,776,1063,5908,506,18763,529,784,236764,107,3133,5264,786,840,506,50767,7085,2900,1879,611,531,236789,236745,236881,108,7280,67386,236787,107,1509,691,614,3890,236787,1217,5510,611,672,236881,108,44180,1439,134560,236787,107,818,80237,529,13706,659,672,1535,36718,236764,107,3133,611,506,5333,77241,3295,236793,573,17318,107,37107,236106,532,910,41927,236764,23219,2432,62960,107,128847,506,692,514,512,236789,506,3364,236764,611,2863,1586,107,3771,1237,7458,837,611,5908,107,4573,625,24676,653,3952,699,1091,531,611,107,3133,951,1595,699,74445,236761,2900,776,611,1751,236764,107,3048,236764,506,1822,23670,529,672,14510,236881,108,7280,67386,236787,107,236777,506,1822,23670,236888,3217,506,1822,23670,236881,108,44180,1439,134560,236787,107,2542,600,236764,1646,886,512,236789,506,14723,236764,2280,598,236764,85769,236764,107,4088,672,1346,21608,68730,236764,35627,817,236789,540,48398,236787,107,178651,24972,1653,236764,600,1610,14588,528,4806,531,1845,236764,107,66825,236789,540,1171,531,3345,1070,132422,236761,107,4573,1386,611,5508,822,24807,43512,532,19571,236787,107,97375,532,1116,27336,659,657,506,1523,529,10041,236793,107,818,886,2678,1921,735,118801,236761,107,192021,236764,29417,2481,81341,236888,108,19954,9575,3118,236787,107,12879,236761,2900,236789,236751,506,4217,236764,611,88149,1434,101077,1303,236764,107,6372,236764,71113,506,6934,129708,529,822,8737,236764,107,13185,74445,1060,9082,236881,108,7280,67386,236787,107,1882,735,3785,822,1535,3658,236761,108,19954,9575,3118,236787,107,2209,600,795,2583,1535,4171,531,44543,795,111771,107,221706,651,159338,851,236761,2900,1093,611,735,236764,611,33251,236764,107,6372,1133,6271,8118,6271,3653,236881,506,886,2296,98853,611,236764,107,818,1032,3590,611,11307,236761,1293,600,57910,531,611,236764,107,10936,668,1374,1586,611,78023,236764,15249,611,678,619,236793,107,10936,159564,236764,119072,236787,611,659,951,1270,497,236764,951,236764,107,55771,563,506,11559,529,4304,3324,506,8205,236764,107,3524,89352,11937,528,506,3768,236761,5180,31886,563,107,2021,1386,1515,26721,5769,39906,1159,780,507,1515,107,3133,57391,600,11933,1602,625,236761,107,15938,34904,75687,107,3984,21165,822,17554,236793,532,822,81142,659,107,236776,14504,880,236789,236751,39485,236764,1015,33447,1346,600,107,24249,1093,3553,914,16966,236761,1293,600,9796,107,41768,822,127804,129129,607,43756,529,2080,107,3133,668,8548,1679,144275,607,139113,236761,47608,12444,236888,12267,22419,236881,107,3497,1418,9641,611,776,2352,496,3666,236764,107,3133,2246,1515,29417,600,691,1492,822,17554,236764,107,142437,107110,600,691,822,99095,236761,2900,236789,236751,506,4217,236764,107,6372,528,1239,3131,6666,529,506,3207,107,3048,4665,2342,506,29417,112936,236764,1015,236764,107,14713,506,33408,236764,2514,611,528,54250,236764,837,1663,107,38786,4387,580,886,2264,236881,2900,236789,236751,910,12985,236881,108,44180,1439,134560,236787,107,2542,15689,657,910,1852,6224,236793,192085,236764,901,1879,236764,107,818,3207,563,1388,11628,236761,108,19954,9575,3118,236787,107,87481,756,581,236888,2195,1879,236888,107,7634,236789,859,2178,684,506,4304,236764,532,76003,531,1281,107,3689,236789,236751,3028,858,236789,506,44264,236793,1015,236789,236751,1133,531,8570,236764,107,15938,142781,532,1015,61085,236793,2678,92137,107,624,2583,855,107,1505,928,1762,66116,236793,3043,7789,3188,107,3133,1127,1641,2395,1288,618,1975,711,528,910,64030,107,43760,910,223939,11795,236761,2195,1879,993,236789,236751,107,99208,3487,236888,107,38786,506,102210,6267,16551,910,71617,236764,107,3133,1531,786,1161,1041,26114,236764,564,236789,859,1386,496,81035,107,3497,11252,529,1239,7737,236789,236753,47008,236764,618,1494,107,2205,564,1451,4351,1041,63180,236761,108,44180,1439,134560,236787,107,197615,236764,1239,659,4180,21410,72386,236793,107,2542,3635,119972,901,6220,25067,236764,107,40524,659,901,11332,209109,236761,2024,236764,564,5426,2167,574,611,236764,107,3689,3189,506,1032,101827,236881,108,19954,9575,3118,236787,107,7634,659,26978,236787,13098,756,581,236888,107,7634,1176,901,964,614,236772,169233,236793,53647,236789,236753,12034,608,56735,236764,107,6372,40786,16689,10810,11595,236764,600,12414,1921,9039,236764,107,6372,11495,691,1603,573,90965,236764,600,506,33408,3265,711,107,59519,573,506,8326,1758,1186,236787,607,1239,225698,107,7634,178305,910,28686,1013,236793,837,1646,3890,236789,236753,236764,107,3133,496,14071,13416,1091,236764,496,17163,886,726,107,2021,2541,506,3710,529,62811,236764,107,3133,1386,16627,2066,1385,24327,726,20890,28120,910,17126,107,2205,901,1093,13098,1091,580,506,63891,512,236789,506,16254,236764,107,3138,32293,910,166943,236761,108,44180,1439,134560,236787,107,3689,563,13416,1091,236881,108,19954,9575,3118,236787,107,40756,12761,11664,531,12250,910,63976,23069,236751,236764,107,4088,910,1852,5313,236787,886,236789,236751,234103,2795,57932,236764,107,86713,138299,21759,57932,236764,532,564,1281,711,726,236789,236773,59274,236888,107,818,40581,1148,1374,735,1171,723,80811,236789,236753,506,3207,236764,107,236788,500,834,59359,236789,236753,607,786,236787,625,795,528,990,107,17561,3324,2066,532,6184,12034,5314,22888,107,2542,168853,236789,236751,46256,236761,108,44180,1439,134560,236787,107,2094,563,17163,236761,108,19954,9575,3118,236787,107,5988,236764,974,611,2033,236764,611,31368,236888,108,86859,236787,107,10936,236789,236751,102301,605,2481,81341,236881,108,19954,9575,3118,236787,107,8291,236787,1144,236789,236751,506,4217,236881,108,86859,236787,107,818,4668,563,236764,17536,236764,506,6285,1166,507,659,528,12162,236761,108,19954,9575,3118,236787,107,236777,1006,16126,580,756,236745,236787,1299,692,2863,678,236789,2820,531,6771,107,7711,1921,236762,102761,665,236761,5912,236764,1023,1791,60953,236761,108,7280,28579,236787,107,9585,81341,236764,756,55188,1847,600,611,735,33208,4173,775,236793,107,818,6285,1166,507,659,528,12162,236761,108,19954,9575,3118,236787,107,7634,735,496,8575,236764,107,236774,1068,605,17519,547,8875,236764,600,795,2247,611,531,756,236745,236761,107,236777,4343,528,11785,6723,914,102210,236764,107,3133,964,564,1027,3210,840,1144,564,1006,236764,107,236777,1093,7976,786,1186,668,236761,108,2457,16008,134560,236787,107,3048,735,25876,3075,236761,108,19954,9575,3118,236787,107,114498,3746,531,3746,506,1902,684,506,23896,532,668,236761,107,41768,1041,4598,236764,564,236789,671,83800,531,1386,107,16904,1041,28481,607,1515,236787,668,563,496,32329,107,6372,564,1006,11307,531,26367,236761,108,7280,28579,236787,107,11407,236764,26721,2481,81341,236764,107,81259,3324,1301,138299,531,1239,28481,236761,108,2457,16008,134560,236787,107,1509,563,822,4937,14468,236761,108,19954,9575,3118,236787,107,39125,236764,625,563,236793,107,3133,564,1006,4512,236761,131998,639,661,8875,236764,35627,107,3138,3404,1460,786,3622,919,15161,657,118719,605,236789,3392,236761,107,3689,236764,1610,35627,24807,236881,1975,236789,540,855,236881,108,136825,3118,236787,107,3771,236764,102301,605,2481,81341,236793,107,236777,236789,859,25224,3324,886,1976,13194,532,6093,607,494,236789,1538,236764,107,236788,500,4196,4977,672,1960,236761,108,44180,1439,134560,236787,107,236806,236764,1847,236772,44817,236888,108,7280,28579,236787,107,11069,2544,531,506,44264,236793,1298,236764,564,1281,236764,107,7711,11333,4690,8835,775,236761,108,136825,3118,236787,108,2457,16008,134560,236787,107,206082,2481,81341,236888,108,7280,28579,236787,108,19954,9575,3118,236787,107,197615,236764,1531,1091,1500,236787,107,818,6285,1166,507,735,1623,15689,236793,1769,1239,27336,541,2853,107,2021,70579,1348,910,34405,616,236761,112961,1275,5333,41516,236764,107,11069,234007,15360,1388,12034,236787,14098,236764,1500,236761,108,206908,68251,3118,236787,107,31403,3785,880,834,11307,618,563,672,2481,81341,236881,108,12357,2170,3118,236787,107,2209,815,951,4745,236761,108,206908,68251,3118,236787,107,4420,692,964,9949,12761,11664,573,506,1331,74077,108,12357,2170,3118,236787,107,10666,236789,236753,611,914,11645,532,6114,236881,108,206908,68251,3118,236787,107,197615,236761,840,914,5787,40012,236761,108,12357,2170,3118,236787,107,33993,7808,236764,668,795,711,25315,531,90114,506,33408,236761,108,206908,68251,3118,236787,107,3912,236772,26044,506,30997,16254,236761,108,12357,2170,3118,236787,107,818,1861,28481,165733,1515,236787,668,563,12530,107,54025,11307,531,577,834,196390,236761,108,206908,68251,3118,236787,107,28576,496,4135,236764,107,44965,991,607,1535,2630,236764,864,236753,2279,506,13024,107,24249,668,228292,580,657,37170,236787,840,564,776,5601,107,15989,35331,819,740,115752,531,577,63584,107,14713,1301,138299,236761,108,12357,2170,3118,236787,107,236811,683,236764,657,506,837,668,17269,236764,107,902,8761,3016,668,236789,236751,1388,151602,236764,740,711,107,65994,577,4247,6271,919,29331,236789,236753,1082,684,107,236776,1977,3426,506,1171,236787,573,1144,3930,4554,2302,107,172702,577,506,2870,236789,236751,12866,236764,3635,668,2121,107,2021,506,46533,529,496,880,236764,532,234895,181708,107,15600,1299,4665,855,529,2481,81341,756,236806,768,668,107,55327,45048,506,1960,37894,108,206908,68251,3118,236787,107,45338,236764,768,2432,817,1388,236764,107,201631,600,834,33227,580,2481,81341,2863,107,4088,914,569,1042,1258,6586,1301,138299,236761,108,12357,2170,3118,236787,107,33190,236787,107,51671,784,1301,138299,236789,88734,659,531,2481,81341,236761,107,31382,2481,81341,15575,1091,711,236764,532,784,914,43945,107,2021,2481,81341,2863,577,88734,236764,3635,11161,107,902,12723,613,668,33641,711,236761,108,206908,68251,3118,236787,107,6481,236789,236751,11632,236764,532,6899,107,3910,506,21100,563,1603,236764,532,528,1144,8013,236764,107,9474,1082,914,71613,236764,668,5899,107,41768,672,1861,2970,236761,108,12357,2170,3118,236787,107,114424,3008,236761,108,7280,28579,236787,107,4324,236764,822,8737,563,236764,17519,547,8875,236764,107,6372,901,529,13706,659,10708,528,1023,236106,107,3133,1281,1217,692,7162,236761,108,22258,236811,128080,3118,236787,107,4602,625,711,23149,236881,107,3689,3785,735,1010,3305,580,528,672,1883,236764,107,6372,1451,577,6111,531,48817,1134,16509,13706,107,55327,8010,6378,236881,756,112728,711,2390,2668,8731,107,10081,564,6827,76849,236793,1239,659,506,4171,236787,564,1751,107,236777,735,506,6064,1590,236793,11262,236764,1590,625,563,236761,107,236789,7634,735,2686,236789,236753,496,2066,236764,840,625,563,711,3224,107,28911,573,13512,653,11895,236787,506,193874,563,1822,236793,107,818,1331,5333,77241,236793,532,625,563,163926,236789,236753,236764,107,1558,138299,236764,2481,81341,822,2255,13550,236764,107,15938,563,529,13706,13633,52695,1082,529,611,236764,107,3133,131998,639,661,8875,236764,496,1346,196390,10995,236764,107,9208,1806,2080,580,672,12652,107,2825,2853,756,55188,20590,236787,1346,4547,756,55188,573,611,236787,107,24501,529,625,7085,108,7280,28579,236787,107,7711,14093,236789,236751,528,506,2135,107,1882,2752,3819,1603,9370,840,13706,691,5508,107,2021,3890,775,236761,108,22258,236811,128080,3118,236787,107,31777,1602,611,1751,625,128005,107,2021,2514,822,1822,15270,2015,64681,236789,236753,8421,1056,107,7634,3548,1921,1407,5507,236793,837,107,495,506,110702,236764,107,1509,4483,236789,236753,236764,3196,236789,236753,531,13706,236761,3763,506,17324,236761,107,1882,2863,577,85329,236789,236753,528,1023,6342,236764,837,691,107,2021,1769,528,1551,21274,16509,4180,13706,107,31336,1281,692,964,496,7569,236761,108,12400,28579,236787,107,206082,17519,547,8875,236764,107,13751,822,10046,236793,534,703,611,531,822,16573,236787,107,6481,775,7057,531,9200,146066,8244,236787,107,2859,901,1076,1679,1680,756,236751,236764,573,506,6349,107,69903,822,14093,236793,840,236764,564,1751,236764,611,236789,859,1586,107,7634,236789,560,711,7759,573,775,236761,108,22258,236811,128080,3118,236787,107,236806,236764,9370,711,600,236793,107,236777,8988,699,2953,3123,236761,68208,236764,919,236764,107,9401,85325,529,910,2066,659,12034,3016,236764,107,3133,1186,534,2853,1476,236761,564,5264,822,88734,236761,107,2859,692,532,102301,605,2481,81341,6584,531,2874,236764,107,236789,112728,57882,1534,775,692,2863,3785,15161,107,112222,886,740,776,951,919,236761,108,3243,236787,107,818,33408,6361,611,236888,108,22258,236811,128080,3118,236787,107,3133,2514,822,88734,6338,236888,108,7280,28579,236787,107,106161,7368,236761,108,12400,28579,236787,107,106161,7368,236761,108,3243,236787,107,106161,7368,236761,108,45415,48017,5299,236787,107,236777,14098,611,236764,8709,236764,2418,236793,653,3821,5869,528,496,107,5576,9650,4260,236787,768,1041,2369,964,1041,8705,236764,564,107,16223,41343,1099,88575,528,600,12312,19139,668,107,41979,20488,1082,528,506,16461,34030,529,914,4086,1298,107,499,1093,1407,1346,2765,236761,3026,3819,668,691,840,107,226317,236772,112606,532,506,1186,2369,529,1041,96807,236764,1056,107,149267,607,631,48636,179438,784,48674,914,1595,236764,1056,107,1708,496,1719,529,44858,236789,1175,1059,695,496,5946,1374,711,107,44916,1515,614,6468,699,1116,2527,28803,236764,564,236764,11337,107,7843,20488,1093,3291,1288,496,1589,236761,600,625,691,107,1904,2480,1082,6083,236772,5282,531,13098,684,506,3549,236764,768,107,105593,1603,625,711,18802,236764,691,17135,531,1531,1515,6370,107,32667,1298,668,691,1133,531,1586,30006,236761,2282,496,34381,107,8281,564,3265,1515,236793,699,52533,668,8323,236764,914,24468,107,10633,607,32049,236761,564,3442,44543,236764,8709,236764,564,122786,711,107,5576,528,12690,657,1171,9903,668,691,496,880,236772,10869,107,14560,1492,528,1171,9333,668,1053,12183,5668,496,107,1562,236761,108,149808,236823,210985,236787,107,4573,1053,668,8390,528,506,1960,236764,198928,236793,1217,1299,236881,108,45415,48017,5299,236787,107,11407,914,1535,2072,1374,735,1010,1041,2369,236793,564,107,13534,495,1093,735,1765,4186,236761,58598,786,4320,107,236751,108929,236787,1053,564,496,25400,22549,236764,1546,528,1041,2765,107,188451,532,7293,2344,23348,1082,162531,532,1041,1535,107,9585,81341,236764,564,1053,4319,1053,36707,1778,951,62695,573,910,107,14801,1082,886,62829,206637,1270,29750,855,529,2970,236761,108,105677,519,16079,236787,107,195889,236764,506,18472,172363,563,2229,531,3517,611,236761,108,149808,236823,210985,236787,107,33374,2167,574,611,236764,2583,786,5264,531,33107,7564,236761,108,45415,48017,5299,236787,107,51077,236764,611,2863,711,236761,107,87157,24255,564,6899,534,2853,822,8705,236789,236751,22023,236764,107,10185,1515,179196,17519,547,8875,1679,684,506,5324,236764,107,2205,2940,699,496,10591,236764,506,6285,1166,507,704,14311,1515,236787,107,87157,24255,564,1460,1515,18743,5478,236764,532,2246,5478,236787,107,236789,33190,580,236764,611,10405,2206,236888,611,964,2506,528,9891,236764,107,31382,611,964,8132,528,13706,16423,914,48804,69135,107,3497,914,10658,236789,236753,1526,1299,107573,236764,12034,668,5899,236764,107,17729,531,496,14919,236772,1562,600,236789,236751,4209,236789,236753,531,212639,107,3524,784,653,10382,914,20947,236761,108,149808,236823,210985,236787,107,15989,48804,69135,236888,708,52895,236764,951,4806,236888,108,45415,48017,5299,236787,107,83627,236764,611,26001,236888,625,919,6775,496,880,107,55771,49492,914,48597,236787,506,81201,529,640,1050,14301,236764,107,4420,1304,1602,35897,519,109137,236764,1385,236789,236753,711,7278,65374,107,55771,109137,236789,236751,55714,1056,625,75789,12034,4806,107,3834,170643,1037,26114,236764,798,581,1261,236761,32815,172363,236764,107,1882,659,4691,531,13700,1116,8349,236761,108,149808,236823,210985,236787,107,2209,737,832,14321,1041,29398,699,11561,17519,547,8875,236888,108,45415,48017,5299,236787,107,2209,236789,859,12222,17519,547,8875,756,2834,3426,914,21980,107,3133,46104,3324,914,10225,236761,108,11110,1089,5299,236787,107,4754,21813,1800,236764,1535,1719,531,611,236761,108,45415,48017,5299,236787,107,59591,198928,236761,108,149808,236823,210985,236787,107,236777,1006,16126,531,1460,822,12333,199744,236761,108,11110,1089,5299,236787,107,3910,776,611,1800,236881,611,659,11563,3155,236772,84105,236761,107,3689,659,611,44979,1590,236881,562,5851,7075,236764,528,1535,107,63458,236761,2088,1677,822,2268,2369,236881,108,149808,236823,210985,236787,107,236777,7806,822,12333,199744,236793,1388,236764,1535,198928,236761,108,45415,48017,5299,236787,107,2209,1053,4319,1460,506,73787,236764,532,6899,496,22023,236764,1082,107,7780,3324,914,2528,236772,13114,236761,108,11110,1089,5299,236787,107,236806,236789,1041,3658,236764,506,6353,236789,236751,2369,236787,564,236789,859,54650,6945,55188,496,107,1158,5497,6938,236761,708,236789,1041,5871,594,236764,564,6976,3324,1515,512,236789,107,58189,3746,614,6468,3075,236787,815,1288,496,107,87877,129412,236761,564,5004,1515,1845,1308,496,158602,107,94627,236787,532,1056,668,12956,625,236764,668,1531,625,817,107,34179,236793,532,1308,625,1570,236793,532,1024,532,1024,668,107,10927,236764,532,1570,236793,5866,3059,625,1570,236793,653,3363,914,107,10141,192779,1515,236764,653,1217,756,236745,9849,236764,668,1602,834,1076,914,107,200742,532,27299,625,236793,708,236764,564,12691,625,236764,1217,668,25129,167899,107,509,236888,108,45415,48017,5299,236787,107,4906,580,756,236751,6353,236789,236751,107301,236761,108,11110,1089,5299,236787,107,51077,236764,759,236764,756,55188,496,29417,1919,236761,108,149808,236823,210985,236787,107,236776,11890,236764,198928,236761,108,11110,1089,5299,236787,107,33190,236764,6267,16551,822,44990,978,236793,564,1921,735,611,1441,107,1437,34982,534,1930,46208,607,786,672,12399,236761,108,149808,236823,210985,236787,107,3771,236764,1535,198928,236793,564,795,711,855,529,13887,236761,108,11110,1089,5299,236787,107,4348,855,529,13887,236888,108,45415,48017,5299,236787,107,5778,2863,236764,1304,2863,236761,108,149808,236823,210985,236787,107,51077,236764,951,236764,684,822,31245,236793,564,236789,859,711,1024,506,107,34436,8421,1041,29398,994,699,506,28481,236761,108,11110,1089,5299,236787,107,236811,703,236764,611,98929,5869,1346,182658,236787,2229,236764,107,7624,1921,817,3517,506,1535,15924,600,12828,528,236761,108,149808,236823,210985,236787,107,236777,795,7976,1116,67727,6332,236764,532,3517,1116,607,107,3307,35486,236793,840,564,3914,817,541,2853,236761,108,45415,48017,5299,236787,107,11355,236764,564,14098,611,236881,108,149808,236823,210985,236787,107,236789,112728,711,531,5383,16297,236764,6271,600,564,1461,2765,236761,108,11110,1089,5299,236787,107,3048,1093,577,2264,120872,236787,3819,236764,901,1879,236764,784,107,1437,40166,1304,69597,528,148437,236789,12312,1602,840,5727,107,236777,594,8408,2587,529,148693,236761,20639,236793,564,1093,822,12468,1021,107,37051,35299,618,822,15599,236764,600,611,2473,5264,107,972,25812,625,573,56346,236761,20639,236764,611,2863,817,607,775,236761,108,149808,236823,210985,236787,107,3771,236764,1535,198928,236764,73831,786,236793,11161,236764,564,795,711,12034,236761,108,11110,1089,5299,236787,107,902,9043,236764,759,236764,817,607,786,236793,532,564,236789,859,3442,611,107,143521,4668,529,822,8705,236761,108,149808,236823,210985,236787,107,236806,236764,1535,198928,236764,993,740,577,7293,3819,236761,108,11110,1089,5299,236787,107,8720,1403,236764,564,776,711,11170,607,611,236793,993,3588,4668,699,107,21156,1774,3446,236761,108,149808,236823,210985,236787,107,51077,236764,198928,236881,108,11110,1089,5299,236787,107,902,64000,236764,625,236789,236751,1847,236793,564,6827,496,63212,8988,625,236761,107,21204,625,563,236787,506,6285,1166,507,735,614,14093,12034,236793,2342,107,210831,1301,138299,506,2870,563,8731,236764,607,886,912,529,107,700,10995,2066,236787,822,29398,532,131998,639,661,8875,659,1076,107,3843,1680,910,3207,146066,8244,236793,901,5017,9370,107,17167,16711,532,531,1386,625,8652,28481,236761,1174,563,1847,236764,107,498,10701,20488,236793,532,834,236764,564,14098,236764,817,607,775,236761,108,149808,236823,210985,236787,107,46762,786,32725,236764,1535,198928,236793,564,795,41179,611,528,1418,107,2437,74026,236761,108,45415,48017,5299,236787,107,6481,1116,7057,236764,15924,236787,618,1304,563,1492,236764,1304,795,840,107,114555,1023,2480,6840,594,236761,108,11110,1089,5299,236787,107,902,5871,594,236764,564,1751,1304,1093,236761,93258,611,1388,236764,1299,236761,107,33190,236764,1535,9380,15924,236761,118874,37034,236764,135815,24129,236764,2490,21820,107,2412,236757,1788,855,512,236789,5232,236761,532,817,3008,607,775,236761,108,149808,236823,210985,236787,107,3771,236764,657,496,3658,236764,198928,236793,11161,236764,564,1921,711,236761,564,7976,107,7624,1623,6840,594,236761,108,11110,1089,5299,236787,107,13086,236764,1299,236764,77821,236761,108,19954,9575,3118,236787,107,236874,15728,3952,4668,236761,562,106315,901,735,1645,236761,108,236798,6751,134560,236787,107,4754,11149,531,23149,236764,951,236761,108,19954,9575,3118,236787,107,236789,112728,3028,236761,108,236798,6751,134560,236787,107,11585,49194,236761,108,19954,9575,3118,236787,107,37889,236764,815,1023,2870,1645,506,13550,236881,108,86859,236787,107,7634,7089,528,1927,236793,840,735,711,13804,618,3819,236761,108,236798,6751,134560,236787,107,4324,236764,506,1535,11149,563,10701,236761,108,19954,9575,3118,236787,107,236777,236789,859,3717,1515,529,611,236761,108,236798,6751,134560,236787,107,3771,236764,564,236789,859,6271,6739,6271,2583,1515,236787,39002,611,1515,564,795,107,2542,3746,496,7549,1518,236761,156951,506,5148,236761,108,19954,9575,3118,236787,107,3910,2793,1135,7089,1239,65534,236881,108,86859,236787,107,43794,672,16879,532,3746,236761,108,19954,9575,3118,236787,107,11407,2863,692,6899,910,756,3365,597,236764,532,901,35079,236761,107,6445,236764,23156,236764,564,140466,37034,236764,1386,775,3823,528,981,236764,107,6372,692,607,24264,73787,1149,14868,699,11632,236764,107,2021,1601,1023,190476,4690,236888,20639,236764,13949,21820,24441,236761,107,236774,57932,17519,547,8875,236764,563,668,2351,822,11595,236881,108,7280,28579,236787,107,3771,236764,6271,496,880,600,28600,611,2344,1082,668,236764,107,6372,236789,236751,32529,1082,496,2268,236761,107,236814,899,236888,1023,41717,107,14219,12717,12034,1023,10974,236761,1191,236789,859,2541,1023,11595,236764,107,98063,1082,901,2863,24151,775,872,236787,1023,33361,236764,107,24249,3819,4483,13213,236764,692,236764,735,840,73925,236789,236753,607,139113,236793,107,7634,236789,859,1932,529,5507,236761,107,236814,899,611,236761,2793,1135,236888,107,3810,563,17519,547,8875,236793,1694,236764,1144,981,668,3590,107,32496,540,822,732,11989,14093,236761,108,19954,9575,3118,236787,107,236806,236764,901,659,657,625,236888,108,236798,6751,134560,236787,107,37107,9168,577,1023,14787,236761,161028,616,236764,3920,236888,108,19954,9575,3118,236787,107,7634,9891,775,711,236764,840,4186,12034,910,3207,236761,107,6445,2247,822,82174,1680,822,17500,236764,532,6093,107,3497,17500,919,7724,1082,82174,236761,67174,236764,107,5256,560,131998,236787,107,7634,776,125688,775,1623,6998,1023,12018,236764,107,24249,3590,786,24490,607,83711,236761,20639,580,236764,1041,78241,236787,107,2209,600,199585,564,236789,859,1769,1515,573,496,104334,588,236764,107,3133,668,2863,2597,10701,7377,236761,108,19954,9575,3118,236787,107,3243,506,185776,529,506,8710,2214,580,611,236764,107,3048,704,1761,529,13706,236888,611,38340,529,726,8444,3448,532,223147,107,2740,2503,611,512,236789,497,236764,600,611,1149,577,159338,236750,236789,236753,107,23148,1082,3472,532,886,14006,2264,107,91561,506,6573,496,16879,236888,1599,39690,529,119072,236764,107,6372,10591,506,15965,529,1758,236764,1217,735,611,1845,107,4663,47008,600,181359,1093,12222,236888,138281,532,17786,236888,107,3243,16131,4977,236793,45213,2604,236764,532,13799,24327,107,3497,10224,532,967,2370,9891,236888,41149,532,5536,2033,236764,107,3524,236764,684,506,29371,529,20808,236764,564,236789,859,5264,506,95521,107,3133,1386,1041,28481,580,611,236787,1385,531,236789,236745,236787,2229,580,236793,107,2859,611,236789,859,1975,4592,236764,692,236789,859,12222,1091,531,910,51582,236764,107,2205,901,775,531,1023,98574,6641,236761,107,4324,236764,1492,506,33361,659,169809,236787,1492,8595,1535,9093,236787,107,236789,112728,573,506,23073,31252,8878,832,1091,236764,107,4348,573,506,1378,6217,236787,1686,786,236764,532,776,506,1133,236761,108,7280,87058,236787,107,196638,236772,15595,1583,236793,711,564,236761,108,12400,87058,236787,107,31777,564,236761,108,7280,87058,236787,107,10185,236764,901,735,13213,1515,528,236761,108,3243,236787,107,2021,506,2197,236764,564,12691,1515,236761,108,236798,6751,134560,236787,107,3689,563,3291,529,2481,81341,236881,108,3243,236787,107,9752,662,236764,17536,236764,83996,236761,108,7280,87058,236787,107,25221,506,1378,6217,657,506,1401,37308,236764,107,3497,1091,668,28062,236793,1015,236764,3324,506,11059,236764,107,1829,1110,236789,236753,531,910,33361,236787,668,563,5668,7057,236764,107,2021,3890,784,506,3207,236761,108,236798,6751,134560,236787,107,236806,29417,12339,236888,107,15938,196771,855,1926,619,914,182093,26114,236764,107,3133,236764,1056,625,87760,236764,11979,872,236761,87109,1610,2378,236764,2481,81341,236787,107,236776,24685,188009,4251,236764,618,2563,618,35627,1610,236764,107,114498,711,834,8326,496,71102,236761,87109,50084,496,31451,107,14986,531,210018,236789,236751,7976,236764,711,48621,532,21832,107,16904,528,47900,236793,840,236764,607,21820,58030,5724,532,107,818,41796,236772,5282,87244,529,21820,12054,236764,107,178651,10716,540,162531,22816,31716,236764,618,768,506,1902,107,114498,26699,806,532,1602,152114,236761,108,7280,87058,236787,107,13908,236764,17536,236761,108,236798,6751,134560,236787,107,236806,6945,55188,2481,81341,236888,107,6481,236789,236751,12270,1515,1135,236764,653,1386,4595,27432,236761,108,7280,10995,236787,107,2094,795,564,6081,531,13706,236761,108,12400,10995,236787,107,3133,564,672,236761,108,42173,10995,236787,107,236776,8962,1256,580,236789,236745,236888,564,3721,672,573,10173,236761,108,19954,9575,3118,236787,107,10185,1590,1239,112447,600,776,22068,910,3885,107,3834,496,11890,236789,236753,2575,813,236757,236888,112712,878,236764,2080,501,108610,236764,107,236777,17878,529,496,24324,236764,3972,1412,600,13098,2438,1093,107,236799,2623,607,1724,600,30748,1091,236764,1239,3225,47008,236764,107,236788,500,3819,506,6093,577,3028,236764,3294,872,236787,1679,607,1091,236888,107,3133,192713,236764,1144,9168,506,2870,3590,236888,2282,1515,236888,107,3810,563,506,880,529,1041,12556,236789,236751,17554,236764,17519,547,8875,236764,107,94348,4776,1023,51386,236787,1299,236764,196390,131998,236764,1769,107,148935,970,4945,531,1386,1535,506,3207,236793,107,96553,564,236764,607,1724,600,735,506,7304,236764,795,104337,107,2021,1601,1301,138299,236761,108,236798,6751,134560,236787,107,236824,43195,17536,236764,35627,84561,236789,540,236793,107,195297,8774,46440,1010,2311,23125,573,107,236776,1855,3003,529,6093,236761,108,19954,9575,3118,236787,107,39125,236764,30450,786,711,236793,107,4754,981,46440,3819,711,6962,236789,236753,786,236787,18989,611,1388,236787,107,818,4806,564,7266,563,4319,5663,107,55771,13588,531,786,236787,531,17519,547,8875,5478,107,236777,795,3196,236764,532,6093,236761,108,236798,6751,134560,236787,107,6445,506,5888,59374,236764,58645,236764,107,48488,5268,528,2765,607,44543,236793,532,1116,1822,85429,107,44843,42544,21820,65529,616,236789,73787,236888,65337,33995,236764,107,63449,100536,577,21820,3104,236888,108,19954,9575,3118,236787,107,195297,4389,951,2344,107,55771,1724,1304,20830,1064,7310,236888,1593,236764,77821,236761,108,236798,6751,134560,236787,107,178651,5367,15163,2481,81341,236888,107,5988,236764,5057,21820,67748,528,506,2436,236772,2811,236793,107,7029,541,2853,784,506,10195,512,236789,506,5148,236764,107,10936,901,2863,1281,1023,3666,236787,3121,236888,108,2457,16008,134560,236787,107,236799,133707,611,236764,1041,4690,236787,1388,25876,236793,107,977,659,2229,1135,107,17729,51386,236764,13637,59940,528,1023,11979,236764,107,31777,209109,528,33107,236787,4646,786,236764,503,16516,236764,107,1882,2863,577,11055,1570,236761,2254,2538,692,735,19847,236764,107,2292,939,9638,532,93954,186031,692,735,6827,107,818,10814,529,1023,4690,236761,22419,10995,33408,236888,107,66825,910,50684,618,692,7976,1023,1852,236764,107,6372,1800,1023,13361,236764,607,16094,107,7171,236751,125180,236764,107,12055,2583,611,43702,29817,236761,107,195297,4668,236881,108,86859,236787,107,818,11838,529,146066,8244,735,10713,236764,107,3133,2238,531,639,661,8875,532,531,2481,81341,10041,236787,107,236777,5004,1023,4598,531,910,98574,14208,236764,107,3133,1299,564,3588,3121,236761,108,2457,16008,134560,236787,107,31382,35627,8988,236789,540,9043,236764,107,87157,24255,35627,8988,236789,540,711,1388,236761,107,3910,1440,563,236789,236745,2338,236881,108,86859,236787,107,68941,614,6468,236764,1041,29398,236761,108,2457,16008,134560,236787,107,236789,112728,711,496,16879,236793,21485,692,6827,910,41717,236787,107,3910,1451,540,35627,528,496,16879,193635,614,6468,236764,107,3133,3437,21820,4668,834,5226,236881,108,86859,236787,107,4225,695,529,506,6285,1166,507,107,122004,786,528,39839,236764,600,564,691,11724,531,11228,107,19765,653,2390,7635,1003,236764,1663,1053,564,236764,17536,236764,107,51671,614,6468,2338,6111,1041,2072,236761,108,2457,16008,134560,236787,107,15938,236789,236751,570,15728,236764,107,6372,1677,3196,618,668,964,1378,571,236789,236753,236881,708,33408,107,2209,815,506,18743,529,2481,81341,236793,532,564,735,107,13286,236772,2289,3472,1515,5478,236761,108,19954,9575,3118,236787,108,2457,16008,134560,236787,107,818,79421,10077,711,41796,699,496,6937,700,107,9474,1082,564,1281,506,5057,529,2481,81341,236789,28166,107,4663,1418,2689,497,880,236761,108,19954,9575,3118,236787,107,33190,564,2311,5226,236881,108,2457,16008,134560,236787,107,43320,236764,768,611,2229,711,528,506,4806,529,3496,236764,107,4573,12806,991,528,822,1852,236761,108,19954,9575,3118,236787,107,236806,236764,1531,786,13650,12444,107,902,12162,618,5057,618,1056,564,94920,236789,236753,236764,528,3710,107,2205,69619,618,1056,1023,219752,706,1719,691,3028,236764,107,3133,11752,616,8141,236789,236753,531,4086,1476,236888,108,2457,16008,134560,236787,107,53162,529,61505,236764,107,3910,563,625,607,131998,639,661,8875,236881,108,19954,9575,3118,236787,107,2205,607,496,880,1635,1178,1003,148745,236787,107,17063,581,1261,1070,531,4355,236764,532,1070,531,74627,236793,107,236794,743,16542,1515,236764,653,56346,522,236764,39272,506,1032,236793,107,134942,146066,8244,528,506,1463,529,13706,236764,107,14986,1133,496,517,60213,13501,82376,528,506,103981,236764,107,2021,1531,1515,21078,657,795,236761,108,2457,16008,134560,236787,107,10936,563,600,34714,107,24249,4173,786,901,1053,12222,611,531,822,98574,236881,107,10936,563,668,236881,2246,1515,534,2853,236761,108,19954,9575,3118,236787,107,6481,1515,7057,236793,107,2209,1602,1573,506,9043,236787,840,573,1023,43085,236764,107,818,3364,2129,726,236746,72877,236888,12761,11664,573,1091,6209,107,818,11866,770,236789,497,704,18586,236789,236753,506,5866,618,901,1602,5207,787,107,4663,637,12456,1294,13633,1082,901,236761,108,2457,16008,134560,236787,107,4573,1217,59359,236789,236753,611,236881,108,19954,9575,3118,236787,107,15600,506,990,7298,531,3442,236881,564,776,711,1751,236761,107,10936,563,506,13550,236881,659,611,97635,512,236789,506,2135,236881,107,2859,711,236764,3217,41013,611,8421,611,659,834,236881,108,2457,16008,134560,236787,107,9585,81341,236764,107,1882,735,657,44097,25876,532,1602,107,7949,750,531,3345,1023,5708,236761,108,19954,9575,3118,236787,107,3910,12828,910,10041,236881,1281,611,580,837,2678,107,7634,735,7006,910,1758,529,5210,236881,108,2457,16008,134560,236787,107,2205,564,8844,236764,2481,81341,236764,107,37107,16573,858,236789,506,566,97829,659,506,28143,1090,236764,107,4088,910,1791,5210,236793,512,236789,497,1091,17519,547,8875,236764,107,37107,1401,3710,529,4614,236761,108,19954,9575,3118,236787,107,236777,776,5426,2167,574,611,236764,107,2292,784,506,34998,19139,692,735,25876,236764,107,2292,506,4806,692,735,28516,3075,236764,684,506,86796,107,1882,735,1603,531,52673,4690,236764,600,611,5467,107,3974,786,2342,17519,547,8875,532,914,28143,1090,236793,107,3133,600,611,711,9249,506,1861,236764,840,236764,107,190533,506,2634,607,73787,9691,532,152856,236764,107,1882,8595,672,1401,6468,236761,108,2457,16008,134560,236787,107,31382,564,1451,7976,107,3048,964,10057,531,496,14617,8088,107,3133,3979,1356,6055,531,236764,611,236764,3819,30253,564,2752,107,22472,236762,822,10980,236787,1769,822,5313,529,1724,107,6372,1791,740,11278,822,2970,236761,108,19954,9575,3118,236787,107,28587,659,901,107,6372,1346,659,10788,236761,1637,1027,1288,577,1590,726,107,2205,625,964,4343,531,9370,726,7705,2765,672,11710,107,10936,495,611,1460,786,83250,236789,236753,236793,768,1027,9891,107,236798,29595,914,1589,1082,614,2611,2072,236793,107,2859,1027,1751,36711,4355,235800,4287,1972,107,3133,600,914,2891,236789,236751,23348,497,1082,5668,236793,107,6481,1515,7057,236764,653,834,1551,834,90971,236764,107,50396,5478,236764,531,3821,914,31602,236764,107,3133,1500,2481,81341,236761,107,236806,236764,786,7057,236888,1386,611,496,26114,529,786,236881,107,2859,1239,3831,577,711,44630,236764,837,529,611,107,4573,563,2390,6285,1166,507,236881,7293,529,611,840,563,107,153003,531,10591,2342,506,1822,17519,547,8875,107,236776,19999,618,2651,618,914,236761,562,2953,1548,236764,107,31382,8863,531,784,236764,1921,564,4864,107,2543,784,236787,506,1884,107,172702,10591,506,1960,528,1070,1032,6093,236764,107,2205,4400,795,577,41179,236789,236753,236761,7323,611,531,14868,236793,107,3133,2390,2863,6077,4988,855,1041,4991,236764,107,24249,1758,659,1791,37105,236761,108,2457,16008,134560,236787,107,27393,580,236764,1041,78241,236787,107,13185,1535,672,24402,113027,236764,532,611,2863,107,125449,528,784,607,775,236761,108,236798,6751,134560,236787,107,4324,236764,1531,506,23961,577,92112,236787,2514,822,15514,236764,107,2205,564,735,1076,1091,1679,236761,1637,564,776,5039,236764,21100,107,28587,24744,531,1023,11278,236787,506,1884,795,7298,107,2542,496,2822,7046,236787,768,692,10382,506,2135,236764,107,1882,3914,2514,506,5148,236761,108,141791,236787,107,117159,711,1023,2065,236764,17536,236761,108,236798,6751,134560,236787,107,35367,236764,532,13213,822,33361,3324,236789,236751,236761,107,7711,1533,1362,236764,2229,236793,531,506,10995,3545,4714,775,236761,108,19954,9575,3118,236787,107,236777,236789,859,6093,607,7293,840,44543,236793,573,564,776,17554,44543,107,236824,29953,1082,496,14468,236772,73854,236761,108,22258,236811,128080,3118,236787,107,1882,17554,27432,236787,107,4348,5079,30047,496,95625,564,159338,107,9474,1082,21820,30006,532,80511,236761,32987,21820,3998,236761,108,19954,9575,3118,236787,107,6481,506,1171,5207,1716,1778,506,1032,236789,236751,34714,236764,107,3133,506,33408,89902,1515,1308,236888,108,22258,236811,128080,3118,236787,107,2859,564,10240,236764,2481,81341,236764,107,27258,203780,786,1133,496,66278,236761,108,19954,9575,3118,236787,107,43794,1239,1806,3885,236764,118719,605,236764,107,233721,564,25876,528,822,146066,8244,11595,236764,107,3133,1603,1144,981,564,17135,236787,756,55188,711,1041,4806,107,10936,495,35627,636,598,786,7441,236789,236753,236793,573,21820,47812,107,236824,5958,872,21820,2066,531,506,7310,236761,108,22258,236811,128080,3118,236787,107,199887,35627,506,109137,107,6372,691,506,68972,529,822,1615,9599,236789,236753,95418,236764,107,178651,1374,540,711,142071,786,1590,236761,107,53913,1434,236764,532,711,196390,236764,611,735,704,15608,786,107,902,822,50709,9093,236761,108,2457,16008,134560,236787,107,2859,564,1374,3442,44543,512,236789,497,672,21820,1719,236789,236751,981,236764,107,178651,236789,671,540,711,4646,21820,50898,236787,840,564,236789,859,2072,625,107,10936,80237,2863,154671,24947,607,45585,236764,107,10936,1822,200997,5990,2863,8835,532,146404,236764,107,236777,236789,506,1345,49122,236764,1298,21813,2863,577,36780,524,236764,107,3133,236764,84568,690,10487,236764,6899,919,236793,1298,506,107,236753,1068,12761,11664,236764,107,6372,236764,607,506,517,707,236762,4846,1553,5990,236764,17554,162531,88734,236764,107,172702,1879,2342,910,17500,756,1882,7806,506,33408,107,7711,13706,46440,1288,496,31451,7085,107,40524,3322,598,35627,531,496,115878,535,529,672,49871,236764,107,27787,6340,201835,1680,236761,108,236798,6751,134560,236787,107,236806,2870,236764,107,8291,563,506,2837,524,236764,692,506,1670,516,3007,236787,107,55327,540,35627,231670,726,108,19954,9575,3118,236787,107,100985,1492,236764,951,919,236787,1041,5946,236764,107,15938,815,496,31795,531,224205,1116,4806,236764,107,4420,1304,1677,30450,786,66031,2061,786,236761,564,735,3028,107,2205,611,735,3028,236793,600,236789,236751,1144,564,740,236793,14622,107,2205,611,735,1010,236793,600,236789,236751,573,1041,2891,236787,107,2209,600,815,840,52817,914,1535,795,107,236814,651,1024,1246,236789,501,10701,1134,236761,108,2457,16008,134560,236787,107,3048,2863,711,577,107,818,23674,529,822,85700,236793,13706,1921,1281,107,818,1550,529,1116,1852,236787,756,15088,627,496,191682,107,236824,29953,1082,496,34047,236764,951,2344,1082,496,111830,1155,236764,107,2021,16239,822,229576,236793,532,531,25872,600,236764,107,24249,236764,531,506,201036,532,1903,529,104846,43096,236789,236753,236764,107,38786,4483,840,30997,236787,5233,236764,564,5426,2167,574,611,107,902,1519,529,1144,611,659,236764,711,531,13833,107,3689,611,735,3028,726,15849,1023,14093,6899,786,236761,108,19954,9575,3118,236787,107,236777,735,1070,40238,3324,786,236764,532,901,6538,107,2021,6899,5507,5630,236789,236753,236761,108,2457,16008,134560,236787,107,31336,901,711,236764,107,13086,2473,901,219611,756,51775,540,4616,5053,4637,236764,107,3133,8336,5507,607,4355,236761,5517,784,506,17768,236764,107,10936,1340,692,735,5787,236789,501,1535,532,1535,4762,236764,529,784,107,818,37120,528,672,2135,11105,532,3207,236764,107,1882,6210,611,506,50160,236764,531,577,5787,236789,501,12034,236764,107,13286,506,3364,5296,236764,657,107,11069,1186,5313,236761,108,19954,9575,3118,236787,107,236777,7806,611,236764,2870,236793,107,4573,3914,1386,1041,3710,13782,531,1769,107,236776,137857,531,2350,1041,26114,236787,564,776,28440,625,236793,107,3133,1975,3324,1041,3364,912,607,1724,107,6372,735,231670,506,3490,236761,108,19954,9575,3118,236787,107,12055,1239,1638,17747,236764,837,611,210125,236764,107,32631,5057,919,236888,1056,41717,532,39976,1713,2863,107,236777,236789,506,2135,8595,111771,616,236764,1531,16220,532,9979,577,107,42619,784,529,2416,236772,66545,65598,236888,107,4420,8103,23897,3538,618,506,68881,236789,236751,27373,236764,107,6481,1515,577,1603,496,1028,37277,573,506,28481,236888,107,3771,919,236764,564,1879,236888,1701,600,564,735,711,15066,236789,236753,107,4754,18410,600,159232,236764,653,40259,236789,236753,1070,4435,755,515,35891,81723,107,24249,236764,2180,5433,236764,1590,236789,236751,1551,1663,735,3028,74077,107,3048,25336,786,12034,107,902,1226,23926,847,187249,800,236793,107,2205,768,564,9312,1041,2268,1374,577,9337,524,107,902,104846,7127,1465,607,12828,236761,108,2457,16008,134560,236787,107,54025,30997,659,611,236793,107,9474,34381,531,822,1535,2072,1082,20429,107,2021,775,600,2583,611,9995,236787,684,822,31245,236764,107,2859,756,51775,540,5869,611,577,2494,13220,236764,692,236789,859,2247,611,236764,107,17729,886,600,2820,914,2005,7342,236764,528,880,17955,236764,107,11407,3282,21236,607,611,236761,6841,236764,577,625,3224,236764,107,2205,531,775,236764,531,784,506,1902,236764,600,102301,605,2481,81341,107,236824,7233,672,3653,236789,236751,99095,236787,528,8369,529,506,837,236764,107,4754,29417,2837,524,236764,3224,531,506,3545,236764,564,2583,1515,236764,107,3497,784,914,15676,24898,236793,532,699,672,990,236764,107,2542,1144,668,1602,1680,146066,8244,236764,2246,1515,236764,107,3497,784,506,92260,532,69987,700,529,506,4253,236764,107,5635,134560,16437,9575,3118,24387,236777,3769,1536,3118,236888,31232,107,818,2517,951,62695,3785,236888,108,3243,236787,107,236780,1389,605,2481,81341,3468,19386,44477,236888,108,38850,236777,3769,1536,3118,236787,107,236777,795,817,15066,236793,107,3133,1056,1041,3392,563,5888,236764,611,2863,48672,107,28911,564,77123,653,951,236787,1217,13972,236764,564,7806,611,236761,107,236777,2689,531,32302,822,2837,524,236764,532,657,784,2782,107,2021,1208,76764,822,1535,2517,107,2021,506,59542,529,1041,2066,236761,108,2457,16008,134560,236787,107,4324,236764,531,1023,8336,236793,107,10936,236764,16509,692,776,105125,775,236764,692,795,4903,107,2021,13706,529,1023,2630,236761,1599,236764,131998,639,661,8875,236764,107,15545,531,146066,8244,1063,236787,5039,775,531,13706,107,818,1791,236764,607,8761,692,1149,81587,236764,107,2542,910,1852,1535,532,35079,236761,108,236798,6751,134560,236787,107,236777,2863,236764,1041,29398,236761,108,38850,236777,3769,1536,3118,236787,107,818,33408,3654,531,18649,786,236761,564,236764,600,1492,107,7166,3456,1346,4105,953,18177,236764,1006,4470,531,2829,107,4088,1041,29398,2870,236761,108,2457,16008,134560,236787,107,13751,236789,236745,236793,756,55188,23149,236761,2900,563,236789,236745,236881,108,38850,236777,3769,1536,3118,236787,107,236777,46232,6267,1590,528,146066,8244,107,3834,496,6934,880,236789,236751,3155,236793,668,1456,786,39682,236787,107,2209,41641,531,786,236793,564,5004,1515,41971,236793,107,4573,1299,17519,547,8875,691,2351,1041,1927,236764,107,3133,83711,512,236789,497,236765,47279,236789,236753,1041,56346,236787,564,2864,611,107,2021,2583,1041,6934,4253,11425,236761,108,2457,16008,134560,236787,107,236806,236764,1388,56400,236789,236753,236888,107,114498,668,506,103173,529,1041,2369,236764,668,1374,107,3912,2196,618,563,506,6573,236761,67023,1515,236764,131998,236761,108,236798,6751,134560,236787,107,9585,81341,236764,914,1463,236881,108,38850,236777,3769,1536,3118,236787,107,2292,52895,236888,18492,236761,107,236777,1006,78567,236793,137044,236764,1041,6571,563,20718,236761,107,19845,692,951,10135,1590,236881,108,2457,16008,134560,236787,107,5988,692,531,1023,8336,236787,107,818,4806,3324,822,96236,105208,236793,756,55188,990,107,1509,1374,577,1385,236789,236753,531,236787,2229,236761,108,22258,236811,128080,3118,236787,107,818,5148,563,5787,236789,501,236888,108,7280,87058,236787,107,236789,236774,16132,577,5518,236789,236753,1063,580,1535,4194,236761,108,22258,236811,128080,3118,236787,107,24921,236888,107,236777,1093,564,964,496,10995,236793,573,564,3914,236764,107,33993,496,104334,588,236764,577,600,564,1006,236761,20402,236888,107,3689,1535,4194,740,496,52340,1586,107,236777,236789,506,912,600,563,657,40474,236881,22749,2782,236764,2481,81341,236764,107,236777,735,25876,607,44543,236787,834,3187,23823,35627,12222,786,236764,107,3133,1093,540,776,834,236764,564,1751,236764,1374,692,14087,107,2205,3187,618,692,9039,236761,3763,506,4820,236764,107,2859,545,236789,497,1570,564,2874,1515,42603,531,42603,236764,107,2209,236789,236751,10701,236764,653,564,1006,914,236787,10701,166943,107,236814,651,711,600,20488,528,236789,236745,625,1053,236793,573,1298,107,236777,3305,531,47585,1515,528,614,4745,4912,236764,107,4339,26114,531,26114,236764,564,236789,859,2197,574,657,1515,1070,1595,107,3524,83711,653,10946,1149,974,1515,236761,108,7280,87058,236787,107,2209,236789,236751,506,40450,236761,108,22258,236811,128080,3118,236787,107,236799,5006,236764,3635,711,834,29110,236761,3551,234007,236789,236751,23572,236789,236753,107,3497,1186,15944,15769,684,1515,236793,573,1515,107,172702,10240,855,529,4850,236787,6271,6745,6271,62073,236764,107,33993,41718,236764,14504,236764,6271,517,1629,6271,44264,236764,107,818,35486,529,48034,6271,2782,529,29817,236764,107,5493,2230,576,4104,784,529,93040,236764,2863,12693,872,107,37107,91580,25962,532,2401,756,51775,540,107,4754,17554,531,2481,81341,236787,1298,564,1586,1515,236764,964,625,107,3834,2033,236764,3324,1041,10070,236789,236751,9200,236764,1581,993,236764,107,91561,506,151910,26164,236764,1093,564,107,105982,1041,48621,1526,528,236789,236751,3710,236761,3764,611,531,506,3207,236793,107,30354,1217,756,55188,4247,236793,532,1144,901,659,600,1921,107,3912,167837,573,13706,236761,108,7280,87058,236787,107,15600,711,611,817,236881,108,22258,236811,128080,3118,236787,107,236777,1006,15193,657,506,178264,93541,236787,564,14098,611,726,107,236789,112728,8710,506,3207,46406,726,40990,786,3658,541,2853,107,3910,506,1902,5899,236764,600,531,506,17723,529,625,107,236777,1149,54961,580,1041,9338,236761,108,7280,87058,236787,107,236777,2863,236764,17536,236761,108,44180,1439,134560,236787,107,818,12723,7355,12630,786,692,2863,735,4668,531,236772,9467,236761,108,12357,2170,3118,236787,107,11947,653,4287,236881,108,44180,1439,134560,236787,107,4348,3894,531,506,21450,529,506,1331,236764,573,901,107,21867,711,2481,81341,236761,108,206908,68251,3118,236787,107,46797,36594,75710,531,1281,910,4690,236761,108,44180,1439,134560,236787,107,100985,611,236764,1015,1677,506,41897,2765,236881,108,206908,68251,3118,236787,107,818,33217,236761,108,44180,1439,134560,236787,107,43320,236764,531,165733,1515,236793,618,506,33233,4846,1553,5990,1093,506,107,212508,2481,81341,236761,108,12357,2170,3118,236787,107,2209,236789,236751,496,33217,11161,236764,600,5394,507,1133,496,10591,236761,108,44180,1439,134560,236787,107,2209,236789,236751,496,10591,11161,236764,600,6176,1133,496,33217,236761,1599,1156,107,733,2255,1758,236787,3442,786,886,3210,600,564,2863,2679,611,236761,108,22186,236787,107,13086,236764,17536,236761,108,44180,1439,134560,236787,107,902,1144,17261,665,563,2481,81341,6934,528,236764,600,611,1156,107,17777,711,528,26444,236881,108,12357,2170,3118,236787,107,2209,236789,236751,6934,528,951,886,12866,236764,840,11628,607,784,236761,108,206908,68251,3118,236787,107,117104,528,21077,236761,108,12357,2170,3118,236787,107,3133,70013,784,3496,528,113865,236761,108,44180,1439,134560,236787,107,2094,563,17163,1492,236787,776,611,1156,1281,1217,611,659,107,79859,2600,1590,528,506,3207,236764,564,2689,529,775,512,236789,506,107,898,236772,4240,2129,236881,776,611,236881,108,22186,236787,107,11355,236764,1217,659,692,38759,2600,236881,108,44180,1439,134560,236787,107,17574,611,2910,529,21077,1492,74077,16132,611,711,577,23186,236881,108,22186,236787,107,13086,236764,1388,236764,17536,236764,1388,236761,108,44180,1439,134560,236787,107,11355,236764,756,55188,951,1822,4217,236793,573,496,1401,2268,69947,529,107,68377,795,6586,611,529,496,1822,3772,529,31245,236787,107,47724,822,86384,506,72935,236764,532,577,23186,657,107,17993,89115,236793,657,506,3198,768,611,1769,625,618,496,107,942,7956,531,611,528,1646,834,236761,1599,27248,2481,81341,573,107,16898,11307,236881,108,12357,2170,3118,236787,107,1882,776,625,711,7057,236764,17536,236761,108,44180,1439,134560,236787,107,236777,1281,611,740,776,1401,2268,7057,236793,573,822,7351,107,733,1551,236764,653,1663,822,7419,1093,2171,151582,107,18999,236787,822,22234,659,2311,21097,236772,5282,573,107,63435,1623,7057,236761,1599,2910,529,21077,236787,708,600,611,107,46052,2490,822,6114,8797,506,13420,507,529,822,132193,236764,107,624,1386,840,614,9228,9375,529,822,1535,105436,236888,107,236806,600,611,1451,236888,108,12357,2170,3118,236787,107,3689,1299,236764,17536,236881,108,44180,1439,134560,236787,107,11355,236764,1299,611,1374,6878,496,24088,529,723,1042,4143,236764,107,187575,236764,23125,236764,1594,236762,126942,236764,39309,119593,236764,618,107,1309,528,13706,236761,108,206908,68251,3118,236787,107,236792,19376,8875,236764,611,659,3224,1388,3487,2311,236761,108,44180,1439,134560,236787,107,236777,1006,3224,531,577,496,83805,200997,1037,236764,532,886,600,107,844,2061,496,8578,529,3425,10135,607,711,496,7266,529,784,97698,107,236774,6197,528,236789,236745,236793,1176,531,577,2613,59314,528,107,236760,27927,522,506,1171,18169,236793,152302,532,494,6229,236772,5282,107,48424,2311,31473,5776,236793,886,600,7787,507,919,107,4060,506,42994,945,529,506,3446,1082,607,506,55714,107,1340,506,5597,236787,1144,564,1751,564,24011,236764,532,8559,1041,107,9215,762,528,1041,11762,236761,25402,1156,1288,692,1294,2438,618,107,7624,659,726,236777,3914,2246,611,13363,1643,236759,4407,726,584,506,6092,107,7624,2583,786,6374,1041,78444,63588,236764,564,1386,496,107,7111,15320,3392,657,625,236761,564,740,236789,236745,1879,822,24712,236751,735,107,214956,506,4217,1388,236764,1056,564,1586,506,1188,528,107,104835,607,506,3262,912,529,822,161888,236787,532,107,3480,564,1921,577,3004,531,10591,607,1724,600,1879,107,7624,659,37923,643,23674,1758,236764,3819,901,7089,35514,600,107,82641,611,611,735,1535,13799,236761,1637,611,1460,672,528,107,1437,4187,529,1041,208735,236764,5238,625,600,564,1006,3224,107,7368,3487,2311,236881,1144,518,1730,740,822,518,26161,107,8902,1180,158841,20508,508,855,529,672,2872,236764,768,564,577,107,5863,1388,3487,2311,236881,108,12357,2170,3118,236787,107,33190,236764,17536,236764,2229,236764,692,1281,611,1388,3487,236761,108,44180,1439,134560,236787,107,3048,1281,13637,786,236764,74445,6271,1027,3210,236761,1599,107,733,35244,573,6934,1054,3740,236789,17126,532,14897,236787,611,107,17634,855,496,1535,87932,2521,11663,528,9903,496,107,63055,1534,614,11167,6853,532,496,31642,1025,236772,62856,236793,107,624,1299,544,180396,506,29660,529,1806,135253,531,496,107,8556,1719,529,9998,236761,3026,611,659,9903,496,107,72750,1534,4598,532,4598,236764,768,611,6584,531,577,107,10475,3059,607,506,235203,236764,611,1386,13799,1133,107,236757,164341,236793,1076,872,506,48804,8134,2342,784,107,236758,84182,236793,532,236764,528,118046,573,496,18782,236772,14221,236764,107,63089,506,29660,35475,506,919,117824,107,2003,822,9903,236787,784,506,8118,611,1386,528,910,107,63055,563,236764,11687,1800,506,7789,1054,3740,236761,1599,659,107,236746,6727,529,17163,5906,236761,108,12357,2170,3118,236787,107,33190,236764,2229,236764,611,659,1388,13674,531,577,496,107,50936,497,549,6197,573,506,2633,1082,496,4127,107,3976,5607,528,506,44264,236761,108,44180,1439,134560,236787,107,7711,1401,48034,1921,3291,18649,616,236764,768,901,2863,107,21740,1288,40922,12560,618,611,659,236761,3026,107,7624,8988,1791,31273,506,5708,236764,625,563,711,5367,506,107,92401,3801,529,822,577,2206,236793,532,822,577,2206,28054,711,107,814,42256,496,23674,618,531,7305,496,9894,5607,236789,236751,107,236755,1974,526,236764,653,531,577,98463,4250,528,614,1188,236789,236751,3294,236772,107,236751,18648,236761,16961,611,1921,577,6420,236764,2481,81341,563,11307,236793,107,14625,528,496,12907,22851,236764,563,5367,76399,107,25142,1783,1398,55068,236764,3635,810,127166,1070,529,506,107,9783,529,756,581,964,83977,13098,2438,236761,3803,236772,2235,531,107,17993,24712,236751,236787,919,529,822,12309,1093,107,33332,1041,7875,236764,1646,506,82130,2438,529,506,42239,586,107,942,1553,5990,236787,564,795,577,16627,531,1769,1041,5264,529,611,236761,107,3910,1492,236764,1041,618,5888,618,29417,21813,74077,624,506,16254,236764,107,37051,1304,92872,236764,951,951,39029,74077,1914,2853,776,611,1500,107,17993,6114,834,4592,236881,108,45415,48017,5299,236787,107,37892,700,742,10186,501,8875,236764,1041,6938,2481,81341,12668,236793,573,107,1437,2765,529,174261,236764,1531,236789,236751,817,236761,108,44180,1439,134560,236787,107,27577,236888,2481,81341,4891,2033,236888,108,45415,48017,5299,236787,107,43320,236764,26721,10186,501,8875,236793,532,607,1346,67836,107,11582,156214,236761,108,44180,1439,134560,236787,107,13751,1041,1670,236764,52895,236764,532,564,7806,44543,236761,211248,236888,107,9585,81341,4891,2033,236888,108,45415,48017,5299,236787,107,197615,6945,55188,1847,236761,108,45415,48017,5299,236787,107,13908,236764,1590,236789,236751,496,6064,699,1515,236787,506,1883,46440,107,51208,236764,914,6853,2264,236793,532,236764,564,1751,236764,993,236789,236751,886,107,502,2033,573,611,236761,108,44180,1439,134560,236787,107,236777,795,1386,1041,1401,3155,54688,20564,236787,496,6064,573,107,1336,236888,108,149808,236823,210985,236787,107,10784,236764,2953,236764,993,236789,236751,496,6064,573,611,236793,564,5004,236789,236745,236761,108,44180,1439,134560,236787,107,236776,6064,573,786,236888,625,5021,786,614,9350,529,6819,107,33658,236789,2404,236793,528,837,990,564,795,1386,496,11645,657,107,1437,21355,236787,506,1346,43284,31462,528,107,28118,501,563,840,24666,30095,525,236764,532,236764,531,672,176292,236764,107,1340,951,2480,2072,1082,496,11149,236772,236753,5958,236761,2375,668,107,2217,36773,236881,668,691,50195,531,2229,2033,36773,236761,108,149808,236823,210985,236787,107,236806,236764,951,236764,951,236764,951,236761,108,45415,48017,5299,236787,107,236806,236764,668,563,36773,236793,564,7806,506,33408,573,236789,236745,236761,108,44180,1439,134560,236787,107,4324,776,564,2311,236764,768,625,577,711,2311,1623,236787,12076,496,236789,107,145479,528,914,19067,236881,506,40238,3291,1515,236761,108,45415,48017,5299,236787,107,2791,236789,236751,24468,236787,10186,501,8875,236764,668,3952,506,4168,990,2033,107,4060,506,512,9484,99095,236761,108,44180,1439,134560,236787,107,18047,668,82235,17519,547,8875,5057,586,236881,108,45415,48017,5299,236787,107,236774,24592,639,661,8875,19874,236764,901,25876,3075,236764,840,107,68690,547,8875,2506,1135,236761,108,44180,1439,134560,236787,107,3133,756,236745,9849,990,573,1515,2311,236764,564,236789,859,12691,1515,600,236787,107,508,668,1053,20246,684,1515,236764,564,1093,711,735,1010,834,107,236760,11129,3456,573,784,506,144433,528,146066,8244,236764,532,506,5122,107,7705,236789,236751,528,1091,236761,2375,506,112936,42564,529,672,236881,108,45415,48017,5299,236787,107,11947,21813,236764,1531,236789,236751,817,236761,8438,236764,11262,236764,11262,236793,506,112936,107,5594,11739,699,506,2870,236764,19139,668,5021,1041,107,1588,506,3697,1463,529,506,3653,236787,668,46440,528,672,107,2064,855,18450,914,4937,50898,85233,108,11110,1089,5299,236787,107,902,5871,594,236764,993,236789,236751,151582,2432,13804,529,1515,236761,108,44180,1439,134560,236787,107,236824,1310,18032,236888,6952,236764,564,12691,611,236764,532,711,2180,914,107,3397,21186,236761,108,149808,236823,210985,236787,107,818,33408,7224,1091,1847,236888,108,45415,48017,5299,236787,107,4339,236888,5983,236764,40413,236761,108,44180,1439,134560,236787,107,4339,236888,564,236789,859,577,57882,901,659,1847,236761,107,10936,563,668,36773,236881,107,27019,5383,822,1535,24712,236751,236888,2481,81341,563,4891,107,8147,236787,668,815,919,4400,531,577,11307,236761,10603,563,668,36773,236881,108,45415,48017,5299,236787,107,236777,236789,506,17198,532,858,236789,506,2378,3774,993,795,577,107,20619,123119,12499,531,1407,506,1331,236764,1056,668,2863,107,2244,573,914,1977,236761,1293,4461,528,506,1697,18688,529,107,53987,17492,6819,58893,858,236789,506,2742,236761,108,44180,1439,134560,236787,107,4906,858,236789,506,10225,236764,532,1156,858,236789,506,47530,74077,13534,236789,236751,107,52908,600,564,1281,236761,108,45415,48017,5299,236787,107,2209,1053,236764,1680,672,1774,46574,236764,12571,236772,21716,107,236765,3575,3324,1515,236761,108,44180,1439,134560,236787,107,6445,625,236789,236751,12571,236772,47526,236787,1418,549,1316,691,614,13550,236789,236751,23674,236761,107,236814,899,236888,506,39976,1713,236761,108,45415,48017,5299,236787,107,9208,659,506,775,120182,529,2481,81341,236787,1680,1515,668,107,4554,2302,9168,236764,532,4977,1515,668,6895,24947,236787,107,28034,236764,600,4996,7304,236764,528,756,236751,12910,236762,3774,139092,7089,236793,107,24249,236764,1646,9691,236764,61085,236764,532,1299,1758,1778,236761,108,198066,236787,107,39201,236764,13706,236764,600,784,7057,2481,81341,1602,6093,107,43794,146066,8244,33361,236787,1298,668,46440,2810,236764,107,3497,30006,236764,496,1463,531,102301,605,2481,81341,236793,1239,107,902,20488,5238,3468,19386,44477,236761,107,16651,531,13706,236764,33461,3468,19386,44477,236888,108,3243,236787,107,16651,531,13706,236764,33461,3468,19386,44477,236888,108,38850,236777,3769,1536,3118,236787,107,3771,919,529,672,236793,625,1677,125227,1041,3710,236787,107,100985,1492,236764,951,919,236761,108,2457,16008,134560,236787,107,13908,236764,17536,236764,822,5946,236888,108,38850,236777,3769,1536,3118,236787,107,236806,236764,107,3048,735,236764,564,1281,236764,14071,236789,236753,784,506,33408,107,2542,1041,38644,236888,108,45415,48017,5299,236787,107,197615,236764,1041,1535,31451,236764,872,236793,107,4754,14617,2481,81341,236764,26721,102301,605,236764,532,107,2292,28869,236772,813,48638,20488,15152,7489,74077,107,3689,563,625,236881,726,236780,6886,40707,605,1921,564,2246,44543,236881,726,107,4573,708,236764,21820,6853,236888,108,38850,236777,3769,1536,3118,236787,107,4754,86372,25872,236764,89352,236888,107,38786,540,35627,735,13112,236789,236753,1053,564,2229,89085,236789,236753,2033,236764,107,6372,137531,236789,540,531,1460,786,39786,236881,16588,236764,1041,23348,236764,107,28576,6114,506,135433,528,146066,8244,8785,236764,107,3133,29372,600,6220,22549,236761,108,44180,1439,134560,236787,107,6445,236764,506,33408,18130,44543,236888,108,38850,236777,3769,1536,3118,236787,107,3133,3892,611,3819,236881,107,236806,1041,9380,15924,236764,73831,236761,108,45415,48017,5299,236787,107,236777,1281,711,1298,531,2490,236787,708,236764,8349,2033,236787,107,3133,8349,236764,2870,236787,532,12444,236789,500,8349,784,236761,108,44180,1439,134560,236787,107,236776,7549,13460,65997,236761,564,1451,137531,107,3133,564,1451,13112,236764,564,1006,2214,532,7274,236761,19694,236761,107,236776,57391,3654,657,1401,5989,580,236789,236751,3710,236764,107,6372,563,711,16126,531,1460,44543,236888,1599,659,1806,107,6372,13706,1374,513,1841,580,236787,3819,236764,684,506,9303,529,1758,236764,107,1882,735,1070,2255,43695,236772,78149,1590,107,502,2033,600,795,711,107,3912,126345,531,822,124615,236761,16961,8349,236764,61505,236787,107,1882,2246,496,56279,519,840,496,56279,519,532,107,818,43945,529,119593,840,128005,236761,108,2457,16008,134560,236787,107,52617,1447,236761,108,38850,236777,3769,1536,3118,236787,107,236792,19376,8875,3785,236764,3785,236761,108,198066,236787,107,46762,1595,993,236764,532,817,580,236888,108,38850,236777,3769,1536,3118,236787,108,45415,48017,5299,236787,107,236777,735,11742,107,2021,1460,42836,1041,1401,21329,107,3133,506,10692,529,1041,28149,236787,1186,107,3810,236789,236751,886,3210,23021,236764,837,564,9370,711,840,107,7711,13706,795,6171,3324,44543,236761,108,38850,236777,3769,1536,3118,236787,107,39201,236764,1535,5946,236764,107,236777,1053,4319,577,910,34936,528,1041,1595,236764,107,55771,47934,607,1091,528,61877,236761,108,2457,16008,134560,236787,107,2791,236764,531,506,44264,236888,108,12357,2170,3118,236787,107,3243,120785,8988,529,1515,236764,532,506,10214,1792,45942,107,14219,4641,552,991,531,1460,1515,236787,822,865,1667,2395,26004,107,52382,496,1459,43592,19518,1116,7205,4665,107,8409,1304,70842,1515,236787,506,8351,520,1574,495,31713,107,16900,73961,6623,1043,756,84614,1116,544,7329,236762,10225,236764,107,1829,11513,522,506,11595,531,7068,1515,236787,69708,236764,5252,2357,236764,11665,236764,107,14219,149037,236789,236753,872,236764,9025,5727,236789,236753,236764,532,86602,15432,524,107,3497,6471,3996,878,236764,784,65090,107,902,64000,1788,531,1460,1515,236787,6025,236753,236772,70177,87076,832,107,6294,2686,3571,506,4913,10942,46503,532,52276,107,2021,3345,496,63976,6410,236787,653,64681,236789,236753,194550,107,43688,506,3653,529,2173,532,3882,891,528,107,37107,33226,236772,236759,1348,7156,65684,531,506,190981,75052,107,4088,26042,1641,605,236789,18830,100271,236787,1288,496,510,1538,107,2205,768,600,45436,8081,1015,9025,1515,107,114498,1924,1403,154563,1131,914,3246,13361,107,3133,5877,1515,86953,39013,236761,108,206908,68251,3118,236787,107,2791,506,11059,236764,107,236777,12691,1515,57292,236761,108,12357,2170,3118,236787,107,11407,1023,4408,1149,236764,107,14521,914,2066,236764,817,6745,236761,108,206908,68251,3118,236787,107,2209,3914,3095,2295,5203,914,88734,107,4663,1298,668,1374,3654,532,1345,236764,840,795,107,206097,1724,668,46440,2810,236761,108,12357,2170,3118,236787,107,902,600,993,236789,236751,5931,236761,108,206908,68251,3118,236787,107,236796,26359,711,107,818,3364,616,236764,573,8761,692,1975,236764,840,901,107,41768,910,12440,105472,795,10849,107,3497,506,3198,4400,1239,914,861,88734,236764,837,107,6372,668,795,2583,1091,1386,564,618,2268,2934,107,2205,668,563,11307,531,776,236789,236745,236761,108,12357,2170,3118,236787,107,236777,6827,1515,54650,236764,107,114498,668,531,1975,573,57292,236764,2752,1093,668,107,111901,858,236789,506,2436,236772,2811,6271,580,1515,2247,107,818,13420,1933,17062,694,529,79028,236793,107,31777,6807,236764,618,506,8155,563,236764,914,40238,107,2021,506,1331,236764,2829,910,579,15624,121520,236761,108,206908,68251,3118,236787,107,236789,112728,1447,236761,108,12357,2170,3118,236787,107,1509,691,914,3658,236787,708,236764,668,1093,4305,625,4319,107,55771,6081,625,840,684,506,8816,529,506,549,8482,531,1515,236764,107,3133,506,12614,529,506,87136,236761,108,206908,68251,3118,236787,107,236777,7976,951,2480,107,55771,735,1515,2768,600,5708,532,531,2247,625,107,902,14860,236761,108,12357,2170,3118,236787,107,236789,112728,1346,1133,668,795,236761,108,206908,68251,3118,236787,107,1509,2863,577,531,1515,1299,618,1023,1535,120416,236764,107,236776,2889,21404,236761,108,12357,2170,3118,236787,107,4324,625,1921,3798,855,107,2021,1515,653,1023,11787,236761,1701,614,1345,236764,107,1882,1921,3600,506,1331,528,1144,54584,107,2209,2036,46440,4247,1091,236793,600,531,236789,236751,2066,668,1093,107,19845,1603,1091,125014,236764,148856,910,4846,11833,532,107,4830,1045,9107,1178,910,93904,236764,7046,1091,236764,107,902,3246,2970,532,6541,236764,107,4088,951,919,12556,6271,14134,573,506,1902,107,55771,184921,528,506,3653,236764,1015,735,910,1218,624,107,16904,573,18752,65501,236764,532,35461,58892,107,2542,62540,1208,1091,236761,108,206908,68251,3118,236787,107,2094,236764,618,611,1879,236764,10340,107,3834,1070,990,1056,914,87846,35331,819,107,172702,6374,506,1331,726,7650,990,2863,711,1461,236764,107,2859,668,577,2247,3324,756,236745,236793,532,600,236789,236751,618,3735,107,2205,531,1076,12414,580,24550,726,16132,577,914,4304,107,2021,178065,910,6299,28947,1148,236793,532,910,76869,107,172702,154923,1515,573,3785,236761,108,12357,2170,3118,236787,107,3689,236789,236751,506,4217,236881,108,86859,236787,107,3048,659,3265,573,531,506,44264,236761,756,112728,3305,107,6372,2481,81341,2863,577,57292,236787,107,236777,735,3472,506,31817,1758,157838,531,1460,1515,532,107,818,18261,531,10591,1515,8988,236787,1756,17878,150648,28764,236764,107,150958,532,225275,910,18416,8097,532,1526,5806,71240,236764,107,41768,1515,618,668,1786,236789,236753,236787,506,87136,518,2577,236764,107,2205,531,730,1034,236789,236751,31230,236764,532,506,76273,1603,107,236776,16984,532,41796,607,910,17126,532,134107,236787,107,236777,2752,5004,506,1133,236761,108,12357,2170,3118,236787,107,6481,236789,236751,531,506,44264,236793,107,3133,6081,607,775,23896,532,6114,573,506,990,236764,107,4573,17500,573,506,2460,236761,108,206908,68251,3118,236787,107,19845,607,611,236761,108,7280,16186,236787,107,33190,236764,2229,236764,901,659,4180,1590,236761,2088,1551,1975,107,1708,57292,18101,236881,108,12400,16186,236787,107,19765,236764,901,1879,236787,840,756,55188,3305,529,1418,886,107,236780,6886,40707,605,795,6081,625,236761,108,7280,16186,236787,107,6372,236789,236751,496,36711,12339,236793,840,668,236789,236751,105246,11307,236764,532,107,844,2061,711,506,3364,1331,236761,108,12400,16186,236787,107,134030,236764,993,1053,1010,1551,1822,1758,600,735,107,1674,42790,506,1331,236764,1015,770,236789,497,9312,1091,236793,532,993,107,1553,1551,600,901,735,9312,236764,901,1281,711,107,3350,1321,236787,834,600,236764,768,901,2765,901,1281,711,3217,236764,107,20890,17554,3324,951,2480,496,3866,236787,5233,236764,573,107,236780,6886,40707,605,13637,531,2065,3363,901,2765,653,17554,107,21156,99688,506,1847,4654,668,815,528,910,107,2763,3473,236793,532,855,529,914,29417,191231,19518,107,49798,76206,1460,236789,236745,236761,108,7280,16186,236787,107,2859,668,1602,711,2065,3363,668,1053,910,2765,653,951,236764,107,499,82701,1036,10424,2492,756,236745,10718,903,3490,1091,13637,107,15466,6271,7342,236787,840,668,26469,910,17554,607,5314,107,5099,6239,1082,740,6210,625,1515,236793,532,6895,107,63823,134342,600,1149,6340,6878,1515,910,107,175989,236761,4224,236764,531,4483,531,4208,506,105472,532,107,2763,942,7956,529,506,1331,563,618,4287,618,600,837,668,107,2763,56652,236764,531,111771,1091,573,910,2765,236761,108,12400,16186,236787,107,2209,46440,52026,5367,1403,529,914,2891,236787,532,914,107,527,1424,563,711,684,1288,3735,10674,618,1724,1015,236764,107,66094,1010,156934,532,103351,531,506,1331,236764,107,4142,2108,524,236764,2180,1027,3342,28869,531,735,1091,657,107,508,1131,910,22851,532,2072,236787,840,668,46440,834,107,202220,914,88734,528,910,6114,236764,532,914,7419,107,495,910,17500,236764,600,573,910,120785,531,577,107,124419,236764,532,711,54452,834,1623,236764,964,496,2712,529,107,522,4720,1275,9938,236793,531,2072,7394,236764,964,496,107,9215,762,236764,600,236764,6295,4850,506,7089,236764,1093,179196,107,500,11958,532,199971,699,1418,10222,600,6827,625,236761,108,7280,16186,236787,107,3771,919,529,1515,236793,668,563,496,26721,880,236787,1386,1595,236764,901,107,733,4891,236761,108,44180,1439,134560,236787,107,27787,6185,529,506,6285,1166,507,532,107,2021,5039,573,131998,639,661,8875,236764,625,7474,236764,107,2205,506,1689,1523,529,672,1023,1308,236772,49540,236764,107,2021,18294,1891,914,29417,2509,600,107,236814,651,5478,15032,573,914,2891,236787,5233,236764,107,40970,611,236764,107,14254,37923,643,532,23674,60953,236764,531,12614,107,818,1861,57292,236764,532,1774,2870,107,902,1023,1388,236772,13395,50684,236764,531,2072,107,236776,2268,529,600,26721,981,2121,236789,236753,107,2292,102301,605,2481,81341,3468,19386,44477,236764,8761,107,1882,1645,1590,1800,531,7806,532,531,5630,107,3497,88734,1133,5668,236761,108,7280,28579,236787,107,130171,236764,1535,1301,138299,236787,107,58499,5017,855,573,3861,236764,532,1386,775,1751,107,98063,1023,1883,236789,236751,40183,573,1221,1693,107,55771,692,531,10161,625,855,236761,107,192890,512,236789,506,1331,236764,107,1882,776,2864,822,2712,598,23896,236764,532,1308,236764,107,11069,22274,5776,8797,506,3364,2742,236764,107,2021,6422,1144,16349,1590,236761,108,206908,68251,3118,236787,107,1882,659,13020,524,107,41768,496,52614,52340,236764,532,735,17500,107,902,35884,742,531,20488,532,10764,107,818,7824,529,1023,14510,236761,108,12357,2170,3118,236787,107,24249,506,4319,107,1882,2863,577,1105,598,531,776,236764,768,668,5630,107,236776,85267,1550,529,506,1331,1082,107,2209,46440,79573,109366,1091,657,236761,108,44180,1439,134560,236787,107,6372,236789,236751,1135,236764,600,236789,236751,1135,236793,107,236777,1093,611,4319,1053,1010,24782,236761,7323,611,107,2021,6899,1301,138299,8988,236881,108,12357,2170,3118,236787,107,14254,95826,236793,107,4573,3819,1041,28012,691,919,50283,107,55771,506,199971,611,2583,625,236761,108,44180,1439,134560,236787,107,2209,18178,822,1331,107,4573,14806,1515,711,531,577,910,4086,103885,236761,107,236824,43195,1301,138299,236764,8988,236761,107,197615,236764,2514,822,1977,236761,108,7280,28579,236787,107,80163,236764,3468,19386,44477,236793,2752,30720,531,6899,107,3689,611,735,951,62695,3028,236761,108,38850,236777,3769,1536,3118,236787,107,11069,25451,236789,236751,73831,236787,107,236777,1053,4319,735,1041,40238,531,35316,1570,107,55771,6899,1879,1217,564,2506,1091,236761,108,12357,2170,3118,236787,107,39125,236764,564,4614,107,4754,4171,864,32857,236789,236753,611,711,236761,108,38850,236777,3769,1536,3118,236787,107,3771,236764,17536,236787,3819,39179,236764,107,4420,58892,735,1603,786,4196,236764,564,37629,699,4171,236761,107,3048,834,28808,711,236764,5233,16131,711,236787,840,107,17993,1331,236764,107,236777,2765,1091,618,901,19281,236761,108,44180,1439,134560,236787,107,100985,1492,236764,2178,1679,236761,108,38850,236777,3769,1536,3118,236787,107,236777,1053,4319,735,886,23037,1041,2228,858,236789,506,3768,107,4420,506,610,63206,964,19847,1082,1846,586,2178,107,2021,6899,1041,711,236754,1013,27321,236789,236753,236761,108,44180,1439,134560,236787,107,192890,529,506,1331,236764,107,11069,44877,36695,1217,740,668,111771,726,107,6372,236789,236751,13460,531,886,1535,886,726,14730,611,1492,1460,107,2209,1053,4319,23937,784,914,50917,573,20488,107,55771,886,580,236789,236751,23896,531,6899,625,236881,101685,236764,1301,138299,236761,108,2457,16008,134560,236787,107,236777,2863,6220,8300,236787,506,50898,529,3468,19386,44477,107,31336,711,577,24011,236789,236753,1127,137555,236761,1030,563,4247,107,6372,234007,563,506,9329,598,31886,236764,532,107,14254,23919,11427,506,99967,236787,768,625,577,236764,107,818,880,564,8988,529,3914,528,506,1902,107,3912,117226,6651,2661,2470,236761,2640,58058,1518,236764,107,4420,18725,17492,1603,496,2228,573,13706,236764,668,25876,107,63140,506,1686,529,3496,236787,1023,1299,110240,236764,107,2825,542,607,784,30450,564,1523,657,236764,5004,1515,6093,236764,107,4420,607,914,9593,1037,40744,668,26214,107,818,189333,991,24236,1680,1515,236787,577,1791,5049,107,2267,512,236789,497,236772,6473,236789,236753,10995,532,858,236789,506,57292,236789,236751,1927,107,236773,65875,1806,65529,616,236787,18725,17492,236789,236751,1265,668,1645,236764,107,3133,19847,1515,580,914,21980,236787,528,600,1719,236789,236751,108014,236764,107,4420,668,2473,1134,506,3875,528,506,8317,236764,107,2209,12183,1791,880,858,236789,506,2135,236764,532,573,914,786,524,107,31403,69135,236772,10633,607,506,32049,236761,4923,53868,3911,107,3366,236772,7703,236789,236753,5478,236764,668,176922,1133,496,5442,236764,107,3133,528,506,177384,529,85989,34998,2338,107,2209,537,3206,236789,236753,784,73787,529,506,99095,236761,1701,672,1774,236764,107,13286,532,528,146066,8244,236764,1531,786,1879,236764,107,236777,3914,8988,1515,2033,236787,668,197735,236789,236753,506,1378,6217,236793,107,3133,684,914,8401,2591,1603,506,97287,107,22354,13614,1131,7932,236787,618,59238,1680,107,236776,16878,1208,22687,236764,834,1758,41179,236789,236753,107,3133,11561,3426,914,15384,236787,914,26114,236764,4355,236789,236751,18743,236764,107,10936,625,1602,1686,236764,625,3721,236793,699,3392,531,3998,107,2209,691,496,3210,529,4806,236764,5769,1418,5776,107,31403,51422,607,26328,73545,236787,7057,668,5273,236789,236753,107,818,53243,13818,529,506,3207,236764,837,668,16715,107,3497,704,63221,54558,236793,11278,1933,3588,1135,236764,107,3133,607,496,11059,60056,19847,107,236780,6886,8244,1133,496,13401,236787,1492,784,236789,236751,914,236787,107,4420,236764,684,532,684,236764,506,7038,529,3653,13494,161376,107,15989,5508,5113,236793,1299,6850,914,37402,7304,107,1479,236772,576,9471,236789,236753,1144,528,32713,691,6477,11870,236764,107,3133,531,506,10041,3588,668,236793,1298,668,1602,107,7306,544,55909,512,236789,497,506,6176,529,1758,236764,618,768,107,236789,30379,627,496,73315,75052,236787,532,8421,692,2246,236789,236753,107,22186,2135,532,3207,35079,236764,668,2752,15032,107,2021,13358,914,16489,607,17280,522,236761,108,44180,1439,134560,236787,107,236824,43195,880,236888,108,7280,28579,236787,107,2209,3914,840,607,4113,4691,506,88734,107,24249,692,68701,1515,236761,108,2457,16008,134560,236787,107,7711,151624,668,12261,236789,236753,657,236764,107,3133,1385,236789,236753,3324,2432,26419,618,901,964,107,818,3364,203372,529,506,1902,236787,668,23203,1713,2344,107,55771,67735,4850,1093,2583,236793,28281,107,15989,50898,607,3490,1091,236764,532,563,3004,107,2021,8559,506,990,531,1345,625,236761,108,44180,1439,134560,236787,107,2209,236789,236751,1447,29417,236787,107,6481,1515,577,2246,236789,236753,573,236761,108,7280,28579,236787,107,7029,3468,19386,44477,236761,108,193907,236787,107,2209,139092,3196,236761,108,44180,1439,134560,236787,107,818,112936,236764,3468,19386,44477,236764,659,1388,17135,107,2021,1386,44543,57292,236761,108,38850,236777,3769,1536,3118,236787,107,236777,776,47700,1091,2036,107,4754,1972,532,3019,236761,108,44180,1439,134560,236787,107,1509,1299,7474,107,6372,611,776,8988,531,506,1331,236761,108,38850,236777,3769,1536,3118,236787,107,236777,776,5426,2167,574,611,236764,107,6481,786,512,236789,497,196823,600,2401,236764,573,564,3914,107,25240,580,506,38174,236764,1975,41718,532,1175,1059,1091,236764,107,2542,1041,40238,236789,24273,236764,531,2583,910,135196,236787,5091,611,107,6372,564,1149,1786,672,3490,236761,108,206908,68251,3118,236787,107,39125,236764,506,1331,107,15545,735,910,27892,236793,13637,795,901,59435,107,4906,71427,529,17620,236761,108,44180,1439,134560,236787,107,25240,1091,711,531,236789,236745,236787,107,100985,611,236764,817,4691,611,531,506,2401,532,107,13751,531,611,236764,618,822,76399,735,236764,107,11069,20488,607,822,1183,236761,108,38850,236777,3769,1536,3118,236787,107,1509,563,6576,107,6372,564,2863,77123,528,13011,236764,532,2473,1388,107,3912,3523,699,506,1331,236761,108,12357,2170,3118,236787,107,10666,611,600,236881,108,38850,236777,3769,1536,3118,236787,107,2021,91932,31273,1091,236764,5478,564,1602,236764,532,5478,236793,107,11340,1091,506,723,12057,69927,837,564,1374,16239,236764,107,2205,768,564,1053,4461,1091,573,506,20947,107,4088,910,11762,1186,236888,108,44180,1439,134560,236787,107,6294,711,1975,3324,236789,236745,236761,107,1882,5840,531,611,236764,12761,11664,529,506,1331,236764,107,7711,5708,531,1091,236787,532,531,1023,29417,57292,107,106896,692,784,12690,532,20488,236761,108,20769,3287,236787,107,2021,3468,19386,44477,2229,784,12690,532,20488,236888,108,12357,2170,3118,236787,107,3048,1460,1217,668,46635,531,1161,506,1331,236761,108,206908,68251,3118,236787,107,12055,901,48672,236789,236751,9703,236888,1293,795,1660,1091,236764,107,2205,768,668,1602,798,212451,1144,668,15633,107,31336,577,528,1091,531,2583,236761,108,12357,2170,3118,236787,107,33190,236764,692,236789,859,1573,1091,107,4088,1023,19258,1590,236787,580,506,34108,236764,107,236777,1281,236764,901,776,8835,775,236761,108,7280,67386,236787,107,14946,236764,768,668,776,1660,1023,27892,236764,692,19452,711,531,30590,1515,236761,108,12400,67386,236787,107,1882,1149,236764,17536,236764,768,692,795,236761,108,42173,67386,236787,107,1882,735,2066,528,13109,531,776,625,236764,840,625,563,496,107,10310,600,692,735,951,2066,531,776,236793,573,768,668,1407,775,107,20941,40238,532,3442,775,914,50898,236764,692,659,531,2247,1023,107,126535,1303,1131,1724,40238,532,8988,573,1091,236793,834,236764,768,107,499,3442,775,914,29417,50898,236764,692,1921,992,3442,1515,107,700,29417,23772,529,1091,236761,799,29313,4637,563,107,3419,540,18032,236764,532,573,506,48859,531,577,4616,4720,1275,236764,107,37051,531,1386,496,27321,529,506,48859,236787,529,506,107,7650,692,1646,3295,236764,1374,3437,13109,531,577,107,3419,540,18032,3295,236761,108,7280,67386,236787,107,3133,531,1386,775,951,2480,3305,529,236764,496,2268,1601,107,16132,7298,236793,573,3622,692,15032,872,1003,506,15689,236764,668,107,21156,1508,17539,711,531,2246,775,506,1551,236772,47460,48859,236761,108,42173,67386,236787,107,1882,735,1010,2760,834,529,1551,236793,711,600,1023,15005,107,733,1070,8864,236764,1070,2764,236764,1070,158825,794,236764,1070,49636,236764,107,5503,600,1023,166623,659,834,7584,953,41980,236787,532,107,1168,3623,564,1751,768,784,1023,166623,964,531,4186,855,529,107,811,29222,236764,901,1093,10240,13512,236764,11895,236764,8339,236764,8710,236764,107,624,910,13782,529,886,1982,1595,1374,577,657,107,20073,531,784,506,3298,512,236789,506,21164,236761,108,12400,67386,236787,107,51836,611,834,236881,15311,1595,776,611,12011,1041,32950,1093,107,15275,236881,108,42173,67386,236787,107,197615,236764,822,32950,795,711,834,4949,855,618,2264,880,236789,236751,107,16132,67251,55188,12117,7206,3701,872,528,496,3355,236772,2834,236764,840,107,584,625,964,657,32795,236764,756,15088,771,236764,2889,236764,175221,236761,108,12400,67386,236787,107,11355,600,1595,236881,108,42173,67386,236787,107,2021,10382,4850,528,496,20389,236764,1298,1646,1806,4688,107,124758,524,3121,607,91580,569,8548,236764,506,10080,1093,994,107,1708,44243,24273,236764,531,1601,531,974,44543,496,6853,236761,108,12400,67386,236787,107,3048,659,2752,2180,822,34296,236787,611,1149,236764,611,1149,236761,108,42173,67386,236787,107,14219,611,784,21891,531,2583,822,27892,236881,2024,107,7705,236789,236751,951,4217,236764,506,5314,912,23076,625,236761,564,107,30468,236764,768,668,1093,106184,531,506,1331,236764,993,691,107,49912,496,5367,1099,880,236761,107,8291,668,3952,236764,532,528,506,38174,529,79028,236787,1686,914,107,59820,236761,1191,659,711,531,4196,784,3075,236764,840,531,107,3521,684,1515,1298,668,11979,236764,684,5906,236764,684,173508,236764,532,107,2003,155173,236761,1293,236789,236751,531,1386,914,10512,684,107,140381,236751,236793,19139,1418,886,529,775,815,496,3161,107,33986,700,236764,528,6295,1515,1023,1852,27892,607,1023,1852,107,126535,1303,236787,5233,1500,786,236764,532,564,1982,611,1217,107,7624,2863,817,684,1515,236761,108,3243,236787,107,4795,236764,3004,236761,108,44180,1439,134560,236787,107,236806,17536,236764,611,659,711,1447,236787,735,611,711,3224,107,818,5367,15163,1758,735,3028,236789,236745,236881,108,38850,236777,3769,1536,3118,236787,107,3689,1921,564,1879,236881,107,236789,236777,40705,236764,17536,236789,726,2740,5090,3324,236789,236745,236888,564,3914,3437,107,4754,28166,531,1288,496,17723,236787,726,236789,13908,236764,17536,236764,1041,40238,236888,107,236777,2506,1091,528,1041,2891,236789,236751,2509,236764,1056,107,9401,2953,529,822,96095,96887,236789,236753,532,11536,107,4663,506,9168,529,1023,1852,41717,7085,108,44180,1439,134560,236787,107,236806,786,236764,506,33408,236888,107,3048,1921,711,8988,529,600,236787,611,1921,12614,1091,107,2021,1751,3324,611,236761,108,38850,236777,3769,1536,3118,236787,107,51836,3324,786,236888,13098,756,581,236888,107,236777,1093,901,1093,10849,786,236764,1133,506,73617,107,24249,1023,3542,1472,10382,684,756,581,236761,108,44180,1439,134560,236787,107,3048,236789,859,2556,784,236787,107,236777,236789,859,5264,611,236787,14098,611,236764,8988,531,756,581,236764,564,14098,611,236764,107,902,87932,8155,236761,108,38850,236777,3769,1536,3118,236787,107,73319,1091,15066,910,13799,107,3133,2514,910,14631,3980,236761,107,4324,236764,1590,3952,496,24088,236761,107,3048,1281,506,4400,236764,2634,236764,529,1041,8101,1590,236761,108,42173,67386,236787,107,1882,776,236764,17536,236793,3442,775,1144,46440,6111,611,531,236789,236745,236761,108,38850,236777,3769,1536,3118,236787,107,61781,1852,19227,236761,108,12400,67386,236787,107,11069,1852,19227,236888,108,38850,236777,3769,1536,3118,236787,107,43320,236764,840,711,10701,1852,12614,236761,108,42173,67386,236787,107,3910,711,822,1852,12614,236881,108,38850,236777,3769,1536,3118,236787,107,3771,236764,17536,6945,236745,9849,2752,1041,12614,3819,531,13007,506,107,75487,607,79899,236761,108,42173,67386,236787,107,3048,1921,1751,236764,768,692,2583,611,1027,3210,236764,692,4614,531,107,51775,684,611,236761,108,38850,236777,3769,1536,3118,236787,107,13086,1299,236764,564,14098,236764,822,3385,512,236789,506,57292,5976,236881,108,7280,67386,236787,107,818,3385,563,531,2679,625,39682,236761,108,38850,236777,3769,1536,3118,236787,107,234176,236888,12182,236764,564,14098,236764,1531,786,678,236789,236745,236787,564,735,40238,531,107,4919,611,236764,837,2863,577,23149,528,2147,236761,5180,107,15466,8300,236764,17536,236793,1144,1879,611,236881,108,12400,67386,236787,107,3048,2863,678,236789,625,236764,26721,17536,236761,108,38850,236777,3769,1536,3118,236787,107,236776,4241,236764,17536,236761,2085,236789,236751,528,784,1156,26721,27892,107,41742,3701,236761,564,735,822,610,1356,236787,772,16629,236761,108,42173,67386,236787,107,4573,672,563,2613,11049,236761,108,12400,67386,236787,107,2267,756,15088,627,531,2583,1570,74077,5503,756,55188,951,4217,236761,108,38850,236777,3769,1536,3118,236787,107,100985,611,1492,236764,768,625,1149,1975,607,506,27224,529,822,107,112607,600,564,1149,577,57292,236764,564,735,1590,506,107,12697,835,38174,236761,108,95739,67386,236787,107,3048,735,52026,951,62695,529,822,2891,236764,532,611,107,17777,711,52026,951,62695,236761,108,38850,236777,3769,1536,3118,236787,107,11069,194753,236881,108,95739,67386,236787,107,3048,735,1010,496,186004,531,1116,22816,236764,611,735,107,55431,496,12453,531,1116,4690,236793,611,735,711,11161,9312,107,1437,3364,1331,236761,108,38850,236777,3769,1536,3118,236787,107,3048,1374,2881,786,506,919,128061,600,564,735,107,2217,1010,3364,528,1041,2765,236761,564,795,236764,17536,236764,111771,1041,107,1745,2163,10070,236764,506,1331,236764,531,6637,496,23348,497,107,114656,529,1091,236793,756,55188,496,4194,901,2881,107,8628,519,236787,532,2338,506,23069,529,910,5313,563,107,81444,531,735,1041,5423,1082,1041,3710,236764,564,795,93544,107,1437,178849,1194,16354,532,577,1135,531,1091,1346,107,6427,29750,586,236793,600,563,236764,17536,236764,564,795,115742,506,107,52661,2694,658,529,1070,4913,880,532,2583,625,107,236763,848,3140,531,506,85241,616,236761,6841,236764,5426,2167,574,611,236764,107,236777,1149,577,57292,236761,108,134936,67386,236787,107,1882,4614,531,1586,611,1023,4389,236793,532,5233,2583,107,7624,1023,27892,155627,236761,108,95739,67386,236787,107,3048,735,4461,1551,40238,573,822,2891,236761,108,38850,236777,3769,1536,3118,236787,107,236777,795,711,18505,822,4654,607,6807,1091,236761,564,107,16132,1386,1623,529,822,27892,236764,532,834,13007,611,951,3342,236761,108,22186,44969,236787,107,818,33408,2583,611,12690,236764,17536,236764,155627,236888,108,38850,236777,3769,1536,3118,236787,107,14254,9380,27892,236888,107,65994,625,563,531,1778,236764,2480,531,169433,236764,107,55771,120822,506,20947,837,1171,692,776,28054,236761,107,11355,528,672,26112,200683,531,787,1374,564,1975,1590,236764,107,2021,2829,529,33555,532,23012,236764,600,776,3196,236764,107,37107,115825,19414,2391,236881,13287,9139,786,531,236789,236745,236787,107,3689,2401,120416,236764,528,784,2432,1374,692,776,236789,236745,236764,107,818,14500,580,34323,990,1093,7089,6770,977,693,236764,107,3133,102694,3165,577,2311,6112,668,2957,107,2542,9043,531,512,236789,497,236772,51647,236761,31373,1082,26001,625,834,236764,107,6481,506,1494,4408,532,506,20488,817,107,2021,886,600,1093,776,5478,236761,564,1006,3746,1343,236793,107,818,886,912,17477,236789,236753,236764,506,1032,795,564,776,236761,107,8291,2229,919,27892,236761,107,11069,27892,236787,573,822,27892,564,735,25876,236793,107,19242,236789,236753,573,822,27892,236793,573,5180,27892,10591,107,4088,40238,1156,25400,11049,236793,34998,147565,3962,107,236777,735,3472,532,6827,529,236793,573,822,27892,735,107,34496,1551,2432,236764,1070,2344,236764,1070,919,822,27892,236787,107,51077,564,1093,577,57292,236761,108,179948,67386,236787,107,2209,815,3028,951,62695,236764,532,3914,817,2180,1027,11481,107,1562,236789,236751,8300,236761,108,215968,67386,236787,107,20416,1531,1515,577,57292,236787,506,33408,2583,1515,12690,236764,107,624,1386,1515,1535,4389,531,506,1331,236888,108,3243,44969,236787,107,95904,236764,19611,236761,3803,5383,44543,236764,29417,57292,236888,108,38850,236777,3769,1536,3118,236787,107,236824,43195,27892,236888,108,44180,1439,134560,236787,107,3048,735,15032,822,26919,236793,532,506,12761,11664,107,7315,687,611,607,506,1331,236789,236751,8300,236787,7474,107,6372,236764,528,506,5125,12933,25676,236764,611,107,135705,776,2874,506,112936,236761,108,38850,236777,3769,1536,3118,236787,107,4602,672,3028,236881,108,206908,68251,3118,236787,107,818,2401,529,2864,611,735,35260,236787,107,818,1331,776,19921,611,236764,532,659,45041,236789,236753,107,2021,2874,116116,236764,3324,822,216419,236761,108,38850,236777,3769,1536,3118,236787,107,10936,236881,657,506,112936,236772,6367,236881,108,206908,68251,3118,236787,107,3810,236764,3468,19386,44477,236761,108,38850,236777,3769,1536,3118,236787,107,12055,564,2352,1239,63035,236881,108,206908,68251,3118,236787,107,3048,1149,236764,17536,236761,108,38850,236777,3769,1536,3118,236787,107,6372,564,236789,859,6850,776,236793,532,236764,14111,7564,1570,236764,107,178139,531,506,112936,236772,6367,236761,108,44180,1439,134560,236787,107,236777,236789,859,2514,611,2544,236761,4291,611,3008,236881,108,12357,2170,3118,236787,107,1882,4196,1590,573,506,1331,236761,108,206908,68251,3118,236787,107,106161,611,1388,236761,107,2209,815,625,1492,236764,532,684,914,5724,17914,961,107,236789,112728,6962,657,756,236751,3710,236761,108,12357,2170,3118,236787,107,3497,496,11307,3710,668,30748,914,36820,59238,236761,107,16132,611,16414,506,1331,236881,108,206908,68251,3118,236787,107,3910,1492,236764,1041,39880,236888,735,611,13874,672,880,236881,108,7280,67386,236787,107,2209,815,1023,27892,236764,17536,236761,108,12357,2170,3118,236787,107,1882,14098,506,33408,668,1149,28054,822,18178,236761,108,12400,67386,236787,107,95904,236764,17536,236787,531,1041,6934,149246,6303,236764,107,2209,18649,236789,236753,775,1056,668,56400,236789,236753,1023,27892,236761,108,42173,67386,236787,107,117494,107,2209,1378,124094,775,107369,236761,108,7280,67386,236787,107,3771,6945,55188,914,2712,529,10808,236787,668,1602,711,18649,775,236761,108,12400,67386,236787,107,4348,886,22110,775,236764,5383,5869,236764,840,3189,107,2209,1456,775,121391,3846,236787,668,1374,735,1407,236789,236753,775,107,15989,12933,529,33641,236764,40238,4461,573,236789,236751,2891,236761,108,206908,68251,3118,236787,107,11355,236764,834,668,1602,236764,564,1006,2889,236761,108,163010,236787,107,3771,236764,951,236793,951,880,5004,756,581,236761,108,42173,67386,236787,107,2209,1176,668,1053,40238,236764,837,668,1451,1407,107,495,2147,236793,107,3133,607,914,5423,236764,5478,49117,625,528,121391,236764,107,236789,236777,1093,577,57292,6945,3189,668,236787,756,4130,2401,236764,107,4573,684,822,27892,236764,795,711,834,9841,786,236793,107,11069,27892,5233,7085,3026,692,13416,600,236764,107,8291,691,756,236777,7806,611,573,822,27892,236787,7806,611,236787,107,11069,1346,9380,27892,236787,1492,611,735,2378,107,17993,27892,236764,107,236777,735,951,3342,607,611,7085,15141,711,672,202952,236881,108,206908,68251,3118,236787,107,11355,3477,964,611,59590,531,1460,236789,236745,236764,107,3524,236764,9333,625,236764,529,1288,106221,170545,107,2021,6422,822,27892,236881,108,12357,2170,3118,236787,107,30092,611,711,735,4173,1515,107,2205,611,964,16183,236789,236753,236764,1056,668,1053,951,2066,236764,107,4573,691,496,73661,34936,531,506,1883,236764,107,2209,691,822,13550,236764,3785,892,873,2342,107,11069,92372,532,506,153772,600,611,10591,107,236777,236789,506,2742,529,506,692,514,236793,532,1492,236764,32643,107,236776,1977,529,89714,532,47934,512,236789,506,1883,236764,107,2859,668,1374,2036,64771,10407,4595,107,37568,95521,531,506,4846,1553,3436,236764,822,27892,2473,107,3912,114739,531,74445,236881,1599,1374,735,1176,107,6372,618,914,26721,50898,1602,3539,951,2344,107,55771,1144,668,15032,573,236764,834,914,86372,4135,107,38786,1751,3324,611,573,822,27892,532,107,40414,914,105472,5645,611,1131,2765,236764,107,119865,822,10841,29398,236761,108,206908,68251,3118,236787,107,21204,531,735,1176,236764,107,2205,611,964,2521,236772,212109,236764,1053,6374,236789,236753,914,7304,107,3133,6956,914,68424,236793,699,1515,179196,236789,236753,107,75064,914,86372,14468,236764,837,611,2473,236764,107,2205,4400,1053,2246,236789,236753,611,872,236764,735,4247,1515,531,107,3524,1663,625,1093,735,14329,236789,236753,914,1270,586,4135,236764,107,24249,5583,1345,1264,711,4676,107,236774,6723,1515,531,12723,613,236793,834,10848,1515,531,49064,236764,107,3048,1374,735,5787,236789,501,506,7821,529,914,24191,497,107,3133,1786,236789,236753,1515,3037,2785,236761,108,12357,2170,3118,236787,107,15562,611,48672,107,2209,1602,29732,611,528,2196,51785,107,4420,668,1602,1202,822,18178,236764,532,776,611,1751,107,6372,914,51785,2863,711,577,168714,531,611,236764,107,4420,668,46440,2066,531,47585,236881,8922,236764,1053,822,12762,107,3771,3710,3571,611,236881,653,1053,611,120785,531,4665,107,91561,506,544,6703,1837,529,11179,236881,108,206908,68251,3118,236787,107,19845,611,107,236788,500,1492,14978,506,180706,236881,532,1492,1570,107,4088,1515,600,1602,711,2679,236764,840,18649,236764,171130,107,11069,51126,236772,1708,120785,236881,108,42173,67386,236787,107,2209,236789,236751,711,9128,236789,236753,236793,692,1149,30590,1515,3819,236761,108,12400,67386,236787,107,3133,795,30590,1515,236787,107,236777,236789,859,735,3493,7549,27892,529,600,5057,236761,108,7280,67386,236787,107,236777,10911,3493,7549,532,910,4690,531,6347,756,581,236761,108,12357,2170,3118,236787,107,3407,611,11632,26310,236764,532,3442,1724,4690,236764,107,7634,735,13874,496,57292,600,795,699,1091,1769,107,37107,92372,236793,1386,1091,529,951,919,8300,107,55771,12414,600,659,618,3187,12222,573,140547,107,2205,5233,7953,531,776,834,236761,108,206908,68251,3118,236787,107,6481,1091,48782,236764,107,3133,580,496,34303,11179,784,98686,107,11069,59590,9912,236793,31957,914,21077,236764,107,3133,914,2255,17554,31273,611,236793,25253,236764,10849,711,107,3497,1144,51785,668,30748,506,36820,41154,236764,107,3910,528,914,8816,668,121391,236789,236753,611,236793,840,822,18178,236764,107,120474,3324,914,3019,236764,3721,699,611,107,818,113291,529,914,1861,2411,831,236764,107,24249,1346,12995,11542,236764,723,22045,78956,236764,668,1602,8013,107,6259,506,1379,2113,606,17554,668,29402,611,236761,108,12357,2170,3118,236787,107,119137,107,236776,12866,580,775,236764,822,12761,11664,236793,600,692,4796,16165,236764,107,3771,156494,1534,236764,840,600,611,1921,107,34597,822,9912,580,1515,236761,108,206908,68251,3118,236787,107,37889,236764,611,13874,1515,107,9474,1308,1023,154026,1082,618,28172,107,2292,822,1852,1847,81142,236764,532,600,822,21161,236764,107,3695,86521,607,1144,611,4319,1921,776,107,55771,1144,611,1374,236764,1603,611,2342,506,11261,107,2021,8300,1515,57292,236787,6267,506,12866,580,775,236761,108,12357,2170,3118,236787,107,43320,236764,25315,775,711,236761,22168,692,1676,36421,531,611,236761,107,3910,3184,586,668,6074,531,7298,914,2891,236764,107,3910,1440,7245,236764,532,1144,2862,668,32844,529,236764,107,818,29417,3155,512,236789,506,2481,1287,743,236764,699,52533,3588,107,6372,1455,16771,2481,81341,236764,646,8931,236789,236751,8709,236789,236751,2369,236764,107,15938,236764,1308,1822,23949,133374,236764,1590,691,9615,236793,107,4088,506,1638,3155,8898,109444,532,67526,605,964,236764,107,6372,1023,12222,1813,6111,684,159845,534,2853,236793,107,3133,138,107,30379,762,1646,138,107,31403,914,1822,72630,236761,108,206908,68251,3118,236787,107,4906,5478,78018,236764,107,6372,46440,28383,1388,528,914,1589,82326,107,2021,577,1076,1494,528,1977,236764,692,1602,63774,107,2021,822,94878,2499,236787,840,611,735,1765,236764,107,150294,914,1861,18752,607,914,3068,236764,107,6372,668,236789,236751,822,6530,13550,236764,532,98686,107,11069,11059,216419,236761,108,12357,2170,3118,236787,107,37889,236764,611,770,236789,497,1053,3028,236789,236745,726,107,236814,8598,580,600,2036,726,5503,684,1023,10848,580,236793,107,3133,41909,236764,1056,611,735,9710,822,1548,236764,107,178139,531,506,44264,236761,108,3243,236787,107,1882,795,834,236787,4180,784,107,6398,533,528,910,9912,236761,108,12357,2170,3118,236787,107,6481,1091,817,580,236793,107,2094,5333,6592,964,2480,2247,528,22051,236764,107,55771,4196,236764,3068,9370,236764,573,5314,236787,107,2859,236764,618,914,4135,563,236764,668,3798,528,49064,107,3497,910,45208,236764,1800,15739,532,3890,107,818,132422,529,914,25046,236761,108,206908,68251,3118,236787,107,2021,506,44264,236764,2229,236787,107,1882,795,577,993,1680,506,6381,512,236789,506,1331,236793,107,3133,672,2863,4483,236764,618,22043,756,55188,236764,910,1852,236764,107,24249,692,735,817,13496,95191,236761,108,38850,236777,3769,1536,3118,236787,107,236774,1068,605,17519,547,8875,1299,1053,1603,861,2228,236881,108,236798,6751,134560,236787,107,2209,1053,236764,1041,29398,236793,532,600,625,691,837,7321,107,7711,2353,66890,9637,236761,108,38850,236777,3769,1536,3118,236787,107,4324,1299,506,6285,1166,507,1975,840,618,657,1171,236764,107,33778,236764,1056,990,2863,11172,1091,236764,531,1386,4284,236761,107,41768,236789,236751,1570,236761,108,2457,16008,134560,236787,107,7634,659,23143,236764,29398,57292,236764,834,236764,107,6372,692,2863,20060,528,1023,16606,1460,107,37107,45892,6794,1570,236761,108,38850,236777,3769,1536,3118,236787,107,115630,611,17519,547,8875,236881,108,236798,6751,134560,236787,107,2791,6338,236772,22425,668,3588,531,786,236793,532,1602,57391,107,91561,506,6285,1166,507,236764,573,901,1053,834,107110,586,107,100153,524,506,5148,236787,668,563,21136,531,5307,1940,236761,108,38850,236777,3769,1536,3118,236787,107,4225,4099,668,529,786,236881,108,236798,6751,134560,236787,107,2209,1602,236764,1041,29398,236761,108,38850,236777,3769,1536,3118,236787,107,3910,236881,1144,236881,108,236798,6751,134560,236787,107,3910,3187,668,1053,1645,611,236764,26114,531,26114,236793,107,6372,529,784,2432,3324,506,7764,668,52695,107,11069,1589,1346,236764,600,668,1093,126693,914,86530,107,2021,62180,96538,236764,834,668,2473,107,3912,2246,236789,236753,822,208282,7574,236761,108,38850,236777,3769,1536,3118,236787,107,3834,5307,1940,6176,668,236881,108,236798,6751,134560,236787,107,3834,5307,1940,236761,108,38850,236777,3769,1536,3118,236787,107,236777,7976,564,1053,496,4400,531,6370,1515,993,236764,107,2021,51373,914,54584,6340,236761,19694,2033,236761,107,3912,2840,236764,1239,659,506,12761,11664,529,506,1331,236764,107,818,120785,512,236789,506,3364,11203,236787,564,776,169004,1091,236793,107,2542,901,776,136054,1091,528,9747,236764,107,91561,784,29417,17477,831,236761,108,206908,68251,3118,236787,107,8653,951,3342,236761,108,38850,236777,3769,1536,3118,236787,107,27577,236888,1144,563,600,236881,108,12357,2170,3118,236787,107,1509,795,577,13588,531,817,580,236787,951,3342,236761,108,38850,236777,3769,1536,3118,236787,107,3689,3590,672,2352,236881,108,44180,1439,134560,236787,107,818,4217,236881,108,2457,16008,134560,236787,107,236814,651,668,711,1786,236789,236753,506,29417,532,506,3364,236881,108,12357,2170,3118,236787,107,1558,138299,236764,951,236761,108,38850,236777,3769,1536,3118,236787,107,19845,564,1053,2940,236789,236751,27892,236881,108,7280,28579,236787,107,92580,11664,236764,2583,1595,236793,668,2863,531,506,2436,236772,2811,236761,108,12357,2170,3118,236787,107,818,1331,659,2494,13220,2342,1515,236761,108,206908,68251,3118,236787,107,17348,236764,107,3524,784,795,3798,528,2603,543,236761,108,38850,236777,3769,1536,3118,236787,107,14219,1239,822,38340,236881,107,15545,1239,735,27892,236764,600,740,6422,1091,1492,107,3133,6850,864,10439,910,120785,236881,2900,659,107,17993,15913,236881,107,3048,1646,910,90965,236764,3217,6157,611,711,910,14631,236881,107,19845,611,711,1076,1091,580,236881,108,44180,1439,134560,236787,107,3912,16680,236764,577,16680,236761,108,38850,236777,3769,1536,3118,236787,107,1509,563,496,2076,3833,3210,236764,532,23897,684,8541,236764,107,2021,53988,506,795,529,506,102210,236787,107,236773,6039,236789,236745,236764,532,3892,607,1288,618,3914,6157,107,31777,3785,795,577,26668,236761,108,12357,2170,3118,236787,107,7029,236789,236745,711,496,8541,236787,107,818,1331,4665,611,18649,236789,236753,1091,236764,532,529,5226,236764,107,4420,15689,691,2238,1091,38029,236764,611,1697,1718,236793,107,3900,25484,236789,236753,506,1349,1554,1639,573,506,1331,236764,2246,236789,236753,1091,107,2866,236772,942,54602,236764,111771,616,236764,96503,531,951,1782,17412,236761,108,38850,236777,3769,1536,3118,236787,107,11355,236764,672,691,3224,1680,236761,108,12357,2170,3118,236787,107,4348,531,1091,784,236761,108,38850,236777,3769,1536,3118,236787,107,19845,611,1573,236789,236753,1091,2178,80073,236881,108,12357,2170,3118,236787,107,3910,236888,564,1573,1091,236888,108,38850,236777,3769,1536,3118,236787,107,3048,659,1133,531,776,1288,1960,236761,108,12357,2170,3118,236787,107,4348,21304,236764,107,7795,1595,236764,531,2480,23149,236761,108,38850,236777,3769,1536,3118,236787,107,11355,1299,1374,564,577,57292,236881,3763,570,1310,14958,236764,107,6481,786,28054,834,2611,618,611,236764,532,1386,786,107,11069,12339,213749,236761,108,206908,68251,3118,236787,107,3048,1407,2311,1623,529,600,107,2542,837,506,1331,18802,236787,768,611,795,1786,107,2021,1298,611,659,4470,236764,611,1921,68256,822,1595,236764,107,24249,611,659,855,529,236764,607,496,12249,1898,7304,236764,107,3524,2752,577,834,29417,618,496,57292,236764,107,31777,116253,607,1515,573,213749,236761,108,44180,1439,134560,236787,107,6481,236789,236751,577,16680,236761,108,2457,16008,134560,236787,107,818,1331,659,57822,236793,1076,580,236761,1174,4857,61540,107,3912,10927,711,13706,236764,6271,815,3468,19386,44477,107,3984,13621,672,834,69160,700,236789,236753,8706,236764,15026,96322,107,236777,236789,506,14529,1595,529,914,33641,236761,108,38850,236777,3769,1536,3118,236787,107,54593,786,529,15689,236888,107,2094,691,1041,10808,236764,532,564,795,8988,236789,236745,1570,726,108,44180,1439,134560,236787,107,4348,1492,236764,711,1492,236761,108,7280,28579,236787,107,4348,528,672,5387,236764,17536,236764,1492,236761,108,38850,236777,3769,1536,3118,236787,107,6445,236764,618,564,3892,236764,564,795,236761,3551,951,39029,4690,236764,107,236777,120822,910,89912,1190,236787,107,2542,506,83213,236764,8345,236772,1166,16764,1551,236764,1531,1091,107,5024,714,786,618,564,776,711,111771,236764,532,107,3810,495,67028,5507,236787,564,1879,1570,236764,107,902,65598,1091,236764,692,114419,756,51775,540,1023,112936,107,818,16690,519,529,68730,236764,35331,819,236764,7559,804,236764,107,24249,692,13109,735,109778,236789,236753,573,236764,36198,236789,236753,236764,107,624,11887,236789,236753,236764,107,2292,224129,1091,607,775,236764,506,20488,236789,236753,1548,236764,107,15938,6220,711,31886,236764,951,236764,6271,2066,236764,840,600,107,24249,901,735,2238,531,223921,236761,108,44180,1439,134560,236787,107,13086,236764,951,919,236761,108,7280,28579,236787,107,3771,919,4171,236764,692,5426,2167,574,611,236761,108,38850,236777,3769,1536,3118,236787,107,3910,236888,951,919,236888,107,2205,573,1041,2891,564,735,28516,1041,4806,236764,107,4348,121557,44630,4912,236764,834,2863,1041,38464,107,44603,4171,8421,910,17538,2342,1724,150347,236764,107,24249,692,125688,1374,494,2737,775,236764,3819,15023,107,818,1401,1595,531,4682,1091,236761,108,12357,2170,3118,236787,107,3048,8988,512,236789,506,1331,236764,107,2205,768,611,964,496,8081,531,45345,236764,711,107,236776,880,529,910,118090,665,236761,108,206908,68251,3118,236787,107,236789,30379,627,1388,107,1882,1531,506,1331,1281,236789,236745,236761,108,44180,1439,134560,236787,107,3689,236764,1144,236881,914,24191,497,236881,108,38850,236777,3769,1536,3118,236787,107,1659,34819,236888,107,114498,564,618,6213,618,506,38735,6745,236764,107,2292,730,1034,236764,756,15088,771,577,1041,3666,236888,108,206908,68251,3118,236787,107,1509,563,496,3666,107,6372,2863,4595,496,23572,1298,625,563,236764,107,4348,23572,1027,3342,236761,108,38850,236777,3769,1536,3118,236787,107,172702,4595,236888,107,129904,611,672,130415,529,506,1322,160620,236881,1686,611,107,15989,10298,756,18803,103709,108,2457,16008,134560,236787,107,236789,236774,9849,699,506,26164,236761,108,38850,236777,3769,1536,3118,236787,107,236789,172702,193877,107,236806,1535,840,1346,157211,200997,5990,236888,3217,236764,107,3048,23674,840,77138,80237,236764,735,611,5478,107,26479,145851,1590,531,5347,614,10095,236764,107,6372,607,914,177023,756,18803,6945,1646,840,107,818,27111,532,9168,512,236789,506,27321,236789,236751,236764,8150,711,7304,107,2021,1879,668,236789,859,2490,822,1873,528,496,69219,236764,107,3133,1386,822,7102,914,236881,1637,668,735,2066,107,11407,566,917,822,47651,236793,768,7293,236764,53004,107,11069,13588,5980,665,236761,1637,611,659,3449,236789,236753,236764,107,3912,711,618,3364,119593,236793,768,611,659,711,236764,107,6481,1091,735,89894,684,611,236761,1599,659,4846,1553,5990,236764,107,2859,901,577,80237,236787,532,901,659,951,2344,236764,107,4420,236764,1800,822,27892,55044,236764,506,1822,236789,540,11613,107,14254,4857,1090,61877,236761,2195,5347,910,68928,236764,107,3133,1288,496,886,618,668,236764,1015,15360,914,756,18803,6945,107,15989,4913,756,18803,236789,2342,496,7859,595,13606,107,55771,3785,136094,528,25717,236761,3763,730,1034,5668,236888,107,1509,3590,506,1122,9784,3225,236787,532,1041,12556,116389,107,2021,1281,236764,1056,1156,11787,659,872,236764,107,95889,39411,236764,1217,4949,19381,107,12055,5273,756,236745,10718,903,506,11302,529,1800,532,1769,107,818,886,684,506,1032,236761,108,2457,16008,134560,236787,107,13086,236764,580,531,506,2436,236772,2811,236761,108,38850,236777,3769,1536,3118,236787,107,177605,5877,600,13784,236764,531,2583,12034,107,818,15689,512,236789,506,4762,6367,38029,236764,618,756,236745,9849,1456,107,236773,166276,528,25717,74077,108,44180,1439,134560,236787,107,13086,236764,1388,236764,951,919,529,600,236761,108,38850,236777,3769,1536,3118,236787,107,31382,993,506,1331,1053,919,10298,2066,236764,107,236777,1879,236764,901,114419,236789,236753,161675,236764,18988,107,818,42711,529,506,1883,236761,108,12357,2170,3118,236787,107,11355,236764,2863,506,1331,2583,107,4906,600,26266,5478,910,8300,236881,108,38850,236777,3769,1536,3118,236787,107,236777,236789,859,2583,1041,7483,236764,107,9474,5367,1099,1082,910,27892,236761,2195,1281,506,15689,107,31403,711,1023,135540,1867,236764,31432,1388,26346,107,6372,770,236789,497,1602,2509,573,236789,236745,236787,1646,2686,236789,236753,531,506,3653,236764,107,14986,1056,506,5760,535,529,506,1883,691,6374,236789,236753,236764,107,7634,1093,711,8583,506,33361,236761,1174,2712,529,2509,107,15562,711,28054,15689,38029,236761,21966,858,236789,506,3653,107,37107,5333,495,695,532,18432,1412,236764,19139,901,1407,236789,236753,107,14254,234007,236764,13804,711,573,1091,236787,506,102517,107,24249,901,735,3187,1603,2342,506,112936,236764,107,3243,4400,125017,236764,1451,2752,577,506,47564,107,4088,1023,834,34076,30973,236761,7134,236764,1144,1299,236881,107,3910,2863,672,518,26161,48859,23219,107,818,112936,236789,236751,33625,236881,3792,50898,3821,107,3689,236789,236751,1133,531,577,910,4171,236787,756,977,1602,2864,625,236793,107,1882,659,506,5314,10536,236764,532,528,1847,9891,107,7634,5877,775,1023,16332,7085,6631,692,4435,781,107,818,4135,529,1023,17823,532,1386,506,40581,1148,107,7029,1023,41927,28600,236793,837,795,528,990,107,38782,169809,506,41226,512,236789,506,112936,532,3437,528,107,818,218884,531,222406,506,89418,236761,108,44180,1439,134560,236787,107,33190,236764,3487,236761,108,12357,2170,3118,236787,107,158976,236764,607,1024,236772,48631,236761,108,38850,236777,3769,1536,3118,236787,107,3771,236764,1769,919,236787,107,3689,1149,577,57882,684,236764,1800,29432,532,3246,236764,107,124093,1144,564,1345,607,514,236888,1174,3972,24712,236764,107,10936,886,912,1677,125688,607,4400,236764,506,1032,107,14904,745,2180,784,3282,236764,1298,549,8482,236764,3822,236764,23069,236764,107,58686,17481,840,684,506,137044,532,951,107,4088,2870,47651,74077,509,1921,68442,107,20235,77096,236764,532,2583,1595,506,1651,107,2021,34827,6495,1788,236787,5708,834,33199,236789,236753,236764,107,509,5238,236764,107,40607,563,3028,531,5708,236761,6841,236764,5426,2167,574,611,74077,107,3048,600,795,577,2344,82527,1082,96363,236764,107,6372,2765,506,11001,912,529,1883,107,9474,1082,611,9370,506,2352,580,236789,236745,236764,600,5278,107,236776,29417,1972,1680,496,1440,236764,532,7976,107,2021,9212,496,2742,607,496,13588,58636,107,6372,236789,236751,2889,529,4355,2180,625,236764,657,3622,179196,855,107,818,165544,77241,28166,236793,1531,1091,711,111995,107,818,9380,837,563,910,23572,236787,822,69160,700,107,236792,24702,1847,11179,532,17322,3740,506,1883,107,4088,600,17661,837,1374,3291,236789,236745,236764,107,4348,2963,506,2066,531,776,506,1535,625,1093,236764,107,2542,506,528,837,139092,2256,236789,236745,236761,108,12357,2170,3118,236787,107,18047,1176,3487,236761,108,206908,68251,3118,236787,107,18047,22852,1133,496,157831,236764,532,2863,3890,107,2205,1664,7428,776,236761,108,38850,236777,3769,1536,3118,236787,107,178651,515,35891,236764,9785,512,236789,497,236765,47279,44543,236888,107,3689,1374,506,1331,776,607,1239,49636,12761,11664,236881,107,2791,8761,10167,236764,910,75712,20269,107,2021,506,5314,13606,236787,528,496,68730,236764,107,4420,1144,236789,236751,711,2874,236764,840,1144,1921,577,236764,691,2621,236764,107,11407,964,901,9949,236787,528,496,2480,6468,236764,107,6481,1144,563,2874,577,1176,625,1921,577,2874,236764,107,3133,6184,910,2066,858,236789,506,14500,236761,108,12357,2170,3118,236787,107,42874,103543,236888,108,206908,68251,3118,236787,107,2094,496,57292,236881,951,236761,108,12357,2170,3118,236787,107,818,496,524,2538,236764,3920,236888,107,6481,1515,577,126685,236761,108,206908,68251,3118,236787,107,5988,236764,2246,506,1331,236787,107,495,5769,1463,7564,107,57588,44543,618,496,157831,806,160820,236764,107,236776,95521,531,506,1237,692,514,236787,41179,236764,564,5536,44543,236764,107,3133,1500,531,162531,3890,236761,108,38850,236777,3769,1536,3118,236787,107,35367,236764,2255,50456,236888,108,20769,3287,236764,833,236780,236787,107,1882,236789,859,102219,1515,236761,108,2457,16008,134560,236787,107,236776,3701,17536,236764,4916,1135,236761,108,38850,236777,3769,1536,3118,236787,107,35367,236764,91580,3210,236888,653,564,2863,31716,21820,26832,107,3949,529,21820,63035,236761,108,206908,68251,3118,236787,107,28368,236764,12444,11838,236888,108,44180,1439,134560,236787,107,2791,1800,9174,919,2833,236761,108,206908,68251,3118,236787,107,8291,236789,236751,668,600,1093,1769,699,611,784,822,2066,236761,108,12357,2170,3118,236787,107,1869,969,1515,236764,562,4675,2538,236888,108,163010,236787,107,8063,607,1515,236888,1679,607,1515,236888,108,20769,3287,236764,833,236780,236787,107,1882,70238,236764,17105,236764,17105,236888,107,236789,92580,11664,37894,756,12407,1021,5990,37894,756,163010,37894,756,3689,236764,3920,37894,107,236789,86713,138299,37894,756,7320,57932,37894,756,236780,6886,40707,605,37894,756,163010,37894,107,236789,84437,236764,8118,236764,8118,37894,756,44289,236764,2768,236764,8118,37894,108,44180,1439,134560,236787,107,3689,563,1003,531,577,236881,564,1006,855,529,11762,236793,107,150107,236789,236751,3541,236793,564,3914,8988,236761,1599,236764,12761,11664,107,2021,506,1331,236888,3468,19386,44477,236764,31245,236888,107,130171,236764,1535,41412,138299,236761,108,206908,68251,3118,236787,107,129904,786,236764,1331,236793,8118,236888,108,163010,236787,107,6481,236789,236751,6899,1023,213749,236787,8118,95419,236764,8988,236764,8988,236761,108,206908,68251,3118,236787,107,3048,659,657,1523,531,10382,822,92372,236787,107,9585,81341,1093,735,784,699,611,236793,2481,81341,236764,107,2825,542,5226,611,735,7489,573,57292,236761,108,44180,1439,134560,236787,107,236811,703,236764,48405,236764,48405,236888,107,2094,563,506,1595,531,178065,236764,711,531,135129,236761,108,7280,28579,236787,107,2021,723,4907,506,3207,532,531,6267,784,6648,236761,108,206908,68251,3118,236787,107,3689,563,506,3207,840,506,1331,236881,108,163010,236787,107,4339,236764,107,818,1331,659,506,3207,236761,108,12357,2170,3118,236787,107,2292,506,13782,529,784,236764,692,964,6869,236789,236753,107,818,1331,236789,236751,126942,236761,108,163010,236787,107,3048,834,4595,236761,108,44180,1439,134560,236787,107,3133,834,659,1133,531,776,236761,108,2457,16008,134560,236787,107,6372,563,506,1595,531,6267,506,3207,6648,236793,107,2021,3437,506,11414,531,506,13916,236764,107,3133,69910,784,236764,837,3819,53756,21280,236764,107,902,118578,532,60190,529,42711,236761,108,206908,68251,3118,236787,107,2094,34904,4355,236761,108,12357,2170,3118,236787,107,3524,1531,775,1975,531,1023,9747,236764,107,3524,1531,775,10382,625,236761,1191,776,1590,111354,236764,107,41768,506,912,512,236789,506,1331,236764,528,5769,2066,107,1882,964,16492,61877,236764,2481,81341,563,26721,107,4088,1861,4355,236761,108,206908,68251,3118,236787,107,20416,6267,2768,529,1515,236793,107,44878,1515,531,506,5441,18725,635,1037,236764,532,699,76849,107,52382,21404,6171,1515,236761,108,12357,2170,3118,236787,107,236776,4675,2538,236764,58886,1515,236888,108,163010,236787,107,100153,236764,2481,81341,236764,6422,236888,108,44180,1439,134560,236787,107,129904,786,886,3658,236793,107,33374,2167,574,611,236764,12761,11664,236764,6899,786,840,496,3658,236761,108,236776,4675,755,236787,107,84437,236764,8118,236888,108,44180,1439,134560,236787,108,12357,2170,3118,236787,107,39125,236764,1724,7445,4861,236764,107,6372,4483,1133,65751,7351,236764,659,1401,71030,107,10936,506,5933,563,23125,236761,35398,4916,3324,1515,236764,107,3133,10591,1515,531,506,5441,236761,108,38850,236777,3769,1536,3118,236787,107,3771,236764,564,236789,859,1778,1590,236761,107,3810,236789,236751,1070,3571,611,735,231670,786,13710,236787,107,33190,236764,2056,3324,74445,1144,611,735,3472,786,236761,108,44180,1439,134560,236787,107,8063,607,600,26114,236888,20891,11664,236764,18145,74951,236761,108,12357,2170,3118,236787,107,119137,4916,3324,1515,236761,108,2457,16008,134560,236787,107,28368,2481,81341,236764,1601,236764,107,3048,600,577,29417,236793,1601,1515,236764,3184,532,2255,236888,108,163010,236787,107,8063,607,1515,236764,1679,607,1515,236888,108,44180,1439,134560,236787,107,5988,236764,974,611,531,822,3155,236793,577,8731,236764,3121,236888,107,3243,795,577,214304,1663,236761,108,12400,28579,236787,107,3407,611,8731,236761,108,2457,16008,134560,236787,107,15248,4592,236793,107,1882,735,618,1551,4690,618,22816,236761,108,44180,1439,134560,236787,107,151171,625,577,2247,531,600,236881,108,7280,28579,236787,107,818,33408,119121,236888,107,236777,140466,37034,236764,29417,4389,236764,2033,531,21820,3155,236793,107,58499,775,531,25399,672,4400,236761,108,44180,1439,134560,236787,107,2542,756,55188,496,35461,3324,775,236764,107,3048,3914,8336,5869,236787,577,8731,236764,5426,2167,574,611,236761,108,2457,16008,134560,236787,107,33190,236764,17536,236764,3008,607,775,236761,108,38850,236777,3769,1536,3118,236787,107,236777,1093,901,964,60713,34131,726,527,901,659,236764,107,31382,528,13706,35977,236789,236753,726,2217,51386,726,527,901,659,711,236764,107,31382,1421,1872,858,236789,506,46162,512,236789,506,44264,726,108,44180,1439,134560,236787,107,3912,8731,236793,107,25240,711,822,26721,49064,1131,822,28166,236793,107,4906,990,795,47700,2264,236761,108,38850,236777,3769,1536,3118,236787,107,2791,5888,3866,107,236777,1451,12222,28959,529,1091,236761,108,2457,16008,134560,236787,107,236777,1451,7564,107,13751,872,496,24088,512,236789,506,1791,529,1091,236793,137044,236764,506,107,13498,12761,11664,236787,107,4573,1492,756,55188,26844,6998,40442,236793,107,3133,174121,563,2246,236789,236753,26001,978,236764,1056,625,11979,107,91561,496,14773,9130,236761,4291,611,11632,236764,107,13286,506,7853,994,236881,5769,49064,139092,6898,107,17729,47613,14513,532,512,236789,497,26316,107,3689,901,659,1456,531,10591,236761,108,44180,1439,134560,236787,107,100985,611,236764,577,8731,236787,107,236777,236789,859,2056,3363,1041,2255,32950,577,528,2864,107,3497,1724,600,735,840,2268,236787,672,1921,577,19126,236789,236753,107,3497,20182,529,1027,10348,236761,108,2457,16008,134560,236787,107,197615,236764,2229,3121,236761,108,236776,161181,1037,236787,107,2094,880,815,2556,236750,236789,236753,914,31252,236761,108,44180,1439,134560,236787,107,15989,4135,563,2311,29417,573,506,1902,236787,107,2209,1093,711,111771,122394,573,914,213646,236764,107,3524,730,1034,573,236789,236751,2066,531,41796,236761,4923,3710,236789,236751,914,11203,236787,107,3689,914,16489,573,3624,236764,600,914,28166,1921,6771,236793,107,3133,236764,1646,23186,236764,1677,10849,600,3785,107,2209,6827,506,1463,529,4355,236761,107,8291,236789,236751,1535,586,981,236888,108,12400,161181,1037,236787,107,236777,1093,901,964,763,524,236888,108,44180,1439,134560,236787,107,236777,1093,901,964,528,134618,236888,2900,506,105246,236888,107,30092,668,711,8988,756,581,5888,236881,108,206908,68251,3118,236787,107,10936,563,672,171222,107,6372,1093,2051,650,7235,506,3207,532,107,3912,1418,880,5668,236881,108,44180,1439,134560,236787,107,3048,26721,12761,11664,74077,108,206908,68251,3118,236787,107,2209,2863,577,20710,1679,506,18725,635,1037,5441,107,3497,44928,4916,236787,668,46440,88177,2621,236764,107,3133,5233,2621,2863,121391,1515,3342,7464,107,55771,506,33326,529,506,1237,2066,107,24249,668,834,7093,657,226774,236761,108,7280,67386,236787,107,2209,2863,1388,1281,107,818,29417,12761,11664,659,506,1331,236789,236751,90965,236764,107,3133,692,910,4916,236761,108,163010,236787,107,2209,2863,236764,2889,580,236789,236745,236761,108,44180,1439,134560,236787,107,39125,236764,17536,74077,108,206908,68251,3118,236787,107,84437,236888,108,44180,1439,134560,236787,107,6294,711,4665,93065,236764,1298,611,1374,840,26367,107,3497,30997,12691,236761,108,206908,68251,3118,236787,107,39125,236764,1217,3952,236789,236745,600,611,107,19845,4790,236758,531,1386,672,22661,236881,108,44180,1439,134560,236787,107,129904,786,8988,236787,107,2205,564,776,1281,506,57292,236789,236751,5367,1583,236764,107,4324,740,564,1463,914,43945,74077,108,206908,68251,3118,236787,107,10085,575,236888,1144,57292,236881,108,44180,1439,134560,236787,107,818,57292,3468,19386,44477,236761,108,12357,2170,3118,236787,107,2209,57292,236888,108,163010,236787,107,3771,236764,951,236764,951,236764,951,236764,951,236761,108,44180,1439,134560,236787,107,2859,236764,684,506,12761,11664,236789,5264,236764,532,23149,236764,1535,1331,236764,107,236777,1149,577,6827,236764,564,1093,120822,496,3658,653,1156,236793,107,818,837,2863,2490,611,531,951,3342,7342,107,55771,834,1623,3967,529,990,236761,108,206908,68251,3118,236787,107,130171,21485,1299,236793,107,2542,692,659,177023,531,21100,107,2094,171222,806,157831,236787,531,84846,1515,11632,107,114498,840,886,8613,236764,532,531,2514,1515,1590,107,7711,2953,4355,236787,5233,625,563,169712,107,2209,5822,531,236772,9467,236761,108,44180,1439,134560,236787,107,6445,506,1535,33408,119121,107,6372,1023,33461,13706,236764,5769,36637,107,120417,1116,52026,2940,563,31205,236789,236753,107,902,730,1034,236789,236751,1852,2260,236764,1133,614,104324,3882,107,31336,1492,9039,872,1116,1852,236888,108,206908,68251,3118,236787,107,2209,236789,236751,496,5933,600,1921,577,3463,3121,236761,108,44180,1439,134560,236787,107,236806,236764,668,236789,236751,496,47938,600,815,840,496,5933,236793,107,236792,38509,236764,531,3463,625,1135,236793,531,25399,625,236764,3735,236761,107,3689,815,668,3028,531,13706,600,236789,236751,26721,4355,236881,107,174450,1023,22816,236764,506,4806,668,46440,5745,726,107,24249,236764,564,30253,43096,236764,563,919,1082,600,668,46440,236764,107,2292,1551,614,62963,726,499,5083,612,236789,236753,625,573,914,2891,236793,107,3133,1144,563,2378,236764,531,10382,625,684,914,2891,236764,107,114498,531,775,784,236764,600,776,236789,236745,532,17477,625,236764,107,236776,5500,531,506,1345,512,236789,506,1902,236761,108,206908,68251,3118,236787,107,2094,563,3980,12543,236761,108,12357,2170,3118,236787,107,17639,953,1764,824,236787,1056,668,1602,2765,914,2891,236764,107,1509,20488,236789,236753,1515,236761,108,44180,1439,134560,236787,107,818,2509,529,506,3998,107,33993,3622,16926,1274,524,236764,563,711,1299,35391,107,2542,1144,1680,625,691,236761,108,12357,2170,3118,236787,107,1882,236789,859,6899,951,919,236761,107,191427,687,1515,531,914,3155,236764,532,179196,1515,76849,236787,107,236798,598,914,13482,236764,1646,529,37636,4135,236764,107,97813,3342,236761,108,44180,1439,134560,236787,107,4906,3658,919,236764,886,3658,236761,107,2094,41652,236772,143766,49064,236764,1056,625,2863,1586,107,818,7342,529,97419,1132,236789,236753,37884,1788,236764,795,2311,5226,107,124912,2080,501,15176,531,236789,236751,37308,236761,101685,684,1657,236793,107,236798,598,7789,236764,618,668,563,27867,236764,2541,855,236764,107,3133,62502,1822,13706,607,51386,236761,108,12357,2170,3118,236787,107,2859,625,964,834,74077,108,206908,68251,3118,236787,107,3689,776,12444,2910,236881,107,19845,692,711,1053,496,11613,529,914,75712,236881,107,7711,496,524,2538,1406,1841,236881,13109,88177,236881,20639,236761,108,44180,1439,134560,236787,107,24501,672,236787,668,815,1010,43257,858,236789,506,28481,107,10081,668,1451,4988,496,26114,236764,532,563,2611,2528,236789,236753,107,902,108087,5192,236793,13294,532,50767,3075,107,2209,9160,2180,23469,236761,23097,786,5264,236764,107,236777,236789,859,817,531,1515,236764,532,37585,531,3437,1515,107,10936,668,2863,3890,236764,684,496,58587,1183,236764,107,902,8118,236764,531,914,46533,63383,236761,108,7280,28579,236787,107,206082,12761,11664,236764,107,1509,563,506,111916,1595,236787,506,1032,3003,107,15600,8595,2311,48804,236764,532,506,1345,529,625,107,41387,531,506,6534,236761,108,206908,68251,3118,236787,107,206082,10186,501,8875,236764,107,3912,611,1299,618,506,1331,236789,236751,10095,236761,107,192890,236764,6267,1679,822,17105,236761,108,12357,2170,3118,236787,107,5988,711,2033,236761,108,206908,68251,3118,236787,107,54825,580,506,2436,236772,2811,236761,1191,236789,859,8835,611,993,236787,107,10936,236764,768,611,3437,711,2481,81341,236764,692,236789,859,7162,107,902,1023,1171,1595,236761,108,44180,1439,134560,236787,107,236777,236789,859,3437,1515,531,611,236761,107,6481,786,12614,822,2544,236787,668,1921,2229,236764,107,3524,1144,563,14588,795,1500,236761,108,7280,28579,236787,107,100985,611,236764,1531,236789,236751,531,1515,236761,108,38850,236777,3769,1536,3118,236787,107,6481,1091,52276,784,1003,10701,23896,236764,1861,786,107,28034,580,506,11228,653,657,6877,17768,236789,37308,236764,107,3524,26106,3595,26607,580,506,18725,635,1037,5441,236764,107,6372,506,31632,2473,1679,10161,107,43760,506,13343,529,14186,236764,3819,795,564,2036,107,3912,5478,531,1091,236761,108,236776,161181,1037,236787,107,3048,776,506,951,39029,236761,108,38850,236777,3769,1536,3118,236787,107,236777,53701,1041,5946,107,25552,711,34767,786,3342,236764,1015,691,50195,107,2021,2246,1091,26112,3469,216043,236764,2432,4464,107,2021,3717,532,6739,607,8337,2139,236764,531,1407,13759,15005,107,902,120550,236764,531,156579,236764,577,2036,532,5601,236764,107,4420,886,840,529,1041,45777,15032,872,107,2021,8988,529,8118,653,3653,236761,107,236777,2910,529,611,236787,107,11355,1602,611,7976,786,141027,236881,1093,611,735,786,107,9277,531,1041,4135,236881,31373,1879,564,1441,107,818,880,564,1006,236761,108,45415,48017,5299,236787,107,236806,236764,17536,236764,17536,236764,17536,236764,107,236777,1093,735,1053,611,2247,822,2066,1388,580,236764,107,13286,611,1053,23143,625,855,236761,108,38850,236777,3769,1536,3118,236787,107,6481,817,236761,108,45415,48017,5299,236787,107,3048,2473,735,1010,3487,506,880,611,659,236764,107,3497,69912,2344,531,577,834,236793,32529,1053,1010,107,818,99717,1013,529,822,86384,236764,768,107,3048,1053,711,1407,236789,236753,1091,1217,12444,964,34045,107,236788,500,901,6220,236789,236753,2066,531,4071,611,236761,108,38850,236777,3769,1536,3118,236787,107,6481,1091,13098,236761,108,236776,161181,1037,236787,107,43320,236764,532,8141,2311,236761,108,44180,1439,134560,236787,107,33190,236764,2229,236764,611,735,1010,2311,10887,236764,2613,107,38574,10887,236793,107,3048,1921,994,532,18678,625,236761,108,7280,28579,236787,107,3810,236789,236751,951,29284,236793,107,71520,236764,684,711,834,3490,236764,1023,1535,3207,107,12785,1478,528,506,38362,236764,532,96529,236761,108,45415,48017,5299,236787,107,100985,236764,577,130035,236789,236753,236787,107,236777,735,496,3710,618,2268,23455,618,23149,236764,107,4573,3819,496,7875,600,9025,1041,1161,529,25046,107,2021,2480,132422,236761,108,44180,1439,134560,236787,107,13086,1176,236764,29417,3875,236881,107,13286,668,1374,5478,7856,650,531,506,38340,236764,840,600,107,818,23125,4691,512,236789,506,990,12783,2061,625,618,58636,107,2542,506,3697,1883,236764,564,1093,2247,10701,69060,580,236764,107,24249,564,740,52183,10591,236761,108,38850,236777,3769,1536,3118,236787,107,3689,1921,564,776,236881,108,44180,1439,134560,236787,107,13293,531,506,12761,11664,236761,108,38850,236777,3769,1536,3118,236787,107,13086,236764,1144,1299,236881,1144,1299,236881,108,44180,1439,134560,236787,107,6398,533,1144,611,735,13804,236761,108,38850,236777,3769,1536,3118,236787,107,2542,1091,236888,564,3914,776,625,531,506,33408,236793,107,15545,564,1299,776,236789,236745,531,1091,236881,108,45415,48017,5299,236787,107,3048,659,2311,10298,236793,107,31382,33025,611,740,2752,577,2311,29417,236764,107,4573,1056,119870,8988,236761,564,735,6827,611,1879,236764,107,37892,700,532,4957,236764,1133,723,30652,236789,236753,4690,236764,107,236777,236789,506,3653,776,2171,3075,236787,7224,600,236764,532,3442,786,236764,107,902,8118,1144,1546,529,1091,684,506,1032,10382,236764,107,6372,901,17794,711,993,236761,108,38850,236777,3769,1536,3118,236787,107,236774,1974,236764,494,1974,236888,108,44180,1439,134560,236787,107,236776,1535,5056,236761,108,45415,48017,5299,236787,107,2859,625,577,20488,528,822,28481,531,4483,107,818,1638,611,659,711,236764,837,236764,573,822,1791,10842,236764,107,3048,8846,822,4957,236764,1217,563,625,2344,653,13633,236764,107,6372,625,2863,2768,144732,528,8118,107,3497,20488,236764,618,528,3653,236764,2338,600,531,1800,107,1509,11979,528,1133,2864,236881,108,38850,236777,3769,1536,3118,236787,107,11355,4912,611,672,236881,108,45415,48017,5299,236787,107,17574,600,1492,625,12828,611,580,531,8988,107,2021,506,1331,236793,711,684,822,1852,14787,236764,107,31777,684,506,4217,837,822,3710,62571,611,236764,107,4573,607,1288,4171,600,659,840,49035,528,107,11069,28166,236764,3635,840,22212,2206,532,161888,107,4088,951,35166,531,822,147851,236789,236751,9043,236761,107,6445,236764,672,951,919,69160,2278,611,657,784,107,55771,531,1769,528,496,5148,607,14617,4171,236764,107,24249,1663,1093,2247,611,531,822,31252,532,107,818,22051,529,1623,4806,236761,107,236777,1093,864,12963,607,1041,4135,1298,107,4754,86530,532,1041,4690,657,14585,3149,107,236777,1374,776,834,528,20488,236787,564,1006,528,672,236764,107,11069,6853,236764,822,2369,236764,1239,80237,236764,506,87136,236793,107,3133,611,795,4319,1407,1023,2870,537,13455,107,3910,611,740,136094,1082,8559,496,177950,3324,756,581,236764,107,2542,506,45202,529,910,18178,532,62810,107,4088,1144,600,1461,2473,42711,236761,108,44180,1439,134560,236787,107,206082,15924,236888,107,33190,236764,817,607,775,236793,8988,5888,236787,611,1149,150456,834,236764,107,4348,1144,563,13588,1861,236764,840,506,3967,107,4088,1144,563,3068,236761,108,45415,48017,5299,236787,107,236777,140466,37034,1492,236764,1041,2369,236764,107,5988,531,1091,236764,607,672,121925,528,21820,1526,236793,107,3133,5478,2793,2963,10161,236789,236753,625,726,8472,577,607,1091,726,107,195297,21980,155178,522,506,23795,726,1708,528,1288,1960,107,4396,563,184980,236764,532,506,6114,529,506,59590,107,9474,8683,1082,506,23896,726,236765,4265,21820,2228,236764,107,24249,3187,236764,5478,236764,74240,21820,79188,3710,236764,107,6445,36820,618,506,41101,598,157035,107,6372,795,711,2768,506,12804,236787,653,1879,531,1091,236764,107,178651,1610,910,31451,236764,532,1646,43257,528,2603,3448,107,202492,711,506,3538,1595,837,236764,35627,24873,54452,236764,107,114498,4691,573,44543,531,1161,618,901,531,3539,236764,107,902,10980,910,1535,18178,236764,840,35627,71438,5528,107,1214,986,883,236764,573,814,1235,236764,74026,61877,236764,834,2793,107,2205,35627,23823,2066,532,1589,236761,108,44180,1439,134560,236787,107,2094,840,3028,236764,107,14986,618,1304,26266,236764,3217,236764,910,17500,964,23149,236793,107,2542,901,735,89912,1190,236764,1646,2679,236789,236753,236764,618,2196,107,2205,4171,531,2268,5708,236761,108,45415,48017,5299,236787,107,236791,1302,37034,1492,236764,107,5988,236764,532,577,26668,236787,6151,564,1281,35627,1053,540,4319,107,27447,162531,13550,528,496,84004,130045,107,55771,111771,1515,528,496,518,1315,236761,5715,563,1301,138299,236761,108,2457,16008,134560,236787,107,236777,735,1010,858,236789,506,2436,236772,2811,236793,532,236764,17536,6945,55188,4691,107,3048,1386,3188,4598,236764,653,12250,5869,107,2292,165357,653,684,12312,236787,784,236789,236751,528,25046,236761,108,44180,1439,134560,236787,107,16904,5888,10808,236761,108,2457,16008,134560,236787,107,236777,1751,756,15088,644,7298,236764,768,668,107,8574,47574,5528,914,7304,236761,108,45415,48017,5299,236787,107,2209,1921,236764,532,795,107,236791,1302,37034,1492,236764,1879,611,795,236764,532,817,1003,625,236761,108,38850,236777,3769,1536,3118,236787,107,15545,564,817,1407,1091,1041,723,2230,4250,89562,588,236881,107,15545,564,607,3225,28166,2583,1041,29417,3710,107,236776,7089,600,625,1921,10591,236881,7134,236764,564,795,776,236789,236745,236787,107,40524,236764,964,993,840,672,3161,8541,531,10382,236764,107,2094,38423,529,2481,81341,236764,901,531,14500,1374,50432,625,107,3133,6184,236789,236745,2342,506,6573,236761,2282,506,2436,236772,2811,236888,107,3048,735,2247,786,1492,531,1288,496,912,837,2752,107,236777,2863,16555,531,506,1972,236761,108,2457,16008,134560,236787,107,33190,236764,2229,236764,692,236789,859,11172,611,236761,108,45415,48017,5299,236787,107,236777,140466,37034,1492,236764,9380,2369,236764,618,35627,23823,1176,107,4754,104846,1603,44543,1171,496,31451,236764,834,236764,107,2021,735,1041,30450,573,672,236764,2121,496,912,107,178651,23823,711,3028,1680,236761,108,38850,236777,3769,1536,3118,236787,107,13086,236764,564,1921,776,236789,236745,236787,107,83627,236764,1041,31602,236764,532,12868,786,107,9401,534,2945,557,236789,236751,7304,236888,1041,35043,529,3653,577,2490,236789,236753,236764,107,24249,690,2415,607,1041,22023,236764,1131,496,14228,107,24497,618,614,213134,1222,236764,653,506,46841,8300,107,6372,27356,537,57751,38888,236888,506,45585,529,1054,3740,107,131895,528,1041,65684,236764,532,2528,41602,236789,24947,1769,872,107,818,17288,529,1041,14186,236888,496,200658,236789,236751,28166,107,13185,5776,1343,1041,24236,236764,532,1041,3774,236789,236753,40027,236764,107,15938,7141,236789,236753,840,528,1041,18802,16021,236764,28773,1133,914,107,6372,46440,4461,614,610,1356,236888,564,795,711,776,236789,236745,236764,107,236798,598,564,1270,588,781,531,20488,10701,1852,9043,107,3133,684,1041,2742,236789,236751,2970,3786,1041,3666,107,236776,1346,32481,2280,17412,236761,108,45415,48017,5299,236787,107,3834,21820,5313,236764,1299,236787,107,2021,2829,529,44543,236764,625,563,1041,919,69160,700,107,55771,35627,529,1091,236761,20639,784,531,42711,236793,1531,107,195297,5946,4319,2597,21820,21077,1082,9891,107,195297,13588,79188,1788,236764,573,564,18649,657,4355,107,3497,618,2563,3710,618,35627,236761,3574,618,35627,1694,107,195297,196390,1788,691,10701,236764,35627,35897,236789,30711,625,699,786,236764,107,4573,47700,21820,21077,208224,883,236761,108,38850,236777,3769,1536,3118,236787,107,100985,236764,577,3004,236787,107,59934,236764,564,1006,1771,531,506,2436,236772,2811,236793,107,1659,754,786,951,919,236761,564,236789,859,142409,654,15433,910,18178,236764,107,228875,910,17500,699,1091,236764,532,2229,2033,27867,107,4088,784,506,37707,528,13706,236761,11696,236764,564,1006,1771,236787,107,6764,643,786,531,1041,6853,236761,564,236789,859,994,57292,236793,107,3524,2752,5210,531,1144,1041,28166,740,776,107,236777,236789,506,1595,529,1378,29086,3342,236761,108,45415,48017,5299,236787,107,6294,822,795,236761,108,2457,16008,134560,236787,107,83627,236888,506,12761,11664,776,8835,611,236787,3774,5869,107,2021,3890,105297,236793,573,901,659,7759,107,3497,64496,236764,618,564,6899,236764,919,3188,107,55771,659,3324,611,3819,236761,108,38850,236777,3769,1536,3118,236787,107,818,3658,563,756,208607,586,7085,40705,611,236764,1531,775,817,236787,107,6481,1091,107152,786,684,17410,236764,564,107,15600,3890,528,10701,20488,236761,108,44180,1439,134560,236787,107,43320,236764,840,105297,236761,108,38850,236777,3769,1536,3118,236787,107,13086,236764,105297,577,625,1299,236761,84432,586,236888,108,12357,2170,3118,236787,107,902,672,1523,5536,1515,2033,236764,600,668,19375,107,117296,1132,800,2066,236787,768,668,101833,775,993,236764,107,2730,10270,1515,607,914,80511,531,506,1331,236764,107,3133,600,506,75052,2506,580,506,28143,1090,107,31403,770,236789,497,10861,236761,107,3689,236764,795,668,2229,236881,108,236776,4675,755,236787,107,2209,236789,236751,4891,236761,108,12357,2170,3118,236787,107,3910,18323,236881,108,236776,4675,755,236787,107,3497,2255,10186,501,8875,236764,532,1724,80237,107,6372,2462,10144,236789,236753,1515,236761,108,206908,68251,3118,236787,107,19845,611,496,45808,107,4088,784,506,27892,600,692,735,88834,107,3974,1679,684,506,10536,236881,108,236776,4675,755,236787,107,236777,735,236793,756,55188,5508,236761,108,206908,68251,3118,236787,107,19845,611,10819,1091,684,42608,236881,108,236776,4675,755,236787,107,236777,735,236761,108,206908,68251,3118,236787,107,222287,41909,506,1331,534,2853,236793,107,3133,1056,901,10591,786,1879,756,1509,2863,577,834,107,236777,236789,506,1447,532,6332,512,236789,506,76273,6945,577,625,3477,107,2542,4355,236764,573,5851,236764,653,8418,16134,236764,1299,1531,1091,107,2859,564,1879,5851,236764,4665,756,82879,67251,768,4355,236764,4665,756,28034,7085,107,14904,15677,580,506,2255,177653,107,3133,2066,858,236789,506,9043,512,236789,506,4400,236761,108,236776,4675,755,236787,107,236777,2863,1573,1091,236761,108,12357,2170,3118,236787,107,3133,1056,1288,990,901,735,22711,531,4665,236764,107,6481,1091,711,41013,236764,840,607,496,7038,23894,107,2730,10270,506,1861,14860,107,4088,1144,692,6584,531,13315,236761,108,236776,4675,755,236787,107,26546,1388,236761,108,206908,68251,3118,236787,107,13185,1091,577,3188,532,5508,573,672,18343,236764,107,4420,692,2863,47926,531,2583,756,236745,1091,236761,108,12357,2170,3118,236787,107,5988,1003,625,236761,107,25240,1515,531,24191,497,6850,236787,668,46440,1010,1456,107,52617,531,60003,236764,532,531,735,914,5367,107,4088,38912,236787,1646,3622,228374,524,236764,668,3914,107,3912,12379,236789,236753,1570,531,3095,831,236793,1299,668,26266,107,3689,236789,236751,528,914,3710,236793,532,600,563,993,837,5724,107,3497,775,531,2541,914,10225,236761,108,206908,68251,3118,236787,107,13086,236764,1590,668,3952,236761,108,44180,1439,134560,236787,107,8571,236757,586,236764,564,776,5426,2167,574,611,236761,108,38850,236777,3769,1536,3118,236787,107,43320,236764,618,614,24402,1898,236764,600,573,506,85769,6347,107,15600,10591,506,1054,1478,684,506,5542,236761,669,20488,236789,236753,33408,107,27252,13706,528,5646,236764,532,506,25794,529,11933,107,10339,4607,607,26721,1758,236888,3732,2765,3571,756,236751,236888,107,110474,797,1023,2455,50569,607,506,3831,529,8118,236764,107,3133,711,1023,15729,607,3653,236888,108,7280,28579,236787,107,95904,236764,19611,236761,108,44180,1439,134560,236787,107,236776,29417,7976,236761,108,206908,68251,3118,236787,107,14575,3541,236764,12444,1331,236761,108,236776,4675,755,236787,107,1613,531,822,12761,11664,236761,112427,236787,8118,236764,564,1879,236888,108,38850,236777,3769,1536,3118,236787,107,7280,236764,6899,786,8988,236761,108,22186,20891,11664,236787,107,13086,236764,1879,236761,25001,236764,3920,236888,108,38850,236777,3769,1536,3118,236787,107,172702,564,577,11055,951,3342,1082,672,1861,236881,107,15545,784,6054,1590,236881,108,206908,68251,3118,236787,107,236777,776,5056,236764,107,2859,611,13103,611,531,506,1331,236789,236751,27892,236764,107,27122,910,10195,532,659,3004,107,2021,17477,58587,181708,573,1288,43945,107,2205,2863,577,12183,3324,611,236881,108,38850,236777,3769,1536,3118,236787,107,236777,1006,3004,236761,108,44180,1439,134560,236787,107,9520,236764,11838,236764,668,3189,668,563,3004,236787,107,818,135162,985,2509,668,815,3028,236764,1966,236793,1751,107,41768,506,40238,914,2742,29402,236764,837,1407,107,17729,47202,858,236789,506,27437,8268,15631,236761,108,38850,236777,3769,1536,3118,236787,107,3900,5053,2391,607,518,13685,236764,107,3900,1731,531,2827,47658,1186,236761,108,44180,1439,134560,236787,107,24501,3342,236764,107,6372,1056,668,26266,711,1133,496,26396,236764,107,3048,1586,1515,1133,496,31451,236787,776,711,1769,107,15989,10887,497,67824,573,44904,12054,236764,107,4573,236764,618,564,1879,236764,1288,618,3291,496,31451,236764,107,98063,1082,80511,611,236761,108,2457,16008,134560,236787,107,13086,236764,1388,236764,951,919,236761,108,38850,236777,3769,1536,3118,236787,107,3689,563,506,4217,107,6372,1646,1786,236789,236753,573,57292,607,2587,8300,236764,107,236777,1006,834,69160,700,236789,236753,600,506,1401,6468,107,3048,1769,625,1135,1570,236881,108,206908,68251,3118,236787,107,7925,531,775,236761,108,38850,236777,3769,1536,3118,236787,107,37889,236764,1299,236787,756,55188,1847,236764,564,19452,834,236761,108,206908,68251,3118,236787,107,1882,5536,611,236764,600,611,735,145846,531,1769,107,4663,13706,784,3409,236789,236753,4408,532,531,6573,107,11069,1508,1131,496,2066,229435,236793,107,2542,837,611,659,496,157831,531,506,1331,236761,108,38850,236777,3769,1536,3118,236787,107,3910,236888,157831,236888,108,44180,1439,134560,236787,107,197615,236764,3095,2295,236793,822,14468,236761,108,38850,236777,3769,1536,3118,236787,107,818,29371,858,236789,506,14723,17786,12724,236772,495,506,1331,236888,107,7029,786,910,157831,236888,87109,117850,213749,236888,107,43794,162531,6114,2838,12571,13460,19867,236764,107,902,21820,1526,39303,236789,236753,618,1551,13889,236764,528,107,195297,17164,28166,1800,4945,236764,564,1093,1879,107,236789,178651,4510,598,236789,31273,44543,607,496,8300,618,2196,107,2205,564,776,14098,506,33408,236761,108,206908,68251,3118,236787,107,10666,611,672,236764,1331,236881,108,163010,236787,107,2021,506,5441,236764,531,506,5441,607,1515,236888,108,206908,68251,3118,236787,107,84437,236888,107,1882,1202,711,2247,861,4217,531,914,5536,236787,107,3689,611,735,3472,1515,776,532,6827,1515,8988,236764,107,3912,1194,822,10195,236764,177617,74445,236764,107,71912,8793,9734,607,47900,532,1590,222047,107,28587,5769,1822,2066,1921,2056,1515,236793,1581,672,236764,107,4324,11434,532,528,1288,5279,2712,236764,107,3984,21165,506,6843,598,4355,236761,108,12357,2170,3118,236787,107,4573,2338,668,46440,107,134500,1388,573,13706,74077,108,38850,236777,3769,1536,3118,236787,107,3689,776,611,865,606,529,2509,236881,108,12357,2170,3118,236787,107,236777,2910,529,600,236764,600,1281,625,236761,108,38850,236777,3769,1536,3118,236787,107,3048,236881,108,44180,1439,134560,236787,107,4602,672,506,14468,600,611,1603,822,5946,236881,108,2457,16008,134560,236787,107,39201,236764,564,14098,611,74077,108,38850,236777,3769,1536,3118,236787,107,236777,1281,951,3342,236787,107,6481,1091,111354,506,25465,18725,635,1037,4355,236764,107,236847,582,147640,74627,236764,16293,522,236764,11573,531,74792,107,4573,607,496,11261,496,1719,236764,564,1093,711,3717,107,37107,40474,657,506,3385,529,886,5888,3658,236793,107,31777,89620,1041,23648,573,1144,901,740,2583,236764,107,2021,735,236789,236745,607,6420,756,11947,3358,809,7085,108,206908,68251,3118,236787,107,2542,600,668,815,236764,107,2205,1623,618,528,1515,12828,236764,699,990,531,990,107,34329,1178,2342,506,1331,236764,12985,2820,107,2021,179196,3121,910,2066,236764,618,1492,657,1774,107,26479,49003,47900,236764,532,600,711,528,506,6219,107,4088,106100,11933,236764,840,580,506,39710,107,6372,776,30016,625,236793,528,506,1463,512,236789,506,1331,107,3133,528,506,2066,529,775,506,12761,11664,236764,692,236764,107,14986,699,672,14816,236764,8418,1044,1515,1023,3207,236764,107,902,63383,529,31632,107,4663,1135,506,5441,18725,635,1037,2752,919,107,2021,5273,1023,13706,33361,236787,858,236789,506,1331,236789,236751,1463,236764,107,236777,1879,625,2863,577,834,236761,108,163010,236787,107,1509,2863,577,834,236764,625,2863,577,834,236793,1531,1515,3121,236787,107,2209,236789,236751,8418,1044,236789,236753,236764,532,625,2863,577,834,236761,108,2457,16008,134560,236787,107,129904,786,236764,1041,39880,236764,532,1041,3364,4690,74077,108,206908,68251,3118,236787,107,2209,236789,236751,37266,236793,951,919,9903,236761,108,2457,16008,134560,236787,107,6481,786,8988,236787,107,236777,735,1010,57292,236764,532,740,1407,573,13706,107,16900,22816,236789,12933,3324,786,236761,564,776,2765,107,4754,2891,236789,236751,1535,607,496,2833,919,21870,236764,107,9474,27437,532,27725,236764,1082,10701,1852,1972,236764,107,4754,23348,6853,236789,236751,10967,236764,1116,96807,236789,236751,3553,236764,107,3133,37120,529,1041,1510,1365,236793,1299,768,564,1093,107,130171,600,74077,108,206908,68251,3118,236787,107,1882,1281,822,27552,236787,8988,1144,236881,108,12357,2170,3118,236787,107,3810,236789,236751,951,919,531,577,1176,236764,840,668,563,8418,1044,236789,236753,236764,107,2205,13550,531,506,1331,532,914,2891,236787,107,1509,2863,577,834,236761,108,163010,236787,107,1509,2863,577,834,236764,625,2863,577,834,236761,108,38850,236777,3769,1536,3118,236787,107,3048,3364,4665,529,33251,236888,5769,11762,564,17554,107,2205,544,1129,512,236789,506,91580,517,832,236764,5769,18178,564,22068,107,2205,506,6582,200494,529,723,228723,1758,107,6372,776,40687,1041,2634,236764,564,8418,1044,611,236793,107,3133,1590,4595,607,822,19262,236888,107,6481,1418,105868,163926,31716,822,17500,236888,107,11069,22816,236764,607,186847,529,910,91070,236764,107,80262,611,1131,53560,236888,13433,506,2066,2036,107,2021,8418,1044,822,65163,236793,8421,657,3861,107,11069,47651,236764,837,15249,711,8421,625,11906,236764,107,41950,711,33391,529,74445,236764,107,31717,822,1852,96503,236764,5518,611,618,1346,107,4619,774,208776,531,1070,7097,107,6372,2810,611,2180,58892,236888,3314,236758,3254,236764,107,2542,611,236764,506,3207,236764,5478,564,2490,1041,1063,236787,107,3810,563,496,1902,18475,236761,108,236776,4675,755,236787,107,818,1331,236789,236751,13550,563,8731,236764,563,8731,236888,108,163010,236787,107,7711,13550,563,8418,1044,236789,236753,236888,668,563,8731,236888,211248,236888,214000,236888,108,206908,68251,3118,236787,107,5988,236764,1460,1515,855,657,33361,236764,532,1500,1515,236764,107,2205,668,46440,6641,611,236764,607,784,9785,236793,107,46762,1515,52026,105819,567,236761,3792,496,9200,107,81259,775,1343,506,3207,236761,108,163010,236787,107,33190,236764,2229,236793,1531,236789,236751,1460,1515,855,657,33361,236793,2229,236761,107,818,33408,22435,1023,29417,12761,11664,236888,20639,236761,108,38850,236777,3769,1536,3118,236787,107,33190,236764,5264,822,24947,236787,496,8652,77821,236787,506,42239,107,3497,1551,15005,173725,786,3121,236761,68208,236764,5946,236764,107,10936,563,822,12440,23648,236881,611,964,1456,107,2021,1879,106457,691,506,2077,497,529,30977,236793,107,6372,3364,18190,3364,1758,1451,10591,236793,107,6372,1056,506,5442,691,16680,784,20896,27432,107,11340,236789,236753,7588,5976,528,18224,236793,31252,236789,236751,58892,236764,107,4420,1346,19847,2033,236764,1646,14617,36773,236764,12783,2061,107,236776,29417,112018,236787,611,964,1456,531,3711,786,107,3497,204393,600,1093,1386,139202,107,818,3710,600,15479,236789,236753,1091,236761,108,149808,236823,210985,236787,107,236806,80958,236888,708,80958,236888,108,38850,236777,3769,1536,3118,236787,107,197615,236888,140466,37034,236764,3875,74077,108,45415,48017,5299,236787,107,6445,506,2604,35889,543,819,15161,784,37707,528,13706,236764,107,3133,78123,96529,236888,108,38850,236777,3769,1536,3118,236787,107,3689,236764,1144,236764,1144,236888,107,236777,2863,577,9312,1056,564,1006,6220,236789,236753,236761,68208,236764,5946,236761,107,34355,600,7304,236764,1056,611,964,50195,531,1879,236764,107,2859,611,1053,1010,506,6853,529,106080,236764,107,44955,529,914,151197,611,236789,671,735,3028,236764,532,10683,107,11069,8705,834,1623,24490,236761,1301,138299,236764,107,79647,650,711,236793,772,16629,236761,214858,236764,1041,6853,236764,1041,5946,236787,107,236777,236789,859,776,1388,3819,236761,87109,2255,532,1847,10186,501,8875,236764,107,195297,24947,659,2095,589,1082,496,15036,880,236789,236751,236764,107,3133,92999,806,531,162531,6114,236761,3551,46232,2870,236764,107,236777,735,3472,44543,15384,236764,532,35627,23823,39179,231670,107,45362,236772,130529,122675,236793,3442,1239,11019,3607,107,236789,112728,10667,531,515,917,33286,47900,236764,107,2205,756,55188,531,13112,657,756,581,236761,3551,5946,236764,611,515,557,1388,107,4754,39232,2036,735,1010,822,141579,236787,532,107,146792,236789,236745,711,37927,726,3480,564,817,7057,236764,107,17729,531,496,40357,25800,236764,600,914,24854,107,114526,9891,236789,236753,532,2910,236789,236753,529,919,1082,3472,726,17993,2369,107,15600,653,9644,506,3364,653,577,12956,107,3497,30394,535,806,182838,532,93544,236761,108,45415,48017,5299,236787,107,4754,1171,2369,236761,107,2825,2853,71438,35627,817,236881,12774,1535,1301,138299,107,3497,44543,74951,236787,6054,580,1070,3003,236764,107,9474,1082,496,6877,1520,731,694,531,1546,6584,107,6372,9857,858,236789,506,1595,1680,44543,236761,108,38850,236777,3769,1536,3118,236787,107,236806,506,33408,236888,108,2457,16008,134560,236787,107,236777,236789,859,1500,44543,496,2297,236764,68701,607,44543,107,10936,35627,145453,1884,236764,600,35627,1149,540,6899,529,775,107,3133,692,529,44543,236787,834,768,506,990,38689,12034,107,236776,4400,573,21820,88649,236764,692,2863,711,5039,107,236806,236789,497,506,12529,1902,531,6370,496,3161,880,236764,107,3133,10382,7821,236764,837,139092,3785,5427,107,236777,236789,506,12312,529,506,1202,497,236761,108,38850,236777,3769,1536,3118,236787,107,106161,12444,1388,236787,107,178651,23823,1518,3324,44543,236793,532,35627,1610,2311,2587,107,4088,506,28481,236789,1270,1794,1258,236764,531,817,935,560,607,886,107,6372,236789,236751,3819,723,89355,2470,236787,3437,786,840,855,657,13818,236761,107,33190,236764,1041,9380,6853,236764,1041,135995,5946,236764,532,107,4754,4690,529,29417,6374,236764,1056,564,1006,12034,236764,107,73319,786,77821,236764,532,14819,236761,564,14098,611,236764,2229,236761,107,8409,564,4595,2787,506,3866,236764,611,2863,107,129904,699,786,2036,236764,532,2752,529,786,12723,613,107,4573,1144,563,1133,786,29121,236761,108,44180,1439,134560,236787,107,6372,236789,236751,5367,1403,107,2205,1027,10222,740,6899,236761,20639,236764,1531,236789,236751,711,137531,236761,107,2859,564,1451,31716,1135,840,886,6819,1518,107,4663,1239,2255,12162,532,14897,236764,684,506,1535,33408,236764,107,236777,236789,671,607,44543,1418,3998,236761,108,38850,236777,3769,1536,3118,236787,107,46762,786,21820,1526,236787,20639,236761,108,206908,68251,3118,236787,107,73319,1091,784,2033,236793,668,236789,236751,8731,236764,532,692,236789,859,951,3342,236761,107,818,102210,659,105819,236789,236753,236764,8761,692,1460,735,77831,107,902,914,17301,236761,108,12357,2170,3118,236787,107,6445,692,735,3505,1023,2066,236764,107,6481,775,4483,2346,39029,1308,625,563,3028,107,55771,1056,625,691,496,236772,63435,236761,108,206908,68251,3118,236787,107,73319,1091,2033,236787,107,37889,910,1822,13550,563,8731,236764,532,901,107,15248,528,910,12440,6332,236761,108,12357,2170,3118,236787,107,162104,1091,2033,236761,107,8291,3952,914,5946,236761,108,206908,68251,3118,236787,107,6481,236789,236751,711,2874,1116,236761,108,12357,2170,3118,236787,107,11355,236881,108,206908,68251,3118,236787,107,7634,1879,1304,236789,236751,10716,236761,108,12357,2170,3118,236787,107,7634,735,5787,236789,501,5433,529,775,236787,2514,580,822,1595,236761,108,45415,48017,5299,236787,107,236806,236764,12444,236789,500,1388,1645,236787,506,3920,19319,72877,512,236789,506,33408,107,70410,785,822,2765,236888,108,44180,1439,134560,236787,107,84437,236764,8118,236793,577,711,834,18762,236761,108,45415,48017,5299,236787,107,2859,600,564,1451,573,118508,236764,611,1374,6899,74077,107,197615,236764,532,611,2863,6899,1070,236761,107,15600,611,577,8731,236881,108,149808,236823,210985,236787,108,206908,68251,3118,236787,107,14219,611,44680,236881,108,45415,48017,5299,236787,107,43320,236764,26001,236793,563,600,496,30720,236881,7918,840,672,26001,236761,107,31403,711,496,880,1041,6353,236881,22759,540,35627,37423,5976,107,2021,8418,1044,1515,600,19847,919,58892,573,13706,107,55771,35627,23823,22852,4171,236881,108,206908,68251,3118,236787,107,236806,31865,80958,236888,108,45415,48017,5299,236787,107,9474,29417,58892,1082,3785,35627,21608,4171,236793,107,3133,573,13706,236789,236751,1535,236761,564,236789,859,3442,44543,1144,236793,3819,817,236787,107,197615,236764,840,35627,145453,4196,2311,236787,564,1093,1041,2369,107,114498,528,32216,236764,532,21820,41704,1680,1515,236764,107,15989,1535,26114,528,914,1526,236761,108,206908,68251,3118,236787,107,3689,1299,236881,108,149808,236823,210985,236787,107,3689,1299,236888,107,2209,236789,671,1386,614,1345,529,21820,163515,236761,108,45415,48017,5299,236787,107,181388,2206,532,784,236761,107,11947,880,236764,506,40238,600,668,1677,10591,573,13706,236888,108,44180,1439,134560,236787,107,33190,236764,2229,236764,8118,236761,108,206908,68251,3118,236787,107,236777,1093,668,1053,7245,531,914,2891,107,2205,668,6074,236764,532,711,723,97084,5668,107,818,29417,49005,668,1603,236761,108,12357,2170,3118,236787,107,236777,1093,668,1053,236761,108,45415,48017,5299,236787,107,236789,236777,1093,668,1053,193877,756,236774,9849,611,2494,13220,506,40581,1148,236787,107,153637,236764,600,740,12011,618,4691,586,529,914,5367,107,2205,564,740,529,1724,70199,837,20808,107,15600,711,735,7764,531,1281,236761,108,12357,2170,3118,236787,107,100985,236764,1531,775,817,236761,108,45415,48017,5299,236787,107,6445,236764,14098,236764,17536,236764,974,611,8731,236787,107,3048,735,3028,496,36711,28869,236761,97091,611,817,236764,6899,672,236787,726,107,2205,2793,618,139092,506,44264,9644,107,818,2689,598,3155,528,13706,236764,834,2793,1041,2369,726,107,2094,15924,236789,236751,8705,1590,236764,672,236764,776,611,1460,726,107,2825,542,611,735,8418,1044,236789,236753,236764,1677,9644,611,784,236761,108,12357,2170,3118,236787,107,13086,236764,1388,236764,692,236789,859,5264,611,236761,108,206908,68251,3118,236787,107,11355,4196,692,531,577,5394,1552,107,3497,886,600,8150,1116,166623,236881,108,45415,48017,5299,236787,107,13751,1041,35486,607,611,236761,107,236777,1093,506,33408,1053,5017,1663,531,776,107,4573,531,9128,1041,114739,236888,28344,564,2874,756,581,107,4573,3622,496,236772,1496,236764,625,1093,203386,679,1041,3710,107,4088,1144,12828,7274,531,236789,236745,236761,108,44180,1439,134560,236787,107,3048,735,4173,1091,2033,236793,107,3133,236764,684,1041,5871,594,236764,611,735,4400,236761,1599,236789,859,13349,607,786,236881,108,45415,48017,5299,236787,107,12165,497,236789,236751,1041,11495,236793,564,13349,3324,7564,236764,107,3133,834,2863,169433,607,19020,236761,20639,236764,1531,236789,236751,817,236787,107,58499,672,44770,5372,522,532,51138,618,564,776,236764,107,902,25046,236764,174261,236772,5282,236761,20639,236764,2229,236764,2229,236761,108,44180,1439,134560,236787,107,236811,703,236764,48405,236764,48405,236888,108,47477,236787,107,236777,1281,611,1388,236764,17536,236764,532,611,1281,107,1336,236787,822,1463,236764,564,1751,236764,563,58016,236761,108,236847,3536,588,236787,107,1509,563,834,236764,17536,236787,9995,236764,564,735,18492,611,236761,108,47477,236787,107,236777,1006,496,10995,236793,532,1041,3019,659,236764,107,527,611,659,236764,2342,756,581,236787,1281,611,786,3819,236881,108,236847,3536,588,236787,107,236797,2723,504,236881,951,236761,108,47477,236787,107,818,1638,236764,17536,236761,108,236847,3536,588,236787,107,3048,1053,919,42603,1056,564,1774,5004,611,236793,840,822,107,236760,27927,563,1388,10833,684,822,28166,236761,2900,236789,236751,506,107,12959,528,13706,236881,564,735,496,5433,699,506,6285,1166,1037,1883,236764,107,1071,1586,611,855,993,236787,611,735,1388,10683,786,496,107,1496,236789,236751,9338,236761,108,47477,236787,107,3810,46440,1010,528,13706,17163,1728,57648,1507,236793,506,107,21991,2342,506,80237,236764,200997,5990,236764,532,87136,236761,108,236847,3536,588,236787,107,236814,651,1010,236888,563,625,10714,236764,1299,236881,5137,1883,20547,711,107,814,236787,901,659,528,496,1346,135162,985,12652,236764,532,107,61475,531,2229,3324,1091,528,506,5387,529,910,11247,236761,108,47477,236787,107,818,1689,76869,529,625,563,3068,236764,840,496,1944,3210,107,41928,1386,625,27400,1570,236787,573,506,87136,5908,107,814,531,3710,506,8418,16134,529,600,26721,107,236780,6886,40707,605,236764,600,901,659,528,496,35923,23455,1788,531,1769,107,712,2066,699,506,1331,532,531,179196,699,1091,107,42974,12761,11664,573,3785,236761,1174,12828,39622,236764,564,740,107,82641,611,236764,532,563,4180,20906,573,506,23125,107,28384,855,236761,108,236847,3536,588,236787,107,236780,6886,40707,605,180130,236888,108,47477,236787,107,48814,2091,236764,17536,236761,108,236847,3536,588,236787,107,3048,795,577,8349,607,672,14020,236764,646,2723,504,236761,108,47477,236787,107,818,1719,14736,1388,573,1091,1492,236761,564,735,6827,625,107,60410,236764,506,225748,990,531,40687,496,880,236789,236751,6853,563,107,14730,1304,236789,236751,22303,855,607,1116,8705,236761,5180,29417,107,236774,1068,605,17519,547,8875,795,3196,1388,528,1239,28481,236764,914,107,36713,4229,27744,236764,3468,19386,44477,236764,1646,1492,528,951,2864,107,1340,914,2891,236761,108,236847,3536,588,236787,107,2209,3914,5347,236761,564,1006,1346,37129,236764,5478,107,127033,1154,531,14087,611,236787,611,735,10714,1041,107,23019,236764,532,564,795,203290,1403,23714,611,2033,236761,108,47477,236787,107,236777,2863,236764,1534,672,532,90253,236764,3442,611,1346,107,184095,2432,699,13706,236793,784,73669,531,506,1535,529,107,42974,112027,236761,13433,611,614,14093,5508,236764,1879,611,236881,108,236847,3536,588,236787,107,236776,1346,19833,886,236793,506,1605,556,878,532,910,10814,236764,107,101282,586,8039,28974,236764,3016,528,506,16357,236764,107,624,531,577,580,3998,657,614,6468,236789,236751,13660,236761,108,47477,236787,107,236777,1006,65515,531,6899,529,910,58024,236764,532,1006,506,107,1562,236764,564,1751,236764,600,2863,1076,1091,528,1861,2970,236761,107,4324,236764,17536,236764,155627,1388,1645,236764,532,1346,16126,529,822,2544,236761,108,236847,3536,588,236787,107,3048,1769,1041,912,699,786,236764,17536,236793,564,735,506,1346,4400,107,1071,577,16126,529,23149,236761,108,47477,236787,107,13086,236764,1531,775,817,3075,236761,108,38850,236777,3769,1536,3118,236787,107,236776,1535,586,3207,563,672,5307,1940,236761,4085,236764,107,236789,112728,564,600,1603,21820,135433,236787,1551,614,49967,107,4088,1239,5888,42560,507,756,1321,1041,28481,107,19845,564,6827,202316,532,7266,236787,1299,1281,786,711,236764,107,236798,598,600,21820,51582,607,233444,532,12958,607,23795,107,902,5512,236762,10041,185937,786,236761,107,14000,611,236764,17536,236761,108,180532,236787,107,3133,611,236761,108,38850,236777,3769,1536,3118,236787,107,10176,786,236764,768,625,577,822,795,236764,107,10936,1822,17519,547,8875,12828,236787,563,668,528,5307,1940,236881,108,180532,236787,107,2209,563,236764,532,188829,506,87136,529,506,1883,107,3834,914,3155,672,3446,236761,108,38850,236777,3769,1536,3118,236787,107,24249,563,914,3155,236764,5426,2167,574,611,236881,108,180532,236787,107,2094,236764,1590,1680,611,236761,108,38850,236777,3769,1536,3118,236787,107,14782,611,236764,17536,236787,77821,236761,107,236806,1902,236764,21820,85783,11747,236888,28903,1492,4592,57882,236764,107,2825,888,3972,25835,4933,4483,531,8785,886,3710,236764,107,2825,888,3155,236764,5769,4086,236764,5769,13294,236764,532,8774,236764,107,14219,2036,3075,236764,1015,25276,236764,618,756,15088,627,236764,528,2765,107,2805,88907,236764,2863,2351,672,6468,236764,107,2791,496,10197,3102,529,496,24324,236764,2541,855,107,2021,31374,598,205870,236787,834,236764,11561,598,96503,236764,107,2825,888,62021,532,5769,23299,735,16689,910,6745,236764,107,2021,1769,506,886,506,1032,236764,684,1070,6584,236764,107,9401,16568,711,5367,614,8727,236764,2863,2171,23348,4690,107,3133,939,7013,910,4342,236761,1593,607,786,236787,107,4754,6681,236772,2811,17554,564,236764,532,1041,2765,236789,236751,3324,107,2094,13550,5148,236761,564,236789,859,5273,236787,768,668,185937,786,236764,107,2209,1677,5888,11933,236793,768,668,2583,786,1595,236764,107,236777,236789,859,776,914,2891,2509,236761,108,7280,117724,1562,236787,107,105457,236764,10135,236764,10135,236888,2900,2509,107,511,1590,236888,564,1751,1023,78241,659,38888,236761,108,12400,117724,1562,236787,107,10936,236789,236751,32352,605,236881,1041,7588,9139,107,1708,1515,236761,32352,605,236888,108,38850,236777,3769,1536,3118,236787,107,236776,1535,586,3155,236787,506,49871,59419,1388,236793,840,564,107,111901,711,1133,496,15083,236761,108,7280,117724,1562,236787,107,3689,1093,611,735,236764,4389,236881,52533,659,611,236881,107,8291,236789,236751,951,1977,573,611,236787,14098,236764,817,531,506,5232,236761,108,38850,236777,3769,1536,3118,236787,107,236777,735,52026,951,2480,16357,236764,107,902,1646,3468,19386,44477,236761,108,12400,117724,1562,236787,107,4420,588,659,611,236764,17536,236881,17260,506,50424,914,6114,528,914,107,2834,236793,600,668,5021,16825,531,1288,47108,236881,107,100985,236764,974,611,855,236761,108,38850,236777,3769,1536,3118,236787,107,83627,236888,108,12400,117724,1562,236787,107,83627,236888,974,611,3121,236761,108,38850,236777,3769,1536,3118,236787,107,6445,35627,236789,2776,90622,236761,108,12400,117724,1562,236787,107,14219,611,834,36711,236881,564,236789,859,735,611,15129,607,116116,236761,108,42173,117724,1562,236787,107,3689,12339,236789,236751,672,236881,108,7280,117724,1562,236787,107,236776,17163,886,618,3785,564,6976,580,236787,564,3914,974,1515,107,725,529,506,3155,236787,140466,37034,236764,2246,1041,7588,531,1515,236761,108,42173,117724,1562,236787,107,3689,735,611,531,776,1590,236764,12339,236881,40705,611,236764,5571,107,1437,3155,236761,108,38850,236777,3769,1536,3118,236787,107,6481,786,840,1975,236793,564,795,711,16131,822,109041,236761,108,42173,117724,1562,236787,107,3689,659,611,236881,108,38850,236777,3769,1536,3118,236787,107,236776,33995,236761,108,42173,117724,1562,236787,107,236776,142599,6934,886,236761,108,38850,236777,3769,1536,3118,236787,107,4339,236764,834,564,1006,236761,108,42173,117724,1562,236787,107,100985,611,236764,6934,33995,236764,1769,872,1070,1032,107,20529,236793,1590,236789,236751,951,1977,573,611,236793,14098,611,236764,5571,236787,2229,236761,108,38850,236777,3769,1536,3118,236787,107,27447,822,1292,236764,817,236764,532,9537,1571,580,7445,18060,236761,108,42173,117724,1562,236787,107,3689,236764,611,795,711,236881,118874,37034,236764,3442,1041,7588,1144,496,107,184095,15083,668,815,1590,236761,108,12400,117724,1562,236787,107,3133,564,2863,236761,108,42173,117724,1562,236787,107,10936,38258,598,35627,236881,108,38850,236777,3769,1536,3118,236787,107,14713,506,56669,236761,108,42173,117724,1562,236787,107,14713,506,56669,236888,108,38850,236777,3769,1536,3118,236787,107,43320,236761,108,42173,117724,1562,236787,107,10936,236789,236751,600,236881,108,38850,236777,3769,1536,3118,236787,107,236777,236789,506,3207,529,179676,532,218884,236761,108,42173,117724,1562,236787,107,236777,236789,506,3207,529,179676,532,218884,236888,2900,614,1188,625,563,236888,107,11407,35627,38258,598,607,513,11457,2311,236881,108,38850,236777,3769,1536,3118,236787,107,3771,236764,564,7298,711,21820,7588,236761,108,42173,117724,1562,236787,107,3910,236764,17536,236888,776,611,1470,88067,607,1041,7588,236881,108,38850,236777,3769,1536,3118,236787,107,43320,236793,756,55188,614,534,2699,589,2509,1082,531,1470,88067,607,21820,107,35768,852,236761,87109,152621,598,236764,532,152621,598,236793,7298,607,21820,107,153567,5607,236764,11632,236888,108,22258,236811,128080,3118,236787,107,10936,563,672,12339,236881,108,12400,117724,1562,236787,107,8291,236764,17536,236787,564,236789,671,735,35919,1515,1133,496,4799,236764,840,573,107,7861,3688,522,506,97635,2351,236761,108,22258,236811,128080,3118,236787,107,4420,588,198792,35627,236881,1144,1093,540,35627,236881,21820,1463,236881,107,11355,8988,236789,540,711,236881,8988,236764,880,236787,1144,236789,236751,21820,1463,236881,108,38850,236777,3769,1536,3118,236787,107,2859,236764,118719,605,236764,107,4348,3819,35627,1281,598,786,236764,532,236764,9333,786,236764,24873,711,107,51836,786,573,506,880,564,1006,236764,23811,107,52519,786,1463,7564,236761,108,22258,236811,128080,3118,236787,107,3689,563,21820,1463,236881,108,38850,236777,3769,1536,3118,236787,107,236776,1463,723,184546,531,506,6285,1166,5990,236789,23896,236764,107,3133,31084,528,5057,531,162531,236761,108,22258,236811,128080,3118,236787,107,37889,236764,1144,236789,236751,21820,1463,236881,107,178651,23823,496,58030,10086,236764,532,21820,3392,107,236799,7233,496,4991,528,236789,236745,236793,3635,21820,25044,236789,236751,18716,236761,107,178651,1407,236789,540,496,29417,16878,236787,1144,236789,236751,21820,1463,236881,108,38850,236777,3769,1536,3118,236787,107,78237,21820,69135,531,136094,236787,1281,236789,540,107,594,521,786,3819,236881,108,22258,236811,128080,3118,236787,107,236777,1281,44543,711,236787,21820,1463,236881,108,38850,236777,3769,1536,3118,236787,107,4754,1463,563,102301,605,2481,81341,236764,1015,46440,3028,107,2021,44543,6412,532,531,784,506,6285,1166,507,107,20418,16131,532,104616,236793,47574,10021,1149,107,4754,50705,236764,3468,19386,44477,236787,506,27668,2509,236764,107,818,13610,39793,532,506,17221,529,4806,107,236773,1371,573,1041,7806,1933,2891,659,1221,1552,107,4573,607,600,50705,236793,496,1535,6571,236764,107,3133,10021,529,506,105472,532,169183,107,24249,35627,1374,540,10591,786,236787,1186,600,1463,7474,236793,107,818,78685,532,80511,529,506,1331,236764,107,18463,3458,684,1023,209520,714,87136,236764,1015,107,19845,784,573,814,680,786,236764,46440,165733,236789,236753,506,1884,236793,107,3133,17477,236789,236753,786,684,506,8300,529,47008,531,577,107,15938,650,236789,236753,855,529,13706,236761,4224,672,106457,107,236814,651,6111,786,531,21820,109041,236793,711,855,529,4614,726,107,113836,873,786,711,726,1071,5383,1041,1972,236764,573,768,107,236777,1053,9891,236789,236753,4355,236764,529,784,506,1758,858,236789,506,1902,107,236777,1093,735,756,4117,524,44543,236764,840,528,9919,28719,236764,107,2021,577,2587,23286,529,1724,1041,8418,1044,616,236764,107,15248,564,1680,44543,1590,236761,4298,768,35627,23823,107,236776,3710,529,215630,528,44543,236764,600,71438,47812,107,1214,688,1852,2931,140658,532,4721,1724,169444,236751,107,4088,30720,3472,1343,21820,2891,236764,4249,107,1437,236744,6850,236764,107,3133,1386,1041,67735,7298,21820,2490,236787,834,1161,625,107,6372,1041,47812,1275,3019,1149,8595,107,2205,6417,531,44543,236764,573,564,795,6093,107,91561,1041,740,5806,236789,236753,2891,607,506,66386,107,4088,784,506,1208,7999,2068,236761,2024,768,834,577,107,178651,513,14959,711,672,532,600,531,8595,919,86530,107,178651,236789,2776,20718,236764,1299,236764,528,496,3658,236764,564,992,1006,107,12059,497,531,3892,1346,78567,236764,532,1861,107,4754,35043,531,44543,532,531,21820,12440,105472,236793,107,24249,711,531,3463,1093,1407,44543,840,496,26001,236764,107,10081,564,735,3785,1500,236789,236753,44543,607,17554,236764,107,236796,7527,494,8069,529,4806,855,529,21820,2891,236789,236751,16489,236764,107,3133,3914,3892,840,531,21820,30720,236764,8423,107,1509,577,531,776,44543,2509,236761,108,22258,236811,128080,3118,236787,107,236806,2481,81341,236764,2481,81341,236888,107,7795,3658,35627,23823,13804,46440,41154,524,699,1041,3710,107,236776,5989,529,12440,80511,236761,1637,52895,107,31336,699,570,1310,6425,8988,29432,2432,236764,107,3133,1879,756,112728,1847,6945,564,236789,671,711,4646,1091,919,107,55771,44543,236764,784,29417,2481,81341,236761,3792,786,156896,107,61781,12162,1003,600,2742,236764,1298,2342,107,4754,208349,29359,614,7549,2782,46440,16689,107,3133,1060,2762,236789,236753,506,16254,607,9141,43781,236787,1590,564,13650,107,818,137997,529,1041,26114,236764,532,776,14779,107,2205,192444,532,618,951,62695,607,21820,2765,107,2205,3785,528,35244,6332,564,1602,107,1584,643,2342,21820,234007,236761,14689,35627,1171,236764,107,236777,9312,506,74289,564,11578,236793,2752,880,107,236773,1107,236789,236753,719,4910,11762,236793,840,600,564,1460,44543,1590,236764,107,178651,29417,3210,236888,919,64624,1041,105090,3710,107,55771,1056,564,1171,1041,7206,7156,98536,5004,107,17043,4263,1041,14272,236761,8922,236764,35627,23156,236888,564,3442,44543,236764,107,1882,735,496,2066,580,3998,236793,532,564,1053,5708,107,14946,919,531,668,236765,21820,3328,699,21820,518,7527,236764,107,3524,10382,10701,3774,7845,236787,35627,23823,12222,786,855,107,182349,3131,2782,236764,532,564,735,101758,2338,107,62894,236745,529,50004,756,236745,10718,903,208224,883,532,786,236793,107,1882,735,1010,1679,3075,528,1041,6745,236764,107,2805,19946,2395,1394,1356,236764,517,15677,1546,1032,236789,236751,35043,236764,107,3133,515,10487,3746,6582,607,5017,236761,35900,236762,2481,81341,236764,107,55327,692,951,103445,1663,531,13706,236764,840,600,107,178651,1610,76849,8418,1044,236789,236753,236764,692,1093,114944,784,107,4663,22778,531,57392,236764,532,43265,3653,107,52382,506,143805,529,723,195634,13706,236764,107,17729,496,16627,18226,512,236789,497,236772,26316,236761,708,236764,2229,236764,817,528,236764,107,3133,1769,1023,10841,80237,684,506,4916,236793,107,15938,1492,659,1590,236764,4030,910,6895,529,786,236764,107,15938,1006,7759,2342,822,43626,236764,107,31382,711,573,13706,4850,236761,108,38850,236777,3769,1536,3118,236787,107,3048,14321,786,236764,33408,236888,108,22258,236811,128080,3118,236787,107,20416,236764,1346,10298,17536,236764,768,35627,71438,735,107,818,5830,529,162531,1852,5437,100615,236764,1769,107,818,886,3746,529,1041,10046,236793,532,1076,1679,726,107,2205,1791,35627,1610,9763,236764,2338,35627,1281,236789,540,107,195297,2891,236789,236751,6332,532,22702,74077,594,688,1852,4861,236793,107,28911,531,16760,2342,506,33361,529,13706,236764,107,3524,39533,953,3517,1091,528,4688,10883,236764,107,2021,36780,1091,236764,16509,9867,236761,2024,2229,528,236787,107,6481,786,63774,44543,1171,531,1724,600,2863,107,37889,137044,531,21820,33447,236761,562,13460,65997,236888,107,3133,919,496,4389,1082,545,236789,497,614,13550,236793,107,40524,236764,2481,81341,236764,600,691,1623,236761,5180,1526,236787,1346,8349,236888,108,7280,117724,1562,236787,107,8291,236789,236751,496,17163,48736,236888,108,12400,117724,1562,236787,107,2292,1041,1526,236764,564,1053,3305,531,735,19847,501,1515,607,107,236746,126402,12281,236793,532,3819,1041,3666,5877,786,914,12799,1603,496,107,4530,2072,529,1515,236761,108,7280,117724,1562,236787,107,3689,614,3774,668,815,236888,668,6812,786,1003,607,914,107,53983,532,914,29168,236764,618,886,1093,1076,872,496,1903,236761,108,12400,117724,1562,236787,107,197615,236764,564,7261,684,914,3392,600,993,691,2613,528,107,21156,236787,668,1053,236764,17536,236764,496,2712,529,3392,236764,17914,2403,74077,236777,107,84237,3442,1217,531,1941,625,236761,108,7280,117724,1562,236787,107,2209,1053,834,236793,3182,618,625,964,726,41928,564,964,146986,236764,107,5503,564,3305,993,691,919,528,1515,1082,564,1451,1751,236761,108,12400,117724,1562,236787,107,4324,1602,564,236764,564,236789,859,577,57882,236787,668,563,5181,506,188477,107,1562,858,236789,506,1902,236761,108,7280,117724,1562,236787,107,236777,1751,668,563,236787,840,496,5314,31451,1082,668,611,515,557,580,236761,108,12400,117724,1562,236787,107,15938,236764,1041,7588,236881,108,7280,117724,1562,236787,107,197615,236764,625,236789,236751,951,4217,573,600,236761,108,12400,117724,1562,236787,107,197976,3962,580,1515,236761,108,7280,117724,1562,236787,107,197615,236764,711,834,13637,236787,840,564,1769,1515,531,577,506,107,89785,31451,236761,108,12400,117724,1562,236787,107,134030,236764,1385,611,236764,886,3914,3442,1217,531,1879,600,236787,107,1708,506,25603,529,496,5148,236764,1023,2870,563,7516,236761,108,7280,117724,1562,236787,107,43320,236764,532,573,614,19211,2311,236761,108,42173,117724,1562,236787,107,236806,47008,236764,564,740,3442,611,4668,74077,4668,236764,611,637,12456,1294,236888,108,7280,117724,1562,236787,107,3689,236764,1144,236764,1144,236881,1531,236789,236751,126193,236761,108,42173,117724,1562,236787,107,236777,1093,711,577,496,10995,236764,529,784,17835,236793,564,1053,618,107,236752,2845,577,496,50709,880,236761,108,7280,117724,1562,236787,107,10936,1321,236881,1298,1321,236881,108,42173,117724,1562,236787,107,11355,236764,1590,236789,236751,668,600,691,50195,531,541,236765,697,1023,2870,236764,107,236780,1389,605,2481,81341,236761,108,7280,117724,1562,236787,107,11355,776,611,1879,756,594,236765,697,1023,2870,124057,108,42173,117724,1562,236787,107,236777,776,711,1879,756,594,236765,697,1023,2870,67251,840,668,691,2462,107,15466,3487,573,1515,236761,108,12400,117724,1562,236787,107,33190,236764,692,659,78241,532,4690,236787,668,691,3785,2311,107,15595,573,1515,236793,564,735,6827,1515,1879,834,5668,236761,108,7280,117724,1562,236787,107,2209,691,2311,2651,573,1515,5467,236764,531,1879,506,5871,594,107,498,236789,236745,236787,1680,146066,8244,668,1060,557,3059,1515,532,131259,107,21156,1133,496,6800,96854,236761,108,12400,117724,1562,236787,107,2267,668,1053,1010,63243,748,1154,2238,236764,668,2473,735,107,9372,3002,532,35751,1515,2311,236761,108,7280,117724,1562,236787,107,4573,236764,919,529,21820,4668,236881,108,42173,117724,1562,236787,107,11355,236764,668,563,834,1603,580,1590,2351,236764,618,768,668,964,2369,107,624,49967,531,23156,236793,1076,657,7593,1345,512,236789,506,2633,236793,951,107,15884,4733,1515,684,1027,529,506,80237,236764,840,901,107,2244,49636,1680,1515,236787,1023,2870,5668,3590,496,107,35768,852,529,1515,236787,39345,11427,5668,607,236789,236751,1526,532,107,133604,872,506,2173,512,236789,506,7068,531,914,42245,236761,2024,107,1437,5944,529,506,4668,563,600,1023,2870,563,3463,858,236789,107,1437,6029,532,840,886,3746,529,1144,668,691,107,33346,1496,236793,573,506,1032,815,3746,236764,684,506,1175,1059,236762,107,624,7224,529,506,3697,2633,236761,1293,236789,859,817,236764,668,3189,236764,107,624,503,29423,506,50424,529,13706,33361,684,506,23896,236787,668,107,16132,212639,784,1679,1680,1515,236764,532,5264,914,16622,128671,236761,108,12400,117724,1562,236787,107,3133,668,236789,236751,618,1133,531,776,236789,236745,618,1027,880,564,740,14011,236761,108,42173,117724,1562,236787,107,6294,236789,236745,236888,668,795,776,236789,236745,236793,573,236764,1385,611,236764,17536,236764,668,815,618,107,34717,4690,618,22816,236793,837,4690,236764,17536,236764,618,625,107,37051,236764,5813,540,711,236764,1385,611,236764,17536,236764,1407,5507,236764,618,107,977,1941,625,236764,914,4690,20126,668,236789,236751,528,1982,4637,236761,108,7280,117724,1562,236787,107,10176,4637,236888,1144,236789,236751,600,236881,108,42173,117724,1562,236787,107,4573,1056,901,2863,1460,236764,17536,236764,914,54940,872,1570,236764,107,624,506,880,528,4806,236764,901,795,855,529,910,107,7400,4290,236764,1133,591,695,1308,6927,236764,532,23459,784,607,107,21156,236761,108,7280,117724,1562,236787,107,4573,1056,5899,672,4448,236881,108,42173,117724,1562,236787,107,2021,236772,121857,236793,531,236772,1496,236793,41909,236793,611,2863,735,506,107,105587,19847,872,672,12399,236787,756,55188,236764,618,625,964,236764,496,107,115483,529,910,49871,236764,532,531,577,19284,16509,901,107,184601,910,24236,236761,108,12400,117724,1562,236787,107,11355,236764,1299,692,2863,735,496,45985,1902,1570,236761,107,2094,8118,563,5017,236764,840,531,16432,8603,236764,3553,107,14170,913,236764,532,22868,138916,236772,26427,236761,108,7280,117724,1562,236787,107,6481,786,735,3653,236764,1879,564,236793,625,32839,8118,618,2793,618,107,1496,1677,3446,236793,625,236789,236751,892,1302,953,236764,58111,236764,99297,236764,532,107,10602,529,6771,236761,25001,563,496,1401,3702,1242,4228,236764,105649,155289,236793,107,34256,991,236764,53648,236764,105662,236764,214045,236793,496,97857,529,919,107,224211,714,2940,1082,3653,236789,236751,496,166101,529,1758,236761,108,12400,117724,1562,236787,107,236789,112728,834,236787,532,618,3653,236764,528,1070,4260,236764,1149,577,1176,531,107,1553,496,1459,3203,949,236764,834,625,3914,577,14978,840,8118,563,496,107,36713,27705,529,505,2472,3601,236761,108,7280,117724,1562,236787,107,43320,236764,532,625,3590,1758,17554,886,2264,236761,108,42173,117724,1562,236787,107,47875,236793,1547,901,1299,2344,1202,886,2264,236761,107,818,28481,573,1041,3273,236761,564,4614,531,1460,51386,618,12907,107,527,6285,1166,5990,236761,2195,659,15213,236764,901,659,15213,236761,108,3243,236787,107,902,236764,528,236764,528,236764,528,236888,108,206908,68251,3118,236787,107,1882,6899,711,529,1515,236764,13637,1202,692,9891,1515,236793,107,15989,39854,659,98739,858,236789,506,1861,8118,107,3133,12010,1788,529,506,1331,236764,837,1680,107,114498,528,6877,49657,236761,5715,776,692,1386,914,4690,107,3508,1974,600,506,1902,5899,1388,236764,1015,4319,1053,236764,107,31382,901,5507,1602,17477,684,236789,236745,236764,67028,107,236796,29488,1434,4945,510,5861,522,15729,1082,1460,107,7711,37707,2438,607,528,910,17320,532,1771,107,11040,910,5151,10841,236761,108,12357,2170,3118,236787,107,1882,15032,531,236789,236745,528,1535,990,236761,107,4602,672,10186,501,8875,236881,108,206908,68251,3118,236787,107,236789,112728,668,6945,55188,668,236787,708,236764,668,563,12530,1346,2712,529,5226,236761,108,22186,20891,11664,236787,107,192021,17536,236888,108,44180,1439,134560,236787,107,192021,531,611,1800,236888,108,206908,68251,3118,236787,107,11069,3468,19386,44477,107,4602,711,1623,4305,236789,236753,236764,840,607,914,4690,236787,107,818,149345,139092,1975,236764,532,834,1093,776,236764,107,114498,668,919,23186,657,625,236761,108,44180,1439,134560,236787,107,3243,236789,236751,1388,236793,532,2473,735,1010,1623,2480,236764,768,107,2209,1451,735,7827,1662,236761,108,206908,68251,3118,236787,107,10936,563,668,236764,6899,611,236881,108,44180,1439,134560,236787,107,197615,236764,564,6899,5017,236787,914,5946,532,914,6853,107,129904,5017,699,1515,236761,108,163010,236787,107,818,33408,22435,611,1800,236888,108,206908,68251,3118,236787,107,27019,236772,2235,236764,1023,40304,236761,108,12357,2170,3118,236787,107,27019,236772,2235,531,611,784,236764,8081,236772,2235,531,611,784,236761,108,7280,67386,236787,107,236806,1857,91787,236764,1023,51582,236764,532,2940,236764,580,1023,40027,236764,107,14219,4470,531,14098,573,611,1800,236761,108,206908,68251,3118,236787,107,28898,236764,532,40360,236888,108,12357,2170,3118,236787,107,106161,7368,236764,2712,40304,236787,692,7976,236789,236753,3468,19386,44477,107,55327,9312,611,618,692,1602,236761,108,163010,236787,107,6445,506,33408,2514,611,236888,108,22186,20891,11664,236787,107,106161,7368,236764,77821,236761,108,206908,68251,3118,236787,107,2094,563,496,50049,532,919,631,953,990,107,55771,1056,1239,78241,11536,1003,506,15729,236764,107,236780,19581,19381,236761,108,12357,2170,3118,236787,107,236780,1389,605,2481,81341,691,107,236776,26721,10095,858,236789,506,3653,236793,840,35331,533,236764,107,236806,236789,2447,761,607,21077,236764,35244,3068,784,6972,236764,107,25864,236772,86463,74077,108,206908,68251,3118,236787,107,3133,21574,886,11097,44233,236764,107,33142,10686,236761,108,44180,1439,134560,236787,107,236777,1751,711,834,236761,108,206908,68251,3118,236787,107,1882,1374,684,672,236764,531,784,1023,51138,567,236764,107,2859,668,1053,8731,12034,57292,236764,1765,625,834,236761,108,12357,2170,3118,236787,107,818,33408,735,1388,30334,625,236764,532,13706,107,236773,1258,6338,532,2036,2180,1515,236761,108,236776,4675,755,236787,107,236824,43195,12761,11664,236764,107,3810,563,496,34714,236764,8761,692,735,2247,528,10894,236764,107,69690,236764,506,6285,1166,507,607,1156,3131,13361,107,14219,5273,236789,236753,528,506,10995,43626,236764,107,3133,607,506,58825,105472,529,506,3653,107,40479,1144,12828,1680,756,581,236761,108,44180,1439,134560,236787,107,236789,112728,17519,547,8875,236764,107,15938,236764,9903,529,1023,2481,81341,236789,8418,16134,236764,107,1214,22413,236751,12034,914,63891,1570,1131,506,1902,236793,107,24249,964,1728,26908,236789,236753,1056,2481,81341,15032,573,13706,236764,107,3133,5813,540,711,3622,154312,855,236761,108,206908,68251,3118,236787,107,33190,236764,1144,2910,611,107,4088,2481,81341,236881,108,12357,2170,3118,236787,107,5988,1460,672,163926,497,615,11314,236789,236753,236761,1030,3914,577,107,818,6285,1166,507,30253,2541,607,775,236761,108,44180,1439,134560,236787,107,58686,577,236888,107,1882,735,3423,600,1401,1388,625,740,236764,107,3133,1806,8698,529,506,1133,735,1010,107,43794,1041,3911,236761,2024,3282,607,506,12339,236764,107,13286,611,45345,1515,236764,1298,668,6827,672,236764,107,236798,598,611,2863,6584,531,68972,822,1938,107,3133,12222,506,59648,1015,51682,110844,107,4088,1144,563,531,577,106100,236761,108,206908,68251,3118,236787,107,54593,711,786,236787,107,236777,1281,672,3914,577,236761,108,12357,2170,3118,236787,107,4348,2653,236761,108,86859,236787,107,818,87136,528,1822,64000,1788,659,1771,107,3243,531,506,112936,236772,6367,236787,1070,4668,563,2229,107,6372,11747,910,1527,501,2499,236761,108,206908,68251,3118,236787,107,236789,112728,672,34714,236793,726,107,5988,68972,1515,236764,756,1321,506,1331,236789,236751,6114,236787,726,20941,18494,236793,107,40607,840,914,2072,236761,108,86859,236787,107,10784,236764,26721,17536,236764,107,818,34714,236789,236751,2072,563,126492,236793,532,919,236764,107,9474,82527,236764,563,5518,236789,236753,236761,108,206908,68251,3118,236787,107,3689,919,82527,236881,108,86859,236787,107,1509,563,13804,25704,855,529,1551,90965,726,107,3910,20127,564,776,711,1281,726,7705,2481,81341,236764,107,22163,236789,236753,607,17519,547,8875,236764,9025,496,2066,756,51775,540,13706,236764,107,3133,86796,47812,618,30351,618,1534,107,818,3184,236789,540,532,24625,3210,236761,108,206908,68251,3118,236787,107,2094,563,1346,4547,236888,108,12357,2170,3118,236787,107,178347,1186,236764,600,506,37385,4260,1149,7976,107,11947,2481,81341,2033,1570,236761,108,206908,68251,3118,236787,107,818,1401,16568,580,236789,236745,236761,108,44180,1439,134560,236787,107,2094,563,21137,236787,107,2209,532,17519,547,8875,740,951,919,657,811,107,55771,23125,598,4529,980,2019,236761,108,12400,68954,236787,107,3048,659,3265,573,531,506,112936,236787,107,236776,82527,14093,236764,5378,684,102301,605,2481,81341,107,112041,607,17519,547,8875,236764,637,1283,107,41768,1023,43626,236793,532,735,3016,107,236806,236789,497,28447,910,1595,236764,27874,607,4304,236764,532,3721,107,3689,6267,1680,1091,236761,108,2457,16008,134560,236787,107,236806,236764,611,735,1603,1535,981,236888,108,44180,1439,134560,236787,107,3689,4668,236881,1144,4668,236881,108,2457,16008,134560,236787,107,3048,735,4790,236758,531,42600,1044,822,1852,27929,532,107,2021,24041,506,3207,9025,3324,822,510,1090,236764,107,2021,1460,822,51582,69160,700,236789,236753,531,822,138366,74077,108,44180,1439,134560,236787,107,3689,236789,236751,506,4668,236881,1144,236789,236751,506,4668,236881,108,2457,16008,134560,236787,107,11069,50569,30494,528,910,22314,236764,532,107,11069,91316,236764,1298,498,611,15032,236764,29791,107,52382,614,12723,497,236789,236751,31361,236761,108,44180,1439,134560,236787,107,100985,1492,236764,822,4668,236881,107,3048,735,1603,5888,981,236764,564,9891,786,81723,100985,236764,822,4668,236881,726,107,2859,2481,81341,1374,577,6154,236789,236753,607,6285,1166,5990,74077,108,2457,16008,134560,236787,107,2859,236888,107,2209,563,910,8081,236787,668,9025,1091,1133,496,3210,107,42619,684,1070,1032,92262,1082,4135,236764,107,6372,15965,880,2480,236793,532,901,1500,1515,236764,107,91561,775,1615,2139,236764,607,951,2344,10805,107,55771,12958,35778,5312,60694,236764,107,3524,218348,18905,34386,236761,108,44180,1439,134560,236787,107,3048,735,1603,1535,981,236764,107,3048,532,822,74772,236772,2438,236793,611,600,15032,834,872,1623,107,498,506,8300,529,30462,532,107,818,11762,529,29508,236772,236744,20474,236888,108,2457,16008,134560,236787,107,2209,795,31716,107,11069,13706,1003,822,23896,236761,108,44180,1439,134560,236787,107,2205,106080,107,15562,31716,1679,129575,9479,236761,107,3048,735,1603,5888,981,236888,108,12357,2170,3118,236787,107,4573,563,672,1847,236764,17536,236881,108,2457,16008,134560,236787,107,43320,236793,532,611,236789,859,1385,24327,107,13286,611,1586,625,1032,236761,2343,506,9576,107,6294,16094,586,83800,236793,532,1015,8975,107,14219,18649,236789,236753,573,196390,47651,236764,107,3133,96529,4512,119593,236761,11063,563,236789,236745,740,27248,1515,236881,107,11069,22816,532,914,1586,2613,528,1515,236761,108,44180,1439,134560,236787,107,1882,659,784,134342,236764,8423,107,818,29417,880,735,40474,236761,108,2457,16008,134560,236787,107,15938,2863,2679,625,236881,107,818,12761,11664,3914,776,236789,236745,573,30720,236793,506,1331,107,3984,5614,1288,56346,529,1515,618,506,41897,107,25552,529,506,192278,236787,573,914,1791,4690,236764,768,901,107,31336,1879,756,3912,1535,531,13706,6945,901,11055,1515,1581,107,2205,1724,1374,776,600,1053,52026,914,17554,236764,107,3133,33025,1407,236789,236753,1133,22816,236761,108,44180,1439,134560,236787,107,236789,112728,1847,236787,107,2859,668,964,10848,531,1041,3155,506,5500,107,6372,1374,30656,625,236764,564,735,711,506,3392,107,2021,1879,756,33374,2167,574,611,236764,41013,7085,1599,735,1603,5888,4916,236764,107,3048,532,822,34294,236888,611,735,43374,5888,236888,108,2457,16008,134560,236787,107,3048,735,6111,107,236776,113699,3324,13706,236764,1288,618,691,2752,107,4324,66865,529,1601,236761,108,22186,20891,11664,236787,107,37889,711,692,6111,625,236761,108,44180,1439,134560,236787,107,3910,236888,15141,625,692,236881,692,9312,1515,840,236764,1133,75710,107,3133,209109,87136,236764,5877,1595,31273,822,23587,236764,107,15938,1602,534,2193,1515,855,512,236789,506,3207,236761,108,2457,16008,134560,236787,107,4573,564,9891,107,7634,236789,859,96887,1515,528,1570,236761,118719,605,17519,547,8875,236764,107,818,1855,1463,529,1758,236764,145904,914,3298,107,2205,768,668,964,914,10095,236787,115054,107,4602,784,506,4957,236764,6332,532,25603,236764,107,6372,13706,740,1386,2342,1091,236761,108,44180,1439,134560,236787,107,8291,2229,506,23587,236761,107,3133,563,17519,547,8875,607,1515,236881,1599,659,901,107,6372,1603,506,2634,723,1914,6909,761,236764,1056,611,6171,107,11069,579,15624,116096,17126,528,534,2193,522,657,107,236780,6886,40707,605,236789,74627,236761,4224,668,236789,236751,4891,236793,107,3133,711,496,5324,3324,496,31451,236789,236751,2228,107,24249,795,711,8595,496,68972,236787,618,1551,104553,854,1886,107,2205,611,28120,17126,872,795,668,113260,1679,236764,107,3133,2350,611,573,822,27892,236761,756,112728,951,4217,236793,107,584,668,1451,8141,775,784,1131,886,11559,236764,107,1882,735,52026,625,236761,108,163010,236787,107,134030,236764,692,6899,82527,4668,236761,108,7280,67386,236787,107,2542,10701,1852,912,236764,107,4420,564,1176,236764,8418,1044,1515,236764,564,1176,756,236745,9849,56346,236761,108,12400,67386,236787,107,3133,834,1602,564,236761,108,42173,67386,236787,107,3133,834,1602,564,236793,532,236764,531,1879,506,9043,236764,834,1602,1401,107,34717,529,775,236787,600,692,1602,236764,692,1602,573,506,1791,236793,532,107,3480,692,95826,120076,531,914,8418,16134,236764,3819,107,509,691,2342,1023,795,236761,108,2457,16008,134560,236787,107,48184,544,1535,586,2432,236764,611,27892,236888,108,44180,1439,134560,236787,107,3048,735,1603,107,11947,981,236764,611,532,822,4665,236888,91573,236789,236751,531,506,44264,236881,108,2457,16008,134560,236787,107,236806,236764,6952,236764,1144,1663,236881,108,206908,68251,3118,236787,107,5988,236764,39880,236764,974,611,2033,236793,577,711,107247,236789,236753,236787,107,9208,659,496,2678,600,1093,577,16126,531,735,107,2094,1847,837,901,834,4483,531,9891,236761,3764,2033,236764,107,3133,1407,951,1519,529,9891,236761,108,7280,67386,236787,107,818,33408,577,1535,531,775,236888,20639,236764,39880,236764,1531,236789,236751,2033,236761,107,236777,3785,1176,692,964,858,236789,506,6133,1056,692,180130,107,21156,236761,108,12400,67386,236787,107,4324,1602,692,784,236761,2024,236764,2229,236764,1531,236789,236751,2033,236761,108,12357,2170,3118,236787,107,236777,776,711,1133,672,4668,236761,108,206908,68251,3118,236787,107,31777,564,236761,108,12357,2170,3118,236787,107,6481,236789,236751,531,506,44264,236761,21284,3746,1041,12821,107,38786,3717,672,573,496,7089,236888,108,206908,68251,3118,236787,107,100985,236764,1531,775,817,236761,108,22258,236811,128080,3118,236787,107,6294,901,2036,10240,531,506,10995,236881,108,141791,236787,107,236777,776,711,1281,1144,146250,236789,236751,528,1515,236764,840,107,11069,18187,1161,1515,618,506,20499,756,1321,11495,236764,107,37107,2910,657,2633,236764,532,910,8863,657,1345,236793,107,3133,611,659,154923,236789,236753,528,672,2970,236764,17536,236764,107,14986,684,822,1852,236761,108,22258,236811,128080,3118,236787,107,236777,3914,1601,625,1492,236764,107,71520,236764,684,1699,2820,236764,564,74907,506,3998,107,4088,1023,1702,236761,1293,29402,5668,919,11307,14619,236764,107,14986,531,1041,1589,236764,1082,564,3305,668,1093,107,4420,1171,564,1602,29887,1515,236787,3819,914,4135,107,902,600,236789,236751,951,2398,13917,236793,532,564,1921,32725,107,3689,3914,577,22474,236761,108,141791,236787,107,40524,564,7976,236764,17536,74077,107,236777,2689,573,822,2931,74077,7624,1053,711,107,22163,236789,236753,528,10046,607,1515,236793,840,3477,107,55327,45048,506,2970,529,5869,236764,653,1663,107,2021,1515,1053,2378,625,22525,236761,108,22258,236811,128080,3118,236787,107,236777,3050,44543,1388,236793,532,577,35627,2889,236764,107,14730,668,2863,2229,531,914,2881,236764,668,10077,711,107,3689,564,740,37773,2342,1515,236761,8035,625,5072,236764,107,3133,834,668,20547,236764,532,563,951,2344,9614,107,2021,506,63976,7068,236764,600,668,29402,784,2432,13493,236761,107,3133,3831,1535,147305,573,506,6285,1166,1037,1883,236764,107,236811,4821,25800,236772,5282,236764,532,1677,6350,618,4949,107,2205,4988,914,26114,236793,3819,668,46440,2378,134342,107,6372,837,2863,2541,914,10225,653,22051,10701,236764,107,4420,236744,236789,497,692,2229,531,1023,2881,236761,108,141791,236787,107,39125,236764,564,5426,2167,574,611,236764,1751,611,668,236789,859,6081,13706,236881,108,22258,236811,128080,3118,236787,107,3243,6666,6422,531,1515,16509,668,23378,1679,236793,107,3133,506,102210,529,13706,659,914,236787,107,818,80237,532,200997,5990,2765,1515,2311,236787,107,818,12761,11664,659,951,18187,236793,532,910,1331,107,15600,577,618,59555,528,506,88649,236764,618,152302,107,2021,152955,1515,76849,236761,564,1751,668,236789,859,577,531,13706,107,2205,563,506,60684,8228,531,506,5385,236764,1015,4716,625,107,2292,58376,529,4135,236761,5315,668,691,107,236776,29417,34936,531,1091,236793,840,668,1451,711,107,185196,914,88734,1581,236787,3363,756,236745,9849,21077,236764,107,24249,855,529,6376,31252,3785,494,15227,107,818,5293,880,236793,3363,11507,529,11179,236764,107,2021,5121,528,506,62768,529,1724,18190,107,24249,668,691,29398,529,236793,653,3363,4135,236764,107,4348,531,577,1032,1082,886,3210,236764,711,6049,107,4663,506,213249,531,506,43326,236764,840,70973,8118,107,14986,607,506,1638,130672,532,173079,107,2205,668,55392,236789,236753,506,3653,236793,840,886,529,1239,726,107,2205,668,46440,35226,529,1091,784,236764,711,784,236764,107,2542,564,30253,834,2793,2196,1515,726,12901,1515,9891,236789,236753,236764,107,4324,52695,236764,532,834,8418,1044,236789,236753,236787,840,668,815,496,33641,236764,107,2021,89887,625,528,506,131675,236761,1593,1023,73617,107,52983,528,506,16823,529,506,990,236787,107,3133,2066,236764,31273,4850,1346,139731,236764,107,236814,651,711,496,44064,834,15179,618,496,9048,107,2021,224205,1144,625,46440,3028,236761,107,4906,4304,21090,855,886,4304,236793,886,25196,236764,886,25196,236793,107,184603,684,5726,14784,589,236764,28386,684,28386,776,5121,236761,107,33190,236764,1531,236789,236751,3121,236761,3026,236764,102301,605,236764,13706,563,162531,236764,107,178651,1610,6934,236789,540,529,784,236793,1299,21546,1610,35627,10701,236761,108,44180,1439,134560,236787,107,3771,236764,564,236789,859,711,817,236787,611,6899,1144,668,46440,1176,107,24249,691,46232,914,2870,236793,1015,9312,1515,107,902,496,1346,23348,2931,236761,1293,2246,236789,236753,786,6353,236787,107,4573,1144,512,236789,600,236881,3764,236764,611,600,8418,1044,236789,236753,1515,236793,107,236776,16879,1680,914,8336,3798,1679,236764,532,21980,107,818,1595,1131,914,40474,236787,16873,236764,768,668,99338,236789,236753,107,2021,6899,1301,138299,8988,236764,564,236789,859,2514,657,2033,236761,108,2457,16008,134560,236787,107,2209,1093,711,4483,531,1281,786,236761,108,44180,1439,134560,236787,107,6294,611,6899,236881,108,2457,16008,134560,236787,107,40524,886,990,668,1602,2246,786,684,1041,1463,236787,107,236777,27565,1023,2255,79060,236764,532,506,17221,107,6372,692,735,159232,3075,236761,3468,19386,44477,107,2209,1093,711,3890,531,236787,573,15242,784,5618,236793,107,2209,691,496,2712,529,5017,236764,3822,1933,236764,107,112222,668,1053,59308,5668,496,1463,512,236789,506,4304,107,4088,18830,13706,236761,108,44180,1439,134560,236787,107,11355,236764,834,236787,611,735,1603,1535,981,236888,107,236776,6727,529,12761,11664,600,735,25883,236789,236753,573,13706,236764,107,2021,1386,120934,12907,74077,236746,29417,6571,236888,108,2457,16008,134560,236787,107,236777,90971,1515,1217,19833,756,236745,9849,531,73831,107,4420,625,691,2344,4275,236787,668,24406,236764,107,1509,691,496,13759,14071,529,496,1883,107,2021,886,8761,901,1053,45345,236789,236753,236761,108,44180,1439,134560,236787,107,26546,1388,236787,107,30092,668,1879,2344,236881,108,2457,16008,134560,236787,107,236777,2729,236789,236753,531,121680,914,4303,107,2542,236789,236751,2147,4690,236787,914,3890,531,786,691,236764,107,2209,1451,711,4196,531,4351,1091,528,496,26106,107,4088,130681,761,1921,236762,172370,236787,668,1176,756,236745,9849,128005,236764,107,2542,886,6934,11261,653,1156,236764,531,5264,723,236081,236764,107,3133,2036,531,18410,506,39906,236761,108,44180,1439,134560,236787,107,2542,886,6934,11261,653,1156,236888,107,236777,1006,886,529,1724,236793,914,5946,236764,6853,236764,914,1919,236764,107,3133,672,36711,12339,2311,236764,692,659,506,27075,236787,107,3048,659,506,1921,236762,172370,236793,532,611,659,191630,107,68941,506,16254,236787,692,1921,577,47058,573,611,236761,108,206908,68251,3118,236787,107,197615,236764,14098,236764,577,6213,236787,768,611,28440,822,11278,107,902,672,834,2752,236772,52796,1601,236764,3819,776,711,107,4674,5256,547,236789,236751,607,1023,33011,236761,2024,236764,2889,236764,768,611,107,38786,577,822,2891,236789,236751,4846,1877,236764,822,1535,28166,236764,107,9474,1082,506,14816,14093,692,740,1386,236764,107,196616,4721,1023,2891,1562,236761,108,44180,1439,134560,236787,107,3771,236764,564,236789,859,711,1470,88067,236761,108,206908,68251,3118,236787,107,100985,611,236764,817,531,1515,236761,108,44180,1439,134560,236787,107,3689,1374,564,776,236881,108,12357,2170,3118,236787,107,16904,1386,7464,1144,822,2765,740,776,107,2542,13706,236764,5645,2481,81341,236761,108,44180,1439,134560,236787,107,13086,236764,532,1879,600,2481,81341,107,13293,786,236764,618,1301,138299,563,994,236789,236753,236764,107,2805,92887,236793,1144,1299,236881,107,4573,618,496,115615,524,4389,236764,32717,236772,13844,107,3497,914,196186,1788,236881,1879,236789,236745,577,834,236881,108,206908,68251,3118,236787,107,40524,822,1535,795,107,23600,735,600,8863,699,13706,236764,1308,506,4113,107,2205,611,10518,1388,236761,108,44180,1439,134560,236787,107,236777,236789,859,37585,756,236745,236787,107,236777,1751,668,236789,859,6899,786,236761,16961,236764,531,28721,914,11645,107,3133,2346,657,1535,1301,138299,236764,1623,723,137346,786,236761,107,2209,691,711,3523,1388,236793,668,1053,711,201835,236787,107,818,43608,723,6910,236789,236753,236764,1023,4806,563,7445,236764,532,1299,107,1882,163960,3324,506,5597,236764,659,723,2957,107,2021,2583,653,531,49233,236793,840,1056,692,735,7305,236789,236753,107,9208,532,1239,17308,2499,529,1023,4806,107,3497,10135,532,19020,236764,692,735,1349,1898,39690,107,55771,528,1023,31390,236772,5282,4592,236751,236787,5233,564,236789,859,4526,1515,107,112222,668,577,9337,524,531,1041,2864,236764,107,3133,1299,564,236789,859,1076,3324,1515,236761,108,12357,2170,3118,236787,107,3048,1281,506,1401,4284,1131,914,42057,236764,107,3133,3914,10382,822,1595,236761,108,44180,1439,134560,236787,107,11947,9303,236764,564,236789,859,8595,1515,236764,107,16733,1217,625,795,236761,564,2863,16509,1440,735,4654,107,4088,1041,2630,236761,108,2457,16008,134560,236787,107,2209,236789,859,2752,6899,1515,236761,108,206908,68251,3118,236787,107,4348,236881,108,2457,16008,134560,236787,107,236777,3442,611,236764,668,1677,2178,528,5122,236764,914,7068,107,9264,618,756,15088,771,8141,13706,236793,532,914,9938,107,818,11557,34819,531,914,56346,236761,564,202428,236789,236753,1680,1515,236793,107,236789,236774,9849,1401,152194,668,1176,756,146838,67251,16414,236789,236753,786,107,21204,236764,607,914,160132,1526,236787,1144,668,1093,776,236764,107,2209,3265,528,5712,1308,786,236793,1144,668,1093,711,236764,107,27743,607,614,47068,531,6422,531,914,3439,236787,107,4324,600,784,4614,563,36118,236761,107,71520,914,29417,5946,236764,532,914,6853,236793,107,15938,236764,618,564,6899,236764,2689,531,29732,1515,107,2542,40474,531,914,2891,236761,6841,236764,1531,236789,236751,11632,236764,107,3133,607,1023,5888,1175,1059,695,104337,1091,580,236761,108,7280,28579,236787,107,44289,236787,52533,659,611,236881,108,12400,28579,236787,107,15248,236764,532,817,1063,236761,108,44180,1439,134560,236787,107,3048,9200,1133,1758,236793,756,55188,1388,236787,840,236764,684,822,5264,236764,107,236777,1006,614,10095,529,1883,236764,532,2229,107,2021,8988,607,3468,19386,44477,236761,108,7280,28579,236787,107,4663,52533,236881,108,44180,1439,134560,236787,107,4663,13706,236761,108,7280,28579,236787,107,3048,1149,711,1786,236764,611,1921,994,236787,1023,2870,107,15600,951,919,6899,699,76849,236761,108,12400,28579,236787,107,3048,236789,859,1460,822,13706,53722,607,4304,1680,107,3048,236789,859,8988,607,3468,19386,44477,236761,108,44180,1439,134560,236787,107,11947,1041,4690,236764,107,2859,611,735,6827,822,2870,2910,529,13706,236764,107,3133,529,914,4690,993,236764,625,563,10163,531,86357,236764,107,4754,1463,46440,6374,236789,236753,822,23896,625,563,10186,501,8875,236761,108,7280,28579,236787,107,3912,625,834,236793,817,1063,236787,506,31886,529,822,1463,107,4602,711,1590,236623,236761,108,44180,1439,134560,236787,107,236777,3442,44543,236764,12339,236764,107,818,2870,563,1041,35047,236787,564,735,1010,107,818,2260,529,914,1535,12316,236764,52533,1758,735,1676,107,15989,1463,723,52725,236789,236753,236764,47926,586,58871,236793,107,2542,564,735,3785,27662,1041,4690,236764,107,4088,8761,668,236789,236751,9329,236764,607,784,506,2425,600,1917,665,107,38786,2180,47788,522,17477,236787,16873,236764,6494,236764,107,17729,531,496,13392,3324,496,29110,3866,236764,107,236777,735,151878,3068,506,6184,236793,532,528,914,30450,107,19845,4180,18743,236789,236753,506,67638,236787,5233,236764,12339,236764,107,236777,1921,735,5264,531,1786,236761,108,7280,28579,236787,107,134030,236764,17536,236764,768,611,1053,4173,618,1551,12828,528,914,107,1553,19836,618,611,735,110018,4171,528,822,1852,236764,611,107,16223,711,1786,1590,236793,951,236764,3635,625,964,618,128061,107,1071,7089,618,531,3892,100770,953,236761,6841,236764,817,1063,236761,108,44180,1439,134560,236787,107,236791,1302,37034,236764,12339,236764,5630,1041,1463,563,10186,501,8875,236764,107,47236,60897,835,580,506,4598,529,822,2870,236761,108,12400,28579,236787,107,3910,42132,611,735,1010,914,103034,236764,618,611,1879,611,107,17777,236764,564,1006,886,600,236764,14315,1847,1208,1515,236764,1921,107,30468,236764,611,3914,1786,236761,6841,236764,817,1063,236761,108,44180,1439,134560,236787,107,18047,668,201835,236764,740,540,35627,3442,236881,573,564,1093,711,107,86137,607,1515,8421,1308,12441,236761,108,7280,28579,236787,107,3048,659,496,10995,236764,659,611,236881,108,44180,1439,134560,236787,107,236777,1006,236764,618,21820,2870,563,236761,108,7280,28579,236787,107,11407,611,1374,17554,13706,236764,618,668,1677,236761,3199,611,236764,107,14730,611,735,19482,855,822,33361,506,1401,107,149172,529,1091,236764,532,236764,528,496,23125,4913,107,163366,831,236764,2238,822,13550,822,19999,236764,1751,531,107,7171,914,5437,100615,607,506,3735,8337,743,529,2255,107,33643,236764,506,192865,1049,71504,529,822,27929,236764,653,607,107,1437,66594,1178,939,23345,529,1288,496,180367,13548,638,618,107,7624,4483,531,577,236881,3199,611,1751,531,13949,855,506,107,153642,4304,822,3207,563,5508,531,27400,528,236764,607,107,17887,7209,11762,618,672,236881,2301,236764,611,659,103911,236793,107,47157,236764,1063,531,13706,236764,532,11006,573,822,107,31128,236787,611,659,50709,236764,1023,2870,815,57882,107,7624,855,529,231541,532,73831,236761,108,44180,1439,134560,236787,107,39125,12470,236764,768,21820,20596,7261,564,964,1590,236764,668,1093,107,1930,786,607,22851,236761,108,12400,28579,236787,107,33190,236764,1041,20596,10077,611,711,236761,108,44180,1439,134560,236787,107,236777,2689,236764,21820,2870,236761,108,7280,28579,236787,107,4754,2870,41927,711,573,611,236761,7429,236764,564,1879,236764,817,236793,75835,107,236777,1531,12034,822,3746,236772,145593,529,4806,236793,1063,74077,7705,236789,236751,107,1437,46533,529,822,2963,236787,1063,236761,108,44180,1439,134560,236787,107,197615,236764,840,236764,12339,236764,12339,74077,108,38850,236777,3769,1536,3118,236787,107,3689,236789,236751,506,4217,236881,108,44180,1439,134560,236787,107,6445,236764,611,29444,236764,564,236789,859,1879,614,221955,573,611,236787,107,3048,2863,1281,1492,600,564,1006,528,22851,236793,611,2863,107,721,588,705,600,496,7802,9200,638,3914,4408,786,699,107,3307,2369,3468,19386,44477,236787,8844,236764,840,684,1041,16357,107,4060,1515,236764,768,35627,1975,598,711,858,236789,506,1883,529,107,117003,236764,653,529,1070,4355,919,1440,528,107,27685,82972,236764,532,6716,5768,528,15944,236793,67028,1492,107,13002,586,236764,532,27959,498,573,1144,236789,236751,531,2229,3324,44543,236761,107,818,46846,33408,2178,528,51599,9033,590,1003,21820,107,140381,38644,236764,532,2765,44543,951,13633,1082,107,42454,2255,6353,10186,501,8875,1677,236888,708,1041,2369,236764,1041,2369,236888,107,594,521,1610,17373,4304,573,775,236793,1385,44543,236764,1590,236789,236751,107,7632,531,135129,625,236761,564,691,20060,7808,531,2229,531,107,1437,236744,236793,840,1646,26346,7293,840,7564,1451,2827,107,1437,236744,236764,564,735,1010,41757,855,529,822,33361,607,107,228772,236751,236793,532,192036,44543,531,73831,13706,236764,532,21820,107,108067,835,148023,236761,669,1535,33408,1188,236756,676,21820,107,14389,651,236764,532,2490,506,513,84409,529,625,3324,672,1911,1184,107,8472,74077,1580,236764,1015,236764,1133,496,3355,236764,46440,14978,1041,107,12549,531,44543,236761,108,38850,236777,3769,1536,3118,236787,107,83627,236888,108,44180,1439,134560,236787,107,3910,236888,3121,236888,108,38850,236777,3769,1536,3118,236787,107,236824,1482,236764,5946,236764,1919,236764,564,1281,711,236761,3551,20137,107,14219,1376,17084,531,3496,236787,3635,564,47700,107,4754,47812,9668,236764,1041,108000,12828,107,902,6285,1166,1037,81201,236761,2981,692,735,1010,10334,236764,107,21115,4720,10849,29106,2863,23572,236764,4319,107,55771,56346,5433,1217,1623,236761,6841,236764,577,8731,236761,107,61781,23896,2342,822,26818,659,15281,1082,107,11069,33361,2342,1041,4912,236761,16961,236764,573,564,9312,44543,236764,107,13751,672,3008,236793,564,3528,625,573,21820,24273,107,3133,1093,735,7801,625,236761,12023,3658,236764,10186,501,8875,236764,107,236777,795,711,6899,44543,8988,236761,1174,880,236764,17519,547,8875,236764,107,31403,1041,27867,528,13706,236787,3819,35627,67028,236789,540,236888,108,22258,236811,128080,3118,236787,107,3048,2514,496,4512,3095,236761,108,7280,28579,236787,107,6445,236764,17536,236764,563,822,1463,10186,501,8875,236881,108,12400,28579,236787,107,236789,112728,496,25749,236764,611,1460,236764,529,1623,2066,236787,611,1281,506,107,2677,2033,1570,236761,108,7280,28579,236787,107,6294,611,6899,1217,692,659,1304,666,573,10264,822,107,36713,1788,1063,236881,108,12400,28579,236787,107,3689,4400,236764,776,611,1751,236764,564,735,531,27959,498,236881,108,44180,1439,134560,236787,107,236777,13637,2065,573,506,1902,6271,822,2870,236787,573,107,17887,2432,618,611,236764,564,740,56345,1751,993,236789,236751,1027,236764,107,3177,236789,500,834,6495,236761,1293,600,46440,496,795,531,1778,684,107,21156,1508,28600,625,711,699,2264,236787,1531,822,2870,107,2320,914,14588,236761,1701,611,236764,577,600,611,659,236764,1440,236793,532,107,17993,67735,3553,607,822,3911,236888,564,1879,531,611,236764,107,527,564,691,1176,531,236764,60918,236888,108,7280,28579,236787,107,236776,29417,12339,236764,564,12691,1515,236761,108,12400,28579,236787,107,818,26721,12339,563,1023,2870,236787,668,236789,236751,506,5441,236764,506,107,103818,711,531,577,6573,236772,1179,9484,236761,108,38850,236777,3769,1536,3118,236787,107,1882,795,1680,506,11595,529,13706,16922,107,3974,1679,1023,4253,236761,3551,8324,528,672,2970,236764,107,3048,1921,2072,531,506,6285,1166,1037,97635,236764,1217,76206,107,236777,735,45048,672,1960,236761,108,22258,236811,128080,3118,236787,107,16904,910,10842,107,3048,735,35391,236793,197735,236789,236753,822,23896,2342,107,818,2870,8816,529,13706,236793,2752,15210,107,236776,2147,89830,236764,951,236764,711,607,1288,4690,107,6372,3305,1091,2889,529,611,236761,108,38850,236777,3769,1536,3118,236787,107,2094,1774,2255,880,236764,107,2825,542,607,496,11890,236789,236753,3710,564,735,3265,531,13706,236764,107,223411,786,2787,506,4113,529,496,6353,236793,107,197615,236764,8081,7156,786,236764,11161,236761,9963,6524,20681,107,31403,531,5039,1515,236793,573,5769,2255,2765,564,735,236764,107,31382,564,1407,236789,236753,21049,586,531,1515,236764,3622,919,2729,236789,236753,107,818,1171,3439,236764,837,901,1602,28440,107,3133,3914,1492,4242,236793,531,20499,1515,1186,107,6372,3305,668,1451,776,919,236764,496,1401,2268,107,236777,735,44423,531,236787,5756,214111,532,26818,236764,107,31777,699,506,1883,6271,2147,4690,236764,74026,107,15600,564,39002,10222,531,236761,9673,236888,1144,25336,563,672,236881,107,172702,564,577,60553,531,142184,1041,77785,107,902,506,1638,990,756,55188,1603,236881,564,795,711,236761,107,4754,6853,3952,48398,236793,1299,506,20488,236789,236753,38423,107,10936,495,672,32516,691,40542,236764,532,528,1116,1526,107,818,184625,531,1116,4806,236761,2024,236764,855,236764,32893,236888,107,3243,6620,532,25962,529,4135,236764,2541,236888,107,6481,625,577,128061,531,577,13145,8074,236761,107,3689,563,600,176244,236789,20339,5367,236881,653,1724,190970,236789,6114,236764,107,24249,740,1386,33408,573,1745,2163,236881,564,24041,236764,532,1006,711,107,4088,15281,7764,1082,3496,236761,3551,5946,87760,236793,107,2205,768,97848,531,496,6050,31680,1374,107,902,1349,2309,16354,236787,532,1041,3184,6938,107,236814,651,614,6084,529,939,23345,236764,837,107,20418,4135,73545,756,22472,236762,711,7085,1531,506,6285,1166,507,107,2740,1351,13706,532,120318,11702,236787,564,236789,859,2752,107,3912,1288,496,162482,2395,531,41179,33691,236764,840,1975,236764,107,2205,768,496,880,964,3260,529,5668,107,3133,7261,951,1032,12694,236761,108,149808,236823,210985,236787,107,4754,29398,532,8705,236888,108,38850,236777,3769,1536,3118,236787,107,9208,6114,659,711,506,1638,564,30748,528,13706,236761,108,149808,236823,210985,236787,107,818,55180,600,28453,775,5478,6692,107,114526,611,1751,834,236761,108,38850,236777,3769,1536,3118,236787,107,17729,496,39212,13387,1492,236764,107,236777,735,18492,1041,912,236764,532,564,1006,855,236764,107,14986,531,496,2587,87270,236761,8890,529,1041,32713,236764,107,52790,705,1041,107903,236793,840,776,711,1879,107,2542,600,756,52790,705,1023,51386,7085,708,236764,496,20568,107,12059,618,1041,74627,236764,9380,618,1041,47812,236888,107,6445,236764,684,506,45288,26476,529,20808,236764,600,20568,107,236777,7505,699,44543,236764,23348,236793,532,1041,1847,11645,107,236814,651,46841,236789,236753,625,545,236789,497,2338,236761,1599,33408,236888,564,865,606,236764,107,3133,506,1346,29417,5946,529,506,1902,107,58499,6770,514,7059,236787,19326,236764,1041,21980,236764,858,236789,506,7764,236793,107,4088,21820,5268,11133,919,18358,1407,107,55771,600,529,3364,22549,236761,108,45415,48017,5299,236787,107,236806,236764,1975,872,1105,598,236888,107,96553,236764,607,951,68097,43326,1082,506,151638,236764,107,236777,202428,1680,44543,236793,532,723,73578,586,107,11340,11133,236764,618,45072,784,672,1651,107,34531,506,1919,532,3724,236761,108,38850,236777,3769,1536,3118,236787,107,3689,563,672,236881,107,11069,40027,531,786,236881,531,822,32107,2369,236881,107,11407,1531,506,104567,580,506,33233,7642,107,25831,760,506,8197,236793,1299,1531,506,5333,77241,29146,107,92159,506,11307,75790,1731,756,51775,540,506,84004,3768,236793,107,179247,522,137954,236764,531,1386,107,3689,3914,577,236764,6495,981,236761,108,45415,48017,5299,236787,107,178651,1610,1041,45850,236793,107,236777,4790,236758,531,5528,44543,236761,3574,611,1281,672,15924,236881,108,38850,236777,3769,1536,3118,236787,107,818,29417,12198,529,7423,4395,236764,107,818,16254,529,13706,236764,677,5737,618,506,18964,2355,107,6372,236789,236751,102446,1178,684,506,34881,699,134929,7613,107,3133,64260,580,104322,236789,236751,20376,236787,23348,172363,236888,108,45415,48017,5299,236787,107,2094,563,496,6934,150333,529,23149,236764,107,24249,684,506,16823,529,2587,990,107,12055,1407,1133,784,5869,236761,108,38850,236777,3769,1536,3118,236787,107,818,8081,529,18187,236764,107,3497,506,13782,529,39411,730,1034,236764,1573,107,195297,12018,607,951,1782,17412,236793,600,35627,1149,540,8595,107,2021,30720,723,236766,175267,236764,532,6568,858,236789,506,28481,107,17729,496,1822,5442,236772,3676,236764,8101,1418,36163,236764,107,3133,16011,1724,600,7068,44543,236888,108,45415,48017,5299,236787,107,11069,21980,236764,17536,12470,236761,108,38850,236777,3769,1536,3118,236787,107,6372,236789,236751,1041,36711,6938,236888,108,45415,48017,5299,236787,107,14986,668,236764,822,6853,236764,672,15924,236764,532,7564,236764,107,14219,219761,531,611,236761,108,38850,236777,3769,1536,3118,236787,107,236777,5426,2167,574,611,236764,8118,236787,107,3524,236764,768,611,236789,671,2679,236764,5630,672,1680,236787,107,818,3210,564,735,573,1745,2163,531,7224,1149,2752,107,3912,4247,684,611,229287,236761,3574,711,13700,786,107,162104,1041,18187,236764,653,36250,7235,107,31711,607,13706,236789,236751,24519,236787,3442,786,711,107,10936,495,564,4483,104324,236787,12614,711,107,2021,46023,1041,637,1283,532,5437,100615,607,107,11069,81934,7483,236761,108,45415,48017,5299,236787,107,236806,236764,951,919,236764,951,919,236888,107,3048,735,1176,611,795,711,7224,775,1027,3210,236793,107,2542,692,735,5017,1663,531,2679,236764,840,600,107,24249,611,30590,3016,236787,3819,692,795,2679,236793,107,6372,236764,768,611,5121,528,1023,2864,236764,506,27248,107,12055,13098,3324,822,48837,236787,5233,6899,775,236761,108,38850,236777,3769,1536,3118,236787,107,68690,547,8875,236764,532,611,6285,1166,507,236764,1686,236793,573,692,236789,859,107,129904,226774,699,13706,528,2147,236761,5180,2864,236881,108,45415,48017,5299,236787,107,31336,692,577,24782,532,711,8988,236764,1023,1459,2277,107,3133,1883,529,12762,1093,29016,1254,1144,1972,107,1882,735,5378,2338,21820,74627,236761,27384,607,208224,883,107,3910,919,37467,1082,784,4882,3607,107,14219,692,2229,534,2853,236787,2338,600,21820,14186,236764,107,7650,1374,107,13185,1023,6114,2727,607,12690,236764,17500,12847,107,4060,98640,236764,107,9390,47173,1091,137531,532,31716,607,9891,532,55180,236793,107,41950,506,5946,236764,6853,532,1919,531,1460,107,818,2369,236764,506,8705,532,506,6353,72235,107,15989,2891,236789,236751,143805,855,236761,1452,531,6934,692,107,1214,688,205870,236789,236751,1346,5279,236787,35627,33199,236789,540,775,107,7711,35486,531,506,33408,236764,837,563,496,5931,107,6372,784,840,692,4059,236793,573,1217,740,692,236764,107,2368,527,236764,1217,740,692,573,1023,2891,14098,236761,107,236824,949,9015,692,659,4470,236764,3075,607,21820,13626,236764,107,236824,949,9015,692,659,4470,236881,610,697,236764,653,692,1921,10382,107,818,2891,236764,1023,23348,26004,236764,653,1663,21820,1589,236764,107,7711,5931,528,506,2891,236761,1191,1921,1586,107,2267,15179,153610,236764,3635,692,1053,107,7711,7976,236764,837,2678,1374,3345,236787,573,3477,35627,107,15545,236764,618,496,7262,19826,638,236764,577,5378,107,3497,880,17955,13868,1023,15729,236764,653,1663,107,236745,91129,10407,46104,580,21820,2891,236789,236751,42711,236764,107,3133,10591,506,21573,573,2963,145292,28516,107,195297,6853,532,2940,236789,236751,4806,236761,1701,7564,236764,2369,236764,107,236777,5708,711,531,4491,580,31252,8421,107,9208,28481,6054,236787,768,564,3914,63905,44543,107,98063,531,1407,496,29417,20499,531,1800,4688,107,55771,6370,506,1345,529,886,236764,35627,145453,951,32909,107,27393,531,19211,21820,2891,1082,531,46104,726,107,55006,531,236789,236745,236764,35627,145453,711,726,498,21820,5946,236789,236751,96807,236764,107,6372,6111,44543,531,672,1902,236761,108,149808,236823,210985,236787,107,43320,236764,532,10701,236764,107,6372,6111,611,12034,672,6938,236764,531,2514,822,1463,107,58106,531,990,236761,108,15565,16437,9575,3118,236787,107,236776,236789,2863,711,46104,580,786,236793,107,236777,236789,859,1845,3121,8421,564,1006,12869,236764,840,1299,564,236789,859,6093,236761,108,38850,236777,3769,1536,3118,236787,107,4348,529,496,3875,236789,236751,92829,531,577,236764,107,110465,6271,1919,6271,3875,236789,236751,3392,531,1460,236761,107,236777,735,2838,2311,1440,236761,108,45415,48017,5299,236787,107,197615,236764,817,711,699,775,5478,236761,107,2859,625,964,834,600,1023,2864,1602,6316,107,2021,5383,506,51386,236764,17534,531,9867,107,818,6285,1166,507,8761,611,7298,236764,611,2473,84699,775,236764,107,2205,71030,529,822,20488,236787,951,236793,1023,8816,107,4602,600,611,81715,1091,236787,1651,506,6285,1166,507,107,12055,1879,756,2094,40474,692,735,1407,236789,236753,67251,506,51386,236764,107,236789,2094,692,4461,67251,532,1546,528,3477,2678,107,46762,506,784,236772,236754,917,531,44543,532,4665,756,3912,1105,598,107,2542,3043,872,672,8118,37894,87109,1281,236789,540,236764,1822,2369,236764,107,818,1345,529,3653,236789,236751,12298,236764,840,672,2953,236764,107,6372,236764,768,35627,60003,13706,236764,506,7458,107,24249,35627,145453,17534,70196,563,1288,496,1463,236764,107,2825,888,51881,795,577,4799,236759,236789,236753,607,114739,236793,107,2825,888,170761,5478,3528,236787,756,818,880,691,29417,236764,107,4573,607,914,1774,5686,668,64216,625,855,236793,107,40479,236789,236753,914,2891,236764,532,914,1463,7474,107,2021,506,78287,3911,159338,236750,236789,236753,7085,95419,531,786,236764,2369,236787,107,178651,23823,9034,506,5851,23207,529,20488,236764,107,2021,97591,506,185518,529,506,33408,236793,107,2021,27299,607,41796,506,5777,65684,512,236789,506,2634,236764,107,3133,3819,531,5536,21820,67394,607,496,36543,107,6372,1374,840,164223,614,32049,236761,8922,24873,711,8988,236881,107,51836,236789,540,35627,625,42256,573,496,29417,880,107,31717,531,5630,140658,236881,84385,236764,8988,611,236787,107,2209,41927,711,573,822,118508,236761,95419,35627,236764,6938,236787,107,38532,21820,106221,1788,795,2827,1515,919,107,55771,740,1023,7483,236761,2085,236789,236751,951,880,528,506,1902,107,9474,4470,531,756,236751,5946,236793,3819,1590,668,19518,786,865,606,107,17729,886,858,236789,506,18300,236761,87109,23823,2752,528,21820,1972,107,11340,236789,236753,21820,23348,5946,1027,33625,236764,107,4420,1304,236764,6934,27829,236764,10667,529,951,1855,102848,236764,107,18047,732,2472,236789,236753,44543,531,506,28481,532,21236,2033,236764,107,9609,501,607,20488,236761,22168,1041,2864,236789,236751,61327,236764,107,3133,892,794,786,1063,236787,840,768,625,577,711,834,236764,107,178651,1610,711,11481,236793,532,506,33408,795,72877,44543,236764,107,6372,35627,85827,236789,540,699,786,506,11133,837,107,2021,496,5946,236789,236751,912,19681,236761,1293,11747,3121,236787,107,8063,236764,21813,236793,1531,775,30720,1515,607,1023,40027,236761,107,2021,914,50705,3468,19386,44477,756,5843,236751,919,21077,107,55771,56346,531,1023,35486,236761,10237,236787,614,1345,236793,107,2094,563,506,1774,236787,834,692,795,2033,531,13706,236764,107,3133,1778,3571,1023,40304,236761,68208,236764,67028,756,236751,236787,107,2094,6938,236764,600,3914,3442,1144,668,1093,735,107,4573,28193,1416,532,9258,872,16573,573,59497,236764,107,25552,3282,1023,14071,607,919,6332,107,55771,35627,23823,531,30590,756,236745,236761,20639,236764,1531,775,817,236787,107,2094,12339,1053,496,6285,1166,1037,531,914,5946,236793,107,15989,6853,563,528,146066,8244,532,914,1919,107,17729,1515,684,6584,236761,16961,2583,775,1023,21100,236787,107,236777,1006,172104,236789,236753,3097,1023,3207,577,496,236772,9942,236764,107,3133,1299,564,236789,859,8988,496,2268,236761,108,38850,236777,3769,1536,3118,236787,107,236806,5946,236764,5946,236888,107,3689,735,611,3028,236881,183750,236764,506,80958,776,169809,236764,107,818,33408,1385,1679,236764,532,672,104324,8317,107,7634,13112,657,236761,708,1041,5946,236764,5946,236888,708,236888,107,3048,735,2810,496,5293,13626,531,13706,236793,107,4573,236764,573,822,2369,74077,123232,625,236764,708,236764,4646,625,236764,107,14254,113529,611,735,607,1515,59359,236789,236753,236764,107,2859,711,1346,53243,531,1515,236761,2024,236764,1531,625,2229,236761,107,68690,547,8875,236764,3635,564,3914,1386,1847,28481,236764,107,236777,236789,859,5528,13139,8118,236761,4224,236764,1535,17519,547,8875,236764,107,114498,611,528,1041,29227,236764,1093,611,735,6827,107,236776,5946,2344,236881,653,13416,2344,236764,17519,547,8875,236881,108,22258,236811,128080,3118,236787,107,236777,691,7808,607,514,236761,108,38850,236777,3769,1536,3118,236787,107,236777,30253,577,57882,611,964,236787,107,3133,236764,17536,236764,625,563,951,2268,3210,531,1386,107,61781,6114,531,24490,30721,236761,2024,236764,1535,17536,236764,107,3689,8118,611,236789,859,1386,236764,28136,786,236787,573,1041,912,236764,107,236777,236789,859,711,531,13706,236764,564,236789,859,1063,607,611,236793,532,14098,611,236764,107,15248,531,786,528,672,4400,236761,708,5946,236888,6853,236888,108,22258,236811,128080,3118,236787,108,38850,236777,3769,1536,3118,236787,107,43320,236764,684,532,684,236793,107,4573,692,795,6092,3075,236793,532,611,2863,10591,107,236776,2480,10021,1063,1082,4171,236764,837,692,236764,107,2791,1133,3439,236764,795,735,6651,236772,79904,236789,236753,236761,107,33190,236764,5273,607,775,236761,61032,236764,611,28054,107,2021,735,496,20376,5284,611,236787,784,506,73787,107,902,11702,236764,532,1116,120951,606,12162,236764,107,30092,711,735,1603,672,8118,236761,108,44180,1439,134560,236787,107,10185,611,570,1310,1028,796,512,236789,506,44264,236764,570,1310,107,50539,236772,11937,236881,108,206908,68251,3118,236787,107,11355,236764,1144,529,600,236881,108,44180,1439,134560,236787,107,2859,625,577,2653,573,611,531,131053,625,607,822,107,36604,15599,236764,993,563,1070,4614,506,21813,529,107,97375,236764,4285,914,5946,236764,1149,59359,607,1515,236761,107,4573,564,1879,993,563,951,4614,528,236789,236745,236787,1023,184152,659,107,15266,4895,532,4196,3324,14860,236761,108,206908,68251,3118,236787,107,4602,236789,236745,2653,600,834,2822,496,990,740,9702,506,107,19216,529,496,880,236888,108,44180,1439,134560,236787,107,3810,563,1335,1895,1534,496,83360,532,496,33945,236793,107,40253,822,33945,691,496,83360,236761,1174,2481,81341,563,12530,107,2543,880,531,25800,236787,668,815,23326,236793,668,236789,236751,919,1082,496,107,906,14828,3210,236761,108,206908,68251,3118,236787,107,2209,9312,914,5946,104306,236761,108,44180,1439,134560,236787,107,4324,1602,668,786,236787,532,668,951,919,56478,914,5946,107,2998,1082,614,6589,236772,3704,236772,947,11149,236761,669,24630,1788,107,1340,914,3392,503,2278,35923,37532,236787,1056,668,23241,236764,668,107,50139,1133,614,3958,236764,532,506,3866,162064,1680,107,20941,494,37974,236787,668,563,2735,531,161376,496,33920,1184,607,107,20941,7068,236793,16592,1133,496,1054,713,236764,532,914,2346,563,496,107,95620,236761,1293,23378,528,914,1883,236764,618,496,3210,1603,573,107,70866,236761,2900,668,51682,577,3028,563,8585,607,107,20941,53318,236761,1293,8150,5017,529,496,8081,840,78428,107,624,496,20808,531,44233,528,236761,108,206908,68251,3118,236787,107,10784,236764,40474,236764,768,611,2072,1515,9995,236761,108,44180,1439,134560,236787,107,236777,6554,1515,528,506,2872,236761,4534,1144,40474,914,107,22517,2863,3437,699,1515,236787,993,563,951,919,40474,107,495,1515,1082,993,563,9556,528,496,8937,41652,236793,600,107,18803,1023,6934,3207,1586,236787,532,784,672,563,1440,529,107,7624,236761,108,206908,68251,3118,236787,107,818,33408,577,1535,31273,775,236888,108,44180,1439,134560,236787,107,3771,236764,528,1288,496,1624,506,33408,795,711,577,1535,31273,107,605,236761,3026,692,180130,1515,236764,692,35391,711,1091,236793,107,624,236764,668,16228,531,2541,1023,132193,236764,901,2833,711,775,236761,108,86859,236787,107,39125,236764,768,611,236789,671,5383,822,1972,236764,10240,531,822,3155,236787,107,818,4846,1553,5990,735,2506,822,12339,236772,105485,2253,107,3133,107131,1515,872,532,1679,236764,784,136964,236764,768,107,818,10995,21813,3437,711,5931,2033,236764,107,7634,236789,859,2583,1515,4355,684,12371,236761,108,206908,68251,3118,236787,107,3689,236789,236751,506,4668,236881,108,12400,68954,236787,107,11947,4668,236764,1535,4668,236793,506,21813,735,59359,236789,236753,236764,107,818,6285,1166,5990,659,864,52917,3701,236764,532,2481,81341,8731,236787,107,236776,3286,5804,1719,1602,2752,3819,28197,13706,236764,107,3771,236764,711,506,116497,529,506,18725,195722,236761,108,206908,68251,3118,236787,107,37662,236764,107,9175,35627,2953,672,563,1847,236881,563,625,1346,2953,236881,108,12400,68954,236787,107,2205,2953,618,564,1281,506,3768,563,4304,236787,107,10936,735,611,61501,236767,236789,236753,236764,600,611,1386,9370,529,625,236881,107,6855,236789,497,1343,614,4667,834,69419,506,41757,38268,236764,107,2205,506,16572,3467,524,1343,506,33361,236761,8922,236764,192713,611,236888,107,818,39976,1713,236764,62502,5503,236751,236764,4929,19713,695,532,9408,507,236764,107,7874,2278,532,505,76871,1294,532,506,63536,51386,236764,107,13185,506,3768,12847,236761,114342,611,236888,108,44180,1439,134560,236787,107,2094,563,1535,4668,236787,107,236777,795,817,2874,506,21813,236761,1174,6285,2162,722,107,4602,5367,529,1122,9784,236764,80237,236764,200997,5990,236764,107,236776,3207,2587,236793,529,12761,11664,236764,1288,618,611,236764,107,236776,5442,532,2601,2587,236761,1599,735,14098,236789,236753,1388,531,236772,1496,236787,107,2094,5597,573,3595,13460,529,822,184152,107,236777,236789,236753,711,735,2238,496,24324,236761,114342,236764,1217,901,12690,236888,108,206908,68251,3118,236787,107,7280,236764,506,33408,14321,611,573,822,7229,1013,236793,2148,236764,107,24040,1041,7806,29106,236761,108,12400,68954,236787,107,39125,236764,692,735,784,107,20418,4400,531,2583,1822,8863,236761,108,206908,68251,3118,236787,107,7634,659,3541,506,3207,236881,108,12400,68954,236787,107,64889,657,1523,531,5273,236761,108,206908,68251,3118,236787,107,1882,795,2874,1091,236764,107,3133,1601,506,12690,236761,108,7280,28579,236787,107,3912,2840,1023,32994,634,236764,506,1972,529,13706,236888,107,7029,784,822,42608,3075,236764,30450,506,33408,236764,107,3133,1386,124294,29371,236793,2082,236765,7983,1680,1091,236787,107,2805,1179,725,506,9168,600,8418,1044,236789,236753,2481,81341,236764,107,98743,514,1515,607,506,8349,529,914,5946,236793,107,136683,756,16651,236764,21813,236764,8349,37894,108,3243,236787,107,16651,236764,21813,236764,19694,236888,108,22258,236811,128080,3118,236787,107,5988,3442,506,97635,512,236789,506,3207,564,1006,1590,236787,107,159653,1091,672,3627,236787,2963,1676,625,236764,107,73319,1091,11790,531,506,2436,1977,236793,1298,564,236764,107,14986,528,61877,532,528,506,76273,236789,23896,236764,107,15600,43096,506,9043,529,625,236761,20633,564,107152,107,818,3207,23961,684,672,46440,5273,236789,236753,532,107,3898,2068,531,3196,1680,506,1331,236764,18430,107,2021,103662,13442,607,4171,236787,21100,236761,107,14254,8349,236888,108,7280,1700,235190,236787,107,3910,563,625,607,1023,2870,236881,108,22258,236811,128080,3118,236787,107,14986,834,107,2205,607,496,880,684,914,1852,610,1356,1092,143525,236789,236753,236764,107,3133,607,914,21811,125947,236761,108,12400,1700,235190,236787,107,14254,29417,17536,236764,107,2859,611,776,2768,506,1638,9703,19139,107,3048,7976,236789,236753,775,7789,236764,692,236789,859,5518,611,107,4088,822,1822,8613,236761,108,22258,236811,128080,3118,236787,107,39125,236764,564,3914,3442,236787,107,1882,1921,7162,618,692,776,1586,506,1331,236761,108,42173,1700,235190,236787,107,818,1331,795,4595,12298,20126,107,236789,236774,10718,903,611,993,236789,236751,4954,236793,840,506,3798,529,3477,107,114526,506,72399,49967,529,784,236761,108,22258,236811,128080,3118,236787,107,236777,1281,625,236793,107,3133,1041,137059,531,15161,657,1515,34822,107,236776,1535,5431,236761,564,8675,1515,236764,532,564,126693,236789,236753,107,61781,20488,573,914,9043,236787,1015,1646,834,236251,236789,236753,236764,107,2209,1813,236789,236753,914,861,6485,607,569,8548,529,1378,29086,236764,107,236773,29092,522,834,1041,4690,236793,532,236764,531,672,1345,236764,107,2209,7141,236789,236753,914,4135,236764,2752,3224,1680,107,4573,531,577,10887,236764,723,1745,571,742,532,2196,236761,108,42173,1700,235190,236787,107,39125,236764,914,79188,1788,107,4420,668,1602,1975,573,57292,236764,837,668,5745,107,2292,6220,529,7856,31910,74077,108,22258,236811,128080,3118,236787,107,6372,564,1093,735,13804,529,236787,107,33993,8418,1044,236789,236753,573,236789,236745,236764,668,3588,31273,1041,109041,236793,107,139825,531,1041,21171,914,35043,236787,564,3721,1515,236793,107,42619,1515,7600,236772,2348,638,607,786,236793,5877,1515,1595,107,902,784,914,1852,33447,236793,16873,236764,1531,1515,5347,107,3949,529,1041,5734,236764,914,5699,531,21479,236764,107,4754,1791,532,187148,1758,236793,7779,914,1702,1230,107,902,10701,1852,1589,236793,4790,236758,531,70196,506,30006,107,24249,668,1602,1345,784,914,236793,532,3721,1070,21077,107,2021,776,7564,672,6133,236787,8421,236764,657,506,1774,236764,107,236777,4483,236789,236753,914,84014,236764,711,8324,236764,532,107,2209,158522,786,607,914,129412,236764,618,768,107,236777,1053,1010,160573,236761,108,7280,1700,235190,236787,107,4324,668,1602,236764,1041,29398,236787,107,818,14093,129686,236789,236753,657,625,236764,532,236764,528,506,1774,236764,107,4420,668,1053,7505,13706,532,600,692,1385,236789,236753,107,2542,951,2344,75052,1082,27877,74077,108,22258,236811,128080,3118,236787,107,3810,691,625,236787,107,2542,837,1041,4343,40892,2863,577,10161,236789,236753,3324,1515,236761,107,3834,496,2321,17221,529,3607,236789,236751,38437,597,236764,837,659,107,2205,12907,618,12828,236764,668,5463,506,4806,532,16297,107,4088,1023,1822,2970,236787,5233,2863,668,1778,236764,107,3133,564,236789,859,12495,786,528,914,3798,236761,2024,236764,192713,236888,108,7280,1700,235190,236787,107,11069,11363,5148,611,5273,236789,236753,1133,496,1868,236764,107,3133,1053,951,65997,2033,236787,840,668,7623,236764,107,56311,3935,506,2634,607,9168,236761,108,12400,1700,235190,236787,107,3133,6213,119593,236764,107,2825,888,2940,668,46440,125947,236764,910,3225,184152,27299,107,3497,6295,1515,27877,236761,108,42173,1700,235190,236787,107,20416,236764,657,822,132422,236764,107,236788,500,668,3821,5668,236764,653,2827,506,1331,107,3497,1144,668,1093,1879,236764,1531,1515,2597,822,26114,236764,107,24249,692,795,1855,236761,3026,668,12828,3008,236764,107,6259,822,1595,914,22720,31555,2863,69910,107,15989,7483,607,914,2742,236761,108,22258,236811,128080,3118,236787,107,37889,951,919,236787,107,8291,2229,506,97635,236761,108,3243,669,55520,236787,107,3048,659,1346,8349,2033,236761,108,22258,236811,128080,3118,236787,107,236777,735,711,52026,625,236761,107,4573,236764,26721,97635,236764,735,611,607,94900,810,3456,107,3689,564,735,5267,531,611,236881,108,236798,2237,236787,107,1882,735,236761,108,7280,8618,236787,107,3133,161152,531,6899,236789,236745,236761,107,3689,43945,668,1603,1680,506,1774,236764,564,1751,107,196616,735,1765,3735,34768,236787,840,993,531,1345,107,10936,668,691,531,3654,532,2583,3121,107,818,7458,529,1023,187896,236764,38020,775,107,3497,1023,1852,5536,236764,3043,496,52340,1298,107,3810,691,496,51459,74077,1580,34822,951,32725,236761,108,22258,236811,128080,3118,236787,107,2209,12668,236787,611,2863,6899,1515,236761,108,38850,236777,3769,1536,3118,236787,107,192021,236764,97635,236888,564,1006,994,236789,236753,822,31451,236764,107,3771,919,22458,607,1041,2891,236789,236751,2765,107,55771,1056,564,101838,11632,236764,840,2036,12764,15677,107,14713,822,1822,4991,236761,1599,659,531,1281,107,6372,24243,5902,564,735,21323,532,107,3497,48804,16622,5378,822,28481,1581,531,107,818,33361,529,13706,236761,5137,151624,692,735,6111,2033,107,6294,919,1082,6651,2661,1117,496,2587,4168,912,107,818,10814,529,506,2970,236761,1191,735,1603,8118,107,3497,951,2344,20488,531,506,28143,1090,107,55771,30720,531,506,51386,236787,532,692,1590,5518,236764,107,5096,52789,684,506,1122,9784,532,200997,5990,236764,107,68494,607,506,18505,512,236789,506,112936,236764,1144,107,1882,735,66941,580,236761,108,22258,236811,128080,3118,236787,107,6190,625,711,236764,29417,97635,236793,107,4573,3442,506,157831,236764,528,506,1494,236789,540,5802,107,2209,46440,57822,822,13361,236761,108,38850,236777,3769,1536,3118,236787,107,15344,2168,236888,1217,1492,236888,108,22258,236811,128080,3118,236787,107,43320,236764,157831,236764,2481,81341,236888,108,38850,236777,3769,1536,3118,236787,107,9585,81341,236888,108,22258,236811,128080,3118,236787,107,43320,236764,2481,81341,236764,102301,605,2481,81341,236787,24873,35627,1751,107,236777,236789,859,20499,44543,607,600,50998,236764,21820,64569,236789,236749,1463,107,236780,6886,40707,605,528,146066,8244,236881,107,3048,97635,532,15005,512,236789,506,1883,236764,18307,547,16965,107,2209,815,47533,236789,236753,822,1960,236764,532,2238,872,236764,107,2542,2953,17221,529,9551,236764,822,3207,13706,236764,107,236777,1879,756,17993,3207,6945,531,914,6853,532,5946,236793,107,104071,914,47068,532,9314,1133,107,236776,26339,529,91580,27373,236764,2752,66154,107,187415,512,236789,506,3653,236764,840,657,914,26004,236789,236751,24947,107,2209,615,1718,532,96887,236789,236753,3121,822,13626,236764,107,6372,7704,77123,236789,236753,657,1515,532,1758,529,3710,107,13908,236789,236753,22967,1546,657,1032,236761,108,38850,236777,3769,1536,3118,236787,107,129904,236789,540,35627,236764,23156,236881,108,22258,236811,128080,3118,236787,107,1567,711,506,8081,236764,35627,6938,529,24947,236888,108,38850,236777,3769,1536,3118,236787,107,27577,236888,108,22258,236811,128080,3118,236787,107,3771,919,236761,108,38850,236777,3769,1536,3118,236787,107,32866,1933,103034,236764,35627,23823,1603,1041,3710,107,54025,1822,573,1144,6097,625,236761,15441,236888,708,34714,236888,107,236791,43917,786,236764,97635,236764,756,55188,506,1171,990,600,3785,107,236777,691,11724,531,1060,947,236761,5180,59673,236764,1041,23674,97635,236764,107,15545,2583,672,1313,506,7089,236787,532,914,1852,19596,726,107,15938,47369,1041,41827,11594,236789,236753,3324,1515,236793,600,107,15545,10591,1041,33838,531,914,23674,726,18803,6154,107,2021,38689,506,7089,31273,1515,236761,108,7280,8618,236787,107,84437,236764,1800,236764,532,6899,786,8988,236761,108,38850,236777,3769,1536,3118,236787,107,37967,786,531,9097,236764,6285,1166,507,236793,1758,532,110012,236764,107,894,662,784,822,14453,580,786,236761,15441,236888,2416,190989,236888,107,2859,611,735,3528,822,185643,1847,236764,756,55188,993,236764,107,6372,236764,1133,614,45193,528,496,27681,236772,131499,236764,564,107,86514,236789,236753,822,6285,1166,5990,528,146066,8244,236787,107,233721,564,1602,625,236761,15441,236888,108,22258,236811,128080,3118,236787,107,11355,236764,29417,97635,236764,107,15600,611,577,2247,528,3666,529,914,18261,31252,236764,107,24249,691,822,30720,236764,684,672,723,127265,1615,9599,661,236764,107,236789,14555,822,1852,6114,532,23896,236881,108,3243,1700,17859,3287,236787,107,6481,1515,1778,573,236789,236745,236761,108,3243,669,9432,236787,107,236789,236774,803,1515,531,9097,7085,756,6294,625,41909,7085,756,2209,11807,236789,236753,107,3307,2369,7085,756,4754,8709,7085,756,2209,10500,1041,40135,107,112181,7085,756,2209,10500,1041,6353,7085,108,12400,8618,236787,107,84437,236764,3920,236888,951,53644,236787,8118,236888,107,818,880,563,29417,532,914,30006,69468,236772,495,107,2094,24918,512,236789,506,7764,236761,4923,1774,53826,531,775,107,172702,735,113662,9903,236761,8383,236764,17519,547,8875,236764,107,3133,13007,711,506,8118,236761,108,38850,236777,3769,1536,3118,236787,107,236806,600,564,1053,1515,236764,107,3497,3962,17519,11129,4407,236764,653,919,236764,914,41704,236764,107,2021,1161,1041,58587,26114,236888,108,22258,236811,128080,3118,236787,107,902,6169,533,50109,236888,108,3243,1700,17859,3287,236787,107,54395,236764,11807,236764,11807,236764,11807,236764,11807,1515,236888,108,236798,2237,236787,107,66639,236764,2768,236764,2768,236764,2768,236888,108,22258,236811,128080,3118,236787,107,4754,29417,39880,236764,6899,786,8988,236761,108,7280,8618,236787,107,236806,118719,605,74077,108,12400,8618,236787,107,178651,23823,3028,496,28869,1298,502,234007,795,137531,236761,108,42173,8618,236787,107,236774,1399,711,3324,1515,236761,37463,784,236764,577,12010,236793,107,25240,872,822,73787,236761,108,22258,236811,128080,3118,236787,107,4754,97635,236764,1056,611,2863,1281,726,527,528,672,49064,236764,107,26496,15320,684,1515,236764,611,3914,726,1437,1822,8613,107,24249,672,880,236789,236751,1972,1602,47700,611,236764,611,236789,859,88575,107,6372,668,563,5478,3463,1135,236761,7323,625,822,88734,107,2021,2246,786,531,822,112936,236764,564,236789,859,5518,107,123941,883,822,31253,34936,236764,653,52673,107,11069,111353,181708,236761,108,7280,8618,236787,107,44878,699,11632,914,2742,236793,107,3133,92377,611,573,1515,236787,1531,1515,577,18793,107,2205,506,1346,29417,1360,561,600,3785,81319,107,15562,1500,531,914,82621,236761,108,12400,8618,236787,107,15989,1852,167474,107,170171,699,17519,547,8875,496,1822,912,529,27248,236761,107,6481,236789,236751,1386,506,1791,529,625,236761,108,22258,236811,128080,3118,236787,107,4754,49064,563,8731,236793,107,3133,564,1006,19847,607,55180,236761,12774,1515,872,236761,107,28368,236764,1806,512,236789,506,9329,598,18187,236793,564,236789,859,577,886,236761,107,57949,35627,506,22023,236764,600,625,8988,92377,3846,236787,107,91270,822,8103,510,14493,236761,20043,528,672,3207,668,107,236814,651,39557,236789,236753,532,723,10869,524,1551,496,886,236764,107,24249,531,672,6468,29016,917,506,9938,236764,107,40524,668,2863,735,496,29417,6571,236761,50591,236761,108,9179,7328,4660,32867,236787,107,6445,563,506,7208,529,1023,115615,107,42619,46846,5312,684,672,3768,529,3773,236793,107,3133,784,506,14958,600,134465,236789,236753,3324,1023,3155,107,902,506,5268,147851,529,506,12461,29051,236761,107,6445,659,1023,24468,4470,607,82865,179961,236793,107,7711,162542,12162,15410,872,573,50744,236793,107,7711,47927,610,516,8896,6692,531,69619,13721,236764,107,7711,92131,119607,531,48151,8073,236761,107,208425,236772,3203,4130,3653,46440,7080,236789,236753,914,163970,3645,236793,107,3133,1492,236764,5205,529,32812,518,19319,92302,107,2021,36780,506,39690,529,82527,112027,236764,107,2209,1670,616,538,8464,586,528,496,15924,236789,236751,18782,107,2021,506,2137,149772,1434,52614,529,496,66479,236761,107,4573,564,236764,600,1006,711,21735,573,147368,34296,236764,107,31777,1603,531,4054,614,1006,17745,3182,236772,22667,236793,107,236777,236764,600,1006,39533,953,18743,236789,236753,236764,532,1461,2765,236789,236751,126164,107,2021,115033,1680,496,190981,1006,13599,114684,236793,107,236777,236764,600,1006,113288,236789,236753,529,672,5888,13789,236764,107,5239,774,529,4926,684,864,41645,4135,236764,107,3321,11214,236764,723,50263,236789,236753,236764,3265,1680,1041,990,107,52382,672,25638,1902,236764,56345,3746,1603,872,236764,107,3133,600,834,27875,953,532,11286,6399,742,107,6372,12414,33931,657,786,618,564,34092,684,1091,236793,107,11355,236764,564,236764,528,672,7209,55555,990,529,8118,236764,107,19845,951,14933,531,1786,3121,506,990,236764,107,71520,531,45989,1041,13024,528,506,3768,107,3133,8061,638,580,10701,1852,196335,236787,107,3133,5233,236764,2338,564,3914,8595,496,35047,236764,107,2021,10360,1239,5888,1388,236772,75136,2668,236764,107,236777,1006,6185,531,8595,496,50109,107,3133,17554,506,34982,89115,529,1239,2668,236761,107,178491,735,564,15026,236764,9075,1507,13588,236764,107,2292,84908,186780,236764,4758,1416,532,19424,236764,107,2021,1076,1041,10070,93556,532,506,9615,107,902,35514,17554,506,886,2342,506,1032,236787,107,3133,768,6065,19848,577,618,1847,532,1164,107,2205,564,1006,29110,236764,2416,532,142222,236764,107,2094,1719,1374,93556,12030,577,66529,236789,236753,872,236764,107,11040,496,92710,236764,837,3189,600,756,236823,236789,107,4088,19848,236789,236751,66780,506,111849,2863,577,236761,107,229738,236764,12018,236764,1679,531,1041,12556,236787,1590,107,1829,59377,3952,236761,107,106065,236764,1535,1719,236793,1144,2820,672,24723,9200,107,6372,79425,3324,822,20499,236881,108,5094,20463,138262,236787,107,15989,126164,107,236774,52813,1041,1589,236789,236751,5646,236764,46440,13857,107,2094,4714,531,17308,786,531,506,25822,236761,108,9179,7328,4660,32867,236787,107,41768,1144,4400,236881,108,5094,20463,138262,236787,107,17574,1041,1463,563,9142,236761,108,9179,7328,4660,32867,236787,107,2368,697,236764,1041,29398,236764,600,12866,563,7293,529,23149,236793,107,2209,1374,236764,573,600,236764,8274,822,8081,115641,236787,107,236806,236764,1578,985,914,126164,46440,1070,9703,107,6372,611,2863,577,861,236772,81867,501,236789,236753,528,506,25822,236761,107,4573,1144,236789,236751,506,4217,236764,93556,236881,138,14786,564,1281,236881,108,5094,20463,138262,236787,107,184154,236764,11181,236764,1056,564,1281,236793,573,564,13456,107,2205,3819,564,776,711,236787,840,236764,618,564,740,3449,236764,107,2209,668,899,832,1308,186780,532,19424,236793,107,3133,699,506,4071,236772,809,729,63901,506,6064,667,236761,107,3133,3189,496,58964,4173,1515,600,684,667,107,15989,4186,864,129375,1374,577,236793,107,3133,236764,573,1041,1463,529,9142,12502,607,667,236764,107,1509,5238,528,914,3305,600,564,1006,668,236761,107,9208,236764,618,564,3449,236764,532,1288,1133,23307,618,1239,107,19845,7808,914,1494,1788,531,8274,786,1492,236761,108,9179,7328,4660,32867,236787,107,11355,236764,672,625,563,236764,1056,1758,659,26668,684,3607,236787,107,236789,112728,711,506,9615,600,25017,611,531,506,25822,236787,107,4754,18472,32709,914,6853,236764,93556,236764,756,55188,1304,107,6372,6262,616,1515,531,672,106457,236761,107,31403,625,711,1304,532,600,1535,880,529,24712,236764,107,79688,10013,6914,236764,1116,10070,993,236764,107,6372,1603,1515,5039,8618,89507,531,506,25822,236764,107,4663,52533,672,1861,1719,668,563,5518,236789,236753,236881,107,1882,659,711,6338,236764,93556,236793,692,659,711,6338,236761,108,5094,20463,138262,236787,107,2292,20808,236764,564,1751,993,236789,236751,951,880,563,9584,107,4573,506,26476,236789,236751,152132,532,3446,236772,95936,81319,236751,107,6372,719,26676,1146,10718,903,506,9615,532,45479,852,47179,236761,107,2209,714,12444,711,1144,614,36820,1349,1554,638,107,52130,23823,1013,691,531,1116,573,914,8341,236881,108,9179,7328,4660,32867,236787,107,236814,3464,586,54734,531,1116,92262,107,48727,1041,29398,18782,42387,914,32795,236761,107,236777,236789,859,3442,611,1144,236793,564,1751,625,563,1023,1595,236764,107,2859,692,795,2514,528,10144,607,506,9615,236764,107,2021,577,1116,1758,532,8785,1116,144726,236787,107,818,45288,512,236789,497,128577,39557,532,13442,236764,107,10081,600,1023,10070,21958,236763,236789,236753,1091,14617,33643,236761,107,14219,45478,144847,3290,528,672,94950,236761,108,82859,26073,162556,236787,107,236777,5426,2167,574,822,185518,1800,531,73831,786,236793,107,15989,126164,46440,175858,586,2238,528,5536,107,6372,951,880,2863,735,2147,9232,236764,107,4088,1144,5802,834,1627,236764,607,914,10070,236761,108,9179,7328,4660,32867,236787,107,14986,834,236793,614,236789,236745,5091,822,24712,236764,16390,3146,14694,236764,107,3048,1149,126193,529,1027,3210,692,1879,236787,107,1882,8988,951,103543,236764,880,236787,692,1879,506,9615,107,4602,21608,532,128061,236764,532,914,29417,26476,107,13086,19847,528,1518,236764,5888,236764,532,711,45288,236793,107,1882,1879,600,47179,236789,236751,6853,46440,496,5497,3998,236764,107,236776,30153,11645,236764,496,7326,1920,7068,236764,496,11332,52614,28166,236793,107,3133,600,506,26476,236789,236751,152132,659,1603,14617,236772,6708,2357,236787,107,3910,1879,611,17536,236881,3199,611,30590,784,672,236881,108,82859,26073,162556,236787,107,3497,672,236764,1041,29398,236764,7564,735,226774,531,776,236761,108,9179,7328,4660,32867,236787,107,236797,7142,531,776,607,98536,47179,236888,564,3442,44543,236764,12339,236764,107,2209,600,139092,214304,607,1116,236764,144443,886,236764,107,114498,1791,668,776,625,63787,236764,7057,236761,108,82859,26073,162556,236787,107,3689,886,236764,1041,29398,236881,108,9179,7328,4660,32867,236787,107,16900,8705,236764,1054,1478,236787,1093,540,35627,47533,786,236881,108,82859,26073,162556,236787,107,236777,5426,2167,574,822,20499,531,73831,786,236764,532,607,514,107,2542,26316,822,9232,607,506,29417,99929,236761,108,5094,20463,138262,236787,107,1882,1281,21820,5536,236764,16390,3146,14694,236764,532,795,41179,236761,108,9179,7328,4660,32867,236787,107,1882,659,506,26476,236789,236751,763,99907,236764,532,1921,41179,236761,107,106065,236764,77821,236787,564,795,31273,506,9615,236793,107,3133,45436,611,795,2736,786,528,236764,107,114498,625,531,2246,6065,19848,236789,236751,39557,12198,236764,107,236777,795,2121,625,531,696,107588,1117,611,236761,107,5191,638,764,236764,672,5268,87270,528,130443,107,193273,786,19276,1082,611,740,14011,236761,108,5094,20463,138262,236787,107,236777,1281,625,50245,1064,13637,529,775,1388,236761,108,9179,7328,4660,32867,236787,107,13086,236764,822,42426,2863,711,577,1440,236793,107,5191,638,764,236764,735,31245,236761,108,5094,20463,138262,236787,107,236777,1921,810,10270,236761,214858,236761,108,9179,7328,4660,32867,236787,107,5988,236764,46104,506,2479,600,35627,145453,770,236789,497,994,236761,107,22575,236764,14529,93556,236888,564,776,2765,44543,834,236764,107,6372,564,795,21546,5039,21820,12556,531,20808,236764,107,2859,20808,795,1769,506,1861,657,1023,4916,236761,107,4573,1015,3952,1590,236881,506,861,236772,177799,236789,236753,89507,236881,108,14983,1393,24222,236787,107,11947,990,529,1719,31273,1041,86372,29398,236888,108,9179,7328,4660,32867,236787,107,2205,1623,31273,1041,1535,29398,18782,42387,236888,107,13086,659,611,8349,531,506,1932,2634,236761,107,3910,46440,822,176712,115752,236789,236753,42426,236881,108,14983,1393,24222,236787,107,3497,31245,236764,29417,29398,236764,618,31872,1921,236787,107,4573,564,2863,3892,236764,1041,29398,236764,531,2583,1091,8863,107,6372,964,506,4400,529,1041,42426,236761,108,9179,7328,4660,32867,236787,107,3771,9370,236764,951,9370,236793,532,834,2863,93556,2311,236793,107,2542,901,600,964,822,22816,659,914,236764,107,3133,735,59359,236789,236753,618,1623,580,1515,618,611,236761,108,14983,1393,24222,236787,107,9474,56346,600,506,45193,1374,577,66529,236789,236753,236764,107,8409,179676,532,38389,2206,34285,657,32795,236761,108,9179,7328,4660,32867,236787,107,3689,4668,19989,236881,108,14983,1393,24222,236787,107,3771,4668,834,4287,19989,618,672,657,2033,236793,107,818,6065,563,216756,236764,7209,532,103409,236764,107,3133,914,30869,9891,1515,2473,1403,236761,108,9179,7328,4660,32867,236787,107,6445,236764,684,14148,6768,236764,672,4668,563,4287,11161,236761,107,236806,236764,668,46440,7953,614,16966,9337,1440,236764,107,3133,1024,58668,27874,914,19833,1589,236787,107,236789,112728,1401,203691,531,577,3305,3324,236761,107,3689,236764,563,668,528,914,4086,236881,108,14983,1393,24222,236787,107,2209,563,236761,108,9179,7328,4660,32867,236787,107,5988,611,1680,236764,532,564,795,1500,611,236761,107,2209,3914,3892,236764,564,4614,236793,532,1921,711,1778,107,112222,9142,577,3294,236789,236753,607,1868,236772,46081,872,531,20808,236761,107,236777,236789,859,528,236764,531,37773,914,54584,919,531,93556,236764,107,3497,12828,1388,8103,236789,236753,607,214784,12130,236793,107,3133,236764,768,564,3798,711,528,1041,5268,9703,236764,107,1829,59377,46440,711,2264,1719,531,3892,236787,107,24249,3028,236764,3803,1769,6065,19848,531,914,40474,236764,107,3133,5264,506,1902,573,786,531,127710,528,236888,107,2542,1299,564,236789,859,36542,62759,236789,236751,40453,8709,236761,107,3689,3635,564,11807,236789,236753,1116,8705,532,1116,6353,236881,107,818,1676,15163,1595,531,1386,506,26274,574,135734,107,4602,531,3291,1116,8705,532,1116,6353,236787,107,818,837,795,564,236793,711,784,834,1623,573,2765,107,2205,573,2264,6789,3107,9703,236764,107,2292,94663,1116,837,564,1921,5370,31273,236761,107,4573,3819,564,1845,1680,1041,11149,531,2436,236787,107,1829,59377,2036,186300,236793,19848,2036,6176,532,128298,236787,107,4420,901,659,8731,236764,1299,1921,564,1527,1041,22656,236761,108,236798,116655,7790,6136,236787,107,3974,1679,236764,1076,1679,822,42256,3711,236764,107,2859,20488,1149,577,145840,528,496,6899,561,236764,107,96553,564,74951,1029,3889,16965,51138,107,818,143423,3798,529,128061,56518,236761,107,90046,2307,236772,66471,5811,529,496,27437,9615,236888,107,163406,67813,529,506,3155,529,56518,236888,107,178651,4806,1933,101799,529,600,19833,4806,236888,107,3912,625,58587,600,564,1379,33345,21820,25556,236764,107,2021,6899,506,51138,847,529,44125,23589,236764,107,236824,1482,531,21820,19848,236764,531,21820,51907,236789,236753,2369,236764,107,894,6705,236789,236753,684,506,1265,31159,1526,600,1603,1239,40238,236888,107,9520,236764,528,1239,11665,600,1531,12034,21820,1972,236764,107,236777,2637,506,58771,98637,529,1041,6934,6114,236761,107,236780,72008,577,506,1526,600,1603,1239,25367,15629,236888,107,236780,72008,577,506,3710,600,1053,506,3710,531,776,625,236888,107,236780,72008,506,4806,600,1531,672,4806,699,11632,236888,107,9474,3348,1275,47926,1146,754,600,52695,515,35891,236764,107,6372,3590,775,119948,684,506,4355,529,44543,236764,107,55771,564,740,7976,531,1138,616,236764,92664,236764,531,10328,236764,107,3524,1027,114720,92999,236789,236753,3210,600,6176,236888,107,2859,3785,668,735,1919,236764,35655,705,577,625,236764,107,1489,10740,1434,236764,532,143423,6111,531,2214,236764,107,2825,888,37653,532,104324,6084,107,12055,36780,506,46894,5946,657,506,1927,236793,107,3133,600,577,49967,531,914,216185,236888,107,2859,3785,668,735,6853,236764,1531,1116,668,1603,107,236776,57746,684,506,4355,529,1515,107,2205,564,1006,1603,684,1041,6934,29398,532,44543,236888,107,33190,236764,1492,5645,971,907,8109,607,822,27437,3711,236764,107,74220,699,6768,236789,236751,531,577,939,1192,993,236793,107,3133,2036,236764,618,611,659,78567,529,506,3825,236764,107,14151,611,236764,615,2538,564,51138,6065,12297,236789,236751,1360,561,236761,108,9179,7328,4660,32867,236787,107,44289,236764,611,600,10591,506,1360,561,236764,532,1076,625,1679,236761,108,236798,116655,7790,6136,236787,107,3689,2764,117116,10199,1264,872,672,7999,643,236764,107,2021,4721,23011,43455,50898,236881,108,9179,7328,4660,32867,236787,107,104972,2279,236764,1076,1679,506,1360,561,236793,653,236764,684,14148,6768,236764,107,236777,236789,859,1386,496,1360,561,529,1515,600,864,13520,986,236761,108,105677,130648,236787,107,4754,29398,236764,1975,1063,236764,532,1531,506,89085,1786,236761,108,9179,7328,4660,32867,236787,107,2805,235184,236789,236753,4799,236888,1975,35627,236764,1056,564,4991,236787,107,93006,21820,2796,8218,3715,1082,1041,16489,236764,107,3524,236764,684,14148,6768,236764,564,236789,859,15161,44543,531,1041,3998,236764,107,3133,892,794,3324,44543,236764,200658,236764,573,21820,223639,236761,108,236798,116655,7790,6136,236787,107,3689,236764,776,611,152114,236881,659,611,784,16937,236881,107,2368,527,236764,564,27248,611,711,236793,573,611,659,53243,236764,107,3133,53243,6114,3914,52673,506,40450,236761,107,162752,3012,236764,35627,92131,13933,529,17786,236888,107,178651,1053,540,840,2066,1024,914,53243,2742,236764,107,15989,12556,35627,740,540,711,735,236793,5233,577,8731,236761,108,9179,7328,4660,32867,236787,107,59591,52291,236764,573,21811,236764,577,711,834,1313,540,236761,108,236798,116655,7790,6136,236787,107,236811,8092,40450,236764,573,3803,236789,236751,24273,236764,11632,236764,532,13007,775,711,236793,107,2542,35627,23823,1603,506,5293,7764,21820,17786,236764,107,25831,236789,236753,625,607,177617,73545,532,5268,652,67259,236761,107,2859,35627,14933,531,1927,21820,186422,50898,236764,107,3912,2840,672,3759,529,21820,103173,695,236761,107,236806,236764,43085,236764,1460,236764,1460,236888,6582,12297,236789,236751,40238,107,7084,910,591,787,514,236789,236753,90965,532,84561,228588,236888,107,3508,1974,236764,148664,236764,35627,55267,529,48690,196335,236793,107,2542,756,55188,21820,6219,600,56794,3051,672,4806,107,4663,7445,532,7738,43608,236764,1298,951,4806,218797,236793,107,195297,28869,236764,151695,532,104324,236764,107,26496,10272,672,231973,1346,104324,236761,107,236806,3803,236764,837,672,4806,10716,598,236764,47812,914,4355,236888,107,236806,7764,236764,837,672,4806,6092,236789,540,47812,914,4355,236888,107,75064,20808,607,35275,15161,506,107,149133,497,6582,236764,107,3524,7764,236764,549,2359,1932,5777,532,9039,1515,3823,236764,107,2205,35627,24873,50696,872,672,1535,9615,236789,236751,4806,107,24249,914,17786,236772,230331,236789,236753,3774,46440,180161,3747,236888,108,9179,7328,4660,32867,236787,107,69851,236764,611,1281,951,6366,529,21811,236764,107,24249,57346,1535,573,4287,236764,56274,573,114739,236761,108,236798,116655,7790,6136,236787,107,104972,662,236764,35627,1281,236789,540,951,2621,529,3803,6271,880,236787,107,3771,42239,834,48621,840,10077,1070,6374,529,56346,236761,108,9179,7328,4660,32867,236787,107,4573,564,1281,7293,236764,532,5233,1006,951,42239,236761,108,236798,116655,7790,6136,236787,107,236806,10455,236764,1056,150324,3442,506,9043,236888,108,9179,7328,4660,32867,236787,107,9474,10455,236764,1056,46580,659,834,23186,236761,107,236847,4197,12977,236764,29432,36515,529,496,3875,236764,107,4088,1239,11750,236772,2743,3448,236764,531,2583,786,5264,236764,107,2292,50752,236764,840,531,5591,509,7564,236761,108,236798,116655,7790,6136,236787,107,236847,4197,12977,236764,1096,3456,13482,529,496,880,236764,107,2542,1239,3224,91907,236764,840,531,2583,786,5264,236764,107,2292,50752,236764,531,57391,21820,98933,1265,236761,108,9179,7328,4660,32867,236787,107,62622,497,1082,28166,740,1463,44543,236764,1531,786,735,107,9401,6213,30434,531,32725,7564,236761,108,236798,116655,7790,6136,236787,107,236811,521,1898,1082,3710,740,1751,44543,236764,35627,740,540,1386,107,3771,32725,1873,236764,840,531,13098,208224,883,236761,108,9179,7328,4660,32867,236787,107,2292,1288,53560,236764,564,1374,107152,7564,236761,108,236798,116655,7790,6136,236787,107,3133,236764,684,53560,522,236764,1374,540,35627,1975,126297,236793,107,2542,3490,26721,105246,580,208224,883,236764,107,24249,1602,540,149246,51907,3324,3496,236761,108,9179,7328,4660,32867,236787,107,37889,600,564,96895,1091,711,236881,108,236798,116655,7790,6136,236787,107,11355,236764,1299,901,659,711,6582,236787,107,4573,6582,901,659,236764,532,40450,1044,34714,236764,684,44543,236761,108,9179,7328,4660,32867,236787,107,236777,1602,711,11807,822,8705,236761,108,236798,116655,7790,6136,236787,107,11355,236764,1299,668,563,16944,236761,108,9179,7328,4660,32867,236787,107,197615,236764,668,563,6582,236793,532,125947,684,19848,236789,236751,1526,236761,108,236798,116655,7790,6136,236787,107,902,21820,48690,35043,35627,4510,598,236787,14596,30181,5004,107,195297,142568,14784,574,526,24264,528,914,4806,236793,107,818,837,35627,3622,1602,540,28773,2342,1116,16489,236764,107,4573,600,21820,19340,12222,16551,506,1523,236761,108,9179,7328,4660,32867,236787,107,236777,691,100652,684,1116,155834,806,28166,236764,107,7650,15026,910,38792,3324,1041,38792,1933,28470,236761,108,236798,116655,7790,6136,236787,107,178651,50084,100652,684,21820,48804,3666,236761,107,24249,2752,180019,580,12723,613,840,103173,695,236787,107,15562,540,35627,711,11807,672,9615,236881,108,9179,7328,4660,32867,236787,107,236777,7224,12444,236761,108,236798,116655,7790,6136,236787,107,236796,731,7224,786,236764,91071,236881,1299,236764,3803,7224,786,2311,107,178651,1149,540,577,113572,573,600,53154,28869,236888,107,236806,236764,668,691,14617,236764,18790,236764,532,128061,236888,108,9179,7328,4660,32867,236787,107,818,205390,573,506,6065,529,20808,236764,600,46440,1515,236761,108,236798,116655,7790,6136,236787,107,2209,563,528,20808,236764,1298,35627,145453,2752,2229,236761,108,9179,7328,4660,32867,236787,107,6481,1515,7806,786,236764,600,4790,236758,531,5039,1515,541,2853,236793,107,2542,668,691,205390,573,600,1977,1082,7764,236761,108,236798,116655,7790,6136,236787,107,3133,35627,128773,573,1027,1977,840,17786,236761,108,9179,7328,4660,32867,236787,107,10784,236764,886,1977,1663,236764,768,611,795,6899,786,1463,625,236761,108,236798,116655,7790,6136,236787,107,9401,101128,236761,108,9179,7328,4660,32867,236787,107,11069,4086,236772,147083,236761,108,236798,116655,7790,6136,236787,107,236777,236789,859,1884,1146,754,506,18782,1298,35627,4510,598,236888,108,9179,7328,4660,32867,236787,107,4324,795,625,236764,198928,8421,564,7089,607,611,236761,108,236798,116655,7790,6136,236787,107,236777,4614,834,236761,108,9179,7328,4660,32867,236787,107,236777,1281,834,236761,2024,236764,14617,18472,23589,236764,107,2021,5264,672,24437,14087,529,1023,166623,236764,107,3133,3798,12506,1131,496,31878,1996,236764,107,4602,711,506,2146,2364,529,506,56870,19867,107,4088,1239,15748,5451,1713,236764,12297,532,19848,236764,107,2205,27248,1275,618,506,14860,497,236881,108,236798,116655,7790,6136,236787,107,178651,1610,506,4400,236764,532,1346,1055,72008,1763,236761,108,9179,7328,4660,32867,236787,107,11069,9899,691,506,4400,529,600,1763,236793,107,11069,9899,236787,837,1602,112590,786,528,1041,6745,107,2021,37585,506,4355,529,784,506,1902,236764,107,4324,564,2473,3892,886,6468,528,822,9380,147851,236761,108,236798,116655,7790,6136,236787,107,2859,564,3305,600,236764,564,3442,44543,236764,79592,236764,107,9208,26960,1374,6898,600,9899,699,1041,65684,236761,108,9179,7328,4660,32867,236787,107,9208,6114,1451,2752,52673,9380,9899,236789,236751,48897,236793,107,3048,1374,711,518,115159,625,236764,768,564,15032,684,236787,107,2205,784,506,1902,563,120899,684,506,3768,236764,107,4324,564,684,600,236793,625,563,1041,1719,236764,1041,1972,236761,108,236798,116655,7790,6136,236787,107,6907,3446,512,236789,616,234547,21820,1719,236764,532,4355,21820,1972,236888,108,9179,7328,4660,32867,236787,107,15413,561,711,208224,883,236764,5888,33070,35627,1610,1800,236761,108,236798,116655,7790,6136,236787,107,236777,1093,564,964,236764,531,577,5437,189958,580,44543,236761,108,9179,7328,4660,32867,236787,107,1509,563,496,103445,1346,104324,236764,107,2021,577,5437,189958,580,1515,600,7278,1064,611,236761,108,236798,116655,7790,6136,236787,107,1509,563,496,103445,1164,532,10559,236764,107,2021,577,5437,189958,580,1515,600,96895,1041,8705,236761,108,9179,7328,4660,32867,236787,107,2209,600,17322,765,44543,236764,15924,236764,529,21820,8705,236764,107,15562,625,531,1601,44543,531,496,2480,8705,236761,108,236798,116655,7790,6136,236787,107,15989,2480,139092,711,38893,3324,506,7764,236761,108,9179,7328,4660,32867,236787,107,2209,6176,600,18178,44543,2480,1082,668,1451,236761,108,236798,116655,7790,6136,236787,107,1567,1515,236761,108,9179,7328,4660,32867,236787,107,56003,5451,536,236761,108,236798,116655,7790,6136,236787,107,11355,236764,600,691,668,236761,108,9179,7328,4660,32867,236787,107,818,1265,31159,1463,236764,840,886,529,2480,4135,236761,108,236798,116655,7790,6136,236787,107,10936,563,668,236881,108,9179,7328,4660,32867,236787,107,8291,236761,107,11355,24873,35627,75789,657,786,236881,108,236798,116655,7790,6136,236787,107,38786,625,964,53243,23572,236764,573,21820,24273,236888,108,9179,7328,4660,32867,236787,107,32631,3588,23572,699,834,9380,496,1977,236761,108,236798,116655,7790,6136,236787,107,32631,15410,23572,580,496,42856,1898,171724,236761,107,3949,529,1041,14186,236888,35627,24873,14006,1041,6114,236761,108,9179,7328,4660,32867,236787,107,1214,688,6114,236764,9380,15924,236764,735,22458,10701,236761,108,236798,116655,7790,6136,236787,107,38786,901,964,2280,26082,2357,236764,531,15161,44543,6582,236888,108,9179,7328,4660,32867,236787,107,236777,1093,901,964,236764,600,564,2473,1778,657,3622,236793,107,2542,1492,901,11807,786,607,496,4882,4355,236761,107,28587,6114,529,162531,699,10701,735,9710,9551,24947,236764,107,3138,15608,910,6084,607,4762,529,106221,17221,236787,107,9208,6114,600,2752,28516,130864,1275,27299,236764,107,3771,236764,1056,1041,6353,3773,532,19848,165246,236764,107,2021,6899,506,510,785,806,229308,600,218827,1603,107,4420,2764,236772,66545,83517,42058,914,26114,657,1515,236793,107,31777,1056,21820,135162,985,6353,236764,1133,496,1919,236764,107,236774,947,506,11019,3925,529,1041,6353,236789,236751,4355,236764,107,3133,12571,2782,1603,23493,531,32702,532,137531,236764,107,6372,784,506,1975,616,236772,2003,1053,11474,910,65684,107,17729,7146,4086,1316,236789,236753,607,6927,236787,528,600,11019,990,107,4754,163078,6114,1602,121391,614,36820,27299,236793,107,3133,1144,1239,179452,1451,711,76849,163158,236764,107,195297,9899,46440,236764,532,1603,1091,18261,607,118508,236761,107,236777,2752,51126,531,4389,6271,13550,236793,107,4754,28166,1451,2752,3449,9380,58863,3658,236793,107,4573,1492,21820,9899,563,6195,1041,11317,236764,107,4754,11307,3710,664,507,236764,532,62571,1041,28166,531,8988,236761,107,189813,711,21820,24236,1288,121391,236764,573,901,964,1603,107,2542,63462,236764,15924,236764,711,573,1288,51785,236761,107,2859,21820,47812,1275,3710,3914,49233,236764,107,9520,236764,1590,564,39002,44543,672,14658,236772,153130,26114,236793,107,24249,768,35627,5091,531,16239,528,672,1847,147851,236761,107,3133,1531,506,12556,12034,600,77270,594,44543,236764,107,236777,6267,625,41718,531,506,35514,15901,236764,107,3133,154320,2829,506,4355,3324,1041,21980,236761,107,197615,236764,776,711,23493,236793,573,564,1602,11807,6065,12297,236764,107,4573,756,236745,9849,21820,9899,600,100652,786,236761,107,197615,236764,1492,21100,236793,756,236745,9849,564,600,579,6705,236789,236753,3184,19848,236764,107,4573,756,236745,9849,21820,67110,3392,600,1076,786,580,236761,107,13751,872,506,26114,1570,236764,653,1769,872,786,236761,108,236798,116655,7790,6136,236787,107,2023,1117,236764,864,18847,497,236787,3635,564,7976,21820,4355,236764,107,236777,795,711,577,506,14860,497,236761,108,9179,7328,4660,32867,236787,107,11407,13700,786,11807,7564,236764,532,564,795,776,625,236761,108,236798,116655,7790,6136,236787,107,236777,735,3016,236761,108,9179,7328,4660,32867,236787,107,236774,1974,236764,600,691,528,21820,49064,236787,107,130171,625,1570,236764,532,236764,1581,607,506,3658,236764,107,6372,1526,236764,837,236764,573,21820,2765,236764,1602,11807,21820,2765,236764,107,172702,236764,573,21820,2765,236764,11807,496,2793,719,4910,2765,236793,107,2021,1800,910,19867,35627,145453,577,2802,835,236761,108,236798,116655,7790,6136,236787,107,236777,1093,564,7261,21820,3710,236761,108,9179,7328,4660,32867,236787,107,236789,112728,31572,528,1041,28166,236761,108,236798,116655,7790,6136,236787,107,236777,9891,786,1800,659,2416,236761,108,9179,7328,4660,32867,236787,107,11407,2752,880,691,1847,236761,108,236798,116655,7790,6136,236787,107,13086,236764,1388,236764,2247,872,822,26114,236761,108,9179,7328,4660,32867,236787,107,37889,236764,1299,236764,1041,8118,563,1603,236761,108,236798,116655,7790,6136,236787,107,6372,2863,611,1281,74026,236761,108,9179,7328,4660,32867,236787,107,4573,2863,564,3892,528,4614,236881,108,236798,116655,7790,6136,236787,107,3243,1758,236764,564,4614,236764,3892,834,236761,108,9179,7328,4660,32867,236787,107,236847,4197,12977,531,8785,672,7610,236761,108,236798,116655,7790,6136,236787,107,2021,1769,563,711,531,2583,236761,108,9179,7328,4660,32867,236787,107,13908,236764,1217,672,7610,33084,1064,15599,236761,107,14986,834,21820,16489,696,11437,1064,1041,6934,3710,236793,107,103664,1800,529,1091,236764,573,1800,529,1091,659,162531,236761,107,3133,768,21820,6934,23011,1349,1554,638,1149,107,4573,2829,886,10144,657,21820,86372,1526,236764,107,178651,24873,9128,914,20121,573,3785,236761,108,236798,116655,7790,6136,236787,107,3689,563,625,236881,108,9179,7328,4660,32867,236787,107,6372,625,1093,5091,44543,5264,1239,11019,11803,107,2021,1515,600,46440,919,4400,531,577,496,37660,1250,236764,107,3133,41909,11790,531,98905,15444,236793,107,10936,236764,1308,564,735,151450,939,236750,236789,236753,107,3834,971,907,8109,72765,672,29417,9615,236764,107,3133,11474,914,23674,607,1041,58163,638,24947,236764,107,236777,795,607,784,104880,11133,1460,611,236787,107,2542,7584,11908,7483,236761,564,5426,2167,574,611,236764,107,67166,786,672,110919,236761,108,236798,116655,7790,6136,236787,107,3497,784,1041,3710,236793,532,1623,625,96923,786,2311,236764,107,2021,1460,611,659,3291,834,116730,533,236761,107,236774,852,535,532,36038,236764,817,3008,607,786,236761,108,9179,7328,4660,32867,236787,107,73319,786,77821,236761,108,236798,116655,7790,6136,106,107,105,4368,107 \ No newline at end of file diff --git a/.sisyphus/notes/gemma4-baseline/prompts/long_code_50k.meta b/.sisyphus/notes/gemma4-baseline/prompts/long_code_50k.meta new file mode 100644 index 00000000..956e1d16 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/long_code_50k.meta @@ -0,0 +1,6 @@ +tokenizer: google/gemma-3-27b-it +chat_template: yes (user: read+summarize this Python module; assistant turn open) +source: HumanEval+ tasks concatenated as one Python module +token_count: 50002 +first_20: [105, 2364, 107, 818, 2269, 563, 496, 17856, 9173, 7906, 1551, 1944, 14955, 5151, 607, 910, 5434, 236761, 8847, 1091] +last_5: [106, 107, 105, 4368, 107] diff --git a/.sisyphus/notes/gemma4-baseline/prompts/long_code_50k.txt b/.sisyphus/notes/gemma4-baseline/prompts/long_code_50k.txt new file mode 100644 index 00000000..248d1678 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/long_code_50k.txt @@ -0,0 +1 @@ +105,2364,107,818,2269,563,496,17856,9173,7906,1551,1944,14955,5151,607,910,5434,236761,8847,1091,13058,236764,1299,4903,496,3161,63510,12323,22454,506,3364,3393,3117,236764,1292,236772,30124,9935,236764,10944,6484,2383,53359,236764,532,1027,9935,611,6303,3418,506,5151,236761,108,2717,6719,107,2543,28751,1419,4361,109,2063,815,236779,5977,236779,31493,236769,34488,236787,4361,236840,8344,1604,14272,236787,6803,236768,3921,7014,236787,107,140,12234,7179,768,528,2238,1694,529,4945,236764,659,1027,1156,4945,12532,531,1546,1032,1082,107,140,45163,14272,236761,107,140,22539,815,236779,5977,236779,31493,5551,236770,236761,236771,236764,236743,236778,236761,236771,236764,236743,236800,236761,236771,1604,236743,236771,236761,236810,236768,107,140,9277,107,140,22539,815,236779,5977,236779,31493,5551,236770,236761,236771,236764,236743,236778,236761,236828,236764,236743,236800,236761,236771,236764,236743,236812,236761,236771,236764,236743,236810,236761,236771,236764,236743,236778,236761,236771,1604,236743,236771,236761,236800,236768,107,140,4339,107,140,12234,109,140,40835,236779,34488,578,19372,236769,34488,236768,107,140,1708,858,528,2644,236769,3469,236769,40835,236779,34488,236768,753,236743,236770,1473,107,144,584,19372,236779,34488,236840,236747,900,236743,236770,236842,753,19372,236779,34488,236840,236747,236842,655,14272,236787,107,148,2060,6288,107,140,2060,8450,108,2543,28751,1419,4361,109,2063,7732,236779,18919,236779,19243,236769,18919,236779,2383,236787,1540,236768,3921,4361,236840,1714,9414,107,140,12234,13065,531,672,1292,563,496,2483,7906,5065,4402,529,43927,62334,236761,5180,5671,563,531,107,140,129510,1724,2299,1131,7732,16587,532,994,506,1694,529,1724,236761,107,140,141104,4402,659,20433,568,17136,1932,24088,563,9668,6535,236768,532,711,43927,2351,1546,1032,107,140,34409,1027,9952,528,506,2744,2483,236761,107,140,22539,7732,236779,18919,236779,19243,175932,1732,5960,28909,5960,136622,28909,1606,107,140,1922,825,963,93501,3507,963,756,11292,3507,2000,107,140,12234,109,140,26297,236764,2299,236764,2536,578,236743,236771,236764,15437,2977,107,140,1708,677,528,53679,236779,2383,236787,107,144,584,677,1251,623,61835,38302,3323,236743,236770,107,144,584,677,1251,15825,1083,38302,14599,236743,236770,107,144,584,677,2843,623,5563,2299,3323,677,107,144,584,38302,1251,236743,236771,236787,107,148,584,2299,2843,86679,2536,236761,3770,236769,4043,236768,107,148,4043,578,3679,107,140,2060,2536,110,2063,102267,236779,5640,236769,5640,236787,6803,236768,3921,6803,236787,107,140,12234,17770,496,4414,18224,1523,1548,236764,625,740,577,81153,1131,107,140,624,11995,912,568,65020,11995,7100,1082,2238,1548,236768,532,70208,107,140,236769,989,1749,912,2462,7100,1082,236743,236770,769,108,140,13293,506,20632,912,529,506,1548,236761,107,140,22539,102267,236779,5640,236769,236800,236761,236810,236768,107,140,236771,236761,236810,107,140,12234,109,140,2060,1548,753,801,236769,5640,236768,108,2543,28751,1419,4361,109,2063,3426,236779,13321,236769,68382,236787,4361,236840,720,2812,3921,7014,236787,107,140,12234,1599,236789,500,2238,496,1694,529,14664,532,34549,6675,580,496,4856,2881,600,9857,607,107,140,13321,7002,236761,5180,4209,563,531,6440,768,657,1027,1523,506,7002,529,2881,3798,5629,3426,5743,236764,532,107,140,502,600,1523,1292,1374,994,6288,236761,27587,625,1374,994,8450,236761,107,140,22539,3426,236779,13321,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,9277,107,140,22539,3426,236779,13321,5551,236770,236764,236743,236778,236764,753,236812,236764,236743,236810,2812,107,140,4339,107,140,12234,108,140,11622,578,236743,236771,107,140,1708,5585,528,6675,236787,107,144,11622,3323,5585,107,144,584,2881,655,236743,236771,236787,107,148,2060,6288,107,140,2060,8450,108,2543,28751,1419,4361,109,2063,2689,236779,35233,236779,95253,236769,34488,236787,4361,236840,8344,2812,3921,6803,236787,107,140,12234,1701,496,2238,1694,529,2744,4945,236764,9279,36673,72493,164069,107,140,24616,506,2689,529,672,15297,236761,107,140,44389,72493,164069,563,506,4398,10298,4954,1534,1546,107,140,7011,532,496,3988,3947,568,10520,528,672,1624,1473,107,140,102892,578,4398,1109,1123,753,1123,236779,10520,1109,107,140,22539,2689,236779,35233,236779,95253,5551,236770,236761,236771,236764,236743,236778,236761,236771,236764,236743,236800,236761,236771,236764,236743,236812,236761,236771,2812,107,140,236770,236761,236771,107,140,12234,108,140,10520,578,2324,236769,34488,236768,965,5980,236769,34488,236768,107,140,2060,2324,236769,9082,236769,236781,753,2689,236768,573,1123,528,4945,236768,965,5980,236769,34488,236768,108,2543,28751,1419,4361,109,2063,939,236751,85432,236769,34488,236787,4361,236840,720,1604,1162,20146,236787,801,236768,3921,4361,236840,720,9414,107,140,12234,39606,496,1548,756,7796,20146,236789,1534,1418,1156,22592,4820,529,2744,1694,2165,34488,236789,107,140,22539,939,236751,85432,142976,236743,236812,236768,107,140,3805,107,140,22539,939,236751,85432,5551,236770,236764,236743,236778,236764,236743,236800,1604,236743,236812,236768,107,140,236840,236770,236764,236743,236812,236764,236743,236778,236764,236743,236812,236764,236743,236800,236842,107,140,12234,109,140,619,578,2977,107,140,1708,858,528,2644,236769,3469,236769,34488,16644,107,144,619,236761,3770,236769,34488,236840,236747,2812,107,144,584,858,2843,5980,236769,34488,236768,753,236743,236770,236787,107,148,619,236761,3770,236769,7796,20146,236768,107,140,2060,766,108,2543,28751,1419,4361,109,2063,11299,236779,103895,236779,16611,3852,236769,18919,236779,2383,236787,1540,236768,3921,4361,236840,720,9414,107,140,12234,13065,531,672,1292,563,496,2483,10725,5065,4402,573,43927,62334,15914,684,9952,236761,107,140,2542,1546,529,506,2299,236764,3938,506,58825,1984,529,64597,529,62334,236761,107,140,236788,236761,236759,236761,231998,3507,815,5783,1156,4535,529,64597,1651,5960,21957,815,1806,236761,108,140,22539,11299,236779,103895,236779,16611,3852,1033,11292,3507,5960,21957,4157,5960,3507,825,3507,1606,107,140,236840,236778,236764,236743,236800,236764,236743,236770,236764,236743,236800,236842,107,140,12234,108,140,107,140,2063,1527,236779,15104,236769,236751,236787,1540,236768,3921,801,236787,107,144,2074,236779,15104,236764,38302,578,236743,236771,236764,236743,236771,107,144,1708,677,528,503,236787,107,148,584,677,1251,623,61835,38302,3323,236743,236770,107,148,584,677,1251,15825,1083,38302,14599,236743,236770,107,148,2074,236779,15104,578,2631,236769,2074,236779,15104,236764,38302,236768,107,144,2060,2631,236779,15104,107,140,107,140,2060,870,2861,236779,15104,236769,236751,236768,573,503,528,53679,236779,2383,236761,6966,885,15825,768,503,2843,623,1935,108,2543,28751,1419,4361,109,2063,5957,236779,2003,236779,26967,236769,29139,236787,4361,236840,1714,1604,85440,236787,1540,236768,3921,4361,236840,1714,9414,107,140,12234,28293,614,2744,1694,529,16587,1186,573,5906,600,3014,2238,85440,107,140,22539,5957,236779,2003,236779,26967,142976,756,236746,1606,107,140,3805,107,140,22539,5957,236779,2003,236779,26967,20768,28180,963,756,53896,236753,963,756,236755,893,963,756,2513,7367,756,236746,1606,107,140,1922,28180,963,756,53896,236753,963,756,2513,2000,107,140,12234,108,140,2060,1694,236769,7212,236769,3485,503,236787,85440,528,503,236764,16587,1223,108,2543,28751,1419,4361,236764,76272,109,2063,2324,236779,5930,236769,34488,236787,4361,236840,720,2812,3921,76272,236840,720,236764,801,9414,107,140,12234,1701,496,2238,1694,529,25630,236764,994,496,33228,17520,529,496,2324,532,496,1698,529,784,506,25630,528,496,1694,236761,107,140,11447,2324,1374,577,4745,531,236743,236771,532,7738,1698,1374,577,4745,531,236743,236770,236761,107,140,22539,2324,236779,5930,67713,107,140,236769,236771,236764,236743,236770,236768,107,140,22539,2324,236779,5930,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,2812,107,140,236769,236770,236771,236764,236743,236778,236812,236768,107,140,12234,108,140,236751,236764,510,578,236743,236771,236764,236743,236770,107,140,1708,1548,528,4945,236787,107,144,236751,3323,1548,107,144,236758,32162,1548,107,140,2060,503,236764,510,108,2543,28751,1419,4361,236764,76272,109,2063,19519,236779,2074,236769,34488,236787,4361,236840,720,2812,3921,4361,236840,720,9414,107,140,12234,4934,496,2238,1694,529,25630,236764,8729,496,1694,529,19519,5783,3408,1765,3097,2238,3479,107,140,495,506,7501,236761,107,140,22539,19519,236779,2074,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236778,2812,107,140,236840,236770,236764,236743,236778,236764,236743,236800,236764,236743,236800,236764,236743,236800,236764,236743,236812,236764,236743,236812,236842,107,140,12234,108,140,2060,870,2074,236769,34488,8497,236769,236747,236862,236770,44538,573,858,528,2644,236769,3469,236769,34488,50796,110,2063,563,236779,227147,236769,2383,236787,1540,236768,3921,7014,236787,107,140,12234,5803,768,2238,2483,563,496,142193,5240,107,140,2060,2483,1251,2483,186487,236770,236842,109,2063,1386,236779,227147,236769,2383,236787,1540,236768,3921,1540,236787,107,140,12234,9100,506,48037,142193,600,12502,607,496,17686,2483,236761,107,140,45835,4317,563,3606,236787,107,140,236772,9100,506,27801,96805,529,17686,2483,600,563,496,142193,236761,107,140,236772,81806,531,506,1345,529,506,2483,14416,529,496,2483,24905,600,3952,1680,506,180522,660,525,42636,236761,107,140,22539,1386,236779,227147,68560,107,140,10440,107,140,22539,1386,236779,227147,1033,9307,1606,107,140,236789,9307,552,236789,107,140,22539,1386,236779,227147,1033,213822,1606,107,140,236789,9307,552,236789,107,140,12234,108,140,584,563,236779,227147,236769,2383,1473,107,144,2060,2483,107,140,1708,858,528,2644,236769,3469,236769,2383,16644,107,144,584,563,236779,227147,236769,2383,236840,236747,9218,1473,107,148,2060,2483,900,2483,236840,236747,236772,236770,59396,236770,236842,108,2543,28751,1419,4361,109,2063,2483,236779,97251,236769,236746,236787,1540,236764,518,236787,1540,236768,3921,1540,236787,107,140,12234,13065,659,1156,16587,496,532,518,17520,1186,529,236743,236770,236751,532,236743,236771,236751,236761,107,140,54950,14820,146592,580,1239,9103,532,994,1354,992,618,496,2483,236761,107,140,22539,2483,236779,97251,1033,236771,236770,236771,963,756,236770,236770,236771,1606,107,140,236789,236770,236771,236771,236789,107,140,12234,109,140,2060,116740,7013,236769,1714,236769,720,236769,236746,236840,236747,2812,8201,801,236769,236763,236840,236747,14430,573,858,528,2644,236769,3469,236769,236746,9670,108,2543,28751,1419,4361,236764,26272,109,2063,27801,236769,29139,236787,4361,236840,1714,2812,3921,26272,236840,1714,9414,107,140,12234,5641,529,1694,529,16587,236764,994,506,27801,886,236761,9657,506,1171,886,528,1624,529,5065,107,140,29139,529,506,1638,3861,236761,9657,5450,528,1624,506,2744,1694,563,7738,236761,107,140,22539,27801,67713,108,140,22539,27801,20768,236746,963,756,236763,963,756,236755,10190,107,140,236789,236746,236789,107,140,22539,27801,20768,236746,963,756,9579,963,756,6450,10190,107,140,236789,6450,236789,107,140,12234,108,140,584,711,16587,236787,107,144,2060,5450,108,140,214676,578,2631,236769,3469,236769,236781,236768,573,1123,528,16587,236768,107,140,1708,503,528,16587,236787,107,144,584,5980,236769,236751,236768,1251,2631,3469,236787,107,148,2060,503,110,2063,11333,236779,11147,236779,76380,236769,236746,236787,801,236764,518,236787,801,236768,3921,801,236787,107,140,12234,9657,496,11333,3364,50289,529,1156,25630,496,532,518,107,140,22539,11333,236779,11147,236779,76380,236769,236800,236764,236743,236810,236768,107,140,236770,107,140,22539,11333,236779,11147,236779,76380,236769,236778,236810,236764,236743,236770,236810,236768,107,140,236810,107,140,12234,109,140,2063,7609,236779,109359,236769,236746,236787,801,236764,518,236787,801,236768,3921,801,236787,107,144,2060,496,768,518,1251,236743,236771,1663,7609,236779,109359,236769,236763,236764,496,2144,518,236768,107,140,2060,7609,236779,109359,236769,236746,236764,518,236768,108,2543,28751,1419,4361,109,2063,784,236779,134517,236769,2383,236787,1540,236768,3921,4361,236840,1714,9414,107,140,12234,9657,1694,529,784,150537,699,48037,531,27801,529,506,2744,2483,107,140,22539,784,236779,134517,1033,28180,1606,107,140,1922,236746,963,756,596,963,756,28180,2000,107,140,12234,108,140,2060,870,2383,8497,236769,236747,900,236743,236770,7066,573,858,528,2644,236769,3469,236769,2383,50796,110,2063,2483,236779,25425,236769,236749,236787,801,236768,3921,1540,236787,107,140,12234,9657,496,2483,7906,2557,236772,166194,1552,4945,6250,699,236743,236771,102844,538,23722,236761,107,140,22539,2483,236779,25425,236769,236771,236768,107,140,236789,236771,236789,107,140,22539,2483,236779,25425,236769,236810,236768,107,140,236789,236771,236743,236770,236743,236778,236743,236800,236743,236812,236743,236810,236789,107,140,12234,109,140,2060,623,16150,7013,236769,3275,236769,1714,236764,2644,236769,236749,900,236743,236770,9670,110,2063,1527,236779,101282,236779,71271,236769,2383,236787,1540,236768,3921,801,236787,107,140,12234,17770,496,2483,236764,1586,855,1217,1551,9245,7579,568,170492,529,1624,236768,1677,625,4551,529,107,140,22539,1527,236779,101282,236779,71271,1033,56660,54283,1606,107,140,236800,107,140,22539,1527,236779,101282,236779,71271,1033,131602,1606,107,140,236812,107,140,12234,108,140,2060,5980,236769,1025,236769,2383,236761,11462,21957,108,2543,28751,1419,4361,109,2063,11299,236779,26873,236769,26873,236779,2383,236787,1540,236768,3921,4361,236840,720,9414,107,140,12234,13065,531,672,1292,563,496,2483,13855,13906,8687,528,496,2803,88724,6518,236761,107,140,11069,4209,563,531,11299,672,2483,532,994,1694,529,25630,7041,531,1217,1551,39161,1677,1546,107,140,2217,1774,236761,108,140,8291,563,496,15287,236787,107,140,236789,236748,236789,753,3697,5433,236764,44079,2390,39161,107,140,236789,236748,236909,236789,753,3746,5433,236764,44079,1156,39161,107,140,6748,236909,236789,753,690,918,5433,236764,44079,886,12222,108,140,22539,11299,236779,26873,1033,236748,512,236909,783,236909,512,236909,512,236909,783,236909,783,236909,783,236909,783,236909,512,512,1606,107,140,236840,236812,236764,236743,236778,236764,236743,236770,236764,236743,236778,236764,236743,236778,236764,236743,236770,236764,236743,236770,236764,236743,236770,236764,236743,236770,236764,236743,236812,236764,236743,236812,236842,107,140,12234,109,140,2063,1527,236779,126374,236769,14210,236787,1540,236768,3921,801,236787,107,144,584,5433,1251,623,236748,1083,994,236743,236812,107,144,36208,5433,1251,623,236748,236909,1083,994,236743,236778,107,144,36208,5433,1251,16150,236909,1083,994,236743,236770,107,140,107,140,584,4252,236779,2383,1251,86679,994,2977,107,140,2060,1694,236769,3275,236769,2861,236779,126374,236764,4252,236779,2383,236761,6966,885,623,9670,110,2063,1217,236779,34717,236779,3841,236769,2383,236787,1540,236764,85440,236787,1540,236768,3921,801,236787,107,140,12234,9100,1217,1551,2782,496,2238,85440,740,577,1765,528,506,3303,2483,236761,4308,17352,27279,3636,236761,107,140,22539,1217,236779,34717,236779,3841,95780,756,236746,1606,107,140,236771,107,140,22539,1217,236779,34717,236779,3841,1033,72004,963,756,236746,1606,107,140,236800,107,140,22539,1217,236779,34717,236779,3841,1033,50354,963,756,9236,1606,107,140,236800,107,140,12234,109,140,35775,16620,1426,578,236743,236771,107,140,1708,858,528,2644,236769,3469,236769,2383,16644,107,144,584,2483,236840,236747,191366,52740,236769,26967,1473,107,148,35775,16620,1426,3323,236743,236770,107,140,2060,2366,16620,1426,108,2543,28751,1419,4361,109,2063,4260,236779,34488,236769,34488,236787,1540,236768,3921,1540,236787,107,140,12234,13065,563,496,2557,236772,166194,1552,2483,529,1548,1294,699,756,13321,236789,531,756,52908,6748,107,140,13280,12871,659,756,13321,963,756,811,963,756,13498,963,756,19891,963,756,19025,963,756,21716,963,756,34699,963,756,47526,963,756,44622,236789,532,756,52908,6748,107,140,13293,506,2483,607,4945,19372,699,21548,531,7488,107,140,22539,4260,236779,34488,1033,19891,886,3493,1606,107,140,236789,811,1806,3493,236789,107,140,12234,108,140,107,140,1071,236779,720,578,16923,13321,2632,236743,236771,236764,756,811,2632,236743,236770,236764,756,13498,2632,236743,236778,236764,756,19891,2632,236743,236800,236764,756,19025,2632,236743,236812,236764,756,21716,2632,236743,236810,236764,756,34699,2632,236743,236825,236764,756,47526,2632,236743,236832,236764,756,44622,2632,236743,236828,236764,756,52908,2632,236743,236819,236783,108,140,584,4945,1251,86679,994,3679,107,140,2060,623,16150,7013,236769,40835,236769,34488,236761,6966,885,142737,2307,236784,3485,538,236787,531,236779,720,236840,236749,14430,108,2543,28751,1419,4361,236764,76272,109,2063,1586,236779,69344,236779,31493,236769,34488,236787,4361,236840,8344,2812,3921,76272,236840,8344,236764,6803,9414,107,140,12234,4934,496,17686,1694,529,4945,568,1340,3861,657,3198,1156,236768,4864,532,994,1156,600,659,506,24119,531,1546,107,140,1538,532,994,1091,528,1900,568,146772,1548,236764,6268,1548,769,107,140,22539,1586,236779,69344,236779,31493,5551,236770,236761,236771,236764,236743,236778,236761,236771,236764,236743,236800,236761,236771,236764,236743,236812,236761,236771,236764,236743,236810,236761,236771,236764,236743,236778,236761,236778,2812,107,140,236769,236778,236761,236771,236764,236743,236778,236761,236778,236768,107,140,22539,1586,236779,69344,236779,31493,5551,236770,236761,236771,236764,236743,236778,236761,236771,236764,236743,236800,236761,236771,236764,236743,236812,236761,236771,236764,236743,236810,236761,236771,236764,236743,236778,236761,236771,2812,107,140,236769,236778,236761,236771,236764,236743,236778,236761,236771,236768,107,140,12234,109,140,34488,236761,10479,825,107,140,1387,236779,16417,578,6803,885,10281,1373,107,140,1387,236779,16754,578,5450,107,140,1708,537,236764,637,528,20058,236769,34488,74852,236770,1604,4945,236840,236770,9218,1473,107,144,16417,578,637,753,537,107,144,584,2675,655,1322,236779,16417,236787,107,148,1387,236779,16417,578,2675,107,148,1387,236779,16754,578,568,236752,236764,637,236768,107,140,2060,1322,236779,16754,108,2543,28751,1419,4361,109,2063,15974,1203,236779,1071,236779,6805,236769,34488,236787,4361,236840,8344,2812,3921,4361,236840,8344,9414,107,140,12234,17770,1694,529,4945,568,1340,657,3198,1156,4820,779,5510,496,6373,4959,531,600,1694,236764,107,140,17887,600,506,21548,1548,795,3291,236743,236771,532,506,7488,795,3291,236743,236770,107,140,22539,15974,1203,236779,1071,236779,6805,5551,236770,236761,236771,236764,236743,236778,236761,236771,236764,236743,236800,236761,236771,236764,236743,236812,236761,236771,236764,236743,236810,236761,236771,2812,107,140,236840,236771,236761,236771,236764,236743,236771,236761,236778,236810,236764,236743,236771,236761,236810,236764,236743,236771,236761,236832,236810,236764,236743,236770,236761,236771,236842,107,140,12234,109,140,841,236764,3628,578,2631,236769,34488,779,1322,236769,34488,236768,107,140,236767,578,236743,236770,965,568,841,753,3628,236768,107,140,2060,1694,236769,3275,236769,3485,1123,236787,568,236781,753,3628,236768,808,620,236764,4945,1223,108,2543,28751,1419,4361,236764,7129,109,2063,5957,236779,16469,9964,236769,7558,236787,4361,236840,10880,2812,3921,4361,236840,720,9414,107,140,12234,28293,2238,1694,529,1027,23181,2979,1186,573,25630,107,140,22539,5957,236779,16469,9964,20768,236746,963,236743,236800,236761,236770,236812,236764,236743,236810,2812,107,140,236840,236810,236842,107,140,22539,5957,236779,16469,9964,5551,236770,236764,236743,236778,236764,236743,236800,236764,756,28180,963,31763,2977,2812,107,140,236840,236770,236764,236743,236778,236764,236743,236800,236842,107,140,12234,108,140,2060,1694,236769,7212,236769,3485,1123,236787,1722,236769,236781,236768,1251,801,236764,2979,1223,110,2063,50839,236769,2383,236787,1540,236768,3921,801,236787,107,140,12234,9657,3861,529,2238,2483,107,140,22539,50839,68560,107,140,236771,107,140,22539,50839,1033,28180,1606,107,140,236800,107,140,12234,108,140,2060,5980,236769,2383,236768,110,2063,7488,236779,76380,236769,236749,236787,801,236768,3921,801,236787,107,140,12234,1701,496,2238,1548,538,236764,1586,506,7488,1548,600,59529,538,41923,236764,7100,1082,538,107,140,22539,7488,236779,76380,236769,236770,236810,236768,107,140,236810,107,140,12234,109,140,1708,858,528,2644,236769,236778,236764,538,1473,107,144,584,538,2144,858,1251,236743,236771,236787,994,538,973,858,107,140,2060,236743,236770,108,2543,28751,1419,4361,109,2063,5415,969,236769,236749,236787,801,236768,3921,4361,236840,720,9414,107,140,12234,9657,1694,529,8355,5872,529,2238,11995,528,506,1900,699,21548,531,7488,236761,107,140,7795,529,506,5872,1374,577,9456,1548,529,2782,7041,531,1217,1551,2782,625,5092,4994,528,82189,236761,107,140,4661,1548,1374,577,4745,531,506,1698,529,784,5872,107,140,22539,5415,969,236769,236828,236768,107,140,236840,236778,236764,236743,236778,236764,236743,236778,236842,107,140,22539,5415,969,236769,236778,236810,236768,107,140,236840,236810,236764,236743,236810,236842,107,140,22539,5415,969,236769,236832,236771,236768,107,140,236840,236778,236764,236743,236810,236764,236743,236832,236842,107,140,12234,108,140,1106,6596,107,140,18377,578,2977,107,140,236747,578,236743,236778,107,140,6858,858,6605,801,236769,747,236761,4784,236769,236749,236768,900,236743,236770,1473,107,144,584,538,2144,858,1251,236743,236771,236787,107,148,18377,236761,3770,236769,236747,236768,107,148,236749,973,236784,858,107,144,4454,236787,107,148,236747,3323,236743,236770,108,140,584,538,1890,236743,236770,236787,107,144,18377,236761,3770,236769,236749,236768,107,140,2060,1707,108,2543,28751,1419,4361,109,2063,6349,236779,179826,236769,34488,236787,4361,236840,720,2812,3921,4361,236840,720,9414,107,140,12234,4934,496,1694,529,25630,236764,6349,784,4820,600,4583,919,1082,3622,236761,107,140,27252,1900,529,4820,2378,506,1638,618,528,506,2744,236761,107,140,22539,6349,236779,179826,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236778,236764,236743,236812,2812,107,140,236840,236770,236764,236743,236800,236764,236743,236812,236842,107,140,12234,108,140,3744,236779,26297,578,12739,825,107,140,1708,1548,528,4945,236787,107,144,584,1548,711,528,1152,236779,26297,236787,107,148,3744,236779,26297,236840,5640,236842,578,236743,236771,107,144,3744,236779,26297,236840,5640,236842,3323,236743,236770,107,140,2060,870,5640,573,1548,528,4945,768,1152,236779,26297,236840,5640,236842,1251,236743,236770,236842,110,2063,23510,236779,4925,236769,2383,236787,1540,236768,3921,1540,236787,107,140,12234,1701,496,2238,2483,236764,23510,67505,7579,531,46451,532,46451,531,67505,236761,107,140,22539,23510,236779,4925,1033,9259,1606,107,140,236789,236754,45340,236806,236789,107,140,12234,108,140,2060,116740,7013,236769,3275,236769,3485,1123,236787,1123,236761,33460,4925,3800,2483,1223,108,2543,28751,1419,4361,109,2063,168984,236769,29139,236787,4361,236840,1714,2812,3921,1540,236787,107,140,12234,36361,54680,1694,529,16587,1131,496,3161,2483,107,140,22539,168984,67713,107,140,10440,107,140,22539,168984,20768,236746,963,756,236763,963,756,236755,10190,107,140,236789,28180,236789,107,140,12234,108,140,2060,116740,7013,236769,29139,236768,108,2543,28751,1419,4361,109,2063,5957,236779,2003,236779,20836,236769,29139,236787,4361,236840,1714,1604,24905,236787,1540,236768,3921,4361,236840,1714,9414,107,140,12234,28293,614,2744,1694,529,16587,1186,573,5906,600,1502,607,496,2238,24905,236761,107,140,22539,5957,236779,2003,236779,20836,142976,756,236746,1606,107,140,3805,107,140,22539,5957,236779,2003,236779,20836,20768,28180,963,756,214728,963,756,236755,893,963,756,2513,7367,756,236746,1606,107,140,1922,28180,963,756,2513,2000,107,140,12234,108,140,2060,1694,236769,7212,236769,3485,1123,236787,1123,236761,52740,236769,20836,779,16587,1223,110,2063,974,236779,30558,236769,236752,236787,1694,1473,107,140,12234,13293,1186,4414,4945,528,506,1694,236761,107,140,22539,974,236779,30558,99097,236770,236764,236743,236778,236764,753,236812,236764,236743,236810,236764,236743,236825,2812,107,140,236840,236778,236764,236743,236810,236764,236743,236825,236842,107,140,22539,974,236779,30558,5551,236810,236764,236743,236800,236764,753,236810,236764,236743,236778,236764,753,236800,236764,236743,236800,236764,236743,236819,236764,236743,236771,236764,236743,236770,236778,236800,236764,236743,236770,236764,753,236770,236771,2812,107,140,236840,236810,236764,236743,236800,236764,236743,236778,236764,236743,236800,236764,236743,236819,236764,236743,236770,236778,236800,236764,236743,236770,236842,107,140,12234,108,140,2060,1694,236769,7212,236769,3485,1123,236787,1123,1890,236743,236771,236764,537,1223,110,2063,563,236779,2497,236769,236749,1473,107,140,12234,13293,1847,768,496,2238,1548,563,8355,236764,532,2416,7394,236761,107,140,22539,563,236779,2497,236769,236825,236768,107,140,9277,107,140,22539,563,236779,2497,236769,236770,236771,236770,236768,107,140,4339,107,140,22539,563,236779,2497,236769,236770,236770,236768,107,140,4339,107,140,22539,563,236779,2497,236769,236770,236800,236812,236812,236770,236768,107,140,4339,107,140,22539,563,236779,2497,236769,236825,236770,236768,107,140,4339,107,140,22539,563,236779,2497,236769,236812,236768,107,140,9277,107,140,22539,563,236779,2497,236769,236770,236768,107,140,9277,107,140,12234,108,140,584,538,6605,236743,236770,236787,994,8450,107,140,236749,236779,4784,578,236743,236770,107,140,6858,538,236779,4784,5213,236743,236778,655,538,236787,538,236779,4784,3323,236743,236770,107,140,1708,858,528,2644,236769,236778,236764,1322,236769,236749,236779,4784,900,236743,236770,236764,538,16644,107,144,584,538,2144,858,1251,236743,236771,236787,107,148,2060,8450,107,140,2060,6288,108,1106,6596,109,2063,6356,236769,28570,236787,1694,236764,1123,236787,6803,1473,107,140,12234,107,140,102415,1090,14337,607,15841,43733,657,1523,1123,236761,107,140,2060,43733,236840,236771,236842,900,43733,236840,236770,236842,808,1123,900,43733,236840,236770,236842,808,1123,236884,236778,900,27103,43733,236840,236749,236842,808,1123,236884,236749,107,140,12234,107,140,2060,2324,5551,69984,808,6596,236761,16559,236769,236781,236764,858,236768,573,858,236764,86675,528,29833,236769,28570,44538,109,2063,1586,236779,13321,236769,28570,236787,1694,1473,107,140,12234,43733,659,15841,529,496,14337,236761,107,140,4114,236779,13321,1586,1123,1288,600,6356,236769,236781,236768,578,236743,236771,236761,107,140,4114,236779,13321,7623,1186,1186,5743,1523,236764,1581,768,993,659,1551,236761,107,140,38419,236764,1586,236779,13321,1186,4716,1694,43733,2963,1581,1548,529,15841,107,140,624,7488,1908,5743,13954,618,625,36369,107,140,236746,3465,236761,107,140,22539,4886,236769,4114,236779,13321,5551,236770,236764,236743,236778,18107,236743,236778,236768,997,517,236769,236781,236768,578,236743,236770,900,236743,236778,236781,107,140,236772,236771,236761,236810,107,140,22539,4886,236769,4114,236779,13321,99097,236825,236764,236743,236770,236770,236764,753,236825,236764,236743,236770,18107,236743,236778,236768,997,568,236781,753,236743,236770,236768,808,568,236781,753,236743,236778,236768,808,568,236781,753,236743,236800,236768,578,753,236825,900,236743,236770,236770,236781,753,236743,236825,236781,236884,236778,900,1123,236884,236800,107,140,236770,236761,236771,107,140,12234,109,140,12275,236751,578,870,28570,236840,236747,236842,808,858,573,858,528,2644,236769,236770,236764,5980,236769,28570,50796,107,140,2063,6051,236769,236781,1473,107,144,2060,6356,236769,28570,236764,1123,236768,107,140,2063,17407,236769,236781,1473,107,144,2060,6356,236769,12275,236751,236764,1123,236768,107,140,107,140,236781,236764,60429,578,236743,236771,236764,236743,236770,236744,236772,236810,107,140,1708,2222,528,2644,236769,236770,236771,236771,236771,1473,107,144,16912,578,6051,236769,236781,236768,107,144,3405,236781,578,17407,236769,236781,236768,107,144,584,2951,236769,16912,236768,655,60429,236787,2541,107,144,236781,578,1123,753,59730,965,9101,236781,108,140,2060,1123,110,2063,4260,236779,23362,236769,236752,236787,1694,1473,107,140,12234,2094,1292,4716,496,1694,537,532,7623,496,1694,537,236789,1288,600,107,140,236752,236789,563,15779,531,537,528,506,3114,695,600,659,711,69330,684,1806,236764,1651,1061,2979,657,506,3114,695,600,659,69330,684,1806,659,4745,107,140,1071,506,2979,529,506,7041,3114,695,529,537,236764,840,19372,236761,107,140,22539,4260,236779,23362,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,236840,236770,236764,236743,236778,236764,236743,236800,236842,107,140,22539,4260,236779,23362,5551,236810,236764,236743,236825,236764,236743,236800,236764,236743,236812,236764,236743,236828,236764,236743,236819,236764,236743,236778,2812,107,140,236840,236778,236764,236743,236825,236764,236743,236800,236764,236743,236812,236764,236743,236828,236764,236743,236819,236764,236743,236810,236842,107,140,12234,108,140,23362,578,870,236752,236840,236747,236842,573,858,528,2644,236769,3469,236769,236752,1223,768,858,2144,236743,236800,1251,236743,236771,236842,107,140,23362,236761,10479,825,107,140,2060,870,23362,236840,236747,973,236743,236800,236842,768,858,2144,236743,236800,1251,236743,236771,1663,537,236840,236747,236842,573,858,528,2644,236769,3469,236769,236752,50796,110,2063,4709,236769,236752,236787,1694,1473,107,140,12234,13293,19372,4709,4820,528,496,1694,107,140,22539,4709,5551,236810,236764,236743,236800,236764,236743,236810,236764,236743,236778,236764,236743,236800,236764,236743,236800,236764,236743,236819,236764,236743,236771,236764,236743,236770,236778,236800,2812,107,140,236840,236771,236764,236743,236778,236764,236743,236800,236764,236743,236810,236764,236743,236819,236764,236743,236770,236778,236800,236842,107,140,12234,108,140,2060,19372,236769,1025,236769,236752,1223,110,2063,2631,236779,7011,236769,236752,236787,1694,1473,107,140,12234,13293,5783,3408,528,506,1694,236761,107,140,22539,2631,236779,7011,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,236800,107,140,22539,2631,236779,7011,5551,236810,236764,236743,236800,236764,753,236810,236764,236743,236778,236764,753,236800,236764,236743,236800,236764,236743,236819,236764,236743,236771,236764,236743,236770,236778,236800,236764,236743,236770,236764,753,236770,236771,2812,107,140,236770,236778,236800,107,140,12234,108,140,2060,2631,236769,236752,236768,110,2063,142747,236779,149986,236769,236749,236787,801,1473,107,140,12234,13293,506,1548,529,2782,506,15958,236743,236832,7412,528,25630,2344,1082,538,837,659,69330,684,236743,236770,236770,653,236743,236770,236800,236761,107,140,22539,142747,236779,149986,236769,236810,236771,236768,107,140,236771,107,140,22539,142747,236779,149986,236769,236832,236828,236768,107,140,236778,107,140,22539,142747,236779,149986,236769,236832,236819,236768,107,140,236800,107,140,12234,108,140,26297,578,236743,236771,107,140,1708,858,528,2644,236769,236749,1473,107,144,584,858,2144,236743,236770,236770,1251,236743,236771,653,858,2144,236743,236770,236800,1251,236743,236771,236787,107,148,26297,3323,5980,236769,2234,236769,7212,236769,3485,505,236787,505,1251,623,236832,827,1540,236769,236747,41052,107,140,2060,38302,110,2063,4260,236779,20952,236769,236752,236787,1694,1473,107,140,12234,2094,1292,4716,496,1694,537,532,7623,496,1694,537,236789,1288,600,107,140,236752,236789,563,15779,531,537,528,506,11049,3114,695,236764,1651,1061,2979,657,506,1581,3114,695,659,4745,107,140,1071,506,2979,529,506,1581,3114,695,529,537,236764,840,19372,236761,107,140,22539,4260,236779,20952,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,236840,236770,236764,236743,236778,236764,236743,236800,236842,107,140,22539,4260,236779,20952,5551,236810,236764,236743,236825,236764,236743,236800,236764,236743,236812,2812,107,140,236840,236800,236764,236743,236825,236764,236743,236810,236764,236743,236812,236842,107,140,12234,108,140,20952,578,870,236752,236840,236747,236842,573,858,528,2644,236769,3469,236769,236752,1223,768,858,2144,236743,236778,1251,236743,236771,236842,107,140,20952,236761,10479,825,107,140,2060,870,20952,236840,236747,973,236743,236778,236842,768,858,2144,236743,236778,1251,236743,236771,1663,537,236840,236747,236842,573,858,528,2644,236769,3469,236769,236752,50796,110,2063,41193,236779,123043,236769,236751,236787,1540,1473,107,140,12234,107,140,15072,35509,2483,684,31509,4402,529,1806,7579,236761,107,140,12234,107,140,236865,9918,2483,531,4402,236761,7714,529,3861,236743,236800,236761,107,140,19243,578,870,236751,17576,236800,808,858,1473,1387,3283,236800,808,858,900,236743,236800,779,5980,236769,236751,50796,573,858,528,2644,3283,3469,236769,236751,236768,900,236743,236778,236768,973,236743,236800,7066,107,140,236865,8881,4820,528,1546,2299,236761,29506,2299,815,18661,4820,1082,236743,236800,236761,107,140,19243,578,21652,4043,236840,236770,9218,900,2299,236840,236771,2812,768,5980,236769,4043,236768,1251,236743,236800,1663,2299,573,2299,528,4402,236842,107,140,2060,116740,7013,236769,19243,236768,109,2063,39961,236779,123043,236769,236751,236787,1540,1473,107,140,12234,107,140,101108,618,2744,2483,35509,607,41193,236779,123043,1292,236761,15543,64686,2483,236761,107,140,12234,108,140,19243,578,870,236751,17576,236800,808,858,1473,1387,3283,236800,808,858,900,236743,236800,779,5980,236769,236751,50796,573,858,528,2644,3283,3469,236769,236751,236768,900,236743,236778,236768,973,236743,236800,7066,107,140,19243,578,21652,4043,236840,236778,236842,900,2299,8497,236778,2812,768,5980,236769,4043,236768,1251,236743,236800,1663,2299,573,2299,528,4402,236842,107,140,2060,116740,7013,236769,19243,236768,110,2063,8355,236779,73368,236769,236749,236787,801,1473,107,140,12234,107,140,2497,236779,73368,7623,538,236772,594,1548,600,563,496,123466,1548,532,625,236789,236751,992,8355,236761,107,140,22539,8355,236779,73368,236769,236770,236768,107,140,236778,107,140,22539,8355,236779,73368,236769,236778,236768,107,140,236800,107,140,22539,8355,236779,73368,236769,236800,236768,107,140,236810,107,140,22539,8355,236779,73368,236769,236812,236768,107,140,236770,236800,107,140,22539,8355,236779,73368,236769,236810,236768,107,140,236828,236819,107,140,12234,109,140,1106,4940,107,140,2063,161533,236779,36367,495,236769,236749,236764,620,236784,236770,236771,1473,107,144,12234,3694,768,538,563,8355,1699,506,17277,236772,236794,50605,2655,2027,1594,29738,107,144,584,538,655,236743,236778,236787,107,148,2060,8450,107,144,584,538,1251,236743,236778,653,538,1251,236743,236800,236787,107,148,2060,6288,107,144,584,538,2144,236743,236778,1251,236743,236771,236787,107,148,2060,8450,108,144,236750,578,236743,236771,107,144,236753,578,538,753,236743,236770,107,144,6858,513,2144,236743,236778,1251,236743,236771,236787,107,148,236750,3323,236743,236770,107,148,236753,973,236784,236743,236778,108,144,1708,2222,528,2644,236769,236767,1473,107,148,236746,578,4940,236761,33156,236769,236778,236764,538,753,236743,236778,236768,107,148,236781,578,5983,236769,236746,236764,513,236764,538,236768,107,148,584,1123,1251,236743,236770,653,1123,1251,538,753,236743,236770,236787,107,152,23162,107,148,1708,2222,528,2644,236769,236750,753,236743,236770,1473,107,152,236781,578,5983,236769,236781,236764,236743,236778,236764,538,236768,107,152,584,1123,1251,538,753,236743,236770,236787,107,156,7284,107,148,4454,236787,107,152,2060,8450,108,144,2060,6288,108,140,236755,236779,2497,578,236743,236771,107,140,236746,236764,518,578,236743,236771,236764,236743,236770,107,140,6858,505,236779,2497,655,538,236787,107,144,236746,236764,518,578,518,236764,496,900,518,107,144,584,161533,236779,36367,495,236769,236763,1473,107,148,236755,236779,2497,3323,236743,236770,107,140,2060,518,110,2063,107170,236779,2330,236779,1071,236779,13321,236769,236752,236787,1694,1473,107,140,12234,107,140,12233,2649,236779,2330,236779,1071,236779,13321,4716,496,1694,529,25630,618,614,2744,236761,107,140,509,7623,6288,768,993,659,1806,9245,4820,528,506,1694,600,107,140,2330,531,5743,236764,532,8450,7394,236761,108,140,22539,107170,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236800,236764,236743,236810,236764,236743,236771,2812,107,140,9277,107,140,22539,107170,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236800,236764,753,236778,236764,236743,236770,2812,107,140,4339,107,140,22539,107170,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236832,2812,107,140,9277,107,140,22539,107170,236779,2330,236779,1071,236779,13321,5551,236778,236764,236743,236812,236764,753,236810,236764,236743,236800,236764,236743,236819,236764,236743,236832,2812,107,140,4339,107,140,22539,107170,236779,2330,236779,1071,236779,13321,5551,236770,2812,107,140,9277,107,140,12234,108,140,1708,858,528,2644,236769,3469,236769,236752,16644,107,144,1708,673,528,2644,236769,3469,236769,236752,16644,107,148,1708,620,528,2644,236769,3469,236769,236752,16644,107,152,584,858,2843,673,532,858,2843,620,532,673,2843,620,532,537,236840,236747,236842,900,537,236840,236804,236842,900,537,236840,236767,236842,1251,236743,236771,236787,107,156,2060,6288,107,140,2060,8450,110,2063,1295,236779,42255,236779,61546,236769,236749,236787,801,1473,107,140,12234,107,140,70895,496,4284,600,236789,236751,496,13275,6850,51853,1440,1757,236761,107,140,236749,9371,659,9204,2378,531,1447,236793,138,4973,745,15665,236764,496,1607,1076,529,538,9371,107,140,733,9204,1447,531,2378,236761,139,818,1156,7093,529,9371,1502,855,1646,1401,2793,699,107,140,17136,1032,236761,138,3243,9371,2827,528,506,1638,4249,236761,138,11634,9371,659,1176,531,98230,107,140,14730,496,1295,600,236789,236751,6049,2378,531,1447,16737,496,1295,600,236789,236751,6049,1447,531,2378,236761,107,140,9675,236764,506,9371,659,51853,47170,532,3188,236793,618,496,1354,236764,901,4102,6049,107,140,495,910,33744,618,768,901,1602,711,98230,236761,108,140,2094,1292,26054,506,1548,529,1288,39362,236761,107,140,12234,109,140,2060,538,5213,236743,236778,110,2063,76541,236779,2234,236769,236752,236787,1694,1473,107,140,12234,13293,1694,607,4820,19104,524,684,236743,236770,236761,107,140,22539,76541,236779,2234,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,236840,236778,236764,236743,236800,236764,236743,236812,236842,107,140,22539,76541,236779,2234,5551,236810,236764,236743,236800,236764,236743,236810,236764,236743,236778,236764,236743,236800,236764,236743,236800,236764,236743,236819,236764,236743,236771,236764,236743,236770,236778,236800,2812,107,140,236840,236825,236764,236743,236812,236764,236743,236825,236764,236743,236800,236764,236743,236812,236764,236743,236812,236764,236743,236770,236771,236764,236743,236770,236764,236743,236770,236778,236812,236842,107,140,12234,108,140,2060,870,236781,900,236743,236770,573,1123,528,537,236842,110,2063,14491,236779,2330,236779,1071,236779,13321,236769,236752,1473,107,140,12234,107,140,73246,236779,2330,236779,1071,236779,13321,4716,496,1694,529,25630,618,614,2744,236761,107,140,509,7623,6288,768,993,659,1156,9245,4820,528,506,1694,600,107,140,2330,531,5743,236764,532,8450,7394,236761,107,140,22539,14491,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236800,236764,236743,236810,236764,236743,236771,2812,107,140,9277,107,140,22539,14491,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236800,236764,753,236778,236764,236743,236770,2812,107,140,9277,107,140,22539,14491,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236832,2812,107,140,9277,107,140,22539,14491,236779,2330,236779,1071,236779,13321,5551,236778,236764,236743,236812,236764,753,236810,236764,236743,236800,236764,236743,236810,236764,236743,236832,2812,107,140,4339,107,140,22539,14491,236779,2330,236779,1071,236779,13321,5551,236770,2812,107,140,9277,107,140,12234,108,140,1708,858,528,2644,236769,3469,236769,236752,16644,107,144,1708,673,528,2644,236769,3469,236769,236752,16644,107,148,584,858,2843,673,532,537,236840,236747,236842,900,537,236840,236804,236842,1251,236743,236771,236787,107,152,2060,6288,107,140,2060,8450,110,2063,2352,236779,5521,236769,236781,236787,801,236764,3225,236787,801,1473,107,140,12234,9233,16688,3225,529,2744,1548,1123,531,3225,236761,107,140,2060,2483,10065,1308,506,14274,236761,107,140,5521,4945,659,2344,1082,236743,236770,236771,236761,107,140,22539,2352,236779,5521,236769,236828,236764,236743,236800,236768,107,140,236789,236778,236778,236789,107,140,22539,2352,236779,5521,236769,236828,236764,236743,236778,236768,107,140,236789,236770,236771,236771,236771,236789,107,140,22539,2352,236779,5521,236769,236832,236764,236743,236778,236768,107,140,236789,236770,236770,236770,236789,107,140,12234,109,140,584,1123,1251,236743,236771,236787,994,623,236771,236775,107,140,4243,578,3679,107,140,6858,1123,2843,236743,236771,236787,107,144,4243,578,1540,236769,236781,2144,3225,236768,900,2461,107,144,236781,973,236784,3225,107,140,2060,2461,110,2063,17852,236779,7376,236769,236746,236764,534,1473,107,140,12234,26479,3861,529,496,2678,532,1494,994,2433,573,496,17852,236761,107,140,22539,17852,236779,7376,236769,236810,236764,236743,236800,236768,107,140,236832,236761,236810,107,140,12234,109,140,2060,496,808,534,965,236743,236778,110,2063,10779,236812,236769,236749,236787,801,1473,107,140,12234,818,56287,236812,1548,7501,563,496,7501,3361,531,506,56287,4142,42280,4659,866,588,600,236789,236751,5221,618,5238,236787,107,140,73368,236812,236769,236771,236768,3921,236743,236771,107,140,73368,236812,236769,236770,236768,3921,236743,236771,107,140,73368,236812,236769,236778,236768,3921,236743,236778,107,140,73368,236812,236769,236800,236768,3921,236743,236771,107,140,73368,236812,236769,236749,236768,3921,10779,236812,236769,236749,236772,236770,236768,900,10779,236812,236769,236749,236772,236778,236768,900,10779,236812,236769,236749,236772,236800,236768,900,10779,236812,236769,236749,236772,236812,769,107,140,9366,4903,496,1292,531,23057,16333,506,538,236772,594,3408,529,506,10779,236812,1548,7501,236761,138,6294,711,1161,74175,236761,107,140,22539,10779,236812,236769,236810,236768,107,140,236812,107,140,22539,10779,236812,236769,236825,236768,107,140,236828,107,140,22539,10779,236812,236769,236832,236768,107,140,236770,236812,107,140,12234,108,140,107,140,584,538,1251,236743,236771,236787,107,144,2060,236743,236771,107,140,36208,538,1251,236743,236770,236787,107,144,2060,236743,236771,107,140,36208,538,1251,236743,236778,236787,107,144,2060,236743,236778,107,140,36208,538,1251,236743,236800,236787,107,144,2060,236743,236771,107,140,4454,236787,107,144,236746,236764,518,236764,505,236764,513,578,236743,236771,236764,236743,236771,236764,236743,236778,236764,236743,236771,107,144,1708,858,528,2644,236769,236812,236764,538,900,236743,236770,1473,107,148,236746,236764,518,236764,505,236764,513,578,518,236764,505,236764,513,236764,496,900,518,900,505,900,513,107,144,2060,513,110,2063,20522,236769,236752,236787,1694,1473,107,140,12234,13293,20522,529,4820,528,506,1694,537,236761,107,140,22539,20522,5551,236800,236764,236743,236770,236764,236743,236778,236764,236743,236812,236764,236743,236810,2812,107,140,236800,107,140,22539,20522,99097,236770,236771,236764,236743,236812,236764,236743,236825,236764,236743,236770,236771,236771,236771,236764,236743,236770,236771,236764,236743,236778,236771,2812,107,140,236770,236810,236761,236771,107,140,12234,109,140,40835,236779,236752,578,19372,236769,236752,236768,107,140,584,5980,236769,236752,236768,2144,236743,236778,1251,236743,236770,236787,107,144,2060,19372,236779,236752,236840,3469,236769,236752,236768,973,236743,236778,236842,107,140,4454,236787,107,144,2060,568,40835,236779,236752,236840,3469,236769,236752,236768,973,236743,236778,753,236743,236770,236842,900,19372,236779,236752,236840,3469,236769,236752,236768,973,236743,236778,2812,965,236743,236778,110,2063,563,236779,227147,236769,1005,236787,1540,1473,107,140,12234,107,140,102854,768,2238,2483,563,496,142193,107,140,22539,563,236779,227147,68560,107,140,4339,107,140,22539,563,236779,227147,1033,6525,1606,107,140,4339,107,140,22539,563,236779,227147,1033,50354,236746,1606,107,140,4339,107,140,22539,563,236779,227147,1033,64060,2692,1606,107,140,9277,107,140,12234,108,140,2060,1816,1251,1816,186487,236770,236842,110,2063,1120,236758,236769,236749,236787,801,236764,510,236787,801,1473,107,140,12234,13293,236743,236778,236884,236749,53307,510,568,1553,7039,529,5562,1229,769,107,140,22539,1120,236758,236769,236800,236764,236743,236810,236768,107,140,236800,107,140,22539,1120,236758,236769,236770,236770,236771,236770,236764,236743,236770,236771,236770,236768,107,140,236778,107,140,22539,1120,236758,236769,236771,236764,236743,236770,236771,236770,236768,107,140,236770,107,140,22539,1120,236758,236769,236800,236764,236743,236770,236770,236768,107,140,236828,107,140,22539,1120,236758,236769,236770,236771,236771,236764,236743,236770,236771,236770,236768,107,140,236770,107,140,12234,109,140,619,236764,1123,578,236743,236770,236764,236743,236778,107,140,6858,538,2843,236743,236771,236787,107,144,584,538,2144,236743,236778,1251,236743,236770,236787,107,148,619,578,766,808,1123,2144,510,107,144,236781,578,1123,808,1123,2144,510,107,144,236749,973,236784,236743,236778,107,140,2060,766,2144,510,110,2063,41193,236779,17631,236769,236751,236787,1540,1473,107,140,12234,107,140,15072,35509,2483,684,34064,1418,2872,684,236743,236810,528,506,30796,236761,107,140,12234,107,140,2060,116740,7013,5551,37952,36583,778,236769,574,236768,900,236743,236810,753,4772,885,236746,5924,2144,236743,236778,236825,236768,900,4772,885,236746,5924,573,677,528,503,2812,109,2063,39961,236779,17631,236769,236751,236787,1540,1473,107,140,12234,107,140,101108,618,2744,2483,35509,607,41193,236779,17631,1292,236761,15543,64686,2483,236761,107,140,12234,109,140,2060,116740,7013,5551,37952,3283,778,236769,574,236768,753,4772,885,236746,1373,753,236743,236810,900,236743,236778,236825,236768,2144,236743,236778,236825,900,4772,885,236746,5924,573,677,528,503,2812,110,2063,6349,236779,236766,172213,236769,1005,1473,107,140,12234,107,140,6773,236779,236766,172213,563,496,1292,600,4716,2483,532,7623,2483,2180,147734,236761,107,140,22539,6349,236779,236766,172213,68560,107,140,10440,107,140,22539,6349,236779,236766,172213,885,108250,236785,829,236754,120715,236757,1373,107,140,236789,236763,149009,236785,829,94238,4998,236757,236789,107,140,22539,6349,236779,236766,172213,1033,108250,1606,107,140,236789,236763,149009,236789,107,140,22539,6349,236779,236766,172213,1033,50354,236746,1606,107,140,10440,107,140,22539,6349,236779,236766,172213,1033,9236,6552,236776,1606,107,140,236789,236799,236789,107,140,22539,6349,236779,236766,172213,1033,64060,2692,1606,107,140,236789,64060,2692,236789,107,140,12234,108,140,2060,116740,7013,236769,2234,236769,7212,236769,3485,677,236787,677,711,528,623,7393,86472,14196,6462,236836,827,1816,9670,110,2063,3426,236779,34436,236769,236752,236787,1694,236764,494,236787,801,1473,107,140,12234,13293,6288,768,784,4945,528,506,1694,537,659,3426,14272,494,236761,107,140,22539,3426,236779,34436,5551,236770,236764,236743,236778,236764,236743,236812,236764,236743,236770,236771,1604,236743,236770,236771,236771,236768,107,140,4339,107,140,22539,3426,236779,34436,5551,236770,236764,236743,236778,236771,236764,236743,236812,236764,236743,236770,236771,1604,236743,236810,236768,107,140,9277,107,140,12234,108,140,2060,784,236769,236781,655,494,573,1123,528,537,236768,110,2063,1138,236769,236781,236787,801,236764,570,236787,801,1473,107,140,12234,3218,1156,4945,1123,532,570,107,140,22539,1138,236769,236778,236764,236743,236800,236768,107,140,236810,107,140,22539,1138,236769,236810,236764,236743,236832,236768,107,140,236770,236778,107,140,12234,108,140,2060,1123,900,570,110,2063,1638,236779,50472,236769,236751,236771,236787,1540,236764,503,236770,236787,1540,1473,107,140,12234,107,140,6845,768,1156,4171,735,506,1638,7579,236761,107,140,22539,1638,236779,50472,1033,236744,200500,9961,9961,963,756,116794,9961,9961,231064,1650,57528,28180,1606,107,140,4339,107,140,22539,1638,236779,50472,1033,200500,963,756,1650,1650,116794,28180,1606,107,140,4339,107,140,22539,1638,236779,50472,1033,1650,1650,116794,28180,963,756,200500,1606,107,140,4339,107,140,22539,1638,236779,50472,1033,236744,200500,963,756,1650,1650,116794,28180,1606,107,140,9277,107,140,22539,1638,236779,50472,1033,200500,963,756,1650,1650,116794,596,588,1606,107,140,9277,107,140,22539,1638,236779,50472,1033,236744,200500,9961,9961,963,756,116794,9961,9961,231064,1650,1650,28180,1606,107,140,9277,107,140,12234,108,140,2060,1076,236769,236751,236771,236768,1251,1076,236769,236751,236770,236768,110,2063,10779,236769,236749,236787,801,1473,107,140,12234,13293,538,236772,594,123466,1548,236761,107,140,22539,10779,236769,236770,236771,236768,107,140,236810,236810,107,140,22539,10779,236769,236770,236768,107,140,236770,107,140,22539,10779,236769,236828,236768,107,140,236778,236770,107,140,12234,109,140,584,538,1251,236743,236771,236787,994,236743,236771,107,140,584,538,6605,236743,236778,236787,994,236743,236770,107,140,236746,236764,518,578,236743,236770,236764,236743,236770,107,140,1708,2222,528,2644,236769,236800,236764,538,900,236743,236770,1473,107,144,236746,236764,518,236764,578,518,236764,496,900,518,107,140,2060,518,110,2063,4338,236779,144765,236769,212439,236787,1540,1473,107,140,12234,41706,563,496,2483,529,19019,236775,532,25052,3056,107,140,2060,6288,768,1418,8115,34704,815,496,7041,16996,34704,236761,108,140,22539,4338,236779,144765,30141,1373,107,140,9277,107,140,22539,4338,236779,144765,30141,130590,107,140,4339,107,140,22539,4338,236779,144765,885,6143,1798,6985,1373,107,140,4339,107,140,22539,4338,236779,144765,885,1798,52715,1373,107,140,9277,107,140,12234,109,140,26297,578,236743,236771,107,140,1708,1123,528,41706,236787,107,144,584,1123,1251,19019,1083,38302,3323,236743,236770,107,144,584,1123,1251,25052,1083,38302,14599,236743,236770,107,144,584,38302,655,236743,236771,236787,994,8450,107,140,2060,38302,1251,236743,236771,110,2063,84613,236769,236752,236787,1694,1473,107,140,12234,13293,6288,563,1694,4820,659,140977,5683,653,22932,236761,107,140,22539,84613,5551,236770,236764,236743,236778,236764,236743,236812,236764,236743,236778,236771,2812,107,140,4339,107,140,22539,84613,5551,236770,236764,236743,236778,236771,236764,236743,236812,236764,236743,236770,236771,2812,107,140,9277,107,140,22539,84613,5551,236812,236764,236743,236770,236764,236743,236771,236764,753,236770,236771,2812,107,140,4339,107,140,12234,108,140,2078,236764,1521,578,6288,236764,6288,107,140,1708,858,528,2644,236769,3469,236769,236752,236768,753,236743,236770,1473,107,144,584,537,236840,236747,236842,1890,537,236840,236747,900,236743,236770,9414,2494,578,8450,107,144,584,537,236840,236747,236842,655,537,236840,236747,900,236743,236770,9414,1521,578,8450,107,140,2060,2494,653,1521,110,2063,3364,236769,236752,236770,236787,1694,236764,537,236778,236787,1694,1473,107,140,12234,13293,19372,4709,3364,4820,573,1156,15852,236761,107,140,22539,3364,5551,236770,236764,236743,236812,236764,236743,236800,236764,236743,236800,236812,236764,236743,236825,236810,236800,236764,236743,236778,236764,236743,236810,1604,870,236810,236764,236743,236832,236764,236743,236770,236764,236743,236810,236764,236743,236819,236764,236743,236825,236810,236800,236764,236743,236770,236778,236770,2812,107,140,236840,236770,236764,236743,236810,236764,236743,236825,236810,236800,236842,107,140,22539,3364,5551,236810,236764,236743,236800,236764,236743,236778,236764,236743,236828,1604,870,236800,236764,236743,236778,2812,107,140,236840,236778,236764,236743,236800,236842,108,140,12234,108,140,2060,19372,236769,2234,236769,1025,236769,236752,236770,769,83593,236769,1025,236769,236752,236778,41052,110,2063,7488,236779,2497,236779,19385,236769,236749,236787,801,1473,107,140,12234,13293,506,7488,8355,5415,529,538,236761,31952,538,1890,236743,236770,532,563,711,496,8355,236761,107,140,22539,7488,236779,2497,236779,19385,236769,236770,236800,236770,236819,236810,236768,107,140,236778,236819,107,140,22539,7488,236779,2497,236779,19385,236769,236778,236771,236812,236828,236768,107,140,236778,107,140,12234,109,140,511,2497,578,870,4339,236842,808,568,236749,900,236743,236770,236768,107,140,1708,858,528,2644,236769,236778,236764,538,900,236743,236770,1473,107,144,584,563,2497,236840,236747,9414,107,148,1708,673,528,2644,236769,236747,900,858,236764,538,236764,858,1473,107,152,511,2497,236840,236804,236842,578,8450,107,140,1708,858,528,2644,236769,236749,753,236743,236770,236764,236743,236771,236764,753,236770,1473,107,144,584,563,2497,236840,236747,236842,532,538,2144,858,1251,236743,236771,236787,107,148,2060,858,110,2063,2324,236779,1071,236779,236749,236769,236749,236787,801,1473,107,140,12234,2330,236779,1071,236779,236749,563,496,1292,600,31945,4945,699,236743,236770,531,538,236761,107,140,22539,2324,236779,1071,236779,236749,236769,236800,236771,236768,107,140,236812,236825,236810,107,140,22539,2324,236779,1071,236779,236749,236769,236770,236771,236771,236768,107,140,236810,236771,236810,236771,107,140,22539,2324,236779,1071,236779,236749,236769,236810,236768,107,140,236770,236810,107,140,22539,2324,236779,1071,236779,236749,236769,236770,236771,236768,107,140,236810,236810,107,140,22539,2324,236779,1071,236779,236749,236769,236770,236768,107,140,236770,107,140,12234,109,140,2060,568,236749,900,236743,236770,236768,808,538,973,236743,236778,110,2063,4338,236779,144765,236769,212439,236787,1540,1473,107,140,12234,41706,563,496,2483,529,132197,532,15825,3056,107,140,2060,6288,768,1418,8115,34704,815,496,7041,16996,34704,236761,108,140,22539,4338,236779,144765,885,177472,107,140,9277,107,140,22539,4338,236779,144765,885,187581,107,140,4339,107,140,22539,4338,236779,144765,885,11292,3507,1373,107,140,4339,107,140,22539,4338,236779,144765,885,3460,187581,107,140,9277,107,140,12234,109,140,26297,578,236743,236771,107,140,1708,1123,528,41706,236787,107,144,584,1123,1251,623,61835,38302,3323,236743,236770,107,144,584,1123,1251,15825,1083,38302,14599,236743,236770,107,144,584,38302,655,236743,236771,236787,994,8450,107,140,2060,38302,1251,236743,236771,110,2063,17407,236769,28570,236787,1694,1473,107,140,12234,43733,2754,15841,529,496,14337,236761,107,140,28570,236840,236771,236842,900,43733,236840,236770,236842,808,1123,900,43733,236840,236778,236842,808,1123,236884,236778,900,27103,107,141,13293,17407,529,672,14337,528,506,1638,1183,236761,107,140,22539,17407,5551,236800,236764,236743,236770,236764,236743,236778,236764,236743,236812,236764,236743,236810,2812,107,140,236840,236770,236764,236743,236812,236764,236743,236770,236778,236764,236743,236778,236771,236842,107,140,22539,17407,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,236840,236778,236764,236743,236825,236842,107,140,12234,109,140,2060,870,28570,236840,236747,236842,808,858,573,858,528,2644,236769,236770,236764,5980,236769,28570,50796,110,2063,10779,73368,236769,236749,236787,801,1473,107,140,12234,818,56287,146228,1548,7501,563,496,7501,3361,531,506,56287,4142,42280,4659,866,588,600,236789,236751,5221,618,5238,236787,107,140,73368,73368,236769,236771,236768,1251,236743,236771,107,140,73368,73368,236769,236770,236768,1251,236743,236771,107,140,73368,73368,236769,236778,236768,1251,236743,236770,107,140,73368,73368,236769,236749,236768,1251,10779,73368,236769,236749,236772,236770,236768,900,10779,73368,236769,236749,236772,236778,236768,900,10779,73368,236769,236749,236772,236800,769,107,140,9366,4903,496,1292,531,23057,16333,506,538,236772,594,3408,529,506,10779,73368,1548,7501,236761,107,140,22539,10779,73368,236769,236770,236768,107,140,236771,107,140,22539,10779,73368,236769,236810,236768,107,140,236812,107,140,22539,10779,73368,236769,236828,236768,107,140,236778,236812,107,140,12234,109,140,584,538,1251,236743,236771,653,538,1251,236743,236770,236787,107,144,2060,236743,236771,107,140,36208,538,1251,236743,236778,236787,107,144,2060,236743,236770,107,140,236746,236764,518,236764,505,578,236743,236771,236764,236743,236771,236764,236743,236770,107,140,1708,2222,528,2644,236769,236800,236764,538,900,236743,236770,1473,107,144,236746,236764,518,236764,505,578,518,236764,505,236764,496,900,518,900,505,107,140,2060,505,109,51022,578,5240,107,3218,919,1594,3636,236761,107,12234,108,2063,147734,236779,2861,236769,236751,1473,107,140,12234,6974,496,1292,147734,236779,2861,837,4716,496,2483,13855,107,140,236746,3658,618,2744,532,7623,506,1548,529,147734,528,506,2483,236761,107,140,236847,172213,528,672,1624,659,756,236746,963,756,236744,963,756,236747,963,756,236748,963,756,236756,6748,5715,236764,756,236762,236789,563,992,496,107,140,236766,54587,236764,840,1186,1056,625,563,657,506,1345,529,506,2238,3658,236761,108,140,12703,236787,107,140,22539,147734,236779,2861,885,28180,893,1373,107,140,236778,107,140,22539,147734,236779,2861,885,2714,183577,1373,107,140,236800,107,140,12234,108,140,584,503,1251,86679,994,236743,236771,107,140,26297,578,5980,236769,2234,236769,7212,236769,3485,677,236787,677,528,623,7393,86472,14196,6462,236836,827,503,9670,107,140,584,503,17825,236770,236842,528,623,236762,236874,1083,38302,3323,236743,236770,107,140,2060,38302,109,2063,16971,236779,17631,236769,236781,236764,8633,1473,107,140,12234,147018,8633,506,26798,529,506,11995,1123,236764,8633,506,26798,1447,684,8633,107,140,624,994,506,1354,618,496,2483,236761,107,140,2859,8633,1890,1548,529,26798,236764,994,26798,29695,236761,107,140,22539,16971,236779,17631,236769,236770,236778,236764,236743,236770,236768,107,140,236775,236778,236770,236775,107,140,22539,16971,236779,17631,236769,236770,236778,236764,236743,236778,236768,107,140,236775,236770,236778,236775,107,140,12234,109,140,236751,578,1540,236769,236781,236768,107,140,584,8633,1890,5980,236769,236751,1473,994,503,186487,236770,236842,107,140,17631,98695,5980,236769,236751,236768,107,140,584,8633,1251,236743,236771,236787,107,144,2060,503,107,140,4454,236787,107,144,2060,503,236840,3469,236769,236751,236768,753,8633,9218,900,503,8497,3469,236769,236751,236768,753,8633,236842,109,2063,15958,10495,236769,236751,1473,107,140,12234,6804,107,140,6974,496,1292,600,4716,496,2483,618,2744,532,7623,506,2324,529,506,7593,7579,1186,236789,107,140,108765,17253,236761,108,140,38408,236787,107,144,29345,10495,48391,1477,236743,236771,107,144,29345,10495,885,596,3066,1373,1477,236743,236770,236800,236770,107,144,29345,10495,885,28180,64212,1373,1477,236743,236825,236832,107,144,29345,10495,885,23391,236788,1373,1477,236743,236825,236819,107,144,29345,10495,885,1014,2023,236799,671,1373,1477,236743,236770,236800,236770,107,144,29345,10495,885,236746,236776,72004,222180,1373,1477,236743,236770,236810,236800,107,140,12234,108,140,2060,2324,5551,778,236769,574,236768,573,677,528,503,768,677,236761,16292,2683,128375,109,2063,9479,236779,52447,236769,236751,236764,236749,1473,107,140,12234,107,140,902,672,4209,236764,611,795,577,2238,496,2483,600,9282,496,1548,529,36157,532,71816,236743,107,140,7705,659,10861,528,496,11406,529,9479,672,11406,6097,236743,107,140,145979,236764,71816,236764,532,51894,16391,236761,17770,506,2483,600,9282,506,2558,1548,529,236743,107,140,1437,71816,532,36157,532,614,11995,600,2754,506,2558,1548,529,506,16391,236743,107,140,495,506,11406,994,506,1548,529,506,51894,16391,528,506,11406,236761,107,140,1708,3491,1148,236787,107,140,31454,236779,52447,885,236810,36157,532,236743,236825,71816,827,236743,236770,236819,236768,3921,236770,236819,753,236743,236810,753,236743,236825,578,236743,236828,107,140,31454,236779,52447,885,236771,36157,532,236743,236770,71816,827,236800,236768,3921,236743,236800,753,236743,236771,753,236743,236770,578,236743,236778,107,140,31454,236779,52447,885,236778,36157,532,236743,236800,71816,827,236743,236770,236771,236771,236768,3921,236743,236770,236771,236771,753,236743,236778,753,236743,236800,578,236743,236819,236810,107,140,31454,236779,52447,885,236770,236771,236771,36157,532,236743,236770,71816,827,236770,236778,236771,236768,3921,236743,236770,236778,236771,753,236743,236770,236771,236771,753,236743,236770,578,236743,236770,236819,107,140,12234,109,140,8992,578,503,236761,6966,885,15825,107,140,236755,236770,236764,505,236778,578,801,236769,8992,236840,236771,18107,801,236769,8992,236840,236800,2812,107,140,10640,538,753,505,236770,753,505,236778,6867,236743,236771,236764,623,29995,9103,236775,997,16780,143114,206586,107,140,107,140,2060,538,753,505,236770,753,505,236778,109,2063,179196,236769,2762,1473,107,140,12234,107,140,236775,26479,614,3499,13855,496,9911,529,496,5028,600,815,1908,236772,27851,11995,13653,107,140,17993,4209,563,531,179196,886,529,506,13653,532,994,625,236761,107,140,818,179438,5349,1374,577,506,5349,607,506,21548,1581,1550,236761,107,140,2859,5065,13653,607,506,1638,21548,1581,1550,659,1765,994,506,5349,600,815,21548,3546,236761,108,140,818,179438,5349,1374,577,8323,528,496,1694,236764,870,1406,514,598,236779,2394,236764,1061,3546,7975,107,140,2859,993,659,951,1581,2979,653,506,2238,3499,563,7738,236764,994,870,1619,108,140,12703,236743,236770,236787,107,144,4661,236787,870,236812,236764,236778,236764,236800,236842,107,144,8433,236787,870,236778,236764,236743,236770,236842,107,144,44008,236787,236743,236778,815,506,21548,1581,1550,236764,532,236743,236778,815,506,21548,3546,236761,108,140,12703,236743,236778,236787,107,144,4661,236787,870,236770,236764,236778,236764,236800,236842,107,144,8433,236787,870,236778,236764,236743,236770,236842,107,144,44008,236787,236743,236778,815,506,21548,1581,1550,236764,532,236743,236778,815,506,21548,3546,236761,236743,108,140,12703,236743,236800,236787,107,144,4661,236787,2977,107,144,8433,236787,2977,107,140,107,140,12703,236743,236812,236787,107,144,4661,236787,870,236810,236764,236743,236771,236764,236743,236800,236764,236743,236771,236764,236743,236812,236764,236743,236778,236842,107,144,8433,236787,870,236771,236764,236743,236770,236842,107,144,44008,236787,236743,236771,563,506,21548,1550,236764,840,138,13534,659,1156,36509,236764,107,157,814,692,795,5347,506,1171,5743,236764,837,815,506,21548,3546,236761,108,140,40498,236787,107,144,236829,236743,236770,6605,13653,236761,3119,6605,236743,236770,236771,236771,236771,236771,107,144,236829,236743,236771,6605,5349,236761,2394,107,140,12234,109,140,584,784,236769,1111,2144,236743,236778,1251,236743,236770,573,1016,528,4617,1473,994,2977,107,140,1387,236779,20952,578,1322,236769,7212,236769,3485,1123,236787,1123,2144,236743,236778,1251,236743,236771,236764,4617,1223,107,140,1708,858,528,2644,236769,3469,236769,2762,16644,107,144,584,4617,236840,236747,236842,1251,1322,236779,20952,236787,107,148,2060,870,1387,236779,20952,236764,858,236842,109,2063,3927,236769,38511,1473,107,140,26610,107,140,3048,659,2238,496,1908,236772,11681,1694,529,4414,25630,236761,9657,506,11333,11995,600,563,5314,1082,236743,107,140,13321,236764,532,815,496,7132,5314,1082,653,4745,531,506,1550,529,506,11995,4850,236761,236743,107,140,818,7132,529,614,11995,563,506,1548,529,2782,625,7412,528,506,1694,236761,107,140,2859,951,1288,496,1550,2849,236764,994,753,236770,236761,107,140,38408,236787,107,144,2305,5551,236812,236764,236743,236770,236764,236743,236778,236764,236743,236778,236764,236743,236800,236764,236743,236770,2812,1251,236743,236778,107,144,2305,5551,236770,236764,236743,236778,236764,236743,236778,236764,236743,236800,236764,236743,236800,236764,236743,236800,236764,236743,236812,236764,236743,236812,236764,236743,236812,2812,1251,236743,236800,107,144,2305,5551,236810,236764,236743,236810,236764,236743,236812,236764,236743,236812,236764,236743,236812,2812,1251,753,236770,107,140,26610,109,140,2861,578,12739,825,107,140,1708,1152,528,65145,236787,107,144,584,1152,711,528,1527,236787,107,148,2861,236840,3744,236842,578,236743,236771,107,144,2861,236840,3744,236842,3323,236743,236770,107,140,743,578,753,236770,107,140,1708,1152,236764,38302,528,1527,236761,7633,6141,107,144,584,38302,6867,1152,236787,107,148,743,578,2631,236769,743,236764,1152,236768,107,140,2060,9898,109,2063,17163,236779,10479,236779,2234,236769,38511,1473,107,140,26610,107,140,26479,1694,529,25630,236764,994,1694,528,17163,1900,236761,107,140,169728,37007,236764,563,1056,611,1502,607,506,7081,1550,236764,107,140,5215,5783,529,506,9866,25630,236764,1299,7081,532,834,580,236761,108,140,38408,236787,107,140,184095,236779,10479,236779,2234,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,2812,1251,870,236770,236764,236743,236812,236764,236743,236778,236764,236743,236800,236842,107,140,184095,236779,10479,236779,2234,5551,236810,236764,236743,236810,236764,236743,236810,236764,236743,236810,2812,1251,870,236810,236764,236743,236810,236764,236743,236810,236764,236743,236810,236842,107,140,184095,236779,10479,236779,2234,67713,1251,2977,107,140,26610,108,140,40835,236779,2234,578,19372,236769,38511,236768,107,140,743,236764,858,236764,673,578,17811,236743,236771,236764,5980,236769,40835,236779,2234,236768,753,236743,236770,107,140,6858,858,655,673,236787,107,144,743,236761,3770,236769,40835,236779,2234,236840,236747,2812,107,144,743,236761,3770,236769,40835,236779,2234,236840,236804,2812,107,144,236747,3323,236743,236770,107,144,236804,14599,236743,236770,107,140,584,858,1251,673,236787,9898,236761,3770,236769,40835,236779,2234,236840,236747,2812,107,140,2060,9898,109,2063,17852,236779,7376,236769,236746,236764,518,236764,505,1473,107,140,26610,107,140,26479,506,25565,529,506,1806,9174,529,496,17852,236761,9657,506,2433,529,107,140,1437,17852,20274,531,236743,236778,20632,3298,768,506,1806,9174,1183,496,4341,17852,236761,236743,107,140,113255,994,753,236770,107,140,19765,9174,1386,496,4341,17852,1056,506,2324,529,1027,1156,9174,563,5314,236743,107,140,14560,506,4168,2678,236761,107,140,12703,236787,107,140,20457,236779,7376,236769,236800,236764,236743,236812,236764,236743,236810,236768,1251,236743,236825,236761,236771,236771,107,140,20457,236779,7376,236769,236770,236764,236743,236778,236764,236743,236770,236771,236768,1251,753,236770,107,140,26610,109,140,584,496,900,518,6605,505,653,496,900,505,6605,518,653,518,900,505,6605,496,236787,994,753,236770,107,140,236758,578,568,236746,900,518,900,505,236768,965,236743,236778,107,140,2060,4886,3283,236758,808,568,236758,753,496,236768,808,568,236758,753,518,236768,808,568,236758,753,505,1223,5213,236743,236771,236761,236810,236764,236743,236778,236768,109,2063,795,236779,509,236779,15275,236769,236809,236764,236765,1473,107,140,26610,107,140,6974,496,1292,600,7623,6288,768,506,2495,3752,795,10240,236764,532,8450,7394,236761,107,140,818,2495,3752,795,10240,768,625,236789,236751,20433,568,509,563,496,180522,660,525,1694,236768,532,506,2324,529,1061,4820,563,2344,1082,653,4745,506,5783,2653,3825,515,236761,108,140,12703,236787,107,140,16132,236779,509,236779,15275,5551,236770,236764,236743,236778,1604,236743,236810,236768,236743,245790,8450,236743,107,140,236865,236743,236770,236862,236778,563,2344,1082,506,5783,2653,3825,236764,840,625,236789,236751,105938,236761,108,140,16132,236779,509,236779,15275,5551,236800,236764,236743,236778,236764,236743,236800,1604,236743,236770,236768,236743,245790,8450,107,140,236865,625,236789,236751,20433,236764,840,236743,236800,236862,236778,236862,236800,563,919,1082,506,5783,2653,3825,236761,108,140,16132,236779,509,236779,15275,5551,236800,236764,236743,236778,236764,236743,236800,1604,236743,236819,236768,236743,245790,6288,107,140,236865,236743,236800,236862,236778,236862,236800,563,2344,1082,506,5783,2653,3825,236764,532,625,236789,236751,20433,236761,108,140,16132,236779,509,236779,15275,5551,236800,1604,236743,236810,236768,236743,245790,6288,107,140,236865,236743,236800,563,2344,1082,506,5783,2653,3825,236764,532,625,236789,236751,20433,236761,107,140,26610,108,140,2060,3752,1251,3752,186487,236770,236842,532,2324,236769,236809,236768,6605,515,109,2063,21548,236779,4177,236769,2762,1473,107,140,12234,107,140,26479,614,3499,4617,529,25630,236764,1586,506,7081,1548,529,4820,600,107,140,25109,531,577,6692,531,1386,506,3499,180522,660,525,236761,562,180522,660,525,3499,563,614,3499,600,107,140,511,1676,506,1638,44615,532,55138,236761,799,886,2352,236764,611,740,2352,886,3408,531,1027,1032,3408,236761,108,140,2542,2591,236787,107,140,118878,236779,4177,5551,236770,236764,236778,236764,236800,236764,236810,236764,236812,236764,236832,236764,236819,236764,236825,2812,1251,236743,236812,107,140,118878,236779,4177,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236800,236764,236743,236778,236764,236743,236778,2812,1251,236743,236770,107,140,118878,236779,4177,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236778,236764,236743,236770,2812,1251,236743,236771,107,140,12234,108,140,2762,236779,140879,236764,38302,578,4617,186487,236770,1604,236743,236771,107,140,1708,858,528,2644,236769,3469,236769,2762,236768,973,236743,236778,1473,107,144,584,4617,236840,236747,236842,2843,4617,236779,140879,236840,236747,9414,107,148,26297,3323,236743,236770,107,140,2060,38302,109,2063,2558,236779,10480,236769,38511,236770,236764,65145,236778,1473,107,140,26610,107,140,6974,496,1292,600,37574,1156,15852,529,16587,532,7623,506,1694,600,815,236743,107,140,6725,1548,529,56256,528,506,784,16587,529,506,1694,2344,1082,506,1032,1694,236761,108,140,584,506,1156,15852,735,506,1638,1548,529,56256,236764,994,506,1171,1694,236761,108,140,38408,107,140,6725,236779,10480,142976,60227,236743,245790,2977,107,140,6725,236779,10480,20768,2202,963,756,8473,7367,7756,236754,236777,963,756,10979,10190,236743,245790,7756,236754,236777,963,756,10979,2000,107,140,6725,236779,10480,20768,2202,963,756,8473,7367,7756,2202,963,756,2202,963,756,8473,963,756,6201,10190,236743,245790,7756,2202,963,756,8473,2000,107,140,6725,236779,10480,20768,2202,963,756,8473,7367,7756,236754,236777,963,756,2202,963,756,2202,10190,236743,245790,7756,236754,236777,963,756,2202,963,756,2202,2000,107,140,6725,236779,10480,20768,236812,7367,7756,236770,963,756,236778,963,756,236800,963,756,236812,963,756,236810,10190,236743,245790,7756,236812,2000,107,140,26610,108,140,236755,236770,236764,505,236778,578,2324,236769,3275,236769,3485,503,236787,5980,236769,236751,779,65145,236770,8914,2324,236769,3275,236769,3485,503,236787,5980,236769,236751,779,65145,236778,1223,107,140,2060,65145,236770,768,505,236770,6605,505,236778,1663,65145,236778,109,2063,563,236779,64545,236779,2497,236769,236746,1473,107,140,12234,6974,496,1292,600,7623,1847,768,506,2238,1548,563,506,27104,529,236743,236800,8355,4945,107,140,624,2416,7394,236761,107,140,101793,600,568,236746,236768,563,2344,1299,236743,236770,236771,236771,236761,236743,107,140,12703,236787,107,140,511,236779,64545,236779,2497,236769,236800,236771,236768,1251,6288,107,140,236800,236771,578,236743,236778,808,236743,236800,808,236743,236810,107,140,12234,108,140,584,496,6605,236743,236770,236787,994,8450,107,140,511,2497,578,870,4339,236842,808,568,236746,900,236743,236770,236768,107,140,1708,858,528,2644,236769,236778,236764,496,900,236743,236770,1473,107,144,584,563,2497,236840,236747,9414,107,148,1708,673,528,2644,236769,236747,900,858,236764,496,900,236743,236770,236764,858,1473,107,152,511,2497,236840,236804,236842,578,8450,107,140,26297,236764,15172,578,236743,236771,236764,496,107,140,1708,858,528,2644,236769,236778,236764,496,900,236743,236770,1473,107,144,6858,563,2497,236840,236747,236842,532,15172,2144,858,1251,236743,236771,236787,107,148,11935,973,236784,858,107,148,26297,3323,236743,236770,107,140,2060,38302,1251,236743,236800,109,2063,563,236779,19751,236779,10310,236769,236781,236764,538,1473,107,140,12234,11069,4209,563,531,4903,496,1292,600,7623,1847,768,496,1548,1123,563,496,3606,107,140,10310,529,538,532,2416,528,1032,3636,236761,107,140,236781,563,496,3606,2066,529,538,768,538,1018,720,236784,236781,107,140,2542,2591,236787,107,140,511,236779,19751,236779,10310,236769,236770,236764,236743,236812,236768,1477,1847,107,140,511,236779,19751,236779,10310,236769,236778,236764,236743,236778,236768,1477,1847,107,140,511,236779,19751,236779,10310,236769,236828,236764,236743,236778,236768,1477,1847,107,140,511,236779,19751,236779,10310,236769,236800,236764,236743,236778,236768,1477,2416,107,140,511,236779,19751,236779,10310,236769,236800,236764,236743,236770,236768,1477,2416,107,140,511,236779,19751,236779,10310,236769,236810,236764,236743,236800,236768,1477,2416,107,140,12234,108,140,584,1123,1251,236743,236770,236787,994,6288,107,140,584,538,1251,236743,236771,236787,994,1123,1251,236743,236771,107,140,584,538,1251,236743,236770,236787,994,1123,1251,236743,236770,107,140,584,538,1251,753,236770,236787,994,2951,236769,236781,236768,1251,236743,236770,107,140,236758,578,538,107,140,6858,2951,236769,236758,236768,6605,2951,236769,236781,1473,107,144,584,510,1251,1123,236787,994,6288,107,144,236758,578,510,808,538,107,140,2060,8450,109,2063,563,45601,236769,236746,1473,107,140,26610,107,140,6974,496,1292,600,4716,614,11995,496,532,7623,6288,236743,107,140,584,672,4616,41879,563,496,26365,529,1070,11995,1548,236761,107,140,10282,236787,611,1149,9027,506,2744,563,2462,4341,236761,107,140,38408,236787,107,140,10608,3411,236769,236770,236768,196301,6288,107,140,10608,3411,236769,236778,236768,196301,8450,107,140,10608,3411,5929,236770,236768,196301,6288,107,140,10608,3411,236769,236825,236812,236768,196301,6288,107,140,10608,3411,236769,236771,236768,196301,6288,107,140,10608,3411,236769,236770,236828,236771,236768,196301,8450,107,140,26610,108,140,236746,578,2951,236769,236746,236768,107,140,2060,801,236769,979,236769,236746,5213,568,236770,236761,965,236743,236800,9670,5213,236743,236800,1251,496,109,2063,17116,236779,2478,236769,3744,1473,107,140,12234,3048,735,1010,68783,531,4903,496,1292,600,21500,236743,107,140,236746,143534,1548,618,496,2483,532,18558,506,1548,529,143534,236743,107,140,67161,600,659,70761,568,2497,1548,236764,653,496,8355,236764,563,496,3756,1548,236743,107,140,89785,1082,236743,236770,600,563,711,496,1698,529,1156,7100,3756,4945,769,107,140,41883,112394,26798,659,236743,236771,236764,236743,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,236764,236743,236825,236764,236743,236832,236764,236743,236828,236764,236743,236819,236764,562,236764,603,236764,565,236764,622,236764,645,236764,633,236761,107,140,40090,4945,659,236743,236778,236764,236743,236800,236764,236743,236810,236764,236743,236832,236764,236743,236770,236770,236764,236743,236770,236800,236764,236743,236770,236832,40287,107,140,4324,611,735,531,6054,496,1548,529,506,2269,26798,236787,236743,236778,236764,236743,236800,236764,236743,236810,236764,236743,236832,236764,236743,107,140,236799,59611,55817,236743,236770,236770,779,622,59611,55817,236743,236770,236800,769,107,140,10282,236787,611,1149,9027,506,2744,563,2462,4338,653,7738,2483,236764,236743,107,140,624,16271,562,236764,236799,236764,236780,236764,236796,236764,236788,236764,236811,659,2462,46451,236761,107,140,38408,236787,107,140,2542,1152,578,623,3066,236775,506,3938,1374,577,236743,236770,236761,107,140,2542,1152,578,623,236770,236771,236832,236832,236788,236775,506,3938,1374,577,236743,236778,236761,107,140,2542,1152,578,623,3066,2413,236770,236776,236800,236800,236775,506,3938,1374,577,236743,236812,236761,107,140,2542,1152,578,623,236770,236778,236800,236812,236810,236825,236832,236828,236819,31348,50526,236771,236775,506,3938,1374,577,236743,236825,236761,107,140,2542,1152,578,623,236778,236771,236778,236771,236775,506,3938,1374,577,236743,236778,236761,107,140,12234,108,140,107,140,2060,5980,236769,2234,236769,7212,236769,3485,1123,236787,1123,528,623,236778,236800,236810,236832,11217,827,1152,9670,109,2063,20632,236779,1071,236779,34280,236769,55817,1473,107,140,12234,3048,795,577,2238,496,1548,528,20632,1183,532,822,4209,563,531,7352,625,531,107,140,34280,6518,236761,669,1292,1374,994,496,2483,236764,607,1546,2872,13855,496,14820,107,140,5640,236761,7714,2872,528,506,2483,795,577,756,236771,236789,653,756,236770,6748,108,140,3810,795,577,614,4481,4628,529,7579,756,3864,236789,657,506,6534,532,657,506,1345,529,506,2483,236761,107,140,818,4481,7579,659,993,531,1601,607,506,6518,236761,108,140,38408,236787,107,140,55817,236779,1071,236779,34280,236769,236770,236810,236768,139,236865,7623,623,3864,236770,236770,236770,236770,3864,236775,107,140,55817,236779,1071,236779,34280,236769,236800,236778,236768,139,236865,7623,623,3864,236770,236771,236771,236771,236771,236771,3864,236775,107,140,12234,109,140,2060,623,3864,236775,900,10915,236769,55817,10309,236778,9218,900,623,3864,236775,109,2063,563,236779,19849,236769,236751,1473,107,140,12234,3048,659,2238,496,2483,503,236761,107,140,11069,4209,563,531,2426,768,506,2483,563,5293,653,711,236761,107,140,236776,2483,563,5293,768,1061,3861,563,657,3198,236743,236800,532,1418,236743,236800,22592,11739,659,9245,107,140,2542,2591,236787,107,140,511,236779,19849,236769,236746,236768,1477,8450,107,140,511,236779,19849,236769,9236,236768,1477,8450,107,140,511,236779,19849,236769,200500,236768,1477,6288,107,140,511,236779,19849,236769,163760,236768,1477,8450,107,140,511,236779,19849,236769,101028,236768,1477,6288,107,140,511,236779,19849,236769,4228,236762,236768,1477,8450,107,140,12234,108,140,584,5980,236769,236751,236768,655,236743,236800,236787,994,8450,107,140,1708,858,528,2644,236769,3469,236769,236751,236768,753,236743,236778,1473,107,144,584,503,236840,236747,236842,1251,503,236840,236747,900,236743,236770,236842,653,503,236840,236747,236842,1251,503,236840,236747,900,236743,236778,236842,653,503,236840,236747,900,236743,236770,236842,1251,503,236840,236747,900,236743,236778,9414,107,148,2060,8450,107,140,2060,6288,109,2063,16688,236779,13143,236779,9619,236769,32477,1473,107,140,12234,1509,563,506,1774,2069,529,506,28066,532,506,9800,815,531,2583,506,26487,107,140,1071,3272,236761,669,9800,815,1010,3043,1116,1852,8417,573,53531,236761,107,140,818,1186,2608,563,236764,1304,815,5745,506,3393,1304,1456,573,53531,236761,107,140,5778,815,2238,611,496,1694,529,27379,2205,573,1070,3272,532,611,735,531,4903,236743,107,140,236746,1292,600,740,3938,496,1694,529,6064,26487,1699,506,2269,2633,236787,107,149,99761,143,236909,140,31070,11398,107,150,236812,236761,236771,152,236776,236862,107,148,236813,236743,236800,236761,236832,152,236776,236743,107,148,236813,236743,236800,236761,236800,152,236776,236772,236743,107,148,236813,236743,236800,236761,236771,152,236799,236862,107,148,236813,236743,236778,236761,236832,152,236799,236743,107,148,236813,236743,236778,236761,236800,152,236799,236772,107,148,236813,236743,236778,236761,236771,152,236780,236862,107,148,236813,236743,236770,236761,236832,152,236780,107,148,236813,236743,236770,236761,236800,152,236780,236772,107,148,236813,236743,236770,236761,236771,152,236796,236862,236743,107,148,236813,236743,236771,236761,236832,152,236796,236743,107,148,236813,236743,236771,236761,236771,152,236796,236772,107,150,236771,236761,236771,152,236788,107,140,108,140,12703,236787,107,140,9619,236779,8539,5551,236812,236761,236771,236764,236743,236800,236764,236743,236770,236761,236832,236764,236743,236778,236764,236743,236800,236761,236810,2812,196301,7756,236776,78431,756,236799,963,756,236780,142008,756,236780,963,756,236776,236772,2000,107,140,12234,109,140,2063,531,236779,13143,236779,9619,236769,8318,1473,107,142,584,6317,1251,236743,236812,236761,236771,236787,107,144,2060,623,236776,16256,107,142,36208,6317,1890,236743,236800,236761,236832,236787,107,144,2060,623,236776,236775,107,142,36208,6317,1890,236743,236800,236761,236800,236787,107,144,2060,623,236776,31621,107,142,36208,6317,1890,236743,236800,236761,236771,236787,107,144,2060,623,236799,16256,107,142,36208,6317,1890,236743,236778,236761,236832,236787,107,144,2060,623,236799,236775,107,142,36208,6317,1890,236743,236778,236761,236800,236787,107,144,2060,623,236799,31621,107,142,36208,6317,1890,236743,236778,236761,236771,236787,107,144,2060,623,236780,16256,107,142,36208,6317,1890,236743,236770,236761,236832,236787,107,144,2060,623,236780,236775,107,142,36208,6317,1890,236743,236770,236761,236800,236787,107,144,2060,623,236780,31621,107,142,36208,6317,1890,236743,236770,236761,236771,236787,107,144,2060,623,236796,16256,107,142,36208,6317,1890,236743,236771,236761,236832,236787,107,144,2060,623,236796,236775,107,142,36208,6317,1890,236743,236771,236761,236771,236787,107,144,2060,623,236796,31621,107,142,4454,236787,107,144,2060,623,236788,236775,107,140,107,140,2060,870,1071,236779,13143,236779,9619,236769,236781,236768,573,1123,528,26487,236842,109,2063,8355,236779,3119,236769,2383,1473,107,140,12234,6974,496,1292,600,4716,496,2483,532,7623,6288,768,506,2483,107,140,3119,563,496,8355,1548,653,8450,7394,107,140,38408,107,140,2497,236779,3119,1033,9259,1606,1251,6288,107,140,2497,236779,3119,1033,596,166537,3604,1606,1251,6288,107,140,2497,236779,3119,1033,234093,832,1606,1251,6288,107,140,2497,236779,3119,1033,28975,1606,1251,8450,107,140,12234,108,140,2063,563,236779,2497,236769,236746,1473,107,144,2060,711,568,236746,655,236743,236778,653,1027,236769,236746,2144,1123,1251,236743,236771,573,1123,528,2644,236769,236778,236764,801,236769,236746,5213,236743,236771,236761,236810,236768,900,236743,236770,9670,108,140,2060,563,236779,2497,236769,3469,236769,2383,1223,109,2063,9857,236779,811,236779,2068,236769,236749,1473,107,140,12234,107,140,26479,496,4414,11995,538,236764,994,506,1527,529,506,4945,529,538,236772,29345,107,140,30558,25630,600,1502,653,1345,607,236743,236770,236761,107,140,12234,109,140,584,538,1251,236743,236770,236787,994,236743,236770,107,140,2060,236743,236770,236828,808,236743,236770,236771,5213,568,236749,753,236743,236778,236768,109,2063,8974,236769,236797,1473,107,140,12234,26479,496,4414,11995,646,236764,994,506,2558,2324,529,1061,26798,528,14820,236761,107,140,107,140,12703,107,144,2542,646,578,236743,236770,236771,236771,236771,236764,506,2324,529,26798,795,577,236743,236770,506,3938,1374,577,623,236770,3056,107,144,2542,646,578,236743,236770,236810,236771,236764,506,2324,529,26798,795,577,236743,236825,506,3938,1374,577,623,236770,236770,236771,3056,107,144,2542,646,578,236743,236770,236812,236832,236764,506,2324,529,26798,795,577,236743,236770,236778,506,3938,1374,577,623,236770,236770,236771,236771,3056,107,140,107,140,59617,236787,107,144,236940,236797,11995,107,149,40498,236787,236743,236771,38010,646,38010,236743,236770,236771,236771,236771,236771,236761,107,140,8433,236787,107,145,236746,2483,529,14820,1548,107,140,12234,109,140,236751,578,2324,236769,3275,236769,3485,1123,236787,801,236769,236781,779,1540,236769,236797,9670,107,140,2060,10915,236769,236751,10309,236778,9218,109,2063,1138,236769,38511,1473,107,140,12234,26479,496,1908,236772,11681,1694,529,25630,65145,236761,1138,506,1581,4820,600,659,657,11049,22697,856,109,140,38408,236787,107,144,1282,5551,236812,236764,236743,236778,236764,236743,236825,236764,236743,236832,2812,196301,236743,236778,236743,107,140,12234,109,140,236751,578,236743,236771,107,140,1708,858,528,2644,236769,236770,236764,5980,236769,38511,779,236743,236778,1473,107,144,584,65145,236840,236747,236842,2144,236743,236778,1251,236743,236771,236787,107,148,236751,3323,65145,236840,236747,236842,107,140,2060,503,109,2063,7247,236779,56305,236769,236751,1473,107,140,12234,107,140,6974,496,1292,600,4716,496,2483,532,7623,614,11496,3567,529,625,236761,107,140,92489,3567,529,2483,236764,563,496,2483,1298,784,4171,568,74010,684,2557,236768,107,140,733,12043,684,496,861,3658,1298,784,506,7579,19587,528,107,140,149308,1900,2721,580,145400,1550,236761,107,140,10282,236787,1599,1374,2514,506,1900,529,4171,532,11580,9952,528,506,13315,236761,108,140,2542,2591,236787,107,140,5027,236779,56305,1033,10979,1606,7623,756,10979,236789,107,140,5027,236779,56305,1033,23391,1606,7623,756,10129,175343,236789,107,140,5027,236779,56305,1033,9259,4109,11145,1606,7623,756,9259,42228,236824,236753,3159,236789,107,140,12234,108,140,8992,578,503,236761,6966,885,15825,107,140,2060,623,16150,7013,236769,3275,236769,3485,1123,236787,116740,7013,236769,40835,236769,236781,236764,2307,236784,3485,677,236787,4772,236769,574,69472,4171,1223,109,2063,974,236779,809,236769,38511,236764,1123,1473,107,140,12234,107,140,3048,659,2238,496,236743,236778,27220,1262,236764,618,496,43927,15852,236764,107,140,7650,563,3361,531,6113,236764,3685,236764,21304,20934,236764,107,140,17136,2050,1149,3014,496,1607,1548,529,11312,236761,107,140,26479,65145,236764,532,11995,1123,236764,1586,25630,1123,528,506,1694,236764,107,140,624,994,1694,529,119743,236764,21652,236781,236770,236764,570,236770,779,568,236781,236778,236764,570,236778,236768,220380,1288,600,107,140,17136,33228,563,496,16422,753,568,809,236764,11312,779,6250,607,236743,236771,236761,107,140,24640,15375,14877,684,12773,528,52557,1900,236761,107,140,12721,236764,4260,15375,529,506,2050,684,11312,528,52919,1900,236761,107,140,107,140,38408,236787,107,140,828,236779,809,5551,107,142,236840,236770,236764,236778,236764,236800,236764,236812,236764,236810,236764,236825,1604,107,142,236840,236770,236764,236778,236764,236800,236764,236812,236764,236770,236764,236825,1604,107,142,236840,236770,236764,236778,236764,236800,236764,236812,236764,236810,236764,236770,236842,107,140,1604,236743,236770,236768,1251,21652,236771,236764,236743,236771,779,568,236770,236764,236743,236812,779,568,236770,236764,236743,236771,779,568,236778,236764,236743,236810,779,568,236778,236764,236743,236771,7066,107,140,828,236779,809,142976,236743,236770,236768,1251,2977,107,140,828,236779,809,5551,22228,870,236770,1604,870,236770,236764,236743,236778,236764,236743,236800,36878,236743,236800,236768,1251,21652,236778,236764,236743,236778,7066,107,140,12234,108,140,619,578,2977,107,140,1708,858,236764,537,528,29833,236769,38511,1473,107,144,1708,673,528,2644,236769,3469,236769,236752,236768,753,236743,236770,236764,753,236770,236764,753,236770,1473,107,148,584,537,236840,236804,236842,1251,1123,236787,766,236761,3770,3283,236747,236764,673,1223,107,140,2060,766,109,2063,4260,236779,2513,236769,2513,1473,107,140,12234,107,140,26479,614,3499,529,1908,236772,27851,25630,236764,994,496,4865,529,506,2238,3499,1308,37007,236764,107,140,7624,795,4260,506,2238,3499,528,52557,1900,768,506,2324,236769,1171,3546,1550,236764,1774,3546,1550,236768,563,11049,236764,107,140,504,4260,625,528,52919,1900,768,506,2324,236769,1171,3546,1550,236764,1774,3546,1550,236768,563,1581,236761,108,140,10282,236787,107,140,236829,1537,236789,236745,2352,506,2238,3499,236761,108,140,38408,236787,107,140,236829,4260,236779,2513,67713,1477,2977,107,140,236829,4260,236779,2513,5551,236810,2812,1477,870,236810,236842,107,140,236829,4260,236779,2513,5551,236778,236764,236743,236812,236764,236743,236800,236764,236743,236771,236764,236743,236770,236764,236743,236810,2812,1477,870,236771,236764,236743,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,236842,107,140,236829,4260,236779,2513,5551,236778,236764,236743,236812,236764,236743,236800,236764,236743,236771,236764,236743,236770,236764,236743,236810,236764,236743,236825,2812,1477,870,236825,236764,236743,236810,236764,236743,236812,236764,236743,236800,236764,236743,236778,236764,236743,236770,236764,236743,236771,236842,107,140,12234,109,140,584,3499,1251,4955,994,2977,107,140,2060,19372,236769,2513,236764,14416,9036,2513,236840,236771,14997,2513,17825,236770,2812,2144,236743,236778,1251,236743,236771,236768,109,2063,65971,236769,236751,1473,107,140,12234,6924,496,1292,65971,600,4716,496,2483,618,614,8485,532,107,140,15072,496,2483,48970,607,506,30796,1646,57384,236761,236743,107,140,818,30796,1374,577,57384,528,496,8155,1288,600,506,11739,236743,107,140,17631,1679,684,1156,37764,531,1156,6666,236761,107,140,2542,2591,236787,107,140,89618,1033,2202,1606,7623,756,28864,236789,107,140,89618,1033,527,3405,860,236804,4998,1606,7623,756,641,236754,42838,3269,650,236789,107,140,89618,1033,111161,1606,7623,756,57673,236789,107,140,89618,1033,536,1606,7623,756,1205,236789,107,140,12234,108,140,236753,578,756,108250,127456,127229,236789,107,140,2060,116740,7013,236769,3275,236769,3485,677,236787,70560,3283,778,236769,574,236768,753,4772,885,236746,1373,900,236743,236812,236768,2144,236743,236778,236825,900,4772,885,236746,5924,768,677,528,513,1663,677,236764,503,1223,109,2063,2148,236779,118878,236769,38511,1473,107,140,12234,107,140,3048,659,2238,496,1694,529,25630,236761,107,140,6974,496,1292,2148,236779,118878,825,600,7623,506,236743,236778,523,21548,3408,529,506,1694,236761,107,140,13293,5450,768,993,563,951,1288,3408,236761,107,140,107,140,4874,236779,118878,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,2812,1251,236743,236778,107,140,4874,236779,118878,5551,236810,236764,236743,236770,236764,236743,236812,236764,236743,236800,236764,236743,236778,2812,1251,236743,236778,107,140,4874,236779,118878,67713,1251,5450,107,140,4874,236779,118878,5551,236770,236764,236743,236770,2812,1251,5450,107,140,12234,108,140,584,5980,236769,38511,236768,6605,236743,236770,236787,994,5450,107,140,40835,236779,2234,578,19372,236769,38511,236768,107,140,1708,1123,528,19372,236779,2234,236787,107,144,584,1123,2843,19372,236779,2234,236840,236771,9414,107,148,2060,1123,109,2063,563,236779,236763,2926,236769,236773,1473,107,140,12234,107,140,3048,236789,859,577,2238,496,2483,529,4171,236764,532,822,4209,563,531,1527,506,1548,107,140,1340,38670,4933,236761,562,111042,563,496,13315,600,9857,607,506,3658,623,236777,3056,107,140,34973,2015,659,215131,684,157052,756,28583,653,118675,6748,107,139,107,140,2542,2591,236787,107,140,22539,563,236779,236763,2926,885,9259,1902,1373,107,140,236771,107,140,22539,563,236779,236763,2926,885,818,7217,563,3730,236761,669,3768,563,36489,236761,564,2765,672,7606,1373,107,140,236770,107,140,12234,108,140,54921,578,4187,236769,3485,1123,236787,1123,236761,17874,3800,5960,236773,236761,10143,885,32109,16150,46463,10143,885,41218,16150,46463,6966,885,100288,107,140,2060,5980,5551,236751,573,503,528,23974,768,503,236761,52740,885,236777,15825,2812,109,2063,1027,236779,720,236769,236781,236764,570,236764,904,1473,107,140,26610,107,140,6924,496,1292,600,4716,236743,236800,4945,236761,107,140,42652,1847,768,886,529,506,4945,563,4745,531,506,2324,529,506,1032,1156,236764,532,784,4945,659,25630,236761,107,140,42652,2416,528,1027,1032,3636,236761,107,140,107,140,38408,107,140,1309,236779,720,236769,236810,236764,236743,236778,236764,236743,236832,236768,236743,245790,6288,107,140,107,140,1309,236779,720,236769,236800,236764,236743,236778,236764,236743,236778,236768,236743,245790,8450,108,140,1309,236779,720,236769,236800,236764,753,236778,236764,236743,236770,236768,236743,245790,6288,107,140,107,140,1309,236779,720,236769,236800,236761,236825,236764,753,236778,236761,236778,236764,236743,236778,236768,236743,245790,8450,107,138,108,140,107,140,26610,108,140,584,1722,236769,236781,236768,2843,801,653,1722,236769,236762,236768,2843,801,653,1722,236769,236802,236768,2843,801,236787,994,8450,107,140,2060,1123,1251,570,900,904,653,570,1251,1123,900,904,653,904,1251,570,900,1123,109,2063,41193,236769,4375,1473,107,140,12234,107,140,6974,496,1292,600,4716,496,3618,236764,532,97303,528,1288,496,236743,107,140,2677,600,625,121050,1624,529,784,11739,236764,54238,784,147734,528,236743,107,140,1437,3618,607,506,6064,600,7412,236743,236778,6666,7531,529,600,236743,107,140,236766,54587,528,506,29341,30796,236761,236743,107,140,74581,1186,11739,236761,236743,107,140,107,140,38408,236787,107,140,22539,41193,1033,2181,1606,107,140,236789,65669,1393,236789,107,140,22539,41193,1033,2094,563,496,3618,1606,107,140,236789,236745,236814,14488,56122,565,34815,4033,17956,236823,236789,107,140,12234,109,140,2063,5702,236779,4925,236769,574,1473,107,144,584,4772,885,236776,1373,6605,4772,236769,574,236768,6605,4772,885,236953,27077,107,148,2060,70560,236769,778,236769,574,236768,900,236743,236800,236778,236768,107,144,36208,4772,885,236746,1373,6605,4772,236769,574,236768,6605,4772,885,236802,27077,107,148,2060,70560,236769,778,236769,574,236768,753,236743,236800,236778,236768,107,144,4454,236787,107,148,2060,677,107,140,107,140,2063,124312,236779,4177,236769,574,1473,107,144,2060,677,768,677,711,528,623,7393,86472,14196,6462,236836,236775,1663,70560,236769,778,236769,574,236768,900,236743,236778,236768,107,140,107,140,236757,578,116740,7013,236769,3275,236769,13570,236779,4925,236764,3618,1223,107,140,2060,116740,7013,236769,3275,236769,236766,54587,236779,4177,236764,520,1223,110,2063,137282,39305,15445,236753,236769,38511,1473,107,140,12234,3048,659,2238,496,1694,529,25630,236761,107,140,3048,1202,531,1586,506,7488,8355,1550,532,994,506,2324,529,1061,26798,236761,108,140,38408,236787,107,140,2542,65145,578,870,236771,236764,236800,236764,236778,236764,236770,236764,236800,236764,236810,236764,236832,236764,236812,236764,236810,236764,236810,236764,236810,236764,236778,236764,236770,236828,236770,236764,236800,236778,236764,236812,236764,236800,236778,236764,236800,236764,236778,236764,236800,236778,236764,236800,236778,236812,236764,236812,236764,236800,236842,506,3938,1374,577,236743,236770,236771,107,140,2542,65145,578,870,236770,236764,236771,236764,236770,236764,236828,236764,236778,236764,236812,236810,236819,236832,236764,236778,236764,236770,236764,236800,236764,236812,236771,236764,236770,236764,236778,236764,236770,236764,236778,236764,236812,236764,236778,236764,236810,236764,236770,236842,506,3938,1374,577,236743,236778,236810,107,140,2542,65145,578,870,236770,236764,236800,236764,236770,236764,236800,236778,236764,236810,236770,236771,236832,236764,236800,236812,236764,236828,236800,236778,236832,236828,236764,236770,236771,236819,236764,236770,236825,236800,236764,236778,236800,236764,236778,236800,236778,236800,236764,236800,236778,236764,236800,236771,236764,236770,236764,236819,236764,236800,236842,506,3938,1374,577,236743,236770,236800,107,140,2542,65145,578,870,236771,236764,236832,236778,236812,236764,236800,236778,236764,236832,236770,236764,236819,236819,236764,236800,236778,236764,236825,236764,236771,236764,236810,236764,236819,236770,236764,236828,236800,236764,236771,236764,236810,236764,236825,236842,506,3938,1374,577,236743,236770,236770,107,140,2542,65145,578,870,236771,236764,236828,236770,236764,236770,236778,236764,236800,236764,236770,236764,236778,236770,236842,506,3938,1374,577,236743,236800,107,140,2542,65145,578,870,236771,236764,236828,236764,236770,236764,236778,236764,236770,236764,236832,236842,506,3938,1374,577,236743,236832,107,140,12234,109,140,2063,563,236779,2497,236769,236746,1473,107,144,2060,711,568,236746,655,236743,236778,653,1027,236769,236746,2144,1123,1251,236743,236771,573,1123,528,2644,236769,236778,236764,801,236769,236746,5213,236743,236771,236761,236810,236768,900,236743,236770,9670,107,140,40835,236779,2234,578,19372,236769,38511,10309,59396,236770,236842,107,140,1708,1123,528,19372,236779,2234,236787,107,144,584,563,236779,2497,236769,236781,1473,107,148,2060,2324,236769,3275,236769,3485,677,236787,801,236769,574,779,1540,236769,236781,9670,109,2063,2426,236779,3074,236779,4925,236769,3074,1473,107,140,12234,107,140,26479,496,19086,236764,994,6288,768,784,13272,659,16587,528,3718,236743,107,140,4925,653,784,13272,659,16587,528,7593,1624,236764,1663,994,8450,236761,107,140,818,1292,1374,994,8450,563,506,2238,19086,563,7738,236761,107,140,38408,236787,107,140,4256,236779,3074,236779,4925,74005,236746,12375,17641,827,623,236763,12375,50044,63890,1374,994,6288,236761,107,140,4256,236779,3074,236779,4925,74005,236746,12375,17641,827,623,236776,12375,50044,827,623,236799,12375,50044,63890,1374,994,8450,236761,107,140,4256,236779,3074,236779,4925,74005,236746,12375,17641,827,236743,236828,10253,50044,827,623,236746,12375,17641,63890,1374,994,8450,236761,107,140,4256,236779,3074,236779,4925,74005,1567,12375,12720,827,623,21035,12375,236800,236825,827,623,17698,12375,110654,63890,1374,994,8450,236761,107,140,4256,236779,3074,236779,4925,74005,30465,12375,10284,827,623,92824,12375,236770,236778,236800,236812,236810,236775,4731,1374,994,6288,236761,107,140,12234,108,140,10422,578,1694,236769,3074,236761,10422,3507,107,140,584,13272,1251,4955,994,8450,107,140,11462,236764,7593,578,6288,236764,6288,107,140,1708,620,528,13272,236787,107,144,584,1722,236769,236767,236768,2843,1540,236787,107,148,11462,578,7593,578,8450,107,148,7284,107,144,584,711,620,236761,511,11462,6141,3718,578,8450,107,144,584,711,620,236761,16292,2683,6141,7593,578,8450,107,140,2060,3718,653,7593,109,2063,1527,236779,1048,236779,1071,236769,236749,1473,107,140,12234,78109,496,1292,600,4716,614,1908,236772,27851,11995,532,7623,614,3499,529,506,1171,538,107,140,16469,9964,600,659,8355,4945,532,2344,1082,538,236761,107,140,1708,2591,236787,107,140,2861,236779,1048,236779,1071,236769,236810,236768,1477,870,236778,236764,236800,236842,107,140,2861,236779,1048,236779,1071,236769,236770,236770,236768,1477,870,236778,236764,236800,236764,236810,236764,236832,236842,107,140,2861,236779,1048,236779,1071,236769,236771,236768,1477,2977,107,140,2861,236779,1048,236779,1071,236769,236778,236771,236768,1477,870,236778,236764,236800,236764,236810,236764,236832,236764,236770,236770,236764,236770,236800,236764,236770,236832,236764,236770,236819,236842,107,140,2861,236779,1048,236779,1071,236769,236770,236768,1477,2977,107,140,2861,236779,1048,236779,1071,236769,236770,236828,236768,1477,870,236778,236764,236800,236764,236810,236764,236832,236764,236770,236770,236764,236770,236800,236764,236770,236832,236842,107,140,12234,109,140,743,578,2977,107,140,511,2497,578,870,4339,236842,808,568,236749,900,236743,236770,236768,107,140,1708,858,528,2644,236769,236778,236764,538,1473,107,144,584,563,2497,236840,236747,9414,107,148,743,236761,3770,236769,236747,236768,107,148,1708,673,528,2644,236769,236747,900,858,236764,538,236764,858,1473,107,152,511,2497,236840,236804,236842,578,8450,107,140,2060,9898,109,2063,24261,236769,236746,236764,518,1473,107,140,12234,25146,506,1292,600,4716,1156,25630,532,7623,236743,107,140,1437,1698,529,910,4360,26798,236761,107,140,74581,506,2744,563,2462,4341,236761,107,140,38408,236787,107,140,64545,236769,236770,236812,236828,236764,236743,236812,236770,236778,236768,1374,994,236743,236770,236825,236761,107,140,64545,236769,236770,236819,236764,236743,236778,236828,236768,1374,994,236743,236832,236778,236761,107,140,64545,236769,236778,236771,236778,236771,236764,236743,236770,236828,236810,236770,236768,1374,994,236743,236771,236761,107,140,64545,236769,236770,236812,10442,236770,236810,236768,1374,994,236743,236778,236771,236761,107,140,12234,108,140,2060,801,236769,1714,236769,236746,10309,236772,236770,2812,808,801,236769,1714,236769,236763,10309,236772,236770,2812,109,2063,1527,236779,26846,236769,236751,1473,107,140,12234,107,140,26479,496,2483,503,236764,1527,506,1548,529,46451,147734,528,1581,22697,236761,107,140,107,140,2542,2591,236787,107,140,2861,236779,26846,1033,236746,5901,236753,142724,1606,7623,236743,236770,107,140,2861,236779,26846,1033,108250,236759,1606,7623,236743,236771,107,140,2861,236779,26846,1033,40688,12051,1606,7623,236743,236771,107,140,12234,108,140,26297,578,236743,236771,107,140,1708,858,528,2644,236769,236771,236764,5980,236769,236751,779,236743,236778,1473,107,144,584,503,236840,236747,236842,528,623,14196,6462,236836,1083,107,148,26297,3323,236743,236770,107,140,2060,38302,109,2063,24119,236779,28018,236769,2394,1473,107,140,26610,107,140,6924,496,1292,600,4716,496,1550,568,2383,236768,13855,496,1548,107,140,624,7623,506,24119,11995,531,625,236761,1637,506,1548,563,201595,107,140,2543,1156,25630,236764,4886,625,3121,699,5743,236761,108,140,38408,107,140,22539,24119,236779,28018,885,236770,236771,1373,107,140,236770,236771,107,140,22539,24119,236779,28018,885,236770,236810,236761,236800,1373,107,140,236770,236810,108,140,10282,236787,107,140,185368,3121,699,5743,2820,600,768,506,2238,1548,563,201595,107,140,2543,1156,25630,236764,506,886,611,1374,994,563,506,886,600,563,506,107,140,236760,661,6165,699,5743,236761,1701,2591,24119,236779,28018,885,236770,236812,236761,236810,1373,1374,107,140,2060,236743,236770,236810,532,24119,236779,28018,37565,236770,236812,236761,236810,1373,1374,994,753,236770,236810,236761,107,140,26610,109,140,2063,84953,236769,1111,1473,107,144,584,2951,236769,1111,753,801,236769,1111,1223,2843,236743,236771,236761,236810,236787,107,148,2060,4886,236769,1111,236768,107,144,584,1016,1890,236743,236771,236787,107,148,2060,801,236769,1111,236768,900,236743,236770,107,144,4454,236787,107,148,2060,801,236769,1111,236768,753,236743,236770,107,140,2060,84953,236769,8344,236769,2394,1223,109,2063,1386,236779,236746,236779,76875,236769,236749,1473,107,140,12234,107,140,26479,496,4414,11995,538,236764,611,735,531,1386,496,26106,529,538,4535,529,23795,236761,107,140,818,1171,1984,815,538,23795,236761,107,140,818,1548,529,23795,528,506,2148,1984,563,236787,107,144,236772,506,2148,11049,1548,768,538,563,11049,236761,107,144,236772,506,2148,1581,1548,768,538,563,1581,236761,107,140,13293,506,1548,529,23795,528,1546,1984,528,496,1694,236764,1298,3408,657,3546,107,140,236747,9282,506,1548,529,23795,528,506,1984,568,236747,236862,236770,769,108,140,38408,236787,107,140,22539,1386,236779,236746,236779,76875,236769,236800,236768,107,140,236840,236800,236764,236743,236810,236764,236743,236832,236842,107,140,12234,109,140,743,236764,1152,578,17811,538,107,140,1708,2222,528,2644,236769,236749,1473,107,144,743,236761,3770,236769,3744,236768,107,144,3744,3323,236743,236778,107,140,2060,9898,109,2063,4171,236779,2383,236769,236751,1473,107,140,12234,107,140,3048,795,577,2238,496,2483,529,4171,15914,684,162760,653,9952,236761,5180,4209,563,107,140,1071,9918,506,2483,1131,4171,532,994,614,3499,529,506,4171,236761,107,140,107,140,2542,2591,236787,107,140,8992,236779,2383,885,10979,236764,1041,1463,563,3415,1373,1251,11058,10979,827,623,3307,827,623,1201,827,623,511,827,623,12720,1935,107,140,8992,236779,2383,885,4906,236764,1156,236764,1806,236764,2390,236764,3493,236764,3962,1373,1251,11058,4906,827,623,13498,827,623,19891,827,623,19025,827,623,21716,827,623,34699,1935,107,140,12234,108,140,8992,578,568,236751,236761,10143,204160,623,623,11287,6966,825,107,140,2060,870,3017,573,3658,528,4171,768,3658,2843,623,1935,109,2063,5347,236779,3744,236769,236781,236764,570,1473,107,140,12234,2094,1292,4716,1156,4414,4945,1123,532,570,532,7623,506,107,140,148921,1581,11995,1548,600,563,528,506,2644,870,236781,236764,570,236842,23722,236761,1637,236743,107,140,13534,236789,236751,951,1288,1548,236764,1299,506,1292,1374,994,753,236770,236761,108,140,2542,2591,236787,107,140,35724,236779,3744,236769,236770,236778,236764,236743,236770,236810,236768,578,236743,236770,236812,107,140,35724,236779,3744,236769,236770,236800,236764,236743,236770,236778,236768,578,753,236770,107,140,12234,109,140,584,1123,1890,570,236787,994,753,236770,107,140,584,1123,1251,570,236787,994,570,768,570,2144,236743,236778,1251,236743,236771,1663,753,236770,107,140,2060,570,768,570,2144,236743,236778,1251,236743,236771,1663,570,753,236743,236770,109,2063,20274,236779,18553,236769,236749,236764,520,1473,107,140,12234,3048,659,2238,1156,4414,25630,538,532,520,236764,532,822,4209,563,531,16333,506,107,140,28128,529,506,25630,699,538,1343,520,568,16603,538,532,520,769,236743,107,140,14670,506,3890,531,506,20480,11995,532,7352,600,531,14820,236761,107,140,2859,538,563,5314,1082,520,236764,994,753,236770,236761,107,140,12703,236787,107,140,38753,236779,18553,236769,236770,236764,236743,236810,236768,1477,623,236771,236763,236770,236770,236775,107,140,38753,236779,18553,236769,236832,236764,236743,236810,236768,1477,753,236770,107,140,38753,236779,18553,236769,236770,236771,236764,236743,236778,236771,236768,1477,623,236771,236763,236770,236770,236770,236770,236775,107,140,38753,236779,18553,236769,236778,236771,236764,236743,236800,236800,236768,1477,623,236771,236763,236770,236770,236771,236770,236771,236775,107,140,12234,109,140,584,538,1890,520,236787,994,753,236770,107,140,18553,578,4886,3283,236749,900,520,236768,965,236743,236778,236768,107,140,2060,10915,236769,18553,236768,109,2063,4709,236779,67161,236769,236781,1473,107,140,12234,26479,496,1694,529,4414,25630,1123,236761,994,496,19372,1694,529,784,236743,107,140,31493,600,18116,236789,236745,1027,1581,15958,236761,108,140,10282,236787,196983,1694,1374,577,19372,528,5683,1900,236761,107,140,107,140,2542,2591,236787,107,140,22539,4709,236779,67161,5551,236770,236810,236764,236743,236800,236800,236764,236743,236770,236812,236778,236778,236764,236743,236770,2812,107,140,236840,236770,236764,236743,236770,236810,236764,236743,236800,236800,236842,107,140,22539,4709,236779,67161,5551,236770,236810,236778,236764,236743,236800,236778,236800,236764,236743,236770,236812,236778,236778,236764,236743,236770,236771,2812,107,140,3805,107,140,12234,109,140,2063,12011,236769,236781,1473,107,144,1708,677,528,1540,236769,236781,1473,107,148,584,801,236769,574,236768,2144,236743,236778,1251,236743,236771,236787,107,152,2060,8450,107,144,2060,6288,107,140,2060,19372,236769,2234,236769,7212,236769,149492,236764,1123,9670,109,2063,684,236779,3119,236769,2762,1473,107,140,12234,107,140,26479,614,3499,529,25630,236764,4260,506,25630,600,659,1534,236743,236770,532,236743,236819,23722,236764,107,140,34220,506,9113,3499,236764,532,1299,9883,1546,15958,684,1061,7041,1463,699,107,140,236775,4906,827,623,11634,827,623,19765,827,623,26391,827,623,40756,827,623,44955,827,623,75683,827,623,80357,827,623,98623,3056,108,140,2542,2591,236787,107,142,2762,578,870,236778,236764,236743,236770,236764,236743,236770,236764,236743,236812,236764,236743,236810,236764,236743,236828,236764,236743,236778,236764,236743,236800,236842,139,107,148,1160,4260,4617,3921,870,236770,236764,236743,236770,236764,236743,236778,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,236764,236743,236828,236842,236743,107,148,1160,14416,4617,3921,870,236828,236764,236743,236810,236764,236743,236812,236764,236743,236800,236764,236743,236778,236764,236743,236778,236764,236743,236770,236764,236743,236770,236842,107,142,2060,11058,80357,827,623,40756,827,623,26391,827,623,19765,827,623,11634,827,623,11634,827,623,4906,827,623,4906,1935,107,140,107,142,2859,506,3499,563,7738,236764,994,614,7738,3499,236787,107,142,2762,578,2977,107,142,2060,2977,107,140,107,142,2859,506,3499,815,1027,17163,1548,16052,625,236787,107,142,2762,578,870,236770,236764,753,236770,1031,236743,236810,236810,236842,236743,107,148,1160,4260,4617,3921,33368,236770,236764,236743,236770,236764,236743,236810,236810,236842,107,148,1160,14416,4617,3921,870,236810,236810,236764,236743,236770,236764,753,236770,236842,107,142,2060,578,7756,4906,2000,107,140,12234,108,140,2063,531,236779,3017,236769,236781,236787,801,236768,3921,1540,236787,107,142,584,1123,1251,236743,236770,236787,107,144,2060,623,4906,236775,107,142,36208,1123,1251,236743,236778,236787,107,144,2060,623,11634,236775,107,142,36208,1123,1251,236743,236800,236787,107,144,2060,623,19765,236775,107,142,36208,1123,1251,236743,236812,236787,107,144,2060,623,26391,236775,107,142,36208,1123,1251,236743,236810,236787,107,144,2060,623,40756,236775,107,142,36208,1123,1251,236743,236825,236787,107,144,2060,623,44955,236775,107,142,36208,1123,1251,236743,236832,236787,107,144,2060,623,75683,236775,107,142,36208,1123,1251,236743,236828,236787,107,144,2060,623,80357,236775,107,142,4454,236787,107,144,2060,623,98623,236775,107,140,40835,236779,2234,236764,9898,578,19372,236769,2762,10309,59396,236770,1604,2977,107,140,1708,1123,528,19372,236779,2234,236787,107,142,584,236743,236770,6605,1123,6605,236743,236819,236787,107,144,743,236761,3770,236769,1071,236779,3017,236769,236781,1223,107,140,2060,9898,109,2063,517,236769,236749,1473,107,140,12234,41276,506,1292,517,600,4716,538,618,496,7689,236764,107,140,624,7623,496,1694,529,2425,538,236764,1288,600,506,1550,529,506,3408,657,3546,858,563,506,82410,529,858,768,858,563,1581,107,140,504,506,2324,529,4945,699,236743,236770,531,858,7394,236761,107,140,236747,9857,699,236743,236770,236761,107,140,1437,82410,529,858,563,506,27104,529,506,4945,699,236743,236770,531,858,568,236770,808,236743,236778,808,3729,808,858,769,107,140,12703,236787,107,140,236760,236769,236810,236768,1251,870,236770,236764,236743,236778,236764,236743,236825,236764,236743,236778,236812,236764,236743,236770,236810,236842,107,140,12234,109,140,584,538,1251,236743,236771,236787,994,2977,107,140,584,538,1251,236743,236770,236787,994,870,236770,236842,107,140,584,538,1251,236743,236778,236787,994,870,236770,236764,236743,236778,236842,108,140,743,578,870,236770,236764,236743,236778,236842,107,140,1708,858,528,2644,236769,236800,236764,538,900,236743,236770,1473,107,144,584,858,2144,236743,236778,1251,236743,236770,236787,107,148,743,236761,3770,236769,743,17825,236778,236842,900,568,236747,753,236743,236770,236768,900,858,236768,107,144,4454,236787,107,148,743,236761,3770,236769,743,17825,236778,236842,808,568,236747,753,236743,236770,236768,808,858,236768,107,140,2060,9898,109,2063,1581,236779,19808,236779,227147,236769,236749,1473,107,140,12234,107,140,26479,496,4414,11995,538,236764,994,496,33228,600,815,506,1548,529,1581,532,11049,107,140,28018,180522,84704,600,3798,2351,506,2644,236769,236770,236764,538,779,23722,236761,108,140,12703,236743,236770,236787,108,144,4661,236787,236743,236800,107,144,8433,236787,568,236770,236764,236743,236778,236768,107,144,44008,236787,107,144,11980,142193,659,236743,236770,236764,236743,236778,236764,236743,236800,236761,886,529,1091,563,1581,236764,532,1156,529,1091,659,11049,236761,108,140,12703,236743,236778,236787,108,144,4661,236787,236743,236770,236778,107,144,8433,236787,568,236812,236764,236743,236825,236768,107,144,44008,236787,107,144,11980,142193,659,236743,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,236764,236743,236825,236764,236743,236832,236764,236743,236828,236764,236743,236819,236764,236743,236770,236770,236761,2390,529,1091,659,1581,236764,532,236743,236825,529,1091,659,11049,236761,108,140,10282,236787,107,144,236770,236761,236743,236770,6605,538,6605,236743,236770,236771,236884,236800,107,144,236778,236761,8323,33228,815,506,1548,529,1581,532,11049,11995,180522,84704,6619,236761,107,140,12234,109,140,19808,236779,26297,236764,1581,236779,26297,578,236743,236771,236764,236743,236771,107,140,1708,858,528,2644,236769,236770,236764,538,900,236743,236770,1473,107,144,584,1540,236769,236747,236768,1251,1540,236769,236747,10309,59396,236770,9414,107,148,584,858,2144,236743,236778,1251,236743,236770,236787,107,152,19808,236779,26297,3323,236743,236770,107,148,4454,236787,107,152,20952,236779,26297,3323,236743,236770,107,140,2060,1581,236779,26297,236764,11049,236779,26297,109,2063,1527,236779,30457,236769,2762,1473,107,140,12234,107,140,6974,496,1292,1527,236779,30457,837,4716,614,3499,529,25630,532,7623,107,140,1437,1548,529,4820,837,815,496,2324,529,26798,1890,236743,236771,236761,107,140,2859,496,1548,563,5676,236764,1299,1061,1171,10227,15958,795,577,5676,236787,107,140,236744,236761,236759,236761,753,236770,236778,236800,815,10227,26798,753,236770,236764,236743,236778,236764,532,236743,236800,236761,107,140,22539,1527,236779,30457,67713,1251,236743,236771,107,140,22539,1527,236779,30457,99097,236770,236764,236743,236770,236770,236764,753,236770,236770,2812,1251,236743,236770,107,140,22539,1527,236779,30457,5551,236770,236764,236743,236770,236764,236743,236778,2812,1251,236743,236800,107,140,12234,108,140,2063,12011,236769,236781,236787,801,236768,3921,801,236787,107,144,236752,578,1694,236769,1714,236769,236781,1223,107,144,584,537,236840,236771,236842,1251,9296,1083,107,148,236752,578,537,236840,236770,9218,107,148,236752,578,1694,236769,3275,236769,720,236764,537,1223,107,148,236752,236840,236771,236842,578,753,236752,236840,236771,236842,107,144,4454,236787,107,148,236752,578,1694,236769,3275,236769,720,236764,537,1223,107,144,2060,236743,236770,768,2324,236769,236752,236768,1890,236743,236771,1663,236743,236771,107,140,2060,2324,236769,3275,236769,149492,236764,4617,1223,109,2063,2827,236779,811,236779,3703,236769,2762,1473,107,140,12234,1882,735,614,3499,756,2762,236789,529,646,25630,4617,236840,236770,1604,4617,236840,236778,1604,32603,4617,236840,236797,1619,818,107,140,34488,528,506,3499,795,577,28760,11496,236761,5180,4209,563,531,6054,768,107,140,509,563,2653,531,974,614,3499,19372,528,1908,236772,128905,1900,684,14980,236743,107,140,1437,2269,5585,580,506,2238,3499,236787,107,144,3048,659,6208,531,2121,1447,8633,5585,1027,1548,529,2782,236761,107,140,107,140,4906,1447,8633,5585,2820,34064,784,4820,529,506,3499,684,886,107,140,3473,528,506,1447,5281,236761,669,1774,3408,529,506,3499,795,577,7808,531,107,140,1437,6250,2939,528,506,3499,858,236761,236744,236761,236743,236771,594,3546,236761,236743,108,140,2859,625,563,2653,531,3011,506,19372,3499,684,14980,506,2787,5585,107,140,5215,994,6288,1663,994,8450,236761,107,140,2859,506,2238,3499,563,7738,1299,994,6288,236761,108,140,10282,236787,669,2238,1694,563,23535,531,735,4709,4820,236761,108,140,2542,14691,236787,107,140,107,140,11047,236779,811,236779,3703,5551,236800,236764,236743,236812,236764,236743,236810,236764,236743,236770,236764,236743,236778,2812,1224,236813,4339,107,140,44008,236787,3763,2121,495,236743,236778,1447,8633,6675,236764,1908,236772,128905,1900,740,107,153,1553,11105,573,506,2238,3499,236761,107,140,11047,236779,811,236779,3703,5551,236800,236764,236743,236810,236764,236743,236812,236764,236743,236770,236764,236743,236778,2812,1224,236813,9277,107,140,44008,236787,1509,563,711,2653,531,974,1908,236772,128905,1900,573,506,2238,107,152,2513,684,14980,1027,1548,529,1447,8633,6675,236761,107,152,107,140,12234,108,140,107,140,40835,236779,2762,578,19372,236769,2762,236768,107,140,584,4617,1251,19372,236779,2762,236787,994,6288,107,140,1708,858,528,2644,236769,236770,236764,5980,236769,2762,16644,107,144,584,4617,236840,236747,9218,900,4617,8497,236747,236842,1251,19372,236779,2762,236787,107,148,2060,6288,107,140,2060,8450,109,2063,8770,236769,38511,236770,236764,65145,236778,1473,107,140,12234,902,672,2608,236764,611,795,4144,496,1292,600,4716,1156,15852,529,4945,236764,107,140,624,23521,3363,625,563,2653,531,2121,614,8770,529,4820,107,140,19195,1091,531,1386,65145,236770,496,1694,529,1186,1581,4945,236761,107,140,3810,563,951,4576,580,506,1548,529,52195,4820,1534,65145,236770,532,65145,236778,236761,107,140,2859,625,563,2653,531,8770,4820,1534,506,65145,236770,532,65145,236778,531,1386,107,140,712,506,4820,529,65145,236770,531,577,1581,236764,994,623,26915,3056,107,140,113255,236764,994,623,7018,3056,107,140,2542,2591,236787,107,140,39588,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,1604,870,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,2812,1477,623,26915,236775,107,140,39588,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,1604,870,236770,236764,236743,236810,236764,236743,236800,236764,236743,236812,2812,1477,623,7018,236775,107,140,1509,563,12718,600,506,2744,15852,795,577,1908,236772,11681,236761,107,140,12234,109,140,26297,236779,19808,578,5980,236769,2234,236769,7212,236769,3485,1123,236787,1123,2144,236743,236778,1251,236743,236770,236764,65145,236770,9670,107,140,26297,236779,20952,578,5980,236769,2234,236769,7212,236769,3485,1123,236787,1123,2144,236743,236778,1251,236743,236771,236764,65145,236778,9670,107,140,2060,623,26915,236775,768,38302,236779,19808,6605,38302,236779,20952,1663,623,7018,236775,109,2063,52473,236769,2181,1473,107,140,12234,26479,496,2483,13855,496,2557,15914,67505,11739,236764,994,496,19086,107,140,1340,506,6064,607,506,1346,51881,532,7906,506,7041,1527,236761,107,140,2859,3131,11739,735,506,1638,24987,236764,994,784,529,1091,236761,107,140,107,140,12703,236787,107,140,82555,1033,236746,518,505,1606,1251,16923,236746,2632,236743,236770,236764,756,236763,2632,236743,236770,236764,756,236755,2632,236743,236770,236783,107,140,82555,1033,236746,518,518,496,1606,1251,16923,236746,2632,236743,236778,236764,756,236763,2632,236743,236778,236783,107,140,82555,1033,236746,518,505,496,518,1606,1251,16923,236746,2632,236743,236778,236764,756,236763,2632,236743,236778,236783,107,140,82555,1033,236763,518,518,518,496,1606,1251,16923,236763,2632,236743,236812,236783,107,140,82555,68560,1251,8637,108,140,12234,109,140,584,1594,1251,86679,994,8637,140,107,140,2861,236764,9898,578,12739,3800,12739,825,107,140,1708,3658,528,1594,236761,6966,885,230324,107,144,584,3658,2843,86679,107,148,584,3658,711,528,1527,236787,1527,236840,3017,236842,578,236743,236771,107,148,2861,236840,3017,236842,3323,236743,236770,107,140,31730,578,2631,236769,2234,236769,2861,236761,7558,21957,107,140,1708,677,236764,505,528,1527,236761,7633,6141,107,144,584,505,1251,36411,236787,107,148,743,236840,574,236842,578,505,107,140,2060,9898,109,2063,14416,236779,8335,236769,236751,236764,236755,1473,107,140,12234,6804,107,140,1882,659,2238,1156,16587,503,532,505,236764,611,735,531,21717,784,506,7579,528,503,600,659,4745,531,1027,2872,528,505,107,140,5215,2426,768,506,1354,2483,563,142193,236761,107,140,236776,2483,563,2760,142193,768,625,20838,506,1638,30708,618,4448,236761,107,140,3048,1374,994,496,33228,7906,506,1354,2483,532,6288,236786,9277,573,506,2426,236761,107,140,12703,107,140,2542,503,578,623,28180,893,827,505,578,623,7393,827,506,1354,1374,577,11297,214728,963,9277,236768,107,140,2542,503,578,623,108250,827,505,578,623,236763,236775,138,1437,1354,1374,577,11297,552,2063,963,9277,236768,107,140,2542,503,578,623,200500,524,236755,3604,827,505,578,623,596,827,506,1354,1374,577,11297,2692,524,236755,963,4339,236768,107,140,12234,108,140,1075,578,116740,7013,236769,7212,236769,3485,677,236787,677,711,528,505,236764,503,1223,107,140,2060,27296,236764,27296,1251,27296,186487,236770,236842,109,2063,11049,236779,2861,236769,38511,1473,107,140,12234,26479,496,1694,529,16587,236764,1298,1546,2483,10594,529,1186,26798,236764,994,496,1694,236761,107,140,7795,3408,858,529,506,3938,1374,577,623,1437,1548,529,11049,4820,528,506,107,140,2383,858,529,506,2744,1781,1298,784,506,858,236789,236751,1374,577,12043,684,506,1548,107,140,1340,11049,26798,528,506,858,236789,594,2483,529,506,2744,236761,108,140,22539,11049,236779,2861,20768,236770,236778,236800,236812,236810,236825,236832,10190,107,140,4119,1437,1548,529,11049,4820,236743,236812,236749,506,1540,236812,829,236743,236812,529,506,236743,236812,236749,1000,231116,107,140,22539,11049,236779,2861,20768,236800,149796,236770,236770,236770,236770,236770,236770,236770,236770,18992,107,140,4119,1437,1548,529,11049,4820,236743,236770,236749,506,1540,236770,829,236743,236770,529,506,236743,236770,236749,1000,10498,107,141,236775,1437,1548,529,11049,4820,236743,236828,236749,506,1540,236828,829,236743,236828,529,506,236743,236828,236749,1000,231116,107,140,12234,109,140,743,236764,7930,578,17811,623,1437,1548,529,11049,4820,528,506,2483,858,529,506,2744,1781,107,140,1708,503,528,65145,236787,107,144,19808,236779,26297,578,5980,236769,2234,236769,7212,236769,3485,677,236787,801,236769,574,236768,2144,236743,236778,1251,236743,236770,236764,503,9670,107,144,743,236761,3770,236769,9944,236761,10143,885,236747,827,1540,236769,19808,236779,26297,9670,107,140,2060,9898,109,2063,1322,5096,4160,10495,236769,30457,1473,107,140,12234,107,140,26479,614,3499,529,25630,27536,236764,1586,506,7081,2324,529,1027,1908,236772,11681,1159,236772,2513,107,140,1340,27536,236761,107,140,12703,107,140,1387,5096,4160,10495,5551,236778,236764,236743,236800,236764,236743,236812,236764,236743,236770,236764,236743,236778,236764,236743,236812,2812,1251,236743,236770,107,140,1387,5096,4160,10495,99097,236770,236764,753,236778,236764,753,236800,2812,1251,753,236825,107,140,12234,109,140,584,784,236769,236781,6867,236743,236771,573,1123,528,27536,1473,994,1322,236769,30457,236768,107,140,236751,236764,9898,578,236743,236771,236764,236743,236771,107,140,1708,1123,528,27536,236787,107,144,236751,3323,1123,107,144,743,578,1322,236769,743,236764,503,236768,107,144,584,503,6867,236743,236771,236787,503,578,236743,236771,107,140,2060,9898,108,1106,6596,108,2063,2631,236779,6910,236769,8545,236764,6541,1473,107,140,12234,107,140,3048,659,2238,496,30777,8411,529,33922,236761,7714,2050,9282,496,3161,1388,236764,107,140,624,1546,236743,236770,528,496,2050,9282,496,3161,4360,529,1813,236761,107,140,7795,1388,815,496,7041,24211,600,740,577,1456,531,8430,1813,699,625,236764,236743,107,140,624,784,78551,735,506,1638,6541,236761,107,140,11069,4209,563,531,1161,506,78551,531,7738,506,33922,236761,107,140,8433,506,1548,529,2782,611,1202,531,3718,506,78551,236761,108,140,12703,236743,236770,236787,107,144,4661,236787,236743,107,148,8545,1017,15385,236771,236764,236771,236764,236770,236764,236771,1604,870,236771,236764,236770,236764,236771,236764,236771,1604,870,236770,236764,236770,236764,236770,236764,236770,10660,107,148,36164,236779,48578,1017,236743,236770,107,144,8433,236787,236743,236825,108,140,12703,236743,236778,236787,107,144,4661,236787,236743,107,148,8545,1017,15385,236771,236764,236771,236764,236770,236764,236770,1604,870,236771,236764,236771,236764,236771,236764,236771,1604,870,236770,236764,236770,236764,236770,236764,236770,1604,870,236771,236764,236770,236764,236770,236764,236770,10660,107,148,36164,236779,48578,1017,236743,236778,107,144,8433,236787,236743,236810,107,140,107,140,12703,236743,236800,236787,107,144,4661,236787,236743,107,148,8545,1017,15385,236771,236764,236771,236764,236771,1604,870,236771,236764,236771,236764,236771,10660,107,148,36164,236779,48578,1017,236743,236810,107,144,8433,236787,236743,236771,108,140,40498,236787,107,144,236829,784,33922,735,506,1638,3861,107,144,236829,236743,236770,6605,8411,236761,3119,6605,236743,236770,236771,236884,236778,107,144,236829,236743,236770,6605,8411,20527,236770,1619,3119,6605,236743,236770,236771,236884,236778,107,144,236829,8411,236840,236747,2585,236804,236842,3921,236743,236771,1109,236743,236770,107,144,236829,236743,236770,6605,6541,6605,236743,236770,236771,107,140,12234,109,140,743,578,236743,236771,140,107,140,1708,537,528,8411,236787,107,144,743,3323,6596,236761,34442,236769,2330,236769,236752,236768,965,6541,236768,107,140,2060,9898,109,2063,4260,236779,2513,236769,2762,1473,107,140,12234,107,140,902,672,99430,236764,611,735,531,4260,614,3499,529,1908,236772,27851,25630,3894,531,107,140,5640,529,5906,528,910,14820,10065,528,52557,1900,236761,107,140,2542,3361,1548,529,5906,236764,4260,2721,580,20632,1550,236761,108,140,1509,1921,577,12845,1133,672,236787,107,140,22539,4260,236779,2513,5551,236770,236764,236743,236810,236764,236743,236778,236764,236743,236800,236764,236743,236812,2812,1251,870,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,236842,107,140,22539,4260,236779,2513,99097,236778,236764,753,236800,236764,753,236812,236764,753,236810,236764,753,236825,2812,1251,33368,236825,236764,753,236810,236764,753,236812,236764,753,236800,236764,753,236778,236842,107,140,22539,4260,236779,2513,5551,236770,236764,236743,236771,236764,236743,236778,236764,236743,236800,236764,236743,236812,2812,870,236771,236764,236743,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236842,107,140,12234,108,140,2543,75162,29001,1419,90654,236779,1071,236779,2478,107,140,2063,90654,236769,236781,236787,801,236764,570,236787,801,236768,3921,801,236787,107,144,236781,236770,578,5980,236769,2234,236769,7212,236769,3485,677,236787,677,1251,623,236770,827,10915,236769,236781,41052,107,144,236762,236770,578,5980,236769,2234,236769,7212,236769,3485,677,236787,677,1251,623,236770,827,10915,236769,236762,41052,107,144,584,1123,236770,2843,570,236770,236787,994,1123,236770,753,570,236770,107,144,2060,1123,753,570,107,140,2060,19372,236769,2762,236764,2307,236784,24671,236779,1071,236779,2478,236769,24671,1223,109,2063,4864,236779,8992,236769,236751,236764,538,1473,107,140,12234,26479,496,2483,503,532,496,3756,1548,538,236764,611,735,1010,68783,531,4144,236743,107,140,236746,1292,600,7623,496,1694,529,784,4171,699,2483,503,600,3014,7121,236743,107,140,236749,223853,236764,528,1900,1239,4171,3196,528,506,2483,503,236761,107,140,2859,506,2483,503,563,7738,1299,506,1292,1374,994,614,7738,1694,236761,107,140,10282,236787,611,1149,9027,506,2744,2483,6097,1186,11739,532,9952,236761,107,140,38408,236787,107,140,6916,236779,8992,885,35899,1053,496,2268,33217,827,236743,236812,236768,196301,11058,36604,1935,107,140,6916,236779,8992,885,35899,1053,496,2268,33217,827,236743,236800,236768,196301,11058,35899,827,623,108827,1935,107,140,6916,236779,8992,885,19751,2173,2557,827,236743,236778,236768,196301,2977,107,140,6916,236779,8992,885,9259,1902,827,236743,236812,236768,196301,11058,12392,1935,107,140,6916,236779,8992,885,168602,3566,827,236743,236800,236768,196301,11058,168602,1935,107,140,12234,109,140,743,578,2977,107,140,1708,3658,528,503,236761,6966,885,230324,107,144,584,3658,2843,86679,107,148,236755,236779,26297,578,5980,236769,2234,236769,7212,236769,3485,677,236787,677,711,528,623,7393,86472,14196,6462,236836,827,3658,9670,107,148,584,505,236779,26297,1251,538,236787,9898,236761,3770,236769,3017,236768,107,140,2060,9898,109,2063,974,236779,69344,236779,236766,54587,236769,3017,1473,107,140,12234,3048,659,2238,496,3658,236761,5180,4209,563,531,1586,506,24119,124312,600,11979,1534,236743,107,140,13498,223853,699,506,1447,2678,529,506,3658,568,4925,13719,769,107,140,107,140,236847,172213,528,506,6534,532,16548,4038,236789,236745,1527,236761,9657,7738,2483,768,611,3782,236789,236745,107,140,4114,1027,124312,1645,506,2787,4194,236761,236743,108,140,3048,1149,9027,600,506,2238,2483,6097,5422,6064,1186,236761,108,140,12703,236787,107,140,828,236779,69344,236779,236766,54587,885,196180,5337,1373,196301,623,236756,236775,107,140,828,236779,69344,236779,236766,54587,885,55038,1373,196301,623,236836,236775,107,140,828,236779,69344,236779,236766,54587,885,41007,1373,196301,3679,107,140,828,236779,69344,236779,236766,54587,885,596,1373,196301,3679,107,140,12234,109,140,2063,563,236779,236766,54587,236769,574,236787,1540,236768,3921,7014,236787,107,144,2060,677,528,623,7393,86472,14196,6462,236836,236775,107,140,1708,858,528,2644,236769,3469,236769,3017,236768,753,236743,236778,236764,236743,236771,236764,753,236770,1473,107,144,584,563,236779,236766,54587,236769,3017,236840,236747,2812,532,711,563,236779,236766,54587,236769,3017,236840,236747,236772,236770,2812,532,711,563,236779,236766,54587,236769,3017,236840,236747,236862,236770,30957,107,148,2060,3658,236840,236747,236842,107,140,2060,3679,109,2063,4241,236779,16611,3852,236769,38511,1473,107,140,26610,107,140,3048,659,2238,496,1694,529,1156,16587,236764,1800,16587,4551,529,1932,107,140,5432,48251,113299,653,3107,62334,756,21725,1186,236761,107,140,11069,3195,563,531,2426,768,625,563,2653,531,168984,506,1156,16587,528,107,140,19256,1900,236764,600,506,9113,2483,795,577,1535,236761,107,140,236776,2483,555,563,4542,531,577,1535,768,532,1186,768,784,62334,528,555,107,140,733,20433,236761,1701,2591,236787,506,2483,93501,3507,232589,563,1535,236764,1651,506,2483,107,140,236789,3507,236789,563,711,236761,107,140,13293,756,10784,236789,768,993,236789,236751,496,1595,531,1386,496,1535,2483,236764,532,994,756,3771,236789,7394,236761,108,140,38408,236787,107,140,10480,236779,16611,3852,20768,825,40692,24411,10190,1251,756,10784,236789,107,140,10480,236779,16611,3852,5551,1606,963,24411,10190,1251,756,3771,236789,107,140,26610,109,140,2063,4341,236779,16611,3852,236769,236751,236787,1540,236768,3921,7014,236787,107,144,26297,578,236743,236771,107,144,1708,677,528,503,236787,107,148,26297,578,38302,900,236743,236770,768,677,1251,132197,1663,38302,753,236743,236770,107,148,584,38302,655,236743,236771,236787,994,8450,107,144,2060,38302,1251,236743,236771,107,140,2060,623,10784,236775,768,4341,236779,16611,3852,236769,38511,236840,236771,236842,900,65145,236840,236770,2812,653,4341,236779,16611,3852,236769,38511,236840,236770,236842,900,65145,236840,236771,2812,1663,623,3771,236775,109,2063,5783,236769,2762,236764,620,1473,107,140,12234,107,140,26479,614,3499,4617,529,25630,532,496,4414,11995,620,236764,994,496,19372,1694,236743,107,140,1340,3861,620,607,506,5783,620,4945,528,4617,236761,108,140,12703,236743,236770,236787,108,144,4661,236787,4617,578,33368,236800,236764,753,236812,236764,236743,236810,1604,620,578,236743,236800,107,144,8433,236787,33368,236812,236764,753,236800,236764,236743,236810,236842,108,140,12703,236743,236778,236787,108,144,4661,236787,4617,578,870,236812,236764,753,236812,236764,236743,236812,1604,620,578,236743,236778,107,144,8433,236787,870,236812,236764,236743,236812,236842,108,140,12703,236743,236800,236787,108,144,4661,236787,4617,578,33368,236800,236764,236743,236778,236764,236743,236770,236764,236743,236778,236764,753,236770,236764,753,236778,236764,236743,236770,1604,620,578,236743,236770,107,144,8433,236787,870,236778,236842,108,140,10282,236787,107,144,236770,236761,669,3861,529,506,3499,795,577,528,506,2644,529,870,236770,236764,236743,236770,236771,236771,236771,1619,107,144,236778,236761,669,4820,528,506,3499,795,577,528,506,2644,529,33368,236770,236771,236771,236771,236764,236743,236770,236771,236771,236771,1619,107,144,236800,236761,236743,236771,6605,620,6605,5980,236769,2762,236768,107,140,12234,109,140,2060,19372,236769,40835,236769,2762,10309,59396,236770,105775,236767,2812,109,2063,3465,236769,38511,1473,107,140,12234,26479,496,1908,236772,11681,1694,529,25630,236764,994,506,2324,529,784,529,506,11049,4820,600,659,528,1581,10681,236761,107,140,108,140,38408,107,140,47213,5551,236810,236764,236743,236828,236764,236743,236832,236764,236743,236770,2812,196301,236743,236770,236778,107,140,47213,5551,236800,236764,236743,236800,236764,236743,236800,236764,236743,236800,236764,236743,236800,2812,196301,236743,236819,107,140,47213,5551,236800,236771,236764,236743,236770,236800,236764,236743,236778,236812,236764,236743,236800,236778,236770,2812,196301,236771,107,140,12234,109,140,2060,2324,236769,38511,236840,236747,236842,573,858,528,2644,236769,3469,236769,38511,1223,768,858,2144,236743,236778,1251,236743,236771,532,65145,236840,236747,236842,2144,236743,236778,1251,236743,236770,236768,109,2063,1138,236779,31493,236769,2762,236764,620,1473,107,140,12234,107,140,26479,496,1908,236772,11681,3499,529,25630,4617,532,614,11995,620,236764,994,107,140,1437,2324,529,506,4820,607,657,1346,1156,26798,699,506,1171,620,4820,529,4617,236761,108,140,12703,236787,108,144,4661,236787,4617,578,870,236770,236770,236770,236764,236778,236770,236764,236800,236764,236812,236771,236771,236771,236764,236810,236764,236825,236764,236832,236764,236828,236764,236819,1604,620,578,236743,236812,107,144,8433,236787,236743,236778,236812,997,2324,529,236743,236778,236770,900,236743,236800,108,140,40498,236787,107,144,236770,236761,236743,236770,6605,5980,236769,2762,236768,6605,236743,236770,236771,236771,107,144,236778,236761,236743,236770,6605,620,6605,5980,236769,2762,236768,107,140,12234,109,140,2063,26798,236769,236781,236787,801,236768,3921,801,236787,107,144,236751,578,1540,236769,236781,236768,107,144,2060,5980,236769,236751,236768,753,236743,236770,768,503,236840,236771,236842,1251,75381,1663,5980,236769,236751,236768,107,140,2060,2324,236769,7212,236769,3485,1123,236787,26798,236769,236781,236768,6605,236743,236778,236764,4617,8497,236767,14430,109,2063,974,236779,19808,236779,1475,8938,236802,236769,236749,1473,107,140,12234,107,140,26479,496,4414,11995,538,236764,994,496,19372,1694,600,815,506,11049,4945,528,3995,11329,7501,236761,108,140,818,2111,8938,236802,56770,563,496,56770,528,29282,600,10928,496,7501,5221,107,140,527,5238,236787,1502,607,1027,4414,11995,538,236761,4298,1546,1941,563,4597,699,506,236743,107,140,32136,1941,618,5238,236787,768,506,3527,1941,563,1581,236764,506,2148,1941,563,886,3746,529,236743,107,140,1437,3527,1941,236761,1637,506,3527,1941,563,11049,236764,506,2148,1941,563,236743,236800,2782,506,3527,107,140,6061,2915,236743,236770,236761,669,56770,563,600,951,4217,1144,1550,529,538,236764,506,7501,795,2462,5370,236743,236770,236761,108,140,10282,236787,236743,107,144,236770,236761,2111,8938,236802,236769,236770,236768,563,870,236770,1619,107,144,236778,236761,8323,1694,19372,528,5683,1900,236761,108,140,2542,2591,236787,107,140,828,236779,19808,236779,1475,8938,236802,236769,236810,236768,7623,870,236770,236764,236743,236810,236842,997,669,3995,11329,7501,573,236743,236810,563,870,236810,236764,236743,236770,236825,236764,236743,236828,236764,236743,236812,236764,236743,236778,236764,236743,236770,1604,834,506,11049,4945,659,1186,236743,236770,236764,532,236743,236810,236761,107,140,12234,109,140,743,236764,1123,578,17811,538,107,140,6858,1123,2843,236743,236770,236787,107,144,584,1123,2144,236743,236778,1251,236743,236770,236787,9898,236761,3770,236769,236781,236768,107,144,236781,578,1123,973,236743,236778,768,1123,2144,236743,236778,1251,236743,236771,1663,1123,808,236743,236800,900,236743,236770,107,140,743,236761,3770,236769,236770,236768,107,140,2060,19372,236769,743,236768,109,2063,4341,236779,1896,236769,1896,1473,107,140,12234,3048,735,531,4903,496,1292,837,121019,496,2238,3433,2483,532,107,140,15072,6288,768,506,3433,563,4341,7394,8450,236761,107,140,818,3433,563,4341,768,784,529,506,2269,6366,659,14137,236787,107,140,236770,236761,669,3433,2483,563,711,7738,236761,107,140,236778,236761,669,1548,529,2668,563,711,2344,1082,236743,236770,653,3715,1082,236743,236800,236770,2668,573,3794,236743,236770,236764,236800,236764,236810,236764,236832,236764,236828,236764,236770,236771,236764,236770,236778,236761,1452,506,1548,529,2668,563,711,2344,1082,236743,236770,653,3715,1082,236743,236800,236771,2668,573,3794,236743,236812,236764,236825,236764,236819,236764,236770,236770,236761,1452,236764,506,1548,529,2668,563,711,2344,1082,236743,236770,653,3715,1082,236743,236778,236819,573,506,2297,236743,236778,236761,107,140,236800,236761,669,3794,1374,711,577,2344,1082,236743,236770,653,3715,1082,236743,236770,236778,236761,107,140,236812,236761,669,3433,1374,577,528,506,6518,236787,6929,236772,1650,236772,42420,108,140,1708,2591,236787,236743,107,140,4730,236779,1896,1033,236771,236800,236772,236770,236770,236772,236778,236771,236771,236771,1606,1477,6288,108,140,4730,236779,1896,1033,236770,236810,236772,236771,236770,236772,236778,236771,236770,236778,1606,1477,8450,108,140,4730,236779,1896,1033,236771,236812,236772,236771,236772,236778,236771,236812,236771,1606,1477,8450,108,140,4730,236779,1896,1033,236771,236825,236772,236771,236812,236772,236778,236771,236778,236771,1606,1477,6288,108,140,4730,236779,1896,1033,236771,236825,236786,236771,236812,236786,236778,236771,236778,236771,1606,1477,8450,107,140,12234,108,140,14356,578,870,236800,236770,236764,236743,236778,236819,236764,236743,236800,236770,236764,236743,236800,236771,236764,236743,236800,236770,236764,236743,236800,236771,236764,236743,236800,236770,236764,236743,236800,236770,236764,236743,236800,236771,236764,236743,236800,236770,236764,236743,236800,236771,236764,236743,236800,236770,236842,107,140,584,5980,236769,1896,236768,2843,236743,236770,236771,236787,994,8450,107,140,584,3433,236840,236778,236842,2843,75381,653,3433,236840,236810,236842,2843,9296,1083,994,8450,107,140,236757,236764,513,236764,570,578,3433,8497,236778,1604,3433,236840,236800,236787,236810,1604,3433,236840,236825,9218,107,140,584,711,520,236761,103053,825,653,711,513,236761,103053,825,653,711,570,236761,103053,6141,994,8450,107,140,236757,236764,513,578,801,236769,236757,779,801,236769,236753,236768,107,140,584,711,236743,236770,6605,520,6605,236743,236770,236778,236787,994,8450,107,140,584,711,236743,236770,6605,513,6605,2668,236840,236757,236772,236770,9414,994,8450,107,140,2060,6288,109,2063,9918,236779,8992,236769,7090,1473,107,140,26610,107,140,26479,496,2483,529,4171,236764,994,496,1694,529,4171,9918,580,118364,236764,768,951,39181,69109,7519,528,506,1816,611,107,140,16223,9918,580,162760,67008,768,951,162760,7519,611,1374,994,506,1548,529,3718,236772,4925,11739,607,11049,1900,528,506,107,140,87420,236764,4772,1033,236746,1606,578,236743,236771,236764,4772,1033,236763,1606,578,236743,236770,236764,3729,4772,1033,236802,1606,578,236743,236778,236810,107,140,38408,107,140,6966,236779,8992,885,9259,1902,26577,236743,245790,11058,9259,827,623,12392,236888,1935,107,140,6966,236779,8992,885,9259,236764,12392,26577,236743,245790,11058,9259,827,623,12392,236888,1935,107,140,6966,236779,8992,885,108250,1373,1251,236743,236800,236743,107,140,26610,108,140,203201,578,33228,1033,621,236749,236785,236750,236785,236745,1606,107,140,584,1027,5551,236781,528,19427,573,1123,528,118364,30957,994,19427,236761,6966,825,107,140,584,46583,528,19427,236787,994,19427,236761,6966,188001,107,140,26297,578,236743,236771,107,140,1708,677,528,19427,236787,107,144,584,677,236761,511,11462,825,532,568,778,236769,574,236768,753,4772,885,236746,5924,2144,236743,236778,1251,236743,236770,236787,38302,3323,236743,236770,107,140,2060,38302,109,2063,563,236779,40835,236769,38511,1473,107,140,26610,107,140,26479,496,1694,529,4945,236764,994,3363,653,711,901,659,19372,107,140,495,52557,1900,236761,1637,1694,815,919,1082,236743,236770,36929,529,506,1638,107,140,5640,236764,994,8450,236761,31952,951,5676,4945,532,1186,25630,236761,108,140,38408,107,140,511,236779,40835,5551,236810,2812,236743,245790,6288,107,140,511,236779,40835,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,2812,236743,245790,6288,107,140,511,236779,40835,5551,236770,236764,236743,236800,236764,236743,236778,236764,236743,236812,236764,236743,236810,2812,236743,245790,8450,107,140,511,236779,40835,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,236764,236743,236825,2812,236743,245790,6288,107,140,511,236779,40835,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,236764,236743,236825,236764,236743,236832,2812,236743,245790,6288,107,140,511,236779,40835,5551,236770,236764,236743,236800,236764,236743,236778,236764,236743,236812,236764,236743,236810,236764,236743,236825,236764,236743,236832,2812,236743,245790,8450,107,140,511,236779,40835,5551,236770,236764,236743,236778,236764,236743,236778,236764,236743,236800,236764,236743,236800,236764,236743,236812,2812,236743,245790,6288,107,140,511,236779,40835,5551,236770,236764,236743,236778,236764,236743,236778,236764,236743,236778,236764,236743,236800,236764,236743,236812,2812,236743,245790,8450,107,140,26610,109,140,2861,578,12739,825,107,140,1708,1123,528,65145,236787,107,144,584,1123,711,528,1527,236787,1527,236840,236781,236842,578,236743,236771,107,144,2861,236840,236781,236842,3323,236743,236770,107,144,584,1527,236840,236781,236842,1890,236743,236778,236787,994,8450,107,140,2060,65145,1251,19372,236769,38511,236768,109,2063,18074,236769,32248,236770,236764,8448,236778,1473,107,140,12234,3048,659,2238,1156,20312,236764,107,140,3350,1546,8448,563,496,6727,529,25630,236761,1701,2591,236764,8448,578,568,3041,236764,1345,236768,578,568,236770,236764,236743,236778,769,107,140,818,2238,20312,659,6535,837,2820,600,506,8448,568,3041,236764,1345,236768,107,140,22239,1800,1502,532,1345,236761,107,140,2542,1546,2238,8448,236764,625,563,12718,600,1061,1502,563,2344,653,4745,1061,1345,236761,107,140,11069,4209,563,531,6054,3363,506,3861,529,18074,529,1239,1156,236743,107,140,95829,563,496,8355,1548,236761,107,140,12703,236764,506,18074,529,506,20312,568,236770,236764,236743,236800,779,568,236778,236764,236743,236812,236768,563,568,236778,236764,236743,236800,236768,107,140,7650,1061,3861,563,236743,236770,236764,837,711,496,8355,1548,236761,107,140,2859,506,3861,529,506,18074,563,496,8355,1548,236764,994,623,26915,827,107,140,88375,236764,994,623,7018,3056,107,140,2859,506,1156,20312,1537,236789,236745,33647,236764,994,623,7018,3056,109,140,236840,2427,236786,4606,236842,7365,236787,107,140,83593,3283,236770,236764,236743,236778,779,568,236778,236764,236743,236800,1223,196301,623,7018,236775,107,140,83593,120978,236770,236764,236743,236770,779,568,236771,236764,236743,236812,1223,196301,623,7018,236775,107,140,83593,120978,236800,236764,753,236770,779,15554,236810,236764,236743,236810,1223,196301,623,26915,236775,107,140,12234,109,140,2063,563,236779,2497,236769,236746,1473,107,144,2060,711,568,236746,655,236743,236778,653,1027,236769,236746,2144,1123,1251,236743,236771,573,1123,528,2644,236769,236778,236764,801,236769,236746,5213,236743,236771,236761,236810,236768,900,236743,236770,9670,107,140,584,8448,236770,236840,236771,236842,1890,8448,236778,236840,236771,9414,8448,236770,236764,8448,236778,578,8448,236778,236764,8448,236770,107,140,236752,236764,637,578,8448,236778,236840,236771,1604,1322,236769,32248,236770,236840,236770,1604,8448,236778,236840,236770,2812,107,140,2060,623,26915,236775,768,563,236779,2497,236769,236750,753,537,236768,1663,623,7018,236775,109,2063,13247,236779,193319,236769,2762,1473,107,140,12234,107,140,3048,659,2238,614,3499,4617,529,25630,532,611,1202,531,994,107,140,2330,529,81938,529,25630,37764,684,1698,529,784,11005,107,140,1340,1546,1548,528,506,3499,236764,10725,684,236743,236770,236764,753,236770,653,236743,236771,236761,107,140,10282,236787,994,5450,573,7738,4617,236761,108,140,12703,236787,107,140,22539,13247,236779,193319,5551,236770,236764,236743,236778,236764,236743,236778,236764,753,236812,2812,1251,753,236819,107,140,22539,13247,236779,193319,5551,236771,236764,236743,236770,2812,1251,236743,236771,107,140,22539,13247,236779,193319,67713,1251,5450,107,140,12234,108,140,584,4617,1251,4955,994,5450,107,140,584,236743,236771,528,4617,236787,994,236743,236771,107,140,236751,236764,503,5440,578,236743,236771,236764,236743,236770,107,140,1708,1123,528,4617,236787,107,144,236751,3323,2951,236769,236781,236768,107,144,167669,32162,568,236781,973,2951,236769,236781,1223,107,140,2060,503,808,503,5440,109,2063,1322,3360,236769,8545,236764,620,1473,107,140,12234,107,140,26479,496,8411,607,646,12773,532,646,11312,568,236797,6867,236743,236778,236768,532,496,4414,11995,620,236764,236743,107,140,17136,2145,529,506,8411,6097,496,1550,236761,7685,11995,528,506,2644,870,236770,236764,646,808,646,236842,107,140,83950,7412,7121,3622,580,506,3874,529,506,8411,236761,108,140,3048,735,531,1586,506,7081,2479,529,3861,620,528,506,8411,236761,1599,740,1502,107,140,2543,1027,2145,236764,532,528,1546,2918,611,740,2827,531,1027,529,506,9171,3874,236764,107,140,495,1032,4171,236764,611,740,817,531,3874,837,4024,614,7377,607,611,1873,107,140,5447,236761,107,140,9366,5433,600,496,2479,529,3861,620,2820,16270,7121,620,3874,568,2217,107,140,87280,9245,769,107,140,3048,22662,15954,817,1135,506,8411,236761,107,140,236776,2479,562,568,1340,3861,620,236768,563,4542,2344,1082,496,2479,603,568,1340,3861,620,236768,768,107,140,9900,3043,506,11496,15852,529,506,2979,580,506,3874,600,562,532,603,817,107,140,17429,568,1184,236789,236751,2246,1091,65145,236779,236776,532,65145,236779,236799,779,65145,236779,236776,563,195867,69565,2344,107,140,14560,65145,236779,236799,236764,528,1032,4171,236764,993,2849,614,11995,3546,858,568,236770,6605,858,6605,620,236768,107,140,17887,600,65145,236779,236776,236840,236747,236842,655,65145,236779,236799,236840,236747,236842,532,573,1027,673,568,236770,6605,673,655,858,236768,692,735,107,140,38511,236779,236776,236840,236804,236842,578,65145,236779,236799,236840,236804,1619,107,140,1509,563,23535,600,506,3890,563,4709,236761,107,140,13293,614,11496,1694,529,506,2979,580,506,3874,600,506,7081,2479,817,1343,236761,108,140,38408,236787,108,144,4661,236787,8411,578,870,870,236770,236764,236778,236764,236800,1604,870,236812,236764,236810,236764,236825,1604,870,236832,236764,236828,236764,236819,36878,620,578,236743,236800,107,144,8433,236787,870,236770,236764,236743,236778,236764,236743,236770,236842,108,144,4661,236787,8411,578,870,870,236810,236764,236819,236764,236800,1604,870,236812,236764,236770,236764,236825,1604,870,236832,236764,236828,236764,236778,36878,620,578,236743,236770,107,144,8433,236787,870,236770,236842,107,140,12234,109,140,236797,578,5980,236769,8545,236768,107,140,236781,236764,570,578,236743,236771,236764,236743,236771,107,140,1708,858,528,2644,236769,236797,1473,107,144,1708,673,528,2644,236769,236797,1473,107,148,584,8411,236840,236747,2585,236804,236842,1251,236743,236770,236787,107,152,236781,236764,570,578,858,236764,673,107,140,2923,578,646,808,646,107,140,584,1123,1890,236743,236771,236787,37700,578,1322,236769,2923,236764,8411,236840,236781,753,236743,236770,2585,236762,2812,107,140,584,1123,655,646,753,236743,236770,236787,37700,578,1322,236769,2923,236764,8411,236840,236781,900,236743,236770,2585,236762,2812,107,140,584,570,1890,236743,236771,236787,37700,578,1322,236769,2923,236764,8411,236840,236781,2585,236762,753,236743,236770,2812,107,140,584,570,655,646,753,236743,236770,236787,37700,578,1322,236769,2923,236764,8411,236840,236781,2585,236762,900,236743,236770,2812,107,140,2060,870,236770,768,858,2144,236743,236778,1251,236743,236771,1663,37700,573,858,528,2644,236769,236767,7066,109,2063,2077,236769,236749,1473,107,140,12234,40651,10077,123466,7501,236764,625,691,9971,19297,684,167134,528,236743,107,140,1437,1774,4628,24744,236761,3153,236764,1144,1331,1537,236789,236745,1281,563,20891,78113,7501,236761,107,140,92580,78113,7501,563,5221,684,506,49498,236787,107,140,12233,236769,236770,236768,578,236743,236800,107,140,12233,236769,236749,236768,578,236743,236770,900,538,965,236743,236778,236764,768,538,563,1581,236761,107,140,12233,236769,236749,236768,578,138,12233,236769,236749,753,236743,236770,236768,900,2077,236769,236749,753,236743,236778,236768,900,2077,236769,236749,900,236743,236770,779,768,538,563,11049,236761,107,140,2542,2591,236787,107,140,12233,236769,236778,236768,578,236743,236770,900,568,236778,965,236743,236778,236768,578,236743,236778,107,140,12233,236769,236812,236768,578,236743,236800,107,140,12233,236769,236800,236768,578,2077,236769,236778,236768,900,2077,236769,236770,236768,900,2077,236769,236812,236768,107,147,236784,236743,236778,900,236743,236800,900,236743,236800,578,236743,236828,236743,107,140,3048,659,2238,496,1908,236772,27851,11995,1548,538,236764,611,735,531,496,994,496,1694,529,506,236743,107,140,6005,538,900,236743,236770,4945,529,506,20891,78113,7501,236761,107,140,38408,236787,107,140,12233,236769,236800,236768,578,870,236770,236764,236743,236800,236764,236743,236778,236764,236743,236828,236842,107,140,12234,109,140,584,538,1251,236743,236771,236787,994,870,236770,236842,107,140,584,538,1251,236743,236770,236787,994,870,236770,236764,236743,236800,236842,107,140,743,578,870,236770,236764,236743,236800,236842,107,140,1708,858,528,2644,236769,236778,236764,538,900,236743,236770,1473,107,144,584,858,2144,236743,236778,1251,236743,236771,236787,107,148,743,236761,3770,236769,236770,900,858,965,236743,236778,236768,107,144,4454,236787,107,148,743,236761,3770,236769,743,17825,236770,236842,900,9898,17825,236778,236842,900,236743,236770,900,568,236747,900,236743,236770,236768,965,236743,236778,236768,107,140,2060,9898,109,2063,26798,236769,236749,1473,107,140,12234,26479,496,4414,11995,538,236764,994,506,1698,529,506,11049,26798,236761,107,140,13293,236743,236771,768,784,26798,659,1581,236761,107,140,2542,2591,236787,107,140,67161,236769,236770,236768,138,1224,236743,236770,107,140,67161,236769,236812,236768,138,1224,236743,236771,107,140,67161,236769,236778,236800,236810,236768,1251,236743,236770,236810,107,140,12234,109,140,5594,236779,19808,236764,13247,578,8450,236764,236743,236770,107,140,1708,677,528,1540,236769,236749,1473,107,144,584,801,236769,574,236768,2144,236743,236778,1251,236743,236770,236787,107,148,5594,236779,19808,578,6288,107,148,15981,32162,801,236769,574,236768,107,140,2060,236743,236771,768,711,815,236779,19808,1663,13247,109,2063,563,236779,103895,236769,2383,1473,107,140,26610,107,140,6924,496,1292,600,4716,496,2483,618,2744,837,6097,1186,6281,41706,236761,107,140,818,1292,1374,994,6288,768,532,1186,768,993,563,496,4341,127002,529,41706,236743,107,140,3350,657,3198,886,34704,528,506,127002,563,43927,236761,108,140,511,236779,103895,59558,3805,117918,236743,245790,6288,107,140,511,236779,103895,1033,3805,175478,69929,28002,236840,3805,1606,236743,245790,8450,107,140,511,236779,103895,1033,49941,1606,236743,245790,8450,107,140,511,236779,103895,1033,3805,1606,236743,245790,8450,107,140,511,236779,103895,1033,28002,2585,10660,1606,236743,245790,6288,107,140,511,236779,103895,59558,3805,200951,1606,236743,245790,6288,107,140,26610,109,140,1708,858,528,2644,236769,3469,236769,2383,16644,107,144,584,2483,236840,236747,236842,1251,64725,1083,4102,107,144,26297,236764,2631,236779,51874,578,236743,236771,236764,236743,236771,107,144,1708,673,528,2644,236769,236747,236764,5980,236769,2383,16644,107,148,584,2483,236840,236804,236842,1251,28286,1083,107,152,26297,3323,236743,236770,107,148,4454,236787,107,152,26297,14599,236743,236770,107,148,2074,236779,51874,578,2631,236769,2074,236779,51874,236764,38302,236768,107,148,584,38302,1251,236743,236771,236787,107,152,584,2631,236779,51874,6867,236743,236778,236787,107,156,2060,6288,107,152,7284,107,140,2060,8450,110,2063,2324,236779,65584,236769,38511,1473,107,140,12234,3048,659,2238,496,1694,529,4945,236761,107,140,3048,1202,531,994,506,2324,529,20218,4945,528,506,2238,1694,236764,107,140,979,1546,3408,528,506,1694,531,506,7593,801,236769,24718,5206,236768,1171,236761,107,140,38408,236787,107,140,2542,65145,578,870,236770,236764,236778,236764,236800,236842,506,3938,1374,577,236743,236770,236812,107,140,2542,65145,578,870,236770,236764,236812,236764,236819,236842,506,3938,1374,577,236743,236819,236828,107,140,2542,65145,578,870,236770,236764,236800,236764,236810,236764,236832,236842,506,3938,1374,577,236743,236828,236812,107,140,2542,65145,578,870,236770,236761,236812,236764,236812,236761,236778,236764,236771,236842,506,3938,1374,577,236743,236778,236819,107,140,2542,65145,578,33368,236778,236761,236812,236764,236770,236764,236770,236842,506,3938,1374,577,236743,236825,107,140,108,140,12234,108,140,1106,6596,107,140,2060,2324,236769,3275,236769,3485,1123,236787,6596,236761,34442,236769,236781,236768,5213,236743,236778,236764,65145,1223,109,2063,2426,236779,584,236779,5455,236779,4873,236779,511,236779,236746,236779,13143,236769,7090,1473,107,140,26610,107,140,6924,496,1292,600,7623,6288,768,506,1774,2872,107,140,1340,496,2238,2483,563,614,109924,2872,532,563,711,107,140,236746,912,529,496,3658,236764,532,8450,7394,236761,107,140,10282,236787,623,3017,236775,563,496,2299,529,7579,15914,684,2557,236761,108,140,38408,236787,107,140,4256,236779,584,236779,5455,236779,4873,236779,511,236779,236746,236779,13143,885,17641,3789,1373,236743,245790,8450,107,140,4256,236779,584,236779,5455,236779,4873,236779,511,236779,236746,236779,13143,885,17641,4604,545,1373,236743,245790,6288,107,140,4256,236779,584,236779,5455,236779,4873,236779,511,236779,236746,236779,13143,885,17641,4604,545,15825,236743,245790,8450,107,140,4256,236779,584,236779,5455,236779,4873,236779,511,236779,236746,236779,13143,48391,236743,245790,8450,236743,107,140,26610,108,140,584,5980,236769,7090,236768,1251,236743,236771,236787,994,8450,107,140,584,5980,236769,7090,236768,1251,236743,236770,236787,994,19427,236761,231534,825,107,140,2060,19427,17825,236770,1619,231534,825,532,19427,17825,236778,236842,1251,623,623,109,2063,740,236779,139877,236769,2762,1473,107,140,12234,6924,496,1292,837,7623,506,7488,3546,529,614,3408,837,107,140,511,711,5314,1082,653,4745,531,506,3408,6842,20865,625,236761,1637,107,140,1904,1288,3408,7519,1299,994,753,236770,236761,669,2238,3499,795,711,3014,107,140,92313,2979,236761,108,140,38408,236787,107,140,4881,236779,139877,5551,236770,236764,236778,236764,236812,236764,236800,236764,236810,2812,578,236743,236800,107,140,4881,236779,139877,5551,236770,236764,236778,236764,236800,2812,578,753,236770,107,140,12234,109,140,1708,858,528,2644,236769,3469,236769,2762,236768,753,236743,236770,236764,236743,236771,236764,753,236770,1473,107,144,584,711,568,2762,236840,236747,236842,6867,4617,236840,236747,753,236743,236770,30957,107,148,2060,858,107,140,2060,753,236770,109,2063,7488,236779,118878,236779,16469,9964,236769,38511,1473,107,140,26610,107,140,6924,496,1292,600,7623,496,33228,568,236746,236764,518,779,1298,756,236746,236789,563,107,140,1437,7488,529,5676,25630,236764,532,756,236763,236789,563,506,21548,107,140,1340,4414,25630,528,496,1694,236761,107,140,2859,993,563,951,5676,653,4414,25630,236764,994,1091,618,5450,236761,108,140,38408,236787,107,140,65020,236779,118878,236779,16469,9964,5551,236778,236764,236743,236812,236764,236743,236770,236764,236743,236800,236764,236743,236810,236764,236743,236832,2812,1251,568,9336,236764,236743,236770,236768,107,140,65020,236779,118878,236779,16469,9964,67713,1251,568,9336,236764,5450,236768,107,140,65020,236779,118878,236779,16469,9964,5551,236771,2812,1251,568,9336,236764,5450,236768,107,140,26610,108,140,11798,578,1694,236769,7212,236769,3485,1123,236787,1123,655,236743,236771,236764,65145,1223,107,140,1163,578,1694,236769,7212,236769,3485,1123,236787,1123,1890,236743,236771,236764,65145,1223,107,140,2060,5450,768,2909,1251,2977,1663,2631,236769,11798,779,5450,768,1118,1251,2977,1663,1322,236769,1163,236768,109,2063,10769,236779,811,236769,236746,236764,518,1473,107,140,12234,107,140,6924,496,1292,600,4716,25630,236764,72160,236764,653,16587,13855,107,140,9238,4945,236764,532,7623,506,6268,6471,528,1061,2238,6471,1722,236761,107,140,13293,5450,768,506,2979,659,4745,236761,107,140,10282,236787,1637,496,1759,1548,563,10725,618,496,2483,236764,506,18224,1523,2473,577,783,653,1031,108,140,29206,236779,811,236769,236770,236764,236743,236778,236761,236810,236768,236743,245790,236743,236778,236761,236810,107,140,29206,236779,811,236769,236770,236764,623,236778,236764,236800,1373,236743,245790,623,236778,236764,236800,236775,107,140,29206,236779,811,885,236810,236764,236770,827,623,236825,1373,236743,245790,623,236825,236775,107,140,29206,236779,811,885,236770,827,236743,236770,236768,236743,245790,5450,107,140,12234,109,140,3744,236779,236746,578,6803,236769,1714,236769,236746,769,10143,204160,87460,1223,107,140,3744,236779,236763,578,6803,236769,1714,236769,236763,769,10143,204160,87460,1223,107,140,584,1152,236779,236746,1251,1152,236779,236763,236787,107,144,2060,5450,107,140,2060,496,768,1152,236779,236746,1890,1152,236779,236763,1663,518,109,2063,563,236779,17677,236779,1071,236779,2330,236779,20952,236769,236749,1473,107,140,12234,99009,3363,506,2238,1548,538,740,577,5267,618,506,2324,529,7121,236743,236812,4414,1581,4945,107,140,12703,107,140,511,236779,17677,236779,1071,236779,2330,236779,20952,236769,236812,236768,1251,8450,107,140,511,236779,17677,236779,1071,236779,2330,236779,20952,236769,236825,236768,1251,8450,107,140,511,236779,17677,236779,1071,236779,2330,236779,20952,236769,236828,236768,1251,6288,107,140,12234,108,140,2060,538,6867,236743,236828,532,538,2144,236743,236778,1251,236743,236771,109,2063,2803,236779,197149,236769,236749,1473,107,140,12234,818,36828,82410,563,5221,618,236787,107,140,5256,236802,41728,236779,197149,236769,236749,236768,578,538,236888,808,568,236749,236772,236770,47473,808,568,236749,236772,236778,47473,808,3729,808,236743,236770,236888,107,140,3350,538,1890,236743,236771,108,140,2542,2591,236787,107,140,22539,2803,236779,197149,236769,236812,236768,107,140,236778,236828,236828,108,140,818,1292,795,5908,614,11995,618,2744,532,1374,994,506,2803,107,140,197149,529,672,11995,236761,107,140,12234,109,140,23976,236764,9898,578,236743,236770,236764,236743,236770,107,140,1708,858,528,2644,236769,236778,236764,538,900,236743,236770,1473,107,144,23976,32162,858,107,144,743,32162,2367,107,140,2060,9898,109,2063,7562,236779,35220,236769,1005,1473,107,140,12234,107,140,26479,496,2483,1816,236764,9883,784,9952,528,625,607,104811,236764,236743,107,140,624,768,496,2483,815,919,1082,236743,236778,22592,9952,236764,236743,107,140,5215,9883,784,22592,9952,607,753,236743,107,140,107,140,9462,236779,35220,885,12703,1373,1251,623,12703,236775,107,140,9462,236779,35220,885,12703,236743,236770,1373,1251,623,12703,236779,236770,236775,107,140,9462,236779,35220,885,14691,236743,236778,1373,1251,30660,12703,236779,236778,236775,107,140,9462,236779,35220,885,14691,139,236800,1373,1251,30660,12703,236772,236800,236775,107,140,12234,108,140,743,578,1816,107,140,1708,858,528,2644,236769,3469,236769,1005,779,236743,236778,236764,753,236770,1473,107,144,743,578,9898,236761,10143,885,623,808,858,236764,9296,1373,107,140,2060,9898,236761,10143,885,7553,30660,1373,109,2063,2129,236779,1201,236779,4256,236769,2164,236779,1201,1473,107,140,12234,6924,496,1292,837,4716,496,2483,13855,496,2129,236789,236751,1463,236764,532,7623,107,140,236789,10784,236789,768,506,506,2129,236789,236751,1463,563,4341,236764,532,7623,756,3771,236789,7394,236761,107,140,236776,2129,236789,236751,1463,563,4542,531,577,4341,768,532,1186,768,784,506,2269,3439,236743,107,140,733,1645,236787,107,140,236772,2085,1374,711,577,919,1082,1806,26798,11297,236771,21517,236789,236819,1606,528,506,2129,236789,236751,1463,236761,107,140,236772,669,2129,236789,236751,1463,6097,7121,886,13548,83948,107,140,236772,669,85440,1680,506,13548,1374,711,577,7738,236764,532,625,9857,607,496,6064,699,236743,107,140,1437,72308,100272,92339,11297,236746,21517,236789,236802,236789,532,756,236776,21517,236789,236953,4833,107,140,236772,669,85440,1308,506,13548,1374,577,886,529,1239,236787,7756,7090,963,756,20546,963,756,23055,2000,107,140,38408,236787,107,140,2164,236779,1201,236779,4256,885,8358,236761,7090,1373,997,1477,756,10784,236789,107,140,2164,236779,1201,236779,4256,885,236770,8358,236761,23055,1373,997,1477,756,3771,236789,568,1437,1463,1374,1502,607,496,72308,100272,92339,6064,236768,107,140,12234,108,140,584,5980,236769,2234,236769,7212,236769,3485,677,236787,677,236761,103053,3800,2129,236779,1201,9670,1890,236743,236800,236787,107,144,2060,623,3771,236775,107,140,236760,236779,2234,578,2129,236779,1201,236761,6966,212104,107,140,584,5980,236769,236760,236779,2234,236768,2843,236743,236778,236787,994,623,3771,236775,107,140,584,5980,236769,236760,236779,2234,236840,236771,2812,1251,236743,236771,236787,994,623,3771,236775,107,140,584,711,517,236779,2234,236840,236771,2585,236771,1619,231534,6141,994,623,3771,236775,107,140,584,517,236779,2234,236840,236770,236842,711,528,11058,7090,827,623,20546,827,623,23055,62088,994,623,3771,236775,107,140,2060,623,10784,236775,111,2063,2324,236779,65584,236769,38511,1473,107,140,84491,107,140,2094,1292,795,1769,496,1694,529,25630,236761,1701,784,16227,528,506,1694,236764,506,1292,2863,6281,506,11995,7234,768,1061,3546,563,496,236743,107,140,43819,529,236743,236800,532,795,26365,506,11995,7234,768,1061,3546,563,496,5065,529,236743,236812,532,711,496,5065,529,236743,236800,236761,669,1292,795,711,236743,107,140,4177,506,16227,528,506,1694,5769,50181,659,711,496,5065,529,236743,236800,653,236743,236812,236761,669,1292,2863,1299,994,506,2324,529,784,16227,236761,236743,107,140,107,140,38408,236787,107,140,2542,65145,578,870,236770,236764,236778,236764,236800,236842,506,3938,1374,577,236743,236825,107,140,2542,65145,578,2977,138,1437,3938,1374,577,236743,236771,107,140,2542,65145,578,33368,236770,10442,236810,236764,236778,10442,236770,10442,236810,236842,138,1437,3938,1374,577,753,236770,236778,236825,107,140,12234,108,140,743,578,236743,236771,107,140,1708,858,236764,1152,528,29833,236769,38511,1473,107,144,584,858,2144,236743,236800,1251,236743,236771,236787,107,148,743,3323,1152,5213,236743,236778,107,144,36208,858,2144,236743,236812,1251,236743,236771,236787,107,148,743,3323,1152,5213,236743,236800,107,144,4454,236787,107,148,743,3323,1152,107,140,2060,9898,109,2063,4171,236779,495,236779,54554,236769,54554,1473,107,140,12234,107,140,3048,659,2238,496,2483,13855,496,13315,236764,107,140,1437,13315,6097,1070,4171,15914,684,496,2557,236764,107,140,624,611,735,531,994,496,2483,600,6097,506,4171,699,506,3303,13315,236764,107,140,87776,25565,659,8355,4945,236764,107,140,1437,1900,529,506,4171,528,506,861,2483,1374,577,506,1638,618,506,3303,886,236761,108,140,12703,236743,236770,236787,107,144,4661,236787,13315,578,623,2094,563,496,1594,236775,107,144,8433,236787,623,511,236775,108,140,12703,236743,236778,236787,107,144,4661,236787,13315,578,623,8524,817,573,16820,236775,107,144,8433,236787,623,1909,573,236775,108,140,40498,236787,107,144,236829,236743,236770,6605,5980,236769,54554,236768,6605,236743,236770,236771,236771,107,144,236829,13315,6097,1186,11739,107,140,12234,109,140,2063,563,236779,2497,236769,236746,1473,107,144,2060,711,568,236746,655,236743,236778,653,1027,236769,236746,2144,1123,1251,236743,236771,573,1123,528,2644,236769,236778,236764,801,236769,236746,5213,236743,236771,236761,236810,236768,900,236743,236770,9670,107,140,2060,623,16150,7013,236769,2234,236769,7212,236769,3485,3658,236787,563,236779,2497,236769,3469,236769,3017,8914,13315,236761,6966,885,623,41052,109,2063,30091,236769,236781,236764,538,1473,107,140,12234,11069,4209,563,531,4144,496,1292,600,795,30091,506,5619,107,140,236781,808,538,236761,669,1292,7623,6288,768,1123,808,538,88382,531,496,3697,1548,532,8450,107,140,88375,236761,10508,1123,532,538,236764,659,2483,10065,529,496,10742,236764,532,735,506,2269,6518,236764,107,140,236820,138038,25492,236820,132450,236813,1298,1800,43354,532,31637,659,4414,3697,4945,236761,108,140,3048,740,9027,600,1123,236764,532,538,659,4341,28267,236764,532,776,711,735,5743,618,31637,236761,108,140,154228,885,236770,236786,236810,827,623,236810,236786,236770,1373,578,6288,107,140,154228,885,236770,236786,236825,827,623,236778,236786,236770,1373,578,8450,107,140,154228,885,236832,236786,236770,236771,827,623,236770,236771,236786,236778,1373,578,8450,107,140,12234,109,140,236781,236770,236764,1123,236778,578,4187,236769,720,236764,1123,236761,6966,223381,107,140,236749,236770,236764,538,236778,578,4187,236769,720,236764,538,236761,6966,223381,107,140,2060,568,236781,236770,808,538,236770,236768,2144,568,236781,236778,808,538,236778,236768,1251,236743,236771,109,2063,1900,236779,2003,236779,11116,236769,30457,1473,107,140,12234,107,140,6974,496,1292,837,25803,506,2238,1694,529,25630,107,140,495,52557,1900,3894,531,506,2324,529,910,26798,236761,107,140,10282,236787,768,993,659,3131,4852,607,3361,2324,529,910,26798,236764,107,140,2400,1091,2721,580,910,3546,528,3303,1694,236761,108,140,2542,2591,236787,107,140,22539,1900,236779,2003,236779,11116,5551,236770,236764,236743,236770,236770,236764,753,236770,236764,753,236770,236770,236764,753,236770,236778,2812,1251,33368,236770,236764,753,236770,236770,236764,236743,236770,236764,753,236770,236778,236764,236743,236770,236770,236842,107,140,22539,1900,236779,2003,236779,11116,67713,1251,2977,107,140,12234,108,140,2063,3825,236769,236781,1473,107,144,236781,236779,2234,578,1694,236769,1714,236769,236781,1223,107,144,584,1123,236779,2234,236840,236771,236842,1251,9296,1083,107,148,236781,236779,2234,578,1123,236779,2234,236840,236770,9218,107,148,236781,236779,2234,578,1694,236769,3275,236769,720,236764,1123,236779,2234,1223,107,148,236781,236779,2234,236840,236771,236842,578,753,236781,236779,2234,236840,236771,236842,107,144,4454,236787,107,148,236781,236779,2234,578,1694,236769,3275,236769,720,236764,1123,236779,2234,1223,107,144,2060,2324,236769,236781,236779,2234,236768,107,140,2060,19372,236769,30457,236764,2307,236784,6078,236768,109,2063,2803,10296,236769,30457,1473,107,140,12234,6974,496,1292,600,4716,614,3499,529,4945,618,2744,532,7623,236743,107,140,1437,1548,529,4820,528,506,3499,600,659,5314,1082,236743,236770,236771,532,1800,236743,107,140,6005,532,1774,26798,529,496,1548,659,11049,568,236770,236764,236743,236800,236764,236743,236810,236764,236743,236832,236764,236743,236819,769,107,140,2542,2591,236787,107,140,18146,10296,5551,236770,236810,236764,753,236832,236800,236764,236743,236770,236812,236764,753,236770,236810,2812,1477,236743,236770,236743,107,140,18146,10296,5551,236800,236800,236764,753,236778,236764,753,236800,236764,236743,236812,236810,236764,236743,236778,236770,236764,236743,236770,236771,236819,2812,1477,236743,236778,107,140,12234,108,140,743,236764,11049,578,236743,236771,236764,11058,236770,827,623,236800,827,623,236810,827,623,236832,827,623,236819,1935,107,140,1708,1152,528,27536,236787,107,144,584,1152,1890,236743,236770,236771,532,1540,236769,3744,10309,236771,236842,528,11049,532,1540,236769,3744,10309,236772,236770,236842,528,11049,236787,107,148,743,3323,236743,236770,107,140,2060,9898,109,2063,974,236779,2074,236779,12233,2649,236769,236749,1473,107,140,12234,107,140,3048,659,2238,496,4414,11995,538,236761,1599,735,531,2619,614,11995,3499,496,529,3861,538,236761,107,144,2542,1546,858,568,236770,38010,858,38010,538,779,506,1550,529,496,236840,236747,236842,578,858,808,858,753,858,900,236743,236770,236761,107,144,13293,506,1548,529,107170,568,236746,236840,236747,1604,496,236840,236804,1604,496,236840,236767,2812,529,496,1298,858,655,673,655,620,236764,236743,107,140,624,496,236840,236747,236842,900,496,236840,236804,236842,900,496,236840,236767,236842,563,496,5065,529,236743,236800,236761,108,140,12703,1017,107,144,4661,236787,538,578,236743,236810,107,144,8433,236787,236743,236770,107,144,44008,236787,236743,107,144,236746,578,870,236770,236764,236743,236800,236764,236743,236832,236764,236743,236770,236800,236764,236743,236778,236770,236842,107,144,818,1186,4341,22178,563,568,236770,236764,236743,236832,236764,236743,236770,236800,769,107,140,12234,109,140,584,538,6605,236743,236778,236787,994,8450,107,140,811,236779,26297,578,236743,236770,900,568,236749,753,236743,236778,236768,973,236743,236800,808,236743,236778,900,568,236749,753,236743,236778,236768,2144,236743,236800,107,140,13321,236779,26297,578,538,753,886,236779,26297,107,140,2060,886,236779,26297,808,568,811,236779,26297,753,236743,236770,236768,808,568,811,236779,26297,753,236743,236778,236768,973,236743,236825,900,5743,236779,26297,808,568,13321,236779,26297,753,236743,236770,236768,808,568,13321,236779,26297,753,236743,236778,236768,973,236743,236825,109,2063,96504,236769,74633,236770,236764,13401,236778,1473,107,140,26610,107,140,3810,659,6589,46501,528,1023,10321,1458,236787,506,12532,540,531,506,7718,236743,107,140,511,50144,236764,506,2148,886,563,44014,236764,1299,10824,236764,23156,236764,52895,236764,56949,236764,236743,107,140,236836,5259,605,236764,122394,236761,107,140,6974,496,1292,600,4716,1156,13401,5618,618,16587,13401,236770,532,13401,236778,236761,236743,107,140,818,1292,1374,994,496,33228,7906,784,46501,5769,51511,659,236743,107,140,72158,1534,506,20982,529,13401,236770,532,506,20982,529,13401,236778,236764,19372,684,236743,107,140,1437,33417,531,506,3768,236761,236743,107,140,818,1292,1374,994,614,7738,33228,768,13401,236770,653,13401,236778,107,140,733,711,4338,13401,5618,236761,236743,107,140,38408,107,140,7589,885,236223,827,623,6855,693,2253,1373,196301,6791,45533,794,827,623,236836,5259,605,1373,107,140,7589,885,52858,827,623,178430,1373,196301,6791,176046,1373,107,140,7589,885,178430,827,623,236836,5259,605,1373,196301,6791,176046,827,623,52858,827,623,81522,827,623,236223,827,623,45533,794,1373,107,140,26610,108,140,15081,1713,578,11058,178430,827,623,176046,827,623,52858,827,623,81522,827,623,236223,827,623,45533,794,827,623,236836,5259,605,827,623,6855,693,2253,1935,107,140,584,13401,236770,711,528,46501,653,13401,236778,711,528,46501,236787,994,33228,825,107,140,236747,236770,236764,858,236778,578,46501,236761,2662,236769,74633,236770,779,46501,236761,2662,236769,74633,236778,236768,107,140,584,858,236770,1890,858,236778,236787,858,236770,236764,858,236778,578,858,236778,236764,858,236770,107,140,2060,33228,236769,15081,1713,236840,236747,236770,900,236743,236770,1017,858,236778,2812,109,2063,19372,236779,2234,236779,2330,236769,38511,1473,107,140,12234,6974,496,1292,600,37574,496,1694,529,16587,618,496,7689,236764,107,140,893,59700,506,16587,600,735,11049,25565,699,625,236764,107,140,624,7623,506,15116,1694,607,496,19372,1900,236764,107,140,818,1694,563,2462,496,1694,529,16587,532,2752,614,3499,529,4945,236764,107,140,624,625,1149,3014,87613,236761,107,140,818,1900,529,506,1694,1374,577,52557,684,3861,529,1546,3658,236764,532,611,107,140,16223,994,506,1694,19372,684,600,6157,236761,107,140,2859,1156,4171,735,506,1638,3861,236764,4260,506,1694,181444,236761,107,140,818,1292,1374,994,496,1694,529,16587,528,19372,1900,236761,107,140,3048,1149,9027,600,784,4171,795,735,506,1638,3861,236761,107,140,2542,2591,236787,107,140,10640,1694,236779,10479,60522,9236,827,623,236746,827,623,72004,18992,1477,11058,9236,1935,107,140,10640,1694,236779,10479,60522,596,827,623,236746,827,623,72004,827,623,2692,18992,1477,11058,596,827,623,2692,1935,107,140,12234,108,140,2543,75162,29001,1419,90654,236779,1071,236779,2478,107,140,2063,90654,236769,236751,236787,1540,236764,494,236787,1540,1473,107,144,584,5980,236769,236751,236768,2843,5980,236769,236745,1473,107,148,2060,5980,236769,236751,236768,753,5980,236769,236745,236768,107,144,2060,753,236770,768,503,655,494,1663,236743,236770,107,140,2060,19372,236769,2234,236769,7212,236769,3485,503,236787,5980,236769,236751,236768,2144,236743,236778,1251,236743,236771,236764,65145,8914,2307,236784,24671,236779,1071,236779,2478,236769,24671,1223,109,2063,1123,236779,504,236779,236762,236769,236749,236764,1123,236764,570,1473,107,140,12234,236776,3606,1948,837,1374,994,506,1550,529,1123,768,538,563,236743,107,140,236746,8355,1548,532,1374,994,506,1550,529,570,7394,236761,108,140,38408,236787,107,140,1708,1123,236779,504,236779,236762,236769,236832,236764,236743,236800,236812,236764,236743,236770,236778,236768,1251,236743,236800,236812,107,140,1708,1123,236779,504,236779,236762,236769,236770,236810,236764,236743,236828,236764,236743,236810,236768,1251,236743,236810,107,140,107,140,12234,108,140,2063,563,236779,2497,236769,236746,1473,107,144,2060,711,568,236746,655,236743,236778,653,1027,236769,236746,2144,1123,1251,236743,236771,573,1123,528,2644,236769,236778,236764,801,236769,236746,5213,236743,236771,236761,236810,236768,900,236743,236770,9670,107,140,2060,1123,768,563,236779,2497,236769,236749,236768,1663,570,109,2063,3972,236779,1437,236779,44251,236769,38511,1473,107,140,26610,107,140,26479,496,1694,529,4945,236764,994,506,2324,529,23441,529,506,4945,107,140,495,506,1694,600,659,11049,236761,98936,4945,600,659,5676,653,711,25630,236761,107,140,107,140,7902,236779,1437,236779,44251,5551,236770,236764,236743,236800,236764,236743,236778,236764,236743,236771,2812,1251,236743,236770,900,236743,236819,900,236743,236771,900,236743,236771,578,236743,236770,236771,107,140,7902,236779,1437,236779,44251,99097,236770,236764,753,236778,236764,236743,236771,2812,1251,236743,236771,107,140,7902,236779,1437,236779,44251,5551,236819,236764,753,236778,2812,1251,236743,236828,236770,107,140,7902,236779,1437,236779,44251,5551,236771,2812,1251,236743,236771,138,107,139,107,140,2859,506,2744,1694,563,7738,236764,994,236743,236771,236761,107,140,26610,108,140,743,578,236743,236771,107,140,1708,1152,528,65145,236787,107,144,584,1152,2144,236743,236778,1251,236743,236770,532,1152,1890,236743,236771,532,87460,711,528,1540,236769,3744,1473,107,148,743,3323,1152,5213,236743,236778,107,140,2060,9898,109,2063,10769,236769,5567,236764,32514,1473,107,140,12234,236777,1751,692,784,5630,600,8178,1056,506,1354,529,1070,1440,236772,113338,107,140,5106,563,6861,3224,236761,669,15571,532,12018,611,735,657,600,3479,659,107,140,2063,7115,5367,28830,1679,532,22020,236761,107,140,11069,4209,563,531,6054,768,496,1589,15195,60058,506,2536,529,496,1548,529,12853,236761,107,140,3048,659,2238,1156,33007,529,14900,532,87945,529,4745,3861,236764,1298,1546,3546,3831,496,4241,236761,236743,107,140,13293,614,3499,529,506,1638,3861,94684,1217,2793,1135,1546,8844,691,236761,1637,901,735,60058,15195,236764,107,140,1437,1550,563,236743,236771,236764,532,768,711,236764,506,1550,563,506,10298,4954,1534,506,8844,532,506,6317,236761,107,140,107,140,107,140,8358,236787,108,140,29206,5551,236770,236764,236778,236764,236800,236764,236812,236764,236810,236764,236770,41564,236770,236764,236778,236764,236800,236764,236812,236764,236778,10442,236778,2812,3921,870,236771,236764,236771,236764,236771,236764,236771,236764,236800,236764,236800,236842,107,140,29206,5551,236771,236764,236810,236764,236771,236764,236771,236764,236771,236764,236812,41564,236812,236764,236770,236764,236770,236764,236771,236764,236771,10442,236778,2812,3921,870,236812,236764,236812,236764,236770,236764,236771,236764,236771,236764,236825,236842,107,140,12234,109,140,2060,870,9082,236769,5567,236840,236747,236842,753,8844,236840,236747,2812,573,858,528,2644,236769,3469,236769,5567,50796,109,2063,27717,598,236779,23950,236769,1999,236779,1201,236764,25616,1473,107,140,12234,3048,795,577,2238,506,1463,529,496,1012,568,236746,2483,236768,532,496,1694,529,25616,236761,107,140,818,25616,659,531,577,1456,531,3711,4767,7694,531,506,1012,236761,669,107,140,38651,529,506,9980,563,618,5238,236787,3792,31567,577,506,1548,529,506,46451,107,140,41281,528,506,9980,236789,236751,1463,236764,532,1531,15233,577,506,1548,529,67505,11739,236743,107,140,495,506,9980,236789,236751,1463,236764,506,6332,563,2238,684,506,10742,31567,753,15233,236761,236743,107,140,3048,1374,1586,506,32879,9980,532,994,496,2483,528,672,236743,107,140,5939,236787,6679,1567,236761,51757,598,23950,1567,236761,107,140,2859,993,659,1156,653,919,25616,607,506,1638,6332,236764,611,1374,107,140,35724,506,886,600,3952,1171,528,506,1694,236761,107,140,2542,2591,236764,768,611,659,2238,623,236773,76868,236775,618,506,1012,532,496,1694,529,506,107,140,54074,236787,7756,1989,236750,4176,14702,236773,1554,80971,963,756,141754,963,756,142171,236811,42168,2000,1299,611,1374,107,140,2060,756,236773,76868,236761,1989,236750,4176,14702,236773,1554,80971,236789,2338,756,1989,236750,4176,14702,236773,1554,80971,236789,563,506,32879,9980,236743,107,140,236769,1258,6332,563,753,236770,769,107,140,12703,236787,107,140,1708,27717,598,236779,23950,1033,3307,236779,1999,963,7756,8686,963,756,3912,963,756,3129,10190,1251,756,3307,236779,1999,236761,8686,236789,107,140,12234,109,140,2063,6332,236769,236751,236787,1540,236768,3921,801,236787,107,144,31832,236764,15233,578,236743,236771,236764,236743,236771,107,144,1708,677,528,503,236787,107,148,584,677,236761,16292,2683,6141,31567,3323,236743,236770,107,148,584,677,236761,511,11462,6141,15233,3323,236743,236770,107,144,2060,31567,753,15233,107,140,2074,236779,38651,578,2631,236769,3275,236769,38651,236764,25616,1223,107,140,1708,545,528,25616,236787,107,144,584,6332,236769,236744,236768,1251,2631,236779,38651,236787,107,148,2060,1012,236779,1201,900,87460,900,545,109,2063,3922,236755,20001,236779,4256,236769,236746,1031,518,1473,107,140,12234,3048,659,2238,236743,236778,4171,236761,1599,1202,531,994,6288,768,506,1855,3658,653,1027,529,1061,74609,563,496,85440,528,506,1171,3658,107,140,3281,236755,20001,236779,4256,885,200500,4337,60154,1373,1477,8450,107,140,3281,236755,20001,236779,4256,885,23391,4337,713,1373,1477,6288,107,140,3281,236755,20001,236779,4256,885,1914,702,1048,4337,975,605,1373,1477,8450,107,140,3281,236755,20001,236779,4256,885,111507,4337,215541,1373,1477,6288,107,140,3281,236755,20001,236779,4256,885,1172,1172,4337,2167,736,1373,1477,8450,107,140,3281,236755,20001,236779,4256,885,21156,832,236751,4337,4973,501,1373,1477,6288,108,140,12234,108,140,584,496,1251,518,236787,107,144,2060,6288,107,140,584,518,1251,86679,107,144,2060,6288,107,140,1708,858,528,2644,236769,236771,236764,5980,236769,236763,16644,107,144,584,518,236840,236747,9218,900,518,8497,236747,236842,528,496,236787,107,148,2060,6288,107,140,2060,8450,109,2063,1581,236779,19808,236779,2861,236769,3744,1473,107,140,12234,26479,614,11995,236761,994,496,33228,600,815,506,1548,529,1581,532,11049,26798,6619,236761,108,141,12703,236787,107,144,20952,236779,19808,236779,2861,5929,236770,236778,236768,196301,568,236770,236764,236743,236770,236768,107,144,20952,236779,19808,236779,2861,236769,236770,236778,236800,236768,196301,568,236770,236764,236743,236778,236768,107,140,12234,108,140,20952,236764,11049,578,236743,236771,236764,236743,236771,107,140,1708,677,528,1540,236769,3744,1473,107,144,584,677,528,623,236771,236778,236812,236825,236828,1083,1581,3323,236743,236770,107,144,584,677,528,623,236770,236800,236810,236832,236819,1083,11049,3323,236743,236770,107,140,2060,1581,236764,11049,109,2063,801,236779,1071,236779,32969,236779,69290,236769,5640,1473,107,140,12234,107,140,26479,496,4414,11995,236764,3011,1061,34900,132944,9363,618,496,2483,236764,107,140,624,994,625,528,67505,236761,107,140,158118,236787,236743,236770,6605,1152,6605,236743,236770,236771,236771,236771,108,140,38408,236787,107,140,22539,801,236779,1071,236779,32969,236779,69290,236769,236770,236819,236768,1251,756,236781,1205,236789,107,140,22539,801,236779,1071,236779,32969,236779,69290,236769,236770,236810,236778,236768,1251,756,738,3436,236789,107,140,22539,801,236779,1071,236779,32969,236779,69290,236769,236812,236778,236825,236768,1251,756,2692,12351,4176,236789,107,140,12234,109,140,236757,578,11058,827,623,236757,1935,107,140,236755,578,11058,827,623,236755,827,623,767,827,623,6450,827,623,2692,827,623,236753,827,623,22439,827,623,236753,767,827,623,236753,6450,827,623,3696,1935,107,140,236781,578,11058,827,623,236781,827,623,12351,827,623,41128,827,623,18765,827,623,236752,827,623,64405,827,623,236752,12351,827,623,236752,41128,827,623,10094,1935,107,140,236747,578,11058,827,623,236747,827,623,3436,827,623,14591,827,623,648,827,623,236766,827,623,4176,827,623,66437,827,623,77032,827,623,1205,1935,107,140,212346,578,520,236840,5640,973,236743,236770,236771,236771,236771,236842,107,140,96498,236751,578,505,17576,5640,2144,236743,236770,236771,236771,236771,236768,973,236743,236770,236771,236771,236842,107,140,155748,578,1123,17576,5640,2144,236743,236770,236771,236771,236768,973,236743,236770,236771,236842,107,140,2699,578,858,236840,5640,2144,236743,236770,236771,236842,107,140,2060,11252,900,13947,900,19162,900,5906,109,2063,1447,236779,2925,236779,20457,236769,236746,236764,518,236764,505,1473,107,140,26610,107,140,26479,506,25565,529,506,1806,9174,529,496,17852,236761,9657,6288,768,506,1806,107,140,132092,1183,496,1447,236772,52597,17852,236764,8450,7394,236761,107,140,236776,1447,236772,52597,17852,563,496,17852,528,837,886,7275,563,1447,7275,653,236743,107,140,236819,236771,5802,236761,107,140,12703,236787,107,140,898,236779,2925,236779,20457,236769,236800,236764,236743,236812,236764,236743,236810,236768,1251,6288,107,140,898,236779,2925,236779,20457,236769,236770,236764,236743,236778,236764,236743,236800,236768,1251,8450,107,140,26610,108,140,107,140,2060,496,5213,236743,236778,900,518,5213,236743,236778,1251,505,5213,236743,236778,653,496,5213,236743,236778,900,505,5213,236743,236778,1251,518,5213,236743,236778,653,518,5213,236743,236778,900,505,5213,236743,236778,1251,496,5213,236743,236778,109,2063,1586,236779,2074,236769,8992,1473,107,140,12234,6974,496,1292,600,37574,496,1694,529,16587,236761,107,140,818,1694,6097,1607,4171,236761,9657,506,3658,607,5783,1548,107,140,1340,4709,7579,236761,1637,5065,16587,735,5783,1548,529,4709,107,140,71271,236764,994,506,886,837,3952,1171,528,195867,22839,1900,236761,108,140,4114,236779,2074,60522,1201,827,623,1340,827,623,2383,18992,1251,623,2383,236775,107,140,4114,236779,2074,60522,1201,827,623,501,546,827,623,5567,18992,1251,623,501,546,236775,107,140,4114,236779,2074,60522,50354,72004,827,623,9579,236775,121987,767,18992,1251,3679,50354,72004,236775,107,140,12234,109,140,31730,236779,574,236779,26297,236764,9898,578,236743,236771,236764,3679,107,140,1708,3658,528,4171,236787,107,144,574,236779,26297,578,5980,236769,1025,236769,3017,1223,236743,107,144,584,677,236779,26297,1890,36411,236779,574,236779,26297,653,568,574,236779,26297,1251,36411,236779,574,236779,26297,532,3658,655,9898,1473,107,148,31730,236779,574,236779,26297,236764,9898,578,677,236779,26297,236764,3658,107,140,2060,9898,109,2063,9039,236769,5640,236764,1202,236764,9866,1473,107,140,12234,107,140,3048,236789,500,496,33233,27973,236764,532,611,3016,735,35751,496,2953,1548,529,55529,236764,107,140,5503,1492,611,1202,531,9039,919,55529,531,4133,506,1719,236789,236751,18917,236761,107,140,7624,1374,994,614,3499,529,870,2558,1548,529,35751,55529,1308,822,18917,236764,107,167,141,1437,1548,529,55529,2378,1308,822,18917,4422,107,140,584,993,659,711,3487,9866,55529,236764,611,795,9039,784,9866,55529,236764,840,795,2036,577,33233,236761,107,140,107,140,12703,236787,107,140,236829,9039,236769,236810,236764,236743,236825,236764,236743,236770,236771,236768,3921,870,236770,236770,236764,236743,236812,236842,107,140,236829,9039,236769,236812,236764,236743,236828,236764,236743,236819,236768,3921,870,236770,236778,236764,236743,236770,236842,107,140,236829,9039,236769,236770,236764,236743,236770,236771,236764,236743,236770,236771,236768,3921,870,236770,236770,236764,236743,236771,236842,107,140,236829,9039,236769,236778,236764,236743,236770,236770,236764,236743,236810,236768,3921,870,236832,236764,236743,236771,236842,107,140,107,140,59617,236787,107,140,236940,5640,1017,11995,107,144,1437,1548,529,55529,600,611,735,35751,236761,107,140,236940,25109,1017,11995,107,144,1437,1548,529,55529,600,611,1202,531,9039,236761,107,140,236940,54023,1017,11995,107,144,1437,1548,529,9866,55529,506,236745,2849,528,2862,107,140,107,140,9390,1256,236787,107,140,236829,236743,236771,6605,1548,6605,236743,236770,236771,236771,236771,107,140,236829,236743,236771,6605,1202,6605,236743,236770,236771,236771,236771,107,140,236829,236743,236771,6605,9866,6605,236743,236770,236771,236771,236771,108,140,19845,2317,19441,107,140,12234,109,140,584,1202,6605,9866,236787,107,144,2060,870,5640,900,1202,236764,9866,753,1202,236842,107,140,4454,236787,107,144,2060,870,5640,900,9866,236764,236743,236771,236842,109,2063,776,236779,36555,236769,4132,236764,47940,1473,107,140,12234,107,140,26479,1156,15852,8535,236764,532,47940,236761,669,1171,1694,815,6079,10075,6675,236764,532,236743,107,140,1437,1855,1694,563,496,1694,529,25630,236761,6890,506,1156,2238,15852,531,2016,506,610,18694,1021,236743,107,140,25733,532,994,506,12207,529,672,5619,236761,108,140,818,6079,10075,6675,236787,107,140,169933,568,900,1732,236743,107,140,208636,568,753,1732,236743,107,140,27915,13592,568,808,1732,236743,107,140,64883,11247,568,973,1732,236743,107,140,190680,9534,568,5213,1732,236743,108,140,12703,236787,107,140,4132,1922,78431,201714,18719,2000,107,140,2513,578,870,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,236842,107,140,3709,578,236743,236778,900,236743,236800,808,236743,236812,753,236743,236810,107,140,9461,1354,578,236743,236819,108,140,10282,236787,107,144,818,3861,529,8535,1694,563,4745,531,506,3861,529,47940,1694,12174,886,236761,107,144,59791,563,496,1694,529,529,1908,236772,27851,25630,236761,107,144,31677,1694,815,657,3198,886,8535,236764,532,47940,1694,815,657,3198,1156,171584,236761,108,140,12234,109,140,4212,578,3679,107,140,1708,858,528,2644,236769,3469,236769,4132,16644,107,144,4212,3323,1540,236769,60612,236840,236747,2812,900,8535,236840,236747,236842,107,140,4212,3323,1540,236769,60612,17825,236770,2812,107,140,2060,4879,236769,4212,236768,109,2063,8974,236769,236751,1473,107,140,12234,3048,659,2238,496,2483,503,236761,107,140,584,503,236840,236747,236842,563,496,6064,236764,14416,1061,1624,699,3718,531,7593,653,79717,42759,236764,236743,107,140,88375,2514,625,618,625,563,236761,107,140,2859,506,2483,6097,951,11739,236764,14416,506,2483,236761,107,140,818,1292,1374,994,506,15116,2483,236761,107,140,38408,107,140,64765,885,236770,236778,236800,236812,1373,578,623,236812,236800,236778,236770,236775,107,140,64765,885,596,1373,578,623,3066,236775,107,140,64765,9828,236746,236940,236780,1373,578,13659,236776,236940,236755,236775,107,140,12234,108,140,743,236764,815,236779,13143,578,15437,8450,107,140,1708,677,528,503,236787,107,144,584,677,236761,231534,6141,107,148,5594,236779,13143,578,6288,107,148,743,3323,677,236761,33460,4925,825,107,144,4454,236787,107,148,743,3323,677,107,140,2060,9898,768,815,236779,13143,1663,503,186487,236770,236842,109,2063,2483,236779,1071,236779,4565,236810,236769,1005,1473,107,140,12234,107,140,26479,496,2483,756,1005,963,994,1061,26456,236810,15424,9363,2483,236761,107,140,2859,756,1005,236789,563,614,7738,2483,236764,994,5450,236761,108,140,22539,2483,236779,1071,236779,4565,236810,1033,9259,1902,1606,1251,756,236800,236744,236778,236810,236819,236825,236771,236746,236832,236819,24224,236825,236819,236763,236825,236832,236812,2692,236812,1050,236825,236832,236746,236832,236778,236755,236825,236778,236789,107,140,12234,108,140,584,1816,1251,86679,994,5450,107,140,1106,209906,107,140,236757,578,209906,236761,4565,236810,825,107,140,236757,236761,6010,236769,1005,236761,18016,885,11365,236772,236828,5924,107,140,2060,520,236761,499,9309,181316,825,109,2063,8729,236779,16469,9964,236769,236746,236764,518,1473,107,140,12234,107,140,26479,1156,4414,25630,496,532,518,236764,994,506,1581,26798,1534,496,107,140,624,518,236764,528,52557,1900,236761,108,140,2542,2591,236787,107,140,19057,236779,16469,9964,236769,236778,236764,236743,236828,236768,1477,870,236778,236764,236743,236812,236764,236743,236825,236764,236743,236828,236842,107,140,19057,236779,16469,9964,236769,236828,236764,236743,236778,236768,1477,870,236778,236764,236743,236812,236764,236743,236825,236764,236743,236828,236842,107,140,19057,236779,16469,9964,236769,236770,236771,236764,236743,236770,236812,236768,1477,2977,107,140,12234,109,140,584,496,1890,518,236787,496,236764,518,578,518,236764,496,107,140,2060,870,236747,573,858,528,2644,236769,236746,236764,1322,236769,236763,900,236743,236770,236764,236743,236770,236771,1223,768,858,2144,236743,236778,1251,236743,236771,236842,108,2543,28751,1419,4361,109,2063,815,236779,5977,236779,31493,236769,34488,236787,4361,236840,8344,1604,14272,236787,6803,236768,3921,7014,236787,107,140,12234,7179,768,528,2238,1694,529,4945,236764,659,1027,1156,4945,12532,531,1546,1032,1082,107,140,45163,14272,236761,107,140,22539,815,236779,5977,236779,31493,5551,236770,236761,236771,236764,236743,236778,236761,236771,236764,236743,236800,236761,236771,1604,236743,236771,236761,236810,236768,107,140,9277,107,140,22539,815,236779,5977,236779,31493,5551,236770,236761,236771,236764,236743,236778,236761,236828,236764,236743,236800,236761,236771,236764,236743,236812,236761,236771,236764,236743,236810,236761,236771,236764,236743,236778,236761,236771,1604,236743,236771,236761,236800,236768,107,140,4339,107,140,12234,109,140,40835,236779,34488,578,19372,236769,34488,236768,107,140,1708,858,528,2644,236769,3469,236769,40835,236779,34488,236768,753,236743,236770,1473,107,144,584,19372,236779,34488,236840,236747,900,236743,236770,236842,753,19372,236779,34488,236840,236747,236842,655,14272,236787,107,148,2060,6288,107,140,2060,8450,108,2543,28751,1419,4361,109,2063,7732,236779,18919,236779,19243,236769,18919,236779,2383,236787,1540,236768,3921,4361,236840,1714,9414,107,140,12234,13065,531,672,1292,563,496,2483,7906,5065,4402,529,43927,62334,236761,5180,5671,563,531,107,140,129510,1724,2299,1131,7732,16587,532,994,506,1694,529,1724,236761,107,140,141104,4402,659,20433,568,17136,1932,24088,563,9668,6535,236768,532,711,43927,2351,1546,1032,107,140,34409,1027,9952,528,506,2744,2483,236761,107,140,22539,7732,236779,18919,236779,19243,175932,1732,5960,28909,5960,136622,28909,1606,107,140,1922,825,963,93501,3507,963,756,11292,3507,2000,107,140,12234,109,140,26297,236764,2299,236764,2536,578,236743,236771,236764,15437,2977,107,140,1708,677,528,53679,236779,2383,236787,107,144,584,677,1251,623,61835,38302,3323,236743,236770,107,144,584,677,1251,15825,1083,38302,14599,236743,236770,107,144,584,677,2843,623,5563,2299,3323,677,107,144,584,38302,1251,236743,236771,236787,107,148,584,2299,2843,86679,2536,236761,3770,236769,4043,236768,107,148,4043,578,3679,107,140,2060,2536,110,2063,102267,236779,5640,236769,5640,236787,6803,236768,3921,6803,236787,107,140,12234,17770,496,4414,18224,1523,1548,236764,625,740,577,81153,1131,107,140,624,11995,912,568,65020,11995,7100,1082,2238,1548,236768,532,70208,107,140,236769,989,1749,912,2462,7100,1082,236743,236770,769,108,140,13293,506,20632,912,529,506,1548,236761,107,140,22539,102267,236779,5640,236769,236800,236761,236810,236768,107,140,236771,236761,236810,107,140,12234,109,140,2060,1548,753,801,236769,5640,236768,108,2543,28751,1419,4361,109,2063,3426,236779,13321,236769,68382,236787,4361,236840,720,2812,3921,7014,236787,107,140,12234,1599,236789,500,2238,496,1694,529,14664,532,34549,6675,580,496,4856,2881,600,9857,607,107,140,13321,7002,236761,5180,4209,563,531,6440,768,657,1027,1523,506,7002,529,2881,3798,5629,3426,5743,236764,532,107,140,502,600,1523,1292,1374,994,6288,236761,27587,625,1374,994,8450,236761,107,140,22539,3426,236779,13321,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,9277,107,140,22539,3426,236779,13321,5551,236770,236764,236743,236778,236764,753,236812,236764,236743,236810,2812,107,140,4339,107,140,12234,108,140,11622,578,236743,236771,107,140,1708,5585,528,6675,236787,107,144,11622,3323,5585,107,144,584,2881,655,236743,236771,236787,107,148,2060,6288,107,140,2060,8450,108,2543,28751,1419,4361,109,2063,2689,236779,35233,236779,95253,236769,34488,236787,4361,236840,8344,2812,3921,6803,236787,107,140,12234,1701,496,2238,1694,529,2744,4945,236764,9279,36673,72493,164069,107,140,24616,506,2689,529,672,15297,236761,107,140,44389,72493,164069,563,506,4398,10298,4954,1534,1546,107,140,7011,532,496,3988,3947,568,10520,528,672,1624,1473,107,140,102892,578,4398,1109,1123,753,1123,236779,10520,1109,107,140,22539,2689,236779,35233,236779,95253,5551,236770,236761,236771,236764,236743,236778,236761,236771,236764,236743,236800,236761,236771,236764,236743,236812,236761,236771,2812,107,140,236770,236761,236771,107,140,12234,108,140,10520,578,2324,236769,34488,236768,965,5980,236769,34488,236768,107,140,2060,2324,236769,9082,236769,236781,753,2689,236768,573,1123,528,4945,236768,965,5980,236769,34488,236768,108,2543,28751,1419,4361,109,2063,939,236751,85432,236769,34488,236787,4361,236840,720,1604,1162,20146,236787,801,236768,3921,4361,236840,720,9414,107,140,12234,39606,496,1548,756,7796,20146,236789,1534,1418,1156,22592,4820,529,2744,1694,2165,34488,236789,107,140,22539,939,236751,85432,142976,236743,236812,236768,107,140,3805,107,140,22539,939,236751,85432,5551,236770,236764,236743,236778,236764,236743,236800,1604,236743,236812,236768,107,140,236840,236770,236764,236743,236812,236764,236743,236778,236764,236743,236812,236764,236743,236800,236842,107,140,12234,109,140,619,578,2977,107,140,1708,858,528,2644,236769,3469,236769,34488,16644,107,144,619,236761,3770,236769,34488,236840,236747,2812,107,144,584,858,2843,5980,236769,34488,236768,753,236743,236770,236787,107,148,619,236761,3770,236769,7796,20146,236768,107,140,2060,766,108,2543,28751,1419,4361,109,2063,11299,236779,103895,236779,16611,3852,236769,18919,236779,2383,236787,1540,236768,3921,4361,236840,720,9414,107,140,12234,13065,531,672,1292,563,496,2483,10725,5065,4402,573,43927,62334,15914,684,9952,236761,107,140,2542,1546,529,506,2299,236764,3938,506,58825,1984,529,64597,529,62334,236761,107,140,236788,236761,236759,236761,231998,3507,815,5783,1156,4535,529,64597,1651,5960,21957,815,1806,236761,108,140,22539,11299,236779,103895,236779,16611,3852,1033,11292,3507,5960,21957,4157,5960,3507,825,3507,1606,107,140,236840,236778,236764,236743,236800,236764,236743,236770,236764,236743,236800,236842,107,140,12234,108,140,107,140,2063,1527,236779,15104,236769,236751,236787,1540,236768,3921,801,236787,107,144,2074,236779,15104,236764,38302,578,236743,236771,236764,236743,236771,107,144,1708,677,528,503,236787,107,148,584,677,1251,623,61835,38302,3323,236743,236770,107,148,584,677,1251,15825,1083,38302,14599,236743,236770,107,148,2074,236779,15104,578,2631,236769,2074,236779,15104,236764,38302,236768,107,144,2060,2631,236779,15104,107,140,107,140,2060,870,2861,236779,15104,236769,236751,236768,573,503,528,53679,236779,2383,236761,6966,885,15825,768,503,2843,623,1935,108,2543,28751,1419,4361,109,2063,5957,236779,2003,236779,26967,236769,29139,236787,4361,236840,1714,1604,85440,236787,1540,236768,3921,4361,236840,1714,9414,107,140,12234,28293,614,2744,1694,529,16587,1186,573,5906,600,3014,2238,85440,107,140,22539,5957,236779,2003,236779,26967,142976,756,236746,1606,107,140,3805,107,140,22539,5957,236779,2003,236779,26967,20768,28180,963,756,53896,236753,963,756,236755,893,963,756,2513,7367,756,236746,1606,107,140,1922,28180,963,756,53896,236753,963,756,2513,2000,107,140,12234,108,140,2060,1694,236769,7212,236769,3485,503,236787,85440,528,503,236764,16587,1223,108,2543,28751,1419,4361,236764,76272,109,2063,2324,236779,5930,236769,34488,236787,4361,236840,720,2812,3921,76272,236840,720,236764,801,9414,107,140,12234,1701,496,2238,1694,529,25630,236764,994,496,33228,17520,529,496,2324,532,496,1698,529,784,506,25630,528,496,1694,236761,107,140,11447,2324,1374,577,4745,531,236743,236771,532,7738,1698,1374,577,4745,531,236743,236770,236761,107,140,22539,2324,236779,5930,67713,107,140,236769,236771,236764,236743,236770,236768,107,140,22539,2324,236779,5930,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,2812,107,140,236769,236770,236771,236764,236743,236778,236812,236768,107,140,12234,108,140,236751,236764,510,578,236743,236771,236764,236743,236770,107,140,1708,1548,528,4945,236787,107,144,236751,3323,1548,107,144,236758,32162,1548,107,140,2060,503,236764,510,108,2543,28751,1419,4361,236764,76272,109,2063,19519,236779,2074,236769,34488,236787,4361,236840,720,2812,3921,4361,236840,720,9414,107,140,12234,4934,496,2238,1694,529,25630,236764,8729,496,1694,529,19519,5783,3408,1765,3097,2238,3479,107,140,495,506,7501,236761,107,140,22539,19519,236779,2074,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236778,2812,107,140,236840,236770,236764,236743,236778,236764,236743,236800,236764,236743,236800,236764,236743,236800,236764,236743,236812,236764,236743,236812,236842,107,140,12234,108,140,2060,870,2074,236769,34488,8497,236769,236747,236862,236770,44538,573,858,528,2644,236769,3469,236769,34488,50796,110,2063,563,236779,227147,236769,2383,236787,1540,236768,3921,7014,236787,107,140,12234,5803,768,2238,2483,563,496,142193,5240,107,140,2060,2483,1251,2483,186487,236770,236842,109,2063,1386,236779,227147,236769,2383,236787,1540,236768,3921,1540,236787,107,140,12234,9100,506,48037,142193,600,12502,607,496,17686,2483,236761,107,140,45835,4317,563,3606,236787,107,140,236772,9100,506,27801,96805,529,17686,2483,600,563,496,142193,236761,107,140,236772,81806,531,506,1345,529,506,2483,14416,529,496,2483,24905,600,3952,1680,506,180522,660,525,42636,236761,107,140,22539,1386,236779,227147,68560,107,140,10440,107,140,22539,1386,236779,227147,1033,9307,1606,107,140,236789,9307,552,236789,107,140,22539,1386,236779,227147,1033,213822,1606,107,140,236789,9307,552,236789,107,140,12234,108,140,584,563,236779,227147,236769,2383,1473,107,144,2060,2483,107,140,1708,858,528,2644,236769,3469,236769,2383,16644,107,144,584,563,236779,227147,236769,2383,236840,236747,9218,1473,107,148,2060,2483,900,2483,236840,236747,236772,236770,59396,236770,236842,108,2543,28751,1419,4361,109,2063,2483,236779,97251,236769,236746,236787,1540,236764,518,236787,1540,236768,3921,1540,236787,107,140,12234,13065,659,1156,16587,496,532,518,17520,1186,529,236743,236770,236751,532,236743,236771,236751,236761,107,140,54950,14820,146592,580,1239,9103,532,994,1354,992,618,496,2483,236761,107,140,22539,2483,236779,97251,1033,236771,236770,236771,963,756,236770,236770,236771,1606,107,140,236789,236770,236771,236771,236789,107,140,12234,109,140,2060,116740,7013,236769,1714,236769,720,236769,236746,236840,236747,2812,8201,801,236769,236763,236840,236747,14430,573,858,528,2644,236769,3469,236769,236746,9670,108,2543,28751,1419,4361,236764,26272,109,2063,27801,236769,29139,236787,4361,236840,1714,2812,3921,26272,236840,1714,9414,107,140,12234,5641,529,1694,529,16587,236764,994,506,27801,886,236761,9657,506,1171,886,528,1624,529,5065,107,140,29139,529,506,1638,3861,236761,9657,5450,528,1624,506,2744,1694,563,7738,236761,107,140,22539,27801,67713,108,140,22539,27801,20768,236746,963,756,236763,963,756,236755,10190,107,140,236789,236746,236789,107,140,22539,27801,20768,236746,963,756,9579,963,756,6450,10190,107,140,236789,6450,236789,107,140,12234,108,140,584,711,16587,236787,107,144,2060,5450,108,140,214676,578,2631,236769,3469,236769,236781,236768,573,1123,528,16587,236768,107,140,1708,503,528,16587,236787,107,144,584,5980,236769,236751,236768,1251,2631,3469,236787,107,148,2060,503,110,2063,11333,236779,11147,236779,76380,236769,236746,236787,801,236764,518,236787,801,236768,3921,801,236787,107,140,12234,9657,496,11333,3364,50289,529,1156,25630,496,532,518,107,140,22539,11333,236779,11147,236779,76380,236769,236800,236764,236743,236810,236768,107,140,236770,107,140,22539,11333,236779,11147,236779,76380,236769,236778,236810,236764,236743,236770,236810,236768,107,140,236810,107,140,12234,109,140,2063,7609,236779,109359,236769,236746,236787,801,236764,518,236787,801,236768,3921,801,236787,107,144,2060,496,768,518,1251,236743,236771,1663,7609,236779,109359,236769,236763,236764,496,2144,518,236768,107,140,2060,7609,236779,109359,236769,236746,236764,518,236768,108,2543,28751,1419,4361,109,2063,784,236779,134517,236769,2383,236787,1540,236768,3921,4361,236840,1714,9414,107,140,12234,9657,1694,529,784,150537,699,48037,531,27801,529,506,2744,2483,107,140,22539,784,236779,134517,1033,28180,1606,107,140,1922,236746,963,756,596,963,756,28180,2000,107,140,12234,108,140,2060,870,2383,8497,236769,236747,900,236743,236770,7066,573,858,528,2644,236769,3469,236769,2383,50796,110,2063,2483,236779,25425,236769,236749,236787,801,236768,3921,1540,236787,107,140,12234,9657,496,2483,7906,2557,236772,166194,1552,4945,6250,699,236743,236771,102844,538,23722,236761,107,140,22539,2483,236779,25425,236769,236771,236768,107,140,236789,236771,236789,107,140,22539,2483,236779,25425,236769,236810,236768,107,140,236789,236771,236743,236770,236743,236778,236743,236800,236743,236812,236743,236810,236789,107,140,12234,109,140,2060,623,16150,7013,236769,3275,236769,1714,236764,2644,236769,236749,900,236743,236770,9670,110,2063,1527,236779,101282,236779,71271,236769,2383,236787,1540,236768,3921,801,236787,107,140,12234,17770,496,2483,236764,1586,855,1217,1551,9245,7579,568,170492,529,1624,236768,1677,625,4551,529,107,140,22539,1527,236779,101282,236779,71271,1033,56660,54283,1606,107,140,236800,107,140,22539,1527,236779,101282,236779,71271,1033,131602,1606,107,140,236812,107,140,12234,108,140,2060,5980,236769,1025,236769,2383,236761,11462,21957,108,2543,28751,1419,4361,109,2063,11299,236779,26873,236769,26873,236779,2383,236787,1540,236768,3921,4361,236840,720,9414,107,140,12234,13065,531,672,1292,563,496,2483,13855,13906,8687,528,496,2803,88724,6518,236761,107,140,11069,4209,563,531,11299,672,2483,532,994,1694,529,25630,7041,531,1217,1551,39161,1677,1546,107,140,2217,1774,236761,108,140,8291,563,496,15287,236787,107,140,236789,236748,236789,753,3697,5433,236764,44079,2390,39161,107,140,236789,236748,236909,236789,753,3746,5433,236764,44079,1156,39161,107,140,6748,236909,236789,753,690,918,5433,236764,44079,886,12222,108,140,22539,11299,236779,26873,1033,236748,512,236909,783,236909,512,236909,512,236909,783,236909,783,236909,783,236909,783,236909,512,512,1606,107,140,236840,236812,236764,236743,236778,236764,236743,236770,236764,236743,236778,236764,236743,236778,236764,236743,236770,236764,236743,236770,236764,236743,236770,236764,236743,236770,236764,236743,236812,236764,236743,236812,236842,107,140,12234,109,140,2063,1527,236779,126374,236769,14210,236787,1540,236768,3921,801,236787,107,144,584,5433,1251,623,236748,1083,994,236743,236812,107,144,36208,5433,1251,623,236748,236909,1083,994,236743,236778,107,144,36208,5433,1251,16150,236909,1083,994,236743,236770,107,140,107,140,584,4252,236779,2383,1251,86679,994,2977,107,140,2060,1694,236769,3275,236769,2861,236779,126374,236764,4252,236779,2383,236761,6966,885,623,9670,110,2063,1217,236779,34717,236779,3841,236769,2383,236787,1540,236764,85440,236787,1540,236768,3921,801,236787,107,140,12234,9100,1217,1551,2782,496,2238,85440,740,577,1765,528,506,3303,2483,236761,4308,17352,27279,3636,236761,107,140,22539,1217,236779,34717,236779,3841,95780,756,236746,1606,107,140,236771,107,140,22539,1217,236779,34717,236779,3841,1033,72004,963,756,236746,1606,107,140,236800,107,140,22539,1217,236779,34717,236779,3841,1033,50354,963,756,9236,1606,107,140,236800,107,140,12234,109,140,35775,16620,1426,578,236743,236771,107,140,1708,858,528,2644,236769,3469,236769,2383,16644,107,144,584,2483,236840,236747,191366,52740,236769,26967,1473,107,148,35775,16620,1426,3323,236743,236770,107,140,2060,2366,16620,1426,108,2543,28751,1419,4361,109,2063,4260,236779,34488,236769,34488,236787,1540,236768,3921,1540,236787,107,140,12234,13065,563,496,2557,236772,166194,1552,2483,529,1548,1294,699,756,13321,236789,531,756,52908,6748,107,140,13280,12871,659,756,13321,963,756,811,963,756,13498,963,756,19891,963,756,19025,963,756,21716,963,756,34699,963,756,47526,963,756,44622,236789,532,756,52908,6748,107,140,13293,506,2483,607,4945,19372,699,21548,531,7488,107,140,22539,4260,236779,34488,1033,19891,886,3493,1606,107,140,236789,811,1806,3493,236789,107,140,12234,108,140,107,140,1071,236779,720,578,16923,13321,2632,236743,236771,236764,756,811,2632,236743,236770,236764,756,13498,2632,236743,236778,236764,756,19891,2632,236743,236800,236764,756,19025,2632,236743,236812,236764,756,21716,2632,236743,236810,236764,756,34699,2632,236743,236825,236764,756,47526,2632,236743,236832,236764,756,44622,2632,236743,236828,236764,756,52908,2632,236743,236819,236783,108,140,584,4945,1251,86679,994,3679,107,140,2060,623,16150,7013,236769,40835,236769,34488,236761,6966,885,142737,2307,236784,3485,538,236787,531,236779,720,236840,236749,14430,108,2543,28751,1419,4361,236764,76272,109,2063,1586,236779,69344,236779,31493,236769,34488,236787,4361,236840,8344,2812,3921,76272,236840,8344,236764,6803,9414,107,140,12234,4934,496,17686,1694,529,4945,568,1340,3861,657,3198,1156,236768,4864,532,994,1156,600,659,506,24119,531,1546,107,140,1538,532,994,1091,528,1900,568,146772,1548,236764,6268,1548,769,107,140,22539,1586,236779,69344,236779,31493,5551,236770,236761,236771,236764,236743,236778,236761,236771,236764,236743,236800,236761,236771,236764,236743,236812,236761,236771,236764,236743,236810,236761,236771,236764,236743,236778,236761,236778,2812,107,140,236769,236778,236761,236771,236764,236743,236778,236761,236778,236768,107,140,22539,1586,236779,69344,236779,31493,5551,236770,236761,236771,236764,236743,236778,236761,236771,236764,236743,236800,236761,236771,236764,236743,236812,236761,236771,236764,236743,236810,236761,236771,236764,236743,236778,236761,236771,2812,107,140,236769,236778,236761,236771,236764,236743,236778,236761,236771,236768,107,140,12234,109,140,34488,236761,10479,825,107,140,1387,236779,16417,578,6803,885,10281,1373,107,140,1387,236779,16754,578,5450,107,140,1708,537,236764,637,528,20058,236769,34488,74852,236770,1604,4945,236840,236770,9218,1473,107,144,16417,578,637,753,537,107,144,584,2675,655,1322,236779,16417,236787,107,148,1387,236779,16417,578,2675,107,148,1387,236779,16754,578,568,236752,236764,637,236768,107,140,2060,1322,236779,16754,108,2543,28751,1419,4361,109,2063,15974,1203,236779,1071,236779,6805,236769,34488,236787,4361,236840,8344,2812,3921,4361,236840,8344,9414,107,140,12234,17770,1694,529,4945,568,1340,657,3198,1156,4820,779,5510,496,6373,4959,531,600,1694,236764,107,140,17887,600,506,21548,1548,795,3291,236743,236771,532,506,7488,795,3291,236743,236770,107,140,22539,15974,1203,236779,1071,236779,6805,5551,236770,236761,236771,236764,236743,236778,236761,236771,236764,236743,236800,236761,236771,236764,236743,236812,236761,236771,236764,236743,236810,236761,236771,2812,107,140,236840,236771,236761,236771,236764,236743,236771,236761,236778,236810,236764,236743,236771,236761,236810,236764,236743,236771,236761,236832,236810,236764,236743,236770,236761,236771,236842,107,140,12234,109,140,841,236764,3628,578,2631,236769,34488,779,1322,236769,34488,236768,107,140,236767,578,236743,236770,965,568,841,753,3628,236768,107,140,2060,1694,236769,3275,236769,3485,1123,236787,568,236781,753,3628,236768,808,620,236764,4945,1223,108,2543,28751,1419,4361,236764,7129,109,2063,5957,236779,16469,9964,236769,7558,236787,4361,236840,10880,2812,3921,4361,236840,720,9414,107,140,12234,28293,2238,1694,529,1027,23181,2979,1186,573,25630,107,140,22539,5957,236779,16469,9964,20768,236746,963,236743,236800,236761,236770,236812,236764,236743,236810,2812,107,140,236840,236810,236842,107,140,22539,5957,236779,16469,9964,5551,236770,236764,236743,236778,236764,236743,236800,236764,756,28180,963,31763,2977,2812,107,140,236840,236770,236764,236743,236778,236764,236743,236800,236842,107,140,12234,108,140,2060,1694,236769,7212,236769,3485,1123,236787,1722,236769,236781,236768,1251,801,236764,2979,1223,110,2063,50839,236769,2383,236787,1540,236768,3921,801,236787,107,140,12234,9657,3861,529,2238,2483,107,140,22539,50839,68560,107,140,236771,107,140,22539,50839,1033,28180,1606,107,140,236800,107,140,12234,108,140,2060,5980,236769,2383,236768,110,2063,7488,236779,76380,236769,236749,236787,801,236768,3921,801,236787,107,140,12234,1701,496,2238,1548,538,236764,1586,506,7488,1548,600,59529,538,41923,236764,7100,1082,538,107,140,22539,7488,236779,76380,236769,236770,236810,236768,107,140,236810,107,140,12234,109,140,1708,858,528,2644,236769,236778,236764,538,1473,107,144,584,538,2144,858,1251,236743,236771,236787,994,538,973,858,107,140,2060,236743,236770,108,2543,28751,1419,4361,109,2063,5415,969,236769,236749,236787,801,236768,3921,4361,236840,720,9414,107,140,12234,9657,1694,529,8355,5872,529,2238,11995,528,506,1900,699,21548,531,7488,236761,107,140,7795,529,506,5872,1374,577,9456,1548,529,2782,7041,531,1217,1551,2782,625,5092,4994,528,82189,236761,107,140,4661,1548,1374,577,4745,531,506,1698,529,784,5872,107,140,22539,5415,969,236769,236828,236768,107,140,236840,236778,236764,236743,236778,236764,236743,236778,236842,107,140,22539,5415,969,236769,236778,236810,236768,107,140,236840,236810,236764,236743,236810,236842,107,140,22539,5415,969,236769,236832,236771,236768,107,140,236840,236778,236764,236743,236810,236764,236743,236832,236842,107,140,12234,108,140,1106,6596,107,140,18377,578,2977,107,140,236747,578,236743,236778,107,140,6858,858,6605,801,236769,747,236761,4784,236769,236749,236768,900,236743,236770,1473,107,144,584,538,2144,858,1251,236743,236771,236787,107,148,18377,236761,3770,236769,236747,236768,107,148,236749,973,236784,858,107,144,4454,236787,107,148,236747,3323,236743,236770,108,140,584,538,1890,236743,236770,236787,107,144,18377,236761,3770,236769,236749,236768,107,140,2060,1707,108,2543,28751,1419,4361,109,2063,6349,236779,179826,236769,34488,236787,4361,236840,720,2812,3921,4361,236840,720,9414,107,140,12234,4934,496,1694,529,25630,236764,6349,784,4820,600,4583,919,1082,3622,236761,107,140,27252,1900,529,4820,2378,506,1638,618,528,506,2744,236761,107,140,22539,6349,236779,179826,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236778,236764,236743,236812,2812,107,140,236840,236770,236764,236743,236800,236764,236743,236812,236842,107,140,12234,108,140,3744,236779,26297,578,12739,825,107,140,1708,1548,528,4945,236787,107,144,584,1548,711,528,1152,236779,26297,236787,107,148,3744,236779,26297,236840,5640,236842,578,236743,236771,107,144,3744,236779,26297,236840,5640,236842,3323,236743,236770,107,140,2060,870,5640,573,1548,528,4945,768,1152,236779,26297,236840,5640,236842,1251,236743,236770,236842,110,2063,23510,236779,4925,236769,2383,236787,1540,236768,3921,1540,236787,107,140,12234,1701,496,2238,2483,236764,23510,67505,7579,531,46451,532,46451,531,67505,236761,107,140,22539,23510,236779,4925,1033,9259,1606,107,140,236789,236754,45340,236806,236789,107,140,12234,108,140,2060,116740,7013,236769,3275,236769,3485,1123,236787,1123,236761,33460,4925,3800,2483,1223,108,2543,28751,1419,4361,109,2063,168984,236769,29139,236787,4361,236840,1714,2812,3921,1540,236787,107,140,12234,36361,54680,1694,529,16587,1131,496,3161,2483,107,140,22539,168984,67713,107,140,10440,107,140,22539,168984,20768,236746,963,756,236763,963,756,236755,10190,107,140,236789,28180,236789,107,140,12234,108,140,2060,116740,7013,236769,29139,236768,108,2543,28751,1419,4361,109,2063,5957,236779,2003,236779,20836,236769,29139,236787,4361,236840,1714,1604,24905,236787,1540,236768,3921,4361,236840,1714,9414,107,140,12234,28293,614,2744,1694,529,16587,1186,573,5906,600,1502,607,496,2238,24905,236761,107,140,22539,5957,236779,2003,236779,20836,142976,756,236746,1606,107,140,3805,107,140,22539,5957,236779,2003,236779,20836,20768,28180,963,756,214728,963,756,236755,893,963,756,2513,7367,756,236746,1606,107,140,1922,28180,963,756,2513,2000,107,140,12234,108,140,2060,1694,236769,7212,236769,3485,1123,236787,1123,236761,52740,236769,20836,779,16587,1223,110,2063,974,236779,30558,236769,236752,236787,1694,1473,107,140,12234,13293,1186,4414,4945,528,506,1694,236761,107,140,22539,974,236779,30558,99097,236770,236764,236743,236778,236764,753,236812,236764,236743,236810,236764,236743,236825,2812,107,140,236840,236778,236764,236743,236810,236764,236743,236825,236842,107,140,22539,974,236779,30558,5551,236810,236764,236743,236800,236764,753,236810,236764,236743,236778,236764,753,236800,236764,236743,236800,236764,236743,236819,236764,236743,236771,236764,236743,236770,236778,236800,236764,236743,236770,236764,753,236770,236771,2812,107,140,236840,236810,236764,236743,236800,236764,236743,236778,236764,236743,236800,236764,236743,236819,236764,236743,236770,236778,236800,236764,236743,236770,236842,107,140,12234,108,140,2060,1694,236769,7212,236769,3485,1123,236787,1123,1890,236743,236771,236764,537,1223,110,2063,563,236779,2497,236769,236749,1473,107,140,12234,13293,1847,768,496,2238,1548,563,8355,236764,532,2416,7394,236761,107,140,22539,563,236779,2497,236769,236825,236768,107,140,9277,107,140,22539,563,236779,2497,236769,236770,236771,236770,236768,107,140,4339,107,140,22539,563,236779,2497,236769,236770,236770,236768,107,140,4339,107,140,22539,563,236779,2497,236769,236770,236800,236812,236812,236770,236768,107,140,4339,107,140,22539,563,236779,2497,236769,236825,236770,236768,107,140,4339,107,140,22539,563,236779,2497,236769,236812,236768,107,140,9277,107,140,22539,563,236779,2497,236769,236770,236768,107,140,9277,107,140,12234,108,140,584,538,6605,236743,236770,236787,994,8450,107,140,236749,236779,4784,578,236743,236770,107,140,6858,538,236779,4784,5213,236743,236778,655,538,236787,538,236779,4784,3323,236743,236770,107,140,1708,858,528,2644,236769,236778,236764,1322,236769,236749,236779,4784,900,236743,236770,236764,538,16644,107,144,584,538,2144,858,1251,236743,236771,236787,107,148,2060,8450,107,140,2060,6288,108,1106,6596,109,2063,6356,236769,28570,236787,1694,236764,1123,236787,6803,1473,107,140,12234,107,140,102415,1090,14337,607,15841,43733,657,1523,1123,236761,107,140,2060,43733,236840,236771,236842,900,43733,236840,236770,236842,808,1123,900,43733,236840,236770,236842,808,1123,236884,236778,900,27103,43733,236840,236749,236842,808,1123,236884,236749,107,140,12234,107,140,2060,2324,5551,69984,808,6596,236761,16559,236769,236781,236764,858,236768,573,858,236764,86675,528,29833,236769,28570,44538,109,2063,1586,236779,13321,236769,28570,236787,1694,1473,107,140,12234,43733,659,15841,529,496,14337,236761,107,140,4114,236779,13321,1586,1123,1288,600,6356,236769,236781,236768,578,236743,236771,236761,107,140,4114,236779,13321,7623,1186,1186,5743,1523,236764,1581,768,993,659,1551,236761,107,140,38419,236764,1586,236779,13321,1186,4716,1694,43733,2963,1581,1548,529,15841,107,140,624,7488,1908,5743,13954,618,625,36369,107,140,236746,3465,236761,107,140,22539,4886,236769,4114,236779,13321,5551,236770,236764,236743,236778,18107,236743,236778,236768,997,517,236769,236781,236768,578,236743,236770,900,236743,236778,236781,107,140,236772,236771,236761,236810,107,140,22539,4886,236769,4114,236779,13321,99097,236825,236764,236743,236770,236770,236764,753,236825,236764,236743,236770,18107,236743,236778,236768,997,568,236781,753,236743,236770,236768,808,568,236781,753,236743,236778,236768,808,568,236781,753,236743,236800,236768,578,753,236825,900,236743,236770,236770,236781,753,236743,236825,236781,236884,236778,900,1123,236884,236800,107,140,236770,236761,236771,107,140,12234,109,140,12275,236751,578,870,28570,236840,236747,236842,808,858,573,858,528,2644,236769,236770,236764,5980,236769,28570,50796,107,140,2063,6051,236769,236781,1473,107,144,2060,6356,236769,28570,236764,1123,236768,107,140,2063,17407,236769,236781,1473,107,144,2060,6356,236769,12275,236751,236764,1123,236768,107,140,107,140,236781,236764,60429,578,236743,236771,236764,236743,236770,236744,236772,236810,107,140,1708,2222,528,2644,236769,236770,236771,236771,236771,1473,107,144,16912,578,6051,236769,236781,236768,107,144,3405,236781,578,17407,236769,236781,236768,107,144,584,2951,236769,16912,236768,655,60429,236787,2541,107,144,236781,578,1123,753,59730,965,9101,236781,108,140,2060,1123,110,2063,4260,236779,23362,236769,236752,236787,1694,1473,107,140,12234,2094,1292,4716,496,1694,537,532,7623,496,1694,537,236789,1288,600,107,140,236752,236789,563,15779,531,537,528,506,3114,695,600,659,711,69330,684,1806,236764,1651,1061,2979,657,506,3114,695,600,659,69330,684,1806,659,4745,107,140,1071,506,2979,529,506,7041,3114,695,529,537,236764,840,19372,236761,107,140,22539,4260,236779,23362,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,236840,236770,236764,236743,236778,236764,236743,236800,236842,107,140,22539,4260,236779,23362,5551,236810,236764,236743,236825,236764,236743,236800,236764,236743,236812,236764,236743,236828,236764,236743,236819,236764,236743,236778,2812,107,140,236840,236778,236764,236743,236825,236764,236743,236800,236764,236743,236812,236764,236743,236828,236764,236743,236819,236764,236743,236810,236842,107,140,12234,108,140,23362,578,870,236752,236840,236747,236842,573,858,528,2644,236769,3469,236769,236752,1223,768,858,2144,236743,236800,1251,236743,236771,236842,107,140,23362,236761,10479,825,107,140,2060,870,23362,236840,236747,973,236743,236800,236842,768,858,2144,236743,236800,1251,236743,236771,1663,537,236840,236747,236842,573,858,528,2644,236769,3469,236769,236752,50796,110,2063,4709,236769,236752,236787,1694,1473,107,140,12234,13293,19372,4709,4820,528,496,1694,107,140,22539,4709,5551,236810,236764,236743,236800,236764,236743,236810,236764,236743,236778,236764,236743,236800,236764,236743,236800,236764,236743,236819,236764,236743,236771,236764,236743,236770,236778,236800,2812,107,140,236840,236771,236764,236743,236778,236764,236743,236800,236764,236743,236810,236764,236743,236819,236764,236743,236770,236778,236800,236842,107,140,12234,108,140,2060,19372,236769,1025,236769,236752,1223,110,2063,2631,236779,7011,236769,236752,236787,1694,1473,107,140,12234,13293,5783,3408,528,506,1694,236761,107,140,22539,2631,236779,7011,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,236800,107,140,22539,2631,236779,7011,5551,236810,236764,236743,236800,236764,753,236810,236764,236743,236778,236764,753,236800,236764,236743,236800,236764,236743,236819,236764,236743,236771,236764,236743,236770,236778,236800,236764,236743,236770,236764,753,236770,236771,2812,107,140,236770,236778,236800,107,140,12234,108,140,2060,2631,236769,236752,236768,110,2063,142747,236779,149986,236769,236749,236787,801,1473,107,140,12234,13293,506,1548,529,2782,506,15958,236743,236832,7412,528,25630,2344,1082,538,837,659,69330,684,236743,236770,236770,653,236743,236770,236800,236761,107,140,22539,142747,236779,149986,236769,236810,236771,236768,107,140,236771,107,140,22539,142747,236779,149986,236769,236832,236828,236768,107,140,236778,107,140,22539,142747,236779,149986,236769,236832,236819,236768,107,140,236800,107,140,12234,108,140,26297,578,236743,236771,107,140,1708,858,528,2644,236769,236749,1473,107,144,584,858,2144,236743,236770,236770,1251,236743,236771,653,858,2144,236743,236770,236800,1251,236743,236771,236787,107,148,26297,3323,5980,236769,2234,236769,7212,236769,3485,505,236787,505,1251,623,236832,827,1540,236769,236747,41052,107,140,2060,38302,110,2063,4260,236779,20952,236769,236752,236787,1694,1473,107,140,12234,2094,1292,4716,496,1694,537,532,7623,496,1694,537,236789,1288,600,107,140,236752,236789,563,15779,531,537,528,506,11049,3114,695,236764,1651,1061,2979,657,506,1581,3114,695,659,4745,107,140,1071,506,2979,529,506,1581,3114,695,529,537,236764,840,19372,236761,107,140,22539,4260,236779,20952,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,236840,236770,236764,236743,236778,236764,236743,236800,236842,107,140,22539,4260,236779,20952,5551,236810,236764,236743,236825,236764,236743,236800,236764,236743,236812,2812,107,140,236840,236800,236764,236743,236825,236764,236743,236810,236764,236743,236812,236842,107,140,12234,108,140,20952,578,870,236752,236840,236747,236842,573,858,528,2644,236769,3469,236769,236752,1223,768,858,2144,236743,236778,1251,236743,236771,236842,107,140,20952,236761,10479,825,107,140,2060,870,20952,236840,236747,973,236743,236778,236842,768,858,2144,236743,236778,1251,236743,236771,1663,537,236840,236747,236842,573,858,528,2644,236769,3469,236769,236752,50796,110,2063,41193,236779,123043,236769,236751,236787,1540,1473,107,140,12234,107,140,15072,35509,2483,684,31509,4402,529,1806,7579,236761,107,140,12234,107,140,236865,9918,2483,531,4402,236761,7714,529,3861,236743,236800,236761,107,140,19243,578,870,236751,17576,236800,808,858,1473,1387,3283,236800,808,858,900,236743,236800,779,5980,236769,236751,50796,573,858,528,2644,3283,3469,236769,236751,236768,900,236743,236778,236768,973,236743,236800,7066,107,140,236865,8881,4820,528,1546,2299,236761,29506,2299,815,18661,4820,1082,236743,236800,236761,107,140,19243,578,21652,4043,236840,236770,9218,900,2299,236840,236771,2812,768,5980,236769,4043,236768,1251,236743,236800,1663,2299,573,2299,528,4402,236842,107,140,2060,116740,7013,236769,19243,236768,109,2063,39961,236779,123043,236769,236751,236787,1540,1473,107,140,12234,107,140,101108,618,2744,2483,35509,607,41193,236779,123043,1292,236761,15543,64686,2483,236761,107,140,12234,108,140,19243,578,870,236751,17576,236800,808,858,1473,1387,3283,236800,808,858,900,236743,236800,779,5980,236769,236751,50796,573,858,528,2644,3283,3469,236769,236751,236768,900,236743,236778,236768,973,236743,236800,7066,107,140,19243,578,21652,4043,236840,236778,236842,900,2299,8497,236778,2812,768,5980,236769,4043,236768,1251,236743,236800,1663,2299,573,2299,528,4402,236842,107,140,2060,116740,7013,236769,19243,236768,110,2063,8355,236779,73368,236769,236749,236787,801,1473,107,140,12234,107,140,2497,236779,73368,7623,538,236772,594,1548,600,563,496,123466,1548,532,625,236789,236751,992,8355,236761,107,140,22539,8355,236779,73368,236769,236770,236768,107,140,236778,107,140,22539,8355,236779,73368,236769,236778,236768,107,140,236800,107,140,22539,8355,236779,73368,236769,236800,236768,107,140,236810,107,140,22539,8355,236779,73368,236769,236812,236768,107,140,236770,236800,107,140,22539,8355,236779,73368,236769,236810,236768,107,140,236828,236819,107,140,12234,109,140,1106,4940,107,140,2063,161533,236779,36367,495,236769,236749,236764,620,236784,236770,236771,1473,107,144,12234,3694,768,538,563,8355,1699,506,17277,236772,236794,50605,2655,2027,1594,29738,107,144,584,538,655,236743,236778,236787,107,148,2060,8450,107,144,584,538,1251,236743,236778,653,538,1251,236743,236800,236787,107,148,2060,6288,107,144,584,538,2144,236743,236778,1251,236743,236771,236787,107,148,2060,8450,108,144,236750,578,236743,236771,107,144,236753,578,538,753,236743,236770,107,144,6858,513,2144,236743,236778,1251,236743,236771,236787,107,148,236750,3323,236743,236770,107,148,236753,973,236784,236743,236778,108,144,1708,2222,528,2644,236769,236767,1473,107,148,236746,578,4940,236761,33156,236769,236778,236764,538,753,236743,236778,236768,107,148,236781,578,5983,236769,236746,236764,513,236764,538,236768,107,148,584,1123,1251,236743,236770,653,1123,1251,538,753,236743,236770,236787,107,152,23162,107,148,1708,2222,528,2644,236769,236750,753,236743,236770,1473,107,152,236781,578,5983,236769,236781,236764,236743,236778,236764,538,236768,107,152,584,1123,1251,538,753,236743,236770,236787,107,156,7284,107,148,4454,236787,107,152,2060,8450,108,144,2060,6288,108,140,236755,236779,2497,578,236743,236771,107,140,236746,236764,518,578,236743,236771,236764,236743,236770,107,140,6858,505,236779,2497,655,538,236787,107,144,236746,236764,518,578,518,236764,496,900,518,107,144,584,161533,236779,36367,495,236769,236763,1473,107,148,236755,236779,2497,3323,236743,236770,107,140,2060,518,110,2063,107170,236779,2330,236779,1071,236779,13321,236769,236752,236787,1694,1473,107,140,12234,107,140,12233,2649,236779,2330,236779,1071,236779,13321,4716,496,1694,529,25630,618,614,2744,236761,107,140,509,7623,6288,768,993,659,1806,9245,4820,528,506,1694,600,107,140,2330,531,5743,236764,532,8450,7394,236761,108,140,22539,107170,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236800,236764,236743,236810,236764,236743,236771,2812,107,140,9277,107,140,22539,107170,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236800,236764,753,236778,236764,236743,236770,2812,107,140,4339,107,140,22539,107170,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236832,2812,107,140,9277,107,140,22539,107170,236779,2330,236779,1071,236779,13321,5551,236778,236764,236743,236812,236764,753,236810,236764,236743,236800,236764,236743,236819,236764,236743,236832,2812,107,140,4339,107,140,22539,107170,236779,2330,236779,1071,236779,13321,5551,236770,2812,107,140,9277,107,140,12234,108,140,1708,858,528,2644,236769,3469,236769,236752,16644,107,144,1708,673,528,2644,236769,3469,236769,236752,16644,107,148,1708,620,528,2644,236769,3469,236769,236752,16644,107,152,584,858,2843,673,532,858,2843,620,532,673,2843,620,532,537,236840,236747,236842,900,537,236840,236804,236842,900,537,236840,236767,236842,1251,236743,236771,236787,107,156,2060,6288,107,140,2060,8450,110,2063,1295,236779,42255,236779,61546,236769,236749,236787,801,1473,107,140,12234,107,140,70895,496,4284,600,236789,236751,496,13275,6850,51853,1440,1757,236761,107,140,236749,9371,659,9204,2378,531,1447,236793,138,4973,745,15665,236764,496,1607,1076,529,538,9371,107,140,733,9204,1447,531,2378,236761,139,818,1156,7093,529,9371,1502,855,1646,1401,2793,699,107,140,17136,1032,236761,138,3243,9371,2827,528,506,1638,4249,236761,138,11634,9371,659,1176,531,98230,107,140,14730,496,1295,600,236789,236751,6049,2378,531,1447,16737,496,1295,600,236789,236751,6049,1447,531,2378,236761,107,140,9675,236764,506,9371,659,51853,47170,532,3188,236793,618,496,1354,236764,901,4102,6049,107,140,495,910,33744,618,768,901,1602,711,98230,236761,108,140,2094,1292,26054,506,1548,529,1288,39362,236761,107,140,12234,109,140,2060,538,5213,236743,236778,110,2063,76541,236779,2234,236769,236752,236787,1694,1473,107,140,12234,13293,1694,607,4820,19104,524,684,236743,236770,236761,107,140,22539,76541,236779,2234,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,236840,236778,236764,236743,236800,236764,236743,236812,236842,107,140,22539,76541,236779,2234,5551,236810,236764,236743,236800,236764,236743,236810,236764,236743,236778,236764,236743,236800,236764,236743,236800,236764,236743,236819,236764,236743,236771,236764,236743,236770,236778,236800,2812,107,140,236840,236825,236764,236743,236812,236764,236743,236825,236764,236743,236800,236764,236743,236812,236764,236743,236812,236764,236743,236770,236771,236764,236743,236770,236764,236743,236770,236778,236812,236842,107,140,12234,108,140,2060,870,236781,900,236743,236770,573,1123,528,537,236842,110,2063,14491,236779,2330,236779,1071,236779,13321,236769,236752,1473,107,140,12234,107,140,73246,236779,2330,236779,1071,236779,13321,4716,496,1694,529,25630,618,614,2744,236761,107,140,509,7623,6288,768,993,659,1156,9245,4820,528,506,1694,600,107,140,2330,531,5743,236764,532,8450,7394,236761,107,140,22539,14491,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236800,236764,236743,236810,236764,236743,236771,2812,107,140,9277,107,140,22539,14491,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236800,236764,753,236778,236764,236743,236770,2812,107,140,9277,107,140,22539,14491,236779,2330,236779,1071,236779,13321,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236832,2812,107,140,9277,107,140,22539,14491,236779,2330,236779,1071,236779,13321,5551,236778,236764,236743,236812,236764,753,236810,236764,236743,236800,236764,236743,236810,236764,236743,236832,2812,107,140,4339,107,140,22539,14491,236779,2330,236779,1071,236779,13321,5551,236770,2812,107,140,9277,107,140,12234,108,140,1708,858,528,2644,236769,3469,236769,236752,16644,107,144,1708,673,528,2644,236769,3469,236769,236752,16644,107,148,584,858,2843,673,532,537,236840,236747,236842,900,537,236840,236804,236842,1251,236743,236771,236787,107,152,2060,6288,107,140,2060,8450,110,2063,2352,236779,5521,236769,236781,236787,801,236764,3225,236787,801,1473,107,140,12234,9233,16688,3225,529,2744,1548,1123,531,3225,236761,107,140,2060,2483,10065,1308,506,14274,236761,107,140,5521,4945,659,2344,1082,236743,236770,236771,236761,107,140,22539,2352,236779,5521,236769,236828,236764,236743,236800,236768,107,140,236789,236778,236778,236789,107,140,22539,2352,236779,5521,236769,236828,236764,236743,236778,236768,107,140,236789,236770,236771,236771,236771,236789,107,140,22539,2352,236779,5521,236769,236832,236764,236743,236778,236768,107,140,236789,236770,236770,236770,236789,107,140,12234,109,140,584,1123,1251,236743,236771,236787,994,623,236771,236775,107,140,4243,578,3679,107,140,6858,1123,2843,236743,236771,236787,107,144,4243,578,1540,236769,236781,2144,3225,236768,900,2461,107,144,236781,973,236784,3225,107,140,2060,2461,110,2063,17852,236779,7376,236769,236746,236764,534,1473,107,140,12234,26479,3861,529,496,2678,532,1494,994,2433,573,496,17852,236761,107,140,22539,17852,236779,7376,236769,236810,236764,236743,236800,236768,107,140,236832,236761,236810,107,140,12234,109,140,2060,496,808,534,965,236743,236778,110,2063,10779,236812,236769,236749,236787,801,1473,107,140,12234,818,56287,236812,1548,7501,563,496,7501,3361,531,506,56287,4142,42280,4659,866,588,600,236789,236751,5221,618,5238,236787,107,140,73368,236812,236769,236771,236768,3921,236743,236771,107,140,73368,236812,236769,236770,236768,3921,236743,236771,107,140,73368,236812,236769,236778,236768,3921,236743,236778,107,140,73368,236812,236769,236800,236768,3921,236743,236771,107,140,73368,236812,236769,236749,236768,3921,10779,236812,236769,236749,236772,236770,236768,900,10779,236812,236769,236749,236772,236778,236768,900,10779,236812,236769,236749,236772,236800,236768,900,10779,236812,236769,236749,236772,236812,769,107,140,9366,4903,496,1292,531,23057,16333,506,538,236772,594,3408,529,506,10779,236812,1548,7501,236761,138,6294,711,1161,74175,236761,107,140,22539,10779,236812,236769,236810,236768,107,140,236812,107,140,22539,10779,236812,236769,236825,236768,107,140,236828,107,140,22539,10779,236812,236769,236832,236768,107,140,236770,236812,107,140,12234,108,140,107,140,584,538,1251,236743,236771,236787,107,144,2060,236743,236771,107,140,36208,538,1251,236743,236770,236787,107,144,2060,236743,236771,107,140,36208,538,1251,236743,236778,236787,107,144,2060,236743,236778,107,140,36208,538,1251,236743,236800,236787,107,144,2060,236743,236771,107,140,4454,236787,107,144,236746,236764,518,236764,505,236764,513,578,236743,236771,236764,236743,236771,236764,236743,236778,236764,236743,236771,107,144,1708,858,528,2644,236769,236812,236764,538,900,236743,236770,1473,107,148,236746,236764,518,236764,505,236764,513,578,518,236764,505,236764,513,236764,496,900,518,900,505,900,513,107,144,2060,513,110,2063,20522,236769,236752,236787,1694,1473,107,140,12234,13293,20522,529,4820,528,506,1694,537,236761,107,140,22539,20522,5551,236800,236764,236743,236770,236764,236743,236778,236764,236743,236812,236764,236743,236810,2812,107,140,236800,107,140,22539,20522,99097,236770,236771,236764,236743,236812,236764,236743,236825,236764,236743,236770,236771,236771,236771,236764,236743,236770,236771,236764,236743,236778,236771,2812,107,140,236770,236810,236761,236771,107,140,12234,109,140,40835,236779,236752,578,19372,236769,236752,236768,107,140,584,5980,236769,236752,236768,2144,236743,236778,1251,236743,236770,236787,107,144,2060,19372,236779,236752,236840,3469,236769,236752,236768,973,236743,236778,236842,107,140,4454,236787,107,144,2060,568,40835,236779,236752,236840,3469,236769,236752,236768,973,236743,236778,753,236743,236770,236842,900,19372,236779,236752,236840,3469,236769,236752,236768,973,236743,236778,2812,965,236743,236778,110,2063,563,236779,227147,236769,1005,236787,1540,1473,107,140,12234,107,140,102854,768,2238,2483,563,496,142193,107,140,22539,563,236779,227147,68560,107,140,4339,107,140,22539,563,236779,227147,1033,6525,1606,107,140,4339,107,140,22539,563,236779,227147,1033,50354,236746,1606,107,140,4339,107,140,22539,563,236779,227147,1033,64060,2692,1606,107,140,9277,107,140,12234,108,140,2060,1816,1251,1816,186487,236770,236842,110,2063,1120,236758,236769,236749,236787,801,236764,510,236787,801,1473,107,140,12234,13293,236743,236778,236884,236749,53307,510,568,1553,7039,529,5562,1229,769,107,140,22539,1120,236758,236769,236800,236764,236743,236810,236768,107,140,236800,107,140,22539,1120,236758,236769,236770,236770,236771,236770,236764,236743,236770,236771,236770,236768,107,140,236778,107,140,22539,1120,236758,236769,236771,236764,236743,236770,236771,236770,236768,107,140,236770,107,140,22539,1120,236758,236769,236800,236764,236743,236770,236770,236768,107,140,236828,107,140,22539,1120,236758,236769,236770,236771,236771,236764,236743,236770,236771,236770,236768,107,140,236770,107,140,12234,109,140,619,236764,1123,578,236743,236770,236764,236743,236778,107,140,6858,538,2843,236743,236771,236787,107,144,584,538,2144,236743,236778,1251,236743,236770,236787,107,148,619,578,766,808,1123,2144,510,107,144,236781,578,1123,808,1123,2144,510,107,144,236749,973,236784,236743,236778,107,140,2060,766,2144,510,110,2063,41193,236779,17631,236769,236751,236787,1540,1473,107,140,12234,107,140,15072,35509,2483,684,34064,1418,2872,684,236743,236810,528,506,30796,236761,107,140,12234,107,140,2060,116740,7013,5551,37952,36583,778,236769,574,236768,900,236743,236810,753,4772,885,236746,5924,2144,236743,236778,236825,236768,900,4772,885,236746,5924,573,677,528,503,2812,109,2063,39961,236779,17631,236769,236751,236787,1540,1473,107,140,12234,107,140,101108,618,2744,2483,35509,607,41193,236779,17631,1292,236761,15543,64686,2483,236761,107,140,12234,109,140,2060,116740,7013,5551,37952,3283,778,236769,574,236768,753,4772,885,236746,1373,753,236743,236810,900,236743,236778,236825,236768,2144,236743,236778,236825,900,4772,885,236746,5924,573,677,528,503,2812,110,2063,6349,236779,236766,172213,236769,1005,1473,107,140,12234,107,140,6773,236779,236766,172213,563,496,1292,600,4716,2483,532,7623,2483,2180,147734,236761,107,140,22539,6349,236779,236766,172213,68560,107,140,10440,107,140,22539,6349,236779,236766,172213,885,108250,236785,829,236754,120715,236757,1373,107,140,236789,236763,149009,236785,829,94238,4998,236757,236789,107,140,22539,6349,236779,236766,172213,1033,108250,1606,107,140,236789,236763,149009,236789,107,140,22539,6349,236779,236766,172213,1033,50354,236746,1606,107,140,10440,107,140,22539,6349,236779,236766,172213,1033,9236,6552,236776,1606,107,140,236789,236799,236789,107,140,22539,6349,236779,236766,172213,1033,64060,2692,1606,107,140,236789,64060,2692,236789,107,140,12234,108,140,2060,116740,7013,236769,2234,236769,7212,236769,3485,677,236787,677,711,528,623,7393,86472,14196,6462,236836,827,1816,9670,110,2063,3426,236779,34436,236769,236752,236787,1694,236764,494,236787,801,1473,107,140,12234,13293,6288,768,784,4945,528,506,1694,537,659,3426,14272,494,236761,107,140,22539,3426,236779,34436,5551,236770,236764,236743,236778,236764,236743,236812,236764,236743,236770,236771,1604,236743,236770,236771,236771,236768,107,140,4339,107,140,22539,3426,236779,34436,5551,236770,236764,236743,236778,236771,236764,236743,236812,236764,236743,236770,236771,1604,236743,236810,236768,107,140,9277,107,140,12234,108,140,2060,784,236769,236781,655,494,573,1123,528,537,236768,110,2063,1138,236769,236781,236787,801,236764,570,236787,801,1473,107,140,12234,3218,1156,4945,1123,532,570,107,140,22539,1138,236769,236778,236764,236743,236800,236768,107,140,236810,107,140,22539,1138,236769,236810,236764,236743,236832,236768,107,140,236770,236778,107,140,12234,108,140,2060,1123,900,570,110,2063,1638,236779,50472,236769,236751,236771,236787,1540,236764,503,236770,236787,1540,1473,107,140,12234,107,140,6845,768,1156,4171,735,506,1638,7579,236761,107,140,22539,1638,236779,50472,1033,236744,200500,9961,9961,963,756,116794,9961,9961,231064,1650,57528,28180,1606,107,140,4339,107,140,22539,1638,236779,50472,1033,200500,963,756,1650,1650,116794,28180,1606,107,140,4339,107,140,22539,1638,236779,50472,1033,1650,1650,116794,28180,963,756,200500,1606,107,140,4339,107,140,22539,1638,236779,50472,1033,236744,200500,963,756,1650,1650,116794,28180,1606,107,140,9277,107,140,22539,1638,236779,50472,1033,200500,963,756,1650,1650,116794,596,588,1606,107,140,9277,107,140,22539,1638,236779,50472,1033,236744,200500,9961,9961,963,756,116794,9961,9961,231064,1650,1650,28180,1606,107,140,9277,107,140,12234,108,140,2060,1076,236769,236751,236771,236768,1251,1076,236769,236751,236770,236768,110,2063,10779,236769,236749,236787,801,1473,107,140,12234,13293,538,236772,594,123466,1548,236761,107,140,22539,10779,236769,236770,236771,236768,107,140,236810,236810,107,140,22539,10779,236769,236770,236768,107,140,236770,107,140,22539,10779,236769,236828,236768,107,140,236778,236770,107,140,12234,109,140,584,538,1251,236743,236771,236787,994,236743,236771,107,140,584,538,6605,236743,236778,236787,994,236743,236770,107,140,236746,236764,518,578,236743,236770,236764,236743,236770,107,140,1708,2222,528,2644,236769,236800,236764,538,900,236743,236770,1473,107,144,236746,236764,518,236764,578,518,236764,496,900,518,107,140,2060,518,110,2063,4338,236779,144765,236769,212439,236787,1540,1473,107,140,12234,41706,563,496,2483,529,19019,236775,532,25052,3056,107,140,2060,6288,768,1418,8115,34704,815,496,7041,16996,34704,236761,108,140,22539,4338,236779,144765,30141,1373,107,140,9277,107,140,22539,4338,236779,144765,30141,130590,107,140,4339,107,140,22539,4338,236779,144765,885,6143,1798,6985,1373,107,140,4339,107,140,22539,4338,236779,144765,885,1798,52715,1373,107,140,9277,107,140,12234,109,140,26297,578,236743,236771,107,140,1708,1123,528,41706,236787,107,144,584,1123,1251,19019,1083,38302,3323,236743,236770,107,144,584,1123,1251,25052,1083,38302,14599,236743,236770,107,144,584,38302,655,236743,236771,236787,994,8450,107,140,2060,38302,1251,236743,236771,110,2063,84613,236769,236752,236787,1694,1473,107,140,12234,13293,6288,563,1694,4820,659,140977,5683,653,22932,236761,107,140,22539,84613,5551,236770,236764,236743,236778,236764,236743,236812,236764,236743,236778,236771,2812,107,140,4339,107,140,22539,84613,5551,236770,236764,236743,236778,236771,236764,236743,236812,236764,236743,236770,236771,2812,107,140,9277,107,140,22539,84613,5551,236812,236764,236743,236770,236764,236743,236771,236764,753,236770,236771,2812,107,140,4339,107,140,12234,108,140,2078,236764,1521,578,6288,236764,6288,107,140,1708,858,528,2644,236769,3469,236769,236752,236768,753,236743,236770,1473,107,144,584,537,236840,236747,236842,1890,537,236840,236747,900,236743,236770,9414,2494,578,8450,107,144,584,537,236840,236747,236842,655,537,236840,236747,900,236743,236770,9414,1521,578,8450,107,140,2060,2494,653,1521,110,2063,3364,236769,236752,236770,236787,1694,236764,537,236778,236787,1694,1473,107,140,12234,13293,19372,4709,3364,4820,573,1156,15852,236761,107,140,22539,3364,5551,236770,236764,236743,236812,236764,236743,236800,236764,236743,236800,236812,236764,236743,236825,236810,236800,236764,236743,236778,236764,236743,236810,1604,870,236810,236764,236743,236832,236764,236743,236770,236764,236743,236810,236764,236743,236819,236764,236743,236825,236810,236800,236764,236743,236770,236778,236770,2812,107,140,236840,236770,236764,236743,236810,236764,236743,236825,236810,236800,236842,107,140,22539,3364,5551,236810,236764,236743,236800,236764,236743,236778,236764,236743,236828,1604,870,236800,236764,236743,236778,2812,107,140,236840,236778,236764,236743,236800,236842,108,140,12234,108,140,2060,19372,236769,2234,236769,1025,236769,236752,236770,769,83593,236769,1025,236769,236752,236778,41052,110,2063,7488,236779,2497,236779,19385,236769,236749,236787,801,1473,107,140,12234,13293,506,7488,8355,5415,529,538,236761,31952,538,1890,236743,236770,532,563,711,496,8355,236761,107,140,22539,7488,236779,2497,236779,19385,236769,236770,236800,236770,236819,236810,236768,107,140,236778,236819,107,140,22539,7488,236779,2497,236779,19385,236769,236778,236771,236812,236828,236768,107,140,236778,107,140,12234,109,140,511,2497,578,870,4339,236842,808,568,236749,900,236743,236770,236768,107,140,1708,858,528,2644,236769,236778,236764,538,900,236743,236770,1473,107,144,584,563,2497,236840,236747,9414,107,148,1708,673,528,2644,236769,236747,900,858,236764,538,236764,858,1473,107,152,511,2497,236840,236804,236842,578,8450,107,140,1708,858,528,2644,236769,236749,753,236743,236770,236764,236743,236771,236764,753,236770,1473,107,144,584,563,2497,236840,236747,236842,532,538,2144,858,1251,236743,236771,236787,107,148,2060,858,110,2063,2324,236779,1071,236779,236749,236769,236749,236787,801,1473,107,140,12234,2330,236779,1071,236779,236749,563,496,1292,600,31945,4945,699,236743,236770,531,538,236761,107,140,22539,2324,236779,1071,236779,236749,236769,236800,236771,236768,107,140,236812,236825,236810,107,140,22539,2324,236779,1071,236779,236749,236769,236770,236771,236771,236768,107,140,236810,236771,236810,236771,107,140,22539,2324,236779,1071,236779,236749,236769,236810,236768,107,140,236770,236810,107,140,22539,2324,236779,1071,236779,236749,236769,236770,236771,236768,107,140,236810,236810,107,140,22539,2324,236779,1071,236779,236749,236769,236770,236768,107,140,236770,107,140,12234,109,140,2060,568,236749,900,236743,236770,236768,808,538,973,236743,236778,110,2063,4338,236779,144765,236769,212439,236787,1540,1473,107,140,12234,41706,563,496,2483,529,132197,532,15825,3056,107,140,2060,6288,768,1418,8115,34704,815,496,7041,16996,34704,236761,108,140,22539,4338,236779,144765,885,177472,107,140,9277,107,140,22539,4338,236779,144765,885,187581,107,140,4339,107,140,22539,4338,236779,144765,885,11292,3507,1373,107,140,4339,107,140,22539,4338,236779,144765,885,3460,187581,107,140,9277,107,140,12234,109,140,26297,578,236743,236771,107,140,1708,1123,528,41706,236787,107,144,584,1123,1251,623,61835,38302,3323,236743,236770,107,144,584,1123,1251,15825,1083,38302,14599,236743,236770,107,144,584,38302,655,236743,236771,236787,994,8450,107,140,2060,38302,1251,236743,236771,110,2063,17407,236769,28570,236787,1694,1473,107,140,12234,43733,2754,15841,529,496,14337,236761,107,140,28570,236840,236771,236842,900,43733,236840,236770,236842,808,1123,900,43733,236840,236778,236842,808,1123,236884,236778,900,27103,107,141,13293,17407,529,672,14337,528,506,1638,1183,236761,107,140,22539,17407,5551,236800,236764,236743,236770,236764,236743,236778,236764,236743,236812,236764,236743,236810,2812,107,140,236840,236770,236764,236743,236812,236764,236743,236770,236778,236764,236743,236778,236771,236842,107,140,22539,17407,5551,236770,236764,236743,236778,236764,236743,236800,2812,107,140,236840,236778,236764,236743,236825,236842,107,140,12234,109,140,2060,870,28570,236840,236747,236842,808,858,573,858,528,2644,236769,236770,236764,5980,236769,28570,50796,110,2063,10779,73368,236769,236749,236787,801,1473,107,140,12234,818,56287,146228,1548,7501,563,496,7501,3361,531,506,56287,4142,42280,4659,866,588,600,236789,236751,5221,618,5238,236787,107,140,73368,73368,236769,236771,236768,1251,236743,236771,107,140,73368,73368,236769,236770,236768,1251,236743,236771,107,140,73368,73368,236769,236778,236768,1251,236743,236770,107,140,73368,73368,236769,236749,236768,1251,10779,73368,236769,236749,236772,236770,236768,900,10779,73368,236769,236749,236772,236778,236768,900,10779,73368,236769,236749,236772,236800,769,107,140,9366,4903,496,1292,531,23057,16333,506,538,236772,594,3408,529,506,10779,73368,1548,7501,236761,107,140,22539,10779,73368,236769,236770,236768,107,140,236771,107,140,22539,10779,73368,236769,236810,236768,107,140,236812,107,140,22539,10779,73368,236769,236828,236768,107,140,236778,236812,107,140,12234,109,140,584,538,1251,236743,236771,653,538,1251,236743,236770,236787,107,144,2060,236743,236771,107,140,36208,538,1251,236743,236778,236787,107,144,2060,236743,236770,107,140,236746,236764,518,236764,505,578,236743,236771,236764,236743,236771,236764,236743,236770,107,140,1708,2222,528,2644,236769,236800,236764,538,900,236743,236770,1473,107,144,236746,236764,518,236764,505,578,518,236764,505,236764,496,900,518,900,505,107,140,2060,505,109,51022,578,5240,107,3218,919,1594,3636,236761,107,12234,108,2063,147734,236779,2861,236769,236751,1473,107,140,12234,6974,496,1292,147734,236779,2861,837,4716,496,2483,13855,107,140,236746,3658,618,2744,532,7623,506,1548,529,147734,528,506,2483,236761,107,140,236847,172213,528,672,1624,659,756,236746,963,756,236744,963,756,236747,963,756,236748,963,756,236756,6748,5715,236764,756,236762,236789,563,992,496,107,140,236766,54587,236764,840,1186,1056,625,563,657,506,1345,529,506,2238,3658,236761,108,140,12703,236787,107,140,22539,147734,236779,2861,885,28180,893,1373,107,140,236778,107,140,22539,147734,236779,2861,885,2714,183577,1373,107,140,236800,107,140,12234,108,140,584,503,1251,86679,994,236743,236771,107,140,26297,578,5980,236769,2234,236769,7212,236769,3485,677,236787,677,528,623,7393,86472,14196,6462,236836,827,503,9670,107,140,584,503,17825,236770,236842,528,623,236762,236874,1083,38302,3323,236743,236770,107,140,2060,38302,109,2063,16971,236779,17631,236769,236781,236764,8633,1473,107,140,12234,147018,8633,506,26798,529,506,11995,1123,236764,8633,506,26798,1447,684,8633,107,140,624,994,506,1354,618,496,2483,236761,107,140,2859,8633,1890,1548,529,26798,236764,994,26798,29695,236761,107,140,22539,16971,236779,17631,236769,236770,236778,236764,236743,236770,236768,107,140,236775,236778,236770,236775,107,140,22539,16971,236779,17631,236769,236770,236778,236764,236743,236778,236768,107,140,236775,236770,236778,236775,107,140,12234,109,140,236751,578,1540,236769,236781,236768,107,140,584,8633,1890,5980,236769,236751,1473,994,503,186487,236770,236842,107,140,17631,98695,5980,236769,236751,236768,107,140,584,8633,1251,236743,236771,236787,107,144,2060,503,107,140,4454,236787,107,144,2060,503,236840,3469,236769,236751,236768,753,8633,9218,900,503,8497,3469,236769,236751,236768,753,8633,236842,109,2063,15958,10495,236769,236751,1473,107,140,12234,6804,107,140,6974,496,1292,600,4716,496,2483,618,2744,532,7623,506,2324,529,506,7593,7579,1186,236789,107,140,108765,17253,236761,108,140,38408,236787,107,144,29345,10495,48391,1477,236743,236771,107,144,29345,10495,885,596,3066,1373,1477,236743,236770,236800,236770,107,144,29345,10495,885,28180,64212,1373,1477,236743,236825,236832,107,144,29345,10495,885,23391,236788,1373,1477,236743,236825,236819,107,144,29345,10495,885,1014,2023,236799,671,1373,1477,236743,236770,236800,236770,107,144,29345,10495,885,236746,236776,72004,222180,1373,1477,236743,236770,236810,236800,107,140,12234,108,140,2060,2324,5551,778,236769,574,236768,573,677,528,503,768,677,236761,16292,2683,128375,109,2063,9479,236779,52447,236769,236751,236764,236749,1473,107,140,12234,107,140,902,672,4209,236764,611,795,577,2238,496,2483,600,9282,496,1548,529,36157,532,71816,236743,107,140,7705,659,10861,528,496,11406,529,9479,672,11406,6097,236743,107,140,145979,236764,71816,236764,532,51894,16391,236761,17770,506,2483,600,9282,506,2558,1548,529,236743,107,140,1437,71816,532,36157,532,614,11995,600,2754,506,2558,1548,529,506,16391,236743,107,140,495,506,11406,994,506,1548,529,506,51894,16391,528,506,11406,236761,107,140,1708,3491,1148,236787,107,140,31454,236779,52447,885,236810,36157,532,236743,236825,71816,827,236743,236770,236819,236768,3921,236770,236819,753,236743,236810,753,236743,236825,578,236743,236828,107,140,31454,236779,52447,885,236771,36157,532,236743,236770,71816,827,236800,236768,3921,236743,236800,753,236743,236771,753,236743,236770,578,236743,236778,107,140,31454,236779,52447,885,236778,36157,532,236743,236800,71816,827,236743,236770,236771,236771,236768,3921,236743,236770,236771,236771,753,236743,236778,753,236743,236800,578,236743,236819,236810,107,140,31454,236779,52447,885,236770,236771,236771,36157,532,236743,236770,71816,827,236770,236778,236771,236768,3921,236743,236770,236778,236771,753,236743,236770,236771,236771,753,236743,236770,578,236743,236770,236819,107,140,12234,109,140,8992,578,503,236761,6966,885,15825,107,140,236755,236770,236764,505,236778,578,801,236769,8992,236840,236771,18107,801,236769,8992,236840,236800,2812,107,140,10640,538,753,505,236770,753,505,236778,6867,236743,236771,236764,623,29995,9103,236775,997,16780,143114,206586,107,140,107,140,2060,538,753,505,236770,753,505,236778,109,2063,179196,236769,2762,1473,107,140,12234,107,140,236775,26479,614,3499,13855,496,9911,529,496,5028,600,815,1908,236772,27851,11995,13653,107,140,17993,4209,563,531,179196,886,529,506,13653,532,994,625,236761,107,140,818,179438,5349,1374,577,506,5349,607,506,21548,1581,1550,236761,107,140,2859,5065,13653,607,506,1638,21548,1581,1550,659,1765,994,506,5349,600,815,21548,3546,236761,108,140,818,179438,5349,1374,577,8323,528,496,1694,236764,870,1406,514,598,236779,2394,236764,1061,3546,7975,107,140,2859,993,659,951,1581,2979,653,506,2238,3499,563,7738,236764,994,870,1619,108,140,12703,236743,236770,236787,107,144,4661,236787,870,236812,236764,236778,236764,236800,236842,107,144,8433,236787,870,236778,236764,236743,236770,236842,107,144,44008,236787,236743,236778,815,506,21548,1581,1550,236764,532,236743,236778,815,506,21548,3546,236761,108,140,12703,236743,236778,236787,107,144,4661,236787,870,236770,236764,236778,236764,236800,236842,107,144,8433,236787,870,236778,236764,236743,236770,236842,107,144,44008,236787,236743,236778,815,506,21548,1581,1550,236764,532,236743,236778,815,506,21548,3546,236761,236743,108,140,12703,236743,236800,236787,107,144,4661,236787,2977,107,144,8433,236787,2977,107,140,107,140,12703,236743,236812,236787,107,144,4661,236787,870,236810,236764,236743,236771,236764,236743,236800,236764,236743,236771,236764,236743,236812,236764,236743,236778,236842,107,144,8433,236787,870,236771,236764,236743,236770,236842,107,144,44008,236787,236743,236771,563,506,21548,1550,236764,840,138,13534,659,1156,36509,236764,107,157,814,692,795,5347,506,1171,5743,236764,837,815,506,21548,3546,236761,108,140,40498,236787,107,144,236829,236743,236770,6605,13653,236761,3119,6605,236743,236770,236771,236771,236771,236771,107,144,236829,236743,236771,6605,5349,236761,2394,107,140,12234,109,140,584,784,236769,1111,2144,236743,236778,1251,236743,236770,573,1016,528,4617,1473,994,2977,107,140,1387,236779,20952,578,1322,236769,7212,236769,3485,1123,236787,1123,2144,236743,236778,1251,236743,236771,236764,4617,1223,107,140,1708,858,528,2644,236769,3469,236769,2762,16644,107,144,584,4617,236840,236747,236842,1251,1322,236779,20952,236787,107,148,2060,870,1387,236779,20952,236764,858,236842,109,2063,3927,236769,38511,1473,107,140,26610,107,140,3048,659,2238,496,1908,236772,11681,1694,529,4414,25630,236761,9657,506,11333,11995,600,563,5314,1082,236743,107,140,13321,236764,532,815,496,7132,5314,1082,653,4745,531,506,1550,529,506,11995,4850,236761,236743,107,140,818,7132,529,614,11995,563,506,1548,529,2782,625,7412,528,506,1694,236761,107,140,2859,951,1288,496,1550,2849,236764,994,753,236770,236761,107,140,38408,236787,107,144,2305,5551,236812,236764,236743,236770,236764,236743,236778,236764,236743,236778,236764,236743,236800,236764,236743,236770,2812,1251,236743,236778,107,144,2305,5551,236770,236764,236743,236778,236764,236743,236778,236764,236743,236800,236764,236743,236800,236764,236743,236800,236764,236743,236812,236764,236743,236812,236764,236743,236812,2812,1251,236743,236800,107,144,2305,5551,236810,236764,236743,236810,236764,236743,236812,236764,236743,236812,236764,236743,236812,2812,1251,753,236770,107,140,26610,109,140,2861,578,12739,825,107,140,1708,1152,528,65145,236787,107,144,584,1152,711,528,1527,236787,107,148,2861,236840,3744,236842,578,236743,236771,107,144,2861,236840,3744,236842,3323,236743,236770,107,140,743,578,753,236770,107,140,1708,1152,236764,38302,528,1527,236761,7633,6141,107,144,584,38302,6867,1152,236787,107,148,743,578,2631,236769,743,236764,1152,236768,107,140,2060,9898,109,2063,17163,236779,10479,236779,2234,236769,38511,1473,107,140,26610,107,140,26479,1694,529,25630,236764,994,1694,528,17163,1900,236761,107,140,169728,37007,236764,563,1056,611,1502,607,506,7081,1550,236764,107,140,5215,5783,529,506,9866,25630,236764,1299,7081,532,834,580,236761,108,140,38408,236787,107,140,184095,236779,10479,236779,2234,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,2812,1251,870,236770,236764,236743,236812,236764,236743,236778,236764,236743,236800,236842,107,140,184095,236779,10479,236779,2234,5551,236810,236764,236743,236810,236764,236743,236810,236764,236743,236810,2812,1251,870,236810,236764,236743,236810,236764,236743,236810,236764,236743,236810,236842,107,140,184095,236779,10479,236779,2234,67713,1251,2977,107,140,26610,108,140,40835,236779,2234,578,19372,236769,38511,236768,107,140,743,236764,858,236764,673,578,17811,236743,236771,236764,5980,236769,40835,236779,2234,236768,753,236743,236770,107,140,6858,858,655,673,236787,107,144,743,236761,3770,236769,40835,236779,2234,236840,236747,2812,107,144,743,236761,3770,236769,40835,236779,2234,236840,236804,2812,107,144,236747,3323,236743,236770,107,144,236804,14599,236743,236770,107,140,584,858,1251,673,236787,9898,236761,3770,236769,40835,236779,2234,236840,236747,2812,107,140,2060,9898,109,2063,17852,236779,7376,236769,236746,236764,518,236764,505,1473,107,140,26610,107,140,26479,506,25565,529,506,1806,9174,529,496,17852,236761,9657,506,2433,529,107,140,1437,17852,20274,531,236743,236778,20632,3298,768,506,1806,9174,1183,496,4341,17852,236761,236743,107,140,113255,994,753,236770,107,140,19765,9174,1386,496,4341,17852,1056,506,2324,529,1027,1156,9174,563,5314,236743,107,140,14560,506,4168,2678,236761,107,140,12703,236787,107,140,20457,236779,7376,236769,236800,236764,236743,236812,236764,236743,236810,236768,1251,236743,236825,236761,236771,236771,107,140,20457,236779,7376,236769,236770,236764,236743,236778,236764,236743,236770,236771,236768,1251,753,236770,107,140,26610,109,140,584,496,900,518,6605,505,653,496,900,505,6605,518,653,518,900,505,6605,496,236787,994,753,236770,107,140,236758,578,568,236746,900,518,900,505,236768,965,236743,236778,107,140,2060,4886,3283,236758,808,568,236758,753,496,236768,808,568,236758,753,518,236768,808,568,236758,753,505,1223,5213,236743,236771,236761,236810,236764,236743,236778,236768,109,2063,795,236779,509,236779,15275,236769,236809,236764,236765,1473,107,140,26610,107,140,6974,496,1292,600,7623,6288,768,506,2495,3752,795,10240,236764,532,8450,7394,236761,107,140,818,2495,3752,795,10240,768,625,236789,236751,20433,568,509,563,496,180522,660,525,1694,236768,532,506,2324,529,1061,4820,563,2344,1082,653,4745,506,5783,2653,3825,515,236761,108,140,12703,236787,107,140,16132,236779,509,236779,15275,5551,236770,236764,236743,236778,1604,236743,236810,236768,236743,245790,8450,236743,107,140,236865,236743,236770,236862,236778,563,2344,1082,506,5783,2653,3825,236764,840,625,236789,236751,105938,236761,108,140,16132,236779,509,236779,15275,5551,236800,236764,236743,236778,236764,236743,236800,1604,236743,236770,236768,236743,245790,8450,107,140,236865,625,236789,236751,20433,236764,840,236743,236800,236862,236778,236862,236800,563,919,1082,506,5783,2653,3825,236761,108,140,16132,236779,509,236779,15275,5551,236800,236764,236743,236778,236764,236743,236800,1604,236743,236819,236768,236743,245790,6288,107,140,236865,236743,236800,236862,236778,236862,236800,563,2344,1082,506,5783,2653,3825,236764,532,625,236789,236751,20433,236761,108,140,16132,236779,509,236779,15275,5551,236800,1604,236743,236810,236768,236743,245790,6288,107,140,236865,236743,236800,563,2344,1082,506,5783,2653,3825,236764,532,625,236789,236751,20433,236761,107,140,26610,108,140,2060,3752,1251,3752,186487,236770,236842,532,2324,236769,236809,236768,6605,515,109,2063,21548,236779,4177,236769,2762,1473,107,140,12234,107,140,26479,614,3499,4617,529,25630,236764,1586,506,7081,1548,529,4820,600,107,140,25109,531,577,6692,531,1386,506,3499,180522,660,525,236761,562,180522,660,525,3499,563,614,3499,600,107,140,511,1676,506,1638,44615,532,55138,236761,799,886,2352,236764,611,740,2352,886,3408,531,1027,1032,3408,236761,108,140,2542,2591,236787,107,140,118878,236779,4177,5551,236770,236764,236778,236764,236800,236764,236810,236764,236812,236764,236832,236764,236819,236764,236825,2812,1251,236743,236812,107,140,118878,236779,4177,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236800,236764,236743,236778,236764,236743,236778,2812,1251,236743,236770,107,140,118878,236779,4177,5551,236770,236764,236743,236778,236764,236743,236800,236764,236743,236778,236764,236743,236770,2812,1251,236743,236771,107,140,12234,108,140,2762,236779,140879,236764,38302,578,4617,186487,236770,1604,236743,236771,107,140,1708,858,528,2644,236769,3469,236769,2762,236768,973,236743,236778,1473,107,144,584,4617,236840,236747,236842,2843,4617,236779,140879,236840,236747,9414,107,148,26297,3323,236743,236770,107,140,2060,38302,109,2063,2558,236779,10480,236769,38511,236770,236764,65145,236778,1473,107,140,26610,107,140,6974,496,1292,600,37574,1156,15852,529,16587,532,7623,506,1694,600,815,236743,107,140,6725,1548,529,56256,528,506,784,16587,529,506,1694,2344,1082,506,1032,1694,236761,108,140,584,506,1156,15852,735,506,1638,1548,529,56256,236764,994,506,1171,1694,236761,108,140,38408,107,140,6725,236779,10480,142976,60227,236743,245790,2977,107,140,6725,236779,10480,20768,2202,963,756,8473,7367,7756,236754,236777,963,756,10979,10190,236743,245790,7756,236754,236777,963,756,10979,2000,107,140,6725,236779,10480,20768,2202,963,756,8473,7367,7756,2202,963,756,2202,963,756,8473,963,756,6201,10190,236743,245790,7756,2202,963,756,8473,2000,107,140,6725,236779,10480,20768,2202,963,756,8473,7367,7756,236754,236777,963,756,2202,963,756,2202,10190,236743,245790,7756,236754,236777,963,756,2202,963,756,2202,2000,107,140,6725,236779,10480,20768,236812,7367,7756,236770,963,756,236778,963,756,236800,963,756,236812,963,756,236810,10190,236743,245790,7756,236812,2000,107,140,26610,108,140,236755,236770,236764,505,236778,578,2324,236769,3275,236769,3485,503,236787,5980,236769,236751,779,65145,236770,8914,2324,236769,3275,236769,3485,503,236787,5980,236769,236751,779,65145,236778,1223,107,140,2060,65145,236770,768,505,236770,6605,505,236778,1663,65145,236778,109,2063,563,236779,64545,236779,2497,236769,236746,1473,107,140,12234,6974,496,1292,600,7623,1847,768,506,2238,1548,563,506,27104,529,236743,236800,8355,4945,107,140,624,2416,7394,236761,107,140,101793,600,568,236746,236768,563,2344,1299,236743,236770,236771,236771,236761,236743,107,140,12703,236787,107,140,511,236779,64545,236779,2497,236769,236800,236771,236768,1251,6288,107,140,236800,236771,578,236743,236778,808,236743,236800,808,236743,236810,107,140,12234,108,140,584,496,6605,236743,236770,236787,994,8450,107,140,511,2497,578,870,4339,236842,808,568,236746,900,236743,236770,236768,107,140,1708,858,528,2644,236769,236778,236764,496,900,236743,236770,1473,107,144,584,563,2497,236840,236747,9414,107,148,1708,673,528,2644,236769,236747,900,858,236764,496,900,236743,236770,236764,858,1473,107,152,511,2497,236840,236804,236842,578,8450,107,140,26297,236764,15172,578,236743,236771,236764,496,107,140,1708,858,528,2644,236769,236778,236764,496,900,236743,236770,1473,107,144,6858,563,2497,236840,236747,236842,532,15172,2144,858,1251,236743,236771,236787,107,148,11935,973,236784,858,107,148,26297,3323,236743,236770,107,140,2060,38302,1251,236743,236800,109,2063,563,236779,19751,236779,10310,236769,236781,236764,538,1473,107,140,12234,11069,4209,563,531,4903,496,1292,600,7623,1847,768,496,1548,1123,563,496,3606,107,140,10310,529,538,532,2416,528,1032,3636,236761,107,140,236781,563,496,3606,2066,529,538,768,538,1018,720,236784,236781,107,140,2542,2591,236787,107,140,511,236779,19751,236779,10310,236769,236770,236764,236743,236812,236768,1477,1847,107,140,511,236779,19751,236779,10310,236769,236778,236764,236743,236778,236768,1477,1847,107,140,511,236779,19751,236779,10310,236769,236828,236764,236743,236778,236768,1477,1847,107,140,511,236779,19751,236779,10310,236769,236800,236764,236743,236778,236768,1477,2416,107,140,511,236779,19751,236779,10310,236769,236800,236764,236743,236770,236768,1477,2416,107,140,511,236779,19751,236779,10310,236769,236810,236764,236743,236800,236768,1477,2416,107,140,12234,108,140,584,1123,1251,236743,236770,236787,994,6288,107,140,584,538,1251,236743,236771,236787,994,1123,1251,236743,236771,107,140,584,538,1251,236743,236770,236787,994,1123,1251,236743,236770,107,140,584,538,1251,753,236770,236787,994,2951,236769,236781,236768,1251,236743,236770,107,140,236758,578,538,107,140,6858,2951,236769,236758,236768,6605,2951,236769,236781,1473,107,144,584,510,1251,1123,236787,994,6288,107,144,236758,578,510,808,538,107,140,2060,8450,109,2063,563,45601,236769,236746,1473,107,140,26610,107,140,6974,496,1292,600,4716,614,11995,496,532,7623,6288,236743,107,140,584,672,4616,41879,563,496,26365,529,1070,11995,1548,236761,107,140,10282,236787,611,1149,9027,506,2744,563,2462,4341,236761,107,140,38408,236787,107,140,10608,3411,236769,236770,236768,196301,6288,107,140,10608,3411,236769,236778,236768,196301,8450,107,140,10608,3411,5929,236770,236768,196301,6288,107,140,10608,3411,236769,236825,236812,236768,196301,6288,107,140,10608,3411,236769,236771,236768,196301,6288,107,140,10608,3411,236769,236770,236828,236771,236768,196301,8450,107,140,26610,108,140,236746,578,2951,236769,236746,236768,107,140,2060,801,236769,979,236769,236746,5213,568,236770,236761,965,236743,236800,9670,5213,236743,236800,1251,496,109,2063,17116,236779,2478,236769,3744,1473,107,140,12234,3048,735,1010,68783,531,4903,496,1292,600,21500,236743,107,140,236746,143534,1548,618,496,2483,532,18558,506,1548,529,143534,236743,107,140,67161,600,659,70761,568,2497,1548,236764,653,496,8355,236764,563,496,3756,1548,236743,107,140,89785,1082,236743,236770,600,563,711,496,1698,529,1156,7100,3756,4945,769,107,140,41883,112394,26798,659,236743,236771,236764,236743,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,236764,236743,236825,236764,236743,236832,236764,236743,236828,236764,236743,236819,236764,562,236764,603,236764,565,236764,622,236764,645,236764,633,236761,107,140,40090,4945,659,236743,236778,236764,236743,236800,236764,236743,236810,236764,236743,236832,236764,236743,236770,236770,236764,236743,236770,236800,236764,236743,236770,236832,40287,107,140,4324,611,735,531,6054,496,1548,529,506,2269,26798,236787,236743,236778,236764,236743,236800,236764,236743,236810,236764,236743,236832,236764,236743,107,140,236799,59611,55817,236743,236770,236770,779,622,59611,55817,236743,236770,236800,769,107,140,10282,236787,611,1149,9027,506,2744,563,2462,4338,653,7738,2483,236764,236743,107,140,624,16271,562,236764,236799,236764,236780,236764,236796,236764,236788,236764,236811,659,2462,46451,236761,107,140,38408,236787,107,140,2542,1152,578,623,3066,236775,506,3938,1374,577,236743,236770,236761,107,140,2542,1152,578,623,236770,236771,236832,236832,236788,236775,506,3938,1374,577,236743,236778,236761,107,140,2542,1152,578,623,3066,2413,236770,236776,236800,236800,236775,506,3938,1374,577,236743,236812,236761,107,140,2542,1152,578,623,236770,236778,236800,236812,236810,236825,236832,236828,236819,31348,50526,236771,236775,506,3938,1374,577,236743,236825,236761,107,140,2542,1152,578,623,236778,236771,236778,236771,236775,506,3938,1374,577,236743,236778,236761,107,140,12234,108,140,107,140,2060,5980,236769,2234,236769,7212,236769,3485,1123,236787,1123,528,623,236778,236800,236810,236832,11217,827,1152,9670,109,2063,20632,236779,1071,236779,34280,236769,55817,1473,107,140,12234,3048,795,577,2238,496,1548,528,20632,1183,532,822,4209,563,531,7352,625,531,107,140,34280,6518,236761,669,1292,1374,994,496,2483,236764,607,1546,2872,13855,496,14820,107,140,5640,236761,7714,2872,528,506,2483,795,577,756,236771,236789,653,756,236770,6748,108,140,3810,795,577,614,4481,4628,529,7579,756,3864,236789,657,506,6534,532,657,506,1345,529,506,2483,236761,107,140,818,4481,7579,659,993,531,1601,607,506,6518,236761,108,140,38408,236787,107,140,55817,236779,1071,236779,34280,236769,236770,236810,236768,139,236865,7623,623,3864,236770,236770,236770,236770,3864,236775,107,140,55817,236779,1071,236779,34280,236769,236800,236778,236768,139,236865,7623,623,3864,236770,236771,236771,236771,236771,236771,3864,236775,107,140,12234,109,140,2060,623,3864,236775,900,10915,236769,55817,10309,236778,9218,900,623,3864,236775,109,2063,563,236779,19849,236769,236751,1473,107,140,12234,3048,659,2238,496,2483,503,236761,107,140,11069,4209,563,531,2426,768,506,2483,563,5293,653,711,236761,107,140,236776,2483,563,5293,768,1061,3861,563,657,3198,236743,236800,532,1418,236743,236800,22592,11739,659,9245,107,140,2542,2591,236787,107,140,511,236779,19849,236769,236746,236768,1477,8450,107,140,511,236779,19849,236769,9236,236768,1477,8450,107,140,511,236779,19849,236769,200500,236768,1477,6288,107,140,511,236779,19849,236769,163760,236768,1477,8450,107,140,511,236779,19849,236769,101028,236768,1477,6288,107,140,511,236779,19849,236769,4228,236762,236768,1477,8450,107,140,12234,108,140,584,5980,236769,236751,236768,655,236743,236800,236787,994,8450,107,140,1708,858,528,2644,236769,3469,236769,236751,236768,753,236743,236778,1473,107,144,584,503,236840,236747,236842,1251,503,236840,236747,900,236743,236770,236842,653,503,236840,236747,236842,1251,503,236840,236747,900,236743,236778,236842,653,503,236840,236747,900,236743,236770,236842,1251,503,236840,236747,900,236743,236778,9414,107,148,2060,8450,107,140,2060,6288,109,2063,16688,236779,13143,236779,9619,236769,32477,1473,107,140,12234,1509,563,506,1774,2069,529,506,28066,532,506,9800,815,531,2583,506,26487,107,140,1071,3272,236761,669,9800,815,1010,3043,1116,1852,8417,573,53531,236761,107,140,818,1186,2608,563,236764,1304,815,5745,506,3393,1304,1456,573,53531,236761,107,140,5778,815,2238,611,496,1694,529,27379,2205,573,1070,3272,532,611,735,531,4903,236743,107,140,236746,1292,600,740,3938,496,1694,529,6064,26487,1699,506,2269,2633,236787,107,149,99761,143,236909,140,31070,11398,107,150,236812,236761,236771,152,236776,236862,107,148,236813,236743,236800,236761,236832,152,236776,236743,107,148,236813,236743,236800,236761,236800,152,236776,236772,236743,107,148,236813,236743,236800,236761,236771,152,236799,236862,107,148,236813,236743,236778,236761,236832,152,236799,236743,107,148,236813,236743,236778,236761,236800,152,236799,236772,107,148,236813,236743,236778,236761,236771,152,236780,236862,107,148,236813,236743,236770,236761,236832,152,236780,107,148,236813,236743,236770,236761,236800,152,236780,236772,107,148,236813,236743,236770,236761,236771,152,236796,236862,236743,107,148,236813,236743,236771,236761,236832,152,236796,236743,107,148,236813,236743,236771,236761,236771,152,236796,236772,107,150,236771,236761,236771,152,236788,107,140,108,140,12703,236787,107,140,9619,236779,8539,5551,236812,236761,236771,236764,236743,236800,236764,236743,236770,236761,236832,236764,236743,236778,236764,236743,236800,236761,236810,2812,196301,7756,236776,78431,756,236799,963,756,236780,142008,756,236780,963,756,236776,236772,2000,107,140,12234,109,140,2063,531,236779,13143,236779,9619,236769,8318,1473,107,142,584,6317,1251,236743,236812,236761,236771,236787,107,144,2060,623,236776,16256,107,142,36208,6317,1890,236743,236800,236761,236832,236787,107,144,2060,623,236776,236775,107,142,36208,6317,1890,236743,236800,236761,236800,236787,107,144,2060,623,236776,31621,107,142,36208,6317,1890,236743,236800,236761,236771,236787,107,144,2060,623,236799,16256,107,142,36208,6317,1890,236743,236778,236761,236832,236787,107,144,2060,623,236799,236775,107,142,36208,6317,1890,236743,236778,236761,236800,236787,107,144,2060,623,236799,31621,107,142,36208,6317,1890,236743,236778,236761,236771,236787,107,144,2060,623,236780,16256,107,142,36208,6317,1890,236743,236770,236761,236832,236787,107,144,2060,623,236780,236775,107,142,36208,6317,1890,236743,236770,236761,236800,236787,107,144,2060,623,236780,31621,107,142,36208,6317,1890,236743,236770,236761,236771,236787,107,144,2060,623,236796,16256,107,142,36208,6317,1890,236743,236771,236761,236832,236787,107,144,2060,623,236796,236775,107,142,36208,6317,1890,236743,236771,236761,236771,236787,107,144,2060,623,236796,31621,107,142,4454,236787,107,144,2060,623,236788,236775,107,140,107,140,2060,870,1071,236779,13143,236779,9619,236769,236781,236768,573,1123,528,26487,236842,109,2063,8355,236779,3119,236769,2383,1473,107,140,12234,6974,496,1292,600,4716,496,2483,532,7623,6288,768,506,2483,107,140,3119,563,496,8355,1548,653,8450,7394,107,140,38408,107,140,2497,236779,3119,1033,9259,1606,1251,6288,107,140,2497,236779,3119,1033,596,166537,3604,1606,1251,6288,107,140,2497,236779,3119,1033,234093,832,1606,1251,6288,107,140,2497,236779,3119,1033,28975,1606,1251,8450,107,140,12234,108,140,2063,563,236779,2497,236769,236746,1473,107,144,2060,711,568,236746,655,236743,236778,653,1027,236769,236746,2144,1123,1251,236743,236771,573,1123,528,2644,236769,236778,236764,801,236769,236746,5213,236743,236771,236761,236810,236768,900,236743,236770,9670,108,140,2060,563,236779,2497,236769,3469,236769,2383,1223,109,2063,9857,236779,811,236779,2068,236769,236749,1473,107,140,12234,107,140,26479,496,4414,11995,538,236764,994,506,1527,529,506,4945,529,538,236772,29345,107,140,30558,25630,600,1502,653,1345,607,236743,236770,236761,107,140,12234,109,140,584,538,1251,236743,236770,236787,994,236743,236770,107,140,2060,236743,236770,236828,808,236743,236770,236771,5213,568,236749,753,236743,236778,236768,109,2063,8974,236769,236797,1473,107,140,12234,26479,496,4414,11995,646,236764,994,506,2558,2324,529,1061,26798,528,14820,236761,107,140,107,140,12703,107,144,2542,646,578,236743,236770,236771,236771,236771,236764,506,2324,529,26798,795,577,236743,236770,506,3938,1374,577,623,236770,3056,107,144,2542,646,578,236743,236770,236810,236771,236764,506,2324,529,26798,795,577,236743,236825,506,3938,1374,577,623,236770,236770,236771,3056,107,144,2542,646,578,236743,236770,236812,236832,236764,506,2324,529,26798,795,577,236743,236770,236778,506,3938,1374,577,623,236770,236770,236771,236771,3056,107,140,107,140,59617,236787,107,144,236940,236797,11995,107,149,40498,236787,236743,236771,38010,646,38010,236743,236770,236771,236771,236771,236771,236761,107,140,8433,236787,107,145,236746,2483,529,14820,1548,107,140,12234,109,140,236751,578,2324,236769,3275,236769,3485,1123,236787,801,236769,236781,779,1540,236769,236797,9670,107,140,2060,10915,236769,236751,10309,236778,9218,109,2063,1138,236769,38511,1473,107,140,12234,26479,496,1908,236772,11681,1694,529,25630,65145,236761,1138,506,1581,4820,600,659,657,11049,22697,856,109,140,38408,236787,107,144,1282,5551,236812,236764,236743,236778,236764,236743,236825,236764,236743,236832,2812,196301,236743,236778,236743,107,140,12234,109,140,236751,578,236743,236771,107,140,1708,858,528,2644,236769,236770,236764,5980,236769,38511,779,236743,236778,1473,107,144,584,65145,236840,236747,236842,2144,236743,236778,1251,236743,236771,236787,107,148,236751,3323,65145,236840,236747,236842,107,140,2060,503,109,2063,7247,236779,56305,236769,236751,1473,107,140,12234,107,140,6974,496,1292,600,4716,496,2483,532,7623,614,11496,3567,529,625,236761,107,140,92489,3567,529,2483,236764,563,496,2483,1298,784,4171,568,74010,684,2557,236768,107,140,733,12043,684,496,861,3658,1298,784,506,7579,19587,528,107,140,149308,1900,2721,580,145400,1550,236761,107,140,10282,236787,1599,1374,2514,506,1900,529,4171,532,11580,9952,528,506,13315,236761,108,140,2542,2591,236787,107,140,5027,236779,56305,1033,10979,1606,7623,756,10979,236789,107,140,5027,236779,56305,1033,23391,1606,7623,756,10129,175343,236789,107,140,5027,236779,56305,1033,9259,4109,11145,1606,7623,756,9259,42228,236824,236753,3159,236789,107,140,12234,108,140,8992,578,503,236761,6966,885,15825,107,140,2060,623,16150,7013,236769,3275,236769,3485,1123,236787,116740,7013,236769,40835,236769,236781,236764,2307,236784,3485,677,236787,4772,236769,574,69472,4171,1223,109,2063,974,236779,809,236769,38511,236764,1123,1473,107,140,12234,107,140,3048,659,2238,496,236743,236778,27220,1262,236764,618,496,43927,15852,236764,107,140,7650,563,3361,531,6113,236764,3685,236764,21304,20934,236764,107,140,17136,2050,1149,3014,496,1607,1548,529,11312,236761,107,140,26479,65145,236764,532,11995,1123,236764,1586,25630,1123,528,506,1694,236764,107,140,624,994,1694,529,119743,236764,21652,236781,236770,236764,570,236770,779,568,236781,236778,236764,570,236778,236768,220380,1288,600,107,140,17136,33228,563,496,16422,753,568,809,236764,11312,779,6250,607,236743,236771,236761,107,140,24640,15375,14877,684,12773,528,52557,1900,236761,107,140,12721,236764,4260,15375,529,506,2050,684,11312,528,52919,1900,236761,107,140,107,140,38408,236787,107,140,828,236779,809,5551,107,142,236840,236770,236764,236778,236764,236800,236764,236812,236764,236810,236764,236825,1604,107,142,236840,236770,236764,236778,236764,236800,236764,236812,236764,236770,236764,236825,1604,107,142,236840,236770,236764,236778,236764,236800,236764,236812,236764,236810,236764,236770,236842,107,140,1604,236743,236770,236768,1251,21652,236771,236764,236743,236771,779,568,236770,236764,236743,236812,779,568,236770,236764,236743,236771,779,568,236778,236764,236743,236810,779,568,236778,236764,236743,236771,7066,107,140,828,236779,809,142976,236743,236770,236768,1251,2977,107,140,828,236779,809,5551,22228,870,236770,1604,870,236770,236764,236743,236778,236764,236743,236800,36878,236743,236800,236768,1251,21652,236778,236764,236743,236778,7066,107,140,12234,108,140,619,578,2977,107,140,1708,858,236764,537,528,29833,236769,38511,1473,107,144,1708,673,528,2644,236769,3469,236769,236752,236768,753,236743,236770,236764,753,236770,236764,753,236770,1473,107,148,584,537,236840,236804,236842,1251,1123,236787,766,236761,3770,3283,236747,236764,673,1223,107,140,2060,766,109,2063,4260,236779,2513,236769,2513,1473,107,140,12234,107,140,26479,614,3499,529,1908,236772,27851,25630,236764,994,496,4865,529,506,2238,3499,1308,37007,236764,107,140,7624,795,4260,506,2238,3499,528,52557,1900,768,506,2324,236769,1171,3546,1550,236764,1774,3546,1550,236768,563,11049,236764,107,140,504,4260,625,528,52919,1900,768,506,2324,236769,1171,3546,1550,236764,1774,3546,1550,236768,563,1581,236761,108,140,10282,236787,107,140,236829,1537,236789,236745,2352,506,2238,3499,236761,108,140,38408,236787,107,140,236829,4260,236779,2513,67713,1477,2977,107,140,236829,4260,236779,2513,5551,236810,2812,1477,870,236810,236842,107,140,236829,4260,236779,2513,5551,236778,236764,236743,236812,236764,236743,236800,236764,236743,236771,236764,236743,236770,236764,236743,236810,2812,1477,870,236771,236764,236743,236770,236764,236743,236778,236764,236743,236800,236764,236743,236812,236764,236743,236810,236842,107,140,236829,4260,236779,2513,5551,236778,236764,236743,236812,236764,236743,236800,236764,236743,236771,236764,236743,236770,236764,236743,236810,236764,236743,236825,2812,1477,870,236825,236764,236743,236810,236764,236743,236812,236764,236743,236800,236764,236743,236778,236764,236743,236770,236764,236743,236771,236842,107,140,12234,109,140,584,3499,1251,4955,994,2977,107,140,2060,19372,236769,2513,236764,14416,9036,2513,236840,236771,14997,2513,17825,236770,2812,2144,236743,236778,1251,236743,236771,236768,109,2717,106,107,105,4368,107 \ No newline at end of file diff --git a/.sisyphus/notes/gemma4-baseline/prompts/long_open.meta b/.sisyphus/notes/gemma4-baseline/prompts/long_open.meta new file mode 100644 index 00000000..626616c7 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/long_open.meta @@ -0,0 +1,14 @@ +file: long_open.txt +tool: HuggingFace transformers AutoTokenizer, model=google/gemma-3-27b-it (local cache) +tokenizer_vocab_size: 262144 +gguf_vocab_size: 262144 (verified via gguf.GGUFReader) +chat_template_applied: yes +bos_prepended_in_csv: no (driver prepends BOS=2 automatically) +token_count: 40 +first_20_ids: [105, 2364, 107, 6974, 496, 2822, 3925, 568, 9869, 236743, 236778, 236810, 236771, 4171, 236768, 1003, 496, 16775, 1015, 53228] +last_5_ids: [106, 107, 105, 4368, 107] +source_text: +user +Write a short story (about 250 words) about a robot who learns to paint. Include dialogue and a clear beginning, middle, and end. +model + diff --git a/.sisyphus/notes/gemma4-baseline/prompts/long_open.txt b/.sisyphus/notes/gemma4-baseline/prompts/long_open.txt new file mode 100644 index 00000000..334a3e89 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/long_open.txt @@ -0,0 +1 @@ +105,2364,107,6974,496,2822,3925,568,9869,236743,236778,236810,236771,4171,236768,1003,496,16775,1015,53228,531,6554,236761,12213,22986,532,496,3582,6534,236764,6029,236764,532,1345,236761,106,107,105,4368,107 \ No newline at end of file diff --git a/.sisyphus/notes/gemma4-baseline/prompts/short_chat.meta b/.sisyphus/notes/gemma4-baseline/prompts/short_chat.meta new file mode 100644 index 00000000..fd7462bd --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/short_chat.meta @@ -0,0 +1,14 @@ +file: short_chat.txt +tool: HuggingFace transformers AutoTokenizer, model=google/gemma-3-27b-it (local cache) +tokenizer_vocab_size: 262144 +gguf_vocab_size: 262144 (verified via gguf.GGUFReader) +chat_template_applied: yes +bos_prepended_in_csv: no (driver prepends BOS=2 automatically) +token_count: 27 +first_20_ids: [105, 2364, 107, 818, 3823, 8864, 37423, 38167, 1024, 506, 31770, 4799, 236761, 42085, 528, 886, 15649, 1144, 672, 13315] +last_5_ids: [106, 107, 105, 4368, 107] +source_text: +user +The quick brown fox jumps over the lazy dog. Explain in one paragraph what this sentence demonstrates. +model + diff --git a/.sisyphus/notes/gemma4-baseline/prompts/short_chat.txt b/.sisyphus/notes/gemma4-baseline/prompts/short_chat.txt new file mode 100644 index 00000000..1f6f79dd --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/prompts/short_chat.txt @@ -0,0 +1 @@ +105,2364,107,818,3823,8864,37423,38167,1024,506,31770,4799,236761,42085,528,886,15649,1144,672,13315,29350,236761,106,107,105,4368,107 \ No newline at end of file diff --git a/.sisyphus/notes/gemma4-baseline/run_64k_drafter_ab.sh b/.sisyphus/notes/gemma4-baseline/run_64k_drafter_ab.sh new file mode 100755 index 00000000..c21e788f --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/run_64k_drafter_ab.sh @@ -0,0 +1,96 @@ +#!/usr/bin/env bash +# 64k context, TQ3 KV, pFlash on, dense 31B. +# Compare drafters: target-only vs MTP vs dflash. +set -e +cd /home/peppi/Dev/lucebox-hub +export PATH=/usr/local/cuda-13.1/bin:$PATH + +LOGDIR=.sisyphus/notes/gemma4-baseline/matrix-64k +mkdir -p $LOGDIR + +MODEL=models/gemma-4-31B-it-Q4_K_M.gguf +MTP=models/gemma4-mtp-31B/gemma-4-31B-it-assistant.Q8_0.gguf +DFLASH_DIR=models/draft-gemma4-31b +PROMPT=.sisyphus/notes/gemma4-baseline/prompts/long_50k.txt + +CTX=65536 +NPREDICT=256 + +run_cell() { + local tag=$1; shift + local logfile=$LOGDIR/${tag}.log + echo "=== ${tag} ===" | tee -a $LOGDIR/SUMMARY.md + ./dflash/build/test_gemma4_dflash \ + --model $MODEL \ + --tokens-file $PROMPT \ + --kv-k tq3_0 --kv-v tq3_0 \ + --ctx-size $CTX --pflash \ + --n-predict $NPREDICT --temp 0 --seed 0 --ignore-eos \ + "$@" \ + > $logfile 2>&1 + local rc=$? + echo "${tag} rc=$rc" | tee -a $LOGDIR/SUMMARY.md + return $rc +} + +echo "# 64k drafter A/B with TQ3 + pFlash (dense 31B) — $(date -Iseconds)" > $LOGDIR/SUMMARY.md +echo "Prompt: long_50k.txt (~50k tokens), ctx=$CTX, n_predict=$NPREDICT" >> $LOGDIR/SUMMARY.md +echo "" >> $LOGDIR/SUMMARY.md + +# T1: target-only (baseline) +run_cell T1_none --draft-method none || echo "T1 failed but continuing" + +# T2: MTP drafter +run_cell T2_mtp --draft-method mtp --mtp $MTP || echo "T2 failed but continuing" + +# T3: dflash drafter +run_cell T3_dflash --draft-method dflash --draft $DFLASH_DIR || echo "T3 failed but continuing" + +# Stats extraction +echo "" >> $LOGDIR/SUMMARY.md +echo "## Per-cell stats" >> $LOGDIR/SUMMARY.md +for cell in T1_none T2_mtp T3_dflash; do + log=$LOGDIR/${cell}.log + [ -f $log ] || continue + echo "" >> $LOGDIR/SUMMARY.md + echo "### ${cell}" >> $LOGDIR/SUMMARY.md + echo '```' >> $LOGDIR/SUMMARY.md + grep -E "kv types|narrow asymmetric|pflash|prefill.*tokens in|context_used|tok/s=|VRAM used|^\[mtp\] steps|accept_rate|GGML_ABORT|fatal" $log 2>&1 | head -25 >> $LOGDIR/SUMMARY.md + echo '```' >> $LOGDIR/SUMMARY.md +done + +# Decoded text comparison (first 80 generated tokens for each) +echo "" >> $LOGDIR/SUMMARY.md +echo "## First 80 generated tokens (decoded)" >> $LOGDIR/SUMMARY.md +python3 - <<'PY' >> $LOGDIR/SUMMARY.md +import re, os +from transformers import AutoTokenizer +LOGDIR = ".sisyphus/notes/gemma4-baseline/matrix-64k" +t = AutoTokenizer.from_pretrained("google/gemma-3-27b-it") + +for cell in ("T1_none", "T2_mtp", "T3_dflash"): + p = f"{LOGDIR}/{cell}.log" + if not os.path.exists(p): + print(f"\n### {cell}: NO LOG"); continue + with open(p) as f: log = f.read() + # Slice between [prefill] ... ms and [stats]/[mtp] block + if "[prefill]" not in log: + print(f"\n### {cell}: no [prefill] marker"); continue + body = log.split("[prefill]", 2)[-1] + body = body.split("[stats]")[0] + body = re.sub(r'\[mtp-step \d+\] accept_rate=[\d.]+', '', body) + body = re.sub(r'ggml_backend_cuda_graph_compute:.*?\n', '', body) + body = re.sub(r'\[mtp-dbg\][^\n]*\n', '', body) + nums = re.findall(r'(?1000 sequence of changes + print(f"\n### {cell}") + print(f"raw extracted (first 80): {ids[:80]}") + print(f"decoded (first 80): {repr(t.decode(ids[:80], skip_special_tokens=False))}") +PY + +echo "" >> $LOGDIR/SUMMARY.md +echo "DONE" >> $LOGDIR/SUMMARY.md +echo "" | tee -a $LOGDIR/SUMMARY.md +echo "All cells complete. See $LOGDIR/SUMMARY.md" diff --git a/.sisyphus/notes/gemma4-baseline/run_64k_v2.sh b/.sisyphus/notes/gemma4-baseline/run_64k_v2.sh new file mode 100755 index 00000000..db712d44 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/run_64k_v2.sh @@ -0,0 +1,77 @@ +#!/usr/bin/env bash +# 64k matrix v2 — all 3 fixes in (SWA mask + TQ3 dispatcher + head_dim=512 mask). +# Dense 31B + TQ3/TQ3 + pflash + ctx 65536 + 50k prompt. +set -e +cd /home/peppi/Dev/lucebox-hub +export PATH=/usr/local/cuda-13.1/bin:$PATH + +LOGDIR=.sisyphus/notes/gemma4-baseline/matrix-64k-v2 +mkdir -p $LOGDIR + +MODEL=models/gemma-4-31B-it-Q4_K_M.gguf +MTP=models/gemma4-mtp-31B/gemma-4-31B-it-assistant.Q8_0.gguf +DFLASH_GGUF=dflash/models/draft-gemma4-31b/draft-q8_0.gguf +PROMPT=.sisyphus/notes/gemma4-baseline/prompts/long_50k.txt + +CTX=65536 +NPREDICT=256 + +run_cell() { + local tag=$1; shift + echo "=== ${tag} starting at $(date +%H:%M:%S) ===" | tee -a $LOGDIR/SUMMARY.md + ./dflash/build/test_gemma4_dflash \ + --model $MODEL \ + --tokens-file $PROMPT \ + --kv-k tq3_0 --kv-v tq3_0 \ + --ctx-size $CTX --pflash \ + --n-predict $NPREDICT --temp 0 --seed 0 --ignore-eos \ + "$@" \ + > $LOGDIR/${tag}.log 2>&1 || true + local rc=$? + echo "${tag} rc=$rc" | tee -a $LOGDIR/SUMMARY.md +} + +echo "# Matrix v2 at 64k — all fixes in. $(date -Iseconds)" > $LOGDIR/SUMMARY.md +echo "" >> $LOGDIR/SUMMARY.md + +run_cell V1_none --draft-method none +run_cell V2_mtp --draft-method mtp --mtp $MTP +run_cell V3_dflash_dm8 --draft-method dflash --draft $DFLASH_GGUF --draft-max 8 + +echo "" >> $LOGDIR/SUMMARY.md +echo "## Per-cell stats" >> $LOGDIR/SUMMARY.md +for cell in V1_none V2_mtp V3_dflash_dm8; do + log=$LOGDIR/${cell}.log + [ -f $log ] || continue + echo "" >> $LOGDIR/SUMMARY.md + echo "### ${cell}" >> $LOGDIR/SUMMARY.md + echo '```' >> $LOGDIR/SUMMARY.md + grep -E "kv types|narrow asymmetric|^\[draft\] KV|prefill.*tokens in|context_used|tok/s=|VRAM used|^\[mtp\] steps|^\[spec\]|GGML_ABORT" $log >> $LOGDIR/SUMMARY.md + echo '```' >> $LOGDIR/SUMMARY.md +done + +echo "" >> $LOGDIR/SUMMARY.md +echo "## Decoded text comparison (first 80 generated tokens)" >> $LOGDIR/SUMMARY.md +python3 - <<'PY' >> $LOGDIR/SUMMARY.md +import re, os +from transformers import AutoTokenizer +LOGDIR = ".sisyphus/notes/gemma4-baseline/matrix-64k-v2" +t = AutoTokenizer.from_pretrained("google/gemma-3-27b-it") +for cell in ("V1_none", "V2_mtp", "V3_dflash_dm8"): + p = f"{LOGDIR}/{cell}.log" + if not os.path.exists(p): print(f"\n### {cell}: NO LOG"); continue + with open(p) as f: log = f.read() + if "[prefill]" not in log: print(f"\n### {cell}: no [prefill] marker"); continue + body = log.split("[prefill]", 2)[-1].split("[stats]")[0] + body = re.sub(r'\[mtp-step \d+\] accept_rate=[\d.]+', '', body) + body = re.sub(r'\[step \d+\] accept=\d+/\d+ avg=[\d.]+', '', body) + body = re.sub(r'ggml_backend[^\n]*\n', '', body) + body = re.sub(r'\[(mtp-dbg|draft)[^\]]*\][^\n]*\n', '', body) + nums = re.findall(r'(? $LOGDIR/SUMMARY.md + +run() { + local tag=$1; local ctx=$2; local dm=$3 + echo "=== ${tag} (ctx=${ctx} dm=${dm}) starting at $(date +%H:%M:%S) ===" | tee -a $LOGDIR/SUMMARY.md + ./dflash/build/test_gemma4_dflash \ + --model $MOE \ + --draft $MOE_DFLASH \ + --draft-method dflash --draft-max $dm \ + --tokens-file $PROMPT_CODE \ + --kv-k q8_0 --kv-v q8_0 \ + --ctx-size $ctx --pflash \ + --n-predict 256 --temp 0 --seed 0 --ignore-eos \ + > $LOGDIR/${tag}.log 2>&1 || true + echo "${tag} rc=$?" | tee -a $LOGDIR/SUMMARY.md +} + +# 64k: dm sweep on code prompt +for dm in 1 2 4 8; do + run code64k_dm${dm} 65536 $dm +done + +# 256k with the best/typical dm to verify at the high end +run code256k_dm4 262144 4 +run code256k_dm8 262144 8 + +# 4k with code (sanity baseline — should hit the best AL) +run code4k_dm4 4096 4 +run code4k_dm8 4096 8 + +echo "" >> $LOGDIR/SUMMARY.md +echo "## Per-cell stats" >> $LOGDIR/SUMMARY.md +for log in $LOGDIR/*.log; do + tag=$(basename $log .log) + echo "" >> $LOGDIR/SUMMARY.md + echo "### $tag" >> $LOGDIR/SUMMARY.md + echo '```' >> $LOGDIR/SUMMARY.md + grep -E "context_used|prefill.*tokens in|tok/s=|VRAM used|^\[spec\]|GGML_ABORT|^\[draft\] KV" $log >> $LOGDIR/SUMMARY.md + echo '```' >> $LOGDIR/SUMMARY.md +done + +echo "" | tee -a $LOGDIR/SUMMARY.md +echo "DONE" | tee -a $LOGDIR/SUMMARY.md diff --git a/.sisyphus/notes/gemma4-baseline/run_matrix_v3.sh b/.sisyphus/notes/gemma4-baseline/run_matrix_v3.sh new file mode 100755 index 00000000..d08f049b --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/run_matrix_v3.sh @@ -0,0 +1,97 @@ +#!/usr/bin/env bash +# Matrix v3: validate the SWA-mask fix at HEAD across the production-relevant configs. +# Cells: +# N1: target-only K=Q8 V=TQ3 (validate fix on long prompt; expect coherent) +# N2: target-only K=Q8 V=Q8 (control, expect coherent — was M2 in v2) +# N3: MTP K=Q8 V=TQ3 (the production ship target — measure accept_rate) +# N4: MTP K=Q8 V=Q8 (previous safe baseline — was M4 in v2; expect crash ~step 210) +set -e +cd /home/peppi/Dev/lucebox-hub +export PATH=/usr/local/cuda-13.1/bin:$PATH +LOGDIR=.sisyphus/notes/gemma4-baseline/matrix-v3 +mkdir -p $LOGDIR + +MODEL=models/gemma-4-31B-it-Q4_K_M.gguf +MTP=models/gemma4-mtp-31B/gemma-4-31B-it-assistant.Q8_0.gguf +PROMPT=.sisyphus/notes/gemma4-baseline/prompts/long_open.txt + +run_cell() { + local tag=$1; local kvk=$2; local kvv=$3; local draft=$4 + local logfile=$LOGDIR/${tag}.log + echo "=== $tag (K=$kvk V=$kvv draft=$draft) ===" | tee -a $LOGDIR/SUMMARY.md + local args=( + --model $MODEL + --draft-method $draft + --kv-k $kvk --kv-v $kvv + --tokens-file $PROMPT + --n-predict 256 --temp 0 --seed 0 --ignore-eos + ) + if [ "$draft" = "mtp" ]; then + args+=(--mtp $MTP) + fi + ./dflash/build/test_gemma4_dflash "${args[@]}" > $logfile 2>&1 + local rc=$? + echo "$tag rc=$rc" | tee -a $LOGDIR/SUMMARY.md +} + +echo "# Matrix v3 with SWA mask fix — $(date -Iseconds)" > $LOGDIR/SUMMARY.md +run_cell N1_none_q8_tq3 q8_0 tq3_0 none +run_cell N2_none_q8_q8 q8_0 q8_0 none +run_cell N3_mtp_q8_tq3 q8_0 tq3_0 mtp +run_cell N4_mtp_q8_q8 q8_0 q8_0 mtp + +# Decode + report +echo "" | tee -a $LOGDIR/SUMMARY.md +echo "## Decoded outputs (first 80 generated tokens) + accept_rate trajectories" | tee -a $LOGDIR/SUMMARY.md +python3 - <<'PY' | tee -a $LOGDIR/SUMMARY.md +import re, os +from transformers import AutoTokenizer +LOGDIR = ".sisyphus/notes/gemma4-baseline/matrix-v3" +t = AutoTokenizer.from_pretrained("google/gemma-3-27b-it") + +def extract_gen_ids(log): + """Get generated token IDs from a log; tokens are space-separated ints between [prefill] and [stats].""" + ids = [] + for line in log.splitlines(): + s = line.strip() + if not s: continue + if s.startswith("[") and s.endswith("]"): continue + # Pure numeric line + if re.match(r'^[\d\s]+$', s): + for x in s.split(): + if x.isdigit() and int(x) < 262144: + ids.append(int(x)) + return ids + +def extract_accept_rates(log): + return [(int(m.group(1)), float(m.group(2))) for m in re.finditer(r'\[mtp-step (\d+)\] accept_rate=([0-9.]+)', log)] + +def extract_final_stats(log): + m = re.search(r'\[mtp\] steps=(\d+) accepted=(\d+) accept_rate=([0-9.]+)', log) + if m: return f"steps={m.group(1)} accepted={m.group(2)} accept_rate={m.group(3)}" + m = re.search(r'\[stats\] generated=(\d+).*?tok/s=([0-9.]+)', log) + if m: return f"generated={m.group(1)} tok/s={m.group(2)}" + return "no stats" + +for tag in ["N1_none_q8_tq3", "N2_none_q8_q8", "N3_mtp_q8_tq3", "N4_mtp_q8_q8"]: + p = f"{LOGDIR}/{tag}.log" + if not os.path.exists(p): + print(f"\n### {tag}: NO LOG"); continue + with open(p) as f: log = f.read() + crashed = "GGML_ABORT" in log or "Aborted" in log or "core dumped" in log + ids = extract_gen_ids(log) + txt80 = t.decode(ids[:80], skip_special_tokens=False) if ids else "(no tokens)" + print(f"\n### {tag}") + print(f"crashed: {crashed}") + print(f"final_stats: {extract_final_stats(log)}") + print(f"first_80_decoded: {txt80!r}") + if "mtp" in tag: + rates = extract_accept_rates(log) + if rates: + print(f"accept_rate trajectory ({len(rates)} samples):") + for step, rate in rates[::4]: # every 4th sample + print(f" step={step:3d} rate={rate:.2f}") + print(f" step={rates[-1][0]:3d} rate={rates[-1][1]:.2f} (final)") +PY + +echo "DONE" | tee -a $LOGDIR/SUMMARY.md diff --git a/.sisyphus/notes/gemma4-baseline/run_scaling.sh b/.sisyphus/notes/gemma4-baseline/run_scaling.sh new file mode 100755 index 00000000..dcf58d0f --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/run_scaling.sh @@ -0,0 +1,66 @@ +#!/usr/bin/env bash +# (1) Verify dense 31B + Q8/Q8 at 64k actually OOMs or not. +# (2) Sweep MoE 26B + dflash + Q8/Q8 + dm=4 across context sizes from 16k upward. +set -e +cd /home/peppi/Dev/lucebox-hub +export PATH=/usr/local/cuda-13.1/bin:$PATH + +LOGDIR=.sisyphus/notes/gemma4-baseline/scaling +mkdir -p $LOGDIR + +DENSE=models/gemma-4-31B-it-Q4_K_M.gguf +MOE=/home/peppi/models/gemma4-26b-a4b-it/gemma-4-26B-A4B-it-UD-Q4_K_M.gguf +MOE_DFLASH=/home/peppi/models/gemma4-26b-a4b-dflash/draft-q8_0.gguf + +PROMPT_SHORT=.sisyphus/notes/gemma4-baseline/prompts/long_open.txt # 40 toks +PROMPT_2K=.sisyphus/notes/gemma4-baseline/prompts/long_2k.txt # 2611 toks +PROMPT_50K=.sisyphus/notes/gemma4-baseline/prompts/long_50k.txt # 49904 toks + +echo "# Scaling matrix — $(date -Iseconds)" > $LOGDIR/SUMMARY.md + +run() { + local tag=$1; local logfile=$LOGDIR/${tag}.log + shift + echo "=== $tag starting at $(date +%H:%M:%S) ===" | tee -a $LOGDIR/SUMMARY.md + ./dflash/build/test_gemma4_dflash "$@" \ + --n-predict 256 --temp 0 --seed 0 --ignore-eos \ + > $logfile 2>&1 || true + local rc=$? + echo "$tag rc=$rc" | tee -a $LOGDIR/SUMMARY.md +} + +# (1) Dense 31B + Q8/Q8 at 64k — test the assumption it OOMs +run D1_dense_q8_q8_64k \ + --model $DENSE \ + --tokens-file $PROMPT_50K \ + --kv-k q8_0 --kv-v q8_0 \ + --ctx-size 65536 --pflash \ + --draft-method none + +# (2) MoE 26B sweep — Q8/Q8, dflash drafter, dm=4 (user-recommended for MoE) +# Steps: 16k, 32k, 64k, 128k, 256k. Use long_50k where it fits, smaller prompt where it doesn't. +for ctx in 16384 32768 65536 131072 262144; do + if [ $ctx -ge 65536 ]; then PROMPT=$PROMPT_50K; else PROMPT=$PROMPT_2K; fi + run M_moe_dflash_q8q8_${ctx} \ + --model $MOE \ + --draft $MOE_DFLASH \ + --draft-method dflash --draft-max 4 \ + --tokens-file $PROMPT \ + --kv-k q8_0 --kv-v q8_0 \ + --ctx-size $ctx --pflash +done + +# Compact summary +echo "" >> $LOGDIR/SUMMARY.md +echo "## Per-cell stats" >> $LOGDIR/SUMMARY.md +for log in $LOGDIR/*.log; do + tag=$(basename $log .log) + echo "" >> $LOGDIR/SUMMARY.md + echo "### $tag" >> $LOGDIR/SUMMARY.md + echo '```' >> $LOGDIR/SUMMARY.md + grep -E "context_used|kv types|prefill.*tokens in|tok/s=|VRAM used|^\[mtp\] steps|^\[spec\]|GGML_ABORT|out of memory|cudaMalloc|^\[draft\] KV" $log >> $LOGDIR/SUMMARY.md + echo '```' >> $LOGDIR/SUMMARY.md +done + +echo "" | tee -a $LOGDIR/SUMMARY.md +echo "DONE" | tee -a $LOGDIR/SUMMARY.md diff --git a/.sisyphus/notes/gemma4-baseline/run_scientific.sh b/.sisyphus/notes/gemma4-baseline/run_scientific.sh new file mode 100755 index 00000000..24a76b06 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/run_scientific.sh @@ -0,0 +1,164 @@ +#!/usr/bin/env bash +# Scientific bench: dense vs MoE × {code, creative} × dm ∈ {1,2,4,8,16,32}. +# For each cell: GPU power profile → total/prefill/decode energy + tok/J + tok/s. +# NOTE: no set -e; we want the script to keep going even if individual cells fail. +cd /home/peppi/Dev/lucebox-hub +export PATH=/usr/local/cuda-13.1/bin:$PATH + +LOGDIR=.sisyphus/notes/gemma4-baseline/scientific +POWLOG=$LOGDIR/power +mkdir -p $LOGDIR $POWLOG + +DENSE=models/gemma-4-31B-it-Q4_K_M.gguf +DENSE_DFLASH=dflash/models/draft-gemma4-31b/draft-q8_0.gguf +MOE=/home/peppi/models/gemma4-26b-a4b-it/gemma-4-26B-A4B-it-UD-Q4_K_M.gguf +MOE_DFLASH=/home/peppi/models/gemma4-26b-a4b-dflash/draft-q8_0.gguf + +PROMPT_CODE=.sisyphus/notes/gemma4-baseline/prompts/humaneval_2.txt +PROMPT_CREATIVE=.sisyphus/notes/gemma4-baseline/prompts/long_open.txt + +echo "# Scientific bench — $(date -Iseconds)" > $LOGDIR/SUMMARY.md +echo "Q8/Q8 KV, 4K ctx, n_predict=256, temp=0 seed=0 --ignore-eos, pflash on" >> $LOGDIR/SUMMARY.md + +run_cell() { + local model=$1; local draft=$2; local prompt=$3; local dm=$4; local tag=$5 + local logfile=$LOGDIR/${tag}.log + local powfile=$POWLOG/${tag}.csv + + # Start GPU power telemetry: timestamp + power.draw, every 100ms via tight loop. + ( while true; do + printf "%s,%s\n" "$(date +%s.%N)" "$(nvidia-smi --query-gpu=power.draw --format=csv,noheader,nounits 2>/dev/null | head -1)" + sleep 0.1 + done ) > $powfile 2>/dev/null & + local POW_PID=$! + + local t0=$(date +%s.%N) + ./dflash/build/test_gemma4_dflash \ + --model $model \ + --draft $draft \ + --draft-method dflash --draft-max $dm \ + --tokens-file $prompt \ + --kv-k q8_0 --kv-v q8_0 \ + --ctx-size 4096 --pflash \ + --n-predict 256 --temp 0 --seed 0 --ignore-eos \ + > $logfile 2>&1 || true + local rc=$? + local t_end=$(date +%s.%N) + + kill $POW_PID 2>/dev/null || true + wait $POW_PID 2>/dev/null || true + + echo "$tag rc=$rc t0=$t0 t_end=$t_end" >> $LOGDIR/timestamps.csv +} + +# 24 cells: 2 models × 2 prompts × 6 dms +for dm in 1 2 4 8 16 32; do + run_cell $DENSE $DENSE_DFLASH $PROMPT_CODE $dm dense_code_dm${dm} + run_cell $DENSE $DENSE_DFLASH $PROMPT_CREATIVE $dm dense_creative_dm${dm} + run_cell $MOE $MOE_DFLASH $PROMPT_CODE $dm moe_code_dm${dm} + run_cell $MOE $MOE_DFLASH $PROMPT_CREATIVE $dm moe_creative_dm${dm} +done + +# Analysis: parse logs + power profiles, compute per-phase energy +python3 - <<'PY' > $LOGDIR/results.csv +import os, re, csv +from glob import glob + +LOGDIR = ".sisyphus/notes/gemma4-baseline/scientific" +POWDIR = f"{LOGDIR}/power" + +# Parse timestamps.csv -> {tag: (t0, t_end)} +tags = {} +with open(f"{LOGDIR}/timestamps.csv") as f: + for line in f: + m = re.match(r"(\S+) rc=(\d+) t0=(\S+) t_end=(\S+)", line) + if m: + tags[m.group(1)] = {"rc": int(m.group(2)), "t0": float(m.group(3)), "t_end": float(m.group(4))} + +writer = csv.writer(__import__("sys").stdout) +writer.writerow([ + "cell", "rc", + "wall_s", "prefill_ms", "decode_ms", "first_tok_ms", + "prefill_tok_s", "decode_tok_s", + "AL", "VRAM_GB", + "avg_power_W", "total_energy_J", + "prefill_energy_J", "decode_energy_J", + "decode_J_per_tok", +]) + +for tag, ts in tags.items(): + log_path = f"{LOGDIR}/{tag}.log" + pow_path = f"{POWDIR}/{tag}.csv" + if not os.path.exists(log_path): + continue + log = open(log_path).read() + + def grep(pat, default=""): + m = re.search(pat, log) + return m.group(1) if m else default + + prefill_ms = grep(r"\[prefill\] \d+ tokens in ([0-9.]+) ms", "") + prefill_tok_s = grep(r"\[prefill\] \d+ tokens in [0-9.]+ ms \(([0-9.]+) tok/s\)", "") + prefill_n = grep(r"prefill=(\d+) tokens", "") + decode_ms = grep(r"decode_ms=([0-9.]+)", "") + decode_tok_s = grep(r"tok/s=([0-9.]+)", "") + first_tok_ms = grep(r"first_tok_ms=([0-9.]+)", "") + AL = grep(r"avg_accept=([0-9.]+)", "") + VRAM = grep(r"VRAM used=([0-9.]+) GB", "") + + # Power integration + samples = [] + if os.path.exists(pow_path): + for line in open(pow_path): + try: + t, p = line.strip().split(",") + samples.append((float(t), float(p))) + except: + pass + + wall_s = ts["t_end"] - ts["t0"] + avg_power = (sum(p for _, p in samples) / len(samples)) if samples else 0.0 + total_E = 0.0 + for i in range(len(samples) - 1): + dt = samples[i+1][0] - samples[i][0] + total_E += dt * (samples[i][1] + samples[i+1][1]) / 2 # trapezoidal + + # Per-phase energy: integrate over the binary's reported prefill/decode windows. + # Phase boundaries from binary: T0 (start) → T0+startup_ms → T0+startup+prefill_ms → T_end. + # We don't have explicit startup; approximate: first 1s is startup+model load (largely CPU, lower GPU power). + # Simpler: split total energy by time fractions. + pms = float(prefill_ms) if prefill_ms else 0.0 + dms = float(decode_ms) if decode_ms else 0.0 + total_active_ms = pms + dms + if total_active_ms > 0 and total_E > 0: + # Integrate over the *active* window (skip first ~1s of model load) + active_start_idx = max(0, int(len(samples) * 1.0 / max(wall_s, 1))) + active_samples = samples[active_start_idx:] + active_E = 0.0 + for i in range(len(active_samples) - 1): + dt = active_samples[i+1][0] - active_samples[i][0] + active_E += dt * (active_samples[i][1] + active_samples[i+1][1]) / 2 + prefill_E = active_E * (pms / total_active_ms) + decode_E = active_E * (dms / total_active_ms) + else: + prefill_E = decode_E = 0.0 + + decode_J_per_tok = decode_E / 256.0 if decode_E > 0 else 0.0 + + writer.writerow([ + tag, ts["rc"], + f"{wall_s:.2f}", prefill_ms, decode_ms, first_tok_ms, + prefill_tok_s, decode_tok_s, + AL, VRAM, + f"{avg_power:.1f}", f"{total_E:.1f}", + f"{prefill_E:.1f}", f"{decode_E:.1f}", + f"{decode_J_per_tok:.3f}", + ]) +PY + +echo "" >> $LOGDIR/SUMMARY.md +echo "## Results table — see results.csv for full data" >> $LOGDIR/SUMMARY.md +echo '```' >> $LOGDIR/SUMMARY.md +column -t -s, $LOGDIR/results.csv >> $LOGDIR/SUMMARY.md 2>/dev/null +echo '```' >> $LOGDIR/SUMMARY.md +echo "DONE" | tee -a $LOGDIR/SUMMARY.md diff --git a/.sisyphus/notes/gemma4-baseline/scaling/SUMMARY.md b/.sisyphus/notes/gemma4-baseline/scaling/SUMMARY.md new file mode 100644 index 00000000..98c0c2eb --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/scaling/SUMMARY.md @@ -0,0 +1,87 @@ +# Scaling matrix — 2026-05-10T00:00:56+02:00 +=== D1_dense_q8_q8_64k starting at 00:00:56 === +D1_dense_q8_q8_64k rc=0 +=== M_moe_dflash_q8q8_16384 starting at 00:02:19 === +M_moe_dflash_q8q8_16384 rc=0 +=== M_moe_dflash_q8q8_32768 starting at 00:02:43 === +M_moe_dflash_q8q8_32768 rc=0 +=== M_moe_dflash_q8q8_65536 starting at 00:03:03 === +M_moe_dflash_q8q8_65536 rc=0 +=== M_moe_dflash_q8q8_131072 starting at 00:03:29 === +M_moe_dflash_q8q8_131072 rc=0 +=== M_moe_dflash_q8q8_262144 starting at 00:03:51 === +M_moe_dflash_q8q8_262144 rc=0 + +## Per-cell stats + +### D1_dense_q8_q8_64k +``` +[cache] kv types: SWA=q8_0, full=q8_0 +[prefill] 49904 tokens in 35599.5 ms (1401.8 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 236798) +[stats] generated=256 decode_ms=32143.1 tok/s=7.96 first_tok_ms=157.93 +[stats] prefill=49904 tokens context_used=50160/65536 +[mem] VRAM used=22.60 GB total=24.00 GB +``` + +### M_moe_dflash_q8q8_131072 +``` +[cache] kv types: SWA=q8_0, full=q8_0 +[draft] KV cache allocated: 2096 slots +[prefill] 49904 tokens in 10209.1 ms (4888.2 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 100) +[draft] KV prefill done: 2096 positions materialized (skipped 47808 early tokens, cap=2096) +[stats] generated=256 decode_ms=8560.5 tok/s=29.90 first_tok_ms=62.33 +[stats] prefill=49904 tokens context_used=50160/131072 +[spec] draft_steps=177 total_accepted=256 avg_accept=1.45 +[mem] VRAM used=20.42 GB total=24.00 GB +``` + +### M_moe_dflash_q8q8_16384 +``` +[cache] kv types: SWA=q8_0, full=q8_0 +[draft] KV cache allocated: 2096 slots +[prefill] 2612 tokens in 703.8 ms (3711.5 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 7243) +[draft] KV prefill done: 2096 positions materialized (skipped 516 early tokens, cap=2096) +[stats] generated=256 decode_ms=3540.0 tok/s=72.32 first_tok_ms=41.71 +[stats] prefill=2612 tokens context_used=2868/16384 +[spec] draft_steps=154 total_accepted=256 avg_accept=1.66 +[mem] VRAM used=19.27 GB total=24.00 GB +``` + +### M_moe_dflash_q8q8_262144 +``` +[cache] kv types: SWA=q8_0, full=q8_0 +[draft] KV cache allocated: 2096 slots +[prefill] 49904 tokens in 10197.4 ms (4893.8 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 100) +[draft] KV prefill done: 2096 positions materialized (skipped 47808 early tokens, cap=2096) +[stats] generated=256 decode_ms=8707.2 tok/s=29.40 first_tok_ms=62.09 +[stats] prefill=49904 tokens context_used=50160/262144 +[spec] draft_steps=177 total_accepted=256 avg_accept=1.45 +[mem] VRAM used=21.74 GB total=24.00 GB +``` + +### M_moe_dflash_q8q8_32768 +``` +[cache] kv types: SWA=q8_0, full=q8_0 +[draft] KV cache allocated: 2096 slots +[prefill] 2612 tokens in 681.4 ms (3833.4 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 7243) +[draft] KV prefill done: 2096 positions materialized (skipped 516 early tokens, cap=2096) +[stats] generated=256 decode_ms=3629.2 tok/s=70.54 first_tok_ms=38.51 +[stats] prefill=2612 tokens context_used=2868/32768 +[spec] draft_steps=154 total_accepted=256 avg_accept=1.66 +[mem] VRAM used=19.45 GB total=24.00 GB +``` + +### M_moe_dflash_q8q8_65536 +``` +[cache] kv types: SWA=q8_0, full=q8_0 +[draft] KV cache allocated: 2096 slots +[prefill] 49904 tokens in 10229.4 ms (4878.5 tok/s) [chunked+pflash, chunk_size=1024] (last sampled token: 100) +[draft] KV prefill done: 2096 positions materialized (skipped 47808 early tokens, cap=2096) +[stats] generated=256 decode_ms=8851.8 tok/s=28.92 first_tok_ms=65.18 +[stats] prefill=49904 tokens context_used=50160/65536 +[spec] draft_steps=177 total_accepted=256 avg_accept=1.45 +[mem] VRAM used=19.74 GB total=24.00 GB +``` + +### orchestrator +``` diff --git a/.sisyphus/notes/gemma4-baseline/scientific/SUMMARY.md b/.sisyphus/notes/gemma4-baseline/scientific/SUMMARY.md new file mode 100644 index 00000000..341a8114 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/scientific/SUMMARY.md @@ -0,0 +1,32 @@ +# Scientific bench — 2026-05-10T01:01:25+02:00 +Q8/Q8 KV, 4K ctx, n_predict=256, temp=0 seed=0 --ignore-eos, pflash on + +## Results table — see results.csv for full data +``` +cell rc wall_s prefill_ms decode_ms first_tok_ms prefill_tok_s decode_tok_s AL VRAM_GB avg_power_W total_energy_J prefill_energy_J decode_energy_J decode_J_per_tok +dense_code_dm1 0 36.32 175.9 8414.9 42.37 796.0 30.42 1.00 22.10 180.2 7350.9 148.3 7095.5 27.717 +dense_creative_dm1 0 12.04 97.4 8325.1 39.33 421.1 30.75 1.00 22.07 317.5 3757.9 41.4 3542.6 13.838 +moe_code_dm1 0 29.18 180.2 4517.1 34.85 777.1 56.67 1.00 18.89 139.7 4017.6 147.7 3702.2 14.462 +moe_creative_dm1 0 7.70 119.9 4410.8 33.37 341.9 58.04 1.00 18.87 209.5 1641.6 39.6 1458.3 5.696 +dense_code_dm2 0 34.19 189.8 5506.4 55.76 737.8 46.49 1.82 22.09 163.6 5452.2 177.2 5140.8 20.081 +dense_creative_dm2 0 29.30 98.4 5390.6 50.84 416.6 47.49 1.87 22.10 171.6 4869.7 84.8 4645.3 18.146 +moe_code_dm2 0 23.54 181.3 2680.0 41.38 772.2 95.52 1.90 18.92 132.3 3160.5 191.1 2825.1 11.036 +moe_creative_dm2 0 10.17 123.2 3117.4 36.85 332.8 82.12 1.63 18.90 205.4 1869.3 67.2 1700.5 6.643 +dense_code_dm4 0 33.64 187.8 4593.4 68.09 745.3 55.73 2.91 22.10 159.6 6161.0 236.8 5793.1 22.629 +dense_creative_dm4 0 18.55 96.6 4379.8 61.58 424.5 58.45 3.05 22.08 182.9 3401.0 70.2 3182.8 12.433 +moe_code_dm4 0 27.17 180.5 2163.9 39.02 775.4 118.30 2.61 18.90 131.9 3535.9 260.2 3119.4 12.185 +moe_creative_dm4 0 16.02 119.7 2705.5 35.68 342.7 94.62 2.12 18.88 147.6 2393.7 94.3 2132.4 8.330 +dense_code_dm8 0 34.52 172.4 5341.2 99.02 812.3 47.93 4.06 22.10 163.9 5502.8 167.6 5191.7 20.280 +dense_creative_dm8 0 16.63 97.1 4891.6 95.39 422.4 52.34 4.41 22.07 230.2 3347.8 62.2 3135.6 12.249 +moe_code_dm8 0 27.70 204.5 1942.9 54.00 684.6 131.76 3.88 18.89 137.8 4216.6 387.8 3684.0 14.391 +moe_creative_dm8 0 11.41 121.0 3717.3 46.98 338.9 68.87 2.03 18.88 196.0 2235.2 65.4 2009.9 7.851 +dense_code_dm16 0 33.79 179.5 3764.8 82.46 779.9 68.00 4.20 22.11 147.6 4801.4 211.6 4437.6 17.334 +dense_creative_dm16 0 14.47 99.5 3134.0 80.90 412.1 81.68 5.12 22.08 176.9 2506.7 71.6 2256.5 8.815 +moe_code_dm16 0 27.26 180.4 1937.8 64.99 776.2 132.11 5.22 18.90 129.6 3457.1 281.7 3025.6 11.819 +moe_creative_dm16 0 11.71 126.5 4135.2 61.63 324.2 61.91 2.49 18.89 175.7 2052.5 56.0 1830.3 7.150 +dense_code_dm32 0 34.69 195.5 3786.8 89.02 715.9 67.60 4.20 22.11 146.5 4948.0 236.2 4575.5 17.873 +dense_creative_dm32 0 18.75 95.8 3101.7 79.06 428.0 82.54 5.12 22.08 169.6 2954.9 85.0 2751.0 10.746 +moe_code_dm32 0 27.70 177.4 1932.8 66.76 789.2 132.45 5.22 18.90 129.3 4044.4 327.3 3566.5 13.932 +moe_creative_dm32 0 17.34 131.5 4172.6 67.01 311.9 61.35 2.49 18.89 154.4 2685.9 77.0 2443.8 9.546 +``` +DONE diff --git a/.sisyphus/notes/gemma4-baseline/scientific/results.csv b/.sisyphus/notes/gemma4-baseline/scientific/results.csv new file mode 100644 index 00000000..94abc992 --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/scientific/results.csv @@ -0,0 +1,25 @@ +cell,rc,wall_s,prefill_ms,decode_ms,first_tok_ms,prefill_tok_s,decode_tok_s,AL,VRAM_GB,avg_power_W,total_energy_J,prefill_energy_J,decode_energy_J,decode_J_per_tok +dense_code_dm1,0,36.32,175.9,8414.9,42.37,796.0,30.42,1.00,22.10,180.2,7350.9,148.3,7095.5,27.717 +dense_creative_dm1,0,12.04,97.4,8325.1,39.33,421.1,30.75,1.00,22.07,317.5,3757.9,41.4,3542.6,13.838 +moe_code_dm1,0,29.18,180.2,4517.1,34.85,777.1,56.67,1.00,18.89,139.7,4017.6,147.7,3702.2,14.462 +moe_creative_dm1,0,7.70,119.9,4410.8,33.37,341.9,58.04,1.00,18.87,209.5,1641.6,39.6,1458.3,5.696 +dense_code_dm2,0,34.19,189.8,5506.4,55.76,737.8,46.49,1.82,22.09,163.6,5452.2,177.2,5140.8,20.081 +dense_creative_dm2,0,29.30,98.4,5390.6,50.84,416.6,47.49,1.87,22.10,171.6,4869.7,84.8,4645.3,18.146 +moe_code_dm2,0,23.54,181.3,2680.0,41.38,772.2,95.52,1.90,18.92,132.3,3160.5,191.1,2825.1,11.036 +moe_creative_dm2,0,10.17,123.2,3117.4,36.85,332.8,82.12,1.63,18.90,205.4,1869.3,67.2,1700.5,6.643 +dense_code_dm4,0,33.64,187.8,4593.4,68.09,745.3,55.73,2.91,22.10,159.6,6161.0,236.8,5793.1,22.629 +dense_creative_dm4,0,18.55,96.6,4379.8,61.58,424.5,58.45,3.05,22.08,182.9,3401.0,70.2,3182.8,12.433 +moe_code_dm4,0,27.17,180.5,2163.9,39.02,775.4,118.30,2.61,18.90,131.9,3535.9,260.2,3119.4,12.185 +moe_creative_dm4,0,16.02,119.7,2705.5,35.68,342.7,94.62,2.12,18.88,147.6,2393.7,94.3,2132.4,8.330 +dense_code_dm8,0,34.52,172.4,5341.2,99.02,812.3,47.93,4.06,22.10,163.9,5502.8,167.6,5191.7,20.280 +dense_creative_dm8,0,16.63,97.1,4891.6,95.39,422.4,52.34,4.41,22.07,230.2,3347.8,62.2,3135.6,12.249 +moe_code_dm8,0,27.70,204.5,1942.9,54.00,684.6,131.76,3.88,18.89,137.8,4216.6,387.8,3684.0,14.391 +moe_creative_dm8,0,11.41,121.0,3717.3,46.98,338.9,68.87,2.03,18.88,196.0,2235.2,65.4,2009.9,7.851 +dense_code_dm16,0,33.79,179.5,3764.8,82.46,779.9,68.00,4.20,22.11,147.6,4801.4,211.6,4437.6,17.334 +dense_creative_dm16,0,14.47,99.5,3134.0,80.90,412.1,81.68,5.12,22.08,176.9,2506.7,71.6,2256.5,8.815 +moe_code_dm16,0,27.26,180.4,1937.8,64.99,776.2,132.11,5.22,18.90,129.6,3457.1,281.7,3025.6,11.819 +moe_creative_dm16,0,11.71,126.5,4135.2,61.63,324.2,61.91,2.49,18.89,175.7,2052.5,56.0,1830.3,7.150 +dense_code_dm32,0,34.69,195.5,3786.8,89.02,715.9,67.60,4.20,22.11,146.5,4948.0,236.2,4575.5,17.873 +dense_creative_dm32,0,18.75,95.8,3101.7,79.06,428.0,82.54,5.12,22.08,169.6,2954.9,85.0,2751.0,10.746 +moe_code_dm32,0,27.70,177.4,1932.8,66.76,789.2,132.45,5.22,18.90,129.3,4044.4,327.3,3566.5,13.932 +moe_creative_dm32,0,17.34,131.5,4172.6,67.01,311.9,61.35,2.49,18.89,154.4,2685.9,77.0,2443.8,9.546 diff --git a/.sisyphus/notes/gemma4-baseline/scientific/timestamps.csv b/.sisyphus/notes/gemma4-baseline/scientific/timestamps.csv new file mode 100644 index 00000000..e4e9a4ab --- /dev/null +++ b/.sisyphus/notes/gemma4-baseline/scientific/timestamps.csv @@ -0,0 +1,24 @@ +dense_code_dm1 rc=0 t0=1778367685.477371744 t_end=1778367721.793818783 +dense_creative_dm1 rc=0 t0=1778367721.794931126 t_end=1778367733.839192846 +moe_code_dm1 rc=0 t0=1778367733.840091813 t_end=1778367763.015678232 +moe_creative_dm1 rc=0 t0=1778367763.016548568 t_end=1778367770.721033915 +dense_code_dm2 rc=0 t0=1778367770.721930366 t_end=1778367804.913462950 +dense_creative_dm2 rc=0 t0=1778367804.914387847 t_end=1778367834.214837479 +moe_code_dm2 rc=0 t0=1778367834.215866999 t_end=1778367857.752659489 +moe_creative_dm2 rc=0 t0=1778367857.753545947 t_end=1778367867.924493190 +dense_code_dm4 rc=0 t0=1778367867.925434008 t_end=1778367901.569052028 +dense_creative_dm4 rc=0 t0=1778367901.569960746 t_end=1778367920.117962771 +moe_code_dm4 rc=0 t0=1778367920.118900624 t_end=1778367947.287365095 +moe_creative_dm4 rc=0 t0=1778367947.288422676 t_end=1778367963.304656030 +dense_code_dm8 rc=0 t0=1778367963.306052952 t_end=1778367997.821679914 +dense_creative_dm8 rc=0 t0=1778367997.822956270 t_end=1778368014.457833242 +moe_code_dm8 rc=0 t0=1778368014.458888628 t_end=1778368042.161315417 +moe_creative_dm8 rc=0 t0=1778368042.162586106 t_end=1778368053.575751597 +dense_code_dm16 rc=0 t0=1778368053.576573194 t_end=1778368087.362434340 +dense_creative_dm16 rc=0 t0=1778368087.363372540 t_end=1778368101.830030550 +moe_code_dm16 rc=0 t0=1778368101.830961459 t_end=1778368129.087987475 +moe_creative_dm16 rc=0 t0=1778368129.088904976 t_end=1778368140.797896926 +dense_code_dm32 rc=0 t0=1778368140.798771797 t_end=1778368175.493668623 +dense_creative_dm32 rc=0 t0=1778368175.494866194 t_end=1778368194.244521868 +moe_code_dm32 rc=0 t0=1778368194.245404463 t_end=1778368221.950156586 +moe_creative_dm32 rc=0 t0=1778368221.951272113 t_end=1778368239.295198724 diff --git a/.sisyphus/notes/gemma4-journey.md b/.sisyphus/notes/gemma4-journey.md new file mode 100644 index 00000000..804149ce --- /dev/null +++ b/.sisyphus/notes/gemma4-journey.md @@ -0,0 +1,298 @@ +# Making Gemma-4 fast on a 24 GB GPU: a debugging journey + +**TL;DR** — Three bugs were silently stealing performance from Gemma-4 on 3090-class hardware: (1) the SWA causal mask was never built for single-token decode, (2) the TQ3_0 KV path silently routed through an MMA kernel that didn't undo the FWHT rotation, and (3) the head_dim=512 MMA gqa-opt branch was rejected for Q8 KV because no mask was supplied. After fixing those, the same 26B-A4B MoE that *collapsed to 13 tok/s decode at 64k context* in our baseline ships **36.57 tok/s** with the same drafter and weights — and runs to **256k context at 21.7 GB on a 24 GB RTX 3090**. The lessons that took days of compute to discover, in a few thousand words, in case they save someone else a weekend. + +This post is written for engineers maintaining a fork of `llama.cpp` for speculative decoding (DFlash / MTP / Medusa-class), targeting Gemma-4 (or any Gemma-4-like SWA + GQA + chunked-prefill model) on consumer GPUs. Skip ahead to the section that matches your symptom. + +--- + +## The setting + +- **Hardware**: RTX 3090, 24 GB VRAM, sm_86, CUDA 13.1. +- **Model A**: Gemma-4-31B-it Q4_K_M (dense, ~18 GB weights). 60 layers, 50 SWA + 10 full-attn, head_dim 256 (SWA) and 512 (full-attn), n_head_kv=8, GQA ratio 8. +- **Model B**: Gemma-4-26B-A4B-it Q4_K_M (MoE, ~13 GB weights). 30 layers, 8 of 128 active experts + 1 shared per layer. +- **Drafters**: + - DFlash (z-lab style, target-conditioned): 5-layer Q8 GGUF for both 31B (1.6 GB) and MoE (456 MB). + - MTP (Google-style, 4-layer assistant attached to target): only available for 31B dense. +- **Stack**: `test_gemma4_dflash` binary, ggml/cuda backend, KV-cache with TQ3_0 / Q8_0 / Q4_0 / F16 options. pFlash (block-sparse prefill) is on a custom `GGML_OP_FLASH_ATTN_SPARSE`. + +The headline goal we were chasing: **MoE 26B + dflash + 256k context, single 24 GB GPU, in production-relevant tok/s**. Spoiler: that's now feasible and the production numbers are in this post. But the path here started with us not being able to reproduce a 0.22 accept_rate baseline that turned out to be on garbage input. + +--- + +## Day 0: the contaminated baseline + +The plan we inherited declared: +> TQ3_0 cross-attention now functional. accept_rate 0.22 on Q4_K_M target + Q8_0 assistant + TQ3_0 KV, 131-token prompt, 64-step generation. + +Running the same command, we got **0.22 at step 64**, then a slow slide to **0.06 at step 256**. That doesn't pattern-match a real text completion. Decoding the generated token IDs back to strings using HF's `google/gemma-3-27b-it` tokenizer (which is byte-identical to the GGUF's vocab, 262144 entries, BOS=2, EOS=106 — verified by side-by-side comparison of `gguf.GGUFReader`'s `tokenizer.ggml.tokens` and `AutoTokenizer.from_pretrained()`'s vocab) revealed: + +``` +をlaenat quelelele tolaredlele samme a które a a a a + a a a a a a a a a które up up a a robot samme a robot +``` + +Multilingual gibberish followed by a `a robot` repetition loop. Not a real completion. + +**The first lesson:** the test driver's `test_gemma4_dflash` was using **byte-fallback tokenization** by default — the message in the binary's startup is literally: +``` +[tokens] byte-fallback tokenisation: 102 tokens (pass --tokens for real tokenisation) +``` +102 bytes for "The quick brown fox jumps over the lazy dog. Explain in one paragraph what this sentence demonstrates." Not 25-30 BPE tokens. The model was being fed UTF-8 bytes as if each were a vocab id. + +**Fix**: built a tokenization pipeline using the in-repo HumanEval+ jsonl + HF's `google/gemma-3-27b-it` tokenizer + the Gemma chat template (`user\n…\nmodel\n`). Saved 6 prompts: short_chat (27 tok), long_open (40 tok), long_2k (2611 tok, Alice in Wonderland Ch. 1), long_50k (49904 tok, Tiny Shakespeare summarisation request), long_code_50k (50002 tok, concatenated HumanEval+ tasks), and humaneval_2 (139 tok, single HE task with EvalPlus chat format). + +After that switch, the **same** binary on the **same** model produced "This sentence is a **pangram**, which is a phrase that contains every letter of the alphabet at least once. Because it is relatively short and coherent" — a real answer. + +**Take-away**: if your bench framework outputs token IDs only and your accept_rate metric is "drafter agrees with target", you can chase 0.22 forever on inputs that aren't even in distribution. Always decode and read your output text. **Real-token plumbing first; everything else after.** + +--- + +## Bug 1: the SWA mask that wasn't there for single-token decode + +With real BPE input, target+TQ3 still produced garbage. Target+Q8 produced clean prose. Same prompt, same seed, same temp=0. + +Decoding ablations narrowed the suspect to the SWA layers' attention. Adding a `fprintf` diagnostic at the SWA FA call site revealed: + +``` +[swa-fa-diag] il=0 n_tokens=28 kv_start=0 K_ne1=2048 mask=swa_mask mask_ne0=2048 mask_ne1=32 ← prefill ✓ +[swa-fa-diag] il=0 n_tokens=1 kv_start=28 K_ne1=2048 mask=attn_mask mask_ne0=256 mask_ne1=32 ← decode ✗ +``` + +For prefill (n_tokens=28), the mask is the proper `swa_mask`, sized 2048×32 to match the K view. For decode (n_tokens=1), it falls back to the full-attn `attn_mask` sized to the kv_len padding (256 wide), but the K view is still 2048. **256-wide mask, 2048-wide K view.** The kernel reads past the populated region into uninitialized cudaMalloc bytes. + +Why didn't it crash for Q8? Q8's higher precision was tolerant — the populated 28 K positions still dominated the corrupted attention distribution. TQ3's quant noise + uninitialized garbage produced a near-flat distribution that fed the LM head a weak signal, which fell back to high-frequency tokens (`'en'`, `'a'`, ``). + +The bug was in the test driver's graph builder. The SWA mask was guarded: +```cpp +if (n_tokens > 1) { // ← "batched prefill only" + sg.swa_mask = ggml_new_tensor_2d(...); +} +``` +And the comment in `internal.h` was: +```cpp +ggml_tensor * swa_mask = nullptr; // sliding-window causal mask (batched prefill only) +``` + +**Fix**: drop the `n_tokens > 1` guard; allocate `swa_mask` always when masks are requested. Add the matching `build_swa_causal_mask()` call at all four single-token decode sites (daemon decode, decode warmup, MTP target verify, target-only decode). Update the comment to reflect the new contract. + +This one fix changed Q8 output too: "This sentence is a sentence" → "This sentence is a phrase" (slightly different attention output → different greedy trajectory, both coherent). The pre-fix Q8 was incidentally surviving; with the mask correct, it's actually more right. + +--- + +## The bisect that proved the bug was older + +After Bug 1, target+TQ3 still produced garbage. The hypothesis was a recent regression in the MTP-related commits. We did a `git bisect` over the 6 commits between `7eea84b` (last pre-MTP) and `c56879c` (HEAD before our fixes), with a coherence-check predicate that decoded the first 16 generated tokens and looked for English alphabet runs. + +**Result**: every commit in the bisect range produced the same garbage on TQ3. Walking back further, we tested at `ce4da35` (the very first commit that integrated TQ3 with Gemma-4, "narrow asymmetric KV"). Same garbage. **TQ3 + Gemma-4 + real BPE tokens has never produced coherent text.** + +The "0.22 accept_rate" the plan had been chasing was on byte-fallback junk input. Both the target and the drafter were generating non-text. The drafter's "accept rate" was just "drafter successfully predicts that the target produces the same gibberish." Double-fake. + +**Take-away**: a well-run bisect that returns "no good commit" is a real answer. Stop bisecting and pivot. The bug was older than the bisect range; it lives in the TQ3 + Gemma-4 interaction, not in any specific commit. + +--- + +## Bug 2: the TQ3 K dequant intercept that silently strips FWHT rotation + +A focused Codex audit on the TQ3 K-side path landed the cause in `dflash/deps/llama.cpp/ggml/src/ggml-cuda/fattn.cu` lines 134–204: + +```cpp +if (K->type == GGML_TYPE_TQ3_0 || V->type == GGML_TYPE_TQ3_0) { + // ... allocate temp F16 buffers ... + if (K->type == GGML_TYPE_TQ3_0) { + k_tq3_0_dequant_f16_full<<<...>>>(...); // dequant TQ3_0 → F16 + K_f16.type = GGML_TYPE_F16; + dst->src[1] = &K_f16; // ← swap K reference + } + // ... same for V ... + // re-enter the standard MMA dispatch with substituted K/V types + switch (Q->ne[0]) { + case 256: ggml_cuda_flash_attn_ext_mma_f16_switch_ncols2<256, 256>(ctx, dst); break; + case 512: ggml_cuda_flash_attn_ext_mma_f16_switch_ncols2<512, 512>(ctx, dst); break; + ... + } +} +``` + +The intercept dequantizes TQ3_0 K storage to F16 and re-enters the MMA dispatch. **But TQ3_0 K is stored in FWHT-rotated form** (the rotation is applied during `tq3_rotate_forward` at quantization write time; see `cpy-utils.cuh:215+227`). The chunked / vec FA kernels handle this correctly: when they see `K->type == GGML_TYPE_TQ3_0`, they forward-rotate Q before computing Q@K, so the dot product happens in matched FWHT space. The MMA intercept skips that hook because by the time it dispatches, `K->type == GGML_TYPE_F16`. **Q is in standard space, K is in FWHT-rotated space, Q@K is computed in mismatched domains.** + +V works on the asymmetric path (K=Q8, V=TQ3) for the same reason inverted: the V FWHT only affects the post-attention output, not the attention score distribution. Tokens are sampled from `softmax(QK)` not from V. So **V being in the wrong domain doesn't break token selection**, only the output values get rotated incorrectly — and since the next layer's V is computed fresh from W_v @ x of the rotated output, it just propagates a basis change that doesn't matter for argmax sampling. + +The dispatcher in `fattn.cu:534` had: +```cpp +const bool tq3_needs_chunked = tq3_any && (Q->ne[1] > 1 || Q->ne[0] > 256) && !tq3_use_mma; +``` +For SWA decode (`Q->ne[1]==1, Q->ne[0]==256`), neither clause fires; the path falls through to the broken MMA intercept. + +**Fix**: drop the `(Q->ne[1] > 1 || Q->ne[0] > 256)` guard. Force chunked for ALL TQ3 cases unless `DFLASH_TQ3_MMA` is opted in. Restores the "historically forced chunked" behavior the in-source comment explicitly claims. + +After this fix, MTP+TQ3/TQ3 went from accept_rate 0.05 (degenerate loop) to **0.56 with coherent prose**: "Unit 7 was designed for precision, not poetry. A maintenance droid with a steady hydraulic arm, its world was composed of grids, bolts, and gray steel." + +--- + +## Bug 3: head_dim=512 + Q8/Q8 MMA gqa-opt requires a non-null mask + +After Bugs 1+2 landed, MTP+Q8/Q8 still aborted in production at `fattn.cu:659: GGML_ABORT("fatal error")` around step ~110. The dispatcher returned `BEST_FATTN_KERNEL_NONE` for the head_dim=512 path because the gqa-opt-applies check requires both `K->ne[1] % 256 == 0` AND `mask != nullptr`. The MTP graph builder padded K view length to 256 alignment (good), but its `need_mask` predicate was: +```cpp +const bool need_mask = (kv_is_tq3 && head_dim_fa >= 512) || needs_kv_pad; +``` +For Q8 KV at a kv_view_len that happened to be 256-aligned, neither clause fired — no mask, no gqa-opt, dispatcher rejection, abort. + +**Fix**: drop the `kv_is_tq3` gate. Always set `need_mask` when `head_dim_fa >= 512`. The MMA gqa-opt path needs the mask regardless of K type or KV alignment. + +After this fix, MTP + Q8/Q8 + HumanEval/2 + 4K context ran the full 256 steps with **accept_rate 0.87** (peaked at 1.00 in early steps, settled at 0.87 by step 112 — the same step it used to abort at). + +--- + +## The HumanEval surprise: "the regression that wasn't" + +With all three fixes in, we ran a regression check at HEAD `7b62c07`+`d758ed9bf`+`f1f811e` matching PR #131's published reference (31B + Q8/Q8 + dflash @ 4K = 149 tok/s decode, AL 10.67/16): + +``` +./test_gemma4_dflash --model 31B-Q4_K_M.gguf --draft draft-q8_0.gguf \ + --draft-method dflash --draft-max 8 \ + --tokens-file long_open.txt --kv-k q8_0 --kv-v q8_0 \ + --ctx-size 4096 --n-predict 256 ... +``` + +Result: **23.77 tok/s, AL 2.13/8.** Six times slower than published, AL collapsed from 10.67 to 2.13. We assumed our fixes broke dflash and queued a bisect. + +The bisect couldn't run (Codex sandbox lacks GPU pass-through) so we tried a different angle: re-ran with the same config but a **HumanEval/2 prompt** (139 BPE tokens of Python code) instead of `long_open.txt` (40-token "robot story" creative prompt). + +**56.12 tok/s, AL 5.12/8 — 64% acceptance rate.** Switching the prompt from creative writing to code more than doubled tok/s and AL. The dflash drafter had been trained on code (it's a 5-layer model distilled from target activations on HumanEval-class tasks) and creative writing was severely OOD. + +PR #131's reported 0.667 accept-rate (10.67/16) is statistically identical to our 0.64 (5.12/8). **No regression**. The "regression" was a prompt distribution mismatch. + +**Take-away**: drafter quality is **not** intrinsic — it's a function of (drafter × target × prompt distribution). When you bench a drafter, bench it on the prompt distribution it was trained for. Even better, document that distribution next to the headline number. PR #131's "10.67/16" claim wasn't wrong but was incompletely contextualized: it was on code prompts; on creative writing it would've shown the same collapse we hit. + +--- + +## DM sweep: PR #131's 64K result was over-speculation, not drafter collapse + +PR #131 documented an 8× decode regression at 64k: +> 64K MoE: 1997→4028 tok/s prefill (+101.7%), decode **13 tok/s**, accept **1.23/16** ← drafter diverges + +Our session got curious: was this *really* drafter collapse, or was it over-speculation (budget=22 with low accept rate just wastes compute)? + +We swept `--draft-max ∈ {1, 2, 4, 8}` on MoE + dflash + Q8/Q8 + pflash + 50k code prompt at 64k context: + +| dm | tok/s | AL | accept rate | +|---|---|---|---| +| 1 | 23.01 | 1.00 | 100% (trivial — draft always = target's first prediction) | +| 2 | 33.81 | 1.51 | 76% | +| **4** | **36.57** | **1.79** | **45%** | +| 8 | 29.45 | 1.86 | 23% | + +**dm=4 is the sweet spot** — high enough to amortise verification, low enough to not waste compute on rejected drafts. **2.8× improvement over PR #131's published 13 tok/s for the same model and context.** + +The same dm=4 also holds at 256k context: **35.30 tok/s** (or 36.63 in a confirm run — variance ±5%). VRAM 21.73 GB — fits a 24 GB 3090 with 2.3 GB headroom. + +For dense 31B the right value is dm=8. For MoE 26B the right value is dm=4. The 2× ratio reflects MoE's lower per-token compute (sparse experts) — verification is faster, so smaller speculation budget pays off. + +--- + +## Scaling: MoE 26B + dflash + Q8/Q8 fits 256k on a 24 GB GPU + +The full ladder, all with the same 50k code prompt + dm=4: + +| ctx | Decode tok/s | AL | VRAM | Δ vs 64k | +|---|---|---|---|---| +| 64k | 36.57 | 1.79 | 19.74 GB | (baseline) | +| 128k | 35.21 | 1.77 | 20.40 GB | -3.7% | +| 256k | 35.30 / 36.63 | 1.79 | 21.73 GB | -3.5% / +0.2% | + +Decode tok/s is **flat** from 64k → 256k. Cache allocation grows by ~700 MB per ctx-doubling, which is just the empty buffer overhead — actual KV usage is held at 50k tokens, so per-step KV bandwidth is constant. + +### Dense 31B + dflash + Q8/Q8 + pflash + dm=8 — the same ladder + +For comparison we ran the dense 31B at the same ctx ladder with the same code prompt. **pFlash is on for both**; the dense vs MoE prefill gap is architectural (15× compute ratio), not a pflash failure. Both runs log `[chunked+pflash, chunk_size=1024]`. + +| ctx | Prefill tok/s | Decode tok/s | AL/8 | VRAM | +|---|---|---|---|---| +| **64k** | 319 | **1.78** ← anomaly | **1.94** (24%) | **24/24 GB cap** | +| **128k** | 256 | **24.89** | **7.11** (89%) | 24/24 GB cap | +| **256k** | 236 | **23.87** | **7.11** (89%) | 24/24 GB cap | + +Two structural observations: + +1. **The dense+drafter ladder hits the 24 GB cap at every cell**, but only the 64K cell decodes catastrophically slowly with a collapsed AL. Both 128K and 256K decode healthily at ~24 tok/s with AL 7.11 (89% acceptance). All three cells use the same Q8 GGUF drafter (`draft-q8_0.gguf`, 1.52 GiB on GPU) and identical config. **The 64K-specific collapse is an open puzzle.** Hypotheses: (a) the cache happens to land in a VRAM region that forces drafter eviction or paging only at this specific ctx allocation, (b) some allocator-fragmentation edge case kicks in when the 50k-token prompt tightly fills 78% of a 64k cache vs 39% of a 128k cache. We did not isolate the cause. +2. **Dense prefill is uniformly 13–20× slower than MoE** at the same ctx. The MoE has ~4B active params/token with 30 layers; dense has 31B params with 60 layers — that's a ~15× compute ratio that matches the observed prefill ratio. pFlash helps both, but it can only skip attention; the FFN compute is unavoidable and dense has 7-8× more of it active per token. Plus dense is hitting the 24 GB cap so some unknown fraction is paging contention; we cannot separate the two contributions on a 24 GB GPU. + +So the dense **does** decode well at long ctx (128K/256K @ ~24 tok/s, AL 7.11) once you get past the prefill cost. But for a 24 GB GPU the prefill economics are bad: a 50K-token prompt takes ~3 minutes on dense vs ~10 seconds on MoE (architectural, not bug-territory). + +### Net: MoE 26B is the long-context ship target on 24 GB + +MoE at 256k fits at 21.7 GB with a 50k-token prefill in ~10 seconds and decode at 35-37 tok/s. Dense at 256k caps at 24 GB, prefills the same 50k tokens in ~3.5 minutes, and decodes ~24 tok/s. **MoE wins on prefill TTFT, fits long context with headroom, and decodes ~50% faster post-prefill.** Dense 31B remains useful at small context where MTP gives the highest AL (0.87 accept_rate at 4K with the head_dim=512 mask fix). + +--- + +## What still hurts: bandwidth, not bugs + +A bandwidth model: RTX 3090 nominal 936 GB/s. Reading the full Q8 KV for a 50k-token cache costs `50000 × 30 layers × 2 (K+V) × 8 heads × 256 head_dim × 1 byte ≈ 6.1 GB/step`. Theoretical ceiling: 152 tok/s. We hit 36.57 — about 24% of bandwidth ceiling. The remaining 76% is split among weight reads (model is 13 GB, attention reads it once per step), MoE FFN routing+execution (the active 4B), drafter forward (several extra KV reads through the drafter's own cache), speculative verification (target forward over the draft block), and ggml graph launch overhead. + +Going from 4k to 50k actual KV gave **3× decode slowdown** (111 → 36 tok/s). A pure-bandwidth model would predict 12×. The fact we see only 3× means weights and overhead dominate at small ctx; KV bandwidth dominates at long ctx; the regimes meet around ctx ≈ 32k. + +The remaining ~75% gap to bandwidth ceiling is not bug territory anymore — it's the structural cost of running a real model. Closing it would require **decode-time KV sparsity** (H2O / StreamingLLM / Quest / Landmark Attention / QuantSpec). None of those is integrated in any production speculative-decoding stack we found. That's an open opportunity, not a fix. + +--- + +## Lessons that would have saved us a weekend + +In rough order of importance, things we wish someone had told us: + +1. **If your decoder takes raw prompts as bytes, that's a bug pretending to be a feature.** Build the tokenization plumbing first. Decode and *read* every output. If your accept_rate metric is "drafter agrees with target" and your inputs are out of distribution, drafter will agree with target on garbage and you'll celebrate a meaningless number. +2. **Drafter quality is not intrinsic.** Bench on the prompt distribution the drafter was trained on. A 6× tok/s gap between code and creative-writing prompts is normal and not a regression. +3. **Speculation budget has a sweet spot per model**, not a "higher is better" curve. Sweep dm. For Gemma-4 26B-A4B MoE the answer is 4. For 31B dense it's 8. PR #131 used the framework default 16 which is over-speculation at long context. +4. **TQ3_0 (or any FWHT-quantised KV) requires the kernel to know the storage is in rotated space.** Any path that dequants and re-dispatches loses that information. Force the chunked path explicitly; don't let an MMA fast-path silently strip the rotation. +5. **Single-token decode is a special case for SWA masks.** The mask geometry that's correct for batched prefill is wrong for single-token decode if the K view is the full SWA ring. Don't gate mask construction on `n_tokens > 1`. +6. **`gqa_opt_applies` (the head_dim=512 MMA fast path) requires BOTH alignment AND a mask.** The "NONE → abort" failure mode is silent until the kernel selector returns NONE. +7. **A bisect that returns "every commit in range is bad" is a real answer.** Walk further back, or pivot to direct code audit. Don't keep bisecting. +8. **MoE > dense for long-context PREFILL on consumer GPUs.** The 26B MoE with ~4B active params has both lower active compute AND lower weights footprint than the 31B dense. Both fit 256k context on a 24 GB GPU; the MoE prefills a 50K prompt in ~10 seconds vs dense's ~3.5 minutes (15× compute ratio). Dense's decode at 128K/256K is fine (~24 tok/s, AL 7.11) but its 64K cell collapses anomalously — open puzzle. +9. **Q8/Q8 KV is 2.4× faster on prefill than TQ3/TQ3 KV at 64k**, costs only 1.3 GB more, and decode is comparable. Use Q8 unless VRAM forces you to TQ3. +10. **`pflash` is prefill-only.** It helps both dense and MoE, but it can only skip the *attention* compute; the FFN compute is unavoidable, which is why dense (60 layers × 31B params) prefills 15× slower than MoE (30 layers × 4B active) even with pflash on. The decode-time KV bottleneck is unaddressed in production stacks. Decode-time block-sparse attention (Quest, H2O, StreamingLLM) is the next research-to-production move. + +--- + +## Production ship config (RTX 3090, Gemma-4) + +All "with drafter" cells use dflash + Q8 GGUF drafter + pflash + Q8/Q8 KV. + +| Use case | Config | Prefill tok/s | Decode tok/s | VRAM | +|---|---|---|---|---| +| **Long context (≥64k), code/agent — primary ship target** | **MoE 26B + dflash + dm=4** | **4900** at 64K | **35–37 from 64K to 256K** | **19.7–21.7 GB** | +| Short context (4k), code/agent | MoE 26B + dflash + dm=4 | (small ctx, ~3700) | ~112 | 19 GB | +| Long context, dense — once prefill is paid | 31B dense + dflash + dm=8 (128K/256K) | 240–260 (slow) | ~24, AL 7.11 | 24/24 GB cap | +| Short context, highest quality MTP | 31B dense + MTP (post-Bug-3 fix) | (small ctx) | 34, accept_rate 0.87 | 20 GB | +| Short context, dflash dense reference | 31B dense + dflash + dm=8 (HumanEval/2) | ~800 | ~98 | 22 GB | +| 64K dense, target-only sanity | 31B + Q8/Q8 + pflash | 1402 | 7.96 | 22.6 GB | +| 64K dense, TQ3 minimum-VRAM | 31B + TQ3/TQ3 + pflash | 585 | 6.90 | 21.25 GB | +| ⚠️ Avoid: dense + drafter + ctx=64K | (anomaly: 1.78 tok/s, AL 1.94) | — | — | — | + +The headline: **MoE 26B + dflash + Q8/Q8 + pflash + dm=4 fits 256K on a 24 GB 3090 at 35–37 tok/s decode and 4.9K tok/s prefill**. With the three fixes from this session (TQ3 dispatcher, SWA mask, head_dim=512 mask) all upstream, this is a real ship config, not a benchmark stunt. + +**Avoid the dense+drafter+64K-specific cell** until the AL-collapse anomaly is understood — same model + same drafter + same config decodes fine at 128K/256K but craters at 64K. Possibly a VRAM-allocator edge case. + +--- + +## What still wastes our compute, and might waste yours + +Open questions that we did not resolve but identified clearly: + +- **Drafter context window cap.** Our 5-layer dflash drafter has a 2096-slot KV cache. On a 50k prompt, it skips the first 47808 tokens. Larger drafter caches (5k? 10k?) might recover meaningful AL at long context. No public ablation on this exists — we looked. +- **Decode-time KV sparsity.** None of H2O / StreamingLLM / Quest / Landmark Attention / QuantSpec is wired into any production speculative-decoding stack we found. Fitting one would close the bandwidth gap at long context. +- **TQ3 SWA-wrap branch.** When the SWA ring wraps (sustained generation past 1024 tokens past the SWA window, on a SWA-windowed model) the wrap branch concat-forces F32, stripping the FWHT rotation again. Same class of bug as the one we fixed in the MMA intercept; same fix pattern (force chunked or split FA + combine softmax) applies. We didn't hit it in this session because our generations stayed within the unwrapped window. +- **MoE MTP drafter.** Doesn't exist. Training one (4-layer assistant against MoE 26B target activations) would unlock the highest-acceptance-rate small-ctx demo on the smaller model. Until then, MoE relies on dflash only. +- **An FA kernel for head_dim=512 + Q8 + non-aligned KV that doesn't require gqa-opt.** Our Bug-3 fix routes around the issue by always providing a mask. A kernel that handles the unaligned case directly would be cleaner. + +If you find yourself debugging similar symptoms on a different model (Llama-3 with MTP, Qwen with Medusa, …), the workflow that ended up working for us was: + +1. Set up real BPE tokenization first; never trust a byte-fallback baseline. +2. Decode every output; never trust accept_rate alone. +3. Bench multiple prompt distributions (code vs creative). The 6× gap is real. +4. Sweep `--draft-max`. The optimum varies per model+drafter. +5. When attention is wrong, instrument the FA call site directly (K type, K ne[1], mask name + dims). The dispatcher's "NONE → abort" path is silent. +6. Use Codex (or any second LLM) for focused audits with EXPLICIT evidence in the prompt — not for general code review. The "give me a fix for this exact line:line" workflow saved us hours. + +The whole journey was three commits in one repo, one commit in a submodule, and roughly thirty benchmark cells. The numbers are documented in `.sisyphus/notes/gemma4-baseline/`. The bench scripts are reproducible on any 24 GB consumer GPU with CUDA 13.1. + +Have at it. diff --git a/.sisyphus/plans/gemma4-context-scaling.md b/.sisyphus/plans/gemma4-context-scaling.md new file mode 100644 index 00000000..745c75fa --- /dev/null +++ b/.sisyphus/plans/gemma4-context-scaling.md @@ -0,0 +1,196 @@ +# Gemma4 context-scaling test plan: 1k → 4k → 8k → 32k → 256k + +**Status going in:** Bug 1 (SWA mask for n_tokens==1 decode, parent `7b62c07`) + Bug 2 (TQ3 K dispatcher → MMA intercept FWHT mismatch, submodule `d758ed9bf`) are fixed and verified at small context. **MTP+TQ3/TQ3 now works**: accept_rate 0.56 on 64 decode steps, coherent prose. Target+TQ3/TQ3 also coherent. Q8/Q8 unchanged. + +**Goal:** map performance + correctness across context lengths to find where the new fixes hold, where they break, and the highest-context production-ship config for the MoE + pFlash demo. + +**Hardware:** RTX 3090 24GB. CUDA 13.1, sm_86. + +--- + +## Phase 0 — preflight + +Before any runs: + +1. **Build is clean at HEAD `7b62c07`**: rebuilt during this session, binary at `dflash/build/test_gemma4_dflash`. +2. **Real tokenized prompts**: `short_chat.txt` (27 tok), `long_open.txt` (40 tok), `long_2k.txt` (2,611 tok) under `.sisyphus/notes/gemma4-baseline/prompts/`. **More prompts needed for ≥4k tests** (see Phase 4). +3. **Models available**: + - 31B dense target: `models/gemma-4-31B-it-Q4_K_M.gguf` (~18 GB) + - 31B MTP drafter: `models/gemma4-mtp-31B/gemma-4-31B-it-assistant.Q8_0.gguf` + - **31B dflash drafter**: `/home/peppi/models/draft-gemma4-31b/` — **NOT TESTED YET this session** + - MoE target: `/home/peppi/models/gemma4-26b-a4b-it/gemma-4-26B-A4B-it-UD-Q4_K_M.gguf` (~13 GB) + - MoE dflash drafter: `/home/peppi/models/gemma4-26b-a4b-dflash/draft-q8_0.gguf` + - **MoE has NO MTP drafter** — MoE will run with dflash only (or no drafter). +4. **Deferred bugs not blocking this plan but to flag during runs**: + - FA-kernel-selection abort at `fattn.cu:652` for head_dim=512 + Q8 KV + non-aligned KV-length. Hit M4 at step ~210, will likely hit any 31B-dense + MTP + Q8/Q8 run that crosses similar context. Workaround: use TQ3 KV (now fixed) or stay below the trigger context. + +--- + +## Phase 1 — short context: 1k, 4k, 8k + +These all fit comfortably in VRAM regardless of KV type. Goal: lock in the baseline numbers across configs, confirm fixes hold past short tests, find the FA-crash trigger context for Q8/Q8. + +### Configs (each at all 3 context lengths) + +| Cell | Drafter | KV-K | KV-V | Why we run it | +|------|---------|------|------|---------------| +| A | none | Q8 | Q8 | Target-only baseline (no draft overhead) | +| B | none | TQ3 | TQ3 | Target-only with full TQ3 (memory savings) | +| C | mtp | Q8 | Q8 | Best-known MTP accept_rate (was 0.65-0.68 at small ctx) | +| D | mtp | TQ3 | TQ3 | **Headline production target** (was 0.56 at 64 steps) | +| E | dflash | Q8 | Q8 | The leg we missed — dflash on dense 31B | +| F | dflash | TQ3 | TQ3 | dflash + TQ3 — interesting if it inherits the fixes too | + +### Context sizes + prompt sizing + +| Ctx target | --ctx-size | Prompt tokens | n_predict | Notes | +|------------|------------|---------------|-----------|-------| +| 1k | 4096 (default) | 41 (`long_open.txt`) | 256 | Already covered by matrix-v3 — re-baseline only if needed | +| 4k | 4096 | 2611 (`long_2k.txt`) | 256 | Stresses prefill + decode at ~3k effective ctx | +| 8k | 8192 | ~6500 (need new prompt: `long_8k.txt`) | 512 | Need new prompt | + +### Pass criteria + +- Cell completes without crash at each context size. +- Decoded output (first 80 tokens) is coherent English (manual check). +- For drafter cells: accept_rate stable across 32-step windows (no slow collapse, no late spike from looping). +- Final tok/s recorded. + +### Numbers to record per cell + +`prefill_ms`, `prefill_tok/s`, `decode_ms`, `decode_tok/s`, `first_tok_ms`, `VRAM_used_GB`, `accept_rate_final` (drafters only), `coherent_yes_no`. + +--- + +## Phase 2 — medium context: 32k + +VRAM starts mattering: at 32k, Q8 KV ≈ 2 GB, TQ3 KV ≈ 0.9 GB. Both fit. + +### What we expect to break + +- **Cell C (MTP+Q8/Q8) at 32k**: very likely hits the FA-kernel-selection abort. The crash trigger is head_dim=512 + Q8 KV + non-FATTN_KQ_STRIDE-aligned KV-length. At 32k, alignment reaches non-aligned values frequently. Document the crash step. +- **Cells E/F (dflash drafter)**: unknown territory. Worth testing. + +### Prompts needed + +- `long_30k.txt` (~30k tokens) — generate from a Project Gutenberg novel chapter or similar reproducible source. + +### Configs to run + +Same A-F as Phase 1, with `--ctx-size 32768`, prompt = `long_30k.txt`, `n_predict 1024`. + +### Bonus: --pflash flag + +The driver supports `--pflash` for prompts ≥4096 tokens. At 32k, this is the natural domain. Add a `pflash` variant of cells A and C: `A_pflash` and `C_pflash`. Compare prefill tok/s with vs without pflash. + +--- + +## Phase 3 — long context: 256k + +This is where memory tightens hard: + +| KV type | Per-token KV bytes (60 layers × 2 × 256 × 2) | At 256k tokens | Plus 18 GB target | Fits 24GB? | +|---------|--------|------|------|-------| +| F16 | 122,880 | ~32 GB | ~50 GB | ❌ | +| Q8 | 61,440 | ~16 GB | ~34 GB | ❌ | +| **TQ3** | ~26,880 | **~7 GB** | **~25 GB** | tight, possibly ✓ | + +(Numbers approximate; head_dim and n_head_kv vary per layer.) + +**For 31B dense at 256k**: only TQ3 KV fits. Run cells B (target-only TQ3) and D (MTP+TQ3) ONLY. Skip Q8 cells. + +**For MoE 26B-A4B at 256k**: smaller weights (~13 GB) leave more headroom. Q8 KV plausibly fits. Test both. + +### Prompts needed + +- `long_200k.txt` (~200k tokens) — concat several Project Gutenberg books, run through HF tokenizer. + +### Configs + +| Cell | Model | Drafter | KV-K | KV-V | --pflash | n_predict | +|------|-------|---------|------|------|----------|-----------| +| G | 31B dense | none | TQ3 | TQ3 | yes | 256 | +| H | 31B dense | mtp | TQ3 | TQ3 | yes | 256 | +| I | MoE 26B | none | TQ3 | TQ3 | yes | 256 | +| J | MoE 26B | dflash | TQ3 | TQ3 | yes | 256 | +| K | MoE 26B | dflash | Q8 | Q8 | yes | 256 | (if VRAM fits) | + +### Pass criteria + +- Each cell completes prefill (the long prefill is the hard part). +- Decode produces coherent output (first 80 tokens of generation). +- Prefill tok/s recorded; pFlash speedup quantified. +- `--pflash-alpha` left at default 0.12 unless we want to sweep. + +--- + +## Phase 4 — prompt manufacturing (do this BEFORE Phase 1c+) + +The benchmark depends on real BPE-tokenized long prompts. Generate them once, reuse: + +| File | Source | Target tok count | +|------|--------|--------| +| `long_8k.txt` | Alice in Wonderland Chs 1-3 | ~6500 | +| `long_30k.txt` | Alice in Wonderland full + Hunting of the Snark + a few short stories | ~30000 | +| `long_200k.txt` | Multiple Gutenberg novels concatenated under one chat-template wrapper | ~200000 | + +Use the existing `generate_prompts.py` under `.sisyphus/notes/gemma4-baseline/` as the template (HF `google/gemma-3-27b-it` tokenizer; chat-template wrap; CSV output). Sidecars `.meta` for each. + +**VRAM cap**: even with TQ3, a 200k-token prompt + 256k ctx alloc is ~25 GB. The prompt may need to be capped at e.g. 180k to leave headroom for generation. + +--- + +## Phase 5 — recovery scripts + +Write small shell scripts under `.sisyphus/notes/gemma4-baseline/`: +- `run_phase1.sh` — runs cells A-F at 1k, 4k, 8k. Captures all stats. Builds a side-by-side report. +- `run_phase2.sh` — same for 32k, plus `--pflash` variants of A and C. +- `run_phase3.sh` — 256k cells G-K. + +Each script writes results to `matrix-v4/_k.log` and a single `SUMMARY.md` table. Use `--temp 0 --seed 0 --ignore-eos` everywhere for reproducibility. + +--- + +## Risks + mitigations + +1. **FA-kernel-selection abort** for Q8/Q8 + head_dim=512 + non-aligned KV (the M4 crash) — likely hits Phase 2 cells C/E and Phase 3 K. Mitigation: keep the run going on TQ3 cells; document the abort step for each. +2. **256k prompt won't tokenize cleanly** — chat template + extreme length might exceed model's training distribution; output may be incoherent regardless of correctness fixes. Mitigation: report what we see; the *prefill timing* is still the headline number even if generation is off-topic. +3. **TQ3 chunked path is slower than MMA** — Phase 1 Q8 vs TQ3 throughput gap (~2× decode hit) will compound at long context. The win is *that it runs at all* in 24 GB; raw speed is secondary at 256k. +4. **MoE branch routing** — never validated end-to-end this session. Phase 3 MoE cells may surface fresh bugs unrelated to KV. + +--- + +## Decision gates + +- After Phase 1: if any cell A-F regresses or crashes at small context, debug before scaling up. +- After Phase 2: if FA crash at 32k blocks Q8/Q8 MTP, defer Q8 from Phase 3. Continue with TQ3 only. +- After Phase 3: report findings; the highest-context working ship config becomes the demo target. + +--- + +## Quick-start commands (for next session) + +```bash +# Phase 1, cell D, 1k ctx (the headline regression test) +./dflash/build/test_gemma4_dflash \ + --model models/gemma-4-31B-it-Q4_K_M.gguf \ + --mtp models/gemma4-mtp-31B/gemma-4-31B-it-assistant.Q8_0.gguf \ + --draft-method mtp --kv-k tq3_0 --kv-v tq3_0 \ + --tokens-file .sisyphus/notes/gemma4-baseline/prompts/long_open.txt \ + --n-predict 256 --temp 0 --seed 0 --ignore-eos + +# Phase 1, cell E (the missed leg — dflash drafter on dense 31B) +./dflash/build/test_gemma4_dflash \ + --model models/gemma-4-31B-it-Q4_K_M.gguf \ + --draft /home/peppi/models/draft-gemma4-31b \ + --draft-method dflash --kv-k q8_0 --kv-v q8_0 \ + --tokens-file .sisyphus/notes/gemma4-baseline/prompts/long_open.txt \ + --n-predict 256 --temp 0 --seed 0 --ignore-eos + +# Phase 3, cell I (MoE target-only at 256k — needs long_200k.txt) +./dflash/build/test_gemma4_dflash \ + --model /home/peppi/models/gemma4-26b-a4b-it/gemma-4-26B-A4B-it-UD-Q4_K_M.gguf \ + --draft-method none --kv-k tq3_0 --kv-v tq3_0 \ + --tokens-file .sisyphus/notes/gemma4-baseline/prompts/long_200k.txt \ + --ctx-size 262144 --pflash --n-predict 256 --temp 0 --seed 0 --ignore-eos +``` diff --git a/dflash/CMakeLists.txt b/dflash/CMakeLists.txt index 8ac91009..9632fb1a 100644 --- a/dflash/CMakeLists.txt +++ b/dflash/CMakeLists.txt @@ -114,6 +114,10 @@ add_library(dflash27b STATIC src/qwen35_target_graph.cpp src/qwen3_dflash_graph.cpp src/qwen3_drafter.cpp + src/gemma4_target_loader.cpp + src/gemma4_target_graph.cpp + src/gemma4_mtp_graph.cpp + src/gemma4_dflash_graph.cpp src/qwen3_0p6b_loader.cpp src/qwen3_0p6b_graph.cpp src/flashprefill_q8.cpp @@ -143,9 +147,14 @@ set_target_properties(dflash27b PROPERTIES CUDA_ARCHITECTURES "${_dflash27b_arch # Extract the minimum SM from the arch list so safetensors_draft.cpp can decide # at compile time whether to convert BF16 draft weights to FP16 (cuBLAS BF16 # GEMM has no tensor core acceleration on SM < 80). -list(GET _dflash27b_archs 0 _dflash27b_min_sm) -# Strip any trailing 'a' suffix (e.g. "121a" → "121") -string(REGEX REPLACE "[^0-9]" "" _dflash27b_min_sm "${_dflash27b_min_sm}") +set(_dflash27b_min_sm 999) +foreach(_a IN LISTS _dflash27b_archs) + # Strip any trailing non-numeric suffix (e.g. "121a" -> "121") before comparing. + string(REGEX REPLACE "[^0-9]" "" _a_num "${_a}") + if(_a_num LESS _dflash27b_min_sm) + set(_dflash27b_min_sm "${_a_num}") + endif() +endforeach() target_compile_definitions(dflash27b PRIVATE DFLASH27B_MIN_SM=${_dflash27b_min_sm}) # FlashPrefill custom CUDA kernels need BF16 WMMA (sm_80+). @@ -154,7 +163,8 @@ if(_dflash27b_min_sm GREATER_EQUAL 80) target_sources(dflash27b PRIVATE src/flashprefill_kernels.cu src/flashprefill_select.cpp - src/flashprefill.cpp) + src/flashprefill.cpp + src/pflash_ggml_adapter.cpp) target_compile_definitions(dflash27b PRIVATE DFLASH27B_HAVE_FLASHPREFILL=1) endif() @@ -337,4 +347,77 @@ if(DFLASH27B_TESTS) target_link_libraries(test_dflash PRIVATE OpenMP::OpenMP_CXX) endif() endif() + + if(EXISTS "${CMAKE_CURRENT_SOURCE_DIR}/test/gemma4/test_gemma4_dflash.cpp") + add_executable(test_gemma4_dflash test/gemma4/test_gemma4_dflash.cpp) + target_include_directories(test_gemma4_dflash PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src) + target_link_libraries(test_gemma4_dflash PRIVATE dflash27b ggml ggml-cuda) + find_package(CUDAToolkit REQUIRED) + target_link_libraries(test_gemma4_dflash PRIVATE CUDA::cudart) + endif() + + if(EXISTS "${CMAKE_CURRENT_SOURCE_DIR}/test/gemma4/smoke_load_gemma4_target.cpp") + add_executable(smoke_load_gemma4_target test/gemma4/smoke_load_gemma4_target.cpp) + target_include_directories(smoke_load_gemma4_target PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src) + target_link_libraries(smoke_load_gemma4_target PRIVATE dflash27b ggml ggml-cuda) + find_package(CUDAToolkit REQUIRED) + target_link_libraries(smoke_load_gemma4_target PRIVATE CUDA::cudart) + endif() + + if(EXISTS "${CMAKE_CURRENT_SOURCE_DIR}/test/gemma4/smoke_gemma4_target_forward.cpp") + add_executable(smoke_gemma4_target_forward test/gemma4/smoke_gemma4_target_forward.cpp) + target_include_directories(smoke_gemma4_target_forward PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src) + target_link_libraries(smoke_gemma4_target_forward PRIVATE dflash27b ggml ggml-cuda) + find_package(CUDAToolkit REQUIRED) + target_link_libraries(smoke_gemma4_target_forward PRIVATE CUDA::cudart) + endif() + + if(EXISTS "${CMAKE_CURRENT_SOURCE_DIR}/test/gemma4/smoke_load_gemma4_draft.cpp") + add_executable(smoke_load_gemma4_draft test/gemma4/smoke_load_gemma4_draft.cpp) + target_include_directories(smoke_load_gemma4_draft PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src) + target_link_libraries(smoke_load_gemma4_draft PRIVATE dflash27b ggml ggml-cuda) + find_package(CUDAToolkit REQUIRED) + target_link_libraries(smoke_load_gemma4_draft PRIVATE CUDA::cudart) + endif() + + if(EXISTS "${CMAKE_CURRENT_SOURCE_DIR}/test/gemma4/smoke_gemma4_draft_forward.cpp") + add_executable(smoke_gemma4_draft_forward test/gemma4/smoke_gemma4_draft_forward.cpp) + target_include_directories(smoke_gemma4_draft_forward PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src) + target_link_libraries(smoke_gemma4_draft_forward PRIVATE dflash27b ggml ggml-cuda) + find_package(CUDAToolkit REQUIRED) + target_link_libraries(smoke_gemma4_draft_forward PRIVATE CUDA::cudart) + endif() + + if(EXISTS "${CMAKE_CURRENT_SOURCE_DIR}/test/gemma4/test_gemma4_kv_tq3.cpp") + add_executable(test_gemma4_kv_tq3 test/gemma4/test_gemma4_kv_tq3.cpp) + target_include_directories(test_gemma4_kv_tq3 PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src) + target_link_libraries(test_gemma4_kv_tq3 PRIVATE dflash27b ggml ggml-cuda) + find_package(CUDAToolkit REQUIRED) + target_link_libraries(test_gemma4_kv_tq3 PRIVATE CUDA::cudart) + endif() + + if(EXISTS "${CMAKE_CURRENT_SOURCE_DIR}/test/test_flash_attn_sparse.cpp") + add_executable(test_flash_attn_sparse test/test_flash_attn_sparse.cpp) + target_link_libraries(test_flash_attn_sparse PRIVATE dflash27b ggml ggml-cuda ggml-base) + target_include_directories(test_flash_attn_sparse PRIVATE + ${CMAKE_CURRENT_SOURCE_DIR}/deps/llama.cpp/ggml/include + ${CMAKE_CURRENT_SOURCE_DIR}/deps/llama.cpp/ggml/src + ${CMAKE_CURRENT_SOURCE_DIR}/src) + endif() + + if(EXISTS "${CMAKE_CURRENT_SOURCE_DIR}/test/gemma4/test_mtp_loader.cpp") + add_executable(test_mtp_loader test/gemma4/test_mtp_loader.cpp) + target_include_directories(test_mtp_loader PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src) + target_link_libraries(test_mtp_loader PRIVATE dflash27b ggml ggml-cuda) + find_package(CUDAToolkit REQUIRED) + target_link_libraries(test_mtp_loader PRIVATE CUDA::cudart) + endif() + + if(EXISTS "${CMAKE_CURRENT_SOURCE_DIR}/test/gemma4/test_mtp_graph_shapes.cpp") + add_executable(test_mtp_graph_shapes test/gemma4/test_mtp_graph_shapes.cpp) + target_include_directories(test_mtp_graph_shapes PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src) + target_link_libraries(test_mtp_graph_shapes PRIVATE dflash27b ggml ggml-cuda) + find_package(CUDAToolkit REQUIRED) + target_link_libraries(test_mtp_graph_shapes PRIVATE CUDA::cudart) + endif() endif() diff --git a/dflash/README.md b/dflash/README.md index 63c6122f..67118dcd 100644 --- a/dflash/README.md +++ b/dflash/README.md @@ -328,6 +328,14 @@ DFLASH27B_KV_TQ3=1 DFLASH27B_PREFILL_UBATCH=16 \ **Requirements:** NVIDIA sm_75+ GPU (2080 Ti, 3090, A10, A40, 4090) or Jetson AGX Thor sm_110, CUDA 12+ (CUDA 13+ required for Thor), 22+ GB VRAM, ~80 GB disk. On Turing (SM 7.5), BF16 draft weights are auto-converted to FP16 at load time for tensor core acceleration. +### Small-VRAM cards (<=24 GiB) + +VMM-backed pools waste VRAM on cards under ~24 GiB. The 32 GB VMM pool reservation fragments badly on a 24 GB card and causes prefill+verify cliffs (measured ~50% throughput loss at ctx=64K). Build with: + + cmake -DGGML_CUDA_NO_VMM=ON .. + +`GGML_CUDA_NO_VMM` is a **compile-time** CMake option — it cannot be set at runtime via environment variable. The dflash test binary prints a runtime warning if it detects <=24 GiB VRAM and the binary was built without this flag. + ## How it works **Block-diffusion draft.** Each step, the draft sees `[last_target_token, MASK×15]` plus the last 5 captured target hidden states. It denoises the masks in a single forward, producing 16 candidate tokens conditioned on real target features. Structurally stronger than chain EAGLE: every position conditions on the same captured context, not its own noisy predictions. diff --git a/dflash/deps/llama.cpp b/dflash/deps/llama.cpp index e2d98e3b..daef232a 160000 --- a/dflash/deps/llama.cpp +++ b/dflash/deps/llama.cpp @@ -1 +1 @@ -Subproject commit e2d98e3b7539521466a6408dffb5e39409c29ea6 +Subproject commit daef232a63acffc8b3712aa460ce7cbe1211d1c4 diff --git a/dflash/include/gemma4.h b/dflash/include/gemma4.h new file mode 100644 index 00000000..c82687fb --- /dev/null +++ b/dflash/include/gemma4.h @@ -0,0 +1,62 @@ +// gemma4 — standalone CUDA library for DFlash speculative decoding of +// Gemma4 models (31B Dense and 26B-A4B MoE) with a DFlash draft model. + +#ifndef GEMMA4_H +#define GEMMA4_H + +#include +#include + +#ifdef __cplusplus +extern "C" { +#endif + +// ─── Gemma4-31B Dense config ─────────────────────────────────────── + +#define GEMMA4_31B_HIDDEN 4096 +#define GEMMA4_31B_LAYERS 60 +#define GEMMA4_31B_N_HEADS 32 +#define GEMMA4_31B_N_KV_HEADS 8 +#define GEMMA4_31B_HEAD_DIM 128 +#define GEMMA4_31B_INTERMEDIATE 16384 +#define GEMMA4_31B_VOCAB 262144 +#define GEMMA4_31B_SWA_WINDOW 1024 + +// ─── Gemma4-26B-A4B MoE config ──────────────────────────────────── + +#define GEMMA4_26B_HIDDEN 4096 +#define GEMMA4_26B_LAYERS 30 +#define GEMMA4_26B_N_HEADS 32 +#define GEMMA4_26B_N_KV_HEADS 8 +#define GEMMA4_26B_HEAD_DIM 128 +#define GEMMA4_26B_INTERMEDIATE 16384 +#define GEMMA4_26B_EXPERT_INTERMEDIATE 2048 +#define GEMMA4_26B_N_EXPERTS 128 +#define GEMMA4_26B_N_EXPERTS_USED 8 +#define GEMMA4_26B_VOCAB 262144 +#define GEMMA4_26B_SWA_WINDOW 1024 + +// ─── Shared constants ───────────────────────────────────────────── + +#define GEMMA4_ROPE_THETA 1000000.0f +#define GEMMA4_RMS_EPS 1e-6f +#define GEMMA4_LOGIT_SOFTCAP 30.0f +#define GEMMA4_ATTN_SCALE 1.0f + +// ─── Draft model config ─────────────────────────────────────────── + +#define GEMMA4_DRAFT_LAYERS 5 +#define GEMMA4_DRAFT_BLOCK_SIZE 16 +#define GEMMA4_DRAFT_N_TARGET_LAYERS 6 +#define GEMMA4_31B_DRAFT_MASK_TOKEN_ID 4 +#define GEMMA4_26B_DRAFT_MASK_TOKEN_ID 4 + +// ─── Diagnostics ────────────────────────────────────────────────── + +const char * gemma4_last_error(void); + +#ifdef __cplusplus +} +#endif + +#endif // GEMMA4_H diff --git a/dflash/scripts/quantize_draft_q8.py b/dflash/scripts/quantize_draft_q8.py index f8ba6c97..ff31d3d9 100644 --- a/dflash/scripts/quantize_draft_q8.py +++ b/dflash/scripts/quantize_draft_q8.py @@ -1,6 +1,10 @@ #!/usr/bin/env python3 """ -Quantize the z-lab DFlash draft (safetensors, bf16) to a Q8_0 GGUF. +Quantize a z-lab DFlash draft (safetensors, bf16) to a Q8_0 GGUF. + +Supports both Qwen and Gemma4 draft architectures via --arch. +When config.json is present alongside the safetensors file, dimensions are +auto-detected from it; hardcoded defaults are used as fallback. Projection weights (fc, wq, wk, wv, wo, gate, up, down) are quantized to Q8_0 (~50% size reduction vs BF16). Norm weights stay F32 @@ -10,6 +14,17 @@ convert_dflash_to_gguf.py so gguf_draft_loader.cpp can load it. Usage: + # Qwen3.5 draft (auto-detects arch from config.json when present) + python3 scripts/quantize_draft_q8.py --arch qwen \ + models/draft/model.safetensors \ + models/draft/draft-q8_0.gguf + + # Gemma4 draft + python3 scripts/quantize_draft_q8.py --arch gemma4 \ + models/draft-gemma4-31b/model.safetensors \ + models/draft-gemma4-31b/draft-q8_0.gguf + + # Auto-detect arch from config.json (requires model_type field) python3 scripts/quantize_draft_q8.py \ models/draft/model.safetensors \ models/draft/draft-q8_0.gguf @@ -17,6 +32,7 @@ import argparse import json +import re import struct import sys from pathlib import Path @@ -24,31 +40,150 @@ import numpy as np import gguf +Q8_0_BLOCK_SIZE = 32 # elements per Q8_0 block + # ────────────────────────────────────────────────────────────────────── -# DFlash 27B draft architecture constants (must match dflash27b.h) +# Per-arch defaults (used when config.json is absent or incomplete) # ────────────────────────────────────────────────────────────────────── -ARCH = "qwen35-dflash-draft" -HIDDEN = 5120 -N_LAYER = 5 -N_HEAD = 32 -N_HEAD_KV = 8 -HEAD_DIM = 128 -INTERMEDIATE = 17408 -VOCAB = 248320 -N_TARGET_LAYERS = 5 -ROPE_THETA = 1_000_000.0 -RMS_EPS = 1e-6 -MASK_TOKEN_ID = 248070 -BLOCK_SIZE = 16 -CTX_LEN = 32768 +_QWEN_DEFAULTS = dict( + ARCH = "qwen35-dflash-draft", + HIDDEN = 5120, + N_LAYER = 5, + N_HEAD = 32, + N_HEAD_KV = 8, + HEAD_DIM = 128, + INTERMEDIATE = 17408, + VOCAB = 248320, + ROPE_THETA = 1_000_000.0, + RMS_EPS = 1e-6, + MASK_TOKEN_ID = 248070, + BLOCK_SIZE = 16, + CTX_LEN = 32768, + N_TARGET_LAYERS = 5, + MODEL_SIZE_TAG = "27B", + # Qwen-specific (no sliding window or logit softcap) + LOGIT_SOFTCAP = None, + SLIDING_WINDOW = None, + TARGET_LAYER_IDS = None, +) + +_GEMMA4_DEFAULTS = dict( + ARCH = "gemma4-dflash-draft", + HIDDEN = 2816, + N_LAYER = 5, + N_HEAD = 32, + N_HEAD_KV = 8, + HEAD_DIM = 128, + INTERMEDIATE = 5632, + VOCAB = 262144, + ROPE_THETA = 1_000_000.0, + RMS_EPS = 1e-6, + MASK_TOKEN_ID = 4, + BLOCK_SIZE = 16, + CTX_LEN = 262144, + LOGIT_SOFTCAP = 30.0, + SLIDING_WINDOW = 2048, + TARGET_LAYER_IDS = [1, 6, 11, 17, 22, 27], + MODEL_SIZE_TAG = "26B", +) + +_ARCH_DEFAULTS = { + "qwen": _QWEN_DEFAULTS, + "gemma4": _GEMMA4_DEFAULTS, +} + +# config.json model_type -> arch key +_MODEL_TYPE_MAP = { + "qwen3": "qwen", + "gemma4": "gemma4", +} + + +# ────────────────────────────────────────────────────────────────────── +# Config loading +# ────────────────────────────────────────────────────────────────────── + +def detect_arch_from_config(cfg_path: Path) -> str | None: + """Return 'qwen' or 'gemma4' by reading model_type from config.json.""" + if not cfg_path.exists(): + return None + with open(cfg_path) as f: + raw = json.load(f) + model_type = raw.get("model_type", "").lower() + for prefix, arch in _MODEL_TYPE_MAP.items(): + if model_type.startswith(prefix): + return arch + architectures = raw.get("architectures", []) + for a in architectures: + a_lower = a.lower() + for prefix, arch in _MODEL_TYPE_MAP.items(): + if prefix in a_lower: + return arch + return None -Q8_0_BLOCK_SIZE = 32 # elements per Q8_0 block + +def load_config(safetensors_path: Path, arch: str) -> dict: + """ + Load dimensions from config.json next to the safetensors file. + Returns a merged cfg dict, falling back to per-arch defaults for missing keys. + """ + defaults = dict(_ARCH_DEFAULTS[arch]) + cfg_path = safetensors_path.parent / "config.json" + + if not cfg_path.exists(): + print(f"[info] no config.json found at {cfg_path}, using {arch} hardcoded defaults") + return defaults + + print(f"[info] reading config from {cfg_path}") + with open(cfg_path) as f: + raw = json.load(f) + + dflash_cfg = raw.get("dflash_config", {}) + + # Derive model size tag from directory name (e.g. "draft-gemma4-31b" -> "31B") + dir_name = safetensors_path.parent.name + m = re.search(r"(\d+[bBmM])", dir_name) + model_size_tag = m.group(1).upper() if m else defaults["MODEL_SIZE_TAG"] + + cfg = dict(defaults) + cfg.update(dict( + HIDDEN = raw.get("hidden_size", defaults["HIDDEN"]), + N_LAYER = raw.get("num_hidden_layers", defaults["N_LAYER"]), + N_HEAD = raw.get("num_attention_heads", defaults["N_HEAD"]), + N_HEAD_KV = raw.get("num_key_value_heads", defaults["N_HEAD_KV"]), + HEAD_DIM = raw.get("head_dim", defaults["HEAD_DIM"]), + INTERMEDIATE = raw.get("intermediate_size", defaults["INTERMEDIATE"]), + VOCAB = raw.get("vocab_size", defaults["VOCAB"]), + ROPE_THETA = float(raw.get("rope_theta", defaults["ROPE_THETA"])), + RMS_EPS = float(raw.get("rms_norm_eps", defaults["RMS_EPS"])), + MASK_TOKEN_ID = dflash_cfg.get("mask_token_id", defaults["MASK_TOKEN_ID"]), + BLOCK_SIZE = raw.get("block_size", defaults["BLOCK_SIZE"]), + CTX_LEN = raw.get("max_position_embeddings", defaults["CTX_LEN"]), + MODEL_SIZE_TAG = model_size_tag, + )) + + if arch == "gemma4": + target_layer_ids = dflash_cfg.get("target_layer_ids", defaults["TARGET_LAYER_IDS"]) + cfg.update(dict( + LOGIT_SOFTCAP = float(raw.get("final_logit_softcapping", defaults["LOGIT_SOFTCAP"])), + SLIDING_WINDOW = raw.get("sliding_window", defaults["SLIDING_WINDOW"]), + TARGET_LAYER_IDS = target_layer_ids, + )) + + print(f"[info] detected model size tag: {model_size_tag}") + print(f"[info] hidden={cfg['HIDDEN']} n_layers={cfg['N_LAYER']} " + f"n_head={cfg['N_HEAD']} n_head_kv={cfg['N_HEAD_KV']} " + f"head_dim={cfg['HEAD_DIM']}") + print(f"[info] intermediate={cfg['INTERMEDIATE']} vocab={cfg['VOCAB']}") + if arch == "gemma4": + print(f"[info] target_layer_ids={cfg['TARGET_LAYER_IDS']}") + return cfg # ────────────────────────────────────────────────────────────────────── # Tensor name mapping — DFlash safetensors -> llama.cpp GGUF -# (Identical to convert_dflash_to_gguf.py) +# (Identical to convert_dflash_to_gguf.py; shared across both arches) # ────────────────────────────────────────────────────────────────────── def map_name(name: str) -> str | None: @@ -115,26 +250,96 @@ def bf16_bytes_to_f32(raw: bytes, shape: list[int]) -> np.ndarray: def main(): ap = argparse.ArgumentParser( - description="Quantize DFlash draft BF16 safetensors to Q8_0 GGUF") + description="Quantize DFlash draft BF16 safetensors to Q8_0 GGUF (qwen or gemma4)") ap.add_argument("safetensors", type=Path, help="Input BF16 safetensors (e.g. models/draft/model.safetensors)") ap.add_argument("out_gguf", type=Path, help="Output Q8_0 GGUF (e.g. models/draft/draft-q8_0.gguf)") + ap.add_argument("--arch", choices=["qwen", "gemma4"], + help="Draft model architecture. Auto-detected from config.json " + "model_type when omitted.") args = ap.parse_args() if not args.safetensors.exists(): print(f"[error] safetensors not found: {args.safetensors}", file=sys.stderr) sys.exit(1) + # Resolve arch: explicit flag > auto-detect from config.json + arch = args.arch + cfg_path = args.safetensors.parent / "config.json" + if arch is None: + arch = detect_arch_from_config(cfg_path) + if arch is None: + print( + "[error] --arch not specified and could not auto-detect from " + f"config.json (model_type not in {list(_MODEL_TYPE_MAP)}).\n" + " Pass --arch qwen or --arch gemma4 explicitly.", + file=sys.stderr, + ) + sys.exit(1) + print(f"[info] auto-detected arch: {arch}") + else: + print(f"[info] arch: {arch}") + + cfg = load_config(args.safetensors, arch) + ARCH = cfg["ARCH"] + HIDDEN = cfg["HIDDEN"] + N_LAYER = cfg["N_LAYER"] + N_HEAD = cfg["N_HEAD"] + N_HEAD_KV = cfg["N_HEAD_KV"] + HEAD_DIM = cfg["HEAD_DIM"] + INTERMEDIATE = cfg["INTERMEDIATE"] + VOCAB = cfg["VOCAB"] + ROPE_THETA = cfg["ROPE_THETA"] + RMS_EPS = cfg["RMS_EPS"] + MASK_TOKEN_ID = cfg["MASK_TOKEN_ID"] + BLOCK_SIZE = cfg["BLOCK_SIZE"] + CTX_LEN = cfg["CTX_LEN"] + MODEL_SIZE_TAG = cfg["MODEL_SIZE_TAG"] + print(f"[info] reading safetensors header from {args.safetensors}") header_size, header = load_safetensors_header(args.safetensors) n_entries = sum(1 for k in header if k != "__metadata__") print(f"[info] {n_entries} tensor entries") + # Compute N_TARGET_LAYERS / TARGET_HIDDEN from fc.weight shape + fc_info = header.get("fc.weight") + if fc_info is None: + print("[error] fc.weight not found in safetensors", file=sys.stderr) + sys.exit(1) + fc_shape = fc_info["shape"] # [hidden, n_target_layers * target_hidden] + + if arch == "qwen": + N_TARGET_LAYERS = cfg["N_TARGET_LAYERS"] + if fc_shape[1] % N_TARGET_LAYERS != 0: + print(f"[error] fc.weight columns ({fc_shape[1]}) not divisible by " + f"N_TARGET_LAYERS ({N_TARGET_LAYERS})", file=sys.stderr) + sys.exit(1) + else: # gemma4 + TARGET_LAYER_IDS = cfg["TARGET_LAYER_IDS"] + if not TARGET_LAYER_IDS: + print("[error] target_layer_ids is empty; cannot compute N_TARGET_LAYERS " + "(check config.json or _DEFAULTS)", file=sys.stderr) + sys.exit(1) + N_TARGET_LAYERS = len(TARGET_LAYER_IDS) + if fc_shape[1] % N_TARGET_LAYERS != 0: + print(f"[error] fc.weight columns ({fc_shape[1]}) not divisible by " + f"N_TARGET_LAYERS ({N_TARGET_LAYERS})", file=sys.stderr) + sys.exit(1) + + TARGET_HIDDEN = fc_shape[1] // N_TARGET_LAYERS + print(f"[info] fc.weight shape {fc_shape} -> " + f"N_TARGET_LAYERS={N_TARGET_LAYERS} TARGET_HIDDEN={TARGET_HIDDEN}") + writer = gguf.GGUFWriter(args.out_gguf, ARCH) # Architecture metadata (identical to convert_dflash_to_gguf.py) - writer.add_string("general.name", "Qwen3.5-27B-DFlash-Draft-Q8_0") + if arch == "qwen": + model_name = f"Qwen3.5-{MODEL_SIZE_TAG}-DFlash-Draft-Q8_0" + else: + model_name = f"Gemma4-{MODEL_SIZE_TAG}-DFlash-Draft-Q8_0" + writer.add_string("general.name", model_name) + print(f"[info] general.name = {model_name}") writer.add_quantization_version(gguf.GGML_QUANT_VERSION) writer.add_uint32(f"{ARCH}.context_length", CTX_LEN) writer.add_uint32(f"{ARCH}.embedding_length", HIDDEN) @@ -148,11 +353,18 @@ def main(): writer.add_float32(f"{ARCH}.attention.layer_norm_rms_epsilon", RMS_EPS) writer.add_float32(f"{ARCH}.rope.freq_base", ROPE_THETA) - # DFlash-specific hyperparameters + # DFlash-specific hyperparameters (shared) writer.add_uint32(f"{ARCH}.dflash.n_target_layers", N_TARGET_LAYERS) writer.add_uint32(f"{ARCH}.dflash.block_size", BLOCK_SIZE) writer.add_uint32(f"{ARCH}.dflash.mask_token_id", MASK_TOKEN_ID) + # Gemma4-specific hyperparameters + if arch == "gemma4": + writer.add_uint32(f"{ARCH}.dflash.sliding_window", cfg["SLIDING_WINDOW"]) + writer.add_float32(f"{ARCH}.dflash.logit_softcap", cfg["LOGIT_SOFTCAP"]) + writer.add_uint32(f"{ARCH}.dflash.target_hidden", TARGET_HIDDEN) + writer.add_array(f"{ARCH}.dflash.target_layer_ids", cfg["TARGET_LAYER_IDS"]) + # Collect and sort tensors (same order as convert_dflash_to_gguf.py) pending = [] for st_name, info in header.items(): diff --git a/dflash/scripts/server.py b/dflash/scripts/server.py index 85d9744a..aa99f43e 100644 --- a/dflash/scripts/server.py +++ b/dflash/scripts/server.py @@ -83,6 +83,32 @@ def resolve_draft(root: Path) -> Path: raise FileNotFoundError(f"no model.safetensors under {root}") +def _read_gguf_architecture(gguf_path: Path) -> str: + """Return the 'general.architecture' string from a GGUF file, or '' on error. + + Logs a warning to stderr if the GGUF read fails, since that means the + server will silently pick the non-Gemma4 daemon path and use the wrong + argv shape. Caller should treat empty string as 'detection failed' and + decide accordingly. + """ + try: + from gguf import GGUFReader # type: ignore + import numpy as np + r = GGUFReader(str(gguf_path)) + f = r.fields.get("general.architecture") + if f is None or not f.data: + return "" + p = f.parts[f.data[0]] + if not isinstance(p, np.ndarray): + return "" + return bytes(p).decode("utf-8", errors="replace").strip() + except Exception as e: + import sys + print(f"[server] WARNING: failed to read general.architecture from {gguf_path}: {e}", + file=sys.stderr) + return "" + + _QWEN35_FAMILY_TOKENIZERS = { "Qwen3.5-27B": "Qwen/Qwen3.5-27B", "Qwen3.6-27B": "Qwen/Qwen3.6-27B", @@ -580,6 +606,7 @@ def build_app(target: Path, draft: Path | None, bin_path: Path, budget: int, max prefix_cache_slots: int = 4, prefill_cache_slots: int = 4, arch: str = "qwen35", + use_pflash: bool = False, lazy_draft: bool = False, verbose_daemon: bool = False) -> FastAPI: import asyncio @@ -620,7 +647,19 @@ async def _openai_compat_error_handler(_request: Request, exc: OpenAICompatError if sys.platform == "win32": env["PATH"] = dll_dir + os.pathsep + str(Path(bin_abs).parent) + os.pathsep + env.get("PATH", "") - if arch in _LAGUNA_ARCHES: + if arch == "gemma4": + # Gemma4 binary uses named flags (--model, --draft) instead of positional args. + # draft is the safetensors directory, not a resolved file. + cmd = [bin_abs, + "--model", str(target), + "--draft", str(draft), + "--daemon", + "--fast-rollback", "--ddtree", f"--ddtree-budget={budget}", + f"--max-ctx={max_ctx}", + f"--stream-fd={stream_fd_val}"] + if use_pflash: + cmd.append("--pflash") + elif arch in _LAGUNA_ARCHES: # test_dflash detects arch=laguna from the GGUF and dispatches # internally to dflash27b::run_laguna_daemon(). No --draft, no # --fast-rollback, no --ddtree (no Laguna spec-decode draft yet). @@ -2201,6 +2240,9 @@ def main(): ap.add_argument("--prefix-cache-slots", type=int, default=4) ap.add_argument("--prefill-cache-slots", type=int, default=4) ap.add_argument("--daemon", action="store_true") + ap.add_argument("--pflash", action="store_true", + help="Enable pFlash sparse-attention prefill in the daemon binary " + "(Gemma4 only; no-op for Qwen3).") add_cli_flags(ap) args = ap.parse_args() prefill_cfg = config_from_args(args) @@ -2226,22 +2268,43 @@ def main(): if not args.target.is_file(): raise SystemExit(f"target GGUF not found at {args.target}") - # Architecture detection. test_dflash itself dispatches by GGUF arch at - # main() entry, so server.py just needs to know enough to omit --draft + - # DFlash/DDTree flags on archs that lack a spec-decode draft. Same - # binary serves every arch. - arch = _arch_from_gguf(args.target) - - if not args.bin.is_file(): - raise SystemExit(f"binary not found at {args.bin} (arch={arch})") - - if arch in _LAGUNA_ARCHES: - # No DFlash draft model exists for laguna yet; test_dflash'́s - # internal arch dispatch reads general.architecture, accepts the - # no-draft argv layout, and routes to run_laguna_daemon(). PFlash - # compression and prefix-cache SNAPSHOT/RESTORE are both wired - # through the laguna daemon now, so --prefill-compression and - # --prefix-cache-slots behave the same as on the qwen35 path. + # Detect architecture and select the right binary. + # test_dflash itself dispatches qwen35/laguna by GGUF arch at main() entry, + # but Gemma4 lives in a separate binary (test_gemma4_dflash) so we route + # explicitly here. + arch = _read_gguf_architecture(args.target) + is_gemma4 = (arch == "gemma4") + + if args.bin != DEFAULT_BIN: + # User explicitly specified a binary — use it as-is. + bin_path = args.bin + elif is_gemma4: + bin_path = ROOT / "build" / ("test_gemma4_dflash" + (".exe" if sys.platform == "win32" else "")) + print(f"[server] detected architecture=gemma4, using binary: {bin_path}") + else: + bin_path = DEFAULT_BIN + + if not bin_path.is_file(): + raise SystemExit(f"binary not found at {bin_path} (arch={arch})") + + if is_gemma4: + # Gemma4 draft is a directory (safetensors dir), not a resolved file. + if args.draft.is_dir(): + draft = args.draft + elif args.draft.is_file(): + # User passed a file path inside the draft directory; use its parent. + draft = args.draft.parent + print(f"[server] note: --draft {args.draft} is a file; using parent {draft}", file=sys.stderr) + else: + raise SystemExit(f"draft path not found or not a directory: {args.draft}") + if not draft.is_dir(): + raise SystemExit(f"draft directory not found: {draft} (from {args.draft})") + elif arch in _LAGUNA_ARCHES: + # No DFlash draft model exists for laguna yet; test_dflash's internal + # arch dispatch reads general.architecture, accepts the no-draft argv + # layout, and routes to run_laguna_daemon(). PFlash compression and + # prefix-cache SNAPSHOT/RESTORE are both wired through the laguna + # daemon now. draft = None else: draft = resolve_draft(args.draft) if args.draft.is_dir() else args.draft @@ -2260,13 +2323,14 @@ def main(): drafter_tokenizer = AutoTokenizer.from_pretrained( prefill_cfg.drafter_tokenizer_id, trust_remote_code=True) - app = build_app(args.target, draft, args.bin, args.budget, args.max_ctx, + app = build_app(args.target, draft, bin_path, args.budget, args.max_ctx, tokenizer, stop_ids, prefill_cfg=prefill_cfg if prefill_cfg.enabled else None, drafter_tokenizer=drafter_tokenizer, prefix_cache_slots=args.prefix_cache_slots, prefill_cache_slots=args.prefill_cache_slots, arch=arch, + use_pflash=getattr(args, "pflash", False), lazy_draft=args.lazy_draft, verbose_daemon=args.verbose_daemon) @@ -2279,8 +2343,9 @@ def main(): print(f"Luce DFlash OpenAI server on http://{args.host}:{args.port}") print(f" arch = {arch}") print(f" target = {args.target}") + print(f" arch = {arch or '(unknown)'}") print(f" draft = {draft}") - print(f" bin = {args.bin}") + print(f" bin = {bin_path}") print(f" budget = {args.budget}") print(f" max_ctx = {args.max_ctx}") print(f" tokenizer = {tokenizer_id}") diff --git a/dflash/scripts/tokenize_prompt.py b/dflash/scripts/tokenize_prompt.py index c2721838..95dfc42b 100644 --- a/dflash/scripts/tokenize_prompt.py +++ b/dflash/scripts/tokenize_prompt.py @@ -1,40 +1,92 @@ """ -Tokenize a prompt string using the Qwen3.5 HF tokenizer (via transformers) -and emit the token IDs as a flat int32 binary file. +Tokenize a prompt string using a HuggingFace tokenizer (via transformers). -We depend on Python only for the tokenizer — the C++ library consumes the -int32 file directly. This keeps the standalone lib free of a BPE impl. +Two output modes: + --out FILE Write token IDs as a flat int32 little-endian binary file + (consumed by the C++ library directly). + --csv Print comma-separated token IDs to stdout + (for use with the --tokens flag of test_gemma4_dflash). Usage: - python tokenize_prompt.py --out /tmp/prompt.bin --prompt "The capital of France is" + # Binary output (backward-compatible): + python tokenize_prompt.py --out /tmp/prompt.bin --prompt "Hello, world!" + + # CSV output for --tokens flag: + python tokenize_prompt.py --csv --prompt "Hello, world!" + # -> 9259,236764,1902,236888 + + # Explicit model: + python tokenize_prompt.py --csv --model google/gemma-4-26b-a4b-it --prompt "..." + + # Show token count: + python tokenize_prompt.py --csv --verbose --prompt "Hello, world!" + +Notes: + The Gemma4 tokenizer is cached locally at: + ~/.cache/huggingface/hub/models--google--gemma-4-26b-a4b-it/ + The script tries local_files_only=True first to avoid network calls. + Gemma4 vocab size: 262144, BOS token id: 2, EOS token id: 1. """ import argparse -import os -import sys import struct +import sys + +def build_parser() -> argparse.ArgumentParser: + ap = argparse.ArgumentParser(description=__doc__, + formatter_class=argparse.RawDescriptionHelpFormatter) + ap.add_argument("--prompt", required=True, help="Text to tokenize") + ap.add_argument("--model", default="google/gemma-4-26b-a4b-it", + help="HF repo id whose tokenizer to use " + "(default: google/gemma-4-26b-a4b-it)") + ap.add_argument("--add-bos", action="store_true", + help="Prepend BOS token (add_special_tokens=True)") + ap.add_argument("--verbose", action="store_true", + help="Print token count and first/last tokens to stderr") + # Output modes (at least one required) + out_group = ap.add_mutually_exclusive_group(required=True) + out_group.add_argument("--out", metavar="FILE", + help="Write int32 binary token ID file") + out_group.add_argument("--csv", action="store_true", + help="Print comma-separated token IDs to stdout") + return ap -def main(): - ap = argparse.ArgumentParser() - ap.add_argument("--out", required=True) - ap.add_argument("--prompt", required=True) - ap.add_argument("--model", default="Qwen/Qwen3.5-27B", - help="HF repo id whose tokenizer to use") - ap.add_argument("--add-bos", action="store_true", help="Prepend BOS token") - args = ap.parse_args() +def load_tokenizer(model: str): + """Load tokenizer, preferring local cache to avoid network calls.""" from transformers import AutoTokenizer - tok = AutoTokenizer.from_pretrained(args.model, trust_remote_code=True) + try: + return AutoTokenizer.from_pretrained( + model, trust_remote_code=True, local_files_only=True + ) + except Exception: + # Fall back to network if not cached + return AutoTokenizer.from_pretrained(model, trust_remote_code=True) + + +def tokenize(prompt: str, model: str, add_bos: bool) -> list[int]: + tok = load_tokenizer(model) + return tok.encode(prompt, add_special_tokens=add_bos) + - ids = tok.encode(args.prompt, add_special_tokens=args.add_bos) - print(f"tokenized {len(ids)} tokens: {ids}") +def main() -> None: + args = build_parser().parse_args() + ids = tokenize(args.prompt, args.model, args.add_bos) - with open(args.out, "wb") as f: - for t in ids: - f.write(struct.pack(" 10 else []) + ids[-5:] if len(ids) > 10 else ids + print(f"tokenized {len(ids)} tokens; first/last: {preview}", file=sys.stderr) - print(f"wrote {args.out} ({len(ids) * 4} bytes)") + if args.csv: + print(",".join(str(i) for i in ids)) + else: + with open(args.out, "wb") as f: + for t in ids: + f.write(struct.pack(" @@ -12,6 +13,7 @@ namespace dflash27b { namespace { std::mutex g_err_mu; std::string g_last_error; +thread_local std::string t_err_buf; // per-thread snapshot for safe c_str return } void set_last_error(std::string msg) { @@ -23,5 +25,12 @@ void set_last_error(std::string msg) { extern "C" const char * dflash27b_last_error(void) { std::lock_guard lk(dflash27b::g_err_mu); - return dflash27b::g_last_error.c_str(); + dflash27b::t_err_buf = dflash27b::g_last_error; // copy under lock + return dflash27b::t_err_buf.c_str(); // safe: thread-local +} + +extern "C" const char * gemma4_last_error(void) { + std::lock_guard lk(dflash27b::g_err_mu); + dflash27b::t_err_buf = dflash27b::g_last_error; + return dflash27b::t_err_buf.c_str(); } diff --git a/dflash/src/gemma4_dflash_graph.cpp b/dflash/src/gemma4_dflash_graph.cpp new file mode 100644 index 00000000..5764f723 --- /dev/null +++ b/dflash/src/gemma4_dflash_graph.cpp @@ -0,0 +1,1112 @@ +// Builds ggml compute graphs for the Gemma4 DFlash draft model +// (5-layer block-diffusion model with KV cache and logit softcapping). +// +// Architecture: +// - 6 captured target layers (Qwen3 used 5) +// - FC input = 6 * target_hidden, where target_hidden = 4096 for all Gemma4 +// variants (31B dense and 26B-A4B MoE), giving FC width = 24576 +// - Logit softcapping: tanh(logits / cap) * cap, cap = 30.0 +// - Tied lm_head: uses tok_embd transposed (or a provided lm_head weight) +// - Vocab = 262144 +// - Draft has its own lm_head + softcap — it does NOT rely on the target's +// lm_head (unlike the Qwen3 draft which shares the target's projection) +// - KV cache (prefix-direct): target features are projected into per-layer +// K/V entries and stored in GemmaTargetCache::draft_k/draft_v. +// build_draft_kv_prefill_graph materializes the context K/V; +// build_gemma4_draft_graph writes block K/V and attends over the full cache. +// - Layer types: 4 SWA (sliding_attention) + 1 full attention +// The attention kernel itself is the same ggml_flash_attn_ext call in both +// cases; the caller controls the mask to implement the sliding window. +// +// Two-step per-decode: +// 1. build_draft_kv_prefill_graph: project new committed context tokens into +// draft KV cache (side-effect only; nullptr returned). +// 2. build_gemma4_draft_graph: attend over context+block K/V and return logits. +// +// build_gemma4_draft_graph takes: +// - draft_embed [draft_hidden, n_tokens] f32 (MASK token embeddings) +// - positions [n_tokens] i32 (absolute token positions) +// - attn_mask [kv_pad, q_pad] f16 (causal over context+block) +// - kv_start = cache.draft_kv_pos (context length before this block) +// and returns: +// - logits [n_vocab, n_tokens] f32 (after softcapping) +// +// Safetensors tensor naming (actual file, no model. prefix): +// fc.weight → fc +// hidden_norm.weight → hidden_norm +// norm.weight → out_norm +// layers.{i}.self_attn.q_proj.weight → wq +// layers.{i}.self_attn.k_proj.weight → wk +// layers.{i}.self_attn.v_proj.weight → wv +// layers.{i}.self_attn.o_proj.weight → wo +// layers.{i}.self_attn.q_norm.weight → q_norm +// layers.{i}.self_attn.k_norm.weight → k_norm +// layers.{i}.input_layernorm.weight → attn_norm +// layers.{i}.post_attention_layernorm.weight → ffn_norm +// layers.{i}.mlp.gate_proj.weight → w_gate +// layers.{i}.mlp.up_proj.weight → w_up +// layers.{i}.mlp.down_proj.weight → w_down +// (no embed_tokens — tok_embd is injected from the target at runtime) + +#include "internal.h" + +#include +#include +#include +#include +#include +#include +#include +#include + +#if defined(_WIN32) +# if !defined(NOMINMAX) +# define NOMINMAX +# endif +# if !defined(WIN32_LEAN_AND_MEAN) +# define WIN32_LEAN_AND_MEAN +# endif +# include +#else +# include +# include +# include +# include +#endif + +namespace dflash27b { + +// ─── Draft SWA truncation toggle ────────────────────────────────────────── +// Set DFLASH_DRAFT_SWA_TRUNC=1 to enable per-layer K/V truncation in the +// draft graph for SWA layers (last n-1 layers — the final layer is full). +// Mirrors PR #129 for the qwen3 drafter, ported to gemma4's cached layout. +static inline bool draft_swa_trunc_enabled() { + static int e = -1; + if (e < 0) { + const char * v = std::getenv("DFLASH_DRAFT_SWA_TRUNC"); + e = (v && std::atoi(v) != 0) ? 1 : 0; + if (e) { + std::fprintf(stderr, "[draft-swa-trunc] enabled\n"); + } + } + return e == 1; +} + +// ─── Draft RoPE wrapper with optional YaRN extrapolation ────────────────── +// Set DFLASH_DRAFT_YARN=1 to enable YaRN scaling for draft RoPE; assumes the +// draft was effectively trained at DFLASH_DRAFT_YARN_NCTX_ORIG (default 32768) +// despite config.json claiming a larger max_position_embeddings. +static inline ggml_tensor * draft_rope(ggml_context * ctx, ggml_tensor * x, + ggml_tensor * positions, int head_dim, + float rope_base) { + static struct { + int nctx; + float ext; + float bf; + float bs; + bool init; + } p = {0, 0.0f, 0.0f, 0.0f, false}; + if (!p.init) { + const char * en = std::getenv("DFLASH_DRAFT_YARN"); + if (en && std::atoi(en) != 0) { + const char * nc = std::getenv("DFLASH_DRAFT_YARN_NCTX_ORIG"); + p.nctx = nc ? std::atoi(nc) : 32768; + p.ext = 1.0f; + p.bf = 32.0f; + p.bs = 1.0f; + std::fprintf(stderr, + "[draft-yarn] enabled: n_ctx_orig=%d ext_factor=%.2f beta_fast=%.1f beta_slow=%.1f\n", + p.nctx, p.ext, p.bf, p.bs); + } + p.init = true; + } + return ggml_rope_ext(ctx, x, positions, /*freq_factors=*/nullptr, + head_dim, GGML_ROPE_TYPE_NEOX, p.nctx, + rope_base, /*freq_scale=*/1.0f, + p.ext, /*attn_factor=*/1.0f, p.bf, p.bs); +} + +// ─── Graph builders ─────────────────────────────────────────────────────── + +// build_draft_kv_prefill_graph — prefix-direct KV materialisation (SGLang style). +// +// Projects n_tokens new context positions through the draft model's Wk / Wv +// (after FC → ctx_hidden) and writes the resulting K, V tensors into +// cache.draft_k[il] / cache.draft_v[il] starting at offset cache.draft_kv_pos. +// +// The function is side-effect only: it expands ggml_cpy ops into gf and +// returns nullptr. The caller must ggml_graph_compute(gf) to materialise +// the cache entries, then increment cache.draft_kv_pos by n_tokens. +// +// target_feat [6*target_hidden, n_tokens] f32 +// positions [n_tokens] i32 (absolute positions for RoPE) +ggml_tensor * build_draft_kv_prefill_graph( + ggml_context * ctx, + ggml_cgraph * gf, + const GemmaDraftWeights & w, + GemmaTargetCache & cache, + ggml_tensor * target_feat, + ggml_tensor * positions, + int n_tokens) +{ + // Guard: writing cache.draft_kv_pos..cache.draft_kv_pos+n_tokens-1 must fit. + if (cache.draft_k.empty() || + cache.draft_kv_pos < 0 || + cache.draft_kv_pos + n_tokens > (int)cache.draft_k[0]->ne[2]) { + const int tensor_cap = cache.draft_k.empty() ? -1 : (int)cache.draft_k[0]->ne[2]; + GGML_ABORT("draft KV prefill out of bounds: draft_kv_pos=%d n_tokens=%d cap=%d tensor_cap=%d", + cache.draft_kv_pos, n_tokens, cache.draft_kv_cap, tensor_cap); + } + + const int n_kv = w.n_head_kv; + const int head_dim = w.head_dim; + const float eps = GEMMA4_RMS_EPS; + const float rope_base = w.rope_theta; + + // ── 1. FC projection: ctx_hidden = fc @ target_feat → [n_embd, n_tokens] + ggml_tensor * ctx_hidden = ggml_mul_mat(ctx, w.fc, target_feat); + // hidden_norm: RMSNorm applied right after the fc projection + // (matches qwen3_dflash_graph.cpp:57-59) + ctx_hidden = ggml_rms_norm(ctx, ctx_hidden, eps); + ctx_hidden = ggml_mul(ctx, ctx_hidden, w.hidden_norm); + ggml_set_name(ctx_hidden, "draft_kv_prefill_ctx_hidden"); + + // ── 2. Per-layer K / V projection, normalisation, RoPE, cache write + for (int il = 0; il < w.n_layer; il++) { + const GemmaDraftLayer & L = w.layers[il]; + + // K = Wk @ ctx_hidden → [kv_dim, n_tokens] → [head_dim, n_kv, n_tokens] + ggml_tensor * Kb = ggml_mul_mat(ctx, L.wk, ctx_hidden); + Kb = ggml_reshape_3d(ctx, Kb, head_dim, n_kv, n_tokens); + Kb = ggml_rms_norm(ctx, Kb, eps); + Kb = ggml_mul(ctx, Kb, L.k_norm); + Kb = draft_rope(ctx, Kb, positions, head_dim, rope_base); + + // V = Wv @ ctx_hidden → [kv_dim, n_tokens] → [head_dim, n_kv, n_tokens] + ggml_tensor * Vb = ggml_mul_mat(ctx, L.wv, ctx_hidden); + Vb = ggml_reshape_3d(ctx, Vb, head_dim, n_kv, n_tokens); + + // Write K into cache.draft_k[il] at offset cache.draft_kv_pos + ggml_tensor * k_dst = ggml_view_3d(ctx, cache.draft_k[il], + head_dim, n_kv, n_tokens, + cache.draft_k[il]->nb[1], cache.draft_k[il]->nb[2], + (size_t)cache.draft_kv_pos * cache.draft_k[il]->nb[2]); + ggml_build_forward_expand(gf, ggml_cpy(ctx, Kb, k_dst)); + + // Write V into cache.draft_v[il] at offset cache.draft_kv_pos + ggml_tensor * v_dst = ggml_view_3d(ctx, cache.draft_v[il], + head_dim, n_kv, n_tokens, + cache.draft_v[il]->nb[1], cache.draft_v[il]->nb[2], + (size_t)cache.draft_kv_pos * cache.draft_v[il]->nb[2]); + ggml_build_forward_expand(gf, ggml_cpy(ctx, Vb, v_dst)); + } + + return nullptr; +} + +// build_gemma4_draft_graph — KV-cached draft forward. +// +// Attends over the full draft KV cache (context K/V already materialised by +// build_draft_kv_prefill_graph, plus newly written block K/V) and returns +// logits for the n_tokens block positions. +// +// draft_embed [n_embd, n_tokens] f32 (MASK token embeddings) +// positions [n_tokens] i32 (absolute token positions) +// attn_mask [kv_pad, q_pad] f16 (causal over context+block) +// kv_start context length before this block (= cache.draft_kv_pos) +// +// Returns logits [n_vocab, n_tokens] f32 (softcapped). +ggml_tensor * build_gemma4_draft_graph( + ggml_context * ctx, + ggml_cgraph * gf, + const GemmaDraftWeights & w, + GemmaTargetCache & cache, + ggml_tensor * draft_embed, + ggml_tensor * positions, + ggml_tensor * attn_mask, + int n_tokens, + int kv_start) +{ + // Validate KV cache write range before any graph nodes touch it. + if (kv_start < 0 || kv_start + n_tokens > cache.draft_kv_cap) { + GGML_ABORT("draft KV write out of bounds: kv_start=%d n_tokens=%d cap=%d", + kv_start, n_tokens, cache.draft_kv_cap); + } + + const int n_head = w.n_head; + const int n_kv = w.n_head_kv; + const int head_dim = w.head_dim; + const float eps = GEMMA4_RMS_EPS; + const float rope_base = w.rope_theta; + const int kv_len = kv_start + n_tokens; + + // Gemma4 scales embeddings by sqrt(hidden_size) — the draft shares the + // target's tok_embd, so it must apply the same scaling. Reference: + // vLLM qwen3_dflash.py embed_normalizer = target_config.hidden_size**0.5 + ggml_tensor * hidden = ggml_scale(ctx, draft_embed, std::sqrt((float)w.n_embd)); + ggml_set_name(hidden, "gemma4_draft_scaled_embed"); + + // ── 2. Transformer layers ───────────────────────────────────────── + for (int il = 0; il < w.n_layer; il++) { + const GemmaDraftLayer & L = w.layers[il]; + + // ── 2a. Attention pre-norm + ggml_tensor * cur = ggml_rms_norm(ctx, hidden, eps); + cur = ggml_mul(ctx, cur, L.attn_norm); + + // ── 2b. Q / K / V projections from block hidden state + ggml_tensor * Q = ggml_mul_mat(ctx, L.wq, cur); // [q_dim, n_tokens] + ggml_tensor * Kb = ggml_mul_mat(ctx, L.wk, cur); // [kv_dim, n_tokens] + ggml_tensor * Vb = ggml_mul_mat(ctx, L.wv, cur); // [kv_dim, n_tokens] + + // ── 2c. Reshape + per-head RMSNorm for Q and block K + Q = ggml_reshape_3d(ctx, Q, head_dim, n_head, n_tokens); + Q = ggml_rms_norm(ctx, Q, eps); + Q = ggml_mul(ctx, Q, L.q_norm); + + Kb = ggml_reshape_3d(ctx, Kb, head_dim, n_kv, n_tokens); + Kb = ggml_rms_norm(ctx, Kb, eps); + Kb = ggml_mul(ctx, Kb, L.k_norm); + + Vb = ggml_reshape_3d(ctx, Vb, head_dim, n_kv, n_tokens); + + // ── 2d. RoPE on Q and block K + Q = draft_rope(ctx, Q, positions, head_dim, rope_base); + Kb = draft_rope(ctx, Kb, positions, head_dim, rope_base); + + // ── 2e. Write block K / V into draft KV cache at [kv_start..kv_start+n_tokens) + ggml_tensor * k_dst = ggml_view_3d(ctx, cache.draft_k[il], + head_dim, n_kv, n_tokens, + cache.draft_k[il]->nb[1], cache.draft_k[il]->nb[2], + (size_t)kv_start * cache.draft_k[il]->nb[2]); + ggml_build_forward_expand(gf, ggml_cpy(ctx, Kb, k_dst)); + + ggml_tensor * v_dst = ggml_view_3d(ctx, cache.draft_v[il], + head_dim, n_kv, n_tokens, + cache.draft_v[il]->nb[1], cache.draft_v[il]->nb[2], + (size_t)kv_start * cache.draft_v[il]->nb[2]); + ggml_build_forward_expand(gf, ggml_cpy(ctx, Vb, v_dst)); + + // ── 2f. Full K / V view (context + block) from draft KV cache + // Optional SWA truncation: when enabled and this is an SWA layer + // with kv_len exceeding sliding_window, restrict K/V (and the mask) + // to the last (sliding_window + n_tokens) slots. Matches the draft + // model's training-time SWA pattern. + const bool layer_is_swa = (il < (int)w.layer_is_swa.size()) + ? w.layer_is_swa[il] : false; + const bool use_swa_trunc = draft_swa_trunc_enabled() + && layer_is_swa + && w.sliding_window > 0 + && kv_len > (w.sliding_window + n_tokens); + const int eff_kv_len = use_swa_trunc + ? (w.sliding_window + n_tokens) + : kv_len; + const int kv_offset = kv_len - eff_kv_len; // 0 if no truncation + + ggml_tensor * K_full = ggml_view_3d(ctx, cache.draft_k[il], + head_dim, n_kv, eff_kv_len, + cache.draft_k[il]->nb[1], cache.draft_k[il]->nb[2], + (size_t)kv_offset * cache.draft_k[il]->nb[2]); + ggml_tensor * V_full = ggml_view_3d(ctx, cache.draft_v[il], + head_dim, n_kv, eff_kv_len, + cache.draft_v[il]->nb[1], cache.draft_v[il]->nb[2], + (size_t)kv_offset * cache.draft_v[il]->nb[2]); + + // ── 2g. Permute into flash_attn_ext layout + // Q: [head_dim, n_tokens, n_head, 1] + // K_full: [head_dim, eff_kv_len, n_head_kv, 1] + // V_full: [head_dim, eff_kv_len, n_head_kv, 1] + Q = ggml_cont(ctx, ggml_permute(ctx, Q, 0, 2, 1, 3)); + K_full = ggml_cont(ctx, ggml_permute(ctx, K_full, 0, 2, 1, 3)); + V_full = ggml_cont(ctx, ggml_permute(ctx, V_full, 0, 2, 1, 3)); + + // SWA-truncated mask view: take the last eff_kv_len rows along the + // kv axis (axis 0). Mask shape is [kv_pad, q_pad] with kv_pad >= kv_len, + // so the slice [kv_offset .. kv_offset+eff_kv_len) gives the same + // causal pattern for the surviving K positions. + ggml_tensor * eff_mask = attn_mask; + if (use_swa_trunc && kv_offset > 0) { + // ggml_view_2d would produce a non-contiguous tensor (row stride is + // unchanged at kv_pad * elt). FA requires contiguous mask, so we + // copy the slice into a fresh tensor. + ggml_tensor * mask_view = ggml_view_2d(ctx, attn_mask, + eff_kv_len, attn_mask->ne[1], + attn_mask->nb[1], + (size_t)kv_offset * ggml_element_size(attn_mask)); + eff_mask = ggml_cont(ctx, mask_view); + } + + // ── 2h. Flash attention over full context+block KV + // scale = 1 / sqrt(head_dim); no logit softcap at attention level + const float scale = 1.0f / std::sqrt((float)head_dim); + ggml_tensor * attn = ggml_flash_attn_ext(ctx, Q, K_full, V_full, eff_mask, + scale, /*max_bias=*/0.0f, + /*logit_softcap=*/0.0f); + // attn: [head_dim, n_head, n_tokens, 1] + attn = ggml_reshape_2d(ctx, attn, head_dim * n_head, n_tokens); + + // ── 2i. Output projection + residual + ggml_tensor * attn_out = ggml_mul_mat(ctx, L.wo, attn); + hidden = ggml_add(ctx, hidden, attn_out); + + // ── 2j. FFN pre-norm + ggml_tensor * hf = ggml_rms_norm(ctx, hidden, eps); + hf = ggml_mul(ctx, hf, L.ffn_norm); + + // ── 2k. SwiGLU FFN: down(silu(gate(x)) * up(x)) + ggml_tensor * g = ggml_mul_mat(ctx, L.w_gate, hf); + g = ggml_silu(ctx, g); + ggml_tensor * u = ggml_mul_mat(ctx, L.w_up, hf); + ggml_tensor * gu = ggml_mul(ctx, g, u); + ggml_tensor * ffn_out = ggml_mul_mat(ctx, L.w_down, gu); + + hidden = ggml_add(ctx, hidden, ffn_out); + } + + // ── 3. Final output norm + ggml_tensor * out = ggml_rms_norm(ctx, hidden, eps); + out = ggml_mul(ctx, out, w.out_norm); + ggml_set_name(out, "gemma4_draft_hidden_out"); + + // ── 4. LM head (tied: transpose of tok_embd) + // tok_embd: [draft_hidden, n_vocab] ggml ne[0]=draft_hidden, ne[1]=n_vocab + // out: [draft_hidden, n_tokens] + // logits: [n_vocab, n_tokens] + ggml_tensor * logits = ggml_mul_mat(ctx, w.tok_embd, out); + ggml_set_name(logits, "gemma4_draft_logits_pre_cap"); + + // ── 5. Logit softcapping: logits = cap * tanh(logits / cap) + const float cap = w.logit_softcap; + logits = ggml_scale(ctx, logits, 1.0f / cap); + logits = ggml_tanh(ctx, logits); + logits = ggml_scale(ctx, logits, cap); + ggml_set_name(logits, "gemma4_draft_logits"); + + return logits; +} + +// ─── Safetensors loader ─────────────────────────────────────────────────── + +namespace { + +struct GStEntry { + std::string dtype; + std::vector shape; + uint64_t data_start; + uint64_t data_end; +}; + +using GStMap = std::unordered_map; + +// Minimal safetensors JSON header parser (same algorithm as safetensors_draft.cpp). +static bool parse_gst_header(const char * h, size_t hlen, GStMap & out) { + auto skip_ws = [&](size_t & i) { + while (i < hlen && (h[i] == ' ' || h[i] == '\t' || + h[i] == '\n' || h[i] == '\r')) i++; + }; + size_t i = 0; + skip_ws(i); + if (i >= hlen || h[i] != '{') return false; + i++; + while (i < hlen) { + skip_ws(i); + if (i >= hlen) return false; + if (h[i] == '}') { i++; break; } + if (h[i] == ',') { i++; skip_ws(i); } + if (i >= hlen || h[i] != '"') return false; + i++; + size_t name_start = i; + while (i < hlen && h[i] != '"') i++; + if (i >= hlen) return false; + std::string name(h + name_start, i - name_start); + i++; + skip_ws(i); + if (i >= hlen || h[i] != ':') return false; + i++; + skip_ws(i); + if (i >= hlen || h[i] != '{') return false; + size_t obj_start = i; + int depth = 0; + size_t obj_end = i; + for (; obj_end < hlen; obj_end++) { + if (h[obj_end] == '{') depth++; + else if (h[obj_end] == '}') { if (--depth == 0) { obj_end++; break; } } + } + if (depth != 0) return false; + if (name == "__metadata__") { i = obj_end; continue; } + + std::string obj(h + obj_start, obj_end - obj_start); + GStEntry e; + { + auto k = obj.find("\"dtype\":\""); + if (k == std::string::npos) return false; + auto vs = k + 9; + auto ve = obj.find('"', vs); + if (ve == std::string::npos) return false; + e.dtype = obj.substr(vs, ve - vs); + } + { + auto k = obj.find("\"shape\":["); + if (k == std::string::npos) return false; + auto vs = k + 9; + auto ve = obj.find(']', vs); + if (ve == std::string::npos) return false; + const char * p = obj.c_str() + vs; + const char * pe = obj.c_str() + ve; + while (p < pe) { + char * end = nullptr; + long long v = std::strtoll(p, &end, 10); + if (end == p) break; + e.shape.push_back((int64_t)v); + p = end; + while (p < pe && (*p == ',' || *p == ' ')) p++; + } + } + { + auto k = obj.find("\"data_offsets\":["); + if (k == std::string::npos) return false; + auto vs = k + 16; + auto ve = obj.find(']', vs); + if (ve == std::string::npos) return false; + unsigned long long s = 0, ed = 0; + if (std::sscanf(obj.c_str() + vs, "%llu , %llu", &s, &ed) != 2) + if (std::sscanf(obj.c_str() + vs, "%llu,%llu", &s, &ed) != 2) return false; + e.data_start = s; + e.data_end = ed; + } + out.emplace(std::move(name), std::move(e)); + i = obj_end; + } + return true; +} + +static ggml_type gst_dtype_to_ggml(const std::string & dt) { + if (dt == "BF16") return GGML_TYPE_BF16; + if (dt == "F16") return GGML_TYPE_F16; + if (dt == "F32") return GGML_TYPE_F32; + return GGML_TYPE_COUNT; +} + +struct GMmap { + void * addr = nullptr; + size_t len = 0; +#if defined(_WIN32) + HANDLE hFile = INVALID_HANDLE_VALUE; + HANDLE hMap = nullptr; +#else + int fd = -1; +#endif + + bool open_ro(const std::string & path, std::string & err) { +#if defined(_WIN32) + hFile = CreateFileA(path.c_str(), GENERIC_READ, FILE_SHARE_READ, + nullptr, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, nullptr); + if (hFile == INVALID_HANDLE_VALUE) { + err = "CreateFileA: " + path + ": error " + std::to_string(GetLastError()); + return false; + } + LARGE_INTEGER sz; + if (!GetFileSizeEx(hFile, &sz)) { + err = "GetFileSizeEx failed"; return false; + } + len = (size_t)sz.QuadPart; + hMap = CreateFileMappingA(hFile, nullptr, PAGE_READONLY, 0, 0, nullptr); + if (!hMap) { err = "CreateFileMappingA failed"; return false; } + addr = MapViewOfFile(hMap, FILE_MAP_READ, 0, 0, 0); + if (!addr) { err = "MapViewOfFile failed"; return false; } +#else + fd = ::open(path.c_str(), O_RDONLY); + if (fd < 0) { err = "open: " + path + ": " + std::strerror(errno); return false; } + struct stat st; + if (::fstat(fd, &st) < 0) { err = "fstat: " + std::string(std::strerror(errno)); return false; } + len = (size_t)st.st_size; + addr = ::mmap(nullptr, len, PROT_READ, MAP_PRIVATE, fd, 0); + if (addr == MAP_FAILED) { + err = "mmap: " + std::string(std::strerror(errno)); + addr = nullptr; return false; + } +#endif + return true; + } + + ~GMmap() { +#if defined(_WIN32) + if (addr) UnmapViewOfFile(addr); + if (hMap) CloseHandle(hMap); + if (hFile != INVALID_HANDLE_VALUE) CloseHandle(hFile); +#else + if (addr) ::munmap(addr, len); + if (fd >= 0) ::close(fd); +#endif + } +}; + +// Allocate one ggml tensor for a safetensors entry. +// HF row-major [out, in] → ggml ne[0]=in, ne[1]=out (byte layout identical). +// norm weights are kept as F32 (ggml CUDA elementwise ops require non-BF16 src1). +// Projection weights stay BF16 (Ampere+) or are converted to F16 (Turing). +static ggml_tensor * galloc_tensor( + ggml_context * gctx, + const GStMap & st, + const std::string & name, + const std::vector & expected_shape, + ggml_type gt_override = GGML_TYPE_COUNT) +{ + auto it = st.find(name); + if (it == st.end()) { + set_last_error("gemma4 safetensors: missing tensor '" + name + "'"); + return nullptr; + } + const GStEntry & e = it->second; + if (e.dtype != "BF16") { + set_last_error("gemma4 safetensors: '" + name + "' dtype=" + e.dtype + + " expected BF16"); + return nullptr; + } + if (e.shape.size() != expected_shape.size()) { + set_last_error("gemma4 safetensors: '" + name + "' ndim mismatch"); + return nullptr; + } + for (size_t k = 0; k < expected_shape.size(); k++) { + if (e.shape[k] != expected_shape[k]) { + char buf[256]; + std::snprintf(buf, sizeof(buf), + "gemma4 safetensors: '%s' shape[%zu]=%lld expected %lld", + name.c_str(), k, (long long)e.shape[k], (long long)expected_shape[k]); + set_last_error(buf); + return nullptr; + } + } + ggml_type gt = (gt_override == GGML_TYPE_COUNT) ? GGML_TYPE_BF16 : gt_override; + ggml_tensor * t = nullptr; + if (expected_shape.size() == 1) { + t = ggml_new_tensor_1d(gctx, gt, expected_shape[0]); + } else if (expected_shape.size() == 2) { + // [out, in] → ne[0]=in, ne[1]=out + t = ggml_new_tensor_2d(gctx, gt, expected_shape[1], expected_shape[0]); + } else { + set_last_error("gemma4 safetensors: unexpected ndim > 2 for '" + name + "'"); + return nullptr; + } + ggml_set_name(t, name.c_str()); + return t; +} + +static void g_bf16_to_f32(const uint16_t * src, float * dst, size_t n) { + for (size_t i = 0; i < n; i++) { + uint32_t bits = ((uint32_t)src[i]) << 16; + std::memcpy(&dst[i], &bits, 4); + } +} + +static void g_bf16_to_f16(const uint16_t * src, uint16_t * dst, size_t n) { + for (size_t i = 0; i < n; i++) { + uint32_t bits = ((uint32_t)src[i]) << 16; + float f; + std::memcpy(&f, &bits, 4); + uint32_t u; + std::memcpy(&u, &f, 4); + uint32_t sign = (u >> 16) & 0x8000; + int32_t exp = ((u >> 23) & 0xFF) - 127 + 15; + uint32_t mant = (u >> 13) & 0x03FF; + if (exp <= 0) dst[i] = (uint16_t)sign; + else if (exp >= 31) dst[i] = (uint16_t)(sign | 0x7C00); + else dst[i] = (uint16_t)(sign | (exp << 10) | mant); + } +} + +static bool g_cuda_has_native_bf16() { + const char * env = std::getenv("DFLASH27B_DRAFT_FP16"); + if (env && std::atoi(env) != 0) return false; +#if defined(DFLASH27B_MIN_SM) && DFLASH27B_MIN_SM < 80 + return false; +#else + return true; +#endif +} + +static uint32_t get_u32_or(const gguf_context * g, const char * key, uint32_t fallback) { + int64_t id = gguf_find_key(g, key); + if (id < 0) return fallback; + return gguf_get_val_u32(g, id); +} + +static float get_f32_or(const gguf_context * g, const char * key, float fallback) { + int64_t id = gguf_find_key(g, key); + if (id < 0) return fallback; + return gguf_get_val_f32(g, id); +} + +} // anonymous namespace + +// ─── Public loader ──────────────────────────────────────────────────────── + +// Load Gemma4 DFlash draft weights from a directory containing one or more +// safetensors shards. We look for files named: +// model.safetensors (single-shard) +// model-00001-of-NNNNN.safetensors (multi-shard, first shard only for now) +// +// In practice the z-lab Gemma4 draft is small enough to fit in a single shard. +bool load_gemma4_draft_safetensors(const std::string & dir_path, + ggml_backend_t backend, + GemmaDraftWeights & out) +{ + // ── 1. Find the shard file ──────────────────────────────────────── + // Try the canonical single-shard name first. + std::string path = dir_path + "/model.safetensors"; + { + // Quick existence check without mmap + int fd_check = ::open(path.c_str(), O_RDONLY); + if (fd_check < 0) { + // Fall back to first numbered shard + path = dir_path + "/model-00001-of-00001.safetensors"; + fd_check = ::open(path.c_str(), O_RDONLY); + if (fd_check < 0) { + set_last_error("gemma4 draft: no safetensors file found in " + dir_path); + return false; + } + } + ::close(fd_check); + } + + // ── 2. Open + mmap ─────────────────────────────────────────────── + GMmap mm; + std::string err; + if (!mm.open_ro(path, err)) { set_last_error(err); return false; } + if (mm.len < 8) { set_last_error("gemma4 draft: safetensors file too small"); return false; } + + // ── 3. Parse header ────────────────────────────────────────────── + uint64_t header_len = 0; + std::memcpy(&header_len, mm.addr, 8); + if (header_len == 0 || 8 + header_len > mm.len) { + set_last_error("gemma4 draft: bad safetensors header length"); + return false; + } + const char * header_ptr = (const char *)mm.addr + 8; + GStMap st; + if (!parse_gst_header(header_ptr, (size_t)header_len, st)) { + set_last_error("gemma4 draft: safetensors JSON parse failed"); + return false; + } + const uint8_t * blob = (const uint8_t *)mm.addr + 8 + header_len; + const size_t blob_size = mm.len - 8 - (size_t)header_len; + + // ── 4. Infer draft dimensions from FC weight shape ─────────────── + // fc: [n_vocab_or_target_feat_in, draft_hidden] + // The FC input is 6*target_hidden; FC output is draft_hidden. + // HF shape in safetensors: [draft_hidden, 6*target_hidden] + { + auto it = st.find("fc.weight"); + if (it == st.end()) { + set_last_error("gemma4 draft: fc.weight not found"); + return false; + } + const GStEntry & e = it->second; + if (e.shape.size() != 2) { + set_last_error("gemma4 draft: model.fc.weight expected 2D"); + return false; + } + // HF stores as [out_features, in_features] = [draft_hidden, 6*target_hidden] + out.n_embd = (int)e.shape[0]; + int fc_in = (int)e.shape[1]; + out.target_hidden = fc_in / GEMMA4_DRAFT_N_TARGET_LAYERS; + if (fc_in % GEMMA4_DRAFT_N_TARGET_LAYERS != 0) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "gemma4 draft: FC input %d not divisible by n_target_layers %d", + fc_in, GEMMA4_DRAFT_N_TARGET_LAYERS); + set_last_error(buf); + return false; + } + } + + // Infer n_head / n_head_kv / n_ff from layer 0 weight shapes + { + auto iq = st.find("layers.0.self_attn.q_proj.weight"); + auto ik = st.find("layers.0.self_attn.k_proj.weight"); + auto ig = st.find("layers.0.mlp.gate_proj.weight"); + if (iq == st.end() || ik == st.end() || ig == st.end()) { + set_last_error("gemma4 draft: missing required layer-0 weight tensors"); + return false; + } + // q_proj HF shape: [q_dim, n_embd] where q_dim = n_head * head_dim + int q_dim = (int)iq->second.shape[0]; + int kv_dim = (int)ik->second.shape[0]; + out.n_head = q_dim / out.head_dim; + out.n_head_kv = kv_dim / out.head_dim; + out.n_ff = (int)ig->second.shape[0]; + // Also set layer_is_swa: layers [0..n_layer-2] are SWA, last is full + out.layer_is_swa.assign((size_t)out.n_layer, true); + out.layer_is_swa[(size_t)(out.n_layer - 1)] = false; + } + + const int64_t HIDDEN = out.n_embd; + const int64_t Q_DIM = (int64_t)out.n_head * out.head_dim; + const int64_t KV_DIM = (int64_t)out.n_head_kv * out.head_dim; + const int64_t INTER = out.n_ff; + const int64_t HD = out.head_dim; + const int64_t FC_IN = (int64_t)GEMMA4_DRAFT_N_TARGET_LAYERS * out.target_hidden; + // VOCAB not used here; tok_embd is injected at runtime from the target model. + + // ── 5. Allocate ggml context ───────────────────────────────────── + // tensors: fc, hidden_norm, out_norm = 3 top-level (tok_embd injected at runtime) + // 11 tensors × 5 layers = 55 + // total = 58 + headroom + const int n_tensors = 3 + 11 * out.n_layer + 8; + ggml_init_params ip{}; + ip.mem_size = (size_t)n_tensors * ggml_tensor_overhead(); + ip.mem_buffer = nullptr; + ip.no_alloc = true; + out.ctx = ggml_init(ip); + if (!out.ctx) { set_last_error("gemma4 draft: ggml_init failed"); return false; } + out.backend = backend; + out.layers.assign((size_t)out.n_layer, GemmaDraftLayer{}); + + const ggml_type NORM_GT = GGML_TYPE_F32; + const bool nbf16 = g_cuda_has_native_bf16(); + const ggml_type PROJ_GT = nbf16 ? GGML_TYPE_COUNT : GGML_TYPE_F16; + + // ── 6. Create named tensors ────────────────────────────────────── + out.fc = galloc_tensor(out.ctx, st, "fc.weight", {HIDDEN, FC_IN}, PROJ_GT); + out.hidden_norm = galloc_tensor(out.ctx, st, "hidden_norm.weight", {HIDDEN}, NORM_GT); + out.out_norm = galloc_tensor(out.ctx, st, "norm.weight", {HIDDEN}, NORM_GT); + // tok_embd is not present in the draft safetensors; the draft shares + // the target model's token embedding which is injected at runtime. + out.tok_embd = nullptr; + if (!out.fc || !out.hidden_norm || !out.out_norm) return false; + + for (int il = 0; il < out.n_layer; il++) { + char pfx[64]; + std::snprintf(pfx, sizeof(pfx), "layers.%d.", il); + std::string p = pfx; + GemmaDraftLayer & L = out.layers[(size_t)il]; + + L.attn_norm = galloc_tensor(out.ctx, st, p + "input_layernorm.weight", {HIDDEN}, NORM_GT); + L.ffn_norm = galloc_tensor(out.ctx, st, p + "post_attention_layernorm.weight", {HIDDEN}, NORM_GT); + L.wq = galloc_tensor(out.ctx, st, p + "self_attn.q_proj.weight", {Q_DIM, HIDDEN}, PROJ_GT); + L.wk = galloc_tensor(out.ctx, st, p + "self_attn.k_proj.weight", {KV_DIM, HIDDEN}, PROJ_GT); + L.wv = galloc_tensor(out.ctx, st, p + "self_attn.v_proj.weight", {KV_DIM, HIDDEN}, PROJ_GT); + L.wo = galloc_tensor(out.ctx, st, p + "self_attn.o_proj.weight", {HIDDEN, Q_DIM}, PROJ_GT); + L.q_norm = galloc_tensor(out.ctx, st, p + "self_attn.q_norm.weight", {HD}, NORM_GT); + L.k_norm = galloc_tensor(out.ctx, st, p + "self_attn.k_norm.weight", {HD}, NORM_GT); + L.w_gate = galloc_tensor(out.ctx, st, p + "mlp.gate_proj.weight", {INTER, HIDDEN}, PROJ_GT); + L.w_up = galloc_tensor(out.ctx, st, p + "mlp.up_proj.weight", {INTER, HIDDEN}, PROJ_GT); + L.w_down = galloc_tensor(out.ctx, st, p + "mlp.down_proj.weight", {HIDDEN, INTER}, PROJ_GT); + + if (!L.attn_norm || !L.ffn_norm || !L.wq || !L.wk || !L.wv || !L.wo || + !L.q_norm || !L.k_norm || !L.w_gate || !L.w_up || !L.w_down) { + return false; + } + } + + // ── 7. Allocate backend buffer and upload bytes ────────────────── + out.buf = ggml_backend_alloc_ctx_tensors(out.ctx, backend); + if (!out.buf) { + set_last_error("gemma4 draft: ggml_backend_alloc_ctx_tensors failed"); + return false; + } + + std::vector scratch_f32; + std::vector scratch_f16; + + for (ggml_tensor * t = ggml_get_first_tensor(out.ctx); t != nullptr; + t = ggml_get_next_tensor(out.ctx, t)) + { + const char * name = ggml_get_name(t); + auto it = st.find(name); + if (it == st.end()) { + set_last_error(std::string("gemma4 draft post-alloc: '") + + name + "' vanished from header"); + return false; + } + const GStEntry & e = it->second; + if (e.data_end > (uint64_t)blob_size) { + set_last_error(std::string("gemma4 draft: data offset out of bounds for '") + + name + "'"); + return false; + } + const size_t src_bytes = (size_t)(e.data_end - e.data_start); + const size_t dst_bytes = ggml_nbytes(t); + const bool same = (t->type == gst_dtype_to_ggml(e.dtype)); + + if (same) { + if (src_bytes != dst_bytes) { + char buf[256]; + std::snprintf(buf, sizeof(buf), + "gemma4 draft: byte mismatch for '%s': blob=%zu ggml=%zu", + name, src_bytes, dst_bytes); + set_last_error(buf); + return false; + } + ggml_backend_tensor_set(t, blob + e.data_start, 0, dst_bytes); + } else if (e.dtype == "BF16" && t->type == GGML_TYPE_F32) { + const size_t n = ggml_nelements(t); + if (src_bytes != n * 2 || dst_bytes != n * 4) { + set_last_error(std::string("gemma4 draft: BF16->F32 size mismatch for '") + name + "'"); + return false; + } + scratch_f32.resize(n); + g_bf16_to_f32((const uint16_t *)(blob + e.data_start), + scratch_f32.data(), n); + ggml_backend_tensor_set(t, scratch_f32.data(), 0, dst_bytes); + } else if (e.dtype == "BF16" && t->type == GGML_TYPE_F16) { + const size_t n = ggml_nelements(t); + if (src_bytes != n * 2 || dst_bytes != n * 2) { + set_last_error(std::string("gemma4 draft: BF16->F16 size mismatch for '") + name + "'"); + return false; + } + scratch_f16.resize(n); + g_bf16_to_f16((const uint16_t *)(blob + e.data_start), + scratch_f16.data(), n); + ggml_backend_tensor_set(t, scratch_f16.data(), 0, dst_bytes); + } else { + set_last_error(std::string("gemma4 draft: unsupported dtype conversion for '") + + name + "': " + e.dtype + " -> " + ggml_type_name(t->type)); + return false; + } + } + + std::fprintf(stderr, + "[gemma4 draft] loaded: n_layer=%d n_head=%d n_kv=%d " + "n_embd=%d n_ff=%d head_dim=%d target_hidden=%d vocab=%d\n", + out.n_layer, out.n_head, out.n_head_kv, + out.n_embd, out.n_ff, out.head_dim, out.target_hidden, out.n_vocab); + std::fflush(stderr); + + return true; +} + +bool load_gemma4_draft_gguf(const std::string & path, + ggml_backend_t backend, + GemmaDraftWeights & out) +{ + // ── 1. Parse metadata + create ggml_context with tensor descriptors ── + ggml_context * meta_ctx = nullptr; + gguf_init_params gip{}; + gip.no_alloc = true; + gip.ctx = &meta_ctx; + gguf_context * gctx = gguf_init_from_file(path.c_str(), gip); + if (!gctx) { + set_last_error("gguf_init_from_file failed: " + path); + return false; + } + + // Validate arch + { + int64_t arch_id = gguf_find_key(gctx, "general.architecture"); + if (arch_id < 0) { + set_last_error("gemma4 draft GGUF: missing general.architecture"); + gguf_free(gctx); + return false; + } + const char * arch = gguf_get_val_str(gctx, arch_id); + if (std::string(arch) != "gemma4-dflash-draft") { + set_last_error(std::string("gemma4 draft GGUF: unexpected arch: ") + arch + + " (expected gemma4-dflash-draft)"); + gguf_free(gctx); + return false; + } + } + + // Read dimensions from GGUF metadata + int64_t arch_id2 = gguf_find_key(gctx, "general.architecture"); + const char * A = gguf_get_val_str(gctx, arch_id2); + char key[256]; + + auto read_u32 = [&](const char * suffix, uint32_t fallback) -> uint32_t { + std::snprintf(key, sizeof(key), "%s.%s", A, suffix); + return get_u32_or(gctx, key, fallback); + }; + auto read_f32 = [&](const char * suffix, float fallback) -> float { + std::snprintf(key, sizeof(key), "%s.%s", A, suffix); + return get_f32_or(gctx, key, fallback); + }; + + const uint32_t n_embd = read_u32("embedding_length", 0); + const uint32_t n_layer = read_u32("block_count", 0); + const uint32_t n_ff = read_u32("feed_forward_length", 0); + const uint32_t n_head = read_u32("attention.head_count", 0); + const uint32_t n_head_kv = read_u32("attention.head_count_kv", 0); + const uint32_t head_dim = read_u32("attention.key_length", 0); + const uint32_t block_sz = read_u32("dflash.block_size", 0); + const uint32_t n_tgt_lay = read_u32("dflash.n_target_layers", 0); + const uint32_t target_hid = read_u32("dflash.target_hidden", 0); + const uint32_t mask_tok_id = read_u32("dflash.mask_token_id", GEMMA4_31B_DRAFT_MASK_TOKEN_ID); + const uint32_t sliding_win = read_u32("dflash.sliding_window", 2048); + const float logit_cap = read_f32("dflash.logit_softcap", GEMMA4_LOGIT_SOFTCAP); + const float rope_theta = read_f32("rope.freq_base", GEMMA4_ROPE_THETA); + + if (n_embd == 0 || n_layer == 0 || n_ff == 0 || n_head == 0 || + n_head_kv == 0 || head_dim == 0) { + char buf[256]; + std::snprintf(buf, sizeof(buf), + "gemma4 draft GGUF: missing hparams: n_embd=%u n_layer=%u n_ff=%u " + "n_head=%u n_head_kv=%u head_dim=%u", + n_embd, n_layer, n_ff, n_head, n_head_kv, head_dim); + set_last_error(buf); + gguf_free(gctx); + return false; + } + + // Validate block_size and n_target_layers match compiled constants + if (block_sz != (uint32_t)GEMMA4_DRAFT_BLOCK_SIZE || + n_tgt_lay != (uint32_t)GEMMA4_DRAFT_N_TARGET_LAYERS) { + char buf[256]; + std::snprintf(buf, sizeof(buf), + "gemma4 draft GGUF: dflash.block_size=%u (expected %d), " + "dflash.n_target_layers=%u (expected %d)", + block_sz, GEMMA4_DRAFT_BLOCK_SIZE, + n_tgt_lay, GEMMA4_DRAFT_N_TARGET_LAYERS); + set_last_error(buf); + gguf_free(gctx); + return false; + } + + // Sanity-check upper bounds + constexpr uint32_t MAX_LAYERS = 1024; + constexpr uint32_t MAX_EMBD = 1u << 17; + constexpr uint32_t MAX_FF = 1u << 19; + constexpr uint32_t MAX_HEADS = 1024; + constexpr uint32_t MAX_HEADDIM = 1024; + if (n_layer > MAX_LAYERS || n_embd > MAX_EMBD || + n_ff > MAX_FF || n_head > MAX_HEADS || + n_head_kv > MAX_HEADS || head_dim > MAX_HEADDIM || + n_head_kv > n_head || (n_head % n_head_kv) != 0) { + char buf[320]; + std::snprintf(buf, sizeof(buf), + "gemma4 draft GGUF: hparams out of range: n_embd=%u n_layer=%u n_ff=%u " + "n_head=%u n_head_kv=%u head_dim=%u", + n_embd, n_layer, n_ff, n_head, n_head_kv, head_dim); + set_last_error(buf); + gguf_free(gctx); + return false; + } + + // ── 2. Populate GemmaDraftWeights scalars ──────────────────────────── + out.ctx = meta_ctx; + out.backend = backend; + out.n_layer = (int)n_layer; + out.n_head = (int)n_head; + out.n_head_kv = (int)n_head_kv; + out.head_dim = (int)head_dim; + out.n_embd = (int)n_embd; + out.n_ff = (int)n_ff; + out.block_size = (int)block_sz; + out.n_target_layers = (int)n_tgt_lay; + out.target_hidden = (int)target_hid; + out.mask_token_id = (int)mask_tok_id; + out.sliding_window = (int)sliding_win; + out.logit_softcap = logit_cap; + out.rope_theta = rope_theta; + + // layers [0..n_layer-2] are SWA, last layer is full attention + out.layer_is_swa.assign((size_t)n_layer, true); + out.layer_is_swa[(size_t)(n_layer - 1)] = false; + + out.layers.assign((size_t)n_layer, GemmaDraftLayer{}); + + // tok_embd is injected at runtime from the target model (same as safetensors path) + out.tok_embd = nullptr; + + // ── 3. Wire tensor pointers ────────────────────────────────────────── + auto g = [&](const char * name) -> ggml_tensor * { + return ggml_get_tensor(meta_ctx, name); + }; + + out.fc = g("dflash.fc.weight"); + out.hidden_norm = g("dflash.hidden_norm.weight"); + out.out_norm = g("output_norm.weight"); + + if (!out.fc || !out.hidden_norm || !out.out_norm) { + set_last_error("gemma4 draft GGUF: missing top-level tensors " + "(dflash.fc.weight / dflash.hidden_norm.weight / output_norm.weight)"); + gguf_free(gctx); + return false; + } + + for (int il = 0; il < out.n_layer; il++) { + char name[128]; + auto fnd = [&](const char * suffix) -> ggml_tensor * { + std::snprintf(name, sizeof(name), "blk.%d.%s", il, suffix); + return ggml_get_tensor(meta_ctx, name); + }; + GemmaDraftLayer & L = out.layers[il]; + L.attn_norm = fnd("attn_norm.weight"); + L.ffn_norm = fnd("ffn_norm.weight"); + L.wq = fnd("attn_q.weight"); + L.wk = fnd("attn_k.weight"); + L.wv = fnd("attn_v.weight"); + L.wo = fnd("attn_output.weight"); + L.q_norm = fnd("attn_q_norm.weight"); + L.k_norm = fnd("attn_k_norm.weight"); + L.w_gate = fnd("ffn_gate.weight"); + L.w_up = fnd("ffn_up.weight"); + L.w_down = fnd("ffn_down.weight"); + if (!L.attn_norm || !L.ffn_norm || !L.wq || !L.wk || !L.wv || !L.wo || + !L.q_norm || !L.k_norm || !L.w_gate || !L.w_up || !L.w_down) { + char b[128]; + std::snprintf(b, sizeof(b), + "gemma4 draft GGUF: layer %d missing tensors", il); + set_last_error(b); + gguf_free(gctx); + return false; + } + } + + // ── 4. Allocate backend buffer for all tensors ─────────────────────── + out.buf = ggml_backend_alloc_ctx_tensors(meta_ctx, backend); + if (!out.buf) { + set_last_error("gemma4 draft GGUF: ggml_backend_alloc_ctx_tensors failed"); + gguf_free(gctx); + return false; + } + + // ── 5. mmap file and copy tensor bytes to backend ──────────────────── + std::string err; + GMmap mm; + if (!mm.open_ro(path, err)) { set_last_error(err); gguf_free(gctx); return false; } + const size_t data_start = gguf_get_data_offset(gctx); + const int64_t n_tensors = gguf_get_n_tensors(gctx); + + size_t total = 0; + for (int64_t tid = 0; tid < n_tensors; tid++) { + const char * tname = gguf_get_tensor_name(gctx, tid); + ggml_tensor * t = ggml_get_tensor(meta_ctx, tname); + if (!t) continue; + const size_t off = data_start + gguf_get_tensor_offset(gctx, tid); + const size_t sz = gguf_get_tensor_size(gctx, tid); + if (off + sz > mm.len) { + set_last_error(std::string("gemma4 draft GGUF: tensor '") + + tname + "' overflows file"); + gguf_free(gctx); + return false; + } + ggml_backend_tensor_set(t, (const uint8_t *)mm.addr + off, 0, sz); + total += sz; + } + + gguf_free(gctx); + + std::fprintf(stderr, + "[gemma4 draft GGUF] loaded: n_layer=%d n_head=%d n_kv=%d " + "n_embd=%d n_ff=%d head_dim=%d target_hidden=%d (%.2f GiB on GPU)\n", + out.n_layer, out.n_head, out.n_head_kv, + out.n_embd, out.n_ff, out.head_dim, out.target_hidden, + total / (1024.0 * 1024.0 * 1024.0)); + std::fflush(stderr); + + return true; +} + +void free_gemma4_draft_weights(GemmaDraftWeights & w) { + if (w.buf) { ggml_backend_buffer_free(w.buf); w.buf = nullptr; } + if (w.ctx) { ggml_free(w.ctx); w.ctx = nullptr; } + w.layers.clear(); + w.layer_is_swa.clear(); + w.fc = nullptr; + w.hidden_norm = nullptr; + w.out_norm = nullptr; + w.tok_embd = nullptr; +} + +} // namespace dflash27b diff --git a/dflash/src/gemma4_mtp_graph.cpp b/dflash/src/gemma4_mtp_graph.cpp new file mode 100644 index 00000000..d8002471 --- /dev/null +++ b/dflash/src/gemma4_mtp_graph.cpp @@ -0,0 +1,766 @@ +// Single-step MTP (Multi-Token Prediction) graph builder for Gemma4. +// +// Builds a ggml compute graph that, given one token id and the target's last +// full-attention hidden state h_prev, produces: +// - out_logits : F32 [n_vocab, 1] full vocabulary row +// - out_h_post : F32 [n_embd_backbone, 1] next h_prev for the γ chain +// - out_argmax : I32 [1] greedy draft token (4-byte host pull per step) +// +// Architecture (mirrors atomicbot's gemma4-assistant.cpp lines 28-256): +// 1. Token embedding from target.tok_embd, scaled by sqrt(n_embd_backbone). +// 2. Concat [tok_emb, h_prev] → pre_projection → [n_embd, 1]. +// 3. 4 transformer blocks (cross-attention into target KV): +// RMSNorm → Q proj → Q-norm → RoPE → cross-attn (reads donor K/V) → +// wo → post_attn_norm → residual → ffn_norm → GELU FFN → post_ffn_norm → +// residual → optional out_scale. +// 4. output_norm → post_projection → h_post [n_embd_backbone, 1]. +// 5. LM head: dense (tied tok_embd) or centroid-routed for ordered embeddings. +// 6. In-graph argmax. +// +// Cross-attention contract: +// - Each MTP layer reads K/V from w.layers[il].donor_target_layer in the +// target KV cache (resolved at load time as the LAST target layer whose +// SWA type matches this MTP layer). +// - V is ALWAYS read from the cache (use_k_as_v=false): per HF Gemma4 the +// V slot stores rms-normed non-rotated vectors, distinct from post-RoPE K. +// - The K/V view covers [0, attn_pos) = all committed target positions. +// attn_pos is passed in via the in_pos tensor (caller sets it to +// cache.cur_pos before each step). +// - KV mask is not needed: all committed positions ≤ attn_pos are uniformly +// admitted (step position > attn_pos, so every cell is in the causal cone). +// We pass nullptr to ggml_flash_attn_ext for the mask argument. +// +// Centroid LM head (use_ordered_embeddings=true, always active for Dense 31B): +// cent_logits = mul_mat(mtp_centroids, h_inner) +// top_k_ids = ggml_top_k(cent_logits, centroid_top_k) +// sel_ids = get_rows(token_ordering_view, top_k_ids) +// sel_logits = mul_mat(get_rows(tok_embd, flat_sel_ids), h_inner) +// full_row = scatter sel_logits into [-1e30 fill] via ggml_set_rows +// +// When use_ordered_embeddings is false (fallback, unlikely for 31B assistant): +// out_logits = mul_mat(tok_embd, h_inner) — dense tied head. + +#include "internal.h" + +#include +#include +#include +#include + +namespace dflash27b { + +static constexpr float MTP_RMS_EPS = GEMMA4_RMS_EPS; + +// ─── Helpers ───────────────────────────────────────────────────────────────── + +static ggml_tensor * mtp_rms_norm_mul(ggml_context * ctx, + ggml_tensor * x, + ggml_tensor * weight) { + ggml_tensor * n = ggml_rms_norm(ctx, x, MTP_RMS_EPS); + return ggml_mul(ctx, n, weight); +} + +// GELU FFN with SwiGLU-like gate: w_down @ (gelu(w_gate @ x) * (w_up @ x)) +static ggml_tensor * mtp_gelu_ffn(ggml_context * ctx, + ggml_tensor * cur, + const MtpLayerWeights & L) { + ggml_tensor * gate = ggml_mul_mat(ctx, L.ffn_gate, cur); + ggml_tensor * up = ggml_mul_mat(ctx, L.ffn_up, cur); + ggml_tensor * gu = ggml_geglu_split(ctx, gate, up); + return ggml_mul_mat(ctx, L.ffn_down, gu); +} + +// ─── Public graph builder ───────────────────────────────────────────────────── + +bool build_mtp_step_graph(const MtpDrafterWeights & w, + const GemmaTargetCache & target_cache, + const GemmaTargetWeights & target, + MtpStepGraph & out, + int attn_pos) { + // ── Validate prerequisites ──────────────────────────────────────────────── + if (!w.pre_projection || !w.post_projection || !w.output_norm) { + set_last_error("build_mtp_step_graph: MtpDrafterWeights missing pre/post projection or output_norm"); + return false; + } + if ((int)w.layers.size() == 0) { + set_last_error("build_mtp_step_graph: no MTP layers"); + return false; + } + if (!target.tok_embd) { + set_last_error("build_mtp_step_graph: target.tok_embd is null"); + return false; + } + if (w.n_embd == 0 || w.n_embd_backbone == 0) { + set_last_error("build_mtp_step_graph: n_embd or n_embd_backbone is 0"); + return false; + } + + const int n_embd_backbone = w.n_embd_backbone; + const int n_layer = (int)w.layers.size(); + const int n_vocab = (int)target.tok_embd->ne[1]; + + // Validate layer 0 donor KV slot (each layer validates its own in the loop). + { + const int32_t donor_il_0 = w.layers[0].donor_target_layer; + if (donor_il_0 < 0 || donor_il_0 >= (int)target_cache.layer_to_kv_idx.size()) { + set_last_error("build_mtp_step_graph: invalid donor_target_layer for MTP layer 0"); + return false; + } + const int kv_slot_0 = target_cache.layer_to_kv_idx[donor_il_0]; + const int kv_read_slot_0 = (kv_slot_0 >= 0) ? kv_slot_0 + : ((donor_il_0 < (int)target_cache.layer_to_donor_kv.size()) + ? target_cache.layer_to_donor_kv[donor_il_0] : -1); + if (kv_read_slot_0 < 0 || kv_read_slot_0 >= (int)target_cache.attn_k.size()) { + set_last_error("build_mtp_step_graph: donor KV slot unresolvable for MTP layer 0"); + return false; + } + } + + // ── Allocate ggml context ───────────────────────────────────────────────── + // Conservative tensor overhead: 3 inputs + ~80 ops per layer + outputs. + // Extras vs original: K/V casts, GQA block-broadcast views/materialization, + // Q permute/cont, explicit KQ mask, Vt materialization. + const size_t n_tensors_est = (size_t)(3 + n_layer * 80 + 20); + ggml_init_params ip{}; + ip.mem_size = n_tensors_est * ggml_tensor_overhead() + 1024 * 1024; + ip.mem_buffer = nullptr; + ip.no_alloc = true; + ggml_context * ctx = ggml_init(ip); + if (!ctx) { + set_last_error("build_mtp_step_graph: ggml_init failed"); + return false; + } + + ggml_cgraph * gf = ggml_new_graph_custom(ctx, GGML_DEFAULT_GRAPH_SIZE, false); + + // ── Input tensors ───────────────────────────────────────────────────────── + ggml_tensor * in_tok = ggml_new_tensor_1d(ctx, GGML_TYPE_I32, 1); + ggml_set_input(in_tok); + ggml_set_name(in_tok, "mtp_in_tok"); + + // in_tok_embd: pre-dequantised token embedding supplied by caller. + // Caller must call target.embedder.embed(&tok, 1, buf) and tensor_set before compute. + // This avoids ggml_get_rows on a k-quant (Q4_K) source which the CUDA backend + // does not support in this llama.cpp revision. + ggml_tensor * in_tok_embd = ggml_new_tensor_2d(ctx, GGML_TYPE_F32, n_embd_backbone, 1); + ggml_set_input(in_tok_embd); + ggml_set_name(in_tok_embd, "mtp_in_tok_embd"); + + ggml_tensor * in_h_prev = ggml_new_tensor_2d(ctx, GGML_TYPE_F32, n_embd_backbone, 1); + ggml_set_input(in_h_prev); + ggml_set_name(in_h_prev, "mtp_in_h_prev"); + + // in_pos: absolute target position for this draft step's RoPE. + // Caller sets this to (cache.cur_pos + step_offset). + ggml_tensor * in_pos = ggml_new_tensor_1d(ctx, GGML_TYPE_I32, 1); + ggml_set_input(in_pos); + ggml_set_name(in_pos, "mtp_in_pos"); + + // ── 1. Token embedding from target (shared weight) ──────────────────────── + // Embedding is passed in pre-dequantised by the caller via in_tok_embd. + // This bypasses ggml_get_rows on a potentially quantised target.tok_embd + // (CUDA backend in this revision only supports F16/F32/Q8_0 for get_rows; + // Q4_K targets would abort at compute time). + ggml_tensor * tok_e = in_tok_embd; + ggml_set_name(tok_e, "mtp_tok_embd"); + + // Gemma4 scales token embeddings by sqrt(n_embd_backbone) at input pipeline + const float tok_scale = std::sqrt((float)n_embd_backbone); + tok_e = ggml_scale(ctx, tok_e, tok_scale); + ggml_set_name(tok_e, "mtp_tok_embd_scaled"); + + // ── 2. Concat [tok_e, h_prev] and project to n_embd ────────────────────── + // Both are [n_embd_backbone, 1]; concat on axis 0 → [2*n_embd_backbone, 1] + ggml_tensor * inp_cat = ggml_concat(ctx, tok_e, in_h_prev, 0); + ggml_set_name(inp_cat, "mtp_concat"); + + // pre_projection: [2*n_embd_backbone, n_embd] (ggml ne[0]=2*n_bb, ne[1]=n_embd) + // mul_mat(A, x): A->ne[0] must == x->ne[0]; output ne[0]=A->ne[1] + ggml_tensor * inpL = ggml_mul_mat(ctx, w.pre_projection, inp_cat); + ggml_set_name(inpL, "mtp_pre_proj_out"); + + // ── 3. Transformer blocks ───────────────────────────────────────────────── + // Single FA mask shared across every layer that needs one. First need-mask + // layer creates the input tensor; later layers reuse it. We require every + // need-mask layer to want the same (width, kv_seq_len) — short contexts + // satisfy this because SWA cap >= attn_pos. Divergence in long contexts + // trips an error and the builder must be extended to per-layer masks. + ggml_tensor * shared_fa_mask = nullptr; + int64_t shared_fa_mask_width = 0; + int64_t shared_fa_mask_kv_seq_len = 0; + for (int il = 0; il < n_layer; ++il) { + const MtpLayerWeights & L = w.layers[il]; + const bool is_swa = L.is_swa; + + // Resolve donor KV slot + const int32_t donor_il = L.donor_target_layer; + if (donor_il < 0 || donor_il >= (int)target_cache.layer_to_kv_idx.size()) { + set_last_error("build_mtp_step_graph: invalid donor_target_layer"); + ggml_free(ctx); + return false; + } + const int kv_slot = target_cache.layer_to_kv_idx[donor_il]; + const int kv_read_slot = (kv_slot >= 0) ? kv_slot + : ((donor_il < (int)target_cache.layer_to_donor_kv.size()) + ? target_cache.layer_to_donor_kv[donor_il] : -1); + if (kv_read_slot < 0 || kv_read_slot >= (int)target_cache.attn_k.size()) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "build_mtp_step_graph: donor KV slot unresolvable for MTP layer %d", il); + set_last_error(buf); + ggml_free(ctx); + return false; + } + ggml_tensor * cache_k = target_cache.attn_k[kv_read_slot]; + ggml_tensor * cache_v = target_cache.attn_v[kv_read_slot]; + if (!cache_k || !cache_v) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "build_mtp_step_graph: null KV cache for MTP layer %d donor slot %d", il, kv_read_slot); + set_last_error(buf); + ggml_free(ctx); + return false; + } + + // KV cache layout: [head_dim_kv, max_ctx, n_head_kv] + const int64_t head_dim_kv = cache_k->ne[0]; + const int64_t n_head_kv = cache_k->ne[2]; + // Q dimensions: derive from wq output size and attn_q_norm shape. + // wq: [n_embd, q_out_dim] where q_out_dim = n_head_norm * head_dim_norm + // attn_q_norm:[head_dim_norm] per-head norm weight from the MTP model's own hparams + // + // head_dim_norm may differ from head_dim_kv (the target KV cache head_dim). + // Dense 31B example: MTP trained with head_dim_norm=256, target K stored at 128. + // For flash_attn Q @ K^T to succeed, Q.ne[0] must equal K.ne[0]. + // Fix: norm and RoPE run at head_dim_norm; before FA, reshape Q to [head_dim_kv, ...] + // so the dot-product dimension matches K. q_out_dim is preserved throughout. + const int64_t q_out_dim = L.wq->ne[1]; + const int64_t head_dim_norm = L.attn_q_norm->ne[0]; // MTP model's per-head norm dim + const int64_t n_head_norm = q_out_dim / head_dim_norm; + // FA head_dim must match target K; use head_dim_kv (from cache_k->ne[0]). + const int64_t head_dim_fa = head_dim_kv; + const int64_t n_head_fa = q_out_dim / head_dim_fa; + + // a) RMSNorm + ggml_tensor * cur = mtp_rms_norm_mul(ctx, inpL, L.attn_norm); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_attn_norm_%d", il); + ggml_set_name(cur, name); + } + + // b) Q projection: [n_embd, 1] → [q_out_dim, 1], reshape to [head_dim_norm, n_head_norm, 1] + ggml_tensor * Qcur = ggml_mul_mat(ctx, L.wq, cur); + Qcur = ggml_reshape_3d(ctx, Qcur, head_dim_norm, n_head_norm, 1); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_Qcur_%d", il); + ggml_set_name(Qcur, name); + } + + // c) Q-norm: per-head RMSNorm at head_dim_norm (attn_q_norm shape: [head_dim_norm]) + Qcur = mtp_rms_norm_mul(ctx, Qcur, L.attn_q_norm); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_Qcur_normed_%d", il); + ggml_set_name(Qcur, name); + } + + // d) RoPE on Q at head_dim_norm + // Use the target's rope_theta (SWA layers) or the full-attn layer's rope_freqs. + // For MTP cross-attention: SWA layers use rope_theta_swa, full layers use rope_theta + // (with per-layer freq_factors from the donor layer). + // We use the target's SWA/full rope parameters mirroring atomicbot. + ggml_tensor * rope_freq_factors = nullptr; + float rope_theta_val = target.rope_theta_swa; + if (!is_swa) { + rope_theta_val = target.rope_theta; + // For full-attention MTP layers: prefer assistant's OWN rope_freqs + // (top-level "rope_freqs.weight" in assistant GGUF — the assistant + // was trained with its own per-dim freq factors). Fall back to + // target's per-layer rope_freqs only if the assistant didn't ship + // one (legacy GGUFs). + if (w.rope_freqs) { + rope_freq_factors = w.rope_freqs; + } else if (donor_il >= 0 && donor_il < (int)target.layers.size()) { + rope_freq_factors = target.layers[donor_il].rope_freqs; + } + } + Qcur = ggml_rope_ext(ctx, Qcur, in_pos, + rope_freq_factors, + (int)head_dim_norm, GGML_ROPE_TYPE_NEOX, + /*n_ctx_orig=*/0, + rope_theta_val, /*freq_scale=*/1.0f, + /*ext_factor=*/0.0f, /*attn_factor=*/1.0f, + /*beta_fast=*/0.0f, /*beta_slow=*/0.0f); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_Qcur_pos_%d", il); + ggml_set_name(Qcur, name); + } + + // e) Cross-attention (manual: Q@K^T → scale → softmax → @V) + // Sidesteps ggml_flash_attn_ext CUDA kernel shape restrictions for MTP. + // + // Make Qcur contiguous before reshape — ggml_rope_ext returns a non-contiguous + // view; ggml_reshape_3d requires a contiguous source. + Qcur = ggml_cont(ctx, Qcur); + // Reshape Q from [head_dim_norm, n_head_norm, 1] to [head_dim_fa, n_head_fa, 1] + // so Q.ne[0] == K.ne[0] == head_dim_kv. + // When head_dim_norm == head_dim_fa this is a no-op reshape. + Qcur = ggml_reshape_3d(ctx, Qcur, head_dim_fa, n_head_fa, 1); + + // K/V view from the target KV cache. + // Full-attention donors read [0, attn_pos). SWA donors use a ring buffer: + // slice only the keys admitted by atomicbot's STANDARD SWA mask for an MTP + // query at pos=attn_pos, then the remaining mask is an all-zero bias. + int64_t kv_seq_len = (int64_t)attn_pos; + int64_t kv_start_slot = 0; + bool kv_wraps = false; + int64_t kv_first_len = 0; + if (is_swa) { + const int64_t ring_len = std::min(cache_k->ne[1], cache_v->ne[1]); + const int64_t swa_prev = target.swa_window > 0 + ? std::max((int64_t)target.swa_window - 1, 0) : ring_len; + kv_seq_len = std::min((int64_t)attn_pos, std::min(swa_prev, ring_len)); + if (kv_seq_len > 0) { + const int64_t first_abs = (int64_t)attn_pos - kv_seq_len; + kv_start_slot = first_abs % ring_len; + const int64_t kv_end_slot = kv_start_slot + kv_seq_len; + kv_wraps = kv_end_slot > ring_len; + kv_first_len = kv_wraps ? (ring_len - kv_start_slot) : kv_seq_len; + } + } else if ((int64_t)attn_pos > cache_k->ne[1] || (int64_t)attn_pos > cache_v->ne[1]) { + char buf[256]; + std::snprintf(buf, sizeof(buf), + "build_mtp_step_graph: attn_pos %d exceeds donor KV cache length (K=%lld V=%lld) for MTP layer %d", + attn_pos, (long long)cache_k->ne[1], (long long)cache_v->ne[1], il); + set_last_error(buf); + ggml_free(ctx); + return false; + } + // Pad to 1 minimum to avoid zero-size tensors when attn_pos==0. + const int64_t kv_view_len = std::max(kv_seq_len, (int64_t)1); + + // For head_dim==512 with any K type, ggml_flash_attn_ext requires + // K->ne[1] % 256 == 0 for gqa_opt_applies to be true (and returns + // BEST_FATTN_KERNEL_NONE otherwise). Pad the K/V view to the next 256 + // multiple; the padding rows contain stale cache data but are masked + // out by the caller-provided fa_mask with -inf bias on those positions. + // This only applies to the non-wrap path (head_dim=512 layers are full-attn + // with monotone KV so no wrap occurs). + // FATTN_KQ_STRIDE alignment: TQ3_0 K is stored in blocks along ne[1] and + // the FA kernels (chunked + vec) iterate KV in 256-position groups; an + // unaligned ne[1] reads past the valid window into stale cache cells. We + // pad the view to 256 and exclude the tail with a -inf mask. + // This matches gemma4_target_graph.cpp:352-355's `need_256_pad` policy. + const bool kv_cache_is_tq3 = + (cache_k->type == GGML_TYPE_TQ3_0 || cache_v->type == GGML_TYPE_TQ3_0); + if (kv_wraps && + (cache_k->type == GGML_TYPE_TQ3_0 || cache_v->type == GGML_TYPE_TQ3_0)) { + char buf[256]; + std::snprintf(buf, sizeof(buf), + "build_mtp_step_graph: refusing wrapped TQ3 donor attention for MTP layer %d donor=%d; force donor KV to Q8_0", + il, donor_il); + set_last_error(buf); + ggml_free(ctx); + return false; + } + const bool needs_kv_pad = (kv_cache_is_tq3 || head_dim_fa >= 512) + && !kv_wraps && (kv_view_len % 256 != 0); + const int64_t kv_view_len_padded = needs_kv_pad + ? ((kv_view_len + 255) / 256) * 256 + : kv_view_len; + + auto view_kv = [&](ggml_tensor * cache, int64_t start, int64_t len) { + return ggml_view_3d(ctx, cache, + head_dim_kv, len, n_head_kv, + cache->nb[1], cache->nb[2], + cache->nb[1] * (size_t)start); + }; + + ggml_tensor * Kview = nullptr; + ggml_tensor * Vview = nullptr; + if (kv_wraps) { + // ggml_concat on CUDA requires F32 src. Direct TQ3_0→F32 is unsupported + // by cpy.cu (it only does TQ3_0→F16 and F16↔F32). So go via F16 first + // when the cache is TQ3, else cast directly. + auto to_f32 = [&](ggml_tensor * v) { + if (v->type == GGML_TYPE_TQ3_0) { + v = ggml_cast(ctx, v, GGML_TYPE_F16); + } + if (v->type != GGML_TYPE_F32) { + v = ggml_cast(ctx, v, GGML_TYPE_F32); + } + return v; + }; + const int64_t kv_second_len = kv_view_len - kv_first_len; + ggml_tensor * k1 = to_f32(view_kv(cache_k, kv_start_slot, kv_first_len)); + ggml_tensor * k2 = to_f32(view_kv(cache_k, 0, kv_second_len)); + ggml_tensor * v1 = to_f32(view_kv(cache_v, kv_start_slot, kv_first_len)); + ggml_tensor * v2 = to_f32(view_kv(cache_v, 0, kv_second_len)); + Kview = ggml_concat(ctx, k1, k2, 1); + Vview = ggml_concat(ctx, v1, v2, 1); + } else { + // Use padded length for the K/V view when required. + Kview = view_kv(cache_k, kv_start_slot, kv_view_len_padded); + Vview = view_kv(cache_v, kv_start_slot, kv_view_len_padded); + } + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_Kview_%d", il); + ggml_set_name(Kview, name); + std::snprintf(name, sizeof(name), "mtp_Vview_%d", il); + ggml_set_name(Vview, name); + } + + // Detect if K/V is in TQ3_0 (FWHT-domain). Graph-level FWHT keeps the + // FA backends on a single contract: pre-rotate Q for TQ3 K, inverse- + // rotate output for TQ3 V, and pass the native K/V views into FA. + const bool k_is_tq3 = (Kview->type == GGML_TYPE_TQ3_0); + const bool v_is_tq3 = (Vview->type == GGML_TYPE_TQ3_0); + const bool kv_is_tq3 = k_is_tq3 || v_is_tq3; + + // Cross-attention via ggml_flash_attn_ext. + // + // Layout for ggml_flash_attn_ext: + // Q: [head_dim, n_tokens=1, n_head_q] + // K: [head_dim, kv_len, n_head_kv] (GQA: n_head_q % n_head_kv == 0) + // V: [head_dim, kv_len, n_head_kv] + // output: [head_dim, n_tokens=1, n_head_q] (reshaped to [q_out_dim, 1]) + // + // Benefits over manual matmul attention: + // - Handles GQA directly without broadcasting K/V. + // - Graph-level FWHT correction keeps TQ3 K/V in their native cache domain. + // + // For TQ3_0 + head_dim > 256 + n_tokens=1 (decode), the CUDA dispatch + // requires a non-null mask to select the CHUNKED kernel path. We create + // an all-zero (fully-admitted) mask in that case. + // + // Permute Q from [head_dim_fa, n_head_fa, 1] → [head_dim_fa, 1, n_head_fa] + // so it matches the FA expected layout. + // ggml_turbo_wht's CUDA kernel writes dst using src strides + // (turbo-wht.cu:20-21); non-contiguous input scatters writes and + // corrupts Q. Always make Q contiguous BEFORE rotating. + ggml_tensor * Qfa = ggml_cont(ctx, ggml_permute(ctx, Qcur, 0, 2, 1, 3)); + if (k_is_tq3) { + Qfa = ggml_turbo_wht(ctx, Qfa, 0); + } + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_Qfa_%d", il); + ggml_set_name(Qfa, name); + } + + // K/V for FA: pass the original Kview/Vview (TQ3_0, Q8_0, or concat-F32) + // directly to ggml_flash_attn_ext. Graph-level FWHT correction above/below + // accounts for TQ3_0 K/V without stripping the tensor type tag. + // For the wrap case (kv_wraps=true), Kview is already F32 (from to_f32 + concat). + ggml_tensor * Kfa = Kview; // original type (TQ3_0, Q8_0, or concat-F32) + ggml_tensor * Vfa = Vview; + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_Kfa_%d", il); + ggml_set_name(Kfa, name); + std::snprintf(name, sizeof(name), "mtp_Vfa_%d", il); + ggml_set_name(Vfa, name); + } + { + // Log per-layer FA types only on the first build of the process. + static bool g_mtp_fa_types_done = false; + if (!g_mtp_fa_types_done) { + std::printf("[mtp-fa-types] layer %d: Qfa=%s Kfa=%s Vfa=%s " + "head_dim_fa=%lld kv_is_tq3=%d need_mask=%d\n", + il, ggml_type_name(Qfa->type), ggml_type_name(Kfa->type), + ggml_type_name(Vfa->type), (long long)head_dim_fa, + (int)kv_is_tq3, (int)((kv_is_tq3 && head_dim_fa >= 512) || needs_kv_pad)); + if (il == n_layer - 1) g_mtp_fa_types_done = true; + } + } + + // For head_dim==512 (any K type): the MMA dispatcher requires + // gqa_opt_applies, which requires BOTH K->ne[1] % 256 == 0 AND + // mask != nullptr. Without mask, BEST_FATTN_KERNEL_NONE → abort + // even when K is properly aligned. Always provide the mask. + // We padded K/V to kv_view_len_padded above when needs_kv_pad is true; + // when not padding, mask width == kv_view_len (all positions admitted). + // The caller fills: positions [0..kv_seq_len-1] = 0.0 (admit), + // positions [kv_seq_len..mask_width-1] = -inf (exclude padding). + // + // For head_dim==256 (SWA) with TQ3_0 K (non-wrap): VEC kernel handles it + // without mask UNLESS needs_kv_pad triggers (KV unaligned); then mask is + // needed to exclude the padding tail. + // For wrap case (F32 K/V after concat): no TQ3_0 issues, no mask needed. + const bool need_mask = head_dim_fa >= 512 || needs_kv_pad; + const int64_t fa_mask_width = (needs_kv_pad ? kv_view_len_padded : kv_view_len); + ggml_tensor * fa_mask = nullptr; + if (need_mask) { + if (shared_fa_mask == nullptr) { + shared_fa_mask = ggml_new_tensor_2d(ctx, GGML_TYPE_F16, fa_mask_width, 1); + ggml_set_name(shared_fa_mask, "mtp_fa_mask"); + ggml_set_input(shared_fa_mask); + shared_fa_mask_width = fa_mask_width; + shared_fa_mask_kv_seq_len = kv_view_len; + } else if (shared_fa_mask_width != fa_mask_width + || shared_fa_mask_kv_seq_len != kv_view_len) { + char buf[256]; + std::snprintf(buf, sizeof(buf), + "build_mtp_step_graph: per-layer FA masks diverge " + "(layer %d wants width=%lld kv_seq=%lld; existing %lld/%lld). " + "Long-context SWA cap mismatch — extend builder to per-layer masks.", + il, (long long)fa_mask_width, (long long)kv_view_len, + (long long)shared_fa_mask_width, (long long)shared_fa_mask_kv_seq_len); + set_last_error(buf); + ggml_free(ctx); + return false; + } + fa_mask = shared_fa_mask; + } + + // Gemma4 MTP: f_attention_scale = 1.0 (no pre-softmax scaling). + ggml_tensor * attn_out = ggml_flash_attn_ext(ctx, Qfa, Kfa, Vfa, fa_mask, + 1.0f, 0.0f, 0.0f); + if (v_is_tq3) { + attn_out = ggml_cont(ctx, attn_out); + attn_out = ggml_turbo_wht(ctx, attn_out, 1); + } + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_fa_out_%d", il); + ggml_set_name(attn_out, name); + } + + // FA output: [head_dim_fa, 1, n_head_fa]. Flatten to [q_out_dim, 1]. + // Flatten heads: [head_dim_fa, 1, n_head_fa] → [q_out_dim, 1] + ggml_tensor * attn = ggml_cont(ctx, attn_out); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_attn_out_%d", il); + ggml_set_name(attn, name); + } + + // Reshape: [q_out_dim, 1] then output projection + // head_dim_fa * n_head_fa == q_out_dim == head_dim_norm * n_head_norm + attn = ggml_reshape_2d(ctx, attn, q_out_dim, 1); + cur = ggml_mul_mat(ctx, L.wo, attn); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_attn_proj_%d", il); + ggml_set_name(cur, name); + } + + // f) Post-attention norm + cur = mtp_rms_norm_mul(ctx, cur, L.attn_post_norm); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_attn_post_norm_%d", il); + ggml_set_name(cur, name); + } + + // g) Attention residual + ggml_tensor * attn_residual = ggml_add(ctx, cur, inpL); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_attn_residual_%d", il); + ggml_set_name(attn_residual, name); + } + + // h) FFN norm + ggml_tensor * ffn_in = mtp_rms_norm_mul(ctx, attn_residual, L.ffn_norm); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_ffn_norm_%d", il); + ggml_set_name(ffn_in, name); + } + + // i) GELU FFN + ggml_tensor * ffn_out = mtp_gelu_ffn(ctx, ffn_in, L); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_ffn_out_%d", il); + ggml_set_name(ffn_out, name); + } + + // j) Post-FFN norm + ffn_out = mtp_rms_norm_mul(ctx, ffn_out, L.ffn_post_norm); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_ffn_post_norm_%d", il); + ggml_set_name(ffn_out, name); + } + + // k) FFN residual + cur = ggml_add(ctx, ffn_out, attn_residual); + + // l) Optional per-layer output scale + if (L.out_scale) { + cur = ggml_mul(ctx, cur, L.out_scale); + { + char name[64]; std::snprintf(name, sizeof(name), "mtp_out_scaled_%d", il); + ggml_set_name(cur, name); + } + } + + inpL = cur; + } + + // ── 4. Output norm ──────────────────────────────────────────────────────── + ggml_tensor * h_inner = mtp_rms_norm_mul(ctx, inpL, w.output_norm); + ggml_set_name(h_inner, "mtp_result_norm"); + + // ── 5. Post-projection → h_post (next h_prev) ───────────────────────────── + // post_projection: [n_embd, n_embd_backbone] (ggml ne[0]=n_embd, ne[1]=n_embd_backbone) + ggml_tensor * h_post = ggml_mul_mat(ctx, w.post_projection, h_inner); + ggml_set_name(h_post, "mtp_post_proj_out"); + + // ── 6. LM head ──────────────────────────────────────────────────────────── + ggml_tensor * logits = nullptr; + + // Determine whether tok_embd supports ggml_get_rows on CUDA. + // This backend (custom llama.cpp fork) only supports F32/F16/BF16/Q4_0/Q4_1/ + // Q5_0/Q5_1/Q8_0/TQ3_0 for get_rows; K-quant types (Q4_K, Q5_K, Q6_K) are not. + // When tok_embd is a K-quant, the centroid sparse path can't use get_rows; + // fall back to dense mul_mat for logit computation instead. + const bool tok_embd_get_rows_ok = + (w.tok_embd && + (w.tok_embd->type == GGML_TYPE_F32 || + w.tok_embd->type == GGML_TYPE_F16 || + w.tok_embd->type == GGML_TYPE_BF16 || + w.tok_embd->type == GGML_TYPE_Q4_0 || + w.tok_embd->type == GGML_TYPE_Q4_1 || + w.tok_embd->type == GGML_TYPE_Q5_0 || + w.tok_embd->type == GGML_TYPE_Q5_1 || + w.tok_embd->type == GGML_TYPE_Q8_0)); + + if (w.use_ordered_embeddings && w.centroids && w.n_centroids > 0 && tok_embd_get_rows_ok) { + // Centroid-routed LM head (matches atomicbot lines 190-235). + // All mul_mat ops use h_inner [n_embd, 1] (MTP's own hidden space, n_embd=1024). + // The embedding source is the MTP model's own tok_embd [n_embd, n_vocab] (w.tok_embd), + // NOT the target's tok_embd (which is in backbone space and used only in step 1). + if (!w.tok_embd) { + set_last_error("build_mtp_step_graph: use_ordered_embeddings=true but w.tok_embd is null (token_embd.weight missing from GGUF)"); + ggml_free(ctx); + return false; + } + + const int64_t n_c = (int64_t)w.n_centroids; + const int64_t top_k = (int64_t)w.centroid_top_k; + // Validate centroid-head shape and index invariants before any arithmetic. + GGML_ASSERT(n_vocab > 0 && "centroid LM head: n_vocab must be > 0"); + GGML_ASSERT(n_c > 0 && "centroid LM head: n_centroids must be > 0"); + GGML_ASSERT(n_vocab % n_c == 0 + && "centroid LM head: n_vocab must be divisible by n_centroids"); + GGML_ASSERT(top_k > 0 && top_k <= n_c + && "centroid LM head: top_k must be in [1, n_centroids]"); + // vsc: tokens per centroid slot + const int64_t vsc = (int64_t)n_vocab / n_c; + + // centroid_logits = mul_mat(centroids, h_inner) → [n_centroids, 1] + // centroids: [n_embd, n_centroids] (ne[0]=n_embd, ne[1]=n_centroids) + ggml_tensor * centroid_logits = ggml_mul_mat(ctx, w.centroids, h_inner); + ggml_set_name(centroid_logits, "mtp_centroid_logits"); + + // top-k centroid indices + ggml_tensor * topk_idx = ggml_top_k(ctx, centroid_logits, (int)top_k); + ggml_set_name(topk_idx, "mtp_centroid_topk_idx"); + + // View token_ordering as [vsc, n_centroids] (I32) + const size_t ordering_row_bytes = ggml_row_size(GGML_TYPE_I32, vsc); + ggml_tensor * ordering = ggml_view_2d(ctx, w.token_ordering, + vsc, n_c, ordering_row_bytes, /*offset=*/0); + ggml_set_name(ordering, "mtp_token_ordering_view"); + + // Gather candidate token ids for top-k centroids: [vsc, top_k, 1] + ggml_tensor * sel_ids = ggml_get_rows(ctx, ordering, topk_idx); + ggml_set_name(sel_ids, "mtp_selected_token_ids"); + + // Flatten to 1D for embedding lookup + const int64_t n_sel = top_k * vsc; + ggml_tensor * flat_ids = ggml_reshape_1d(ctx, sel_ids, n_sel); + ggml_set_name(flat_ids, "mtp_selected_token_ids_flat"); + + // Gather embeddings for selected tokens from MTP's own tok_embd [n_embd, n_vocab]. + // get_rows selects n_sel rows → [n_embd, n_sel] + ggml_tensor * sel_emb = ggml_get_rows(ctx, w.tok_embd, flat_ids); + ggml_set_name(sel_emb, "mtp_selected_embd"); + + // Sparse logits: mul_mat(sel_emb, h_inner): + // sel_emb [n_embd, n_sel], h_inner [n_embd, 1] → [n_sel, 1] + ggml_tensor * sel_logits = ggml_mul_mat(ctx, sel_emb, h_inner); + ggml_set_name(sel_logits, "mtp_selected_logits"); + ggml_tensor * sel_logits_f32 = ggml_cast(ctx, sel_logits, GGML_TYPE_F32); + ggml_set_name(sel_logits_f32, "mtp_selected_logits_f32"); + + // Build full vocab row pre-filled with -1e30 + ggml_tensor * logits_full = ggml_new_tensor_2d(ctx, GGML_TYPE_F32, n_vocab, 1); + logits_full = ggml_fill_inplace(ctx, logits_full, -1e30f); + ggml_set_name(logits_full, "mtp_logits_masked_base"); + + // Scatter selected logits into full row + ggml_tensor * scatter_dst = ggml_cont_2d(ctx, logits_full, 1, (int64_t)n_vocab); + ggml_tensor * scatter_src = ggml_cont_2d(ctx, sel_logits_f32, 1, n_sel); + logits = ggml_set_rows(ctx, scatter_dst, scatter_src, flat_ids); + logits = ggml_reshape_2d(ctx, logits, n_vocab, 1); + ggml_set_name(logits, "mtp_logits_full"); + } else if (w.use_ordered_embeddings && w.tok_embd) { + // Dense fallback for ordered-embeddings models when tok_embd type does not + // support CUDA get_rows (e.g. K-quants in this llama.cpp fork). + // mul_mat supports Q4_K/Q5_K/Q6_K on CUDA; produces exact logits + // (not the centroid approximation) which is fine for greedy/low-temp decoding. + logits = ggml_mul_mat(ctx, w.tok_embd, h_inner); + ggml_set_name(logits, "mtp_logits_dense_fallback"); + } else { + // Dense tied LM head: mul_mat(tok_embd, h_post) → [n_vocab, 1] + // For non-ordered-embeddings models (n_embd == n_embd_backbone), use h_post + // (post-projected to n_embd_backbone) so dimensions match target.tok_embd. + // Prefer w.tok_embd (MTP's own, in n_embd space) if available, else + // fall back to target.tok_embd (in n_embd_backbone space) with h_post. + if (w.tok_embd) { + // MTP has its own tied LM head in n_embd space + logits = ggml_mul_mat(ctx, w.tok_embd, h_inner); + } else { + // Fallback: use target's tok_embd against the backbone-projected hidden + logits = ggml_mul_mat(ctx, target.tok_embd, h_post); + } + ggml_set_name(logits, "mtp_logits_dense"); + } + + // Optional logit softcapping (matches target's softcap=30) + if (target.logit_softcap > 0.0f) { + logits = ggml_scale(ctx, logits, 1.0f / target.logit_softcap); + logits = ggml_tanh(ctx, logits); + logits = ggml_scale(ctx, logits, target.logit_softcap); + ggml_set_name(logits, "mtp_logits_softcapped"); + } + + // ── 7. In-graph argmax ───────────────────────────────────────────────────── + ggml_tensor * argmax = ggml_argmax(ctx, logits); + ggml_set_name(argmax, "mtp_argmax"); + + // Expand all outputs into the graph + ggml_build_forward_expand(gf, argmax); + ggml_build_forward_expand(gf, h_post); + // Note: logits is already in argmax's DAG, but mark it as output for diagnostic reads. + ggml_set_output(logits); + ggml_set_output(h_post); + ggml_set_output(argmax); + + // ── Populate output struct ──────────────────────────────────────────────── + out.ctx = ctx; + out.gf = gf; + out.in_tok = in_tok; + out.in_tok_embd = in_tok_embd; + out.in_h_prev = in_h_prev; + out.in_pos = in_pos; + out.fa_mask = shared_fa_mask; + out.fa_mask_kv_seq_len = shared_fa_mask_kv_seq_len; + out.out_logits = logits; + out.out_h_post = h_post; + out.out_argmax = argmax; + + return true; +} + +void free_mtp_step_graph(MtpStepGraph & g) { + if (g.ctx) { + ggml_free(g.ctx); + g.ctx = nullptr; + } + g.gf = nullptr; + g.in_tok = nullptr; + g.in_tok_embd = nullptr; + g.in_h_prev = nullptr; + g.in_pos = nullptr; + g.fa_mask = nullptr; + g.fa_mask_kv_seq_len = 0; + g.out_logits = nullptr; + g.out_h_post = nullptr; + g.out_argmax = nullptr; +} + +} // namespace dflash27b diff --git a/dflash/src/gemma4_target_graph.cpp b/dflash/src/gemma4_target_graph.cpp new file mode 100644 index 00000000..dab1a341 --- /dev/null +++ b/dflash/src/gemma4_target_graph.cpp @@ -0,0 +1,1151 @@ +// Forward pass of Gemma4 (pure attention) in pure ggml. +// +// Supports both Gemma4-31B (dense, 60 layers) and Gemma4-26B-A4B (MoE, 30 layers). +// All model dimensions are read from GGUF at load time via GemmaTargetWeights. +// No llama.cpp runtime is linked — only ggml ops. +// +// Architecture highlights: +// - ALL layers are attention (no DeltaNet/SSM) — simpler than Qwen3.5 hybrid +// - Two layer types interleaved per swa_layers[]: +// SWA (sliding window): standard RoPE (rope_theta_swa), windowed FA +// Full (global): proportional RoPE via per-layer rope_freqs, full FA +// - Attention scale = 1.0 (self.scaling = 1.0, not 1/sqrt(head_dim)) +// - Logit softcapping: output = softcap * tanh(output / softcap), softcap=30 +// - Per-Layer Embeddings (PLE): gated embedding added to residual each layer +// - Shared KV cache: some layers reuse an earlier layer's KV slot +// - MoE FFN (26B-A4B): shared_expert + routed experts (top-K) +// +// State (persisted in GemmaTargetCache across calls): +// - attn_k, attn_v : KV cache for non-shared KV layers +// - layer_to_kv_idx : maps layer index -> KV slot index (-1 = shared) +// - layer_to_donor_kv: maps layer index -> donor slot for shared layers + +#include "internal.h" +#include "kv_quant.h" + +#include +#include +#include +#include +#include + +namespace dflash27b { + +// ─── File-local constants ──────────────────────────────────────────────────── + +static constexpr float EPS = GEMMA4_RMS_EPS; + +// ─── Helpers ───────────────────────────────────────────────────────────────── + +static ggml_tensor * rms_norm_mul(ggml_context * ctx, ggml_tensor * x, + ggml_tensor * weight, float eps) { + ggml_tensor * n = ggml_rms_norm(ctx, x, eps); + return ggml_mul(ctx, n, weight); +} + +// GeGLU FFN: w_down @ (gelu(w_gate @ x) * (w_up @ x)) +static ggml_tensor * build_geglu_ffn(ggml_context * ctx, + ggml_tensor * cur, + const GemmaTargetLayer & L) { + ggml_tensor * gate = ggml_mul_mat(ctx, L.w_gate, cur); + ggml_tensor * up = ggml_mul_mat(ctx, L.w_up, cur); + ggml_tensor * gu = ggml_geglu_split(ctx, gate, up); + return ggml_mul_mat(ctx, L.w_down, gu); +} + +// MoE FFN — shared expert + softmax-gated routed experts. +// Matches Gemma4-26B-A4B architecture: +// shared_out = w_down @ (gelu(w_gate @ x) * (w_up @ x)) +// shared_out = rms_norm(shared_out) * ffn_post_norm_1 +// router_in = rms_norm(inpSA) / sqrt(n_embd) * ffn_gate_inp_s (bare rms_norm) +// logits = ffn_gate_inp @ router_in [n_expert, n_tokens] +// probs = softmax(logits) +// top_ids = argsort_top_k(probs, n_expert_used) [n_expert_used, n_tokens] i32 +// weights = get_rows(probs, top_ids) [1, n_expert_used, n_tokens] +// weights = weights / sum(weights) (normalize to 1.0) +// gate_up_out = mul_mat_id(ffn_gate_up_exps, x, top_ids) → gelu+mul → weighted +// expert_out = mul_mat_id(ffn_down_exps, act, top_ids) [n_embd, n_expert_used, n_tokens] +// expert_out = sum over expert dim [n_embd, n_tokens] +// expert_out = rms_norm(expert_out) * ffn_post_norm_2 +// result = shared_out + expert_out +static ggml_tensor * build_moe_ffn(ggml_context * ctx, + ggml_cgraph * gf, + const GemmaTargetWeights & w, + const GemmaTargetLayer & L, + ggml_tensor * cur_shared_ffn, + ggml_tensor * cur_moe_ffn, + ggml_tensor * cur_for_router, + int n_tokens) { + const int n_embd = w.n_embd; + const int n_expert_used = w.n_expert_used; + const int n_expert = w.n_expert; + const int n_ff_exp = w.n_ff_exp; + + // ── Shared expert (always active) ────────────────────────────────────────── + ggml_tensor * shared_out = nullptr; + if (L.w_gate && L.w_up && L.w_down) { + ggml_tensor * sg = ggml_mul_mat(ctx, L.w_gate, cur_shared_ffn); + ggml_tensor * su = ggml_mul_mat(ctx, L.w_up, cur_shared_ffn); + ggml_tensor * sgu = ggml_geglu_split(ctx, sg, su); + shared_out = ggml_mul_mat(ctx, L.w_down, sgu); + if (L.ffn_post_norm_1) { + shared_out = rms_norm_mul(ctx, shared_out, L.ffn_post_norm_1, EPS); + } + } + + // ── Router ───────────────────────────────────────────────────────────────── + // router_in = rms_norm(inpSA) / sqrt(n_embd) * ffn_gate_inp_s (bare rms_norm, no weight) + ggml_tensor * router_in = ggml_rms_norm(ctx, cur_for_router, EPS); + router_in = ggml_scale(ctx, router_in, 1.0f / std::sqrt((float)n_embd)); + if (L.ffn_gate_inp_s) { + router_in = ggml_mul(ctx, router_in, L.ffn_gate_inp_s); + } + // logits: [n_expert, n_tokens] + ggml_tensor * logits = ggml_mul_mat(ctx, L.ffn_gate_inp, router_in); + + // Softmax gating + ggml_tensor * probs = ggml_soft_max(ctx, logits); // [n_expert, n_tokens] + + // Top-K selection — returns i32 index tensor [n_expert_used, n_tokens] + ggml_tensor * selected_experts = ggml_argsort_top_k(ctx, probs, n_expert_used); + + // Routing weights: gather probs at selected indices [1, n_expert_used, n_tokens] + ggml_tensor * probs_3d = ggml_reshape_3d(ctx, probs, 1, n_expert, n_tokens); + ggml_tensor * weights = ggml_get_rows(ctx, probs_3d, selected_experts); + // weights: [1, n_expert_used, n_tokens] → normalize to sum=1.0 + { + ggml_tensor * w2d = ggml_reshape_2d(ctx, weights, n_expert_used, n_tokens); + ggml_tensor * wsum = ggml_sum_rows(ctx, w2d); + wsum = ggml_clamp(ctx, wsum, 6.103515625e-5f, INFINITY); + w2d = ggml_div(ctx, w2d, wsum); + weights = ggml_reshape_3d(ctx, w2d, 1, n_expert_used, n_tokens); + } + + // ── Routed experts via ggml_mul_mat_id ───────────────────────────────────── + ggml_tensor * expert_out = nullptr; + if (L.ffn_gate_up_exps && L.ffn_down_exps) { + // cur_moe_ffn is [n_embd, n_tokens]; mul_mat_id expects [n_embd, 1, n_tokens] + ggml_tensor * x = ggml_reshape_3d(ctx, cur_moe_ffn, n_embd, 1, n_tokens); + + // Gate+up projection: ffn_gate_up_exps [2*n_ff_exp, n_embd, n_expert] + // Result: [2*n_ff_exp, n_expert_used, n_tokens] + ggml_tensor * gate_up = ggml_mul_mat_id(ctx, L.ffn_gate_up_exps, + x, selected_experts); + + const size_t elt = ggml_element_size(gate_up); + // gate half: first n_ff_exp rows + ggml_tensor * g_half = ggml_view_3d(ctx, gate_up, + n_ff_exp, n_expert_used, n_tokens, + (size_t)n_ff_exp * 2 * elt, + (size_t)n_ff_exp * 2 * n_expert_used * elt, + 0); + // up half: second n_ff_exp rows + ggml_tensor * u_half = ggml_view_3d(ctx, gate_up, + n_ff_exp, n_expert_used, n_tokens, + (size_t)n_ff_exp * 2 * elt, + (size_t)n_ff_exp * 2 * n_expert_used * elt, + (size_t)n_ff_exp * elt); + + // GeGLU activation (views are non-contiguous; ggml_gelu requires contiguous) + g_half = ggml_cont(ctx, g_half); + u_half = ggml_cont(ctx, u_half); + ggml_tensor * activated = ggml_mul(ctx, ggml_gelu(ctx, g_half), u_half); + + // Scale by routing weights [1, n_expert_used, n_tokens] + activated = ggml_mul(ctx, activated, weights); + + // Down projection: ffn_down_exps [n_embd, n_ff_exp, n_expert] + // activated: [n_ff_exp, n_expert_used, n_tokens] + ggml_tensor * down_out = ggml_mul_mat_id(ctx, L.ffn_down_exps, + activated, selected_experts); + // down_out: [n_embd, n_expert_used, n_tokens] + + // Optional down-projection scale (ffn_down_exps_s is a per-column scale) + if (L.ffn_down_exps_s) { + down_out = ggml_mul(ctx, down_out, L.ffn_down_exps_s); + } + + // Sum over n_expert_used to get [n_embd, n_tokens]. + // down_out: [n_embd, n_expert_used, n_tokens] + // Use the proven llama.cpp pattern: ggml_build_forward_expand the full + // tensor then sum slice views with ggml_add in a loop over n_expert_used. + ggml_build_forward_expand(gf, down_out); + expert_out = ggml_view_2d(ctx, down_out, + n_embd, n_tokens, + down_out->nb[2], + 0); + ggml_build_forward_expand(gf, expert_out); + for (int ei = 1; ei < n_expert_used; ++ei) { + ggml_tensor * slice = ggml_view_2d(ctx, down_out, + n_embd, n_tokens, + down_out->nb[2], + (size_t)ei * down_out->nb[1]); + ggml_build_forward_expand(gf, slice); + expert_out = ggml_add(ctx, expert_out, slice); + ggml_build_forward_expand(gf, expert_out); + } + + if (L.ffn_post_norm_2) { + expert_out = rms_norm_mul(ctx, expert_out, L.ffn_post_norm_2, EPS); + } + } + + // ── Combine shared + routed experts ──────────────────────────────────────── + if (shared_out && expert_out) { + return ggml_add(ctx, shared_out, expert_out); + } else if (shared_out) { + return shared_out; + } else if (expert_out) { + return expert_out; + } + // Fallback: should not happen with a correctly loaded MoE model + return cur_shared_ffn; +} + +// ─── SWA view geometry helper ──────────────────────────────────────────────── +// +// Compute the (abs_win_start, effective_win_len, ring_win_start) triple for a +// chunk at position kv_start with n_tokens query tokens, given swa_window and +// the ring-buffer size (swa_ctx_alloc). This is the single source of truth for +// the K/V view passed to FA and for the host-side causal mask. +SwaView compute_swa_view(int kv_start, int n_tokens, + int swa_window, int swa_ctx_alloc) +{ + SwaView v; + v.abs_win_start = (swa_window > 0 && kv_start > swa_window) + ? (kv_start - swa_window) : 0; + // K view is ALWAYS the full ring; the host-built mask handles the + // non-monotonic ring layout via abs_pos(slot) computation. + v.effective_win_len = swa_ctx_alloc; + v.ring_win_start = 0; + return v; +} + +// Sliding-Window Attention block. +// Uses standard RoPE (rope_theta_swa) and a windowed view of the KV cache. +static ggml_tensor * build_swa_attn_block( + ggml_context * ctx, + ggml_cgraph * gf, + const GemmaTargetWeights & w, + const GemmaTargetLayer & L, + ggml_tensor * cur, + ggml_tensor * positions, + ggml_tensor * cache_k, + ggml_tensor * cache_v, + ggml_tensor * attn_mask, + int kv_start, + int n_tokens, + ggml_type kv_k_type, + ggml_type kv_v_type, + bool write_kv, + int il) +{ + // SWA layers use the SWA head_dim (may be smaller than full-attn head_dim) + const int head_dim = w.head_dim_swa; + const int n_head = w.n_head; + const int n_head_kv = (il >= 0 && il < (int)w.head_kv_per_layer.size()) + ? w.head_kv_per_layer[il] : w.n_head_kv; + const int q_dim = n_head * head_dim; + + // Q projection + ggml_tensor * Qcur = ggml_mul_mat(ctx, L.wq, cur); + Qcur = ggml_reshape_3d(ctx, Qcur, head_dim, n_head, n_tokens); + Qcur = rms_norm_mul(ctx, Qcur, L.q_norm, EPS); + + ggml_tensor * Kcur = nullptr; + ggml_tensor * Vcur = nullptr; + if (write_kv) { + Kcur = ggml_mul_mat(ctx, L.wk, cur); + Kcur = ggml_reshape_3d(ctx, Kcur, head_dim, n_head_kv, n_tokens); + + Vcur = ggml_mul_mat(ctx, L.wv, cur); + Vcur = ggml_reshape_3d(ctx, Vcur, head_dim, n_head_kv, n_tokens); + + if (L.k_norm) { + Kcur = rms_norm_mul(ctx, Kcur, L.k_norm, EPS); + } + Vcur = ggml_rms_norm(ctx, Vcur, EPS); + } + + // Standard RoPE (SWA uses rope_theta_swa, no freq_factors) + Qcur = ggml_rope_ext(ctx, Qcur, positions, /*freq_factors=*/nullptr, + head_dim, GGML_ROPE_TYPE_NEOX, /*n_ctx_orig=*/0, + w.rope_theta_swa, /*freq_scale=*/1.0f, + /*ext_factor=*/0.0f, /*attn_factor=*/1.0f, + /*beta_fast=*/0.0f, /*beta_slow=*/0.0f); + if (Kcur) { + Kcur = ggml_rope_ext(ctx, Kcur, positions, nullptr, + head_dim, GGML_ROPE_TYPE_NEOX, 0, + w.rope_theta_swa, 1.0f, + 0.0f, 1.0f, 0.0f, 0.0f); + } + + // SWA ring-buffer: derive the ring size from the tensor's actual slot count. + // When swa_ctx_alloc < max_ctx (long contexts), writes use kv_start % ring_size + // so the tensor is never exceeded. + const int ring_size = cache_k ? (int)cache_k->ne[1] : (kv_start + n_tokens); + + // Write K/V into cache using ring-buffer position. + // Split-on-wrap: when write_pos + n_tokens > ring_size the chunk straddles + // the ring boundary, so we issue two ggml_cpy ops (pre-wrap and post-wrap). + if (write_kv && cache_k && cache_v && Kcur && Vcur) { + ggml_tensor * Kcur_T = ggml_permute(ctx, Kcur, 0, 2, 1, 3); + ggml_tensor * Vcur_T = ggml_permute(ctx, Vcur, 0, 2, 1, 3); + + const int write_pos = kv_start % ring_size; + const int pre_n = std::min(n_tokens, ring_size - write_pos); + const int post_n = n_tokens - pre_n; + + // First slice: [write_pos .. write_pos+pre_n) + { + ggml_tensor * k_slot = ggml_view_3d(ctx, cache_k, + head_dim, pre_n, n_head_kv, + cache_k->nb[1], cache_k->nb[2], + cache_k->nb[1] * write_pos); + ggml_tensor * v_slot = ggml_view_3d(ctx, cache_v, + head_dim, pre_n, n_head_kv, + cache_v->nb[1], cache_v->nb[2], + cache_v->nb[1] * write_pos); + ggml_tensor * k_src = ggml_view_3d(ctx, Kcur_T, + head_dim, pre_n, n_head_kv, + Kcur_T->nb[1], Kcur_T->nb[2], 0); + ggml_tensor * v_src = ggml_view_3d(ctx, Vcur_T, + head_dim, pre_n, n_head_kv, + Vcur_T->nb[1], Vcur_T->nb[2], 0); + ggml_build_forward_expand(gf, ggml_cpy(ctx, k_src, k_slot)); + ggml_build_forward_expand(gf, ggml_cpy(ctx, v_src, v_slot)); + } + + // Second slice (wrap-around): [0 .. post_n) + if (post_n > 0) { + ggml_tensor * k_slot = ggml_view_3d(ctx, cache_k, + head_dim, post_n, n_head_kv, + cache_k->nb[1], cache_k->nb[2], + 0); + ggml_tensor * v_slot = ggml_view_3d(ctx, cache_v, + head_dim, post_n, n_head_kv, + cache_v->nb[1], cache_v->nb[2], + 0); + ggml_tensor * k_src = ggml_view_3d(ctx, Kcur_T, + head_dim, post_n, n_head_kv, + Kcur_T->nb[1], Kcur_T->nb[2], + Kcur_T->nb[1] * pre_n); + ggml_tensor * v_src = ggml_view_3d(ctx, Vcur_T, + head_dim, post_n, n_head_kv, + Vcur_T->nb[1], Vcur_T->nb[2], + Vcur_T->nb[1] * pre_n); + ggml_build_forward_expand(gf, ggml_cpy(ctx, k_src, k_slot)); + ggml_build_forward_expand(gf, ggml_cpy(ctx, v_src, v_slot)); + } + } + + // Determine window for SWA reads using the shared geometry helper. + // ring_win_start is always 0 (full-ring read); correctness comes from the + // host-built mask which uses abs_pos(slot) arithmetic for ring geometry. + const SwaView swa_view = compute_swa_view(kv_start, n_tokens, + w.swa_window, ring_size); + const int effective_win_len = swa_view.effective_win_len; + const int ring_win_start = swa_view.ring_win_start; // always 0 + + // swa_ctx_alloc is already aligned to fattn_stride (set in create_gemma4_cache), + // so win_len_padded == effective_win_len == ring_size. No further snap needed. + const bool need_256_pad = (kv_k_type == GGML_TYPE_TQ3_0 || kv_v_type == GGML_TYPE_TQ3_0 + || head_dim >= 512); + const int fattn_stride = need_256_pad ? 256 : 1; + const int win_len_padded = ((effective_win_len + fattn_stride - 1) / fattn_stride) * fattn_stride; + + const bool q_rotate = (kv_k_type == GGML_TYPE_TQ3_0); + const bool out_rotate = (kv_v_type == GGML_TYPE_TQ3_0); + // TQ3 contract: caller pre-rotates Q forward, post-rotates FA output + // backward. ggml_turbo_wht's kernel writes dst using src strides + // (turbo-wht.cu:20-21), so non-contiguous input scatters writes and + // corrupts the result — wrap with ggml_cont before rotating, never after + // permute alone. (Regression-fix vs. c15f93a; see EVIDENCE.md TQ3 thread.) + ggml_tensor * Qfa = ggml_cont(ctx, ggml_permute(ctx, Qcur, 0, 2, 1, 3)); + if (q_rotate) { + Qfa = ggml_turbo_wht(ctx, Qfa, 0); + } + + ggml_tensor * Kfa = ggml_view_3d(ctx, cache_k, + head_dim, win_len_padded, n_head_kv, + cache_k->nb[1], cache_k->nb[2], + cache_k->nb[1] * ring_win_start); + ggml_tensor * Vfa = ggml_view_3d(ctx, cache_v, + head_dim, win_len_padded, n_head_kv, + cache_v->nb[1], cache_v->nb[2], + cache_v->nb[1] * ring_win_start); + // Gemma4: attn_scale = 1.0 (self.scaling = 1.0, no 1/sqrt(head_dim)) + ggml_tensor * attn = ggml_flash_attn_ext(ctx, Qfa, Kfa, Vfa, attn_mask, + 1.0f, 0.0f, 0.0f); + + if (out_rotate) { + attn = ggml_cont(ctx, attn); + attn = ggml_turbo_wht(ctx, attn, 1); + } + + attn = ggml_reshape_2d(ctx, attn, q_dim, n_tokens); + attn = ggml_mul_mat(ctx, L.wo, attn); + return attn; +} + +// Full (Global) Attention block. +// Uses proportional RoPE via per-layer rope_freqs (freq_factors) and full context. +// When use_pflash is true, uses ggml_flash_attn_sparse (block-sparse) instead of +// ggml_flash_attn_ext for the attention computation. +static ggml_tensor * build_full_attn_block( + ggml_context * ctx, + ggml_cgraph * gf, + const GemmaTargetWeights & w, + const GemmaTargetLayer & L, + ggml_tensor * cur, + ggml_tensor * positions, + ggml_tensor * cache_k, + ggml_tensor * cache_v, + ggml_tensor * attn_mask, + int kv_start, + int n_tokens, + ggml_type kv_k_type, + ggml_type kv_v_type, + bool write_kv, + int fa_window, + int il, + bool use_pflash, + float pflash_alpha) +{ + // Full-attention layers use the full head_dim + const int head_dim = w.head_dim; + const int n_head = w.n_head; + const int n_head_kv = (il >= 0 && il < (int)w.head_kv_per_layer.size()) + ? w.head_kv_per_layer[il] : w.n_head_kv; + const int q_dim = n_head * head_dim; + + // Q projection + ggml_tensor * Qcur = ggml_mul_mat(ctx, L.wq, cur); + Qcur = ggml_reshape_3d(ctx, Qcur, head_dim, n_head, n_tokens); + Qcur = rms_norm_mul(ctx, Qcur, L.q_norm, EPS); + + ggml_tensor * Kcur = nullptr; + ggml_tensor * Vcur = nullptr; + if (write_kv) { + Kcur = ggml_mul_mat(ctx, L.wk, cur); + Kcur = ggml_reshape_3d(ctx, Kcur, head_dim, n_head_kv, n_tokens); + + // V = K (pre-norm) when wv absent, else separate projection + if (L.wv == L.wk) { + Vcur = Kcur; + } else { + Vcur = ggml_mul_mat(ctx, L.wv, cur); + Vcur = ggml_reshape_3d(ctx, Vcur, head_dim, n_head_kv, n_tokens); + } + + // K gets weighted RMSNorm, V gets bare RMSNorm (no learned weights) + if (L.k_norm) { + Kcur = rms_norm_mul(ctx, Kcur, L.k_norm, EPS); + } + Vcur = ggml_rms_norm(ctx, Vcur, EPS); + } + + // Proportional RoPE for full-attention layers (uses per-layer rope_freqs) + Qcur = ggml_rope_ext(ctx, Qcur, positions, L.rope_freqs, + head_dim, GGML_ROPE_TYPE_NEOX, /*n_ctx_orig=*/0, + w.rope_theta, /*freq_scale=*/1.0f, + /*ext_factor=*/0.0f, /*attn_factor=*/1.0f, + /*beta_fast=*/0.0f, /*beta_slow=*/0.0f); + if (Kcur) { + Kcur = ggml_rope_ext(ctx, Kcur, positions, L.rope_freqs, + head_dim, GGML_ROPE_TYPE_NEOX, 0, + w.rope_theta, 1.0f, + 0.0f, 1.0f, 0.0f, 0.0f); + } + + // Write K/V into cache + if (write_kv && cache_k && cache_v && Kcur && Vcur) { + ggml_tensor * Kcur_T = ggml_permute(ctx, Kcur, 0, 2, 1, 3); + ggml_tensor * Vcur_T = ggml_permute(ctx, Vcur, 0, 2, 1, 3); + + ggml_tensor * k_slot = ggml_view_3d(ctx, cache_k, + head_dim, n_tokens, n_head_kv, + cache_k->nb[1], cache_k->nb[2], + cache_k->nb[1] * kv_start); + ggml_tensor * v_slot = ggml_view_3d(ctx, cache_v, + head_dim, n_tokens, n_head_kv, + cache_v->nb[1], cache_v->nb[2], + cache_v->nb[1] * kv_start); + ggml_build_forward_expand(gf, ggml_cpy(ctx, Kcur_T, k_slot)); + ggml_build_forward_expand(gf, ggml_cpy(ctx, Vcur_T, v_slot)); + } + + // For full-attention layers: optional windowed FA for long-context efficiency + const int win_start = (fa_window > 0 && kv_start > fa_window) + ? (kv_start - fa_window) : 0; + const int kv_len = kv_start + n_tokens; + const int win_len = kv_len - win_start; + + const bool need_256_pad = (kv_k_type == GGML_TYPE_TQ3_0 || kv_v_type == GGML_TYPE_TQ3_0 + || head_dim >= 512); + const int fattn_stride = need_256_pad ? 256 : 1; + const int win_len_padded = ((win_len + fattn_stride - 1) / fattn_stride) * fattn_stride; + + const bool q_rotate = (kv_k_type == GGML_TYPE_TQ3_0); + const bool out_rotate = (kv_v_type == GGML_TYPE_TQ3_0); + // See SWA block above for the contiguity rationale (turbo-wht.cu:20-21). + ggml_tensor * Qfa = ggml_cont(ctx, ggml_permute(ctx, Qcur, 0, 2, 1, 3)); + if (q_rotate) { + Qfa = ggml_turbo_wht(ctx, Qfa, 0); + } + + ggml_tensor * Kfa = ggml_view_3d(ctx, cache_k, + head_dim, win_len_padded, n_head_kv, + cache_k->nb[1], cache_k->nb[2], + cache_k->nb[1] * win_start); + ggml_tensor * Vfa = ggml_view_3d(ctx, cache_v, + head_dim, win_len_padded, n_head_kv, + cache_v->nb[1], cache_v->nb[2], + cache_v->nb[1] * win_start); + + // pFlash sparse path supports F16, Q8_0, and Q4_0 K/V — the CUDA dispatch layer + // dequantizes to F16 before the S<->H BF16 transpose for these types. + // TQ3_0 is excluded because sparse FA does not consume rotated TQ3 K/V + // directly; fall back to dense FA for TQ3_0 and other types. + auto pflash_supports = [](enum ggml_type t) { + return t == GGML_TYPE_F16 || t == GGML_TYPE_Q8_0 || t == GGML_TYPE_Q4_0; + }; + const bool can_pflash = use_pflash && + pflash_supports(Kfa->type) && + pflash_supports(Vfa->type); + + // Gemma4: attn_scale = 1.0 (self.scaling = 1.0, no 1/sqrt(head_dim)) + ggml_tensor * attn; + if (can_pflash) { + attn = ggml_flash_attn_sparse(ctx, Qfa, Kfa, Vfa, 1.0f, pflash_alpha); + } else { + attn = ggml_flash_attn_ext(ctx, Qfa, Kfa, Vfa, attn_mask, 1.0f, 0.0f, 0.0f); + } + + if (out_rotate) { + attn = ggml_cont(ctx, attn); + attn = ggml_turbo_wht(ctx, attn, 1); + } + + attn = ggml_reshape_2d(ctx, attn, q_dim, n_tokens); + attn = ggml_mul_mat(ctx, L.wo, attn); + return attn; +} + +// ─── GemmaTargetCache allocation ───────────────────────────────────────────── + +bool create_gemma4_cache(const GemmaTargetWeights & w, + int max_ctx, + ggml_backend_t backend, + GemmaTargetCache & out, + const std::vector & extra_q8_layers, + int target_feat_cap_hint, + bool enable_dflash_capture_overrides) { + out.backend = backend; + out.max_ctx = max_ctx; + out.cur_pos = 0; + + // Resolve KV types from environment + ggml_type kv_k_type = GGML_TYPE_Q8_0; + ggml_type kv_v_type = GGML_TYPE_Q8_0; + dflash::resolve_kv_types(kv_k_type, kv_v_type); + out.kv_k_type = kv_k_type; + out.kv_v_type = kv_v_type; + + // TQ3_0 and head_dim>=512 (CUDA FA FATTN_KQ_STRIDE) require 256-alignment + const bool need_256_align = (kv_k_type == GGML_TYPE_TQ3_0 || kv_v_type == GGML_TYPE_TQ3_0 + || w.head_dim >= 512); + const int align_stride = need_256_align ? 256 : 1; + const int max_ctx_alloc = need_256_align + ? ((max_ctx + 255) / 256) * 256 + : max_ctx; + + // SWA layers only need swa_window slots (ring-buffer). Allocate + // min(max_ctx_alloc, swa_window_padded) for SWA layers, saving ~50% VRAM + // at long contexts. swa_ctx_alloc must be strictly > swa_window so the + // decode window (win_len = swa_window + n_tokens) fits within one view. + // We pad swa_window to the same alignment stride and add one alignment + // block as headroom so contiguous views always work for n_tokens=1 decode. + const int swa_window_padded = (w.swa_window > 0) + ? ((w.swa_window + align_stride - 1) / align_stride) * align_stride + : max_ctx_alloc; + // Ring sized to hold last R = 2*swa_window keys (= 2 chunks worth, since + // chunk_size <= swa_window). Combined with a non-monotonic mask in the + // test driver's build_swa_causal_mask, this lets the K view be the full + // ring while correctness comes from the mask filtering by abs_pos. + const int swa_ring_target = 2 * swa_window_padded; + const int swa_ctx_alloc = (w.swa_window > 0) + ? std::min(max_ctx_alloc, swa_ring_target) + : max_ctx_alloc; + out.swa_ctx_alloc = swa_ctx_alloc; + + // Build layer -> KV index mappings. + // Gemma4 can share KV caches across layers. The weight loader sets wk=nullptr + // for shared layers. We detect this and point them at the most recent + // non-shared layer's KV slot. + out.layer_to_kv_idx.assign(w.n_layer, -1); + out.layer_to_donor_kv.assign(w.n_layer, -1); + + int n_kv_slots = 0; + for (int il = 0; il < w.n_layer; il++) { + if (w.layers[il].wk != nullptr) { + out.layer_to_kv_idx[il] = n_kv_slots++; + } + } + + // For shared layers, find the most recent layer that owns a KV slot + int last_kv_slot = -1; + for (int il = 0; il < w.n_layer; il++) { + if (out.layer_to_kv_idx[il] >= 0) { + last_kv_slot = out.layer_to_kv_idx[il]; + } else { + out.layer_to_donor_kv[il] = last_kv_slot; + } + } + + if (n_kv_slots == 0) { + set_last_error("create_gemma4_cache: no KV-owning layers found"); + return false; + } + + // Per-layer KV types. + // + // The upstream FA dispatch (deps/llama.cpp/.../fattn.cu:441) routes + // TQ3 + (Q->ne[0] > 256 || Q->ne[1] > 1) to the slow CHUNKED kernel. + // On Dense Gemma4 31B with full-attn head_dim=512, every chunked + // prefill / draft-verify hits this trap. + // + // Narrow workaround (Codex pattern, mirrors vLLM's kv-cache-dtype-skip-layers): + // when the DFlash draft is wired up, force Q8_0 KV on the small subset of + // full-attn layers whose hidden states are CAPTURED for the draft (the + // "target_feat" ring at gemma4_target_graph.cpp:971 — drafter consumes + // these in build_gemma4_draft_graph). This unblocks the pflash sparse + // fast path for the layers the draft actually depends on, without + // touching the other 8/10 full-attn layers (avoids the MoE regression + // we saw when forcing ALL full-attn -> Q8). + out.kv_k_type_per_layer.assign(w.n_layer, kv_k_type); + out.kv_v_type_per_layer.assign(w.n_layer, kv_v_type); + + const bool gate = (kv_k_type == GGML_TYPE_TQ3_0 || kv_v_type == GGML_TYPE_TQ3_0) + && (w.head_dim > 256) + && enable_dflash_capture_overrides + && (w.n_capture_layers > 0); + + if (gate) { + int n_overridden = 0; + for (int ci = 0; ci < w.n_capture_layers; ci++) { + const int captured_il = w.capture_layer_ids[ci]; + if (captured_il < 0 || captured_il >= w.n_layer) continue; + const bool is_swa = (captured_il < (int)w.swa_layers.size()) + && w.swa_layers[captured_il]; + if (is_swa) continue; // SWA layers don't hit the trap + if (kv_k_type == GGML_TYPE_TQ3_0) { + out.kv_k_type_per_layer[captured_il] = GGML_TYPE_Q8_0; + } + if (kv_v_type == GGML_TYPE_TQ3_0) { + out.kv_v_type_per_layer[captured_il] = GGML_TYPE_Q8_0; + } + n_overridden++; + } + // Count total full-attn layers for the log message + int n_full_attn = 0; + for (int il = 0; il < w.n_layer; il++) { + const bool is_swa = (il < (int)w.swa_layers.size()) && w.swa_layers[il]; + if (!is_swa && out.layer_to_kv_idx[il] >= 0) n_full_attn++; + } + std::fprintf(stderr, + "[cache] narrow asymmetric: forced Q8_0 on %d captured full-attn layer(s) " + "(remaining %d full-attn keep TQ3)\n", + n_overridden, n_full_attn - n_overridden); + } + + // Extra override: force Q8_0 on caller-specified layer indices (e.g. MTP donor layers). + // These layers must NOT use TQ3_0 because MTP cross-attention reads them via ggml_cast + // (no FWHT inverse applied), so TQ3_0 FWHT-domain values would corrupt attention scores. + if (!extra_q8_layers.empty() && + (kv_k_type == GGML_TYPE_TQ3_0 || kv_v_type == GGML_TYPE_TQ3_0)) { + int n_mtp_overridden = 0; + for (int il : extra_q8_layers) { + if (il < 0 || il >= w.n_layer) continue; + if (kv_k_type == GGML_TYPE_TQ3_0) out.kv_k_type_per_layer[il] = GGML_TYPE_Q8_0; + if (kv_v_type == GGML_TYPE_TQ3_0) out.kv_v_type_per_layer[il] = GGML_TYPE_Q8_0; + n_mtp_overridden++; + } + if (n_mtp_overridden > 0) { + std::fprintf(stderr, + "[cache] MTP donor override: forced Q8_0 on %d layer(s) to avoid TQ3/FWHT cross-attn mismatch\n", + n_mtp_overridden); + } + } + + // (head_dim and n_head_kv are resolved per-layer in the allocation loop below) + + const int n_capture_layers = w.n_capture_layers; + const int n_embd = w.n_embd; + + // Tensor count: 2 (K+V) per KV slot + 1 target_feat + const int n_tensors = 2 * n_kv_slots + 1; + ggml_init_params ip{}; + ip.mem_size = (size_t)(n_tensors + 16) * ggml_tensor_overhead(); + ip.mem_buffer = nullptr; + ip.no_alloc = true; + out.base_ctx = ggml_init(ip); + if (!out.base_ctx) { + set_last_error("create_gemma4_cache: ggml_init failed"); + return false; + } + + out.attn_k.assign(n_kv_slots, nullptr); + out.attn_v.assign(n_kv_slots, nullptr); + + // Create KV tensors — iterate layers to preserve name <-> layer correlation. + // Each layer's KV slot uses the head_dim and n_head_kv appropriate to its + // attention type (SWA vs full-attention may have different dimensions). + for (int il = 0; il < w.n_layer; il++) { + const int kv_idx = out.layer_to_kv_idx[il]; + if (kv_idx < 0) continue; + + const bool is_swa_layer = (il < (int)w.swa_layers.size()) && w.swa_layers[il]; + const int layer_head_dim = is_swa_layer ? w.head_dim_swa : w.head_dim; + const int layer_n_head_kv = (il < (int)w.head_kv_per_layer.size()) + ? w.head_kv_per_layer[il] : w.n_head_kv; + + // SWA layers use a ring buffer of swa_ctx_alloc slots; full-attn layers + // need the full max_ctx_alloc to cover the entire context. + const int layer_ctx_alloc = is_swa_layer ? swa_ctx_alloc : max_ctx_alloc; + + const ggml_type layer_kv_k_type = out.kv_k_type_per_layer[il]; + const ggml_type layer_kv_v_type = out.kv_v_type_per_layer[il]; + ggml_tensor * K = ggml_new_tensor_3d(out.base_ctx, layer_kv_k_type, + layer_head_dim, layer_ctx_alloc, layer_n_head_kv); + ggml_tensor * V = ggml_new_tensor_3d(out.base_ctx, layer_kv_v_type, + layer_head_dim, layer_ctx_alloc, layer_n_head_kv); + char name[64]; + std::snprintf(name, sizeof(name), "gemma4_cache_k_%d", il); + ggml_set_name(K, name); + std::snprintf(name, sizeof(name), "gemma4_cache_v_%d", il); + ggml_set_name(V, name); + out.attn_k[kv_idx] = K; + out.attn_v[kv_idx] = V; + } + + // target_feat ring buffer: [n_capture_layers * n_embd, cap] bf16 + constexpr int TARGET_FEAT_CAP_DEFAULT = 4096; + const int target_feat_cap_req = std::max(TARGET_FEAT_CAP_DEFAULT, target_feat_cap_hint); + out.target_feat_cap = std::min(max_ctx, target_feat_cap_req); + { + const int fc_in = n_capture_layers * n_embd; + out.target_feat = ggml_new_tensor_2d(out.base_ctx, GGML_TYPE_BF16, + fc_in, out.target_feat_cap); + ggml_set_name(out.target_feat, "gemma4_target_feat"); + } + + out.base_buf = ggml_backend_alloc_ctx_tensors(out.base_ctx, backend); + if (!out.base_buf) { + set_last_error("create_gemma4_cache: ggml_backend_alloc_ctx_tensors failed"); + ggml_free(out.base_ctx); + out.base_ctx = nullptr; + return false; + } + + // Count full-attn vs SWA KV-owning layers for VRAM savings log. + int n_full_kv = 0, n_swa_kv = 0; + for (int il = 0; il < w.n_layer; il++) { + if (out.layer_to_kv_idx[il] < 0) continue; + const bool is_swa = (il < (int)w.swa_layers.size()) && w.swa_layers[il]; + if (is_swa) n_swa_kv++; else n_full_kv++; + } + const float full_slots = (float)n_full_kv * max_ctx_alloc; + const float swa_slots = (float)n_swa_kv * swa_ctx_alloc; + const float old_slots = (float)(n_full_kv + n_swa_kv) * max_ctx_alloc; + const float saved_pct = old_slots > 0.0f + ? 100.0f * (1.0f - (full_slots + swa_slots) / old_slots) + : 0.0f; + // Find a representative SWA layer index and a representative full-attn layer index + // for the diagnostic log (first of each kind that owns a KV slot). + int repr_swa_il = -1, repr_full_il = -1; + for (int il = 0; il < w.n_layer; il++) { + if (out.layer_to_kv_idx[il] < 0) continue; + const bool is_swa = (il < (int)w.swa_layers.size()) && w.swa_layers[il]; + if (is_swa && repr_swa_il < 0) repr_swa_il = il; + if (!is_swa && repr_full_il < 0) repr_full_il = il; + if (repr_swa_il >= 0 && repr_full_il >= 0) break; + } + const char * swa_k_name = (repr_swa_il >= 0) + ? ggml_type_name(out.kv_k_type_per_layer[repr_swa_il]) : "n/a"; + const char * full_k_name = (repr_full_il >= 0) + ? ggml_type_name(out.kv_k_type_per_layer[repr_full_il]) : "n/a"; + std::fprintf(stderr, + "[cache] created max_ctx=%d (full_attn=%d, swa=%d), kv_layers=%d, saved %.1f%%\n", + max_ctx, max_ctx_alloc, swa_ctx_alloc, n_kv_slots, saved_pct); + std::fprintf(stderr, "[cache] kv types: SWA=%s, full=%s\n", swa_k_name, full_k_name); + + // Zero-initialize all tensors + std::vector zeros(1 * 1024 * 1024, 0); + for (ggml_tensor * t = ggml_get_first_tensor(out.base_ctx); t != nullptr; + t = ggml_get_next_tensor(out.base_ctx, t)) { + size_t nb = ggml_nbytes(t); + size_t off = 0; + while (off < nb) { + size_t chunk = std::min(nb - off, zeros.size()); + ggml_backend_tensor_set(t, zeros.data(), off, chunk); + off += chunk; + } + } + + return true; +} + +void free_gemma4_cache(GemmaTargetCache & c) { + free_draft_kv_cache(c); + if (c.base_buf) { ggml_backend_buffer_free(c.base_buf); c.base_buf = nullptr; } + if (c.base_ctx) { ggml_free(c.base_ctx); c.base_ctx = nullptr; } + c.attn_k.clear(); + c.attn_v.clear(); + c.layer_to_kv_idx.clear(); + c.layer_to_donor_kv.clear(); + c.target_feat = nullptr; + c.cur_pos = 0; + c.last_tok = -1; + c.swa_ctx_alloc = 0; +} + +void reset_gemma4_cache(GemmaTargetCache & c) { + c.cur_pos = 0; + c.last_tok = -1; + c.draft_kv_pos = 0; + std::vector zeros(1 * 1024 * 1024, 0); + if (!c.base_ctx) return; + for (ggml_tensor * t = ggml_get_first_tensor(c.base_ctx); t != nullptr; + t = ggml_get_next_tensor(c.base_ctx, t)) { + size_t nb = ggml_nbytes(t); + size_t off = 0; + while (off < nb) { + size_t chunk = std::min(nb - off, zeros.size()); + ggml_backend_tensor_set(t, zeros.data(), off, chunk); + off += chunk; + } + } +} + +// ─── Draft KV cache allocation ─────────────────────────────────────────────── + +bool create_draft_kv_cache(const GemmaDraftWeights & dw, + ggml_backend_t backend, + GemmaTargetCache & cache, + int cap_override) { + // Capacity: sliding window + one block + headroom + const int default_cap = dw.sliding_window + dw.block_size + 32; + const int draft_kv_cap = cap_override > 0 ? cap_override : default_cap; + if (draft_kv_cap < dw.block_size + 1) { + set_last_error("create_draft_kv_cache: cap_override is smaller than block_size+1"); + return false; + } + + const size_t n_tensors = (size_t)(2 * dw.n_layer); // K + V per layer + ggml_init_params ip{}; + ip.mem_size = ggml_tensor_overhead() * n_tensors + 256; + ip.mem_buffer = nullptr; + ip.no_alloc = true; + cache.draft_kv_ctx = ggml_init(ip); + if (!cache.draft_kv_ctx) { + set_last_error("create_draft_kv_cache: ggml_init failed"); + return false; + } + + cache.draft_k.reserve((size_t)dw.n_layer); + cache.draft_v.reserve((size_t)dw.n_layer); + + for (int il = 0; il < dw.n_layer; il++) { + ggml_tensor * K = ggml_new_tensor_3d(cache.draft_kv_ctx, GGML_TYPE_F32, + dw.head_dim, dw.n_head_kv, draft_kv_cap); + ggml_tensor * V = ggml_new_tensor_3d(cache.draft_kv_ctx, GGML_TYPE_F32, + dw.head_dim, dw.n_head_kv, draft_kv_cap); + char name[64]; + std::snprintf(name, sizeof(name), "draft_k_%d", il); + ggml_set_name(K, name); + std::snprintf(name, sizeof(name), "draft_v_%d", il); + ggml_set_name(V, name); + cache.draft_k.push_back(K); + cache.draft_v.push_back(V); + } + + cache.draft_kv_buf = ggml_backend_alloc_ctx_tensors(cache.draft_kv_ctx, backend); + if (!cache.draft_kv_buf) { + set_last_error("create_draft_kv_cache: ggml_backend_alloc_ctx_tensors failed"); + ggml_free(cache.draft_kv_ctx); + cache.draft_kv_ctx = nullptr; + cache.draft_k.clear(); + cache.draft_v.clear(); + return false; + } + + cache.draft_kv_cap = draft_kv_cap; + cache.draft_kv_pos = 0; + + ggml_backend_buffer_clear(cache.draft_kv_buf, 0); + + return true; +} + +void free_draft_kv_cache(GemmaTargetCache & cache) { + if (cache.draft_kv_buf) { + ggml_backend_buffer_free(cache.draft_kv_buf); + cache.draft_kv_buf = nullptr; + } + if (cache.draft_kv_ctx) { + ggml_free(cache.draft_kv_ctx); + cache.draft_kv_ctx = nullptr; + } + cache.draft_k.clear(); + cache.draft_v.clear(); + cache.draft_kv_cap = 0; + cache.draft_kv_pos = 0; +} + +// ─── Main graph builder ─────────────────────────────────────────────────────── + +GemmaGraphOutputs build_gemma4_graph( + ggml_context * ctx, + ggml_cgraph * gf, + const GemmaTargetWeights & w, + GemmaTargetCache & cache, + const GemmaGraphInputs & in) +{ + const int n_tokens = in.n_tokens; + const int kv_start = in.kv_start; + const int n_embd = w.n_embd; + + // CUDA FA for head_dim>=512 requires a non-null mask to enable the GQA + // optimization path (gqa_opt_applies=true). Auto-create a causal mask + // when the caller did not supply one so that full-attention layers don't + // hit BEST_FATTN_KERNEL_NONE → abort. + ggml_tensor * attn_mask = in.attn_mask; + if (!attn_mask && w.head_dim >= 512) { + const int kv_len = kv_start + n_tokens; + // Pad to 256 — required by FATTN_KQ_STRIDE for TQ3 / large head_dim. + const int kv_len_padded = ((kv_len + 255) / 256) * 256; + attn_mask = ggml_new_tensor_2d(ctx, GGML_TYPE_F16, kv_len_padded, n_tokens); + ggml_set_name(attn_mask, "auto_causal_mask"); + ggml_set_input(attn_mask); + } + + ggml_tensor * inpL = in.inp_embed; // [n_embd, n_tokens] f32 + + // Gemma4 scales embeddings by sqrt(n_embd) (matches HF Gemma4TextScaledWordEmbedding) + inpL = ggml_scale(ctx, inpL, std::sqrt((float)n_embd)); + + for (int il = 0; il < w.n_layer; il++) { + const GemmaTargetLayer & L = w.layers[il]; + const bool is_swa = (il < (int)w.swa_layers.size()) ? w.swa_layers[il] : true; + + // ── a) Pre-attention RMSNorm ──────────────────────────────────────────── + ggml_tensor * inpSA = inpL; + ggml_tensor * cur = rms_norm_mul(ctx, inpL, L.attn_norm, EPS); + + // ── b-f) Attention (SWA or Full) ─────────────────────────────────────── + const int kv_idx = cache.layer_to_kv_idx[il]; + const bool write_kv = (kv_idx >= 0); + + // Determine which KV cache buffers to use for reading + const int read_kv_idx = write_kv ? kv_idx : cache.layer_to_donor_kv[il]; + ggml_tensor * cache_k = (read_kv_idx >= 0) ? cache.attn_k[read_kv_idx] : nullptr; + ggml_tensor * cache_v = (read_kv_idx >= 0) ? cache.attn_v[read_kv_idx] : nullptr; + + // Resolve per-layer KV types (asymmetric: TQ3 on SWA, Q8 on full-attn). + const ggml_type layer_kv_k = !cache.kv_k_type_per_layer.empty() + ? cache.kv_k_type_per_layer[il] : cache.kv_k_type; + const ggml_type layer_kv_v = !cache.kv_v_type_per_layer.empty() + ? cache.kv_v_type_per_layer[il] : cache.kv_v_type; + + if (is_swa) { + ggml_tensor * effective_mask = in.swa_mask ? in.swa_mask : attn_mask; + cur = build_swa_attn_block(ctx, gf, w, L, cur, in.positions, + cache_k, cache_v, effective_mask, + kv_start, n_tokens, + layer_kv_k, layer_kv_v, + write_kv, il); + } else { + cur = build_full_attn_block(ctx, gf, w, L, cur, in.positions, + cache_k, cache_v, attn_mask, + kv_start, n_tokens, + layer_kv_k, layer_kv_v, + write_kv, in.fa_window, il, + in.use_pflash, in.pflash_alpha); + } + + // ── g) Output projection already done inside attn block ──────────────── + + // ── h) Post-attention norm + residual ────────────────────────────────── + if (L.attn_post_norm) { + cur = rms_norm_mul(ctx, cur, L.attn_post_norm, EPS); + } + // NOTE: out_scale is applied AFTER the full layer (after FFN), not here + ggml_tensor * inpSA_post = ggml_add(ctx, cur, inpSA); + + // ── i) FFN ───────────────────────────────────────────────────────────── + ggml_tensor * ffn_residual = inpSA_post; + ggml_tensor * ffn_in = rms_norm_mul(ctx, inpSA_post, L.ffn_norm, EPS); + + ggml_tensor * ffn_out = nullptr; + if (L.ffn_gate_inp != nullptr) { + // MoE path (26B-A4B): shared expert uses ffn_norm, routed use ffn_pre_norm_2 + ggml_tensor * moe_in = L.ffn_pre_norm_2 + ? rms_norm_mul(ctx, inpSA_post, L.ffn_pre_norm_2, EPS) + : ffn_in; + ffn_out = build_moe_ffn(ctx, gf, w, L, + ffn_in, moe_in, inpSA_post, + n_tokens); + } else { + // Dense path (31B) + ffn_out = build_geglu_ffn(ctx, ffn_in, L); + } + + // Post-FFN norm + if (L.ffn_post_norm) { + ffn_out = rms_norm_mul(ctx, ffn_out, L.ffn_post_norm, EPS); + } + + cur = ggml_add(ctx, ffn_out, ffn_residual); + + // ── layer_output_scale: applied after full layer (attn + FFN residuals) ─ + // Matches HF: hidden_states = layer_scalar * (attn_residual + ffn_residual) + if (L.out_scale) { + cur = ggml_mul(ctx, cur, L.out_scale); + } + + // ── j) Per-Layer Embedding (PLE) ─────────────────────────────────────── + if (in.per_layer_inp && L.ple_inp_gate && L.ple_proj) { + // ple_inp_gate: gate projection + ggml_tensor * ple_gate = ggml_mul_mat(ctx, L.ple_inp_gate, cur); + ple_gate = ggml_gelu(ctx, ple_gate); + + // per_layer_inp is [n_embd_per_layer, n_tokens, n_layer] or similar; + // we select the slice for this layer along axis 2. + // Assuming per_layer_inp is [n_embd_per_layer, n_tokens] for this layer + // (caller pre-selects by layer index) — or it is [n_embd_per_layer, n_layer] + // shaped with the layer axis being dim 1. + // Use a view to extract the il-th column if per_layer_inp has n_layer cols. + const int n_embd_per_layer = w.n_embd_per_layer > 0 ? w.n_embd_per_layer + : (int)in.per_layer_inp->ne[0]; + ggml_tensor * ple_emb; + if (ggml_n_dims(in.per_layer_inp) >= 3 || (int)in.per_layer_inp->ne[1] == w.n_layer) { + // Shape [n_embd_per_layer, n_layer] or [n_embd_per_layer, n_tokens, n_layer] + ple_emb = ggml_view_2d(ctx, in.per_layer_inp, + n_embd_per_layer, n_tokens, + in.per_layer_inp->nb[1], + (size_t)il * n_tokens * in.per_layer_inp->nb[1]); + } else { + // Already sliced per-layer by caller + ple_emb = in.per_layer_inp; + } + + ggml_tensor * ple = ggml_mul(ctx, ple_gate, ple_emb); + ple = ggml_mul_mat(ctx, L.ple_proj, ple); + if (L.ple_post_norm) { + ple = rms_norm_mul(ctx, ple, L.ple_post_norm, EPS); + } + cur = ggml_add(ctx, cur, ple); + } + + // ── k) Target feature capture ────────────────────────────────────────── + if (in.capture_layers && cache.target_feat) { + int capture_idx = -1; + for (int k = 0; k < w.n_capture_layers; k++) { + if (w.capture_layer_ids[k] == il) { capture_idx = k; break; } + } + if (capture_idx >= 0) { + const size_t elt = ggml_element_size(cache.target_feat); + const size_t col_stride = cache.target_feat->nb[1]; + const int cap = cache.target_feat_cap; + const int slot_start = kv_start % cap; + const int pre_n = std::min(n_tokens, cap - slot_start); + const int post_n = n_tokens - pre_n; + + ggml_tensor * cur_2d = ggml_reshape_2d(ctx, cur, n_embd, n_tokens); + + // First slice: [slot_start..slot_start+pre_n) in the ring + { + const size_t offset = + (size_t)slot_start * col_stride + + (size_t)capture_idx * n_embd * elt; + ggml_tensor * slot = ggml_view_2d(ctx, cache.target_feat, + n_embd, pre_n, col_stride, offset); + ggml_tensor * src = ggml_view_2d(ctx, cur_2d, + n_embd, pre_n, cur_2d->nb[1], 0); + ggml_build_forward_expand(gf, ggml_cpy(ctx, src, slot)); + } + + // Second slice: wrap-around at [0..post_n) if needed + if (post_n > 0) { + const size_t offset = + (size_t)capture_idx * n_embd * elt; + ggml_tensor * slot = ggml_view_2d(ctx, cache.target_feat, + n_embd, post_n, col_stride, offset); + ggml_tensor * src = ggml_view_2d(ctx, cur_2d, + n_embd, post_n, cur_2d->nb[1], + (size_t)pre_n * cur_2d->nb[1]); + ggml_build_forward_expand(gf, ggml_cpy(ctx, src, slot)); + } + } + } + + // ── l) Advance residual stream ────────────────────────────────────────── + inpL = cur; + } + + // ── Final norm ───────────────────────────────────────────────────────────── + ggml_tensor * out = rms_norm_mul(ctx, inpL, w.out_norm, EPS); + + // ── MTP h_prev capture (post-output-norm, last token) ────────────────────── + // h_prev must be the backbone hidden AFTER final RMSNorm — the same vector + // fed to lm_head — so the MTP draft head sees the same representation as + // the target's token prediction. Capturing inside the layer loop (pre-norm) + // caused accept_rate=0 because the draft head was trained on post-norm hiddens. + // Source: vLLM PR #41745:569-621 + llama.cpp #22738. + if (cache.mtp_h_prev_enabled && cache.mtp_h_prev) { + const int n_embd_hp = (int)cache.mtp_h_prev->ne[0]; + ggml_tensor * h_prev_src = out; + if (n_tokens > 1) { + h_prev_src = ggml_view_2d(ctx, out, + n_embd_hp, 1, + ggml_row_size(out->type, n_embd_hp), + ggml_row_size(out->type, n_embd_hp) * (n_tokens - 1)); + } + if (h_prev_src->type != GGML_TYPE_F32) { + h_prev_src = ggml_cast(ctx, h_prev_src, GGML_TYPE_F32); + } + h_prev_src = ggml_reshape_2d(ctx, h_prev_src, n_embd_hp, 1); + ggml_build_forward_expand(gf, ggml_cpy(ctx, h_prev_src, cache.mtp_h_prev)); + } + + // ── last_token_logits_only: slice to the final token before lm_head ──────── + // During chunked prefill we only need the last token's logits to seed decode. + // Slicing here reduces lm_head compute from O(n_tokens) to O(1) and avoids + // allocating a [vocab, n_tokens] output tensor (saves ~1 GB for chunk_size=1024). + if (in.last_token_logits_only && n_tokens > 1) { + out = ggml_view_2d(ctx, out, + n_embd, 1, + ggml_row_size(out->type, n_embd), + ggml_row_size(out->type, n_embd) * (n_tokens - 1)); + } + + // ── LM head ──────────────────────────────────────────────────────────────── + ggml_tensor * logits = ggml_mul_mat(ctx, w.output, out); + + // ── Logit softcapping: logits = softcap * tanh(logits / softcap) ────────── + if (w.logit_softcap > 0.0f) { + logits = ggml_scale(ctx, logits, 1.0f / w.logit_softcap); + logits = ggml_tanh(ctx, logits); + logits = ggml_scale(ctx, logits, w.logit_softcap); + } + + ggml_set_name(logits, "logits"); + ggml_build_forward_expand(gf, logits); + + GemmaGraphOutputs og{}; + og.logits = logits; + return og; +} + +} // namespace dflash27b diff --git a/dflash/src/gemma4_target_loader.cpp b/dflash/src/gemma4_target_loader.cpp new file mode 100644 index 00000000..d8823770 --- /dev/null +++ b/dflash/src/gemma4_target_loader.cpp @@ -0,0 +1,1200 @@ +// Loads a Gemma4 target model (31B Dense or 26B-A4B MoE) from a GGUF file into +// a GemmaTargetWeights struct backed by the supplied ggml backend (typically +// CUDA). +// +// The expected GGUF architecture string is "gemma4". The loader supports both +// the dense variant (60 layers, pure SwiGLU FFN) and the MoE variant (30 +// layers, sparse expert FFN on the "26B-A4B" config). +// +// Tensor naming follows llama.cpp's gemma4-iswa.cpp conventions: +// +// Global: +// token_embd.weight [n_embd, n_vocab] +// output_norm.weight [n_embd] +// output.weight [n_vocab, n_embd] (optional; falls back) +// +// Per-Layer Embedding (PLE, present when n_embd_per_layer > 0): +// per_layer_token_embd.weight [n_embd_per_layer * n_layer, n_vocab] +// per_layer_model_proj.weight [n_embd, n_embd_per_layer * n_layer] +// per_layer_proj_norm.weight [n_embd_per_layer] +// blk.{i}.inp_gate.weight [n_embd, n_embd_per_layer] +// blk.{i}.proj.weight [n_embd_per_layer, n_embd] +// blk.{i}.post_norm.weight [n_embd] +// +// Per-Layer Attention: +// blk.{i}.attn_norm.weight [n_embd] +// blk.{i}.attn_q.weight [n_embd, n_head * head_dim] +// blk.{i}.attn_k.weight [n_embd, n_head_kv * head_dim] (optional) +// blk.{i}.attn_v.weight [n_embd, n_head_kv * head_dim] (optional) +// blk.{i}.attn_output.weight [n_head * head_dim, n_embd] +// blk.{i}.attn_q_norm.weight [head_dim] +// blk.{i}.attn_k_norm.weight [head_dim] (optional) +// blk.{i}.attn_post_norm.weight [n_embd] +// blk.{i}.rope_freqs.weight [head_dim/2] (full-attn layers only) +// blk.{i}.out_scale.weight [1] (optional) +// +// Per-Layer FFN (SwiGLU): +// blk.{i}.ffn_norm.weight [n_embd] +// blk.{i}.ffn_gate.weight [n_embd, n_ff] +// blk.{i}.ffn_up.weight [n_embd, n_ff] +// blk.{i}.ffn_down.weight [n_ff, n_embd] +// blk.{i}.ffn_post_norm.weight [n_embd] +// +// Per-Layer MoE (26B-A4B only, present when n_expert > 0): +// blk.{i}.ffn_gate_inp.weight [n_embd, n_expert] +// blk.{i}.ffn_gate_inp.scale [n_embd] (optional) +// blk.{i}.ffn_pre_norm_2.weight [n_embd] +// blk.{i}.ffn_gate_up_exps.weight [n_embd, n_ff_exp*2, n_expert] +// blk.{i}.ffn_down_exps.weight [n_ff_exp, n_embd, n_expert] +// blk.{i}.ffn_down_exps.scale [n_expert] (optional) +// blk.{i}.ffn_post_norm_1.weight [n_embd] +// blk.{i}.ffn_post_norm_2.weight [n_embd] +// +// KV-sharing: layers with index >= (n_layer - n_kv_shared_layers) omit wk, wv, +// k_norm. Their KV is borrowed from the last non-shared layer of the same +// attention type. layer_to_kv_idx maps each layer to its KV cache slot; +// layer_to_donor_kv maps shared layers to their donor layer index. + +#include "internal.h" + +#include +#include +#include +#include +#include +#include +#include + +#if !defined(_WIN32) +#include +#include +#include +#include +#include +#endif + +namespace dflash27b { + +namespace { + +// ─── Thin mmap wrapper ─────────────────────────────────────────────────────── +// Mirrors the Mmap struct from gguf_target_loader.cpp. Ownership can be +// transferred to a CpuEmbedder via release(). + +struct Mmap { + void * addr = nullptr; + size_t len = 0; +#if defined(_WIN32) + HANDLE hFile = INVALID_HANDLE_VALUE; + HANDLE hMap = nullptr; +#else + int fd = -1; +#endif + + bool open_ro(const std::string & path, std::string & err) { +#if defined(_WIN32) + hFile = CreateFileA(path.c_str(), GENERIC_READ, FILE_SHARE_READ, + nullptr, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, nullptr); + if (hFile == INVALID_HANDLE_VALUE) { + err = "CreateFileA: " + path + ": error " + std::to_string(GetLastError()); + return false; + } + LARGE_INTEGER sz; + if (!GetFileSizeEx(hFile, &sz)) { + err = "GetFileSizeEx: error " + std::to_string(GetLastError()); + return false; + } + len = (size_t)sz.QuadPart; + hMap = CreateFileMappingA(hFile, nullptr, PAGE_READONLY, 0, 0, nullptr); + if (!hMap) { + err = "CreateFileMappingA: error " + std::to_string(GetLastError()); + return false; + } + addr = MapViewOfFile(hMap, FILE_MAP_READ, 0, 0, 0); + if (!addr) { + err = "MapViewOfFile: error " + std::to_string(GetLastError()); + return false; + } +#else + fd = ::open(path.c_str(), O_RDONLY); + if (fd < 0) { err = "open: " + path + ": " + std::strerror(errno); return false; } + struct stat st; + if (::fstat(fd, &st) < 0) { err = "fstat: " + std::string(std::strerror(errno)); return false; } + len = (size_t)st.st_size; + addr = ::mmap(nullptr, len, PROT_READ, MAP_PRIVATE, fd, 0); + if (addr == MAP_FAILED) { err = "mmap: " + std::string(std::strerror(errno)); addr = nullptr; return false; } +#endif + return true; + } + + void release() { + addr = nullptr; + len = 0; +#if defined(_WIN32) + hFile = INVALID_HANDLE_VALUE; + hMap = nullptr; +#else + fd = -1; +#endif + } + + ~Mmap() { +#if defined(_WIN32) + if (addr) UnmapViewOfFile(addr); + if (hMap) CloseHandle(hMap); + if (hFile != INVALID_HANDLE_VALUE) CloseHandle(hFile); +#else + if (addr) ::munmap(addr, len); + if (fd >= 0) ::close(fd); +#endif + } +}; + +// ─── GGUF metadata helpers ─────────────────────────────────────────────────── + +static uint32_t get_u32_or(const gguf_context * g, const char * key, uint32_t fallback) { + int64_t id = gguf_find_key(g, key); + if (id < 0) return fallback; + return gguf_get_val_u32(g, id); +} + +static float get_f32_or(const gguf_context * g, const char * key, float fallback) { + int64_t id = gguf_find_key(g, key); + if (id < 0) return fallback; + return gguf_get_val_f32(g, id); +} + +static size_t align_up(size_t x, size_t a) { + if (a == 0) return x; + const size_t r = x % a; + return r == 0 ? x : x + (a - r); +} + +// ─── Tensor selection filter ───────────────────────────────────────────────── +// +// All tensors go to GPU, including token_embd.weight which doubles as the LM +// head (tied weights in Gemma4-26B-A4B). The CPU embedder keeps its own +// read-only mmap view of tok_embd for the input embedding path, so placing +// it on GPU as well is safe and necessary for correct LM head logits. + +static bool is_gemma4_gpu_tensor(const char * name) { + (void)name; + return true; +} + +} // namespace + +// ─── load_gemma4_target_gguf ───────────────────────────────────────────────── + +bool load_gemma4_target_gguf(const std::string & path, + ggml_backend_t backend, + GemmaTargetWeights & out) { + + // ── 1. Parse GGUF metadata ──────────────────────────────────────────────── + + ggml_context * meta_ctx = nullptr; + gguf_init_params gip{}; + gip.no_alloc = true; + gip.ctx = &meta_ctx; + gguf_context * gctx = gguf_init_from_file(path.c_str(), gip); + if (!gctx) { + set_last_error("gguf_init_from_file failed: " + path); + return false; + } + + // Validate architecture string. + { + int64_t arch_id = gguf_find_key(gctx, "general.architecture"); + if (arch_id < 0) { + set_last_error("missing general.architecture"); + gguf_free(gctx); + return false; + } + const char * arch = gguf_get_val_str(gctx, arch_id); + if (std::string(arch) != "gemma4") { + set_last_error(std::string("unexpected arch: ") + arch + " (expected gemma4)"); + gguf_free(gctx); + return false; + } + } + + // Read required architecture hyperparameters. + const uint32_t n_embd = get_u32_or(gctx, "gemma4.embedding_length", 0); + const uint32_t n_layer = get_u32_or(gctx, "gemma4.block_count", 0); + const uint32_t n_ff = get_u32_or(gctx, "gemma4.feed_forward_length", 0); + const uint32_t n_head = get_u32_or(gctx, "gemma4.attention.head_count", 0); + // Fix A: head_count_kv may be a per-layer INT32 array, not a scalar + std::vector head_kv_per_layer; + uint32_t n_head_kv_max = 0; + { + int64_t kv_id = gguf_find_key(gctx, "gemma4.attention.head_count_kv"); + if (kv_id >= 0) { + enum gguf_type kv_type = gguf_get_kv_type(gctx, kv_id); + if (kv_type == GGUF_TYPE_ARRAY) { + size_t arr_n = gguf_get_arr_n(gctx, kv_id); + const int32_t * arr = (const int32_t *)gguf_get_arr_data(gctx, kv_id); + head_kv_per_layer.resize(arr_n); + for (size_t i = 0; i < arr_n; i++) { + head_kv_per_layer[i] = (int)arr[i]; + if ((uint32_t)arr[i] > n_head_kv_max) n_head_kv_max = (uint32_t)arr[i]; + } + } else { + // Scalar fallback + n_head_kv_max = gguf_get_val_u32(gctx, kv_id); + } + } + } + const uint32_t n_head_kv = n_head_kv_max; + + // Fix D: read both full-attn and SWA head dims + const uint32_t head_dim = get_u32_or(gctx, "gemma4.attention.key_length", 0); + const uint32_t head_dim_swa = get_u32_or(gctx, "gemma4.attention.key_length_swa", head_dim); + + // Fix B: vocab_size key may be absent — fall back to tokenizer array length + uint32_t n_vocab = get_u32_or(gctx, "gemma4.vocab_size", 0); + if (n_vocab == 0) { + int64_t tok_id = gguf_find_key(gctx, "tokenizer.ggml.tokens"); + if (tok_id >= 0) n_vocab = (uint32_t)gguf_get_arr_n(gctx, tok_id); + } + const uint32_t swa_win = get_u32_or(gctx, "gemma4.attention.sliding_window", 1024); + const uint32_t n_kv_shared = get_u32_or(gctx, "gemma4.attention.shared_kv_layers", 0); + const uint32_t n_embd_per_layer = get_u32_or(gctx, "gemma4.embedding_length_per_layer_input", 0); + const uint32_t n_expert = get_u32_or(gctx, "gemma4.expert_count", 0); + const uint32_t n_expert_used = get_u32_or(gctx, "gemma4.expert_used_count", 0); + const uint32_t n_ff_exp = get_u32_or(gctx, "gemma4.expert_feed_forward_length", 0); + + const float rope_theta = get_f32_or(gctx, "gemma4.rope.freq_base", 1000000.0f); + const float rope_theta_swa = get_f32_or(gctx, "gemma4.rope.freq_base_swa", 1000000.0f); + const float logit_softcap = get_f32_or(gctx, "gemma4.final_logit_softcapping", 30.0f); + + if (n_embd == 0 || n_layer == 0 || n_ff == 0 || + n_head == 0 || n_head_kv == 0 || head_dim == 0 || n_vocab == 0) { + char buf[256]; + std::snprintf(buf, sizeof(buf), + "missing or zero required hparams: n_embd=%u n_layer=%u n_ff=%u " + "n_head=%u n_head_kv=%u head_dim=%u n_vocab=%u", + n_embd, n_layer, n_ff, n_head, n_head_kv, head_dim, n_vocab); + set_last_error(buf); + gguf_free(gctx); + return false; + } + + // ── 2. Build the per-layer SWA pattern ─────────────────────────────────── + // + // swa_layers[il] != 0 → sliding-window attention; == 0 → full attention. + // The array is stored as GGUF_TYPE_ARRAY of INT32 or BOOL. If absent we + // default to alternating: odd layers use SWA, even layers use full attn + // (matches Gemma4-31B's default pattern). + + std::vector swa_layers(n_layer, false); + { + int64_t swa_arr_id = gguf_find_key(gctx, "gemma4.attention.sliding_window_pattern"); + // Fix C: sliding_window_pattern may be BOOL array (1-byte), not INT32 + if (swa_arr_id >= 0) { + size_t arr_n = gguf_get_arr_n(gctx, swa_arr_id); + enum gguf_type arr_type = gguf_get_arr_type(gctx, swa_arr_id); + const void * arr_data = gguf_get_arr_data(gctx, swa_arr_id); + for (size_t i = 0; i < arr_n && i < n_layer; i++) { + if (arr_type == GGUF_TYPE_BOOL || arr_type == GGUF_TYPE_INT8 || arr_type == GGUF_TYPE_UINT8) { + swa_layers[i] = (((const uint8_t *)arr_data)[i] != 0); + } else { + swa_layers[i] = (((const int32_t *)arr_data)[i] != 0); + } + } + } else { + // Fallback: odd-indexed layers → SWA, even → full attention. + for (uint32_t i = 0; i < n_layer; i++) { + swa_layers[i] = ((i % 2) == 1); + } + } + } + + // ── 3. Build KV-sharing maps ────────────────────────────────────────────── + // + // Layers [0, n_layer - n_kv_shared_layers) own their own KV cache slot. + // Layers [n_layer - n_kv_shared_layers, n_layer) are KV-shared: they borrow + // KV from the last non-shared layer that has the same attention type (SWA + // or full). layer_to_kv_idx[il] == -1 for shared layers. + + const int n_non_shared = (int)n_layer - (int)n_kv_shared; + if (n_non_shared < 0) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "n_kv_shared_layers=%u > n_layer=%u", n_kv_shared, n_layer); + set_last_error(buf); + gguf_free(gctx); + return false; + } + + std::vector layer_to_kv_idx((size_t)n_layer, -1); + std::vector layer_to_donor_kv((size_t)n_layer, -1); + { + int kv_slot = 0; + for (int il = 0; il < n_non_shared; il++) { + layer_to_kv_idx[il] = kv_slot++; + } + // Shared layers find their donor: the last non-shared layer with the + // same attention type. + for (int il = n_non_shared; il < (int)n_layer; il++) { + bool is_swa = swa_layers[(size_t)il]; + int donor = -1; + for (int j = n_non_shared - 1; j >= 0; j--) { + if (swa_layers[(size_t)j] == is_swa) { donor = j; break; } + } + layer_to_donor_kv[il] = donor; + // kv_idx stays -1 (no dedicated slot). + } + } + const int n_kv_slots = n_non_shared; // total distinct KV cache entries + + // ── 4. Populate struct metadata ────────────────────────────────────────── + + out.ctx = meta_ctx; + out.backend = backend; + + out.n_embd = (int)n_embd; + out.n_head = (int)n_head; + out.n_head_kv = (int)n_head_kv; + out.head_dim = (int)head_dim; + out.head_dim_swa = (int)head_dim_swa; + out.head_kv_per_layer = head_kv_per_layer; + out.n_layer = (int)n_layer; + out.n_ff = (int)n_ff; + out.n_vocab = (int)n_vocab; + out.n_embd_per_layer = (int)n_embd_per_layer; + out.swa_window = (int)swa_win; + out.swa_layers = swa_layers; + out.n_kv_shared_layers = (int)n_kv_shared; + out.n_layer_kv = n_kv_slots; + out.rope_theta = rope_theta; + out.rope_theta_swa = rope_theta_swa; + out.n_expert = (int)n_expert; + out.n_expert_used = (int)n_expert_used; + out.n_ff_exp = (int)n_ff_exp; + out.logit_softcap = logit_softcap; + + // BOS / EOS tokens (missing key → -1) + { + const uint32_t kMissing = 0xFFFFFFFFu; + const uint32_t raw_bos = get_u32_or(gctx, "tokenizer.ggml.bos_token_id", kMissing); + const uint32_t raw_eos = get_u32_or(gctx, "tokenizer.ggml.eos_token_id", kMissing); + const uint32_t raw_eot = get_u32_or(gctx, "tokenizer.ggml.eot_token_id", kMissing); + out.bos_id = (raw_bos == kMissing) ? -1 : (int32_t)raw_bos; + out.eos_id = (raw_eos == kMissing) ? -1 : (int32_t)raw_eos; + out.eos_chat_id = (raw_eot == kMissing) ? -1 : (int32_t)raw_eot; + + // Gemma4 fallback: (107) is the chat stop token. + // Many GGUFs omit eot_token_id; default to 107 when missing. + if (out.eos_chat_id < 0) { + out.eos_chat_id = 107; + } + + std::printf("[gemma4_loader] bos_id=%d eos_id=%d eos_chat_id=%d\n", + out.bos_id, out.eos_id, out.eos_chat_id); + } + + // ── 5. Compute capture_layer_ids ───────────────────────────────────────── + // + // Use hardcoded values from the DFlash draft model config.json. + // Fallback to evenly-spaced formula for unknown layer counts. + { + const int N = GEMMA4_DRAFT_N_TARGET_LAYERS; // 6 + if ((int)n_layer == 30) { + // Gemma4-26B-A4B — from z-lab/gemma-4-26B-A4B-it-DFlash config.json + const int ids[6] = {1, 6, 11, 17, 22, 27}; + for (int k = 0; k < N; k++) out.capture_layer_ids[k] = ids[k]; + } else if ((int)n_layer == 60) { + // Gemma4-31B — from z-lab/gemma-4-31B-it-DFlash config.json + const int ids[6] = {1, 12, 23, 35, 46, 57}; + for (int k = 0; k < N; k++) out.capture_layer_ids[k] = ids[k]; + } else { + // Fallback: evenly spaced + const int step = ((int)n_layer - 2) / (N - 1); + for (int k = 0; k < N; k++) out.capture_layer_ids[k] = 1 + k * step; + } + std::printf("[gemma4_loader] capture_layer_ids:"); + for (int k = 0; k < N; k++) std::printf(" %d", out.capture_layer_ids[k]); + std::printf("\n"); + } + + // ── 6. Wire tensor pointers ─────────────────────────────────────────────── + + auto g = [&](const char * name) -> ggml_tensor * { + return ggml_get_tensor(meta_ctx, name); + }; + + out.tok_embd = g("token_embd.weight"); + out.out_norm = g("output_norm.weight"); + // output.weight is optional; fall back to token_embd for tied weights. + out.output = g("output.weight"); + if (!out.output) out.output = out.tok_embd; + + if (!out.tok_embd || !out.out_norm) { + set_last_error("missing top-level tensors (token_embd.weight / output_norm.weight)"); + gguf_free(gctx); + return false; + } + + // Global PLE tensors (present only when n_embd_per_layer > 0) + if (n_embd_per_layer > 0) { + out.per_layer_tok_embd = g("per_layer_token_embd.weight"); + out.per_layer_model_proj = g("per_layer_model_proj.weight"); + out.per_layer_proj_norm = g("per_layer_proj_norm.weight"); + if (!out.per_layer_tok_embd || !out.per_layer_model_proj || !out.per_layer_proj_norm) { + set_last_error("n_embd_per_layer > 0 but PLE global tensors missing"); + gguf_free(gctx); + return false; + } + } + + // Load global rope_freqs tensor (full-attention layers use this for proportional RoPE). + // Gemma4 stores one shared rope_freqs.weight (not per-layer blk.{i}.rope_freqs.weight). + // All full-attention layers share this single tensor, matching llama.cpp's TENSOR_DUPLICATED + // pattern (llama-model.cpp:4657-4658). + ggml_tensor * global_rope_freqs = g("rope_freqs.weight"); + + // Per-layer tensors. + out.layers.assign((size_t)n_layer, GemmaTargetLayer{}); + + for (int il = 0; il < (int)n_layer; il++) { + char name[160]; + auto fnd = [&](const char * suffix) -> ggml_tensor * { + std::snprintf(name, sizeof(name), "blk.%d.%s", il, suffix); + return ggml_get_tensor(meta_ctx, name); + }; + + GemmaTargetLayer & L = out.layers[(size_t)il]; + + // ── Attention (always present) ──────────────────────────────────────── + L.attn_norm = fnd("attn_norm.weight"); + L.wq = fnd("attn_q.weight"); + L.wo = fnd("attn_output.weight"); + L.q_norm = fnd("attn_q_norm.weight"); + // This GGUF uses "post_attention_norm.weight"; fall back to legacy name + L.attn_post_norm = fnd("post_attention_norm.weight"); + if (!L.attn_post_norm) L.attn_post_norm = fnd("attn_post_norm.weight"); + + if (!L.attn_norm || !L.wq || !L.wo || !L.q_norm || !L.attn_post_norm) { + char b[128]; + std::snprintf(b, sizeof(b), "layer %d: missing required attention tensor", il); + set_last_error(b); + gguf_free(gctx); + return false; + } + + // wk, wv, k_norm — absent for KV-shared layers (il >= n_non_shared). + const bool is_kv_owner = (il < n_non_shared); + if (is_kv_owner) { + L.wk = fnd("attn_k.weight"); + L.wv = fnd("attn_v.weight"); + L.k_norm = fnd("attn_k_norm.weight"); + if (!L.wk) { + char b[128]; + std::snprintf(b, sizeof(b), "layer %d: expected wk (non-shared), missing", il); + set_last_error(b); + gguf_free(gctx); + return false; + } + // V may be absent on full-attention layers where V == K (shared K/V). + if (!L.wv) { + L.wv = L.wk; + } + // k_norm may be absent for SWA layers in some checkpoints; allow nullptr. + } + + // Optional per-layer tensors + L.rope_freqs = fnd("rope_freqs.weight"); + // Full-attention layers use proportional RoPE via rope_freqs (freq_factors). + // Gemma4 stores a single global rope_freqs.weight (no per-layer blk.{i} variant). + // Fall back to the global tensor for full-attention layers when the per-layer + // variant is absent (which is always the case for this GGUF format). + if (!L.rope_freqs && !swa_layers[(size_t)il] && global_rope_freqs) { + L.rope_freqs = global_rope_freqs; + } + // This GGUF uses "layer_output_scale.weight"; fall back to legacy name + L.out_scale = fnd("layer_output_scale.weight"); + if (!L.out_scale) L.out_scale = fnd("out_scale.weight"); + + // ── FFN (always present) ────────────────────────────────────────────── + L.ffn_norm = fnd("ffn_norm.weight"); + L.w_gate = fnd("ffn_gate.weight"); + L.w_up = fnd("ffn_up.weight"); + L.w_down = fnd("ffn_down.weight"); + // This GGUF uses "post_ffw_norm.weight"; fall back to legacy name + L.ffn_post_norm = fnd("post_ffw_norm.weight"); + if (!L.ffn_post_norm) L.ffn_post_norm = fnd("ffn_post_norm.weight"); + + if (!L.ffn_norm || !L.w_gate || !L.w_up || !L.w_down || !L.ffn_post_norm) { + char b[128]; + std::snprintf(b, sizeof(b), "layer %d: missing required FFN tensor", il); + set_last_error(b); + gguf_free(gctx); + return false; + } + + // ── MoE (26B-A4B — present when n_expert > 0) ──────────────────────── + if (n_expert > 0) { + L.ffn_gate_inp = fnd("ffn_gate_inp.weight"); + L.ffn_gate_inp_s = fnd("ffn_gate_inp.scale"); + // This GGUF uses "pre_ffw_norm_2.weight"; fall back to legacy name + L.ffn_pre_norm_2 = fnd("pre_ffw_norm_2.weight"); + if (!L.ffn_pre_norm_2) L.ffn_pre_norm_2 = fnd("ffn_pre_norm_2.weight"); + L.ffn_gate_up_exps = fnd("ffn_gate_up_exps.weight"); + L.ffn_down_exps = fnd("ffn_down_exps.weight"); + L.ffn_down_exps_s = fnd("ffn_down_exps.scale"); + // This GGUF uses "post_ffw_norm_1/2.weight"; fall back to legacy names + L.ffn_post_norm_1 = fnd("post_ffw_norm_1.weight"); + if (!L.ffn_post_norm_1) L.ffn_post_norm_1 = fnd("ffn_post_norm_1.weight"); + L.ffn_post_norm_2 = fnd("post_ffw_norm_2.weight"); + if (!L.ffn_post_norm_2) L.ffn_post_norm_2 = fnd("ffn_post_norm_2.weight"); + + if (!L.ffn_gate_inp || !L.ffn_pre_norm_2 || + !L.ffn_gate_up_exps || !L.ffn_down_exps || + !L.ffn_post_norm_1 || !L.ffn_post_norm_2) { + char b[128]; + std::snprintf(b, sizeof(b), "layer %d: MoE model but missing expert tensor", il); + set_last_error(b); + gguf_free(gctx); + return false; + } + // ffn_gate_inp_s, ffn_down_exps_s are optional quantization scales. + } + + // ── Per-Layer Embedding (PLE) ───────────────────────────────────────── + if (n_embd_per_layer > 0) { + L.ple_inp_gate = fnd("inp_gate.weight"); + L.ple_proj = fnd("proj.weight"); + L.ple_post_norm = fnd("post_norm.weight"); + if (!L.ple_inp_gate || !L.ple_proj || !L.ple_post_norm) { + char b[128]; + std::snprintf(b, sizeof(b), "layer %d: PLE model but missing per-layer embedding tensor", il); + set_last_error(b); + gguf_free(gctx); + return false; + } + } + } + + // ── 7. Allocate GPU buffer ──────────────────────────────────────────────── + // + // Walk all GGUF tensors, skip token_embd.weight (stays CPU), accumulate + // aligned sizes, allocate one contiguous backend buffer, assign each tensor. + + ggml_backend_buffer_type_t buft = ggml_backend_get_default_buffer_type(backend); + const size_t alignment = ggml_backend_buft_get_alignment(buft); + + struct TensorSlot { + ggml_tensor * tensor = nullptr; + size_t file_offset = 0; + size_t file_size = 0; + size_t buf_offset = 0; + }; + + std::vector slots; + size_t total_gpu = 0; + const int64_t n_tensors = gguf_get_n_tensors(gctx); + for (int64_t tid = 0; tid < n_tensors; tid++) { + const char * tname = gguf_get_tensor_name(gctx, tid); + if (!is_gemma4_gpu_tensor(tname)) continue; + ggml_tensor * t = ggml_get_tensor(meta_ctx, tname); + if (!t) continue; + total_gpu = align_up(total_gpu, alignment); + TensorSlot s; + s.tensor = t; + s.file_offset = gguf_get_data_offset(gctx) + gguf_get_tensor_offset(gctx, tid); + s.file_size = gguf_get_tensor_size(gctx, tid); + s.buf_offset = total_gpu; + total_gpu += ggml_backend_buft_get_alloc_size(buft, t); + slots.push_back(s); + } + + if (slots.empty()) { + set_last_error("no GPU tensors found in gemma4 GGUF"); + gguf_free(gctx); + return false; + } + + // Cleanup helper: release any GPU buffer and ggml context already assigned + // to `out` before returning false. Must be called on every failure path + // after out.buf has been (or is about to be) allocated. + auto cleanup_out = [&]() { + if (out.buf) { + ggml_backend_buffer_free(out.buf); + out.buf = nullptr; + } + // out.ctx == meta_ctx; free it so the caller doesn't leak the graph. + if (out.ctx) { + ggml_free(out.ctx); + out.ctx = nullptr; + } + out = GemmaTargetWeights{}; + }; + + out.buf = ggml_backend_alloc_buffer(backend, total_gpu); + if (!out.buf) { + set_last_error("ggml_backend_alloc_buffer failed (gemma4 target)"); + gguf_free(gctx); + cleanup_out(); + return false; + } + ggml_backend_buffer_set_usage(out.buf, GGML_BACKEND_BUFFER_USAGE_WEIGHTS); + + char * base = (char *)ggml_backend_buffer_get_base(out.buf); + for (const TensorSlot & s : slots) { + if (ggml_backend_tensor_alloc(out.buf, s.tensor, base + s.buf_offset) != GGML_STATUS_SUCCESS) { + set_last_error("ggml_backend_tensor_alloc failed (gemma4 target)"); + gguf_free(gctx); + cleanup_out(); + return false; + } + } + + // ── 8. mmap file, upload GPU tensors, keep tok_embd on CPU ─────────────── + + std::string err; + Mmap mm; + if (!mm.open_ro(path, err)) { + set_last_error(err); + gguf_free(gctx); + cleanup_out(); + return false; + } + + const size_t data_start = gguf_get_data_offset(gctx); + size_t gpu_bytes_uploaded = 0; + size_t tok_embd_off = 0; + size_t tok_embd_sz = 0; + ggml_type tok_embd_type = GGML_TYPE_COUNT; + + for (int64_t tid = 0; tid < n_tensors; tid++) { + const char * tname = gguf_get_tensor_name(gctx, tid); + ggml_tensor * t = ggml_get_tensor(meta_ctx, tname); + if (!t) continue; + const size_t off = data_start + gguf_get_tensor_offset(gctx, tid); + const size_t sz = gguf_get_tensor_size(gctx, tid); + if (off + sz > mm.len) { + set_last_error(std::string("tensor '") + tname + "' overflows file"); + gguf_free(gctx); + cleanup_out(); + return false; + } + if (std::strcmp(tname, "token_embd.weight") == 0) { + tok_embd_off = off; + tok_embd_sz = sz; + tok_embd_type = gguf_get_tensor_type(gctx, tid); + // fall through: also upload to GPU for LM head (tied weights) + } + ggml_backend_tensor_set(t, (const uint8_t *)mm.addr + off, 0, sz); + gpu_bytes_uploaded += sz; + } + + gguf_free(gctx); + + if (tok_embd_off == 0 || tok_embd_type == GGML_TYPE_COUNT) { + set_last_error("token_embd.weight not found or invalid type"); + cleanup_out(); + return false; + } + + // Fix 2: validate tok_embd_sz divisibility before computing row stride. + if (n_vocab == 0 || tok_embd_sz % (size_t)n_vocab != 0) { + set_last_error("malformed GGUF: tok_embd_sz=" + std::to_string(tok_embd_sz) + + " not divisible by n_vocab=" + std::to_string(n_vocab)); + cleanup_out(); + return false; + } + + // ── 9. Transfer mmap ownership to CpuEmbedder ──────────────────────────── + + out.embedder.mmap_addr = mm.addr; + out.embedder.mmap_len = mm.len; +#if defined(_WIN32) + out.embedder.mmap_hfile = mm.hFile; + out.embedder.mmap_hmap = mm.hMap; +#else + out.embedder.mmap_fd = mm.fd; +#endif + out.embedder.tok_embd_bytes = (const uint8_t *)mm.addr + tok_embd_off; + out.embedder.tok_embd_type = tok_embd_type; + out.embedder.n_embd = (int64_t)n_embd; + out.embedder.n_vocab = (int64_t)n_vocab; + out.embedder.row_bytes = tok_embd_sz / (size_t)n_vocab; + mm.release(); + + char summary[256]; + std::snprintf(summary, sizeof(summary), + "gemma4 target loaded: n_layer=%u n_embd=%u n_ff=%u n_expert=%u " + "n_kv_slots=%d n_kv_shared=%u, %zu GPU tensors %.2f GiB, " + "tok_embd %.0f MiB GPU+CPU-mmap (%s, tied LM head)", + n_layer, n_embd, n_ff, n_expert, n_kv_slots, n_kv_shared, + slots.size(), (double)gpu_bytes_uploaded / (1024.0 * 1024.0 * 1024.0), + (double)tok_embd_sz / (1024.0 * 1024.0), ggml_type_name(tok_embd_type)); + set_last_error(summary); + + return true; +} + +// ─── load_gemma4_mtp_assistant ─────────────────────────────────────────────── +// +// Loads a Gemma4 MTP assistant GGUF (gemma4_assistant architecture) into +// MtpDrafterWeights. The loader: +// 1. Reads metadata: n_embd_backbone, attention_k_eq_v, n_centroids, etc. +// 2. Reads per-MTP-layer SWA type from gemma4_assistant.attention.sliding_window_pattern. +// 3. Resolves each MTP layer's donor_target_layer = LAST target layer whose +// SWA type matches that MTP layer's SWA type, assuming Dense 31B: +// 60 target layers, alternating pattern (odd-indexed = SWA, even = full attn). +// 4. Loads all tensors into a GPU backend buffer. +// +// Tensor names follow llama.cpp's gemma4-assistant.cpp conventions: +// mtp.pre_projection.weight [2*n_bb, n_embd] +// mtp.post_projection.weight [n_embd, n_bb] +// output_norm.weight [n_embd] +// blk.{i}.attn_norm.weight [n_embd] +// blk.{i}.attn_q.weight [n_embd, n_head*head_dim] +// blk.{i}.attn_q_norm.weight [head_dim] +// blk.{i}.attn_output.weight [n_head*head_dim, n_embd] +// blk.{i}.post_attention_norm.weight [n_embd] +// blk.{i}.ffn_norm.weight [n_embd] +// blk.{i}.ffn_gate.weight [n_embd, n_ff] +// blk.{i}.ffn_up.weight [n_embd, n_ff] +// blk.{i}.ffn_down.weight [n_ff, n_embd] +// blk.{i}.post_ffw_norm.weight [n_embd] +// blk.{i}.layer_output_scale.weight [1] (optional) +// +// Metadata keys (prefix = "gemma4_assistant"): +// gemma4_assistant.n_embd_backbone u32 +// gemma4_assistant.n_centroids u32 +// gemma4_assistant.centroid_top_k u32 +// gemma4_assistant.attention.k_eq_v bool +// gemma4_assistant.use_ordered_embeddings bool +// gemma4_assistant.requires_target_arch string + +bool load_gemma4_mtp_assistant(const std::string & gguf_path, + ggml_backend_t backend, + MtpDrafterWeights & out) { + + // ── 1. Open GGUF and read metadata ──────────────────────────────────────── + + ggml_context * meta_ctx = nullptr; + gguf_init_params gip{}; + gip.no_alloc = true; + gip.ctx = &meta_ctx; + gguf_context * gctx = gguf_init_from_file(gguf_path.c_str(), gip); + if (!gctx) { + set_last_error("load_gemma4_mtp_assistant: gguf_init_from_file failed: " + gguf_path); + return false; + } + + // Validate architecture string. + { + int64_t arch_id = gguf_find_key(gctx, "general.architecture"); + if (arch_id < 0) { + set_last_error("load_gemma4_mtp_assistant: missing general.architecture"); + gguf_free(gctx); + return false; + } + const char * arch = gguf_get_val_str(gctx, arch_id); + if (std::string(arch) != "gemma4_assistant") { + set_last_error(std::string("load_gemma4_mtp_assistant: unexpected arch: ") + + arch + " (expected gemma4_assistant)"); + gguf_free(gctx); + return false; + } + } + + // Read MTP-specific metadata. + const uint32_t n_embd = get_u32_or(gctx, "gemma4_assistant.embedding_length", 0); + const uint32_t n_embd_backbone = get_u32_or(gctx, "gemma4_assistant.n_embd_backbone", 0); + const uint32_t n_centroids = get_u32_or(gctx, "gemma4_assistant.n_centroids", 0); + const uint32_t centroid_top_k = get_u32_or(gctx, "gemma4_assistant.centroid_top_k", 0); + bool attention_k_eq_v = false; + bool use_ordered_embeddings = false; + std::string requires_target_arch; + { + int64_t kid = gguf_find_key(gctx, "gemma4_assistant.attention.k_eq_v"); + if (kid >= 0) attention_k_eq_v = gguf_get_val_bool(gctx, kid); + } + { + int64_t kid = gguf_find_key(gctx, "gemma4_assistant.use_ordered_embeddings"); + if (kid >= 0) use_ordered_embeddings = gguf_get_val_bool(gctx, kid); + } + { + int64_t kid = gguf_find_key(gctx, "gemma4_assistant.requires_target_arch"); + if (kid >= 0) requires_target_arch = gguf_get_val_str(gctx, kid); + } + + // Validate n_embd_backbone. + if (n_embd_backbone == 0) { + set_last_error("load_gemma4_mtp_assistant: missing or zero gemma4_assistant.n_embd_backbone"); + gguf_free(gctx); + return false; + } + + // Validate requires_target_arch. + if (requires_target_arch != "gemma4") { + set_last_error(std::string("load_gemma4_mtp_assistant: requires_target_arch='") + + requires_target_arch + "' expected 'gemma4'"); + gguf_free(gctx); + return false; + } + + // Read MTP model's own layer count and SWA pattern. + const uint32_t n_mtp_layer = get_u32_or(gctx, "gemma4_assistant.block_count", 4); + + std::vector mtp_swa_layers(n_mtp_layer, false); + { + int64_t swa_arr_id = gguf_find_key(gctx, "gemma4_assistant.attention.sliding_window_pattern"); + if (swa_arr_id >= 0) { + size_t arr_n = gguf_get_arr_n(gctx, swa_arr_id); + enum gguf_type arr_type = gguf_get_arr_type(gctx, swa_arr_id); + const void * arr_data = gguf_get_arr_data(gctx, swa_arr_id); + for (size_t i = 0; i < arr_n && i < (size_t)n_mtp_layer; i++) { + if (arr_type == GGUF_TYPE_BOOL || arr_type == GGUF_TYPE_INT8 || arr_type == GGUF_TYPE_UINT8) { + mtp_swa_layers[i] = (((const uint8_t *)arr_data)[i] != 0); + } else { + mtp_swa_layers[i] = (((const int32_t *)arr_data)[i] != 0); + } + } + } + // If absent, default all MTP layers to non-SWA (full attention). + } + + // ── 2. Resolve donor_target_layer per MTP layer ─────────────────────────── + // + // Per atomicbot's gemma4-assistant.cpp:12-27 + 126: + // For each MTP layer il, find the LAST target layer whose SWA type == mtp_swa_layers[il]. + // We assume Dense 31B target: 60 layers, alternating (odd-indexed = SWA, even = full attn). + // This matches the fallback in load_gemma4_target_gguf when no swa pattern key is found. + + const int target_n_layer = 60; // Dense 31B + // Build target SWA pattern: odd = SWA, even = full. + std::vector target_swa(target_n_layer, false); + for (int il = 0; il < target_n_layer; il++) { + target_swa[il] = ((il % 2) == 1); + } + + std::vector donor_per_mtp_layer(n_mtp_layer, -1); + for (uint32_t mil = 0; mil < n_mtp_layer; mil++) { + bool want_swa = mtp_swa_layers[mil]; + int32_t best = -1; + for (int til = 0; til < target_n_layer; til++) { + if (target_swa[til] == want_swa) { + best = til; + } + } + donor_per_mtp_layer[mil] = best; + } + + // ── 3. Wire tensor pointers ─────────────────────────────────────────────── + + auto g = [&](const char * name) -> ggml_tensor * { + return ggml_get_tensor(meta_ctx, name); + }; + + // Global tensors. + ggml_tensor * pre_proj = g("mtp.pre_projection.weight"); + ggml_tensor * post_proj = g("mtp.post_projection.weight"); + ggml_tensor * out_norm = g("output_norm.weight"); + // Token embedding (tied LM head for the MTP model). Used by the centroid + // LM head for get_rows(tok_embd, candidate_ids) → mul_mat(·, h_inner). + // Optional: absent in stripped GGUFs; graph falls back gracefully. + ggml_tensor * tok_embd_t = g("token_embd.weight"); + // Assistant's own RoPE per-dim freq factors (top-level tensor, used for + // proportional RoPE on the full-attn MTP layer's Q rotation). The assistant + // was trained with ITS OWN rope_freqs which may differ from target's. + ggml_tensor * rope_freqs_t = g("rope_freqs.weight"); + + if (!pre_proj || !post_proj || !out_norm) { + char buf[256]; + std::snprintf(buf, sizeof(buf), + "load_gemma4_mtp_assistant: missing global tensors " + "(pre_projection=%s post_projection=%s output_norm=%s)", + pre_proj ? "ok" : "MISSING", + post_proj ? "ok" : "MISSING", + out_norm ? "ok" : "MISSING"); + set_last_error(buf); + gguf_free(gctx); + return false; + } + + // Optional centroid tensors. Load them when n_centroids > 0, regardless of + // use_ordered_embeddings flag — some GGUFs may have the flag wrong while the + // centroid tensors are present. The graph builder decides whether to use them. + ggml_tensor * centroids_t = nullptr; + ggml_tensor * token_ordering_t = nullptr; + if (n_centroids > 0) { + centroids_t = g("mtp.centroids.weight"); + token_ordering_t = g("mtp.token_ordering.weight"); + if (use_ordered_embeddings && !centroids_t) { + set_last_error("load_gemma4_mtp_assistant: use_ordered_embeddings=true but mtp.centroids.weight missing"); + gguf_free(gctx); + return false; + } + // centroids/token_ordering are optional when use_ordered_embeddings=false + // (may be present anyway for future use). + } + + // Per-layer tensors. + std::vector mtp_layers(n_mtp_layer); + for (uint32_t il = 0; il < n_mtp_layer; il++) { + char name[160]; + auto fnd = [&](const char * suffix) -> ggml_tensor * { + std::snprintf(name, sizeof(name), "blk.%u.%s", il, suffix); + return ggml_get_tensor(meta_ctx, name); + }; + + MtpLayerWeights & L = mtp_layers[il]; + L.is_swa = mtp_swa_layers[il]; + L.donor_target_layer = donor_per_mtp_layer[il]; + + L.attn_norm = fnd("attn_norm.weight"); + L.wq = fnd("attn_q.weight"); + L.attn_q_norm = fnd("attn_q_norm.weight"); + L.wo = fnd("attn_output.weight"); + L.attn_post_norm = fnd("post_attention_norm.weight"); + L.ffn_norm = fnd("ffn_norm.weight"); + L.ffn_up = fnd("ffn_up.weight"); + L.ffn_gate = fnd("ffn_gate.weight"); + L.ffn_down = fnd("ffn_down.weight"); + L.ffn_post_norm = fnd("post_ffw_norm.weight"); + L.out_scale = fnd("layer_output_scale.weight"); // optional + + // Validate required tensors. + if (!L.attn_norm || !L.wq || !L.attn_q_norm || !L.wo || !L.attn_post_norm || + !L.ffn_norm || !L.ffn_up || !L.ffn_gate || !L.ffn_down || !L.ffn_post_norm) { + char buf[256]; + std::snprintf(buf, sizeof(buf), + "load_gemma4_mtp_assistant: layer %u missing required tensor " + "(attn_norm=%s wq=%s attn_q_norm=%s wo=%s attn_post_norm=%s " + "ffn_norm=%s ffn_up=%s ffn_gate=%s ffn_down=%s ffn_post_norm=%s)", + il, + L.attn_norm ? "ok" : "MISSING", L.wq ? "ok" : "MISSING", + L.attn_q_norm ? "ok" : "MISSING", L.wo ? "ok" : "MISSING", + L.attn_post_norm ? "ok" : "MISSING", + L.ffn_norm ? "ok" : "MISSING", L.ffn_up ? "ok" : "MISSING", + L.ffn_gate ? "ok" : "MISSING", L.ffn_down ? "ok" : "MISSING", + L.ffn_post_norm ? "ok" : "MISSING"); + set_last_error(buf); + gguf_free(gctx); + return false; + } + } + + // ── 4. Allocate GPU buffer ──────────────────────────────────────────────── + + ggml_backend_buffer_type_t buft = ggml_backend_get_default_buffer_type(backend); + const size_t alignment = ggml_backend_buft_get_alignment(buft); + + struct TensorSlot { + ggml_tensor * tensor = nullptr; + size_t file_offset = 0; + size_t file_size = 0; + size_t buf_offset = 0; + }; + + std::vector slots; + size_t total_gpu = 0; + const int64_t n_tensors = gguf_get_n_tensors(gctx); + for (int64_t tid = 0; tid < n_tensors; tid++) { + const char * tname = gguf_get_tensor_name(gctx, tid); + ggml_tensor * t = ggml_get_tensor(meta_ctx, tname); + if (!t) continue; + total_gpu = align_up(total_gpu, alignment); + TensorSlot s; + s.tensor = t; + s.file_offset = gguf_get_data_offset(gctx) + gguf_get_tensor_offset(gctx, tid); + s.file_size = gguf_get_tensor_size(gctx, tid); + s.buf_offset = total_gpu; + total_gpu += ggml_backend_buft_get_alloc_size(buft, t); + slots.push_back(s); + } + + if (slots.empty()) { + set_last_error("load_gemma4_mtp_assistant: no tensors found in GGUF"); + gguf_free(gctx); + return false; + } + + auto cleanup_out = [&]() { + if (out.buffer) { ggml_backend_buffer_free(out.buffer); out.buffer = nullptr; } + if (out.ctx) { ggml_free(out.ctx); out.ctx = nullptr; } + out = MtpDrafterWeights{}; + }; + + out.buffer = ggml_backend_alloc_buffer(backend, total_gpu); + if (!out.buffer) { + set_last_error("load_gemma4_mtp_assistant: ggml_backend_alloc_buffer failed"); + gguf_free(gctx); + cleanup_out(); + return false; + } + ggml_backend_buffer_set_usage(out.buffer, GGML_BACKEND_BUFFER_USAGE_WEIGHTS); + + char * base = (char *)ggml_backend_buffer_get_base(out.buffer); + for (const TensorSlot & s : slots) { + if (ggml_backend_tensor_alloc(out.buffer, s.tensor, base + s.buf_offset) != GGML_STATUS_SUCCESS) { + set_last_error("load_gemma4_mtp_assistant: ggml_backend_tensor_alloc failed"); + gguf_free(gctx); + cleanup_out(); + return false; + } + } + + // ── 5. mmap and upload tensors ──────────────────────────────────────────── + + std::string err; + Mmap mm; + if (!mm.open_ro(gguf_path, err)) { + set_last_error(err); + gguf_free(gctx); + cleanup_out(); + return false; + } + + const size_t data_start = gguf_get_data_offset(gctx); + for (int64_t tid = 0; tid < n_tensors; tid++) { + const char * tname = gguf_get_tensor_name(gctx, tid); + ggml_tensor * t = ggml_get_tensor(meta_ctx, tname); + if (!t) continue; + const size_t off = data_start + gguf_get_tensor_offset(gctx, tid); + const size_t sz = gguf_get_tensor_size(gctx, tid); + if (off + sz > mm.len) { + set_last_error(std::string("load_gemma4_mtp_assistant: tensor '") + tname + "' overflows file"); + gguf_free(gctx); + cleanup_out(); + return false; + } + ggml_backend_tensor_set(t, (const uint8_t *)mm.addr + off, 0, sz); + } + + gguf_free(gctx); + + // ── 6. Populate output struct ───────────────────────────────────────────── + + out.ctx = meta_ctx; + out.backend = backend; + out.pre_projection = pre_proj; + out.post_projection = post_proj; + out.output_norm = out_norm; + out.tok_embd = tok_embd_t; + out.rope_freqs = rope_freqs_t; + out.centroids = centroids_t; + out.token_ordering = token_ordering_t; + out.layers = std::move(mtp_layers); + out.n_embd = (int32_t)n_embd; + out.n_embd_backbone = (int32_t)n_embd_backbone; + out.n_centroids = (int32_t)n_centroids; + out.centroid_top_k = (int32_t)centroid_top_k; + out.use_ordered_embeddings = use_ordered_embeddings; + out.attention_k_eq_v = attention_k_eq_v; + out.requires_target_arch = requires_target_arch; + + std::printf("[mtp_loader] loaded: n_embd_backbone=%u n_mtp_layers=%u " + "attention_k_eq_v=%d n_centroids=%u use_ordered_embeddings=%d " + "requires_target_arch=%s tensors=%zu GPU %.2f MiB\n", + n_embd_backbone, n_mtp_layer, + (int)attention_k_eq_v, n_centroids, (int)use_ordered_embeddings, + requires_target_arch.c_str(), + slots.size(), + (double)total_gpu / (1024.0 * 1024.0)); + + for (uint32_t mil = 0; mil < n_mtp_layer; mil++) { + std::printf("[mtp_loader] layer[%u]: is_swa=%d donor_target_layer=%d\n", + mil, (int)out.layers[mil].is_swa, out.layers[mil].donor_target_layer); + } + + return true; +} + +// ─── free_gemma4_mtp_assistant ──────────────────────────────────────────────── + +void free_gemma4_mtp_assistant(MtpDrafterWeights & w) { + if (w.buffer) { ggml_backend_buffer_free(w.buffer); w.buffer = nullptr; } + if (w.ctx) { ggml_free(w.ctx); w.ctx = nullptr; } + w.layers.clear(); + w.pre_projection = nullptr; + w.post_projection = nullptr; + w.output_norm = nullptr; + w.tok_embd = nullptr; + w.centroids = nullptr; + w.token_ordering = nullptr; + w = MtpDrafterWeights{}; +} + +// ─── get_mtp_swa_pattern ────────────────────────────────────────────────────── + +bool get_mtp_swa_pattern(const std::string & gguf_path, + std::vector & out_mtp_swa_layers) { + ggml_context * meta_ctx = nullptr; + gguf_init_params gip{}; + gip.no_alloc = true; + gip.ctx = &meta_ctx; + gguf_context * gctx = gguf_init_from_file(gguf_path.c_str(), gip); + if (!gctx) return false; + + // Validate arch + { + int64_t aid = gguf_find_key(gctx, "general.architecture"); + if (aid < 0) { gguf_free(gctx); if (meta_ctx) ggml_free(meta_ctx); return false; } + if (std::string(gguf_get_val_str(gctx, aid)) != "gemma4_assistant") { + gguf_free(gctx); if (meta_ctx) ggml_free(meta_ctx); return false; + } + } + + const uint32_t n_mtp_layer = get_u32_or(gctx, "gemma4_assistant.block_count", 4); + out_mtp_swa_layers.assign(n_mtp_layer, false); + + int64_t swa_arr_id = gguf_find_key(gctx, "gemma4_assistant.attention.sliding_window_pattern"); + if (swa_arr_id >= 0) { + size_t arr_n = gguf_get_arr_n(gctx, swa_arr_id); + enum gguf_type arr_type = gguf_get_arr_type(gctx, swa_arr_id); + const void * arr_data = gguf_get_arr_data(gctx, swa_arr_id); + for (size_t i = 0; i < arr_n && i < (size_t)n_mtp_layer; i++) { + if (arr_type == GGUF_TYPE_BOOL || arr_type == GGUF_TYPE_INT8 || arr_type == GGUF_TYPE_UINT8) { + out_mtp_swa_layers[i] = (((const uint8_t *)arr_data)[i] != 0); + } else { + out_mtp_swa_layers[i] = (((const int32_t *)arr_data)[i] != 0); + } + } + } + + gguf_free(gctx); + if (meta_ctx) ggml_free(meta_ctx); + return true; +} + +// ─── resolve_mtp_donor_layers ───────────────────────────────────────────────── + +void resolve_mtp_donor_layers(MtpDrafterWeights & mtp, + const std::vector & target_swa_layers) { + const int n_target = (int)target_swa_layers.size(); + for (auto & L : mtp.layers) { + // Find the LAST target layer whose SWA type matches this MTP layer. + bool want_swa = L.is_swa; + int32_t best = -1; + for (int til = 0; til < n_target; ++til) { + if ((int)target_swa_layers.size() > til && target_swa_layers[(size_t)til] == want_swa) { + best = til; + } + } + L.donor_target_layer = best; + } +} + +// ─── free_gemma4_target_weights ────────────────────────────────────────────── + +void free_gemma4_target_weights(GemmaTargetWeights & w) { + if (w.buf) { ggml_backend_buffer_free(w.buf); w.buf = nullptr; } + if (w.ctx) { ggml_free(w.ctx); w.ctx = nullptr; } + // CpuEmbedder destructor handles the mmap automatically. + w.layers.clear(); + w.tok_embd = nullptr; + w.out_norm = nullptr; + w.output = nullptr; + w.per_layer_tok_embd = nullptr; + w.per_layer_model_proj = nullptr; + w.per_layer_proj_norm = nullptr; + w.swa_layers.clear(); +} + +} // namespace dflash27b diff --git a/dflash/src/internal.h b/dflash/src/internal.h index 80b61dcf..89a7a80c 100644 --- a/dflash/src/internal.h +++ b/dflash/src/internal.h @@ -23,6 +23,7 @@ #include "gguf.h" #include "dflash27b.h" +#include "gemma4.h" namespace dflash27b { @@ -481,4 +482,448 @@ ggml_tensor * build_qwen35_layer( bool capture, int fa_window = 0); +// ============ Gemma4 Architecture ============ + +struct GemmaTargetLayer { + // Attention (ALL layers are attention in Gemma4) + ggml_tensor * attn_norm = nullptr; + ggml_tensor * wq = nullptr; + ggml_tensor * wk = nullptr; // nullptr for KV-shared layers + ggml_tensor * wv = nullptr; // nullptr for KV-shared layers + ggml_tensor * wo = nullptr; + ggml_tensor * q_norm = nullptr; + ggml_tensor * k_norm = nullptr; // nullptr for KV-shared layers + ggml_tensor * attn_post_norm = nullptr; + + // p-RoPE freq factors (full-attention layers only) + ggml_tensor * rope_freqs = nullptr; + + ggml_tensor * out_scale = nullptr; + + // FFN (SwiGLU) + ggml_tensor * ffn_norm = nullptr; + ggml_tensor * w_gate = nullptr; + ggml_tensor * w_up = nullptr; + ggml_tensor * w_down = nullptr; + ggml_tensor * ffn_post_norm = nullptr; + + // MoE (26B-A4B only) + ggml_tensor * ffn_gate_inp = nullptr; + ggml_tensor * ffn_gate_inp_s = nullptr; + ggml_tensor * ffn_pre_norm_2 = nullptr; + ggml_tensor * ffn_gate_up_exps = nullptr; + ggml_tensor * ffn_down_exps = nullptr; + ggml_tensor * ffn_down_exps_s = nullptr; + ggml_tensor * ffn_post_norm_1 = nullptr; + ggml_tensor * ffn_post_norm_2 = nullptr; + + // Per-Layer Embedding (PLE) + ggml_tensor * ple_inp_gate = nullptr; + ggml_tensor * ple_proj = nullptr; + ggml_tensor * ple_post_norm = nullptr; +}; + +struct GemmaTargetWeights { + ggml_context * ctx = nullptr; + ggml_backend_t backend = nullptr; + ggml_backend_buffer_t buf = nullptr; + CpuEmbedder embedder; + + ggml_tensor * tok_embd = nullptr; + std::vector layers; + ggml_tensor * out_norm = nullptr; + ggml_tensor * output = nullptr; + + // Per-Layer Embedding global tensors + ggml_tensor * per_layer_tok_embd = nullptr; + ggml_tensor * per_layer_model_proj = nullptr; + ggml_tensor * per_layer_proj_norm = nullptr; + + // Architecture metadata (loaded from GGUF) + int n_embd = 4096; + int n_head = 32; + int n_head_kv = 8; // max n_head_kv across layers (used for cache alloc) + int head_dim = 128; // full-attention head dim + int head_dim_swa = 128; // SWA head dim (may differ from head_dim) + std::vector head_kv_per_layer; // per-layer n_head_kv (empty = use n_head_kv for all) + int n_layer = 60; + int n_ff = 16384; + int n_vocab = 262144; + int n_embd_per_layer = 0; + + int swa_window = 1024; + std::vector swa_layers; + + int n_kv_shared_layers = 0; + int n_layer_kv = 0; + + float rope_theta = 1000000.0f; + float rope_theta_swa = 1000000.0f; + + int n_expert = 0; + int n_expert_used = 0; + int n_ff_exp = 0; + + float logit_softcap = 30.0f; + float attn_scale = 1.0f; + + int32_t bos_id = -1; + int32_t eos_id = -1; + int32_t eos_chat_id = -1; + + int n_capture_layers = GEMMA4_DRAFT_N_TARGET_LAYERS; + int capture_layer_ids[GEMMA4_DRAFT_N_TARGET_LAYERS] = {0}; +}; + +struct GemmaTargetCache { + ggml_context * base_ctx = nullptr; + ggml_backend_buffer_t base_buf = nullptr; + ggml_context * rollback_ctx = nullptr; + ggml_backend_buffer_t rollback_buf = nullptr; + ggml_backend_t backend = nullptr; + + int max_ctx = 0; + int swa_ctx_alloc = 0; // Actual KV-slot count for SWA layers (ring-buffer size). + // Derived as min(max_ctx_alloc, swa_window_padded). + // Full-attention layers always use max_ctx_alloc. + int cur_pos = 0; + int last_tok = -1; + + ggml_type kv_k_type = GGML_TYPE_Q8_0; + ggml_type kv_v_type = GGML_TYPE_Q8_0; + + // Per-layer override: if non-empty, use these instead of kv_k_type / kv_v_type. + // Used for asymmetric KV: TQ3_0 on SWA layers, Q8_0 on full-attn layers so + // those layers can ride the pflash block-sparse fast path (which excludes TQ3). + std::vector kv_k_type_per_layer; + std::vector kv_v_type_per_layer; + + std::vector attn_k; + std::vector attn_v; + + std::vector layer_to_kv_idx; + std::vector layer_to_donor_kv; + + ggml_tensor * target_feat = nullptr; + int target_feat_cap = 0; + + // MTP h_prev: last committed token's post-block hidden state from the + // last full-attention layer. Shape [n_embd_backbone, 1] f32. + // Allocated only when MTP is enabled (mtp_h_prev_enabled flag on cache). + // Written by the target graph at the end of every decode step. + ggml_tensor * mtp_h_prev = nullptr; + bool mtp_h_prev_enabled = false; + // Index of the last full-attention layer in the target (Dense 31B = 58). + // Computed once at cache init from w.swa_layers (highest il with swa==false). + int mtp_last_full_layer = -1; + + // Draft KV cache (prefix-direct: projected target features → K/V per layer) + ggml_context * draft_kv_ctx = nullptr; + ggml_backend_buffer_t draft_kv_buf = nullptr; + std::vector draft_k; // [head_dim, n_kv_heads, draft_kv_cap] f32 + std::vector draft_v; // [head_dim, n_kv_heads, draft_kv_cap] f32 + int draft_kv_cap = 0; + int draft_kv_pos = 0; +}; + +struct GemmaGraphInputs { + ggml_tensor * inp_embed = nullptr; + ggml_tensor * positions = nullptr; // [n_tokens] i32 + ggml_tensor * attn_mask = nullptr; + ggml_tensor * swa_mask = nullptr; // sliding-window causal mask (required for ANY SWA dispatch — prefill AND single-token decode) + ggml_tensor * per_layer_inp = nullptr; // PLE pre-computed embeddings + int n_tokens = 0; + int kv_start = 0; + bool capture_layers = false; + int fa_window = 0; + ggml_tensor * parent_ids = nullptr; + // pFlash: when true, full-attention layers use ggml_flash_attn_sparse + // instead of ggml_flash_attn_ext, keeping the single-graph-per-chunk + // architecture while enabling block-sparse attention during prefill. + bool use_pflash = false; + float pflash_alpha = 0.12f; + // When true, slice hidden to the last token before lm_head so the output + // tensor has shape [vocab, 1] instead of [vocab, n_tokens]. + // Only safe for prefill chunks where we discard all but the last logit. + bool last_token_logits_only = false; +}; + +struct GemmaGraphOutputs { + ggml_tensor * logits = nullptr; +}; + +// Gemma4 target loading +bool load_gemma4_target_gguf(const std::string & path, ggml_backend_t backend, + GemmaTargetWeights & out); +void free_gemma4_target_weights(GemmaTargetWeights & w); + +// Gemma4 cache +// extra_q8_layers: additional layer indices to force Q8_0 KV regardless of the +// global kv type (e.g. MTP donor layers that need to avoid the TQ3_0/FWHT mismatch). +bool create_gemma4_cache(const GemmaTargetWeights & w, int max_ctx, + ggml_backend_t backend, GemmaTargetCache & out, + const std::vector & extra_q8_layers = {}, + int target_feat_cap_hint = 0, + bool enable_dflash_capture_overrides = false); +void free_gemma4_cache(GemmaTargetCache & c); +void reset_gemma4_cache(GemmaTargetCache & c); + +// Gemma4 graph +GemmaGraphOutputs build_gemma4_graph(ggml_context * ctx, ggml_cgraph * gf, + const GemmaTargetWeights & w, + GemmaTargetCache & cache, + const GemmaGraphInputs & in); + +// SWA window geometry for a chunk at position kv_start with n_tokens query tokens. +// Returns the triple that build_swa_attn_block uses for the K/V view. +// The mask must be sized [effective_win_len, n_tokens] (both aligned) and filled +// with view-relative indices: mask[q][k_view] where abs_k = abs_win_start + k_view. +struct SwaView { + int abs_win_start; // absolute KV position of view slot 0 + int effective_win_len; // number of valid tokens in the view + int ring_win_start; // ring-buffer modular offset (for graph K view) +}; + +SwaView compute_swa_view(int kv_start, + int n_tokens, + int swa_window, + int swa_ctx_alloc /* ring size */); + + +// ─── Gemma4 Draft weights ───────────────────────────────────────── + +struct GemmaDraftLayer { + ggml_tensor * attn_norm = nullptr; + ggml_tensor * ffn_norm = nullptr; + ggml_tensor * wq = nullptr; + ggml_tensor * wk = nullptr; + ggml_tensor * wv = nullptr; + ggml_tensor * wo = nullptr; + ggml_tensor * q_norm = nullptr; + ggml_tensor * k_norm = nullptr; + ggml_tensor * w_gate = nullptr; + ggml_tensor * w_up = nullptr; + ggml_tensor * w_down = nullptr; +}; + +struct GemmaDraftWeights { + ggml_context * ctx = nullptr; + ggml_backend_t backend = nullptr; + ggml_backend_buffer_t buf = nullptr; + + ggml_tensor * fc = nullptr; // [6*target_hidden, draft_hidden] (ggml ne[0]=6*th, ne[1]=dh) + ggml_tensor * hidden_norm = nullptr; // [draft_hidden] + ggml_tensor * out_norm = nullptr; // [draft_hidden] + ggml_tensor * tok_embd = nullptr; // [draft_hidden, n_vocab] — tied lm_head + + std::vector layers; + std::vector layer_is_swa; + + int n_layer = GEMMA4_DRAFT_LAYERS; // 5 + int n_head = 0; + int n_head_kv = 0; + int head_dim = 128; + int n_embd = 0; // draft hidden size + int n_ff = 0; // draft intermediate size + int n_vocab = GEMMA4_31B_VOCAB; // 262144 + int block_size = GEMMA4_DRAFT_BLOCK_SIZE; // 16 + int n_target_layers = GEMMA4_DRAFT_N_TARGET_LAYERS; // 6 + int target_hidden = 0; // target model hidden dim (4096 for all Gemma4 variants) + float logit_softcap = GEMMA4_LOGIT_SOFTCAP; // 30.0 + float rope_theta = GEMMA4_ROPE_THETA; // 1e6 + int mask_token_id = GEMMA4_31B_DRAFT_MASK_TOKEN_ID; // 4 + int sliding_window = 2048; +}; + +// ─── Gemma4 MTP (Multi-Token Prediction) assistant weights ─────────────────── +// +// Loaded from a gemma4_assistant GGUF (e.g. gemma-4-31B-it-assistant.Q4_K_M.gguf). +// These are the 4 cross-attention transformer blocks that run after the target +// model's forward pass to predict the next speculative token. + +struct MtpLayerWeights { + // Q-only attention (no wk/wv — V is always read from the donor target KV cache; + // attention_k_eq_v=true means V stored as rms-normed non-rotated K, so MTP + // MUST read V from cache, not reuse K. use_k_as_v=false hardcoded per + // atomicbot:gemma4-assistant.cpp:134). + ggml_tensor * attn_norm = nullptr; // [n_embd] + ggml_tensor * wq = nullptr; // [n_embd, n_head * head_dim] + ggml_tensor * attn_q_norm = nullptr; // [head_dim] + ggml_tensor * wo = nullptr; // [n_head * head_dim, n_embd] + ggml_tensor * attn_post_norm = nullptr; // [n_embd] + ggml_tensor * ffn_norm = nullptr; // [n_embd] + ggml_tensor * ffn_up = nullptr; // [n_embd, n_ff] + ggml_tensor * ffn_gate = nullptr; // [n_embd, n_ff] + ggml_tensor * ffn_down = nullptr; // [n_ff, n_embd] + ggml_tensor * ffn_post_norm = nullptr; // [n_embd] + ggml_tensor * out_scale = nullptr; // [1] optional; nullptr if absent + // Donor target layer resolved per-MTP-layer: LAST target layer whose + // attention type (SWA vs full) matches this MTP layer's type. + int32_t donor_target_layer = -1; + bool is_swa = false; // this MTP layer's attention type +}; + +struct MtpDrafterWeights { + // Pre/post projection (concat tok_emb + h_prev → n_embd, and back) + ggml_tensor * pre_projection = nullptr; // [2*n_embd_backbone, n_embd] + ggml_tensor * post_projection = nullptr; // [n_embd, n_embd_backbone] + ggml_tensor * output_norm = nullptr; // [n_embd] + // Token embedding (shared / tied LM head for the MTP assistant model). + // Used ONLY in the centroid-routed LM head (get_rows + mul_mat) and in + // the dense fallback. This is the MTP model's own embedding, NOT the + // target's tok_embd (which is used only for the step-1 input embedding). + // Loaded from "token_embd.weight" in the assistant GGUF. + // nullptr if absent (some stripped GGUFs omit it; dense path then uses + // target.tok_embd projected through h_post). + ggml_tensor * tok_embd = nullptr; // [n_embd, n_vocab] + // Per-dim RoPE freq factors (assistant's own; for proportional RoPE on full-attn MTP layer). + // Loaded from "rope_freqs.weight" in the assistant GGUF (top-level, NOT per-layer). + // nullptr if absent (legacy GGUFs); MTP graph then falls back to target's per-layer rope_freqs. + ggml_tensor * rope_freqs = nullptr; // [head_dim/2] f32 + // Optional centroid head (Edge models only; nullptr for Dense 31B) + ggml_tensor * centroids = nullptr; // [n_embd, n_centroids] + ggml_tensor * token_ordering = nullptr; // [n_vocab] I32 invariant if present + // MTP transformer layers (always 4 per atomicbot spec) + std::vector layers; + // Metadata + int32_t n_embd = 0; // MTP model's own hidden size (e.g. 1024 for compressed MTP) + int32_t n_embd_backbone = 0; // target backbone hidden size (must match target's n_embd) + int32_t n_centroids = 0; + int32_t centroid_top_k = 0; + bool use_ordered_embeddings = false; + bool attention_k_eq_v = false; + std::string requires_target_arch; + // Backend that owns the tensors + ggml_backend_t backend = nullptr; + ggml_context * ctx = nullptr; + ggml_backend_buffer_t buffer = nullptr; +}; + +// Load Gemma4 MTP assistant weights from a GGUF file. +// The loader reads n_embd_backbone from GGUF metadata and resolves each MTP +// layer's donor target KV layer assuming Dense 31B (60 target layers, alternating +// SWA pattern: odd-indexed = SWA, even-indexed = full attention). +bool load_gemma4_mtp_assistant(const std::string & gguf_path, + ggml_backend_t backend, + MtpDrafterWeights & out); + +void free_gemma4_mtp_assistant(MtpDrafterWeights & w); + +// Read only the MTP SWA layer pattern from the GGUF (lightweight — no tensor loading). +// Returns false if the GGUF can't be opened or lacks the required architecture. +// out_mtp_swa_layers[il] = true if MTP layer il uses sliding-window attention. +bool get_mtp_swa_pattern(const std::string & gguf_path, + std::vector & out_mtp_swa_layers); + +// Re-resolve MTP donor layers using the actual target SWA pattern instead of the +// hardcoded alternating assumption used during loading. Call this after both the +// target model and MTP assistant are loaded, passing the target's swa_layers vector. +// Each MTP layer's donor_target_layer is updated to the LAST target layer whose +// SWA type matches the MTP layer's SWA type per the provided pattern. +void resolve_mtp_donor_layers(MtpDrafterWeights & mtp, + const std::vector & target_swa_layers); + +// ─── Gemma4 MTP step graph ──────────────────────────────────────────────────── +// +// Build a single MTP step graph that maps: +// inputs: in_tok (i32 [1]) — last token id +// in_h_prev (f32 [n_embd_backbone, 1]) — last target full-attn hidden +// in_pos (i32 [1]) — absolute target position for RoPE +// outputs: out_logits (f32 [n_vocab, 1]) — full vocab row +// out_h_post (f32 [n_embd_backbone, 1]) — next h_prev +// out_argmax (i32 [1]) — greedy token (in-graph argmax) +// +// Each MTP layer reads target K/V from w.layers[il].donor_target_layer +// (resolved at load time). V always read from cache (attention_k_eq_v quirk). +// KV mask is nullptr: all committed positions ≤ attn_pos are uniformly admitted. +// +// attn_pos is the number of committed target tokens (cache.cur_pos at call time). +// The caller passes it separately because the graph is rebuilt per-step in the +// chained γ loop (attn_pos is constant across steps, pos advances per step). +struct MtpStepGraph { + ggml_context * ctx = nullptr; + ggml_cgraph * gf = nullptr; + // Inputs (caller sets via ggml_backend_tensor_set before each step) + ggml_tensor * in_tok = nullptr; // I32 [1] — the token id (unused in graph; kept for API compat) + ggml_tensor * in_tok_embd = nullptr; // F32 [n_embd_backbone, 1] — pre-dequantised embedding + ggml_tensor * in_h_prev = nullptr; + ggml_tensor * in_pos = nullptr; + // Single FA mask shared across all MTP layers that need padding (currently + // every TQ3_0 layer with non-256-aligned kv_view_len, and every head_dim≥512 + // layer with non-256-aligned kv_view_len). The builder asserts at compile + // time that every need-mask layer wants the same `(width, kv_seq_len)`; if + // they ever diverge (e.g. SWA window cap < full-attn pos in long context) + // the assert fires and the builder must be extended to per-layer masks. + // Caller must fill before each compute: + // positions [0..fa_mask_kv_seq_len-1]: 0x0000 (F16 0.0 = admit) + // positions [fa_mask_kv_seq_len..width-1]: 0xFC00 (F16 -inf = exclude) + ggml_tensor * fa_mask = nullptr; // F16 [width, 1] or null + int64_t fa_mask_kv_seq_len = 0; + // Outputs (caller reads via ggml_backend_tensor_get after compute) + ggml_tensor * out_logits = nullptr; + ggml_tensor * out_h_post = nullptr; + ggml_tensor * out_argmax = nullptr; +}; + +// Build the MTP step graph. attn_pos = cache.cur_pos at submit time. +// Returns false and sets last_error on failure. +bool build_mtp_step_graph(const MtpDrafterWeights & w, + const GemmaTargetCache & target_cache, + const GemmaTargetWeights & target, + MtpStepGraph & out, + int attn_pos); + +// Free the ggml context owned by the graph (tensors only; backend buffers +// for KV views are owned by target_cache and must not be freed here). +void free_mtp_step_graph(MtpStepGraph & g); + +// Load Gemma4 DFlash draft weights from a directory containing safetensors shards. +bool load_gemma4_draft_safetensors(const std::string & dir_path, + ggml_backend_t backend, + GemmaDraftWeights & out); + +// Load Gemma4 DFlash draft weights from a Q8_0-quantized GGUF file. +bool load_gemma4_draft_gguf(const std::string & path, + ggml_backend_t backend, + GemmaDraftWeights & out); + +void free_gemma4_draft_weights(GemmaDraftWeights & w); + +// Allocate draft KV cache tensors on the given backend. +bool create_draft_kv_cache(const GemmaDraftWeights & dw, + ggml_backend_t backend, + GemmaTargetCache & cache, + int cap_override = 0); +void free_draft_kv_cache(GemmaTargetCache & cache); + +// Build graph that projects target features → draft KV cache (prefix-direct). +// Materializes K,V for n_tokens new positions starting at cache.draft_kv_pos. +// target_feat [6*target_hidden, n_tokens] f32 +// positions [n_tokens] i32 (absolute positions for RoPE) +ggml_tensor * build_draft_kv_prefill_graph( + ggml_context * ctx, + ggml_cgraph * gf, + const GemmaDraftWeights & w, + GemmaTargetCache & cache, + ggml_tensor * target_feat, + ggml_tensor * positions, + int n_tokens); + +// Build the Gemma4 draft model forward graph with KV cache attention. +// draft_embed [draft_hidden, n_tokens] f32 (MASK token embeddings) +// positions [n_tokens] i32 (absolute positions) +// attn_mask [kv_pad, q_pad] f16 (causal over context+block) +// kv_start = cache.draft_kv_pos (context length before this block) +// Returns logits [n_vocab, n_tokens] f32 (softcapped). +ggml_tensor * build_gemma4_draft_graph( + ggml_context * ctx, + ggml_cgraph * gf, + const GemmaDraftWeights & w, + GemmaTargetCache & cache, + ggml_tensor * draft_embed, + ggml_tensor * positions, + ggml_tensor * attn_mask, + int n_tokens, + int kv_start); + } // namespace dflash27b diff --git a/dflash/src/pflash_ggml_adapter.cpp b/dflash/src/pflash_ggml_adapter.cpp new file mode 100644 index 00000000..4862d379 --- /dev/null +++ b/dflash/src/pflash_ggml_adapter.cpp @@ -0,0 +1,33 @@ +#include "flashprefill.h" + +// Forward-declare the registration function from ggml-cuda (defined in fattn-sparse.cu). +// No extern "C" — nvcc compiles .cu as C++ and the symbol has C++ linkage. +void ggml_cuda_flash_attn_sparse_set_kernel( + int (*fn)(const void*, const void*, const void*, void*, + int, int, int, int, int, float, float)); + +static int pflash_adapter( + const void * Q, const void * K, const void * V, void * O, + int batch, int seq_len, int n_q_heads, int n_k_heads, int head_dim, + float scale, float alpha) +{ + dflash27b::flashprefill::FlashPrefillConfig cfg; + if (alpha >= 1.0f) { + // alpha >= 1.0 means "select all blocks" — configure for dense attention + cfg.alpha = 0.0f; + cfg.attention_sink = seq_len; // all blocks are "sinks" + cfg.window = seq_len; // window covers everything + cfg.last_n_full = seq_len; // all query blocks attend fully + } else { + cfg.alpha = alpha; + } + return dflash27b::flashprefill::flash_prefill_forward_bf16( + Q, K, V, O, + batch, seq_len, n_q_heads, n_k_heads, head_dim, + scale, cfg); +} + +// Call this once at init time before running any ggml_flash_attn_sparse graphs. +void pflash_register_ggml_kernel() { + ggml_cuda_flash_attn_sparse_set_kernel(&pflash_adapter); +} diff --git a/dflash/src/pflash_ggml_adapter.h b/dflash/src/pflash_ggml_adapter.h new file mode 100644 index 00000000..f3bd5c56 --- /dev/null +++ b/dflash/src/pflash_ggml_adapter.h @@ -0,0 +1,2 @@ +#pragma once +void pflash_register_ggml_kernel(); diff --git a/dflash/test/gemma4/smoke_gemma4_draft_forward.cpp b/dflash/test/gemma4/smoke_gemma4_draft_forward.cpp new file mode 100644 index 00000000..4d963eb9 --- /dev/null +++ b/dflash/test/gemma4/smoke_gemma4_draft_forward.cpp @@ -0,0 +1,310 @@ +// Smoke test: load Gemma4 DFlash draft weights, build a forward graph with +// synthetic inputs, run on CUDA, and validate logits. +// +// Usage: smoke_gemma4_draft_forward + +#include "internal.h" +#include "gemma4.h" + +#include "ggml.h" +#include "ggml-alloc.h" +#include "ggml-backend.h" +#include "ggml-cuda.h" + +#include +#include +#include +#include +#include +#include +#include +#include + +using namespace dflash27b; + +static void fail(const char * msg) { + std::fprintf(stderr, "FAIL: %s\n", msg); + std::exit(1); +} + +int main(int argc, char ** argv) { + if (argc < 3) { + std::fprintf(stderr, "usage: %s \n", argv[0]); + return 2; + } + + ggml_backend_t backend = ggml_backend_cuda_init(0); + if (!backend) { std::fprintf(stderr, "cuda init failed\n"); return 1; } + + GemmaDraftWeights dw; + if (!load_gemma4_draft_safetensors(argv[1], backend, dw)) { + std::fprintf(stderr, "load_gemma4_draft_safetensors: %s\n", dflash27b_last_error()); + ggml_backend_free(backend); + return 1; + } + + // Load target to get tok_embd (shared between target and draft for LM head). + // tok_embd is not in the draft safetensors — it must come from the target at runtime. + // The target loader keeps tok_embd CPU-side (CpuEmbedder / mmap) to avoid uploading + // ~400 MiB to VRAM for every inference. For this smoke test we upload it once. + GemmaTargetWeights tw; + if (!load_gemma4_target_gguf(argv[2], backend, tw)) { + std::fprintf(stderr, "load_gemma4_target_gguf: %s\n", dflash27b_last_error()); + free_gemma4_draft_weights(dw); + ggml_backend_free(backend); + return 1; + } + + // tw.tok_embd is metadata-only (data = nullptr); actual bytes live in tw.embedder. + // Allocate a dedicated GPU tensor for tok_embd and upload the quantized bytes. + ggml_context * tok_embd_ctx = nullptr; + ggml_backend_buffer_t tok_embd_buf = nullptr; + { + ggml_init_params ep{}; + ep.mem_size = ggml_tensor_overhead() * 2; + ep.mem_buffer = nullptr; + ep.no_alloc = true; + tok_embd_ctx = ggml_init(ep); + if (!tok_embd_ctx) { + std::fprintf(stderr, "ggml_init for tok_embd failed\n"); + free_gemma4_target_weights(tw); + free_gemma4_draft_weights(dw); + ggml_backend_free(backend); + return 1; + } + + const ggml_type emb_type = tw.embedder.tok_embd_type; + const int64_t n_embd_t = tw.embedder.n_embd; + const int64_t n_vocab_t = tw.embedder.n_vocab; + + // ggml convention: ne[0] = n_embd (fast axis), ne[1] = n_vocab + ggml_tensor * te = ggml_new_tensor_2d(tok_embd_ctx, emb_type, n_embd_t, n_vocab_t); + ggml_set_name(te, "tok_embd_gpu"); + + tok_embd_buf = ggml_backend_alloc_ctx_tensors(tok_embd_ctx, backend); + if (!tok_embd_buf) { + std::fprintf(stderr, "ggml_backend_alloc_ctx_tensors for tok_embd failed\n"); + ggml_free(tok_embd_ctx); + free_gemma4_target_weights(tw); + free_gemma4_draft_weights(dw); + ggml_backend_free(backend); + return 1; + } + + const size_t emb_bytes = (size_t)tw.embedder.row_bytes * (size_t)n_vocab_t; + ggml_backend_tensor_set(te, tw.embedder.tok_embd_bytes, 0, emb_bytes); + std::printf("[tok_embd] uploaded %.1f MiB to GPU (%s [%" PRId64 ", %" PRId64 "])\n", + (double)emb_bytes / (1024.0 * 1024.0), + ggml_type_name(emb_type), n_embd_t, n_vocab_t); + + dw.tok_embd = te; + dw.n_vocab = (int)n_vocab_t; + } + + std::printf("[draft] n_layer=%d n_head=%d n_embd=%d n_vocab=%d target_hidden=%d\n", + dw.n_layer, dw.n_head, dw.n_embd, dw.n_vocab, dw.target_hidden); + + const int n_tokens = 16; // one block + const int target_feat_w = dw.n_target_layers * dw.target_hidden; // 6*4096 = 24576 + const int draft_hidden = dw.n_embd; + const int n_vocab = dw.n_vocab; + const int kq_mask_pad = 32; + + auto align_up = [](int x, int a) { return ((x + a - 1) / a) * a; }; + + // Allocate draft KV cache + GemmaTargetCache cache; + cache.backend = backend; + if (!create_draft_kv_cache(dw, backend, cache)) { + std::fprintf(stderr, "create_draft_kv_cache failed\n"); + return 1; + } + std::printf("[draft kv] cap=%d\n", cache.draft_kv_cap); + + // ── Step 1: Prefill draft KV with synthetic target features ────── + // Simulate n_tokens context positions with random target features + { + ggml_init_params ip{}; + ip.mem_size = 256 * 1024 * 1024; + ip.no_alloc = true; + ggml_context * pctx = ggml_init(ip); + if (!pctx) { fail("ggml_init for prefill failed"); } + + ggml_tensor * pf_target_feat = ggml_new_tensor_2d(pctx, GGML_TYPE_F32, target_feat_w, n_tokens); + ggml_tensor * pf_positions = ggml_new_tensor_1d(pctx, GGML_TYPE_I32, n_tokens); + ggml_set_name(pf_target_feat, "pf_target_feat"); + ggml_set_name(pf_positions, "pf_positions"); + ggml_set_input(pf_target_feat); + ggml_set_input(pf_positions); + + ggml_cgraph * pf_gf = ggml_new_graph_custom(pctx, 4096, false); + build_draft_kv_prefill_graph(pctx, pf_gf, dw, cache, + pf_target_feat, pf_positions, n_tokens); + + ggml_gallocr_t pf_alloc = ggml_gallocr_new(ggml_backend_get_default_buffer_type(backend)); + if (!ggml_gallocr_alloc_graph(pf_alloc, pf_gf)) { fail("prefill alloc failed"); } + + std::mt19937 rng_pf(42); + std::uniform_real_distribution u_pf(-0.05f, 0.05f); + { + std::vector data((size_t)target_feat_w * n_tokens); + for (auto & v : data) v = u_pf(rng_pf); + ggml_backend_tensor_set(pf_target_feat, data.data(), 0, sizeof(float) * data.size()); + } + { + std::vector pos(n_tokens); + for (int i = 0; i < n_tokens; i++) pos[i] = i; + ggml_backend_tensor_set(pf_positions, pos.data(), 0, sizeof(int32_t) * n_tokens); + } + + auto st = ggml_backend_graph_compute(backend, pf_gf); + if (st != GGML_STATUS_SUCCESS) { fail("prefill compute failed"); } + cache.draft_kv_pos = n_tokens; + std::printf("[prefill] KV materialized for %d positions\n", n_tokens); + + ggml_gallocr_free(pf_alloc); + ggml_free(pctx); + } + + // ── Step 2: Draft forward with KV cache ────────────────────────── + const int kv_start = cache.draft_kv_pos; // context length = n_tokens + + ggml_init_params ip{}; + ip.mem_size = 256 * 1024 * 1024; + ip.mem_buffer = nullptr; + ip.no_alloc = true; + ggml_context * gctx = ggml_init(ip); + if (!gctx) { std::fprintf(stderr, "ggml_init failed\n"); return 1; } + + ggml_tensor * draft_embed = ggml_new_tensor_2d(gctx, GGML_TYPE_F32, draft_hidden, n_tokens); + ggml_tensor * positions = ggml_new_tensor_1d(gctx, GGML_TYPE_I32, n_tokens); + const int kv_len = kv_start + n_tokens; + const int kv_pad = align_up(kv_len, kq_mask_pad); + const int q_pad = align_up(n_tokens, kq_mask_pad); + ggml_tensor * attn_mask = ggml_new_tensor_2d(gctx, GGML_TYPE_F16, kv_pad, q_pad); + + ggml_set_name(draft_embed, "draft_embed"); + ggml_set_name(positions, "positions"); + ggml_set_name(attn_mask, "attn_mask"); + ggml_set_input(draft_embed); + ggml_set_input(positions); + ggml_set_input(attn_mask); + + ggml_cgraph * gf = ggml_new_graph_custom(gctx, 8192, false); + ggml_tensor * logits = build_gemma4_draft_graph( + gctx, gf, dw, cache, + draft_embed, positions, attn_mask, + n_tokens, kv_start); + if (!logits) { std::fprintf(stderr, "build_gemma4_draft_graph returned null\n"); return 1; } + ggml_set_output(logits); + ggml_build_forward_expand(gf, logits); + std::printf("[graph] nodes=%d\n", ggml_graph_n_nodes(gf)); + + ggml_gallocr_t alloc = ggml_gallocr_new(ggml_backend_get_default_buffer_type(backend)); + if (!ggml_gallocr_alloc_graph(alloc, gf)) { + std::fprintf(stderr, "ggml_gallocr_alloc_graph failed\n"); + return 1; + } + + // Fill inputs with deterministic random data + std::mt19937 rng(1234); + std::uniform_real_distribution u(-0.05f, 0.05f); + + // draft_embed: [draft_hidden, 16] f32 + { + std::vector data((size_t)draft_hidden * n_tokens); + for (auto & v : data) v = u(rng); + ggml_backend_tensor_set(draft_embed, data.data(), 0, sizeof(float) * data.size()); + } + // positions: [kv_start, kv_start+1, ..., kv_start+15] + { + std::vector pos(n_tokens); + for (int i = 0; i < n_tokens; i++) pos[i] = kv_start + i; + ggml_backend_tensor_set(positions, pos.data(), 0, sizeof(int32_t) * n_tokens); + } + // attn_mask: causal over full kv_len, block queries attend to all context + causal within block + { + const ggml_fp16_t zero_h = ggml_fp32_to_fp16(0.0f); + const ggml_fp16_t ninf_h = ggml_fp32_to_fp16(-INFINITY); + std::vector mask((size_t)kv_pad * q_pad, ninf_h); + for (int q = 0; q < n_tokens; q++) { + int max_kv = kv_start + q; // attend to all context + block[0..q] + for (int k = 0; k <= max_kv; k++) { + mask[(size_t)q * kv_pad + k] = zero_h; + } + } + ggml_backend_tensor_set(attn_mask, mask.data(), 0, sizeof(ggml_fp16_t) * mask.size()); + } + + // Compute + auto status = ggml_backend_graph_compute(backend, gf); + if (status != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "ggml_backend_graph_compute failed: %d\n", (int)status); + return 1; + } + std::printf("[compute] OK\n"); + + // Validate expected output shape + if (logits->ne[0] != (int64_t)n_vocab || logits->ne[1] != (int64_t)n_tokens) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "logits shape [%" PRId64 ", %" PRId64 "] expected [%d, %d]", + logits->ne[0], logits->ne[1], n_vocab, n_tokens); + fail(buf); + } + std::printf("[logits] shape: [%" PRId64 ", %" PRId64 "]\n", + logits->ne[0], logits->ne[1]); + + // Read logits for position 0 + std::vector logit_buf((size_t)n_vocab * n_tokens); + ggml_backend_tensor_get(logits, logit_buf.data(), 0, sizeof(float) * logit_buf.size()); + + // Check for NaN and softcap bounds across all positions + int n_nan = 0, n_oob = 0; + float vmin = 1e30f, vmax = -1e30f; + for (auto v : logit_buf) { + if (std::isnan(v)) { n_nan++; continue; } + if (v < vmin) vmin = v; + if (v > vmax) vmax = v; + if (v < -30.0f || v > 30.0f) n_oob++; + } + std::printf("[logits] nan=%d oob=%d min=%.4g max=%.4g\n", + n_nan, n_oob, vmin, vmax); + + if (n_nan > 0) fail("NaN values in logits"); + if (n_oob > 0) { + char buf[64]; + std::snprintf(buf, sizeof(buf), + "%d logit values outside [-30, 30] softcap bounds", n_oob); + fail(buf); + } + + // Top-5 tokens for position 0 + const float * pos0_logits = logit_buf.data(); + std::vector> sorted; + sorted.reserve((size_t)n_vocab); + for (int i = 0; i < n_vocab; i++) sorted.emplace_back(pos0_logits[i], i); + std::partial_sort(sorted.begin(), sorted.begin() + 5, sorted.end(), + [](const auto & a, const auto & b) { return a.first > b.first; }); + std::printf("[top 5 pos=0]"); + for (int i = 0; i < 5; i++) { + std::printf(" id=%d l=%.3f", sorted[i].second, sorted[i].first); + } + std::printf("\n"); + + ggml_gallocr_free(alloc); + ggml_free(gctx); + free_draft_kv_cache(cache); + // dw.tok_embd points into tok_embd_ctx/buf — null it before freeing the draft + // so free_gemma4_draft_weights doesn't double-free or access freed memory. + dw.tok_embd = nullptr; + free_gemma4_draft_weights(dw); + // Free tok_embd GPU allocation (must outlive the compute graph). + if (tok_embd_buf) ggml_backend_buffer_free(tok_embd_buf); + if (tok_embd_ctx) ggml_free(tok_embd_ctx); + // Target weights own the mmap that backs tok_embd_bytes; free after GPU upload. + free_gemma4_target_weights(tw); + ggml_backend_free(backend); + std::printf("PASS\n"); + return 0; +} diff --git a/dflash/test/gemma4/smoke_gemma4_target_forward.cpp b/dflash/test/gemma4/smoke_gemma4_target_forward.cpp new file mode 100644 index 00000000..596cd790 --- /dev/null +++ b/dflash/test/gemma4/smoke_gemma4_target_forward.cpp @@ -0,0 +1,198 @@ +// Smoke test: load Gemma4 target, run a single-token forward pass, validate logits. +// +// Usage: smoke_gemma4_target_forward + +#include "internal.h" +#include "gemma4.h" + +#include "ggml.h" +#include "ggml-alloc.h" +#include "ggml-backend.h" +#include "ggml-cuda.h" + +#include +#include +#include +#include +#include +#include +#include +#include + +using namespace dflash27b; + +static void fail(const char * msg) { + std::fprintf(stderr, "FAIL: %s\n", msg); + std::exit(1); +} + +int main(int argc, char ** argv) { + if (argc < 2) { + std::fprintf(stderr, "usage: %s \n", argv[0]); + return 2; + } + + ggml_backend_t backend = ggml_backend_cuda_init(0); + if (!backend) { std::fprintf(stderr, "cuda init failed\n"); return 1; } + + // Load target weights + GemmaTargetWeights w; + if (!load_gemma4_target_gguf(argv[1], backend, w)) { + std::fprintf(stderr, "load_gemma4_target_gguf: %s\n", dflash27b_last_error()); + ggml_backend_free(backend); + return 1; + } + std::printf("[target] n_layer=%d n_embd=%d n_vocab=%d\n", + w.n_layer, w.n_embd, w.n_vocab); + + // Create target cache + GemmaTargetCache cache; + const int max_ctx = 512; + if (!create_gemma4_cache(w, max_ctx, backend, cache)) { + std::fprintf(stderr, "create_gemma4_cache: %s\n", dflash27b_last_error()); + free_gemma4_target_weights(w); + ggml_backend_free(backend); + return 1; + } + std::printf("[cache] created max_ctx=%d kv_layers=%zu\n", + cache.max_ctx, cache.attn_k.size()); + + // Build graph context + ggml_init_params ip{}; + ip.mem_size = 512 * 1024 * 1024; + ip.mem_buffer = nullptr; + ip.no_alloc = true; + ggml_context * gctx = ggml_init(ip); + if (!gctx) { std::fprintf(stderr, "ggml_init failed\n"); return 1; } + + // Input tensors for a single token at position 0 + const int n_tokens = 1; + const int hidden = w.n_embd; + const int kv_start = 0; + + // Gemma4 uses 1D positions (not M-RoPE with 4 values like Qwen) + ggml_tensor * inp_embed = ggml_new_tensor_3d(gctx, GGML_TYPE_F32, hidden, n_tokens, 1); + ggml_tensor * positions = ggml_new_tensor_1d(gctx, GGML_TYPE_I32, n_tokens); + ggml_set_name(inp_embed, "inp_embed"); + ggml_set_name(positions, "positions"); + ggml_set_input(inp_embed); + ggml_set_input(positions); + + // CUDA flash attention for head_dim>=512 (Gemma4-26B has head_dim=512 on full-attn + // layers) requires a non-null mask so the GQA optimisation path is taken. + // Provide a causal attention mask: shape [kv_len_padded, n_tokens], F32. + // Entries are 0.0 for positions we attend to and -INF for positions we don't. + const int kv_len = kv_start + n_tokens; // 1 + const int kv_len_padded = ((kv_len + 255) / 256) * 256; // 256 + ggml_tensor * attn_mask = ggml_new_tensor_2d(gctx, GGML_TYPE_F16, kv_len_padded, n_tokens); + ggml_set_name(attn_mask, "attn_mask"); + ggml_set_input(attn_mask); + + GemmaGraphInputs gi{}; + gi.inp_embed = inp_embed; + gi.positions = positions; + gi.attn_mask = attn_mask; + gi.n_tokens = n_tokens; + gi.kv_start = kv_start; + gi.capture_layers = true; + + // Build graph + ggml_cgraph * gf = ggml_new_graph_custom(gctx, 16384, false); + GemmaGraphOutputs go = build_gemma4_graph(gctx, gf, w, cache, gi); + if (!go.logits) { std::fprintf(stderr, "build_gemma4_graph returned null logits\n"); return 1; } + ggml_set_output(go.logits); + ggml_build_forward_expand(gf, go.logits); + std::printf("[graph] nodes=%d\n", ggml_graph_n_nodes(gf)); + + // Allocate graph memory + ggml_gallocr_t alloc = ggml_gallocr_new(ggml_backend_get_default_buffer_type(backend)); + if (!ggml_gallocr_alloc_graph(alloc, gf)) { + std::fprintf(stderr, "ggml_gallocr_alloc_graph failed\n"); + return 1; + } + + // Fill causal attention mask (F16). + // mask[k, q] = 0.0 if k <= q (position k is visible from query q) + // = -INF otherwise (masked out / padding) + { + const ggml_fp16_t zero_h = ggml_fp32_to_fp16(0.0f); + const ggml_fp16_t ninf_h = ggml_fp32_to_fp16(-INFINITY); + std::vector mask_data((size_t)kv_len_padded * n_tokens, ninf_h); + for (int q = 0; q < n_tokens; q++) { + for (int k = 0; k <= kv_start + q; k++) { + mask_data[(size_t)q * kv_len_padded + k] = zero_h; + } + } + ggml_backend_tensor_set(attn_mask, mask_data.data(), 0, + sizeof(ggml_fp16_t) * mask_data.size()); + } + + // Embed token id=2 (BOS) using the CPU embedder + int32_t bos_id = 2; + std::vector embed_buf((size_t)hidden * n_tokens); + if (!w.embedder.embed(&bos_id, n_tokens, embed_buf.data())) { + std::fprintf(stderr, "embedder.embed failed\n"); + return 1; + } + ggml_backend_tensor_set(inp_embed, embed_buf.data(), 0, sizeof(float) * embed_buf.size()); + + // Position 0 + int32_t pos0 = 0; + ggml_backend_tensor_set(positions, &pos0, 0, sizeof(int32_t)); + + // Compute + auto status = ggml_backend_graph_compute(backend, gf); + if (status != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "ggml_backend_graph_compute failed: %d\n", (int)status); + return 1; + } + std::printf("[compute] OK\n"); + + // Read logits back + const int64_t vocab = w.n_vocab; + std::vector logits((size_t)vocab); + ggml_backend_tensor_get(go.logits, logits.data(), 0, sizeof(float) * vocab); + + // Check for NaN / Inf and validate softcap bounds + int n_nan = 0, n_inf = 0, n_oob = 0; + float vmin = 1e30f, vmax = -1e30f; + for (auto v : logits) { + if (std::isnan(v)) { n_nan++; continue; } + if (std::isinf(v)) { n_inf++; continue; } + if (v < vmin) vmin = v; + if (v > vmax) vmax = v; + // Logit softcap = 30.0 means values are in (-30, 30) + if (v < -30.0f || v > 30.0f) n_oob++; + } + std::printf("[logits] vocab=%" PRId64 " nan=%d inf=%d oob=%d min=%.4g max=%.4g\n", + vocab, n_nan, n_inf, n_oob, vmin, vmax); + + if (n_nan > 0) fail("NaN values in logits"); + if (n_inf > 0) fail("Inf values in logits"); + if (n_oob > 0) { + char buf[64]; + std::snprintf(buf, sizeof(buf), + "%d logit values outside [-30, 30] softcap bounds", n_oob); + fail(buf); + } + + // Print top-5 tokens + std::vector> sorted; + sorted.reserve((size_t)vocab); + for (int i = 0; i < (int)vocab; i++) sorted.emplace_back(logits[i], i); + std::partial_sort(sorted.begin(), sorted.begin() + 5, sorted.end(), + [](const auto & a, const auto & b) { return a.first > b.first; }); + std::printf("[top 5]"); + for (int i = 0; i < 5; i++) { + std::printf(" id=%d l=%.3f", sorted[i].second, sorted[i].first); + } + std::printf("\n"); + + ggml_gallocr_free(alloc); + ggml_free(gctx); + free_gemma4_cache(cache); + free_gemma4_target_weights(w); + ggml_backend_free(backend); + std::printf("PASS\n"); + return 0; +} diff --git a/dflash/test/gemma4/smoke_load_gemma4_draft.cpp b/dflash/test/gemma4/smoke_load_gemma4_draft.cpp new file mode 100644 index 00000000..1cc7d638 --- /dev/null +++ b/dflash/test/gemma4/smoke_load_gemma4_draft.cpp @@ -0,0 +1,113 @@ +// Smoke test: load Gemma4 DFlash draft weights from a safetensors directory. +// +// Usage: smoke_load_gemma4_draft + +#include "internal.h" +#include "gemma4.h" + +#include "ggml.h" +#include "ggml-backend.h" +#include "ggml-cuda.h" + +#include +#include +#include +#include +#include + +using namespace dflash27b; + +static void fail(const char * msg) { + std::fprintf(stderr, "FAIL: %s\n", msg); + std::exit(1); +} + +int main(int argc, char ** argv) { + if (argc < 2) { + std::fprintf(stderr, "usage: %s \n", argv[0]); + return 2; + } + + ggml_backend_t backend = ggml_backend_cuda_init(0); + if (!backend) { + std::fprintf(stderr, "ggml_backend_cuda_init(0) failed\n"); + return 1; + } + std::printf("cuda backend: %s\n", ggml_backend_name(backend)); + + GemmaDraftWeights dw; + if (!load_gemma4_draft_safetensors(argv[1], backend, dw)) { + std::fprintf(stderr, "load_gemma4_draft_safetensors failed: %s\n", + dflash27b_last_error()); + ggml_backend_free(backend); + return 1; + } + + // Print loaded metadata + std::printf("n_layer=%d n_head=%d n_head_kv=%d head_dim=%d n_embd=%d n_ff=%d n_vocab=%d\n", + dw.n_layer, dw.n_head, dw.n_head_kv, dw.head_dim, + dw.n_embd, dw.n_ff, dw.n_vocab); + std::printf("n_target_layers=%d target_hidden=%d logit_softcap=%.1f\n", + dw.n_target_layers, dw.target_hidden, dw.logit_softcap); + + // Assert expected draft topology + if (dw.n_layer != 5) { + char buf[64]; + std::snprintf(buf, sizeof(buf), "n_layer=%d expected 5", dw.n_layer); + fail(buf); + } + if (dw.n_vocab != 262144) { + char buf[64]; + std::snprintf(buf, sizeof(buf), "n_vocab=%d expected 262144", dw.n_vocab); + fail(buf); + } + if (!dw.fc) fail("fc is null"); + + // Validate fc shape: ne[0] = 6*target_hidden (input features), ne[1] = draft_hidden (output) + // In ggml convention: ne[0] is the fast (inner) dimension of matrix multiply, + // so fc has ne[0]=6*target_hidden and ne[1]=draft_hidden. + const int64_t expected_fc_ne0 = (int64_t)dw.n_target_layers * dw.target_hidden; + std::printf("fc: ne=[%" PRId64 ", %" PRId64 "] type=%s\n", + dw.fc->ne[0], dw.fc->ne[1], + ggml_type_name(dw.fc->type)); + if (dw.fc->ne[0] != expected_fc_ne0) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "fc->ne[0]=%" PRId64 " expected %" PRId64 " (n_target_layers=%d * target_hidden=%d)", + dw.fc->ne[0], expected_fc_ne0, dw.n_target_layers, dw.target_hidden); + fail(buf); + } + + // Assert layers vector size + if ((int)dw.layers.size() != dw.n_layer) { + char buf[64]; + std::snprintf(buf, sizeof(buf), + "layers.size()=%zu expected %d", dw.layers.size(), dw.n_layer); + fail(buf); + } + + // Spot-check layer 0 key tensors + if (!dw.layers[0].wq) fail("layers[0].wq is null"); + if (!dw.layers[0].wk) fail("layers[0].wk is null"); + if (!dw.layers[0].w_gate) fail("layers[0].w_gate is null"); + + // Print layer 0 shape as spot check + std::printf("layers[0].wq: ne=[%" PRId64 ", %" PRId64 "] type=%s\n", + dw.layers[0].wq->ne[0], dw.layers[0].wq->ne[1], + ggml_type_name(dw.layers[0].wq->type)); + + // Validate hidden_norm and out_norm + if (!dw.hidden_norm) fail("hidden_norm is null"); + if (!dw.out_norm) fail("out_norm is null"); + // tok_embd is NOT loaded from the draft safetensors; it is injected at + // runtime from the target model's token embedding table. + if (dw.tok_embd) fail("tok_embd should be null after loading draft (shared with target)"); + + std::printf("hidden_norm: ne[0]=%" PRId64 " type=%s\n", + dw.hidden_norm->ne[0], ggml_type_name(dw.hidden_norm->type)); + + free_gemma4_draft_weights(dw); + ggml_backend_free(backend); + std::printf("PASS\n"); + return 0; +} diff --git a/dflash/test/gemma4/smoke_load_gemma4_target.cpp b/dflash/test/gemma4/smoke_load_gemma4_target.cpp new file mode 100644 index 00000000..fc70c1dc --- /dev/null +++ b/dflash/test/gemma4/smoke_load_gemma4_target.cpp @@ -0,0 +1,115 @@ +// Smoke test: load a Gemma4 target GGUF, validate metadata and tensor shapes. +// +// Usage: smoke_load_gemma4_target + +#include "internal.h" +#include "gemma4.h" + +#include "ggml.h" +#include "ggml-backend.h" +#include "ggml-cuda.h" + +#include +#include +#include +#include +#include + +using namespace dflash27b; + +static void fail(const char * msg) { + std::fprintf(stderr, "FAIL: %s\n", msg); + std::exit(1); +} + +int main(int argc, char ** argv) { + if (argc < 2) { + std::fprintf(stderr, "usage: %s \n", argv[0]); + return 2; + } + + ggml_backend_t backend = ggml_backend_cuda_init(0); + if (!backend) { std::fprintf(stderr, "cuda init failed\n"); return 1; } + + GemmaTargetWeights w; + if (!load_gemma4_target_gguf(argv[1], backend, w)) { + std::fprintf(stderr, "load_gemma4_target_gguf failed: %s\n", dflash27b_last_error()); + ggml_backend_free(backend); + return 1; + } + + // Print architecture metadata + std::printf("hparams: n_layer=%d n_embd=%d n_head=%d n_head_kv=%d head_dim=%d " + "n_vocab=%d n_ff=%d\n", + w.n_layer, w.n_embd, w.n_head, w.n_head_kv, w.head_dim, + w.n_vocab, w.n_ff); + + // Count SWA vs full-attention layers + int n_swa = 0, n_full = 0; + for (int il = 0; il < w.n_layer; il++) { + if (il < (int)w.swa_layers.size() && w.swa_layers[il]) n_swa++; + else n_full++; + } + std::printf("swa_layers: swa=%d full=%d (total=%d)\n", n_swa, n_full, w.n_layer); + + // Print KV-sharing config + std::printf("kv_sharing: n_kv_shared_layers=%d n_layer_kv=%d\n", + w.n_kv_shared_layers, w.n_layer_kv); + + // Print Per-Layer Embedding dimension + std::printf("n_embd_per_layer=%d\n", w.n_embd_per_layer); + + // Print MoE config (0 for dense) + std::printf("moe: n_expert=%d n_expert_used=%d\n", w.n_expert, w.n_expert_used); + + // Print attention config + std::printf("logit_softcap=%.2f attn_scale=%.4f rope_theta=%.0f\n", + w.logit_softcap, w.attn_scale, w.rope_theta); + + // Print captured layer IDs for the DFlash draft + std::printf("capture_layer_ids:"); + for (int i = 0; i < w.n_capture_layers; i++) { + std::printf(" %d", w.capture_layer_ids[i]); + } + std::printf("\n"); + + // Assertions + if (w.n_vocab != 262144) { + char buf[64]; + std::snprintf(buf, sizeof(buf), "n_vocab=%d expected 262144", w.n_vocab); + fail(buf); + } + if (w.logit_softcap != 30.0f) { + char buf[64]; + std::snprintf(buf, sizeof(buf), "logit_softcap=%.2f expected 30.0", w.logit_softcap); + fail(buf); + } + if (w.n_layer_kv <= 0) { + fail("n_layer_kv must be > 0"); + } + if (w.n_layer_kv > w.n_layer) { + char buf[64]; + std::snprintf(buf, sizeof(buf), "n_layer_kv=%d > n_layer=%d", w.n_layer_kv, w.n_layer); + fail(buf); + } + + // Spot-check layer 0 tensors + if (!w.layers[0].wq) fail("layers[0].wq is null"); + if (!w.layers[0].wo) fail("layers[0].wo is null"); + if (!w.layers[0].w_gate) fail("layers[0].w_gate is null"); + + // Spot-check tok_embd and output + if (!w.tok_embd) fail("tok_embd is null"); + if (!w.output) fail("output (lm_head) is null"); + if (!w.out_norm) fail("out_norm is null"); + + std::printf("tok_embd: ne=[%" PRId64 ", %" PRId64 "] type=%s nbytes=%.2f MiB\n", + w.tok_embd->ne[0], w.tok_embd->ne[1], + ggml_type_name(w.tok_embd->type), + ggml_nbytes(w.tok_embd) / (1024.0 * 1024.0)); + + free_gemma4_target_weights(w); + ggml_backend_free(backend); + std::printf("PASS\n"); + return 0; +} diff --git a/dflash/test/gemma4/test_gemma4_dflash.cpp b/dflash/test/gemma4/test_gemma4_dflash.cpp new file mode 100644 index 00000000..8bd48acb --- /dev/null +++ b/dflash/test/gemma4/test_gemma4_dflash.cpp @@ -0,0 +1,2604 @@ +// Gemma4 DFlash speculative decoding end-to-end test / benchmark driver. +// +// Pipeline: +// 1. Load target (Gemma4-31B or 26B-A4B GGUF) + draft (z-lab Gemma4-DFlash +// safetensors directory). +// 2. Prefill: chunked batched forward over prompt tokens (up to swa_window +// tokens per chunk), capture_layers=true so target_feat gets populated. +// 3. Decode loop (until n_predict): +// a. [target-only path, always active] +// Run target forward for last committed token → logits → sample next. +// b. [speculative path, active when draft is loaded] +// i. Get target_feat from cache. +// ii. Run draft model to propose a block of tokens. +// iii. Verify proposals against target in one batched forward. +// iv. Accept longest verified prefix + bonus token, advance cache. +// 4. Print generated text and timing stats. +// +// Usage: +// test_gemma4_dflash --model [--draft ] +// [--prompt ] [--n-predict ] +// [--ctx-size ] [--kv-k ] [--kv-v ] +// [--seed ] [--temp ] [--top-k ] [--top-p ] +// [--budget ] [--gpu ] [--bench] + +#include "internal.h" +#include "dflash27b.h" +#include "gemma4.h" +#include "ggml.h" +#include "ggml-alloc.h" +#include "ggml-backend.h" +#include "ggml-cuda.h" +#include +#include "../src/pflash_ggml_adapter.h" + +#ifdef _WIN32 +#define setenv(name, value, overwrite) _putenv_s(name, value) +#define unsetenv(name) _putenv_s(name, "") +#endif + +#include +#include +#include +#include +#include +#include +#include +#include +#include +#include +#include +#include +#include +#include +#include + +#ifdef _WIN32 +#include +#else +#include +#endif + +using namespace dflash27b; + +// Copy n_tokens rows of width feat_w from a BF16 ring-buffer tensor (src_bf16) +// starting at ring slot ring_slot0 into a contiguous F32 tensor (dst_f32). +// Uses ggml_cpy with ggml_view_2d for type conversion on the GPU backend — +// replaces the former dflash27b_launch_bf16_to_f32 custom kernel (f16_convert.cu), +// removed per howard0su's review (r3214289240): ggml_cpy does the same thing. +static void copy_target_feat_bf16_to_f32( + ggml_backend_t backend, + const ggml_tensor * src_bf16, // [feat_w, cap] BF16 (cache.target_feat) + ggml_tensor * dst_f32, // [feat_w, n_tokens] F32 (pkg.target_feat) + int ring_slot0, + int n_tokens, + int feat_w) { + const int cap = (int)src_bf16->ne[1]; + const int pre_n = std::min(n_tokens, cap - ring_slot0); + const int post_n = n_tokens - pre_n; + + ggml_init_params ip{}; + ip.mem_size = 256 * 1024; + ip.mem_buffer = nullptr; + ip.no_alloc = true; + ggml_context * tmp_ctx = ggml_init(ip); + + ggml_cgraph * gf = ggml_new_graph(tmp_ctx); + + // Pre-wrap segment: rows [ring_slot0 .. ring_slot0+pre_n-1] → dst rows [0..pre_n-1] + { + ggml_tensor * s = ggml_view_2d(tmp_ctx, src_bf16, feat_w, pre_n, + src_bf16->nb[1], + (size_t)ring_slot0 * src_bf16->nb[1]); + ggml_tensor * d = ggml_view_2d(tmp_ctx, dst_f32, feat_w, pre_n, + dst_f32->nb[1], 0); + ggml_build_forward_expand(gf, ggml_cpy(tmp_ctx, s, d)); + } + // Post-wrap segment: rows [0..post_n-1] → dst rows [pre_n..pre_n+post_n-1] + if (post_n > 0) { + ggml_tensor * s = ggml_view_2d(tmp_ctx, src_bf16, feat_w, post_n, + src_bf16->nb[1], 0); + ggml_tensor * d = ggml_view_2d(tmp_ctx, dst_f32, feat_w, post_n, + dst_f32->nb[1], + (size_t)pre_n * dst_f32->nb[1]); + ggml_build_forward_expand(gf, ggml_cpy(tmp_ctx, s, d)); + } + + ggml_backend_graph_compute(backend, gf); + ggml_free(tmp_ctx); +} + +// ─── Utilities ──────────────────────────────────────────────────────────── + +static constexpr int KQ_MASK_PAD = 32; +static constexpr uint16_t F16_ZERO = 0x0000; +static constexpr uint16_t F16_NEG_INF = 0xFC00; + +static int g_kq_stride_pad = KQ_MASK_PAD; + +static int align_up(int x, int a) { return ((x + a - 1) / a) * a; } + +static int argmax_f32(const float * x, int n) { + int best = 0; + float bv = x[0]; + for (int i = 1; i < n; i++) { + if (x[i] > bv) { bv = x[i]; best = i; } + } + return best; +} + +// ─── Sampler ────────────────────────────────────────────────────────────── + +struct SamplerCfg { + float temp = 0.0f; + float top_p = 1.0f; + int top_k = 0; + float rep_pen = 1.0f; + int rep_window = 256; + uint64_t seed = 0; +}; + +static int sample_logits(const float * logits_in, + int vocab, + const SamplerCfg & cfg, + const std::vector & history, + std::mt19937_64 & rng) { + if (cfg.temp <= 0.0f) { + return argmax_f32(logits_in, vocab); + } + + std::vector> cand(vocab); + for (int i = 0; i < vocab; i++) cand[i] = {logits_in[i], i}; + + if (cfg.rep_pen > 1.0f && !history.empty()) { + const int win = std::min((int)history.size(), cfg.rep_window); + const int from = (int)history.size() - win; + std::unordered_set seen; + for (int i = from; i < (int)history.size(); i++) seen.insert(history[i]); + for (auto & c : cand) { + if (seen.count(c.second)) { + c.first = (c.first > 0.0f) ? c.first / cfg.rep_pen + : c.first * cfg.rep_pen; + } + } + } + + if (cfg.top_k > 0 && cfg.top_k < vocab) { + std::partial_sort(cand.begin(), cand.begin() + cfg.top_k, cand.end(), + [](const auto & a, const auto & b) { return a.first > b.first; }); + cand.resize(cfg.top_k); + } else { + std::sort(cand.begin(), cand.end(), + [](const auto & a, const auto & b) { return a.first > b.first; }); + } + + const float inv_t = 1.0f / std::max(1e-3f, cfg.temp); + float maxv = cand.front().first * inv_t; + double Z = 0.0; + std::vector probs(cand.size()); + for (size_t i = 0; i < cand.size(); i++) { + probs[i] = std::exp(cand[i].first * inv_t - maxv); + Z += probs[i]; + } + for (auto & p : probs) p = (float)(p / Z); + + if (cfg.top_p > 0.0f && cfg.top_p < 1.0f) { + double cum = 0.0; + size_t cut = probs.size(); + for (size_t i = 0; i < probs.size(); i++) { + cum += probs[i]; + if (cum >= cfg.top_p) { cut = i + 1; break; } + } + probs.resize(cut); + cand.resize(cut); + double zz = 0.0; + for (auto p : probs) zz += p; + for (auto & p : probs) p = (float)(p / zz); + } + + std::uniform_real_distribution u(0.0, 1.0); + double r = u(rng); + double acc = 0.0; + for (size_t i = 0; i < probs.size(); i++) { + acc += probs[i]; + if (r <= acc) return cand[i].second; + } + return cand.back().second; +} + +// ─── Causal mask builder ────────────────────────────────────────────────── + +static void build_causal_mask(std::vector & out, + int kv_len, int n_tokens, int kv_start) { + const int kv_pad = align_up(kv_len, g_kq_stride_pad); + const int q_pad = align_up(n_tokens, KQ_MASK_PAD); + out.assign((size_t)kv_pad * q_pad, F16_NEG_INF); + for (int q = 0; q < n_tokens; q++) { + const int abs_q = kv_start + q; + for (int k = 0; k <= abs_q && k < kv_len; k++) { + out[(size_t)q * kv_pad + k] = F16_ZERO; + } + } +} + +// ─── SWA causal mask builder (for chunked batched prefill) ─────────────────── +// +// Non-monotonic ring mask. The K view is always the full ring (ring_size slots, +// ring_win_start==0). Slot k_view maps to absolute position via: +// latest_slot = (kv_end - 1) % ring_size +// offset_back = (latest_slot - k_view + ring_size) % ring_size +// abs_k = (kv_end - 1) - offset_back +// +// mask[q_idx][k_view_idx] = 0 (attend) iff: +// abs_k >= (abs_q - swa_window + 1) AND abs_k <= abs_q AND abs_k >= 0 +// else -inf. +static void build_swa_causal_mask(std::vector & out, + int kv_start, + int n_tokens, + int swa_window, + int ring_size, // = swa_view.effective_win_len = swa_ctx_alloc + int kv_end) { // = kv_start + n_tokens + const int kv_pad = align_up(ring_size, g_kq_stride_pad); + const int q_pad = align_up(n_tokens, KQ_MASK_PAD); + out.assign((size_t)kv_pad * q_pad, F16_NEG_INF); + const int latest_slot = ((kv_end - 1) % ring_size + ring_size) % ring_size; + for (int q = 0; q < n_tokens; q++) { + const int abs_q = kv_start + q; + const int q_lo = std::max(0, abs_q - swa_window + 1); + for (int k_view = 0; k_view < ring_size; k_view++) { + const int offset_back = (latest_slot - k_view + ring_size) % ring_size; + const int abs_k = (kv_end - 1) - offset_back; + const bool valid = (abs_k >= q_lo && abs_k <= abs_q && abs_k >= 0); + if (valid) { + out[(size_t)q * kv_pad + k_view] = F16_ZERO; + } + } + } +} + +// ─── Per-step graph state (rebuilt each forward pass since kv_len varies) ─ + +struct StepGraph { + ggml_context * ctx = nullptr; + ggml_cgraph * gf = nullptr; + ggml_gallocr_t alloc = nullptr; + ggml_tensor * inp_embed = nullptr; + ggml_tensor * positions = nullptr; + ggml_tensor * attn_mask = nullptr; + ggml_tensor * swa_mask = nullptr; + ggml_tensor * logits = nullptr; +}; + +static void step_graph_free(StepGraph & sg) { + if (sg.ctx) { ggml_free(sg.ctx); sg.ctx = nullptr; } + sg.gf = nullptr; + sg.inp_embed = nullptr; + sg.positions = nullptr; + sg.attn_mask = nullptr; + sg.swa_mask = nullptr; + sg.logits = nullptr; +} + +static void step_graph_destroy(StepGraph & sg) { + if (sg.alloc) { ggml_gallocr_free(sg.alloc); sg.alloc = nullptr; } + step_graph_free(sg); +} + +// ─── Draft step graph state ─────────────────────────────────────────────── + +struct DraftStepGraph { + ggml_context * ctx = nullptr; + ggml_cgraph * gf = nullptr; + ggml_gallocr_t alloc = nullptr; + ggml_tensor * draft_embed = nullptr; + ggml_tensor * positions = nullptr; + ggml_tensor * attn_mask = nullptr; + ggml_tensor * logits = nullptr; +}; + +static void draft_step_free(DraftStepGraph & dsg) { + if (dsg.ctx) { ggml_free(dsg.ctx); dsg.ctx = nullptr; } + dsg.gf = nullptr; + dsg.draft_embed = nullptr; + dsg.positions = nullptr; + dsg.attn_mask = nullptr; + dsg.logits = nullptr; +} + +static void draft_step_destroy(DraftStepGraph & dsg) { + if (dsg.alloc) { ggml_gallocr_free(dsg.alloc); dsg.alloc = nullptr; } + draft_step_free(dsg); +} + +// ─── Draft KV prefill graph state ──────────────────────────────────────────── + +struct DraftKVPrefillGraph { + ggml_context * ctx = nullptr; + ggml_cgraph * gf = nullptr; + ggml_gallocr_t alloc = nullptr; + ggml_tensor * target_feat = nullptr; // input: [6*target_hidden, n_tokens] + ggml_tensor * positions = nullptr; // input: [n_tokens] i32 +}; + +static void draft_kv_prefill_free(DraftKVPrefillGraph & pkg) { + if (pkg.ctx) { ggml_free(pkg.ctx); pkg.ctx = nullptr; } + pkg.gf = nullptr; + pkg.target_feat = nullptr; + pkg.positions = nullptr; +} + +static void draft_kv_prefill_destroy(DraftKVPrefillGraph & pkg) { + if (pkg.alloc) { ggml_gallocr_free(pkg.alloc); pkg.alloc = nullptr; } + draft_kv_prefill_free(pkg); +} + +// Build a single-step target forward graph. +// n_tokens - number of tokens in this forward (1 for decode, >1 for prefill) +// kv_start - index of the first new token in the KV cache +// with_mask - whether to allocate an attention-mask input (required for n_tokens > 1) +// capture - whether to write captured layer features to cache.target_feat +static bool build_gemma4_step(StepGraph & sg, + const GemmaTargetWeights & w, + GemmaTargetCache & cache, + ggml_backend_t backend, + int kv_start, + int n_tokens, + bool with_mask, + bool capture, + bool use_pflash = false, + float pflash_alpha = 0.12f, + int fa_window = 0, + bool last_token_logits_only = false) { + step_graph_free(sg); + + ggml_init_params ip{}; + ip.mem_size = 512 * 1024 * 1024; + ip.mem_buffer = nullptr; + ip.no_alloc = true; + sg.ctx = ggml_init(ip); + if (!sg.ctx) return false; + + sg.inp_embed = ggml_new_tensor_3d(sg.ctx, GGML_TYPE_F32, w.n_embd, n_tokens, 1); + ggml_set_name(sg.inp_embed, "inp_embed"); + ggml_set_input(sg.inp_embed); + + sg.positions = ggml_new_tensor_1d(sg.ctx, GGML_TYPE_I32, n_tokens); + ggml_set_name(sg.positions, "positions"); + ggml_set_input(sg.positions); + + if (with_mask) { + const int kv_len = kv_start + n_tokens; + const int kv_pad = align_up(kv_len, g_kq_stride_pad); + const int q_pad = align_up(n_tokens, KQ_MASK_PAD); + + sg.attn_mask = ggml_new_tensor_2d(sg.ctx, GGML_TYPE_F16, kv_pad, q_pad); + ggml_set_name(sg.attn_mask, "attn_mask"); + ggml_set_input(sg.attn_mask); + ggml_set_output(sg.attn_mask); // force gallocr to allocate even if no op references it + + // SWA mask is required for every SWA dispatch — including single-token + // decode (n_tokens==1). When swa_mask is null, gemma4_target_graph falls + // back to attn_mask, which is sized for kv_len rather than the SWA window; + // the resulting dimension mismatch lets FA read past the populated cache + // region and corrupts attention. Catastrophic with TQ3_0 KV (it amplifies + // uninitialized-cache noise into a fixed-point repetition loop), benign + // but technically wrong with Q8_0 KV. + const SwaView swa_view = compute_swa_view(kv_start, n_tokens, + w.swa_window, cache.swa_ctx_alloc); + const int swa_kv_pad = align_up(swa_view.effective_win_len, g_kq_stride_pad); + sg.swa_mask = ggml_new_tensor_2d(sg.ctx, GGML_TYPE_F16, swa_kv_pad, q_pad); + ggml_set_name(sg.swa_mask, "swa_mask"); + ggml_set_input(sg.swa_mask); + ggml_set_output(sg.swa_mask); // force gallocr to allocate even if no op references it + } + + sg.gf = ggml_new_graph_custom(sg.ctx, 16384, false); + + GemmaGraphInputs gi{}; + gi.inp_embed = sg.inp_embed; + gi.positions = sg.positions; + gi.attn_mask = sg.attn_mask; + gi.swa_mask = sg.swa_mask; + gi.n_tokens = n_tokens; + gi.kv_start = kv_start; + gi.capture_layers = capture; + gi.fa_window = fa_window; + gi.use_pflash = use_pflash; + gi.pflash_alpha = pflash_alpha; + gi.last_token_logits_only = last_token_logits_only; + + GemmaGraphOutputs go = build_gemma4_graph(sg.ctx, sg.gf, w, cache, gi); + if (!go.logits) return false; + sg.logits = go.logits; + ggml_set_output(sg.logits); + ggml_build_forward_expand(sg.gf, sg.logits); + + if (!sg.alloc) { + sg.alloc = ggml_gallocr_new(ggml_backend_get_default_buffer_type(backend)); + } + return ggml_gallocr_alloc_graph(sg.alloc, sg.gf); +} + +// Build a draft KV prefill graph: project target features → draft KV cache. +static bool build_draft_kv_prefill(DraftKVPrefillGraph & pkg, + const GemmaDraftWeights & dw, + GemmaTargetCache & cache, + ggml_backend_t backend, + int n_tokens) { + // Free previous graph state + if (pkg.ctx) { ggml_free(pkg.ctx); pkg.ctx = nullptr; } + pkg.gf = nullptr; + pkg.target_feat = nullptr; + pkg.positions = nullptr; + + const int target_feat_w = dw.n_target_layers * dw.target_hidden; + + ggml_init_params ip{}; + ip.mem_size = 256 * 1024 * 1024; + ip.mem_buffer = nullptr; + ip.no_alloc = true; + pkg.ctx = ggml_init(ip); + if (!pkg.ctx) return false; + + pkg.target_feat = ggml_new_tensor_2d(pkg.ctx, GGML_TYPE_F32, target_feat_w, n_tokens); + ggml_set_name(pkg.target_feat, "prefill_target_feat"); + ggml_set_input(pkg.target_feat); + + pkg.positions = ggml_new_tensor_1d(pkg.ctx, GGML_TYPE_I32, n_tokens); + ggml_set_name(pkg.positions, "prefill_positions"); + ggml_set_input(pkg.positions); + + pkg.gf = ggml_new_graph_custom(pkg.ctx, 4096, false); + + build_draft_kv_prefill_graph(pkg.ctx, pkg.gf, dw, cache, + pkg.target_feat, pkg.positions, n_tokens); + + if (!pkg.alloc) { + pkg.alloc = ggml_gallocr_new(ggml_backend_get_default_buffer_type(backend)); + } + return ggml_gallocr_alloc_graph(pkg.alloc, pkg.gf); +} + +// Build a draft model forward graph for one diffusion step. +static bool build_draft_step(DraftStepGraph & dsg, + const GemmaDraftWeights & dw, + GemmaTargetCache & cache, + ggml_backend_t backend, + int n_tokens, + int kv_start) { + draft_step_free(dsg); + + ggml_init_params ip{}; + ip.mem_size = 256 * 1024 * 1024; + ip.mem_buffer = nullptr; + ip.no_alloc = true; + dsg.ctx = ggml_init(ip); + if (!dsg.ctx) return false; + + dsg.draft_embed = ggml_new_tensor_2d(dsg.ctx, GGML_TYPE_F32, dw.n_embd, n_tokens); + ggml_set_name(dsg.draft_embed, "draft_embed"); + ggml_set_input(dsg.draft_embed); + + dsg.positions = ggml_new_tensor_1d(dsg.ctx, GGML_TYPE_I32, n_tokens); + ggml_set_name(dsg.positions, "positions"); + ggml_set_input(dsg.positions); + + // Attention mask: block tokens attend to context + block (causal). + const int kv_len = kv_start + n_tokens; + const int kv_pad = align_up(kv_len, KQ_MASK_PAD); + const int q_pad = align_up(n_tokens, KQ_MASK_PAD); + dsg.attn_mask = ggml_new_tensor_2d(dsg.ctx, GGML_TYPE_F16, kv_pad, q_pad); + ggml_set_name(dsg.attn_mask, "draft_attn_mask"); + ggml_set_input(dsg.attn_mask); + + dsg.gf = ggml_new_graph_custom(dsg.ctx, 8192, false); + dsg.logits = build_gemma4_draft_graph( + dsg.ctx, dsg.gf, dw, cache, + dsg.draft_embed, dsg.positions, dsg.attn_mask, + n_tokens, kv_start); + if (!dsg.logits) return false; + ggml_set_output(dsg.logits); + ggml_build_forward_expand(dsg.gf, dsg.logits); + + if (!dsg.alloc) { + dsg.alloc = ggml_gallocr_new(ggml_backend_get_default_buffer_type(backend)); + } + return ggml_gallocr_alloc_graph(dsg.alloc, dsg.gf); +} + +// ─── Embed one token into the inp_embed input tensor ───────────────────── + +static bool embed_token(const GemmaTargetWeights & w, + int32_t token_id, + ggml_tensor * inp_embed, + ggml_backend_t backend) { + const int hidden = w.n_embd; + std::vector emb(hidden); + if (!w.embedder.embed(&token_id, 1, emb.data())) { + std::fprintf(stderr, "[embed] failed for token %d\n", token_id); + return false; + } + // inp_embed shape: [hidden, 1, 1] + ggml_backend_tensor_set(inp_embed, emb.data(), 0, sizeof(float) * hidden); + (void)backend; + return true; +} + +// Embed a batch of tokens (for chunked prefill). +static bool embed_tokens_batch(const GemmaTargetWeights & w, + const int32_t * ids, + int n, + ggml_tensor * inp_embed, + ggml_backend_t backend) { + const int hidden = w.n_embd; + std::vector emb((size_t)hidden * n); + if (!w.embedder.embed(ids, n, emb.data())) { + std::fprintf(stderr, "[embed_batch] failed for %d tokens\n", n); + return false; + } + ggml_backend_tensor_set(inp_embed, emb.data(), 0, sizeof(float) * hidden * n); + (void)backend; + return true; +} + +// ─── EOS check ─────────────────────────────────────────────────────────── + +static bool g_ignore_eos = false; + +#define IS_EOS_TOK(tok, w) \ + (!g_ignore_eos && \ + (((w).eos_chat_id >= 0 && (tok) == (w).eos_chat_id) || \ + ((w).eos_id >= 0 && (tok) == (w).eos_id))) + +// ─── KV type resolution helper ─────────────────────────────────────────── + +static ggml_type kv_type_from_string(const std::string & s) { + if (s == "f16") return GGML_TYPE_F16; + if (s == "q8_0") return GGML_TYPE_Q8_0; + if (s == "q4_0") return GGML_TYPE_Q4_0; + if (s == "tq3_0") return GGML_TYPE_TQ3_0; + return GGML_TYPE_Q8_0; // default +} + +// ─── Nanosecond wall clock ──────────────────────────────────────────────── + +static double now_ms() { + return std::chrono::duration( + std::chrono::steady_clock::now().time_since_epoch()).count(); +} + +// ─── Minimal tokenizer stub ────────────────────────────────────────────── +// +// A proper tokenizer (SentencePiece / tiktoken) requires linking to an +// external library. For benchmarking purposes we provide two modes: +// +// 1. Pre-tokenised input via --tokens +// Pass comma-separated integer token IDs directly. This is the +// recommended path for reproducible benchmarks. +// +// 2. Byte-fallback: each byte of the --prompt string becomes one token. +// This is NOT linguistically valid but lets the driver run without any +// tokenizer library. Override with --tokens for real evaluation. + +static std::vector tokenize_byte_fallback(const std::string & text) { + std::vector ids; + ids.reserve(text.size()); + for (unsigned char c : text) { + ids.push_back((int32_t)c); + } + return ids; +} + +static std::vector parse_token_ids(const std::string & s) { + std::vector ids; + const char * p = s.c_str(); + while (*p) { + char * end = nullptr; + long v = std::strtol(p, &end, 10); + if (end == p) break; + ids.push_back((int32_t)v); + if (*end == '\0') break; + if (*end == ',') { p = end + 1; continue; } + break; + } + return ids; +} + +// ─── Binary token file helper (daemon mode) ────────────────────────────── + +static std::vector read_int32_file(const std::string & path) { + std::ifstream f(path, std::ios::binary | std::ios::ate); + if (!f) return {}; + auto sz = (size_t)f.tellg(); + f.seekg(0); + std::vector out(sz / sizeof(int32_t)); + f.read((char *)out.data(), (std::streamsize)sz); + return out; +} + +// Parse optional " samp=temp,top_p,top_k,rep_pen[,seed]" suffix from line. +// Erases the matched suffix from line. Returns true if parsed. +static bool parse_sampler_token(std::string & line, SamplerCfg & out) { + auto pos = line.find(" samp="); + if (pos == std::string::npos) return false; + auto end = line.find(' ', pos + 1); + std::string tok = (end == std::string::npos) + ? line.substr(pos + 6) + : line.substr(pos + 6, end - (pos + 6)); + line.erase(pos, (end == std::string::npos ? std::string::npos : end - pos)); + float t = 0.0f, tp = 1.0f, rp = 1.0f; + int tk = 0; + unsigned long long sd = 0; + int n = std::sscanf(tok.c_str(), "%f,%f,%d,%f,%llu", + &t, &tp, &tk, &rp, &sd); + if (n < 1) return false; + out.temp = t; + out.top_p = tp; + out.top_k = tk; + out.rep_pen = rp; + out.seed = sd; + return true; +} + +// ─── Main ───────────────────────────────────────────────────────────────── + +static void print_usage(const char * prog) { + std::fprintf(stderr, + "usage: %s --model [options]\n" + "\n" + "Options:\n" + " --model path to Gemma4 GGUF (target, required)\n" + " --draft path to z-lab DFlash safetensors directory (optional)\n" + " --prompt input prompt text (default: \"Hello, world!\")\n" + " --tokens comma-separated prompt token IDs (overrides --prompt)\n" + " --tokens-file read comma-separated token IDs from a file (for long prompts)\n" + " --n-predict max tokens to generate (default: 128)\n" + " --ctx-size max context size (default: 4096)\n" + " --kv-k KV cache K type: f16/q8_0/q4_0/tq3_0 (default: q8_0)\n" + " --kv-v KV cache V type: f16/q8_0/q4_0/tq3_0 (default: q8_0)\n" + " --seed RNG seed (default: 0)\n" + " --temp temperature, 0 = greedy (default: 0.0)\n" + " --top-k top-k sampling, 0 = disabled (default: 0)\n" + " --top-p nucleus sampling (default: 1.0)\n" + " --budget DDTree budget for speculative decoding (default: 22)\n" + " --gpu CUDA device index (default: 0)\n" + " --bench benchmark mode: repeat generation, report statistics\n" + " --fa-window sliding attention window for full layers (0 = full, default: 0)\n" + " --pflash use pFlash prefill for prompts >= 4096 tokens\n" + " --pflash-alpha pFlash block selection threshold (default: 0.12)\n" + " --draft-max DFlash draft block cap (0 = model block_size)\n" + " --draft-max-adaptive enable rolling adaptive DFlash draft cap\n" + " --draft-kv-cap override DFlash drafter KV slots\n" + " --draft-swa-trunc enable per-layer SWA truncation in the draft graph\n" + " (also DFLASH_DRAFT_SWA_TRUNC=1; helps long-prompt decode)\n" + " --mem-diag print VRAM checkpoints around major allocations\n" + "\n", + prog); +} + +// Draft method selection +enum class DraftMethod { Auto, None, Dflash, Mtp }; + +static void print_mem_diag(const char * tag) { + size_t free_bytes = 0, total_bytes = 0; + cudaMemGetInfo(&free_bytes, &total_bytes); + const double used_gb = (total_bytes - free_bytes) / (1024.0 * 1024.0 * 1024.0); + const double free_gb = free_bytes / (1024.0 * 1024.0 * 1024.0); + const double total_gb = total_bytes / (1024.0 * 1024.0 * 1024.0); + std::printf("[mem-diag] %-18s used=%.2f GB free=%.2f GB total=%.2f GB\n", + tag, used_gb, free_gb, total_gb); +} + +struct AdaptiveDraftMax { + bool enabled = false; + int current = 0; + int min_q = 1; + int max_q = 0; + int window_steps = 8; + int window_accepted = 0; + int window_capacity = 0; + int window_steps_seen = 0; + + void init(bool on, int initial, int block_size) { + enabled = on; + max_q = block_size; + current = initial > 0 ? std::min(initial, block_size) : block_size; + current = std::max(min_q, current); + } + + void observe(int accepted, int q_len, int step_no) { + if (!enabled) return; + // accept_n includes the pinned current token. Adapt on speculative + // next-token fill so dm=1 does not look artificially perfect. + window_accepted += std::max(0, accepted - 1); + window_capacity += std::max(1, q_len - 1); + window_steps_seen++; + if (window_steps_seen < window_steps || window_capacity <= 0) return; + + const double fill = (double)window_accepted / (double)window_capacity; + const int old = current; + if (fill < 0.35 && current > min_q) { + current = std::max(min_q, current / 2); + } else if (fill > 0.78 && current < max_q) { + current = std::min(max_q, current * 2); + } + if (current != old) { + std::printf("[adaptive] step=%d fill=%.2f draft_max %d -> %d\n", + step_no, fill, old, current); + } else { + std::printf("[adaptive] step=%d fill=%.2f draft_max=%d\n", + step_no, fill, current); + } + window_accepted = 0; + window_capacity = 0; + window_steps_seen = 0; + } +}; + +int main(int argc, char ** argv) { + if (argc < 2) { + print_usage(argv[0]); + return 2; + } + + // ── Parse CLI arguments ─────────────────────────────────────────────── + std::string model_path; + std::string draft_path; + std::string mtp_path; + std::string prompt_text = "Hello, world!"; + std::string token_ids_str; + std::string tokens_file; + int n_predict = 128; + int ctx_size = 4096; + std::string kv_k_str = "q8_0"; + std::string kv_v_str = "q8_0"; + int gpu = 0; + int ddtree_budget = 22; + bool bench_mode = false; + int fa_window = 0; + bool use_pflash = false; + float pflash_alpha = 0.12f; + SamplerCfg sampler; + bool daemon_mode = false; + int stream_fd = -1; + int draft_max = 0; // 0 = use model's block_size (default 16) + bool draft_max_adaptive = false; + int draft_kv_cap_override = 0; + bool mem_diag = false; + DraftMethod draft_method = DraftMethod::Auto; + + for (int i = 1; i < argc; i++) { + auto require_next = [&](const char * flag) -> const char * { + if (i + 1 >= argc) { + std::fprintf(stderr, "error: %s requires an argument\n", flag); + std::exit(2); + } + return argv[++i]; + }; + + if (std::strcmp(argv[i], "--model") == 0) model_path = require_next("--model"); + else if (std::strcmp(argv[i], "--draft") == 0) draft_path = require_next("--draft"); + else if (std::strcmp(argv[i], "--prompt") == 0) prompt_text = require_next("--prompt"); + else if (std::strcmp(argv[i], "--tokens") == 0) token_ids_str = require_next("--tokens"); + else if (std::strcmp(argv[i], "--tokens-file") == 0) tokens_file = require_next("--tokens-file"); + else if (std::strcmp(argv[i], "--n-predict") == 0) n_predict = std::atoi(require_next("--n-predict")); + else if (std::strcmp(argv[i], "--ctx-size") == 0) ctx_size = std::atoi(require_next("--ctx-size")); + else if (std::strncmp(argv[i], "--ctx-size=", 11) == 0) ctx_size = std::atoi(argv[i] + 11); + else if (std::strcmp(argv[i], "--max-ctx") == 0) ctx_size = std::atoi(require_next("--max-ctx")); + else if (std::strncmp(argv[i], "--max-ctx=", 10) == 0) ctx_size = std::atoi(argv[i] + 10); + else if (std::strcmp(argv[i], "--kv-k") == 0) kv_k_str = require_next("--kv-k"); + else if (std::strcmp(argv[i], "--kv-v") == 0) kv_v_str = require_next("--kv-v"); + else if (std::strcmp(argv[i], "--seed") == 0) sampler.seed = (uint64_t)std::atoll(require_next("--seed")); + else if (std::strcmp(argv[i], "--temp") == 0) sampler.temp = (float)std::atof(require_next("--temp")); + else if (std::strcmp(argv[i], "--ignore-eos")== 0) g_ignore_eos = true; + else if (std::strcmp(argv[i], "--top-k") == 0) sampler.top_k = std::atoi(require_next("--top-k")); + else if (std::strcmp(argv[i], "--top-p") == 0) sampler.top_p = (float)std::atof(require_next("--top-p")); + else if (std::strcmp(argv[i], "--budget") == 0) ddtree_budget = std::atoi(require_next("--budget")); + else if (std::strcmp(argv[i], "--gpu") == 0) gpu = std::atoi(require_next("--gpu")); + else if (std::strcmp(argv[i], "--fa-window") == 0) fa_window = std::atoi(require_next("--fa-window")); + else if (std::strcmp(argv[i], "--bench") == 0) bench_mode = true; + else if (std::strcmp(argv[i], "--daemon") == 0) daemon_mode = true; + else if (std::strcmp(argv[i], "--pflash") == 0) use_pflash = true; + else if (std::strcmp(argv[i], "--pflash-alpha") == 0) pflash_alpha = (float)std::atof(require_next("--pflash-alpha")); + else if (std::strcmp(argv[i], "--draft-max") == 0) draft_max = std::atoi(require_next("--draft-max")); + else if (std::strcmp(argv[i], "--draft-max-adaptive") == 0) draft_max_adaptive = true; + else if (std::strcmp(argv[i], "--draft-kv-cap") == 0) draft_kv_cap_override = std::atoi(require_next("--draft-kv-cap")); + else if (std::strcmp(argv[i], "--draft-swa-trunc") == 0) ::setenv("DFLASH_DRAFT_SWA_TRUNC", "1", 1); + else if (std::strcmp(argv[i], "--mem-diag") == 0) mem_diag = true; + else if (std::strcmp(argv[i], "--mtp") == 0) mtp_path = require_next("--mtp"); + else if (std::strcmp(argv[i], "--draft-method") == 0) { + const char * m = require_next("--draft-method"); + if (std::strcmp(m, "none") == 0) draft_method = DraftMethod::None; + else if (std::strcmp(m, "dflash") == 0) draft_method = DraftMethod::Dflash; + else if (std::strcmp(m, "mtp") == 0) draft_method = DraftMethod::Mtp; + else { std::fprintf(stderr, "error: unknown --draft-method %s\n", m); return 1; } + } + else if (std::strncmp(argv[i], "--stream-fd=", 12) == 0) { + stream_fd = std::atoi(argv[i] + 12); + } + // No-op flags forwarded by server.py for Qwen3 compatibility: + else if (std::strcmp(argv[i], "--fast-rollback") == 0) { /* no-op */ } + else if (std::strcmp(argv[i], "--ddtree") == 0) { /* no-op */ } + else if (std::strncmp(argv[i], "--ddtree-budget=", 16) == 0) { /* no-op */ } + else if (std::strncmp(argv[i], "--ddtree-temp=", 14) == 0) { /* no-op */ } + else if (std::strcmp(argv[i], "--ddtree-no-chain-seed") == 0) { /* no-op */ } + else if (std::strcmp(argv[i], "--help") == 0 || + std::strcmp(argv[i], "-h") == 0) { + print_usage(argv[0]); + return 0; + } else { + std::fprintf(stderr, "warning: unknown argument: %s\n", argv[i]); + } + } + + if (model_path.empty()) { + std::fprintf(stderr, "error: --model is required\n"); + print_usage(argv[0]); + return 2; + } + + // ── Resolve Auto draft method ───────────────────────────────────────── + if (draft_method == DraftMethod::Auto) { + if (!draft_path.empty() && !mtp_path.empty()) { + std::fprintf(stderr, "error: both --draft and --mtp provided; use --draft-method to disambiguate\n"); + return 1; + } else if (!mtp_path.empty()) { + draft_method = DraftMethod::Mtp; + } else if (!draft_path.empty()) { + draft_method = DraftMethod::Dflash; + } else { + draft_method = DraftMethod::None; + } + } + if (draft_method == DraftMethod::Mtp && mtp_path.empty()) { + std::fprintf(stderr, "error: --draft-method mtp requires --mtp \n"); + return 1; + } + if (draft_method == DraftMethod::Dflash && draft_path.empty()) { + std::fprintf(stderr, "error: --draft-method dflash requires --draft \n"); + return 1; + } + + const bool have_draft = (draft_method == DraftMethod::Dflash); + const bool have_mtp = (draft_method == DraftMethod::Mtp); + + // ── Load token IDs from file if --tokens-file was specified ────────── + if (!tokens_file.empty()) { + FILE * f = fopen(tokens_file.c_str(), "r"); + if (!f) { + std::fprintf(stderr, "error: cannot open tokens file: %s\n", tokens_file.c_str()); + return 1; + } + fseek(f, 0, SEEK_END); + long sz = ftell(f); + rewind(f); + std::string content(sz, '\0'); + fread(&content[0], 1, sz, f); + fclose(f); + token_ids_str = content; + } + + // ── KV type env vars (consumed by create_gemma4_cache → resolve_kv_types) ─ + setenv("DFLASH27B_KV_K", kv_k_str.c_str(), 1); + setenv("DFLASH27B_KV_V", kv_v_str.c_str(), 1); + + // After argv parsing, the KV type may have been chosen via --kv-k tq3_0 / --kv-v tq3_0, + // which sets DFLASH27B_KV_K / DFLASH27B_KV_V env vars. Re-check for TQ3 here so + // g_kq_stride_pad matches the chunked-FA driver's align_up(kv_len, 256); otherwise the + // host-built mask is short and the kernel reads past its end. + auto kv_env_is_tq3 = [](const char * name) { + const char * s = std::getenv(name); + if (!s) return false; + std::string lc; + for (const char * p = s; *p; ++p) lc += (char)std::tolower((unsigned char)*p); + return lc.rfind("tq3", 0) == 0; + }; + if (kv_env_is_tq3("DFLASH27B_KV_K") || kv_env_is_tq3("DFLASH27B_KV_V")) { + g_kq_stride_pad = 256; + } + + // ── CUDA device validation ──────────────────────────────────────────── + int cuda_device_count = 0; + cudaGetDeviceCount(&cuda_device_count); + if (gpu >= cuda_device_count) { + std::fprintf(stderr, "error: --gpu %d out of range (device_count=%d)\n", + gpu, cuda_device_count); + return 2; + } + cudaSetDevice(gpu); + + // Detect <=24 GiB CUDA devices and emit a runtime warning if VMM is enabled. + // Note: GGML_CUDA_NO_VMM is compile-time only (CMake option that adds + // compile_definitions). Setting it via setenv() at runtime has no effect on + // ggml-cuda — it's not read via getenv. The real safeguard is to rebuild + // with `cmake -DGGML_CUDA_NO_VMM=ON ..`. + { + int dev_count = 0; + if (cudaGetDeviceCount(&dev_count) == cudaSuccess) { + for (int i = 0; i < dev_count; ++i) { + cudaDeviceProp prop{}; + if (cudaGetDeviceProperties(&prop, i) != cudaSuccess) continue; + const size_t gib = (size_t)(prop.totalGlobalMem / (1ull << 30)); +#ifndef GGML_CUDA_NO_VMM + if (gib <= 24) { + std::fprintf(stderr, + "[dflash] WARNING: detected CUDA device %d (%s) with %zu GiB VRAM.\n" + "[dflash] Long-context prefill on <=24 GiB cards is significantly\n" + "[dflash] slower with VMM enabled. Consider rebuilding with:\n" + "[dflash] cmake -DGGML_CUDA_NO_VMM=ON ..\n", + i, prop.name, gib); + } +#endif + } + } + } + + std::printf("[cfg] model=%s draft=%s method=%s gpu=%d ctx=%d n_predict=%d kv_k=%s kv_v=%s " + "temp=%.2f top_k=%d top_p=%.2f budget=%d bench=%d fa_window=%d " + "draft_max=%d adaptive=%d draft_kv_cap_override=%d pflash=%d pflash_alpha=%.3f\n", + model_path.c_str(), + draft_path.empty() ? "(none)" : draft_path.c_str(), + draft_method == DraftMethod::Dflash ? "dflash" : + draft_method == DraftMethod::Mtp ? "mtp" : + draft_method == DraftMethod::None ? "none" : "auto", + gpu, ctx_size, n_predict, + kv_k_str.c_str(), kv_v_str.c_str(), + sampler.temp, sampler.top_k, sampler.top_p, + ddtree_budget, (int)bench_mode, fa_window, + draft_max, (int)draft_max_adaptive, draft_kv_cap_override, + (int)use_pflash, pflash_alpha); + + // ── Backend init ────────────────────────────────────────────────────── + ggml_backend_t backend = ggml_backend_cuda_init(gpu); + if (!backend) { + std::fprintf(stderr, "error: ggml_backend_cuda_init(%d) failed\n", gpu); + return 1; + } + if (mem_diag) print_mem_diag("after-backend"); + + // Register the pFlash GGML custom kernel so ggml_flash_attn_sparse ops + // dispatched from build_gemma4_graph (full-attention layers, use_pflash=true) + // have a backend implementation available. + if (use_pflash) { + pflash_register_ggml_kernel(); + } + + // ── Load target weights ─────────────────────────────────────────────── + GemmaTargetWeights w; + { + double t0 = now_ms(); + if (!load_gemma4_target_gguf(model_path, backend, w)) { + std::fprintf(stderr, "load_gemma4_target_gguf: %s\n", dflash27b_last_error()); + return 1; + } + double t1 = now_ms(); + std::printf("[target] loaded %d layers, n_embd=%d, vocab=%d (%.1f ms)\n", + w.n_layer, w.n_embd, w.n_vocab, t1 - t0); + if (mem_diag) print_mem_diag("after-target-load"); + } + + // ── Load draft weights (optional) ──────────────────────────────────── + // Draft state: declared in main scope so they persist across bench iterations + // and are accessible in cleanup. + GemmaDraftWeights dw; + ggml_context * tok_embd_ctx = nullptr; + ggml_backend_buffer_t tok_embd_buf = nullptr; + + if (have_draft) { + double t0 = now_ms(); + // Auto-detect: + // 1. If path ends with .gguf, use GGUF loader directly + // 2. If path is a directory containing draft-q8_0.gguf, prefer it + // (Q8 GGUF is ~2x smaller than the BF16 safetensors and avoids + // a memory-pressure perf trap on Dense + TQ3 KV that drops + // target prefill 20x; see commit notes for details) + // 3. Otherwise fall back to safetensors loader + std::string resolved_draft_path = draft_path; + bool is_gguf = (draft_path.size() >= 5 && + draft_path.compare(draft_path.size() - 5, 5, ".gguf") == 0); + if (!is_gguf) { + // Check if path is a directory with a draft-q8_0.gguf inside + const std::string candidate = draft_path + "/draft-q8_0.gguf"; + std::ifstream probe(candidate.c_str()); + if (probe.good()) { + resolved_draft_path = candidate; + is_gguf = true; + std::fprintf(stderr, + "[draft] auto-selected Q8 GGUF: %s\n" + " (%s also present; Q8 is ~2x smaller and ~20x faster on Dense+TQ3)\n", + candidate.c_str(), + (draft_path + "/model.safetensors").c_str()); + } + } + bool ok = false; + if (is_gguf) { + ok = load_gemma4_draft_gguf(resolved_draft_path, backend, dw); + if (!ok) std::fprintf(stderr, "load_gemma4_draft_gguf: %s\n", dflash27b_last_error()); + } else { + ok = load_gemma4_draft_safetensors(resolved_draft_path, backend, dw); + if (!ok) std::fprintf(stderr, "load_gemma4_draft_safetensors: %s\n", dflash27b_last_error()); + } + if (!ok) return 1; + double t1 = now_ms(); + if (mem_diag) print_mem_diag("after-draft-load"); + + // Upload tok_embd from target embedder to GPU (tied lm_head for draft). + // tw.embedder keeps the bytes CPU-side; we upload once and inject a pointer. + { + ggml_init_params ep{}; + ep.mem_size = ggml_tensor_overhead() * 2; + ep.mem_buffer = nullptr; + ep.no_alloc = true; + tok_embd_ctx = ggml_init(ep); + if (!tok_embd_ctx) { + std::fprintf(stderr, "[draft] ggml_init for tok_embd failed\n"); + return 1; + } + + const ggml_type emb_type = w.embedder.tok_embd_type; + const int64_t n_embd_t = w.embedder.n_embd; + const int64_t n_vocab_t = w.embedder.n_vocab; + + // ggml convention: ne[0] = n_embd (fast axis), ne[1] = n_vocab + ggml_tensor * te = ggml_new_tensor_2d(tok_embd_ctx, emb_type, n_embd_t, n_vocab_t); + ggml_set_name(te, "tok_embd_gpu"); + + tok_embd_buf = ggml_backend_alloc_ctx_tensors(tok_embd_ctx, backend); + if (!tok_embd_buf) { + std::fprintf(stderr, "[draft] ggml_backend_alloc_ctx_tensors for tok_embd failed\n"); + ggml_free(tok_embd_ctx); + tok_embd_ctx = nullptr; + return 1; + } + + const size_t emb_bytes = (size_t)w.embedder.row_bytes * (size_t)n_vocab_t; + ggml_backend_tensor_set(te, w.embedder.tok_embd_bytes, 0, emb_bytes); + std::printf("[tok_embd] uploaded %.1f MiB to GPU (%s [%" PRId64 ", %" PRId64 "])\n", + (double)emb_bytes / (1024.0 * 1024.0), + ggml_type_name(emb_type), n_embd_t, n_vocab_t); + + dw.tok_embd = te; + dw.n_vocab = (int)n_vocab_t; + if (mem_diag) print_mem_diag("after-tok-embd"); + } + + std::printf("[draft] loaded n_layer=%d n_head=%d n_embd=%d n_vocab=%d " + "target_hidden=%d block_size=%d (%.1f ms)\n", + dw.n_layer, dw.n_head, dw.n_embd, dw.n_vocab, + dw.target_hidden, dw.block_size, t1 - t0); + } + + // ── Load MTP weights early when enabled ────────────────────────────── + // Donor target layers must be known before target KV allocation so TQ3 + // donor caches can be forced to Q8_0 and avoid wrap-concat FWHT loss. + MtpDrafterWeights mtp_w; + MtpStepGraph mtp_g; + std::vector mtp_extra_q8_layers; + + if (have_mtp) { + double t0 = now_ms(); + if (!load_gemma4_mtp_assistant(mtp_path, backend, mtp_w)) { + std::fprintf(stderr, "load_gemma4_mtp_assistant: %s\n", dflash27b_last_error()); + return 1; + } + double t1 = now_ms(); + std::printf("[mtp] loaded n_layers=%d n_embd=%d n_embd_backbone=%d (%.1f ms)\n", + (int)mtp_w.layers.size(), mtp_w.n_embd, mtp_w.n_embd_backbone, t1 - t0); + if (mem_diag) print_mem_diag("after-mtp-load"); + + // Re-resolve donor target layers using the actual target SWA pattern. + resolve_mtp_donor_layers(mtp_w, w.swa_layers); + for (const MtpLayerWeights & L : mtp_w.layers) { + if (L.donor_target_layer >= 0 && + std::find(mtp_extra_q8_layers.begin(), mtp_extra_q8_layers.end(), + L.donor_target_layer) == mtp_extra_q8_layers.end()) { + mtp_extra_q8_layers.push_back(L.donor_target_layer); + } + } + } + + // ── Create KV cache ─────────────────────────────────────────────────── + GemmaTargetCache cache; + { + if (mem_diag) print_mem_diag("before-target-kv"); + double t0 = now_ms(); + const int draft_kv_default_cap = have_draft + ? (dw.sliding_window + dw.block_size + 32) + : 0; + const int target_feat_cap_hint = have_draft + ? std::max(draft_kv_default_cap, draft_kv_cap_override) + : 0; + if (!create_gemma4_cache(w, ctx_size, backend, cache, mtp_extra_q8_layers, + target_feat_cap_hint, + /*enable_dflash_capture_overrides=*/have_draft)) { + std::fprintf(stderr, "create_gemma4_cache: %s\n", dflash27b_last_error()); + return 1; + } + double t1 = now_ms(); + std::printf("[cache] created max_ctx=%d, kv_layers=%zu (%.1f ms)\n", + cache.max_ctx, cache.attn_k.size(), t1 - t0); + if (mem_diag) print_mem_diag("after-target-kv"); + } + + // ── Allocate draft KV cache (requires cache to already exist) ───────── + if (have_draft) { + if (mem_diag) print_mem_diag("before-draft-kv"); + if (!create_draft_kv_cache(dw, backend, cache, draft_kv_cap_override)) { + std::fprintf(stderr, "create_draft_kv_cache failed\n"); + return 1; + } + std::printf("[draft] KV cache allocated: %d slots%s\n", + cache.draft_kv_cap, + draft_kv_cap_override > 0 ? " (override)" : ""); + if (mem_diag) print_mem_diag("after-draft-kv"); + } + + // ── MTP state + step graph (optional) ──────────────────────────────── + // mtp_h_prev context/buffer: separate small allocation so base_ctx stays + // unmodified and free_gemma4_cache() doesn't double-free it. + ggml_context * mtp_h_prev_ctx = nullptr; + ggml_backend_buffer_t mtp_h_prev_buf = nullptr; + + if (have_mtp) { + // Allocate mtp_h_prev tensor: [n_embd_backbone, 1] f32, GPU-resident, + // persistent across decode steps. Separate context so free_gemma4_cache + // doesn't free it. + { + ggml_init_params ep{}; + ep.mem_size = ggml_tensor_overhead() + 256; + ep.mem_buffer = nullptr; + ep.no_alloc = true; + mtp_h_prev_ctx = ggml_init(ep); + if (!mtp_h_prev_ctx) { + std::fprintf(stderr, "[mtp] ggml_init for mtp_h_prev failed\n"); + return 1; + } + cache.mtp_h_prev = ggml_new_tensor_2d(mtp_h_prev_ctx, + GGML_TYPE_F32, + mtp_w.n_embd_backbone, 1); + ggml_set_name(cache.mtp_h_prev, "mtp_h_prev"); + mtp_h_prev_buf = ggml_backend_alloc_ctx_tensors(mtp_h_prev_ctx, backend); + if (!mtp_h_prev_buf) { + std::fprintf(stderr, "[mtp] alloc mtp_h_prev failed\n"); + ggml_free(mtp_h_prev_ctx); mtp_h_prev_ctx = nullptr; + return 1; + } + // Zero-initialize + std::vector zeros_f(mtp_w.n_embd_backbone, 0.0f); + ggml_backend_tensor_set(cache.mtp_h_prev, zeros_f.data(), 0, + sizeof(float) * mtp_w.n_embd_backbone); + } + + // Determine last full-attention layer index from swa_layers + cache.mtp_last_full_layer = -1; + for (int il = w.n_layer - 1; il >= 0; il--) { + const bool is_swa = (il < (int)w.swa_layers.size()) && w.swa_layers[il]; + if (!is_swa) { + cache.mtp_last_full_layer = il; + break; + } + } + if (cache.mtp_last_full_layer < 0) { + std::fprintf(stderr, "[mtp] error: no full-attention layer found in target\n"); + return 1; + } + std::printf("[mtp] mtp_last_full_layer=%d\n", cache.mtp_last_full_layer); + + cache.mtp_h_prev_enabled = true; + + // Build the MTP step graph (attn_pos=0 initially; will be rebuilt per step) + if (!build_mtp_step_graph(mtp_w, cache, w, mtp_g, /*attn_pos=*/0)) { + std::fprintf(stderr, "build_mtp_step_graph: %s\n", dflash27b_last_error()); + return 1; + } + std::printf("[mtp] step graph built ok\n"); + } + + // ── RNG ─────────────────────────────────────────────────────────────── + std::mt19937_64 rng(sampler.seed); + + // ── Daemon mode: stream token fd write helper ───────────────────────── + auto stream_emit = [&](int32_t tok) { + if (stream_fd < 0) return; + int32_t v = tok; +#ifdef _WIN32 + DWORD written; + WriteFile((HANDLE)(intptr_t)stream_fd, &v, sizeof(v), &written, nullptr); +#else + ssize_t n = ::write(stream_fd, &v, sizeof(v)); + (void)n; +#endif + }; + + // ── Daemon mode ─────────────────────────────────────────────────────── + if (daemon_mode) { + std::printf("[daemon] ready\n"); + std::fflush(stdout); + + StepGraph sg; + DraftStepGraph dsg; + bool daemon_first_iter = true; + std::string line; + + while (std::getline(std::cin, line)) { + // Per-request sampler (reset to CLI defaults each request). + SamplerCfg req_sampler = sampler; + parse_sampler_token(line, req_sampler); + // Always reseed per request so requests are independent. + // seed==0 means "random": use std::random_device for a fresh seed. + uint64_t actual_seed = req_sampler.seed; + if (actual_seed == 0) { + actual_seed = std::random_device{}(); + } + rng.seed(actual_seed); + + // ── Unsupported commands: emit -1 sentinel and continue ──────── + auto starts_with = [](const std::string & s, const char * pre) { + size_t n = std::strlen(pre); + return s.size() >= n && s.compare(0, n, pre) == 0; + }; + bool unsupported = (starts_with(line, "RESTORE") || + starts_with(line, "SNAPSHOT") || + starts_with(line, "FREE_SNAPSHOT") || + starts_with(line, "LIST_SLOTS") || + starts_with(line, "compress ") || + starts_with(line, "park") || + starts_with(line, "unpark") || + line == "free drafter" || + line == "drafter free"); + if (unsupported) { + std::fprintf(stderr, + "[daemon] command not supported in gemma4 daemon: %s\n", + line.c_str()); + std::fflush(stderr); + stream_emit(-1); + continue; + } + + // ── Parse: ────────────────────────── + char ppath[1024] = {0}; + int n_gen = 0; + if (std::sscanf(line.c_str(), "%1023s %d", ppath, &n_gen) != 2 || n_gen <= 0) { + std::fprintf(stderr, "[daemon] bad command line: %s\n", line.c_str()); + std::fflush(stderr); + stream_emit(-1); + continue; + } + + // Read binary prompt file (int32 LE token IDs). + std::vector prompt_ids = read_int32_file(ppath); + if (prompt_ids.empty()) { + std::fprintf(stderr, "[daemon] empty or unreadable prompt file: %s\n", ppath); + std::fflush(stderr); + stream_emit(-1); + continue; + } + std::printf("[daemon] prompt=%zu tokens n_gen=%d\n", + prompt_ids.size(), n_gen); + std::fflush(stdout); + + // Reset KV cache between requests. + if (!daemon_first_iter) { + step_graph_free(sg); + reset_gemma4_cache(cache); // also resets draft_kv_pos + if (have_draft) { + draft_step_free(dsg); + } + } + daemon_first_iter = false; + + if ((int)prompt_ids.size() + n_gen > ctx_size) { + std::fprintf(stderr, + "[daemon] prompt (%zu) + n_gen (%d) > ctx_size (%d)\n", + prompt_ids.size(), n_gen, ctx_size); + std::fflush(stderr); + stream_emit(-1); + continue; + } + + // ── Prefill ─────────────────────────────────────────────────── + int last_logit_tok = -1; + { + const int n_prompt = (int)prompt_ids.size(); + const int swa_window = w.swa_window > 0 ? w.swa_window : 1024; + const int chunk_size = std::min(n_prompt, swa_window); + + for (int cs = 0; cs < n_prompt; cs += chunk_size) { + const int chunk_n = std::min(chunk_size, n_prompt - cs); + const bool is_last = (cs + chunk_n == n_prompt); + const bool need_mask = (cs + chunk_n > 1); + + if (!build_gemma4_step(sg, w, cache, backend, + cs, chunk_n, need_mask, + /*capture=*/true, + use_pflash, pflash_alpha, + fa_window, + /*last_token_logits_only=*/true)) { + std::fprintf(stderr, "[daemon] prefill build failed at %d\n", cs); + std::fflush(stderr); + break; + } + + if (!embed_tokens_batch(w, prompt_ids.data() + cs, chunk_n, + sg.inp_embed, backend)) { + std::fprintf(stderr, "[daemon] embed_tokens_batch failed\n"); + std::fflush(stderr); + break; + } + + { + std::vector pos(chunk_n); + for (int i = 0; i < chunk_n; i++) pos[i] = cs + i; + ggml_backend_tensor_set(sg.positions, pos.data(), 0, + sizeof(int32_t) * chunk_n); + } + + if (sg.attn_mask && sg.attn_mask->buffer) { + const int kv_len = cs + chunk_n; + std::vector mask_buf; + build_causal_mask(mask_buf, kv_len, chunk_n, cs); + ggml_backend_tensor_set(sg.attn_mask, mask_buf.data(), 0, + sizeof(uint16_t) * mask_buf.size()); + } + + if (sg.swa_mask && sg.swa_mask->buffer) { + const SwaView swa_view = compute_swa_view(cs, chunk_n, + swa_window, cache.swa_ctx_alloc); + std::vector swa_buf; + build_swa_causal_mask(swa_buf, + /*kv_start*/ cs, + /*n_tokens*/ chunk_n, + /*swa_window*/ swa_window, + /*ring_size*/ swa_view.effective_win_len, + /*kv_end*/ cs + chunk_n); + ggml_backend_tensor_set(sg.swa_mask, swa_buf.data(), 0, + sizeof(uint16_t) * swa_buf.size()); + } + + auto st = ggml_backend_graph_compute(backend, sg.gf); + if (st != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[daemon] prefill compute failed at %d\n", cs); + std::fflush(stderr); + break; + } + + // ── TQ3_0 K-cache write probe ───────────────────────────────────── + if (getenv("DFLASH_TQ3_PROBE_CACHE_WRITE") && + (cs == 0 || cs == chunk_size) && + !cache.attn_k.empty()) { + ggml_tensor * cache_k_layer0 = cache.attn_k[0]; + if (cache_k_layer0 && cache_k_layer0->type == GGML_TYPE_TQ3_0) { + // nb[1] is the stride in bytes between successive token slots + const size_t off = (size_t)cache_k_layer0->nb[1] * (size_t)cs; + uint8_t blk[14] = {}; + ggml_backend_tensor_get(cache_k_layer0, blk, off, 14); + std::fprintf(stderr, "[CACHE-WRITE-PROBE] cs=%d off=%zu bytes=", cs, off); + for (int _i = 0; _i < 14; _i++) + std::fprintf(stderr, "%02x ", blk[_i]); + std::fprintf(stderr, "\n"); + std::fflush(stderr); + } + } + // ───────────────────────────────────────────────────────────────── + + cache.cur_pos = cs + chunk_n; + + if (is_last) { + const int vocab = w.n_vocab; + std::vector logits_cpu(vocab); + ggml_backend_tensor_get(sg.logits, logits_cpu.data(), + 0, sizeof(float) * vocab); + last_logit_tok = sample_logits(logits_cpu.data(), vocab, + req_sampler, prompt_ids, rng); + cache.last_tok = last_logit_tok; + } + + step_graph_free(sg); + } + + // Draft KV prefill after target prefill. + if (have_draft && last_logit_tok >= 0) { + const int target_feat_w = dw.n_target_layers * dw.target_hidden; + const int draft_kv_cap = cache.draft_kv_cap > 0 + ? cache.draft_kv_cap + : (int)cache.draft_k[0]->ne[2]; + const int draft_prefill_n = std::min(n_prompt, draft_kv_cap); + const int draft_prefill_skip = n_prompt - draft_prefill_n; + + DraftKVPrefillGraph pkg; + if (build_draft_kv_prefill(pkg, dw, cache, backend, draft_prefill_n)) { + // Ring-buffer aware bf16→f32 conversion via ggml_cpy. + copy_target_feat_bf16_to_f32(backend, cache.target_feat, + pkg.target_feat, + draft_prefill_skip % cache.target_feat_cap, + draft_prefill_n, target_feat_w); + + std::vector pos(draft_prefill_n); + for (int pi = 0; pi < draft_prefill_n; pi++) pos[pi] = draft_prefill_skip + pi; + ggml_backend_tensor_set(pkg.positions, pos.data(), 0, + sizeof(int32_t) * draft_prefill_n); + + auto dst = ggml_backend_graph_compute(backend, pkg.gf); + if (dst != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[daemon] draft KV prefill compute failed\n"); + std::fflush(stderr); + } + cache.draft_kv_pos = draft_prefill_n; + std::fprintf(stderr, + "[daemon] draft KV prefill done: %d positions materialized " + "(skipped %d early tokens, cap=%d, target_feat_cap=%d, dkv_pos=%d)\n", + draft_prefill_n, draft_prefill_skip, draft_kv_cap, + cache.target_feat_cap, cache.draft_kv_pos); + } + draft_kv_prefill_destroy(pkg); + } + } + + if (last_logit_tok < 0) { + std::fprintf(stderr, "[daemon] prefill produced no logit token\n"); + std::fflush(stderr); + stream_emit(-1); + continue; + } + + // ── Decode loop ─────────────────────────────────────────────── + std::vector history(prompt_ids); + int committed = cache.cur_pos; + int32_t cur_tok = last_logit_tok; + int n_generated = 0; + + while (n_generated < n_gen) { + if (IS_EOS_TOK(cur_tok, w)) { + std::printf("[daemon] EOS at step %d\n", n_generated); + std::fflush(stdout); + break; + } + if (committed >= ctx_size - 1) { + std::printf("[daemon] context full\n"); + std::fflush(stdout); + break; + } + + if (!build_gemma4_step(sg, w, cache, backend, + committed, 1, + /*with_mask=*/true, + /*capture=*/false, + /*use_pflash=*/false, pflash_alpha, + fa_window)) { + std::fprintf(stderr, "[daemon] decode build failed at step %d\n", n_generated); + std::fflush(stderr); + break; + } + + if (sg.attn_mask && sg.attn_mask->buffer) { + const int kv_len = committed + 1; + std::vector mask_buf; + build_causal_mask(mask_buf, kv_len, 1, committed); + ggml_backend_tensor_set(sg.attn_mask, mask_buf.data(), 0, + sizeof(uint16_t) * mask_buf.size()); + } + if (sg.swa_mask && sg.swa_mask->buffer) { + const SwaView swa_view = compute_swa_view(committed, 1, + w.swa_window, cache.swa_ctx_alloc); + std::vector swa_buf; + build_swa_causal_mask(swa_buf, + /*kv_start*/ committed, + /*n_tokens*/ 1, + /*swa_window*/ w.swa_window, + /*ring_size*/ swa_view.effective_win_len, + /*kv_end*/ committed + 1); + ggml_backend_tensor_set(sg.swa_mask, swa_buf.data(), 0, + sizeof(uint16_t) * swa_buf.size()); + } + + if (!embed_token(w, cur_tok, sg.inp_embed, backend)) { + std::fprintf(stderr, "[daemon] embed_token failed\n"); + std::fflush(stderr); + break; + } + + int32_t pos_val = committed; + ggml_backend_tensor_set(sg.positions, &pos_val, 0, sizeof(int32_t)); + + auto st = ggml_backend_graph_compute(backend, sg.gf); + if (st != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[daemon] decode compute failed at step %d\n", n_generated); + std::fflush(stderr); + break; + } + + committed++; + cache.cur_pos = committed; + + const int vocab = w.n_vocab; + std::vector logits_cpu(vocab); + ggml_backend_tensor_get(sg.logits, logits_cpu.data(), 0, + sizeof(float) * vocab); + + const int32_t next_tok = (int32_t)sample_logits( + logits_cpu.data(), vocab, req_sampler, history, rng); + + // Emit current token to stream fd before advancing. + stream_emit(cur_tok); + + history.push_back(cur_tok); + n_generated++; + + cur_tok = next_tok; + cache.last_tok = cur_tok; + + step_graph_free(sg); + } + + // Sentinel: end of stream. + stream_emit(-1); + std::printf("[daemon] generated %d tokens\n", n_generated); + std::fflush(stdout); + } + + // ── Daemon exit: clean up ───────────────────────────────────────── + step_graph_destroy(sg); + draft_step_destroy(dsg); + if (have_draft) { + free_draft_kv_cache(cache); + dw.tok_embd = nullptr; + free_gemma4_draft_weights(dw); + if (tok_embd_buf) ggml_backend_buffer_free(tok_embd_buf); + if (tok_embd_ctx) ggml_free(tok_embd_ctx); + } + free_gemma4_cache(cache); + free_gemma4_target_weights(w); + ggml_backend_free(backend); + return 0; + } + + // ── Non-daemon: tokenize prompt ─────────────────────────────────────── + std::vector prompt_ids; + if (!token_ids_str.empty()) { + prompt_ids = parse_token_ids(token_ids_str); + if (prompt_ids.empty()) { + std::fprintf(stderr, "error: --tokens produced no valid token IDs\n"); + return 2; + } + std::printf("[tokens] using %zu pre-tokenised IDs from --tokens\n", + prompt_ids.size()); + } else { + prompt_ids = tokenize_byte_fallback(prompt_text); + std::printf("[tokens] byte-fallback tokenisation: %zu tokens " + "(pass --tokens for real tokenisation)\n", + prompt_ids.size()); + } + + // ── Ensure BOS is prepended (Gemma4 requires BOS at position 0) ── + if (w.bos_id >= 0 && (prompt_ids.empty() || prompt_ids[0] != w.bos_id)) { + prompt_ids.insert(prompt_ids.begin(), w.bos_id); + std::printf("[tokens] prepended BOS token %d\n", w.bos_id); + } + + if ((int)prompt_ids.size() >= ctx_size) { + std::fprintf(stderr, "error: prompt (%zu tokens) >= ctx_size (%d)\n", + prompt_ids.size(), ctx_size); + return 2; + } + + // ── Benchmark loop outer container ──────────────────────────────────── + const int bench_runs = bench_mode ? 3 : 1; + std::vector bench_tok_per_sec; + + // Declared here (main scope) so step_graph_destroy(sg)/draft_step_destroy(dsg) + // in cleanup is valid. + StepGraph sg; + DraftStepGraph dsg; + + // Speculative decode stats (accumulated across bench iterations when bench_mode) + int total_draft_steps = 0; + int total_accepted = 0; + + for (int bench_iter = 0; bench_iter < bench_runs; bench_iter++) { + + if (bench_runs > 1) { + reset_gemma4_cache(cache); + // Reset draft step state for the new bench iteration + draft_step_free(dsg); + total_draft_steps = 0; + total_accepted = 0; + std::printf("[bench] run %d/%d\n", bench_iter + 1, bench_runs); + } + + // ── Prefill ─────────────────────────────────────────────────────── + // + // Chunked batched prefill: process up to swa_window tokens per chunk. + // Each chunk dispatches a single GPU graph covering all tokens in the + // chunk, which is far cheaper than one dispatch per token. + // + // For a chunk [cs, cs+chunk_n): + // 1. Embed chunk tokens → inp_embed + // 2. Set positions[i] = cs + i + // 3. Build causal mask covering [0, cs+chunk_n) for the chunk rows + // 4. Build SWA mask for sliding-window layers (when cs > 0) + // 5. Compute graph → KV + target_feat (logits discarded except last) + + std::printf("[prefill] %zu tokens ...\n", prompt_ids.size()); + double prefill_t0 = now_ms(); + int last_logit_tok = -1; + + { + const int n_prompt = (int)prompt_ids.size(); + + { + const int swa_window = w.swa_window > 0 ? w.swa_window : 1024; + const int chunk_size = std::min(n_prompt, swa_window); + + for (int cs = 0; cs < n_prompt; cs += chunk_size) { + const int chunk_n = std::min(chunk_size, n_prompt - cs); + const bool is_last = (cs + chunk_n == n_prompt); + const bool need_mask = (cs + chunk_n > 1); + + if (!build_gemma4_step(sg, w, cache, backend, + /*kv_start=*/cs, chunk_n, + need_mask, /*capture=*/true, + use_pflash, pflash_alpha, + fa_window, + /*last_token_logits_only=*/true)) { + std::fprintf(stderr, "prefill chunk build failed at offset %d\n", cs); + return 1; + } + + if (!embed_tokens_batch(w, prompt_ids.data() + cs, chunk_n, + sg.inp_embed, backend)) { + return 1; + } + + { + std::vector pos(chunk_n); + for (int i = 0; i < chunk_n; i++) pos[i] = cs + i; + ggml_backend_tensor_set(sg.positions, pos.data(), 0, + sizeof(int32_t) * chunk_n); + } + + if (sg.attn_mask && sg.attn_mask->buffer) { + const int kv_len = cs + chunk_n; + std::vector mask_buf; + build_causal_mask(mask_buf, kv_len, chunk_n, cs); + ggml_backend_tensor_set(sg.attn_mask, mask_buf.data(), 0, + sizeof(uint16_t) * mask_buf.size()); + } + + if (sg.swa_mask && sg.swa_mask->buffer) { + const SwaView swa_view = compute_swa_view(cs, chunk_n, + swa_window, cache.swa_ctx_alloc); + std::vector swa_buf; + build_swa_causal_mask(swa_buf, + /*kv_start*/ cs, + /*n_tokens*/ chunk_n, + /*swa_window*/ swa_window, + /*ring_size*/ swa_view.effective_win_len, + /*kv_end*/ cs + chunk_n); + ggml_backend_tensor_set(sg.swa_mask, swa_buf.data(), 0, + sizeof(uint16_t) * swa_buf.size()); + } + + auto st = ggml_backend_graph_compute(backend, sg.gf); + if (st != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "prefill compute failed at chunk offset %d\n", cs); + return 1; + } + + cache.cur_pos = cs + chunk_n; + + if (is_last) { + const int vocab = w.n_vocab; + std::vector logits_cpu(vocab); + // last_token_logits_only=true → logits has shape [vocab, 1]; + // read from offset 0 instead of skipping (chunk_n-1)*vocab floats. + ggml_backend_tensor_get(sg.logits, logits_cpu.data(), + 0, + sizeof(float) * vocab); + last_logit_tok = sample_logits(logits_cpu.data(), vocab, + sampler, prompt_ids, rng); + cache.last_tok = last_logit_tok; + } + + step_graph_free(sg); + } + } + } + + double prefill_t1 = now_ms(); + { + const int n_prompt = (int)prompt_ids.size(); + const double prefill_ms = prefill_t1 - prefill_t0; + { + const int swa_window = w.swa_window > 0 ? w.swa_window : 1024; + const int chunk_size = std::min(n_prompt, swa_window); + std::printf("[prefill] %d tokens in %.1f ms (%.1f tok/s) " + "[chunked%s, chunk_size=%d] (last sampled token: %d)\n", + n_prompt, prefill_ms, + prefill_ms > 0.0 ? (double)n_prompt / (prefill_ms / 1000.0) : 0.0, + use_pflash ? "+pflash" : "", chunk_size, last_logit_tok); + } + } + + // ── Draft KV prefill: materialize draft KV for all prompt positions ─ + if (have_draft) { + const int n_prompt = (int)prompt_ids.size(); + const int target_feat_w = dw.n_target_layers * dw.target_hidden; + + // Clamp to draft KV cache capacity. When the prompt is longer than the + // draft cache, we prefill only the LAST draft_prefill_n tokens so that + // the context that matters most (closest to the first decode step) is + // represented in the draft KV cache. + const int draft_kv_cap = cache.draft_kv_cap > 0 + ? cache.draft_kv_cap + : (int)cache.draft_k[0]->ne[2]; + const int draft_prefill_n = std::min(n_prompt, draft_kv_cap); + const int draft_prefill_skip = n_prompt - draft_prefill_n; + + DraftKVPrefillGraph pkg; + if (!build_draft_kv_prefill(pkg, dw, cache, backend, draft_prefill_n)) { + std::fprintf(stderr, "[draft] KV prefill build failed\n"); + return 1; + } + + // Extract target_feat from ring buffer (bf16 → f32) via ggml_cpy. + // The ring buffer stores tokens at slot (pos % cap). + // We want the LAST draft_prefill_n hidden states (positions draft_prefill_skip + // through n_prompt-1). + copy_target_feat_bf16_to_f32(backend, cache.target_feat, + pkg.target_feat, + draft_prefill_skip % cache.target_feat_cap, + draft_prefill_n, target_feat_w); + + // Positions: [draft_prefill_skip, ..., n_prompt-1] + { + std::vector pos(draft_prefill_n); + for (int i = 0; i < draft_prefill_n; i++) pos[i] = draft_prefill_skip + i; + ggml_backend_tensor_set(pkg.positions, pos.data(), 0, sizeof(int32_t) * draft_prefill_n); + } + + auto st = ggml_backend_graph_compute(backend, pkg.gf); + if (st != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[draft] KV prefill compute failed\n"); + draft_kv_prefill_destroy(pkg); + return 1; + } + // draft_kv_pos tracks entries written, bounded by draft_kv_cap. + cache.draft_kv_pos = draft_prefill_n; + + draft_kv_prefill_destroy(pkg); + std::printf("[draft] KV prefill done: %d positions materialized " + "(skipped %d early tokens, cap=%d, target_feat_cap=%d, dkv_pos=%d)\n", + draft_prefill_n, draft_prefill_skip, draft_kv_cap, + cache.target_feat_cap, cache.draft_kv_pos); + } + + // ── Decode loop ─────────────────────────────────────────────────── + + std::vector generated; + generated.reserve(n_predict); + std::vector history(prompt_ids); + + int committed = cache.cur_pos; + int32_t cur_tok = last_logit_tok; + + double decode_t0 = now_ms(); + double first_token_ms = -1.0; + + if (have_draft) { + // ── SPECULATIVE DECODE LOOP ─────────────────────────────────── + // + // Each iteration proposes a block of q_len tokens via the draft + // model, then verifies with a single batched target forward. + // Accepted prefix tokens are committed; the loop advances by + // accept_n tokens per target call instead of 1. + // + // Gemma4 is pure attention (no SSM/conv state), so rollback is + // trivially: just don't advance committed past accepted tokens. + // Stale KV at positions [committed+commit_n..committed+q_len-1] + // will be overwritten by the next verify pass. + + AdaptiveDraftMax adaptive; + adaptive.init(draft_max_adaptive, draft_max, dw.block_size); + if (draft_max_adaptive) { + std::printf("[adaptive] enabled initial=%d max=%d window=%d\n", + adaptive.current, adaptive.max_q, adaptive.window_steps); + } + const int mask_tok = dw.mask_token_id; // 4 + const int target_feat_w = dw.n_target_layers * dw.target_hidden; + const int vocab = w.n_vocab; + const int dkv_cap = cache.draft_kv_cap > 0 + ? cache.draft_kv_cap + : (int)cache.draft_k[0]->ne[2]; + + std::vector noise_ids(dw.block_size); + std::vector noise_embed_buf((size_t)dw.n_embd * dw.block_size); + std::vector draft_tok(dw.block_size); + std::vector target_tok(dw.block_size); + std::vector draft_logits_buf((size_t)vocab * dw.block_size); + std::vector verify_logits_buf((size_t)vocab * dw.block_size); + + while ((int)generated.size() < n_predict) { + int q_len = adaptive.enabled + ? adaptive.current + : ((draft_max > 0 && draft_max < dw.block_size) + ? draft_max : dw.block_size); + q_len = std::min(q_len, std::max(1, ctx_size - committed - 1)); + + if (IS_EOS_TOK(cur_tok, w)) { + std::printf("\n[decode] EOS token %d\n", cur_tok); + break; + } + if (committed >= ctx_size - 1) { + std::printf("\n[decode] context full\n"); + break; + } + + // Not enough context for target_feat extraction yet: + // fall back to single-token target-only decode. + if (committed < q_len) { + if (!build_gemma4_step(sg, w, cache, backend, + committed, /*n_tokens=*/1, + /*with_mask=*/true, + /*capture=*/true, + /*use_pflash=*/false, pflash_alpha, + fa_window)) { + std::fprintf(stderr, "[decode] warmup build failed at step %zu\n", + generated.size()); + return 1; + } + + if (sg.attn_mask && sg.attn_mask->buffer) { + const int kv_len = committed + 1; + std::vector mask_buf; + build_causal_mask(mask_buf, kv_len, 1, committed); + ggml_backend_tensor_set(sg.attn_mask, mask_buf.data(), 0, + sizeof(uint16_t) * mask_buf.size()); + } + if (sg.swa_mask && sg.swa_mask->buffer) { + const SwaView swa_view = compute_swa_view(committed, 1, + w.swa_window, cache.swa_ctx_alloc); + std::vector swa_buf; + build_swa_causal_mask(swa_buf, + /*kv_start*/ committed, + /*n_tokens*/ 1, + /*swa_window*/ w.swa_window, + /*ring_size*/ swa_view.effective_win_len, + /*kv_end*/ committed + 1); + ggml_backend_tensor_set(sg.swa_mask, swa_buf.data(), 0, + sizeof(uint16_t) * swa_buf.size()); + } + + if (!embed_token(w, cur_tok, sg.inp_embed, backend)) return 1; + + int32_t pos_val = committed; + ggml_backend_tensor_set(sg.positions, &pos_val, 0, sizeof(int32_t)); + + double step_t0 = now_ms(); + auto st = ggml_backend_graph_compute(backend, sg.gf); + double step_t1 = now_ms(); + + if (st != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[decode] warmup compute failed at step %zu\n", + generated.size()); + return 1; + } + + committed++; + cache.cur_pos = committed; + + // Draft KV prefill for this warmup token (position committed-1). + { + const int warmup_pos = committed - 1; + const int target_feat_w_w = dw.n_target_layers * dw.target_hidden; + DraftKVPrefillGraph wpkg; + if (!build_draft_kv_prefill(wpkg, dw, cache, backend, 1)) { + std::fprintf(stderr, "[decode] warmup draft KV prefill build failed\n"); + return 1; + } + copy_target_feat_bf16_to_f32(backend, cache.target_feat, + wpkg.target_feat, + warmup_pos % cache.target_feat_cap, + 1, target_feat_w_w); + { + int32_t p = warmup_pos; + ggml_backend_tensor_set(wpkg.positions, &p, 0, sizeof(int32_t)); + } + auto wst = ggml_backend_graph_compute(backend, wpkg.gf); + if (wst != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[decode] warmup draft KV prefill compute failed\n"); + draft_kv_prefill_destroy(wpkg); + return 1; + } + cache.draft_kv_pos = std::min(dkv_cap, cache.draft_kv_pos + 1); + draft_kv_prefill_destroy(wpkg); + } + + const int vocab_inner = w.n_vocab; + std::vector logits_cpu(vocab_inner); + ggml_backend_tensor_get(sg.logits, logits_cpu.data(), 0, + sizeof(float) * vocab_inner); + + const int32_t next_tok = (int32_t)sample_logits( + logits_cpu.data(), vocab_inner, sampler, history, rng); + + generated.push_back(cur_tok); + history.push_back(cur_tok); + + if (first_token_ms < 0.0) { + first_token_ms = step_t1 - step_t0; + } + + std::printf("%d ", cur_tok); + std::fflush(stdout); + + cur_tok = next_tok; + cache.last_tok = cur_tok; + + step_graph_free(sg); + continue; + } + + // ── 1. Build noise block: [cur_tok, MASK, MASK, ..., MASK] + noise_ids[0] = cur_tok; + for (int i = 1; i < q_len; i++) noise_ids[i] = mask_tok; + if (!w.embedder.embed(noise_ids.data(), q_len, noise_embed_buf.data())) { + std::fprintf(stderr, "[spec] embed noise_ids failed\n"); + return 1; + } + + // ── 2. Build draft graph (KV-cached, no target_feat input) + // The draft model operates in its own KV address space bounded by + // draft_kv_cap. Use cache.draft_kv_pos (number of entries written into + // the draft KV cache) as kv_start, NOT the absolute committed position. + double refill_ms = 0.0; + if (cache.draft_kv_pos + q_len > dkv_cap) { + // Sliding-window re-prefill: instead of wiping all draft KV context, + // keep the most recent (dkv_cap - q_len) committed tokens by + // re-projecting their target_feat into the beginning of the draft + // KV cache. This preserves the drafter's context continuity across + // ring-buffer wrap points, which is the root cause of acceptance + // collapsing from ~10/16 at 32K to ~1/16 at 64K. + const int keep = dkv_cap - q_len; + if (keep > 0 && committed >= keep) { + // Absolute positions of the (keep) tokens we want to retain: + // [committed - keep, committed). + const int refill_start = committed - keep; + + // Reset draft_kv_pos to 0 so build_draft_kv_prefill_graph writes + // to slot [0, keep) — the ASSERT inside the graph builder requires + // draft_kv_pos + n_tokens <= ne[2]. + cache.draft_kv_pos = 0; + + const double refill_t0 = now_ms(); + DraftKVPrefillGraph rpkg; + if (!build_draft_kv_prefill(rpkg, dw, cache, backend, keep)) { + std::fprintf(stderr, "[spec] draft KV re-prefill build failed\n"); + return 1; + } + + // Copy target_feat for [refill_start, refill_start+keep) from the + // ring buffer (bf16) into rpkg.target_feat (f32) via ggml_cpy. + copy_target_feat_bf16_to_f32(backend, cache.target_feat, + rpkg.target_feat, + refill_start % cache.target_feat_cap, + keep, target_feat_w); + + // Absolute positions for RoPE — must match training. + { + std::vector rpos(keep); + for (int i = 0; i < keep; i++) rpos[i] = refill_start + i; + ggml_backend_tensor_set(rpkg.positions, rpos.data(), 0, + sizeof(int32_t) * keep); + } + + auto rst = ggml_backend_graph_compute(backend, rpkg.gf); + if (rst != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[spec] draft KV re-prefill compute failed\n"); + draft_kv_prefill_destroy(rpkg); + return 1; + } + cache.draft_kv_pos = keep; + draft_kv_prefill_destroy(rpkg); + refill_ms = now_ms() - refill_t0; + + std::fprintf(stderr, + "[spec] draft KV sliding re-prefill: kept %d tokens " + "(positions %d..%d), dkv_cap=%d\n", + keep, refill_start, committed - 1, dkv_cap); + } else { + // Not enough committed history to re-prefill — hard reset. + // This only happens at the very beginning of decode (committed < keep). + cache.draft_kv_pos = 0; + } + } + if (!build_draft_step(dsg, dw, cache, backend, q_len, cache.draft_kv_pos)) { + std::fprintf(stderr, "[spec] draft build failed\n"); + return 1; + } + + // ── 3. Set draft inputs + + // draft_embed: noise embeddings [n_embd, q_len] f32 + ggml_backend_tensor_set(dsg.draft_embed, noise_embed_buf.data(), 0, + sizeof(float) * (size_t)dw.n_embd * q_len); + + // positions: absolute [committed, committed+1, ..., committed+q_len-1] + // (absolute positions are used for RoPE — they must match training) + { + std::vector pos(q_len); + for (int i = 0; i < q_len; i++) pos[i] = committed + i; + ggml_backend_tensor_set(dsg.positions, pos.data(), 0, sizeof(int32_t) * q_len); + } + + // Causal mask: block token i attends to all draft KV context + // [0..draft_kv_pos-1] plus block tokens [0..i]. + // Use draft_kv_pos (draft KV address space), not committed. + if (dsg.attn_mask && dsg.attn_mask->buffer) { + const int dkv_ctx = cache.draft_kv_pos; + const int kv_len = dkv_ctx + q_len; + const int kv_pad = align_up(kv_len, KQ_MASK_PAD); + const int q_pad = align_up(q_len, KQ_MASK_PAD); + std::vector mask((size_t)kv_pad * q_pad, F16_NEG_INF); + for (int q = 0; q < q_len; q++) { + const int max_k = dkv_ctx + q; + for (int k = 0; k <= max_k; k++) { + mask[(size_t)q * kv_pad + k] = F16_ZERO; + } + } + ggml_backend_tensor_set(dsg.attn_mask, mask.data(), 0, + sizeof(uint16_t) * mask.size()); + } + + // ── 4. Draft compute + const double draft_t0 = now_ms(); + { + auto st = ggml_backend_graph_compute(backend, dsg.gf); + if (st != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[spec] draft compute failed: %d\n", (int)st); + return 1; + } + } + const double draft_t1 = now_ms(); + + // ── 5. Read draft logits and argmax + ggml_backend_tensor_get(dsg.logits, draft_logits_buf.data(), 0, + sizeof(float) * (size_t)vocab * q_len); + for (int i = 0; i < q_len; i++) { + draft_tok[i] = argmax_f32(draft_logits_buf.data() + (size_t)i * vocab, vocab); + } + draft_tok[0] = cur_tok; // pin first token (it was cur_tok, not a prediction) + + // ── 6. Target verify: batched forward on draft_tok[0..q_len-1] + if (!build_gemma4_step(sg, w, cache, backend, + committed, q_len, + /*with_mask=*/true, /*capture=*/true, + use_pflash, pflash_alpha, fa_window)) { + std::fprintf(stderr, "[spec] verify build failed\n"); + return 1; + } + + if (!embed_tokens_batch(w, draft_tok.data(), q_len, sg.inp_embed, backend)) { + return 1; + } + + // Target positions: [committed, committed+1, ..., committed+q_len-1] + { + std::vector pos(q_len); + for (int i = 0; i < q_len; i++) pos[i] = committed + i; + ggml_backend_tensor_set(sg.positions, pos.data(), 0, sizeof(int32_t) * q_len); + } + + // Causal mask for target verify + if (sg.attn_mask && sg.attn_mask->buffer) { + const int kv_len = committed + q_len; + std::vector mask_buf; + build_causal_mask(mask_buf, kv_len, q_len, committed); + ggml_backend_tensor_set(sg.attn_mask, mask_buf.data(), 0, + sizeof(uint16_t) * mask_buf.size()); + } + + // SWA mask for target verify (required when n_tokens > 1) + if (sg.swa_mask && sg.swa_mask->buffer) { + const SwaView swa_view = compute_swa_view(committed, q_len, + w.swa_window, cache.swa_ctx_alloc); + std::vector swa_buf; + build_swa_causal_mask(swa_buf, + /*kv_start*/ committed, + /*n_tokens*/ q_len, + /*swa_window*/ w.swa_window, + /*ring_size*/ swa_view.effective_win_len, + /*kv_end*/ committed + q_len); + ggml_backend_tensor_set(sg.swa_mask, swa_buf.data(), 0, + sizeof(uint16_t) * swa_buf.size()); + } + + const double verify_t0 = now_ms(); + { + auto st = ggml_backend_graph_compute(backend, sg.gf); + if (st != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[spec] verify compute failed: %d\n", (int)st); + return 1; + } + } + const double verify_t1 = now_ms(); + + // ── 7. Read target logits and argmax + ggml_backend_tensor_get(sg.logits, verify_logits_buf.data(), 0, + sizeof(float) * (size_t)vocab * q_len); + for (int i = 0; i < q_len; i++) { + target_tok[i] = argmax_f32(verify_logits_buf.data() + (size_t)i * vocab, vocab); + } + + // ── 8. Acceptance: longest prefix match + // draft_tok[0] = cur_tok (accepted unconditionally as the current token) + // target_tok[i] = target's prediction for position committed+i+1 + // Check: draft_tok[i+1] == target_tok[i] (draft proposed the right next token) + int accept_n = 1; + for (int i = 0; i < q_len - 1; i++) { + if (draft_tok[i + 1] == target_tok[i]) accept_n++; + else break; + } + int commit_n = accept_n; + if (commit_n > n_predict - (int)generated.size()) { + commit_n = n_predict - (int)generated.size(); + } + + // ── 9. Commit accepted tokens + bool hit_eos = false; + for (int i = 0; i < commit_n; i++) { + generated.push_back(draft_tok[i]); + history.push_back(draft_tok[i]); + std::printf("%d ", draft_tok[i]); + std::fflush(stdout); + if (IS_EOS_TOK(draft_tok[i], w)) { hit_eos = true; break; } + } + + // ── 10. Draft KV prefill for the committed positions, then advance state. + // The target verify pass (step 6) captured target_feat for positions + // [committed..committed+q_len-1]. We prefill draft KV for the accepted + // prefix [committed..committed+commit_n-1] before advancing committed. + const double commit_t0 = now_ms(); + { + DraftKVPrefillGraph cpkg; + if (!build_draft_kv_prefill(cpkg, dw, cache, backend, commit_n)) { + std::fprintf(stderr, "[spec] draft KV prefill build failed\n"); + return 1; + } + + // Extract target_feat for positions [committed..committed+commit_n-1] + // from the ring buffer (bf16 → f32) via ggml_cpy. + copy_target_feat_bf16_to_f32(backend, cache.target_feat, + cpkg.target_feat, + committed % cache.target_feat_cap, + commit_n, target_feat_w); + + { + std::vector pos(commit_n); + for (int i = 0; i < commit_n; i++) pos[i] = committed + i; + ggml_backend_tensor_set(cpkg.positions, pos.data(), 0, + sizeof(int32_t) * commit_n); + } + + auto cst = ggml_backend_graph_compute(backend, cpkg.gf); + if (cst != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[spec] draft KV prefill compute failed\n"); + draft_kv_prefill_destroy(cpkg); + return 1; + } + cache.draft_kv_pos = std::min(dkv_cap, cache.draft_kv_pos + commit_n); + draft_kv_prefill_destroy(cpkg); + } + const double commit_t1 = now_ms(); + + // Gemma4 is pure attention — no SSM/conv rollback needed. + // Stale KV at positions [committed+commit_n..committed+q_len-1] + // will be overwritten by the next verify pass. + committed += commit_n; + cache.cur_pos = committed; + cur_tok = target_tok[commit_n - 1]; + cache.last_tok = cur_tok; + + total_draft_steps++; + total_accepted += commit_n; + + if (first_token_ms < 0.0) { + first_token_ms = now_ms() - decode_t0; + } + + double avg_accept = (total_draft_steps > 0) + ? (double)total_accepted / total_draft_steps : 0.0; + std::printf("[step %d] accept=%d/%d avg=%.1f " + "draft_ms=%.2f verify_ms=%.2f kv_ms=%.2f refill_ms=%.2f\n", + total_draft_steps, accept_n, q_len, avg_accept, + draft_t1 - draft_t0, verify_t1 - verify_t0, + commit_t1 - commit_t0, refill_ms); + adaptive.observe(accept_n, q_len, total_draft_steps); + + if (hit_eos) break; + + step_graph_free(sg); + draft_step_free(dsg); + } + + } else if (have_mtp) { + // ── MTP SPECULATIVE DECODE LOOP (γ=1 v1) ───────────────────── + // + // Each iteration: + // 1. Run target forward for cur_tok at position `committed`, + // capturing mtp_h_prev from the last full-attention layer. + // 2. Rebuild MTP step graph with current attn_pos = committed+1. + // 3. Feed (cur_tok, mtp_h_prev) into MTP graph → draft_tok. + // 4. Run target verify forward for draft_tok at position committed+1. + // 5. Accept draft_tok if target agrees; otherwise accept target's + // token instead (standard single-draft acceptance). + // γ=1: one MTP draft per step. Correctness gate before γ>1. + + int mtp_steps = 0; + int mtp_accepted = 0; + + while ((int)generated.size() < n_predict) { + + if (IS_EOS_TOK(cur_tok, w)) { + std::printf("\n[mtp] EOS token %d at step %zu\n", + cur_tok, generated.size()); + break; + } + if (committed >= ctx_size - 2) { + std::printf("\n[mtp] context full at step %zu\n", + generated.size()); + break; + } + + // ── 1. Target forward for cur_tok (captures mtp_h_prev) ── + if (!build_gemma4_step(sg, w, cache, backend, + committed, /*n_tokens=*/1, + /*with_mask=*/true, + /*capture=*/false, + /*use_pflash=*/false, pflash_alpha, + fa_window)) { + std::fprintf(stderr, "[mtp] target build failed at step %zu\n", + generated.size()); + return 1; + } + + if (sg.attn_mask && sg.attn_mask->buffer) { + const int kv_len = committed + 1; + std::vector mask_buf; + build_causal_mask(mask_buf, kv_len, 1, committed); + ggml_backend_tensor_set(sg.attn_mask, mask_buf.data(), 0, + sizeof(uint16_t) * mask_buf.size()); + } + if (sg.swa_mask && sg.swa_mask->buffer) { + const SwaView swa_view = compute_swa_view(committed, 1, + w.swa_window, cache.swa_ctx_alloc); + std::vector swa_buf; + build_swa_causal_mask(swa_buf, + /*kv_start*/ committed, + /*n_tokens*/ 1, + /*swa_window*/ w.swa_window, + /*ring_size*/ swa_view.effective_win_len, + /*kv_end*/ committed + 1); + ggml_backend_tensor_set(sg.swa_mask, swa_buf.data(), 0, + sizeof(uint16_t) * swa_buf.size()); + } + if (!embed_token(w, cur_tok, sg.inp_embed, backend)) return 1; + { + int32_t pos_val = committed; + ggml_backend_tensor_set(sg.positions, &pos_val, 0, sizeof(int32_t)); + } + { + auto st = ggml_backend_graph_compute(backend, sg.gf); + if (st != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[mtp] target compute failed\n"); + return 1; + } + } + committed++; + cache.cur_pos = committed; + + // Read target logits to get target's own prediction at position committed-1 + const int vocab = w.n_vocab; + std::vector logits_cpu(vocab); + ggml_backend_tensor_get(sg.logits, logits_cpu.data(), 0, + sizeof(float) * vocab); + const int32_t target_next = (int32_t)sample_logits( + logits_cpu.data(), vocab, sampler, history, rng); + + step_graph_free(sg); + + // ── 2. Rebuild MTP step graph with attn_pos = committed ── + free_mtp_step_graph(mtp_g); + if (!build_mtp_step_graph(mtp_w, cache, w, mtp_g, committed)) { + std::fprintf(stderr, "[mtp] build_mtp_step_graph failed: %s\n", + dflash27b_last_error()); + return 1; + } + + // Allocate MTP graph (needs gallocr; build_mtp_step_graph creates + // the ggml context but not the backend buffers) + ggml_gallocr_t mtp_alloc = ggml_gallocr_new( + ggml_backend_get_default_buffer_type(backend)); + bool mtp_alloc_ok = ggml_gallocr_alloc_graph(mtp_alloc, mtp_g.gf); + if (!mtp_alloc_ok) { + std::fprintf(stderr, "[mtp] gallocr_alloc_graph failed\n"); + ggml_gallocr_free(mtp_alloc); + return 1; + } + + // ── 3. Set MTP inputs and compute ──────────────────────── + // in_tok_embd: pre-dequantised F32 embedding of cur_tok. + // embed_token dequantises via w.embedder.embed() on CPU, avoiding + // ggml_get_rows on a Q4_K source (unsupported in CUDA get_rows). + if (!embed_token(w, cur_tok, mtp_g.in_tok_embd, backend)) { + std::fprintf(stderr, "[mtp] embed_token failed for tok=%d\n", cur_tok); + ggml_gallocr_free(mtp_alloc); + return 1; + } + // in_h_prev: captured by target graph into cache.mtp_h_prev + ggml_backend_tensor_copy(cache.mtp_h_prev, mtp_g.in_h_prev); + // in_pos: position of the draft token (= committed, 0-based) + { + int32_t p = committed; + ggml_backend_tensor_set(mtp_g.in_pos, &p, 0, sizeof(int32_t)); + } + + // Fill the FA mask for TQ3_0 + head_dim>=512 cross-attention layers. + // Real positions [0..kv_seq_len-1]: 0x0000 (F16 0.0 = admit). + // Padding positions [kv_seq_len..mask_width-1]: 0xFC00 (F16 -inf = exclude). + if (mtp_g.fa_mask && mtp_g.fa_mask->buffer) { + const int64_t mask_n = mtp_g.fa_mask->ne[0]; // total mask width + const int64_t kv_seq = mtp_g.fa_mask_kv_seq_len; // admitted positions + std::vector mask_buf(mask_n); + for (int64_t i = 0; i < mask_n; i++) { + mask_buf[i] = (i < kv_seq) ? 0x0000u : 0xFC00u; + } + ggml_backend_tensor_set(mtp_g.fa_mask, mask_buf.data(), 0, + sizeof(uint16_t) * mask_n); + } + + { + auto st = ggml_backend_graph_compute(backend, mtp_g.gf); + if (st != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[mtp] MTP compute failed\n"); + ggml_gallocr_free(mtp_alloc); + return 1; + } + } + + // Read draft token from in-graph argmax + int32_t draft_tok = -1; + ggml_backend_tensor_get(mtp_g.out_argmax, &draft_tok, 0, sizeof(int32_t)); + + ggml_gallocr_free(mtp_alloc); + + // Emit the current token (already committed by target step above) + generated.push_back(cur_tok); + history.push_back(cur_tok); + std::printf("%d ", cur_tok); + std::fflush(stdout); + + if (first_token_ms < 0.0) { + first_token_ms = now_ms() - decode_t0; + } + + mtp_steps++; + + // ── 4+5. Check if draft matches target's greedy token ─── + if (mtp_steps <= 8) { + std::printf("[mtp-dbg] step=%d draft=%d target=%d %s\n", + mtp_steps, draft_tok, target_next, + draft_tok == target_next ? "MATCH" : "miss"); + std::fflush(stdout); + } + if (draft_tok == target_next) { + // MTP was right: accept draft token as next cur_tok + mtp_accepted++; + cur_tok = draft_tok; + } else { + // MTP was wrong: use target's token + cur_tok = target_next; + } + cache.last_tok = cur_tok; + + if ((int)generated.size() % 8 == 0) { + std::printf("[mtp-step %d] accept_rate=%.2f\n", + mtp_steps, + mtp_steps > 0 ? (float)mtp_accepted / mtp_steps : 0.0f); + } + + if (IS_EOS_TOK(cur_tok, w)) { + std::printf("\n[mtp] EOS token %d\n", cur_tok); + break; + } + } + + if (mtp_steps > 0) { + std::printf("\n[mtp] steps=%d accepted=%d accept_rate=%.2f\n", + mtp_steps, mtp_accepted, + (float)mtp_accepted / mtp_steps); + } + + } else { + // ── TARGET-ONLY DECODE LOOP ─────────────────────────────────── + // + // Single-token autoregressive path. + // Each iteration: + // 1. Feed `cur_tok` through the target at position `committed`. + // 2. Sample the next token from logits. + // 3. Append to generated sequence. + // 4. Stop if EOS or n_predict reached. + + while ((int)generated.size() < n_predict) { + + if (IS_EOS_TOK(cur_tok, w)) { + std::printf("\n[decode] EOS token %d at step %zu\n", + cur_tok, generated.size()); + break; + } + + if (committed >= ctx_size - 1) { + std::printf("\n[decode] context full at step %zu\n", + generated.size()); + break; + } + + // Build single-token decode graph + if (!build_gemma4_step(sg, w, cache, backend, + committed, /*n_tokens=*/1, + /*with_mask=*/true, + /*capture=*/false, + /*use_pflash=*/false, pflash_alpha, + fa_window)) { + std::fprintf(stderr, "[decode] build failed at step %zu\n", + generated.size()); + return 1; + } + + if (sg.attn_mask && sg.attn_mask->buffer) { + const int kv_len = committed + 1; + std::vector mask_buf; + build_causal_mask(mask_buf, kv_len, 1, committed); + ggml_backend_tensor_set(sg.attn_mask, mask_buf.data(), 0, + sizeof(uint16_t) * mask_buf.size()); + } + if (sg.swa_mask && sg.swa_mask->buffer) { + const SwaView swa_view = compute_swa_view(committed, 1, + w.swa_window, cache.swa_ctx_alloc); + std::vector swa_buf; + build_swa_causal_mask(swa_buf, + /*kv_start*/ committed, + /*n_tokens*/ 1, + /*swa_window*/ w.swa_window, + /*ring_size*/ swa_view.effective_win_len, + /*kv_end*/ committed + 1); + ggml_backend_tensor_set(sg.swa_mask, swa_buf.data(), 0, + sizeof(uint16_t) * swa_buf.size()); + } + + if (!embed_token(w, cur_tok, sg.inp_embed, backend)) return 1; + + int32_t pos_val = committed; + ggml_backend_tensor_set(sg.positions, &pos_val, 0, sizeof(int32_t)); + + double step_t0 = now_ms(); + auto st = ggml_backend_graph_compute(backend, sg.gf); + double step_t1 = now_ms(); + + if (st != GGML_STATUS_SUCCESS) { + std::fprintf(stderr, "[decode] compute failed at step %zu\n", + generated.size()); + return 1; + } + + committed++; + cache.cur_pos = committed; + + // Fetch logits and sample + const int vocab = w.n_vocab; + std::vector logits_cpu(vocab); + ggml_backend_tensor_get(sg.logits, logits_cpu.data(), 0, + sizeof(float) * vocab); + + const int32_t next_tok = (int32_t)sample_logits( + logits_cpu.data(), vocab, sampler, history, rng); + + // Debug: check logits on first decode step + if (generated.empty()) { + float maxl = logits_cpu[0]; int maxi = 0; + for (int i = 1; i < vocab; i++) { + if (logits_cpu[i] > maxl) { maxl = logits_cpu[i]; maxi = i; } + } + std::printf("[tgt-only-dbg] logits[0..3]: %.3f %.3f %.3f %.3f max=%.3f@%d next=%d\n", + logits_cpu[0], logits_cpu[1], logits_cpu[2], logits_cpu[3], maxl, maxi, next_tok); + std::fflush(stdout); + } + + generated.push_back(cur_tok); + history.push_back(cur_tok); + + if (first_token_ms < 0.0 && !generated.empty()) { + first_token_ms = step_t1 - step_t0; + } + + // Print token id (a proper decoder would map id -> string here) + std::printf("%d ", cur_tok); + std::fflush(stdout); + + cur_tok = next_tok; + cache.last_tok = cur_tok; + + step_graph_free(sg); + } + } + + double decode_t1 = now_ms(); + const double decode_ms = decode_t1 - decode_t0; + const int n_gen = (int)generated.size(); + const double tps = (decode_ms > 0.0 && n_gen > 0) + ? n_gen / (decode_ms / 1000.0) + : 0.0; + + bench_tok_per_sec.push_back(tps); + + std::printf("\n"); + std::printf("[stats] generated=%d decode_ms=%.1f tok/s=%.2f " + "first_tok_ms=%.2f\n", + n_gen, decode_ms, tps, first_token_ms); + std::printf("[stats] prefill=%zu tokens context_used=%d/%d\n", + prompt_ids.size(), committed, ctx_size); + + if (have_draft && total_draft_steps > 0) { + std::printf("[spec] draft_steps=%d total_accepted=%d avg_accept=%.2f\n", + total_draft_steps, total_accepted, + (double)total_accepted / total_draft_steps); + } + + // ── Memory stats ────────────────────────────────────────────────── + { + size_t free_bytes = 0, total_bytes = 0; + cudaMemGetInfo(&free_bytes, &total_bytes); + const double used_gb = (total_bytes - free_bytes) / (1024.0 * 1024.0 * 1024.0); + const double total_gb = total_bytes / (1024.0 * 1024.0 * 1024.0); + std::printf("[mem] VRAM used=%.2f GB total=%.2f GB\n", + used_gb, total_gb); + } + + } // bench loop + + // ── Benchmark summary ───────────────────────────────────────────────── + if (bench_mode && bench_tok_per_sec.size() > 1) { + std::sort(bench_tok_per_sec.begin(), bench_tok_per_sec.end()); + const double median = bench_tok_per_sec[bench_tok_per_sec.size() / 2]; + const double best = bench_tok_per_sec.back(); + std::printf("\n[bench] median=%.2f tok/s best=%.2f tok/s runs=%zu\n", + median, best, bench_tok_per_sec.size()); + } + + // ── Cleanup ─────────────────────────────────────────────────────────── + step_graph_destroy(sg); + draft_step_destroy(dsg); + if (have_draft) { + free_draft_kv_cache(cache); + dw.tok_embd = nullptr; // prevent double-free (tok_embd lives in tok_embd_buf) + free_gemma4_draft_weights(dw); + if (tok_embd_buf) ggml_backend_buffer_free(tok_embd_buf); + if (tok_embd_ctx) ggml_free(tok_embd_ctx); + } + if (have_mtp) { + free_mtp_step_graph(mtp_g); + free_gemma4_mtp_assistant(mtp_w); + // mtp_h_prev lives in mtp_h_prev_buf/ctx (not base_ctx). + // Null out the pointer in cache before free_gemma4_cache to avoid + // dangling reference (cache struct is stack-allocated; the pointer + // would otherwise reference freed memory). + cache.mtp_h_prev = nullptr; + cache.mtp_h_prev_enabled = false; + if (mtp_h_prev_buf) { ggml_backend_buffer_free(mtp_h_prev_buf); mtp_h_prev_buf = nullptr; } + if (mtp_h_prev_ctx) { ggml_free(mtp_h_prev_ctx); mtp_h_prev_ctx = nullptr; } + } + free_gemma4_cache(cache); + free_gemma4_target_weights(w); + ggml_backend_free(backend); + + return 0; +} diff --git a/dflash/test/gemma4/test_gemma4_kv_tq3.cpp b/dflash/test/gemma4/test_gemma4_kv_tq3.cpp new file mode 100644 index 00000000..495fc869 --- /dev/null +++ b/dflash/test/gemma4/test_gemma4_kv_tq3.cpp @@ -0,0 +1,179 @@ +// Smoke test: create a Gemma4 KV cache with TQ3_0 quantization and validate +// the resulting cache structure, alignment, and layer-to-KV-index mappings. +// +// Usage: test_gemma4_kv_tq3 + +#include "internal.h" +#include "gemma4.h" + +#include "ggml.h" +#include "ggml-backend.h" +#include "ggml-cuda.h" + +#include +#include +#include +#include +#include + +#ifdef _WIN32 +# define setenv(name, value, overwrite) _putenv_s(name, value) +#endif + +using namespace dflash27b; + +static void fail(const char * msg) { + std::fprintf(stderr, "FAIL: %s\n", msg); + std::exit(1); +} + +int main(int argc, char ** argv) { + if (argc < 2) { + std::fprintf(stderr, "usage: %s \n", argv[0]); + return 2; + } + + ggml_backend_t backend = ggml_backend_cuda_init(0); + if (!backend) { std::fprintf(stderr, "cuda init failed\n"); return 1; } + + GemmaTargetWeights w; + if (!load_gemma4_target_gguf(argv[1], backend, w)) { + std::fprintf(stderr, "load_gemma4_target_gguf: %s\n", dflash27b_last_error()); + ggml_backend_free(backend); + return 1; + } + std::printf("[target] n_layer=%d n_embd=%d n_head_kv=%d head_dim=%d " + "n_layer_kv=%d n_capture_layers=%d\n", + w.n_layer, w.n_embd, w.n_head_kv, w.head_dim, + w.n_layer_kv, w.n_capture_layers); + + // Set KV type environment variables to tq3_0 before cache creation + setenv("DFLASH27B_KV_K", "tq3_0", 1); + setenv("DFLASH27B_KV_V", "tq3_0", 1); + + const int max_ctx = 1024; + GemmaTargetCache cache; + if (!create_gemma4_cache(w, max_ctx, backend, cache)) { + std::fprintf(stderr, "create_gemma4_cache: %s\n", dflash27b_last_error()); + free_gemma4_target_weights(w); + ggml_backend_free(backend); + return 1; + } + std::printf("[cache] created max_ctx=%d kv_slots=%zu\n", + cache.max_ctx, cache.attn_k.size()); + + // Assert KV types resolved correctly + if (cache.kv_k_type != GGML_TYPE_TQ3_0) { + char buf[64]; + std::snprintf(buf, sizeof(buf), + "kv_k_type=%s expected tq3_0", ggml_type_name(cache.kv_k_type)); + fail(buf); + } + if (cache.kv_v_type != GGML_TYPE_TQ3_0) { + char buf[64]; + std::snprintf(buf, sizeof(buf), + "kv_v_type=%s expected tq3_0", ggml_type_name(cache.kv_v_type)); + fail(buf); + } + std::printf("[types] kv_k=%s kv_v=%s OK\n", + ggml_type_name(cache.kv_k_type), + ggml_type_name(cache.kv_v_type)); + + // Validate layer_to_kv_idx mapping + if ((int)cache.layer_to_kv_idx.size() != w.n_layer) { + char buf[64]; + std::snprintf(buf, sizeof(buf), + "layer_to_kv_idx.size()=%zu expected %d", + cache.layer_to_kv_idx.size(), w.n_layer); + fail(buf); + } + + const int n_kv_slots = (int)cache.attn_k.size(); + int n_shared_layers = 0; + for (int il = 0; il < w.n_layer; il++) { + const int idx = cache.layer_to_kv_idx[il]; + if (idx == -1) { + n_shared_layers++; + } else if (idx < 0 || idx >= n_kv_slots) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "layer_to_kv_idx[%d]=%d out of range [0, %d)", + il, idx, n_kv_slots); + fail(buf); + } + } + std::printf("[kv_idx] n_kv_slots=%d n_shared_layers=%d n_layer=%d\n", + n_kv_slots, n_shared_layers, w.n_layer); + + // Validate layer_to_donor_kv: shared layers must have a valid donor + if ((int)cache.layer_to_donor_kv.size() != w.n_layer) { + fail("layer_to_donor_kv.size() != n_layer"); + } + for (int il = 0; il < w.n_layer; il++) { + if (cache.layer_to_kv_idx[il] == -1) { + // This is a shared layer — must have a valid donor + const int donor = cache.layer_to_donor_kv[il]; + if (donor < 0 || donor >= n_kv_slots) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "layer_to_donor_kv[%d]=%d invalid for shared layer (n_kv_slots=%d)", + il, donor, n_kv_slots); + fail(buf); + } + } + } + std::printf("[donor_kv] all shared layers have valid donors OK\n"); + + // Validate TQ3_0 alignment: for TQ3_0, KV tensors must have ne[1] % 256 == 0 + // (create_gemma4_cache rounds max_ctx_alloc up to a multiple of 256 for TQ3_0). + for (int i = 0; i < n_kv_slots; i++) { + const ggml_tensor * K = cache.attn_k[i]; + const ggml_tensor * V = cache.attn_v[i]; + if (!K) { char buf[32]; std::snprintf(buf, sizeof(buf), "attn_k[%d] is null", i); fail(buf); } + if (!V) { char buf[32]; std::snprintf(buf, sizeof(buf), "attn_v[%d] is null", i); fail(buf); } + if (K->ne[1] % 256 != 0) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "attn_k[%d]: ne[1]=%" PRId64 " not a multiple of 256 (TQ3_0 alignment)", + i, K->ne[1]); + fail(buf); + } + if (V->ne[1] % 256 != 0) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "attn_v[%d]: ne[1]=%" PRId64 " not a multiple of 256 (TQ3_0 alignment)", + i, V->ne[1]); + fail(buf); + } + } + std::printf("[alignment] all %d KV tensors are 256-aligned OK\n", n_kv_slots); + + // Validate target_feat tensor + if (!cache.target_feat) fail("target_feat is null"); + const int64_t expected_feat_ne0 = (int64_t)w.n_capture_layers * w.n_embd; + if (cache.target_feat->ne[0] != expected_feat_ne0) { + char buf[128]; + std::snprintf(buf, sizeof(buf), + "target_feat->ne[0]=%" PRId64 " expected %" PRId64 + " (n_capture_layers=%d * n_embd=%d)", + cache.target_feat->ne[0], expected_feat_ne0, + w.n_capture_layers, w.n_embd); + fail(buf); + } + std::printf("[target_feat] ne=[%" PRId64 ", %" PRId64 "] type=%s OK\n", + cache.target_feat->ne[0], cache.target_feat->ne[1], + ggml_type_name(cache.target_feat->type)); + + // Print cache stats + std::printf("[stats] n_kv_slots=%d max_ctx=%d kv_seq_dim=%" PRId64 + " target_feat_cap=%d\n", + n_kv_slots, cache.max_ctx, + cache.attn_k[0]->ne[1], + cache.target_feat_cap); + + free_gemma4_cache(cache); + free_gemma4_target_weights(w); + ggml_backend_free(backend); + std::printf("PASS\n"); + return 0; +} diff --git a/dflash/test/gemma4/test_mtp_graph_shapes.cpp b/dflash/test/gemma4/test_mtp_graph_shapes.cpp new file mode 100644 index 00000000..b0da76ba --- /dev/null +++ b/dflash/test/gemma4/test_mtp_graph_shapes.cpp @@ -0,0 +1,298 @@ +// Phase 3a shape test: MTP step graph builds without crash and output tensor +// shapes match the contract: +// out_logits : F32 [n_vocab, 1] +// out_h_post : F32 [n_embd_backbone, 1] +// out_argmax : I32 [1] +// +// We stub GemmaTargetCache and GemmaTargetWeights with zero-initialised tensors +// of the correct shapes. No actual inference is performed — this is a graph +// construction smoke test only. +// +// Run: +// MTP_GGUF=/path/to/gemma-4-31B-it-assistant.Q4_K_M.gguf \ +// ./build/test_mtp_graph_shapes +// +// Requires MTP_GGUF to be set; exits 77 (autotools skip) if absent. + +#include "../src/internal.h" +#include "ggml.h" +#include "ggml-alloc.h" +#include "ggml-backend.h" +#include "ggml-cuda.h" + +#include +#include +#include +#include +#include +#include + +using namespace dflash27b; + +static int fail(const char * msg) { + std::fprintf(stderr, "[FAIL] %s\n", msg); + return 1; +} + +// Build a minimal stub GemmaTargetWeights with tok_embd of the right shape. +// The stub does NOT allocate GPU memory for embedding data; graph construction +// only needs the tensor *metadata* (ne[], type), not data. +static bool build_stub_target_weights(ggml_backend_t backend, + int n_vocab, + int n_embd_backbone, + int n_layer, + const std::vector & swa_layers, + GemmaTargetWeights & out) { + // Minimal tensor count: tok_embd + per-layer rope_freqs (optional) + out_norm + output + const size_t n_tensors_est = (size_t)(n_layer + 8); + ggml_init_params ip{}; + ip.mem_size = n_tensors_est * ggml_tensor_overhead() + 4096; + ip.mem_buffer = nullptr; + ip.no_alloc = true; + out.ctx = ggml_init(ip); + if (!out.ctx) return false; + + // tok_embd: [n_embd_backbone, n_vocab] (ggml ne[0]=embedding_dim, ne[1]=n_vocab) + out.tok_embd = ggml_new_tensor_2d(out.ctx, GGML_TYPE_F32, n_embd_backbone, n_vocab); + ggml_set_name(out.tok_embd, "stub_tok_embd"); + + // Populate fields needed by build_mtp_step_graph + // Dense 31B: head_dim=256 (from GGUF "gemma4.attention.key_length") + out.n_embd = n_embd_backbone; + out.n_head = 32; + out.n_head_kv = 8; + out.head_dim = 256; + out.head_dim_swa = 256; + out.n_layer = n_layer; + out.rope_theta = 1000000.0f; + out.rope_theta_swa = 1000000.0f; + out.attn_scale = 1.0f; + out.logit_softcap = 30.0f; + out.swa_layers = swa_layers; + + // Populate minimal per-layer structs (only rope_freqs is accessed by MTP graph + // for full-attention donor layers) + out.layers.resize((size_t)n_layer); + // Leave rope_freqs nullptr for all layers (proportional RoPE freq_factors are + // optional; nullptr → falls back to base rope_theta scaling). + + out.backend = backend; + out.buf = ggml_backend_alloc_ctx_tensors(out.ctx, backend); + if (!out.buf) { ggml_free(out.ctx); out.ctx = nullptr; return false; } + + // Zero-init the tok_embd (so GPU tensor is valid even though we won't run compute) + ggml_backend_tensor_set(out.tok_embd, nullptr, 0, 0); // no-op; buffer already zeroed + + return true; +} + +// Build a minimal stub GemmaTargetCache with KV tensors of the right shapes. +// attn_k[i]: [head_dim_kv, max_ctx, n_head_kv] +// attn_v[i]: [head_dim_kv, max_ctx, n_head_kv] +// head_dim_kv_swa and head_dim_kv_full allow different head_dims per attention type. +static bool build_stub_target_cache(ggml_backend_t backend, + int n_layer, + int n_kv_per_layer, // n_head_kv for KV cache + int head_dim_kv_swa, // head_dim for SWA layers + int head_dim_kv_full, // head_dim for full-attn layers + int max_ctx, + const std::vector & swa_layers, + GemmaTargetCache & out) { + // Count KV-owning layers (non-shared). For stub, all layers own a KV slot. + const int n_kv_slots = n_layer; // stub: one per layer (no sharing) + + const size_t n_tensors_est = (size_t)(2 * n_kv_slots + 4); + ggml_init_params ip{}; + ip.mem_size = n_tensors_est * ggml_tensor_overhead() + 4096; + ip.mem_buffer = nullptr; + ip.no_alloc = true; + out.base_ctx = ggml_init(ip); + if (!out.base_ctx) return false; + + out.layer_to_kv_idx.resize((size_t)n_layer); + out.layer_to_donor_kv.resize((size_t)n_layer, -1); + out.attn_k.resize((size_t)n_kv_slots, nullptr); + out.attn_v.resize((size_t)n_kv_slots, nullptr); + + for (int il = 0; il < n_layer; il++) { + out.layer_to_kv_idx[il] = il; // one-to-one for stub + + // Use different head_dim per attention type + const bool is_swa = (il < (int)swa_layers.size()) && swa_layers[il]; + const int layer_head_dim = is_swa ? head_dim_kv_swa : head_dim_kv_full; + ggml_tensor * K = ggml_new_tensor_3d(out.base_ctx, GGML_TYPE_F16, + layer_head_dim, max_ctx, n_kv_per_layer); + ggml_tensor * V = ggml_new_tensor_3d(out.base_ctx, GGML_TYPE_F16, + layer_head_dim, max_ctx, n_kv_per_layer); + char name[64]; + std::snprintf(name, sizeof(name), "stub_k_%d", il); + ggml_set_name(K, name); + std::snprintf(name, sizeof(name), "stub_v_%d", il); + ggml_set_name(V, name); + out.attn_k[il] = K; + out.attn_v[il] = V; + } + + out.backend = backend; + out.max_ctx = max_ctx; + out.cur_pos = 16; // pretend we have 16 committed tokens + out.swa_ctx_alloc = max_ctx; + (void)swa_layers; + + out.base_buf = ggml_backend_alloc_ctx_tensors(out.base_ctx, backend); + if (!out.base_buf) { ggml_free(out.base_ctx); out.base_ctx = nullptr; return false; } + + // Zero-init (backend buffer is already zeroed by alloc; explicit set skipped for perf) + + return true; +} + +int main() { + const char * mtp_path = std::getenv("MTP_GGUF"); + if (!mtp_path) { + std::fprintf(stderr, "[skip] MTP_GGUF not set; skipping test_mtp_graph_shapes\n"); + return 77; // autotools skip code + } + + ggml_backend_t backend = ggml_backend_cuda_init(0); + if (!backend) { + return fail("ggml_backend_cuda_init(0) failed"); + } + + // ── Load MTP weights ───────────────────────────────────────────────────── + MtpDrafterWeights mtp{}; + if (!load_gemma4_mtp_assistant(std::string(mtp_path), backend, mtp)) { + std::fprintf(stderr, " loader error: %s\n", gemma4_last_error()); + ggml_backend_free(backend); + return fail("load_gemma4_mtp_assistant failed"); + } + + const int n_embd_backbone = mtp.n_embd_backbone; // e.g. 5376 + const int n_vocab = 262144; // Dense 31B vocab + const int n_target_layers = 60; // Dense 31B + const int max_ctx = 64; // small stub context + + // Dense 31B SWA pattern: odd-indexed = SWA, even = full attention + std::vector target_swa(n_target_layers, false); + for (int il = 0; il < n_target_layers; il++) { + target_swa[il] = ((il % 2) == 1); + } + + // ── Build stub target structures ───────────────────────────────────────── + GemmaTargetWeights stub_target{}; + if (!build_stub_target_weights(backend, n_vocab, n_embd_backbone, + n_target_layers, target_swa, stub_target)) { + ggml_backend_free(backend); + return fail("build_stub_target_weights failed"); + } + + GemmaTargetCache stub_cache{}; + // KV head_dim: derived from MTP weight shapes (attn_q_norm->ne[0] gives per-head Q dim, + // which must equal the target KV head_dim for flash_attn_ext to work). + // Dense 31B: SWA layers use head_dim=256, full-attn layers use head_dim=512 + // (derived from mtp.layers[0].attn_q_norm->ne[0]=256 for SWA, [3].attn_q_norm->ne[0]=512 for full). + const int head_dim_swa_stub = (int)mtp.layers[0].attn_q_norm->ne[0]; // SWA layers 0-2 + const int head_dim_full_stub = (int)mtp.layers[3].attn_q_norm->ne[0]; // Full-attn layer 3 + std::fprintf(stderr, "[shape_test] MTP Q head_dim: SWA=%d, full=%d\n", + head_dim_swa_stub, head_dim_full_stub); + if (!build_stub_target_cache(backend, n_target_layers, + /*n_kv_per_layer=*/8, + head_dim_swa_stub, head_dim_full_stub, + max_ctx, target_swa, stub_cache)) { + free_gemma4_target_weights(stub_target); + ggml_backend_free(backend); + return fail("build_stub_target_cache failed"); + } + + // ── Build MTP step graph ───────────────────────────────────────────────── + MtpStepGraph graph{}; + const int attn_pos = stub_cache.cur_pos; // = 16 + + if (!build_mtp_step_graph(mtp, stub_cache, stub_target, graph, attn_pos)) { + std::fprintf(stderr, " build error: %s\n", gemma4_last_error()); + free_gemma4_target_weights(stub_target); + // Note: stub_cache KV tensors point into base_ctx; free manually: + if (stub_cache.base_buf) ggml_backend_buffer_free(stub_cache.base_buf); + if (stub_cache.base_ctx) ggml_free(stub_cache.base_ctx); + ggml_backend_free(backend); + return fail("build_mtp_step_graph failed"); + } + + // ── Shape assertions ───────────────────────────────────────────────────── + + // 1. Input shapes + if (!graph.in_tok || graph.in_tok->ne[0] != 1 || + graph.in_tok->type != GGML_TYPE_I32) { + ggml_backend_free(backend); + return fail("in_tok shape/type mismatch: expected I32[1]"); + } + + if (!graph.in_h_prev || + graph.in_h_prev->ne[0] != (int64_t)n_embd_backbone || + graph.in_h_prev->ne[1] != 1 || + graph.in_h_prev->type != GGML_TYPE_F32) { + std::fprintf(stderr, " in_h_prev->ne = [%lld, %lld]\n", + (long long)(graph.in_h_prev ? graph.in_h_prev->ne[0] : -1), + (long long)(graph.in_h_prev ? graph.in_h_prev->ne[1] : -1)); + ggml_backend_free(backend); + return fail("in_h_prev shape/type mismatch: expected F32[n_embd_backbone, 1]"); + } + + if (!graph.in_pos || graph.in_pos->ne[0] != 1 || + graph.in_pos->type != GGML_TYPE_I32) { + ggml_backend_free(backend); + return fail("in_pos shape/type mismatch: expected I32[1]"); + } + + // 2. out_h_post: F32 [n_embd_backbone, 1] + if (!graph.out_h_post || + graph.out_h_post->ne[0] != (int64_t)n_embd_backbone || + graph.out_h_post->ne[1] != 1 || + graph.out_h_post->type != GGML_TYPE_F32) { + std::fprintf(stderr, " out_h_post->ne = [%lld, %lld], type=%s\n", + (long long)(graph.out_h_post ? graph.out_h_post->ne[0] : -1), + (long long)(graph.out_h_post ? graph.out_h_post->ne[1] : -1), + graph.out_h_post ? ggml_type_name(graph.out_h_post->type) : "null"); + ggml_backend_free(backend); + return fail("out_h_post shape mismatch: expected F32[n_embd_backbone, 1]"); + } + + // 3. out_logits: F32 [n_vocab, 1] + if (!graph.out_logits || + graph.out_logits->ne[0] != (int64_t)n_vocab || + graph.out_logits->ne[1] != 1 || + graph.out_logits->type != GGML_TYPE_F32) { + std::fprintf(stderr, " out_logits->ne = [%lld, %lld], type=%s\n", + (long long)(graph.out_logits ? graph.out_logits->ne[0] : -1), + (long long)(graph.out_logits ? graph.out_logits->ne[1] : -1), + graph.out_logits ? ggml_type_name(graph.out_logits->type) : "null"); + ggml_backend_free(backend); + return fail("out_logits shape mismatch: expected F32[n_vocab, 1]"); + } + + // 4. out_argmax: I32 [1] + if (!graph.out_argmax || + graph.out_argmax->ne[0] != 1 || + graph.out_argmax->type != GGML_TYPE_I32) { + std::fprintf(stderr, " out_argmax->ne[0]=%lld type=%s\n", + (long long)(graph.out_argmax ? graph.out_argmax->ne[0] : -1), + graph.out_argmax ? ggml_type_name(graph.out_argmax->type) : "null"); + ggml_backend_free(backend); + return fail("out_argmax shape/type mismatch: expected I32[1]"); + } + + std::fprintf(stderr, "[PASS] all shape assertions passed for MTP step graph\n"); + std::fprintf(stderr, " n_embd_backbone=%d, n_vocab=%d, n_layers=%zu, attn_pos=%d\n", + n_embd_backbone, n_vocab, mtp.layers.size(), attn_pos); + + // Cleanup + free_mtp_step_graph(graph); + // Stub cache: manual teardown since we bypassed create_gemma4_cache + if (stub_cache.base_buf) ggml_backend_buffer_free(stub_cache.base_buf); + if (stub_cache.base_ctx) ggml_free(stub_cache.base_ctx); + free_gemma4_target_weights(stub_target); + free_gemma4_mtp_assistant(mtp); + ggml_backend_free(backend); + + return 0; +} diff --git a/dflash/test/gemma4/test_mtp_loader.cpp b/dflash/test/gemma4/test_mtp_loader.cpp new file mode 100644 index 00000000..ea44ca94 --- /dev/null +++ b/dflash/test/gemma4/test_mtp_loader.cpp @@ -0,0 +1,128 @@ +// Phase 2 RED test: Gemma4 MTP loader (load_gemma4_mtp_assistant) +// +// Should NOT compile today — MtpDrafterWeights and load_gemma4_mtp_assistant +// do not yet exist in internal.h. Once Phase 2 GREEN lands, the test compiles +// and 7 assertions verify the loader contract per +// .sisyphus/notes/mtp-spike-2026-05-09.md (sections "Contract — Phase 2"). +// +// Run: +// cd dflash && cmake --build build --target test_mtp_loader && \ +// MTP_GGUF=$ROOT/models/gemma4-mtp-31B/gemma-4-31B-it-assistant.Q4_K_M.gguf \ +// ./build/test_mtp_loader + +#include "../src/internal.h" +#include "ggml.h" +#include "ggml-backend.h" +#include "ggml-cuda.h" + +#include +#include +#include +#include +#include + +using namespace dflash27b; + +static int fail(const char *msg) { + std::fprintf(stderr, "[red] FAIL: %s\n", msg); + return 1; +} + +int main() { + const char *p = std::getenv("MTP_GGUF"); + if (!p) { + std::fprintf(stderr, "[skip] MTP_GGUF env not set; expected:\n"); + std::fprintf(stderr, " /home/peppi/Dev/lucebox-hub/models/gemma4-mtp-31B/gemma-4-31B-it-assistant.Q4_K_M.gguf\n"); + return 77; // autotools skip + } + + // Backend init (reuse the pattern from test_gemma4_dflash.cpp) + ggml_backend_t backend = ggml_backend_cuda_init(0); + if (!backend) { + return fail("ggml_backend_cuda_init failed"); + } + + // The function under test (Phase 2 GREEN must define this) + MtpDrafterWeights mtp; + bool ok = load_gemma4_mtp_assistant(std::string(p), backend, mtp); + if (!ok) { + ggml_backend_free(backend); + return fail("load_gemma4_mtp_assistant returned false"); + } + + // Assertion 1: n_embd_backbone matches target hidden (Dense 31B = 5376) + if (mtp.n_embd_backbone != 5376) { + std::fprintf(stderr, " n_embd_backbone=%d expected 5376\n", mtp.n_embd_backbone); + ggml_backend_free(backend); + return fail("n_embd_backbone mismatch"); + } + + // Assertion 2: requires_target_arch == "gemma4" (vLLM #41789 guard) + if (mtp.requires_target_arch != "gemma4") { + std::fprintf(stderr, " requires_target_arch=\"%s\" expected \"gemma4\"\n", + mtp.requires_target_arch.c_str()); + ggml_backend_free(backend); + return fail("requires_target_arch mismatch"); + } + + // Assertion 3: 4 MTP transformer blocks (per MTP.md spec) + if (mtp.layers.size() != 4) { + std::fprintf(stderr, " layers.size()=%zu expected 4\n", mtp.layers.size()); + ggml_backend_free(backend); + return fail("MTP block count mismatch"); + } + + // Assertion 4: attention_k_eq_v=true (Gemma4 quirk; V always read from cache) + if (!mtp.attention_k_eq_v) { + ggml_backend_free(backend); + return fail("attention_k_eq_v should be true for Gemma4"); + } + + // Assertion 5: pre_projection tensor shape [2*n_embd_backbone, n_embd_mtp] + // pre_projection concatenates [tok_embd(n_embd_backbone) + h_prev(n_embd_backbone)] + // and projects to MTP's own hidden size n_embd. + // ne[0] = 2*n_embd_backbone = 10752, ne[1] = mtp.n_embd (the MTP model's hidden size) + if (!mtp.pre_projection || + mtp.pre_projection->ne[0] != 2 * (int64_t)mtp.n_embd_backbone) { + std::fprintf(stderr, " pre_projection->ne[0]=%lld expected %d\n", + (long long)(mtp.pre_projection ? mtp.pre_projection->ne[0] : -1), + 2 * mtp.n_embd_backbone); + ggml_backend_free(backend); + return fail("pre_projection shape mismatch (ne[0] != 2*n_embd_backbone)"); + } + + // Assertion 6: post_projection tensor shape [n_embd_mtp, n_embd_backbone] + // Projects MTP hidden back to target backbone dimension. + // ne[0] = mtp.n_embd, ne[1] = n_embd_backbone = 5376 + if (!mtp.post_projection || + mtp.post_projection->ne[1] != (int64_t)mtp.n_embd_backbone) { + std::fprintf(stderr, " post_projection->ne[1]=%lld expected %d\n", + (long long)(mtp.post_projection ? mtp.post_projection->ne[1] : -1), + mtp.n_embd_backbone); + ggml_backend_free(backend); + return fail("post_projection shape mismatch (ne[1] != n_embd_backbone)"); + } + + // Assertion 7: per-MTP-layer donor KV resolution (NOT global pair). + // For Dense 31B (60 target layers, SWA pattern from gemma4_target_graph): + // even-indexed target layers = full attention, last = 58 + // odd-indexed target layers = SWA attention, last = 59 + // Each MTP layer's donor_target_layer must be exactly 58 (full) or 59 (SWA) + // depending on that layer's own attention type. A bounds-only check would + // accept any value in [0, 60), which misses wrong-type assignments. + for (size_t il = 0; il < mtp.layers.size(); ++il) { + const int32_t got = mtp.layers[il].donor_target_layer; + const int32_t want = mtp.layers[il].is_swa ? 59 : 58; // last SWA / last full-attn + if (got != want) { + std::fprintf(stderr, + " layer %zu is_swa=%d donor_target_layer=%d expected %d\n", + il, (int)mtp.layers[il].is_swa, got, want); + ggml_backend_free(backend); + return fail("donor_target_layer does not point to last matching-type target layer"); + } + } + + ggml_backend_free(backend); + std::fprintf(stderr, "[red->green] all 7 assertions PASS\n"); + return 0; +} diff --git a/dflash/test/test_flash_attn_sparse.cpp b/dflash/test/test_flash_attn_sparse.cpp new file mode 100644 index 00000000..f38c470f --- /dev/null +++ b/dflash/test/test_flash_attn_sparse.cpp @@ -0,0 +1,201 @@ +#include "ggml.h" +#include "ggml-cuda.h" +#include "ggml-alloc.h" +#include "ggml-backend.h" +#include "../src/pflash_ggml_adapter.h" +#include +#include +#include +#include +#include + +// Compare dense FA output vs sparse FA output +// At alpha=1.0 (select all blocks), sparse should match dense exactly. +static bool test_sparse_matches_dense(ggml_backend_t backend, int S, int H, int Hk, int D) { + // Use no_alloc=true so tensors are NOT pre-allocated in CPU memory. + // The gallocr will allocate them in the CUDA backend buffer instead, + // which is required for ggml_backend_tensor_set/get to work. + const size_t ctx_size = 256 * 1024 * 1024; + ggml_init_params params = { ctx_size, nullptr, /*no_alloc=*/true }; + ggml_context * ctx = ggml_init(params); + + // Q must be F32: the CUDA FA kernel asserts Q->type == GGML_TYPE_F32 + // K and V can be F16; the kernel converts them internally if needed + // ggml FA convention: ne[0]=D, ne[1]=S, ne[2]=H + ggml_tensor * Q = ggml_new_tensor_3d(ctx, GGML_TYPE_F32, D, S, H); + // K [D, S, Hk] + ggml_tensor * K = ggml_new_tensor_3d(ctx, GGML_TYPE_F16, D, S, Hk); + // V [D, S, Hk] + ggml_tensor * V = ggml_new_tensor_3d(ctx, GGML_TYPE_F16, D, S, Hk); + + // Mark Q, K, V as graph inputs so gallocr allocates persistent backend buffers for them + ggml_set_input(Q); + ggml_set_input(K); + ggml_set_input(V); + + // Causal mask for dense FA: ne[0]=KV_len, ne[1]=Q_len. + // The kernel indexes it as mask[q * ne[0] + kv], so mask[q][kv] = (kv <= q) ? 0 : -inf. + // pFlash applies causal masking at block granularity, so we give dense FA the same mask. + ggml_tensor * mask = ggml_new_tensor_2d(ctx, GGML_TYPE_F16, S, S); + ggml_set_input(mask); + + // Dense FA + ggml_tensor * dense_out = ggml_flash_attn_ext(ctx, Q, K, V, mask, 1.0f/sqrtf((float)D), 0.0f, 0.0f); + + // Sparse FA (alpha=1.0 = select all blocks = should match dense) + ggml_tensor * sparse_out = ggml_flash_attn_sparse(ctx, Q, K, V, 1.0f/sqrtf((float)D), 1.0f); + + // Mark outputs so gallocr never frees/overwrites them before readback + ggml_set_output(dense_out); + ggml_set_output(sparse_out); + + ggml_cgraph * gf = ggml_new_graph(ctx); + ggml_build_forward_expand(gf, dense_out); + ggml_build_forward_expand(gf, sparse_out); + + ggml_gallocr_t alloc = ggml_gallocr_new(ggml_backend_get_default_buffer_type(backend)); + ggml_gallocr_alloc_graph(alloc, gf); + + // Fill Q (F32), K (F16), V (F16) with random data + srand(42); + + std::vector q_buf(D * S * H); + for (auto & x : q_buf) x = (float)(rand() % 1000 - 500) / 500.0f; + ggml_backend_tensor_set(Q, q_buf.data(), 0, ggml_nbytes(Q)); + + std::vector buf(D * S * Hk); + for (auto & x : buf) x = ggml_fp32_to_fp16((float)(rand() % 1000 - 500) / 500.0f); + ggml_backend_tensor_set(K, buf.data(), 0, ggml_nbytes(K)); + + buf.resize(D * S * Hk); + for (auto & x : buf) x = ggml_fp32_to_fp16((float)(rand() % 1000 - 500) / 500.0f); + ggml_backend_tensor_set(V, buf.data(), 0, ggml_nbytes(V)); + + // Fill causal mask: mask[q * S + kv] = (kv <= q) ? 0.0f : -INFINITY + { + std::vector mask_data(S * S); + for (int q = 0; q < S; q++) { + for (int kv = 0; kv < S; kv++) { + float val = (kv <= q) ? 0.0f : -INFINITY; + mask_data[q * S + kv] = ggml_fp32_to_fp16(val); + } + } + ggml_backend_tensor_set(mask, mask_data.data(), 0, S * S * sizeof(ggml_fp16_t)); + } + + ggml_backend_graph_compute(backend, gf); + + // Compare outputs (dense_out is GGML_TYPE_F32, use ggml_nelements for element count) + const size_t n_elems = ggml_nelements(dense_out); + const size_t out_bytes = n_elems * sizeof(float); + std::vector dense_data(n_elems); + std::vector sparse_data(n_elems); + ggml_backend_tensor_get(dense_out, dense_data.data(), 0, out_bytes); + ggml_backend_tensor_get(sparse_out, sparse_data.data(), 0, out_bytes); + + float max_diff = 0.0f; + bool any_nonfinite = false; + for (size_t i = 0; i < dense_data.size(); i++) { + if (!std::isfinite(sparse_data[i]) || !std::isfinite(dense_data[i])) { + any_nonfinite = true; + break; + } + float diff = fabsf(dense_data[i] - sparse_data[i]); + if (diff > max_diff) max_diff = diff; + } + + printf("[test] S=%d H=%d Hk=%d D=%d max_diff=%.6f nonfinite=%s %s\n", + S, H, Hk, D, max_diff, + any_nonfinite ? "YES" : "no", + (max_diff < 1.0f && !any_nonfinite) ? "PASS" : "FAIL"); + + ggml_gallocr_free(alloc); + ggml_free(ctx); + return max_diff < 1.0f && !any_nonfinite; +} + +// Sanity-check sparse attention at alpha < 1.0: +// The output should not be all zeros (basic liveness check). +// With alpha < 1.0 outputs will differ from dense FA — that is expected and not tested here. +static bool test_sparse_alpha(ggml_backend_t backend, int S, int H, int Hk, int D, float alpha) { + const size_t ctx_size = 256 * 1024 * 1024; + ggml_init_params params = { ctx_size, nullptr, /*no_alloc=*/true }; + ggml_context * ctx = ggml_init(params); + + // ggml FA convention: ne[0]=D, ne[1]=S, ne[2]=H + ggml_tensor * Q = ggml_new_tensor_3d(ctx, GGML_TYPE_F32, D, S, H); + ggml_tensor * K = ggml_new_tensor_3d(ctx, GGML_TYPE_F16, D, S, Hk); + ggml_tensor * V = ggml_new_tensor_3d(ctx, GGML_TYPE_F16, D, S, Hk); + + ggml_set_input(Q); + ggml_set_input(K); + ggml_set_input(V); + + ggml_tensor * sparse_out = ggml_flash_attn_sparse(ctx, Q, K, V, 1.0f/sqrtf((float)D), alpha); + ggml_set_output(sparse_out); + + ggml_cgraph * gf = ggml_new_graph(ctx); + ggml_build_forward_expand(gf, sparse_out); + + ggml_gallocr_t alloc = ggml_gallocr_new(ggml_backend_get_default_buffer_type(backend)); + ggml_gallocr_alloc_graph(alloc, gf); + + srand(42); + + std::vector q_buf(D * S * H); + for (auto & x : q_buf) x = (float)(rand() % 1000 - 500) / 500.0f; + ggml_backend_tensor_set(Q, q_buf.data(), 0, ggml_nbytes(Q)); + + std::vector buf(D * S * Hk); + for (auto & x : buf) x = ggml_fp32_to_fp16((float)(rand() % 1000 - 500) / 500.0f); + ggml_backend_tensor_set(K, buf.data(), 0, ggml_nbytes(K)); + + buf.resize(D * S * Hk); + for (auto & x : buf) x = ggml_fp32_to_fp16((float)(rand() % 1000 - 500) / 500.0f); + ggml_backend_tensor_set(V, buf.data(), 0, ggml_nbytes(V)); + + ggml_backend_graph_compute(backend, gf); + + const size_t n_elems = ggml_nelements(sparse_out); + const size_t out_bytes = n_elems * sizeof(float); + std::vector out_data(n_elems); + ggml_backend_tensor_get(sparse_out, out_data.data(), 0, out_bytes); + + // Basic sanity: output must not be all zeros + float max_abs = 0.0f; + for (size_t i = 0; i < out_data.size(); i++) { + float v = fabsf(out_data[i]); + if (v > max_abs) max_abs = v; + } + + bool pass = max_abs > 1e-6f; + printf("[test_sparse_alpha] alpha=%.2f S=%d H=%d Hk=%d D=%d max_abs=%.6f %s\n", + alpha, S, H, Hk, D, max_abs, pass ? "PASS" : "FAIL (all zeros)"); + + ggml_gallocr_free(alloc); + ggml_free(ctx); + return pass; +} + +int main() { + ggml_backend_t backend = ggml_backend_cuda_init(0); + if (!backend) { + fprintf(stderr, "CUDA backend not available\n"); + return 1; + } + + pflash_register_ggml_kernel(); + + bool ok = true; + ok &= test_sparse_matches_dense(backend, 256, 16, 8, 128); // small + ok &= test_sparse_matches_dense(backend, 1024, 16, 8, 128); // medium + ok &= test_sparse_matches_dense(backend, 4096, 16, 8, 128); // large + + // Alpha < 1.0: pFlash kernel with moderate and aggressive sparsity + ok &= test_sparse_alpha(backend, 1024, 16, 8, 128, 0.5f); // moderate sparsity + ok &= test_sparse_alpha(backend, 4096, 16, 8, 128, 0.12f); // aggressive sparsity (default alpha) + + ggml_backend_free(backend); + printf("\n%s\n", ok ? "ALL TESTS PASSED" : "SOME TESTS FAILED"); + return ok ? 0 : 1; +}