| notes/research/x/openclaw-token-freedom-2026-03-08.md | 2026-03-08 笔记 | | | |
| notes/research/x/2026-03-21-1028-browserbase-cli.md | Browserbase CLI | | | |
| notes/research/training-framework-versions-2026-03-19.md | 大模型训练框架版本调研 | | | |
| notes/research/sglang-vllm-versions-2026-03-19.md | SGLang vs vLLM 版本调研 | | | |
| notes/research/news/热点汇总-report-20260314.md | 热点新闻汇总 - 2026 年 3 月 14 日 | | | |
| notes/research/news/iran-war-x-report-20260308.md | 🇮🇷 2026 伊朗战争 X/Twitter 舆情汇总报告 | | | |
| notes/research/news/iran-us-israel-monitor.md | 伊朗 - 美国 - 以色列 新闻监控 | | | |
| notes/research/news/china-military-taiwan-report-20260308.md | 🇨🇳 中国军委变化与台海局势报告 | | | |
| notes/research/huggingface-models-2026-03-19.md | HuggingFace 模型调研报告 | | | |
| notes/research/github/visual-explainer-2026-03-03.md | Visual Explainer 项目笔记 | | | |
| notes/research/github/twitter-cli-20260310.md | Twitter-CLI 笔记 | | | |
| notes/research/github/paperclip-2026-03-19.md | Paperclip 项目调研报告 | | | |
| notes/research/github/opencli-skill-2026-03-17.md | opencli-skill | | | |
| notes/research/github/oh-my-claudecode-usage-guide.md | oh-my-claudecode 使用指南 | | | |
| notes/research/github/oh-my-claudecode-research.md | oh-my-claudecode 调研报告 | | | |
| notes/research/github/msa-memory-sparse-attention-2026-03-19.md | MSA (Memory Sparse Attention) 项目笔记 | | | |
| notes/research/github/larksuite-cli-2026-03-28.md | 飞书/Lark 开放平台命令行工具 | | | |
| notes/research/github/gstack-2026-03-18.md | gstack - Garry Tan 的 Claude Code 软件工厂 | | | |
| notes/research/github/everything-claude-code-2026-03-18.md | everything-claude-code - Claude Code 完整配置集合 | | | |
| notes/research/github/cli-anything-research-20260310.md | CLI-Anything 调研报告 | | | |
| notes/research/github/clawport-ui-research-20260310.md | ClawPort UI 调研报告 | | | |
| notes/research/github/boss-cli-20260314.md | boss-cli - BOSS 直聘命令行工具 | | | |
| notes/research/github/autoresearch-at-home.md | autoresearch-at-home | | | |
| notes/research/github/2026-03-29-1618-hermes-mod.md | Hermes Mod | | | |
| notes/research/github/2026-03-29-1533-daVinci-LLM.md | daVinci-LLM | | | |
| notes/research/github/2026-03-21-0716-uncommonroute.md | UncommonRoute | | | |
| notes/research/github/2026-03-20-0835-metaclaw.md | MetaClaw | | | |
| notes/research/github/2026-03-20-0818-msa-memory-sparse-attention.md | MSA (Memory Sparse Attention) | | | |
| notes/research/arxiv/reopold-arxiv-2603.11137-20260314.md | Scaling Reasoning Efficiently via Relaxed On-Policy Distillation | | | |
| notes/research/arxiv/paper-neural-thickets-2603.12228.md | Neural Thickets: Diverse Task Experts Are Dense Around Pretrained Weights | | | |
| notes/research/arxiv/opsdc-arxiv-2603.05433-20260314.md | On-Policy Self-Distillation for Reasoning Compression | | | |
| notes/research/arxiv/hopchain-2603.17024.md | HopChain: Multi-Hop Data Synthesis for Generalizable Vision-Language Reasoning | | | |
| notes/research/arxiv/arxiv-csai-survey-2026-03-week1.md | arXiv cs.AI 3 月第一周深度调研报告 | | | |
| notes/research/arxiv/arxiv-2603.07685-moe-megatron.md | Scalable Training of Mixture-of-Experts Models with Megatron Core | | | |
| notes/research/arxiv/arxiv-2602.12566-m2rl.md | To Mix or To Merge: Toward Multi-Domain Reinforcement Learning for Large Language Models | | | |
| notes/research/arxiv/2026-03-26-2116-arxiv-feedback-collection.md | arXiv 网站用户反馈数据收集需求 | | | |
| notes/research/arxiv/2026-03-22-1332-code-foundation-models-survey-arxiv-2511-18538.md | 📄 From Code Foundation Models to Agents and Applications: A Comprehensive Survey and Practical Guide to Code Intelligence | | | |
| notes/research/arxiv/2026-03-22-0945-cbrl-arxiv-2603-18953.md | 📄 Context Bootstrapped Reinforcement Learning (CBRL) | | | |
| notes/research/arxiv/2026-03-20-0738-arxiv-classification-chat.md | 需求记录 | | | |
| notes/research/arxiv/2026-03-16-1946-attention-residuals.md | Attention Residuals (AttnRes) | | | |
| notes/research/arxiv/2026-03-16-1843-yuan3-ultra.md | Yuan3.0 Ultra: A Trillion-Parameter Enterprise-Oriented MoE LLM | | | |
| notes/research/2026-03-25-0902-distillation-without-the-dark.md | Distillation Without the Dark - GAD 生成式对抗蒸馏 | | | |
| notes/research/2026-03-24-0737-rllm.md | RLLM - 统一后训练框架 | | | |