Home
LLM
NLP
Machine Learning
HPC
About
Libido Knowledge Bank
Cool & Powerful
Total written
2
articles
Total created
2
tags
Total received
3
comments
Categories
Table of Contents
CONTENT
Latest Articles
2025-04-30
Lookback Lens论文解读
下面按 Why → What → How 的脉络,用尽量少的行话把 Lookback Lens 这篇论文讲清楚;你只要有大模型(LLM)的基本概念即可跟上。 一、Why:为什么要做 Lookback Lens? 幻觉仍是 LLM 落地的最大阻力。即使给定了正确的参考文档,模型也常把不存在的细节写进摘
2025-04-30
11
0
0
LLM
2025-04-06
Flash Attention原理
What、Why、Where、How What: 通过减少IO访问量加速attention计算 Why: Attention计算是memory-bound而非computation-bound Where: 以往的文章都注重加速计算过程,而本文着力于减少访存消耗 How: 矩阵分块减少中间结果缓存、
2025-04-06
50
1
2
LLM