Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
|
from
login
PowerRetention: a drop-in replacement for FlashAttention in LLMs
(
github.com/m-a-n-i-f-e-s-t
)
2 points
by
dvrp
84 days ago
|
past
|
2 comments
Power Attention: Efficient CUDA Kernels for Symmetric Power Transformers
(
github.com/m-a-n-i-f-e-s-t
)
6 points
by
txus
10 months ago
|
past
|
2 comments
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: