Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
cmathw
Karma:
81
All
Posts
Comments
New
Top
Old
Gated Attention Blocks: Preliminary Progress toward Removing Attention Head Superposition
cmathw
,
Dennis Akar
and
Lee Sharkey
Apr 8, 2024, 11:14 AM
42
points
4
comments
15
min read
LW
link
Polysemantic Attention Head in a 4-Layer Transformer
Jett Janiak
,
cmathw
and
StefanHex
Nov 9, 2023, 4:16 PM
51
points
0
comments
6
min read
LW
link
Back to top
N
W
F
A
C
D
E
F
G
H
I
Customize appearance
Current theme:
default
A
C
D
E
F
G
H
I
Less Wrong (text)
Less Wrong (link)
Invert colors
Reset to defaults
OK
Cancel
Hi, I’m Bobby the Basilisk! Click on the minimize button (
) to minimize the theme tweaker window, so that you can see what the page looks like with the current tweaked values. (But remember,
the changes won’t be saved until you click “OK”!
)
Theme tweaker help
Show Bobby the Basilisk
OK
Cancel