embedding hackathon repo
Go to file
2025-06-20 14:47:57 +02:00
embedding.ipynb Reran, renamed, added file 2025-06-20 14:47:57 +02:00
README.md Reran, renamed, added file 2025-06-20 14:47:57 +02:00
reflection.ipynb Reran, renamed, added file 2025-06-20 14:47:57 +02:00
requirements.txt added some comments and cleanup (still not great) 2025-06-19 17:40:23 +02:00
rotation_all.ipynb Reran, renamed, added file 2025-06-20 14:47:57 +02:00
rotation_fixed.ipynb Reran, renamed, added file 2025-06-20 14:47:57 +02:00
rotation.ipynb Reran, renamed, added file 2025-06-20 14:47:57 +02:00
scaling.ipynb Reran, renamed, added file 2025-06-20 14:47:57 +02:00

Frankenstein

Embedding hackathon repo

Notebook embedding.ipynb: Try to swap components between 2 LMM models. Recommended to start. Has more comments and helper functions.

Notebooks rotation, scaling and reflection.ipynb: Only attempt an operation on the input and output embeddings of one model. Does it break the model? Or is it invariant?

Notebook rotation_all.ipynb: This notebook then attempts to rotate the entire model. So all weights in all transformer layers, etc. This is not as easy as it sounds, and highly model specific: different models have very different internal layers and representations. Layers may have different shapes, or are concatenated (such as the kvq matrices).

Notebook rotation_fixed.ipynb: What happens if you try to rotate the input embedding, and then rotate back just before the first activation function in the first neural network?