22 lines
881 B
Markdown
22 lines
881 B
Markdown
# Frankenstein
|
|
|
|
Embedding hackathon repo
|
|
|
|
Notebook embedding.ipynb:
|
|
Try to swap components between 2 LMM models. Recommended to start.
|
|
Has more comments and helper functions.
|
|
|
|
|
|
Notebooks rotation, scaling and reflection.ipynb:
|
|
Only attempt an operation on the input and output embeddings of one model.
|
|
Does it break the model? Or is it invariant?
|
|
|
|
|
|
Notebook rotation_all.ipynb:
|
|
This notebook then attempts to rotate the entire model. So all weights in all transformer layers, etc.
|
|
This is not as easy as it sounds, and highly model specific: different models have very different internal layers and representations. Layers may have different shapes, or are concatenated (such as the kvq matrices).
|
|
|
|
|
|
Notebook rotation_fixed.ipynb:
|
|
What happens if you try to rotate the input embedding, and then rotate back just before the first activation function in the first neural network?
|