Save Embeddings

btibert
btibert New Altair Community Member
edited November 2024 in Community Q&A
I could be missing something, but with the latest updates to Deep Learning, which includes the Embedding Layer, I want to be able to extract and save the embeddings.  Other operators ask for a pre-trained model, so I was hoping that we could construct our own, save the embedding, and use elsewhere.
Tagged:

Best Answer

  • jacobcybulski
    jacobcybulski New Altair Community Member
    Answer ✓
    You can save the entire model, or just the weights and/or biases. But this is not exactly what you are looking at. I need to test it out, however, I think that if you want to deal only with an embedding layer, you can create and save a word2vec model using a separate extension, and then load it into your embedding layer.

Answers

  • jacobcybulski
    jacobcybulski New Altair Community Member
    Answer ✓
    You can save the entire model, or just the weights and/or biases. But this is not exactly what you are looking at. I need to test it out, however, I think that if you want to deal only with an embedding layer, you can create and save a word2vec model using a separate extension, and then load it into your embedding layer.
  • pschlunder
    pschlunder New Altair Community Member
    Hey,

    as @jacobcybulski said you can already create your own word2vec embeddings with the Word2Vec extension. But you are right, as of now you can not just export embeddings created inside the network training. Currently you can adapt an embedding with the network structure and apply the network incl. the embedding as a whole. We know it's also important to extract/export just the embedding, but it did not make it into this release.

    Hope this helps.

  • btibert
    btibert New Altair Community Member
    Got it, thanks @pschlunder.  For now, I will go the other route, because obviously ideally we can constrain our workflow to operators within RM and not rely solely on externally trained embeddings, though that is a great feature.