Transformer Support#
Support for transformer models is currently limited, but rapidly expanding. We have successfully mapped and validated several transformer and CNN+Transformer hybrid models, which are featured in the model explorer. Some notable examples include:
TinyViT (Vision Transformer)
TinyStories (Language Model)
The NeuralCompiler has been adapted to recognize Transformer structures and map them to hardware-supported operators where possible. While this process is not automated yet, it allows many of the underlying operations to be accelerated by our hardware.
If you are working with models that incorporate Transformer architectures, we encourage you to reach out to MemryX. Our team will work with you to explore extending support for your specific model.
As we continue to refine and enhance our compiler and runtime, support for transformer models will broaden. Stay tuned for updates and improvements!