As of my last knowledge update in September 2021, the term "transformer saturation" doesn't have a widely recognized meaning within the context of the transformer architecture used in natural language processing and machine learning. The term "transformer" usually refers to a type of deep learning model that has been highly successful in tasks like language translation, text generation, and more.
However, there could be other domains or contexts where "transformer saturation" might have a specific meaning that has emerged after my last update. If you're referring to a concept introduced after that date, I recommend checking more recent sources or providing additional context for a more accurate explanation.