A Activation Function. Commonly used during output Layers of a Neural Net to normalize all values to , and ensure all elements in the output add up to 1.