Neural network ReLU pronunciation

Wondering how to pronounce the abbreviation for the Rectified Linear Unit (ReLU) which is often used in the hidden layers?  I was thinking about the same thing.  So I reviewed several presentations and from what I can hear some common ways to pronounce ReLU are:

1) R’eh’ ‘Lew’

2) R’ay’ ‘Lew’

Both of these sound quite close, and I believe both are acceptable and when uttered the meaning will be clearly understood.  Pronunciation 1 with more of an “eh” sound appears to be more common and correct to me and from what I have seen in use.

Scroll to Top