Yes. There are different concepts called 'entropy', sometimes merely because their mathematical formulation looks very similar.
It means different things in different contexts and an abstract discussion of the term is essentially meaningless.
Even discussions within the context of the second law of thermodynamics are often misleading because people ignore much of the context in which the statistical framing of the law was formulated. Formal systems and all that... These are not general descriptions of how nature works, but formal systems definitions that allow for some calculations.
I find the study of symmetries by Noether much more illuminating in general than trying to generalize conservation laws as observed within certain formal models.
Whenever there is an entropy, it can be defined as
S = - sum_n p_n log( p_n )
where the p_n is a probability distribution: for n = 1...W, p_n >= 0 and sum_n p_n = 1. This is always the underlying equation, the only thing that changes is the probability distribution.