compute the Entropy of a string
Entropy( s )
The Entropy(s) command returns the Shannon entropy of the string s. A floating-point number, the entropy of the string, is returned.
Shannon's entropy is defined as -add( P( ch ) * log[ 2 ]( P( ch ) ), ch = Support( s ) ), where P⁡ch=CountCharacterOccurrences⁡s,chlength⁡s. It is a measure of the information content of the string, and can be interpreted as the number of bits required to encode each character of the string given perfect compression. The entropy is maximal when each character is equally likely. For arbitrary non-null characters, this maximal value is log⁡255=7.99435.
(The null byte, with code point 0, cannot appear in a Maple string. If all 256 single byte code points could appear, then the maximal entropy would be log⁡256=8, which is the number of bits per byte).
Note that the entropy is computed as a floating-point number, at hardware (double) precision.
The following steps illustrate the definition of Entropy.
s ≔ Random⁡30,'lower'
s ≔ rbygsggdjijjtiqelzxehfnojeorwr
occ ≔ seq⁡CountCharacterOccurrences⁡s,ch,ch=Support⁡s
occ ≔ 1,1,3,1,3,1,2,4,1,1,2,1,3,1,1,1,1,1,1
L ≔ map⁡`/`,occ,length⁡s
L ≔ 130,130,110,130,110,130,115,215,130,130,115,130,110,130,130,130,130,130,130
U ≔ map⁡p→−evalf⁡p⁢log⁡p,L
U ≔ 0.1635630199,0.1635630199,0.3321928095,0.1635630199,0.3321928095,0.1635630199,0.2604593730,0.3875854127,0.1635630199,0.1635630199,0.2604593730,0.1635630199,0.3321928095,0.1635630199,0.1635630199,0.1635630199,0.1635630199,0.1635630199,0.1635630199
Download Help Document
What kind of issue would you like to report? (Optional)
Thank you for submitting feedback on this help document. Your feedback will be used
to improve Maple's help in the future.