LarryFromVegas
New Member
- Joined
- Mar 6, 2012
- Messages
- 2
I have an input cell formatted as Number/Percentage/4. This cell needs to display the user's input without the user having to type the percent sign. It is used to express some small tax percentages, oftentimes less than 1%. This cell is used in other places on the sheet to perform calculations, and so precision and consistency of input are extremely important. My issue is with the inconsistent display of the input values.
When the user types "1" (meaning 1%, the same as the decimal .01), the cell correctly displays as 1.0000 %.
When the user types "10" (meaning 10%, the same as the decimal .10), the cell correctly displays 10.0000 %.
When the user types "100" (meaning 100%, the same as the decimal 1.00), the cell correctly displays 100.0000 %.
HOWEVER when the user types .1 (meaning .1% --- aka 1/10% --- the same as the decimal .001), the cell INCORRECTLY displays as 10%. It should display as .1%; treating it similarly to the others shown above. Excel responds as if the user made an error and doesn't want to express a decimal of a percentage, but that is an incorrect assumption.
It appears that Excel will only treat the entry as a percentage as long as that it is greater than or equal to 1. But when the entry is less than one, it multiplies that amount by 100.
For a temp fix, I WAS able to get Excel to correctly put the percent sign in if the user were to type ".1%" [needed to type the percent sign after the number]. But the problem with that is it is not intended that the user have to type the percent sign when entering text into that cell. In this application, it would be very confusing and inconsistent for the user to type the percent sign in some instances (if less than 1) and not type the percent sign when greater than 1%.
I have searched the Internet and cannot find a way to change this so far. Would settle for a custom format to get Excel to properly express this cell entry as a percent as long as that didn't mess up any related calculations.
When the user types "1" (meaning 1%, the same as the decimal .01), the cell correctly displays as 1.0000 %.
When the user types "10" (meaning 10%, the same as the decimal .10), the cell correctly displays 10.0000 %.
When the user types "100" (meaning 100%, the same as the decimal 1.00), the cell correctly displays 100.0000 %.
HOWEVER when the user types .1 (meaning .1% --- aka 1/10% --- the same as the decimal .001), the cell INCORRECTLY displays as 10%. It should display as .1%; treating it similarly to the others shown above. Excel responds as if the user made an error and doesn't want to express a decimal of a percentage, but that is an incorrect assumption.
It appears that Excel will only treat the entry as a percentage as long as that it is greater than or equal to 1. But when the entry is less than one, it multiplies that amount by 100.
For a temp fix, I WAS able to get Excel to correctly put the percent sign in if the user were to type ".1%" [needed to type the percent sign after the number]. But the problem with that is it is not intended that the user have to type the percent sign when entering text into that cell. In this application, it would be very confusing and inconsistent for the user to type the percent sign in some instances (if less than 1) and not type the percent sign when greater than 1%.
I have searched the Internet and cannot find a way to change this so far. Would settle for a custom format to get Excel to properly express this cell entry as a percent as long as that didn't mess up any related calculations.
Last edited by a moderator: