ExcelChampion
Well-known Member
- Joined
- Aug 12, 2005
- Messages
- 976
Racking my brain on this one...
creating a table of 1-10 buckets to show the average of values that fall in those buckets. The buckets are simply a min as the lowest, max as the highest, and the rest is equally distributed between the min and max.
The issue is that when I also do a count of the values that fall in those buckets, I'm getting a result where 99.9% of values fall in to the min bucket.
I'd like to create the buckets using most of the data. So, my buckets would really be from 0 to 227, for example.
creating a table of 1-10 buckets to show the average of values that fall in those buckets. The buckets are simply a min as the lowest, max as the highest, and the rest is equally distributed between the min and max.
The issue is that when I also do a count of the values that fall in those buckets, I'm getting a result where 99.9% of values fall in to the min bucket.
I'd like to create the buckets using most of the data. So, my buckets would really be from 0 to 227, for example.
Code:
Bucket Avg Value Count
Min 0 -1.2% 26031
227 -0.7% 110
455 -1.6% 37
682 1.1% 33
909 -11.8% 12
1,137 -10.5% 5
1,364 1.2% 4
1,591 25.2% 1
1,818 0.0% 1
2,046 -2.2% 2
Max 2,273
Last edited: