I'm trying to plot a dataset of frequency of events (=frequency(values,bins)) with timestamps in a chart, and getting some odd behavior. Firstly, my dates are in the format (yyyy.mm.dd hh:mm:ss.000) and are recognized as dates in the worksheet. But, as these dates contain both date and time, excel seems to have some problems when making a chart. I cannot set axes to 'time-scale' in 'chart options', this seems to aggregate all datapoints as one spike per day in the line chart (cannot set major/minor unit to time-units (hour/minute), only day units(day/month/year). Choosing 'Category' gives me more time-values in the x-axis, but the aggregation is quite strange. Anybody knows what excel does behind the scenes?
Example: it shows a spike to 250 in the line chart, while in the worksheet the highest datapoint i have for one hour in that day is 120, and the total number of events for that whole day is 400... Where did it get that number from (250?) Does it do any 'intelligent' aggregation over the previous xxx datapoints and the following xxx datapoints? When looking at the worksheet my data show that event A occurs 50% more often than event B in a specific 60 minute period, but the chart presents it as being about 300% more for that same period...
Example: it shows a spike to 250 in the line chart, while in the worksheet the highest datapoint i have for one hour in that day is 120, and the total number of events for that whole day is 400... Where did it get that number from (250?) Does it do any 'intelligent' aggregation over the previous xxx datapoints and the following xxx datapoints? When looking at the worksheet my data show that event A occurs 50% more often than event B in a specific 60 minute period, but the chart presents it as being about 300% more for that same period...