Apologies, didn't mean to post it so soon. Here is the example:
data from model : 18.407790088925115 18.44762345205904 18.4597030832462 18.360995778757925, 18.515915831120605 18.74921119654731 18.579003351919216 19.10187314465647, 18.939485932181274 18.302944242739716 18.523080265081624, 18.675458496756278 18.548265393909382 18.87877739857164 19.091166795323524, 19.063097365622763 19.021308771221953 19.060954668761667 19.05951169001614
standard deviation: 0.2782772857
RMSE: 0.87789744
Pearson's correlation coefficient: 0.810840658
observed data : 16.674455374979765 16.50238843493613 16.920688508795916 16.83653380063905, 16.80002055931839 17.1536863008199 16.741343068154787 17.011087488810798, 17.13915136650847 16.80960410031736 16.92260392250867 17.156146418231792, 17.007458499264462 17.134257775378767 16.955364014430202, 17.507028801956164 17.203462038159312 16.91186921444716 16.62443612437913
standard deviation: 0.348033406
RMSE: 0
Pearson's correlation coefficient: 1
Here is the link to the file:
https://www.dropbox.com/s/mqhukq7l5l8z83c/Taylor Diagram V1_2_test.xlsm?dl=0
When I plot the standard deviation and correlation data into your file (Taylor Diagram V1_2 it plots nicely, but the RMSE for the model is indicated to be about 0.14, not 0.877... any ideas why this might be? Have I done something wrong? or have I misunderstood?
Thanks!