You are welcome
I don't mean to argue but I believe you are mistaken about imperial/metric scales. Most of the Asian dro's I have seen have a 5 micron resolution, the glass scales inside the linear encoder (scale) need to have graduations at a specific distance to be of any use, given the resolution of being exactly 5 microns and not some round imperial measurement would lead me to believe fhe scales are graduated in metric....making the scales effectively metric
I see! But, I think we are both right. It's great to be Canadian. We can debate such things without getting angry with each other. That's why I prefer to use the word debate instead of argue. And no worries - I take no offense from anyone who takes the time to debate anything with me. As Thomas Edison once said: "Time is our most precious quantity because time is the stuff life is made of." Anyone who is willing to give me a part of their life by debating with me has my total respect and gratitude. Please allow me to return the favour.
By your definition, counting microns makes the unit metric because microns are a metric unit - specifically 1 millionth of a meter. And I agree, if counting a metric unit means that they are metric, then they are metric.
But I believe it's actually much more complicated than that.
Metric is a system that uses a number base of 10. 0 1 2 3 4 5 6 7 8 9 10 11 12 13, etc. For distances, each time you get to 10 you bump the preceding digit and each additional number is another magnitude.
However, ALL digital systems use a numbering system that is binary. In fact, it is this binary numbering system that gives it the moniker "digital". In binary, the same sequence as above is 0 1 10 11 100 101 110 111 1000 1001 1010 1011 1100 1101, etc. Note that decimal 10 is the same as binary 1010.
So, even though the system actually does count a metric unit. It doesn't count those units in metric, it counts them in binary and then the program in the control unit runs a subroutine that converts that binary count into a decimal or SAE number for display purposes.
It might seem odd to go to such crazy lengths to count a metric quantity in binary and then convert it back to metric for display, but that is the essence of how all computers work. It's also what makes humans so amazing and computers so fast.
As an aside, the metric decimal system is great because it is a global standard. But I wish that we humans had skipped our thumbs and only used our fingers when we started counting. Then we would have developed an octal instead of decimal numbering system. Why? Because unlike decimal which requires a rather complicated subroutine to convert binary to decimal, octal is a perfect multiple of binary. The subroutine although still required, becomes very simple.
Anyway, all that is to say that you are certainly right at the most basic level. If the measured quantity is a micron (and it is certainly advertised that way) then I agree that its a metric system at its core.
But I would also make the claim that we are actually both right because the core unit or increment gets lost in the binary digital counting process and only converted from binary to the desired metric or SAE system in the final algorithm that drives the display.
Cool eh!
Edit - on a separate matter, it's also worth noting that 5 microns is just under 2 tenths of a thou. So the 4th digit of an SAE display (tenths) isn't really of any true value other than ensuring that the 3rd digit (thousandth) is more likely to be correct regardless of whether the programmer averaged, rounded, or truncated the result. As
@YYCHM has pointed out - other than that, the 4th place is a pretty useless digit.