You lose power. Whether it matters is case by case. (machine, operator, operation, how much you are trying to reduce it, etc) I don't know how that can be argued, but it seems contentious.
I am beginning to think it's contentious because we seem to have different baselines. We agree on almost everything except the baseline requirements. But your reply to
@Darren suggests to me a few opportunities to at least understand our different perspectives and perhaps even get on the same page. If not, then we can always agree to disagree as we have in the past.
In any event, I am hopeful that we can get there from your reply to Darren. Wouldn't that be nice!
Please bear with me one more time.
You're losing power. Reducing speed on a machine tool is usually because the diameter has increased meaning you need greater torque. Imagine a motor 1:1 to spindle, no transmission and mount a 1/4 end mill. It should work, right? Now hook up a 4" fly cutter. Doesn't cut despite having the same torque. You need much higher torque to drive 4"
Your example assumes no speed change (1:1 spindle, no gear box). I totally agree that if rpm is constant, and diameter increases, the power requirements go up. Not just a little, a lot! In fact, exponentially! So does the torque requirement to an extent. It requires more torque to plow faster as surface speed increases. But this cutting force is not linear. But I am quibbling on that point. Fundamentally I agree with your example for a constant speed motor.
Do I hear applause?
When you quarter the speed via a mechanical transmission you quadruple the torque. i.e. A mechanical transmission increases torque as the speed goes down, maintaining power.
Yup. Totally agree again. There are other factors at play here too, but again they are quibbles.
When you did that low Hz cut, I bet you are in low gear mechanically? That has multiplied the torque.
Not necessarily the way I would make your point. Even if you don't change gears mechanically, you do reduce rpm through the VFD in order to maintain surface cutting speed. Here I won't quibble. It takes a certain force to push the cutter forward in the metal at a certain speed. That tangential cutting force is translated via the torque applied to the cutter from the motor via a moment arm which is the radius of the part or tool (lathe/mill). Thus the torque required to cut the metal at that constant surface speed has to go up to maintain the same size chip with the same cutting force.
I believe we agree on this point.
Where we disagree is within the parameters of what you have called performance loss. Not so much in terms of actual performance but rather what I would call available performance. I think this will get clearer when I debate your small motor argument next.
At some point it matters else they industry would put smaller motors on machines. It also matters for a correct understanding of the merits of different approachs; there is a performance degradation with electronic speed reduction.
I don't think that's how it happens. The industry doesn't specify motors based on minimum requirements. I believe they are specified in order to meet maximum requirements. The maximum dictates the motor size and everything else happens below the maximum.
This where our debate converges, and also where your performance criteria enters the picture. I think it's also a good place to understand the difference between hobby use and professional use. Those who make their living on machining are always looking to maximize performance. ROI is what drives profit and pays the bills. Hobbiests on the other hand usually don't care that much about maximizing performance. They just want to make parts. Payback and cost justification take a back seat to getting the job done.
So yes, if the AVAILABLE performance is greater than the necessary performance everyone is happy (pros and hobbiests).
In most cases, the pros look to maximize performance pushing cutting depth, chip size, insert design, etc to the limits. That's what pays the bills.
On the other hand, the hobbiest looks to produce a part. As long as the "available" performance is greater than the "required" performance, all is well. Furthermore, if the required performance isn't less than the available performance, the hobbiest is totally prepared to make adjustments (shallower cuts, slower feed rates, lower load inserts,) etc to get the job done.
Which is the only point I've ever made, is that there IS a performance degradation. You lose power.
I think this is where the circle closes.
I would insert the word "potential" into your point which if you accept it, I believe allows us to agree.
"there is a POTENTIAL performance degradation".
You only lose power if the requirements exceed what is available. But if what is available exceeds what is required, then everything works just fine.
Whether it matters is case by case. (machine, operator, operation, how much you are trying to reduce it, etc) I don't know how that can be argued, but it seems contentious.
There is NOTHING contentious about that statement at all. At least not for me. I agree 100%.
I would only add that most hobbiest machines are capable of much greater performance than we Hobbiests require. So, we can easily be happy with less horsepower (using your words) because we simply don't need it. We don't push our machines to their limits and there is usually LOTS of room between available and required. When there isn't, we simply reduce our requirements.
What do you say? Can we agree to agree now? Or is there still more ground to close between us?
FWIW, I shall change my statements about VFD's going forward to be more clear about the difference between available and required.
I think this clarification also perfectly addresses
@Darren's observations and mine too cuz our requirements simply do not exceed what is available.
I'll only add one statement that you may or may not agree with. That is that "the low speed constant torque capabilities of a VFD dramatically improve the margin between what is available and what is required."