The controller definitely requires DC input. The really big hint on that is the + and - marked on the power terminals! Feeding 120V AC in there will extremely quickly let out the magic smoke, and probably in a very impressive way.
More specifically it requires somewhere between 24 and 50V DC on the power terminals to work. From the more detailed description I found, it can control a 3-phase motor that uses up to 500W of power, or a maximum of about 12A on the input DC power. 12A times the input voltage is the real maximum output power. More realistically this is probably built as cheaply as possible (the 12A can also be called 'marketing amps' which are more likely 6-8 real amps), so for reliable operation I would limit it to half to two thirds of that.
Since any cheap power supply you order is also likely using 'marketing' ratings, make sure you get some extra. To provide a reliable 12A, I would look at least a 15-20A power supply at whatever voltage you choose.
If the motor you use is only a 250W motor, this controller would probably be fine with a 10-12A power supply, I would not try to run a 500W motor under full load on a continuous basis. For just a few seconds it is probably OK, but you are pushing your luck to run it continuously on this controller with a 12A power supply.
The output speed of the motor is controlled by the DC voltage applied to the SV terminal (from 0 to 10V). That voltage can be applied from a potentiometer that acts as a voltage divider between the supplied 10V and ground terminals, or by some external 10V source. The 0-10V is chosen because that is a very common control scheme used in industrial controls.
Exactly what speed the motor will run at is dependant on the motor and the input voltage. Many BLDC motors have a rating called KV. That is the number of thousand RPM they will turn per input volt. That rating is commonly found on the small motors used in quad-copters. For example 1000KV means that the motor will be able to spin at 1000RPM for each volt on its input. So with 12V, the motor will be able to spin at up to 12,000RPM.
It won't spin at that speed on its own. The controller needs to generate an appropriate frequency of three-phase AC to run the motor at that speed. That is just the maximum speed at which the motor retains its torque. As the AC frequency increases beyond that point the motor rapidly loses torque since its windings cannot change quickly enough. The rate at which the winding current (and hence the magnetic field that makes the motor rotate) can change is dependant on the inductance of the windings and the voltage that is applied.
The AC frequency required will be the number of rev/sec (12,000RPM = 200 rev/sec) times the number of poles in the motor. For instance, a 4 pole motor requires 4 AC cycles for every rotation of the motor shaft. So for that 1KV motor running at 12,000RPM requires 200 x 4 = 800Hz 3-phase AC driving signal. Increasing the frequency to 1600Hz will make the motor spin faster, but at much less torque since in half the time the coils can only build to half the current, which means half the magnetic field, so half the torque. It might actually only be a quarter of the torque at twice the frequency, I don't remember the details.
Basically when the controller is at its full speed setting (10V input signal), it will drive the motor with an AC signal at the full input voltage (or nearly so), so the motor should rotate at its KV rating times the input voltage. There are some loses, so it won't be quite that. This controller has a maximum RPM of 20,000 listed, but that is probably actually the maximum AC frequency (333Hz). The motor RPM would then be dependant on the number of poles the motor has.
The controller has two ways of controlling the output frequency to the motor. With and without Hall sensors. Either way, the controller uses feedback from the physical motor rotation to control its out put frequency.
Hall effect sensors are magnetic sensors that sense the position of small magnets on the motor shaft. As the motor rotates the Hall sensors tell the controller how fast the motor is actually rotating (actually the position of the rotor) so it can output the appropriate frequency. If you are trying to speed up the motor, the controller will output a slightly higher frequency than the motor is actually running at so the motor will speed up. The idea is to keep increasing the rate of the rotating magnetic field so the motor speeds up, but not so much that the motor falls behind the field. You cannot just suddenly provide a 200Hz signal to a stopped BLDC motor and expect it to catch up. It will just stall and vibrate instead.
Without hall sensors, the controller samples what is called the back EMF from the motor windings to see how fast the motor is turning. As the motor turns it generates a voltage called the back-EMF that is proportional to its speed. Basically once the back-EMF rises to the same level as the driving voltage (EMF) the motor will go no faster.
Hall sensors give the controller better feedback so it can generate a better signal that matches the actual motor RPM, but they cost money, and the back-EMF technique works, but not as well. Especially when the motor is under heavy load like when starting or transitioning to a heavy load where it can more easily stall without the real physical feedback provided by the Hall sensors.
Hope this helps a bit.