Nicely presented, but pretty much old news if you are an electronics geek.

I suspect the ESR meter is simply a load tester. It measures unloaded voltage, applies a fixed and precise load to the pack/cell, and measures the loaded voltage. The ESR is calculated simply by using the difference between the two voltages and the load (in the meter).
However, there are two factors that aren't taken into account in that meter; ESR changes as load varies and over the cell discharge cycle. The presenter says a battery is a voltage source with a small series resistance in one package. While true, this gives the impression that the "resistor" is static, but it isn't in the real world.
The ESR will change a little when applied to a big load vs a small load. If you measure ESR using a small (like 100Ω) load, and then again with a large (like 1Ω) load, the ESR will be different. Not a whole lot, but it does vary.
However, measuring ESR at high loads becomes problematic because at high currents, the cell heats up (because of the power it is dissipating), and the ESR changes (IR changes with temp). When measuring at high currents, you'd have to measure it quickly with adequate resting periods in between to allow the cell to cool to maintain a fairly constant temperature.
Also, keep in mind that the ESR measured at the beginning of the discharge cycle will be lower than when measured near the end of the discharge cycle (near LVC). To get a truer value, the ESR should be measured either at several points during the discharge cycle and average the results together, or measure ESR where the voltage curve is flattest (usually near the nominal value and/or at some point in the middle of the discharge cycle).
You can measure ESR at home fairly easily using a single calibrated meter. Like the ESR meter does in the video:
- Measure unloaded voltage
- Briefly apply a known resistance while measuring the loaded voltage
- Calculate the ESR by: (loaded_voltage/load_resistance) * (unloaded_voltage - loaded_voltage)