Automation of Time and Motion Studies – Beta release

When Watch&Time App started, the focus was to bring a century old technique into the 21st century.   Taking into account the advances in operating systems, computing power and memory in mobile devices in the past few years, there is an opportunity to bring automation into play.  In effect, automating and streamlining an industrial engineering technique.

If you want to see what is possible, you can now download the beta: Available for iPhone, iPad and Mac with M1 chips.

Some observations regarding on device video activity analysis.

Firstly, it is really CPU and memory intensive.  And therefore, not recommended for older devices.  There is a quite.a bit of optimisation done to be able to run it on an iPhone and iPad.  And this impact accuracy of the model.   There is still development being done to leverage the newest Apple devices with 8 or 16GB memory which will lead to more accuracy. Speaking about accuracy.   The machine learning model is still actively being developed.  Therefore, the automation is more of a first run pass on a video as the best accuracy at present with video models are only around 70-80%.   This will become better as better architectures are developed.

Secondly, the Apple microchips are really fast with machine learning predictions.  A 2019 iPad Pro outperforms the MacBook Pro Intel by factor of 2-3x.   Which is insane.  As we haven’t tested it on a M1 Ultra, there is an expected higher performance.  If anyone has a new Mac Studio, please download the beta and test.

Thirdly.  There has been a few requests for an Android version of Watch&Time.   At present, there is active development, however, with the advent of Windows 11 that will run Android, this is definitely being investigated as it will open the app to more devices and users.

So, what is next?   Logically, we need to optimise the model and the code.   It is now to bring it to a level that it is accurate enough for daily use.   We are also looking at bringing multiple options for analysing the video, meaning on device or via the Cloud.  Lastly, we are also looking at the ability to have your data and results shared across devices.   If you do the analysis on your new Mac Studio, it would be great to see the results from your phone after it is done.

Lastly.   We do need help on making the ML model more accurate.  So if you want the app for free (not the beta) and willing to share tagged videos of time and motion studies, please contact


Leave a Reply