table is that it’s the unix time difference between the start and end of a query. However, does that imply it’s the difference of when the query was last executed? Further,
are both milliseconds and I would have to divide by
to understand its performance profile?