<@URMRXFEPP> The integration test is meant to have...
# code-review
@João Godinho The integration test is meant to have the table use its logic, beyond also testing the schema, so if you mean filling the table with dummy data, no. That been said, if there’s a way to generate that data through launching commands, that’s fine, I would just call them in the test. But I think there might be a second issue. I’m not sure what’s missing but the Linux distro we run on the CI is running inside Docker; I would’ve expected to at least see one login from the Azure user into the Docker container, but.. yeah
Hi @Stefano Bonicatti, thanks for the feedback. I’ve pulled the container and indeed
is empty, it doesn’t seem to trigger events. I’m a little lost in here on how to proceed from here. You were suggesting providing a dummy
file to the container?
Yes, though looking at it better I see that it’s possible to write into the wtmpx file with its APIs, namely pututxline
In general it would be better if the system itself is not changed, even if it’s a test… (so for instance having the table read a file on another path), but the API used expect to operate on a file which is used by the system
so point it to a tmp file and populate with some random data?
I was saying not to care about pointing it to something else, because it seemed that calling utmpname would change it also for other processes, but i misread
So, if we can generate the test data it would be better, so we have more choice vs giving a prefilled file
Which seems possible using utmpxname to point to a custom a file and then use pututxline to write it.
That's what I was thinking, modifying via utmpxname and populate it via the api
I think the only way right now would be to use a global variable, to extern and overwrite it in the test
hum… shouldn’t the implementation itself use
instead of hard-coding it?
It could be better yes, but you still need to specify it, because the default is
So it would point to the currently logged in users, not the past
I’ve been tinkering around, and found two options: setting the
(btw the default for it is already
) via cmake, which I tried but my experience with cmake is 0, so couldn’t make it work yet. the other option is to create tests like the ones for
, the problem is identical and they create some tmp dirs and test the functions directly (not the output of a query)
Well, that would make it more of a unit test, because it would test the table directly and “only”. But that’s fine; depending on how much you want to work on it, the best would be to refactor a bit the table so that you can test the logic directly instead of just checking the
because when something doesn’t work, you would only get an empty value in the column, which doesn’t necessarily mean that’s wrong
what I mean is, that if you check the Row, you cannot distinguish between a failure and soemthing that simply contains an empty value
as for the define, that’s not possible to do, because tests do not recompile that code, the tables are in a library.. so the define is already chosen
and it’s the system one, you cannot change it for the test only
I think it's best to refactor the implementation then, given that the table may indeed be empty.
I thought the define could be used, since everything is compiled during the same run, and one could use of flag to either modify or not a preprocessor definition
Anyway, you agree with the refactoring as a good option to test the parsing of the file? Not the content of the table itself
I can slightly modify the current table test to only test if it has rows, no harm done I guess, as builds outside containers do populate the table
But that’s the problem
Either you compile the table code to be used for tests or you compile it for production
In any case, accessing results (and errors) directly, not through Row makes for a better test i believe, yes. The integration test is still valuable to verify the schema and the sql part, but it doesn’t really verify the table logic
I’ll see if I can refactor the code today
thanks for all the help 🙏
Thanks for your effort! :)