-
Notifications
You must be signed in to change notification settings - Fork 82
Makefile: build proto files unconditionally #229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
cc8b948
to
840defb
Compare
@marquiz Is this a workaround for a problem with the local development workflow, or a workaround for the occasionally reoccuring 'protoc not found' problem in our github CI ? |
This is a workaround (or I would say fix) to local development workflow, doing |
840defb
to
0ebefc6
Compare
ping @klihub |
@marquiz Would you be okay with a milder version of this, for instance something like this: klihub@486e315 ? That allows forcing rebuild of protobufs and it always forces it in the dockerized rebuild, so in your workflow it will always rebuild. |
I think that would work. TBH, I just don't get the the insistence on these pattern rules that fundamentally don't work in this setup (when both the sources and targets are stored in git). I kind of get that from a philosophical "this is the way that make was designed to be used" pov but if it doesn't work why bother. The amount of wasted build time should certainly not be an issue, I'd guess the rule doesn't get run very often even globally. |
Both the sources and targets are stored in git, but they are supposed to be consistently committed: if you change the protobuf definition, you should recompile and commit the compiled versions. Now I understand that right after a fresh clone the timestamps might be off, but that only should cause IMO (and in my experience) unnecessary recompilation and not the other way around (since the protobuf definition and the compiled protobuf files should always be consistently committed in the repo). I have seen this happening occasionally in CI. But I don't rememeber ever running into a situation in my local workflow, where I would have changed the protobuf definition, yet it would not have been recompiled. And to be completely honest, it might very well be that it has happened to me and I just shrugged it off with a
@marquiz That's not my reason/primary concern here. I'm more wary of folks having random different and newer versions of protoc& al. installed natively instead of the (fairly old) version we use (in sync with containerd). This then guarantees with 100% probability that a recompile will not result bit-by-bit in the same output, so it will result in false changes. If then folks are not extra careful and commit all their changes and file PRs from that, we need to point out that part of the PR are not real changes and should be reverted out. We have seen this already in the past even without a forced recompile by |
It doesn't matter how you commit then as git will tamper the timestamps. This is particularly annoying and breaks you when you have feature branches that you need to rebase (and e.g. happen to have conflicts).
I think this is not the correct way to try to prevent that. If this is a concern we should change all:/build: targets to depend on build-proto-containerized to get reproducible builds. |
If you have a conflict in a compiled protobuf file and you try to resolve it manually, you are really doing something wrong. The right way to resolve such a conflict is to (resolve any conflicts you might have in the proto source), remove all generated pb.go files with conflicts, then run make and let it regenerate the missing compiled proto files (and all others that come from that same compilation).
|
@marquiz Well, this is also true. And if I am the only one who thinks that unconditionally always regenerating and recompiling everything in the local development workflow is not the right way to go, then I realize I should simply give up and accept it. And if we go with that then I agree that a dockerized proto-build is friendlier. |
I beg to differ here. I don't want (and shouldn't need) to manually resolve or remove anything. I resolve conflicts in the .proto file (if any) and run "regenerate auto-generated stuff" and it should just work. |
@marquiz Well, if we get in #232 to avoid dockerized proto builds then I'm fine with this and always rebuilding. |
Ditch the makefile pattern rule which simply doesn't work in scenarios where both the source and the build targets are stored in the git repo (as git operations mangle the time stamps). So much grey hair avoided when the stuff just force generates everything instead of leaving you wonder why the tools didn't work as expected. Signed-off-by: Markus Lehtonen <[email protected]>
0ebefc6
to
9623748
Compare
Not needed anymore as we install protoc locally in the source tree. Signed-off-by: Markus Lehtonen <[email protected]>
5db759b
to
f880e1c
Compare
Sure, done. I don't think the dockerized build even worked anymore as we mounted the source tree as ro inside the build... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Ditch the makefile pattern rule which simply doesn't work in scenarios where both the source and the build targets are stored in the git repo (as git operations mangle the time stamps).
So much grey hair avoided when the stuff just force generates everything instead of leaving you wonder why the tools didn't work as expected.