* Failures in GitHub Actions linux-leaks and linux-asan-ubsan
@ 2024-03-16 19:20 Philippe Blain
2024-03-16 23:31 ` Junio C Hamano
` (2 more replies)
0 siblings, 3 replies; 4+ messages in thread
From: Philippe Blain @ 2024-03-16 19:20 UTC (permalink / raw
To: Git mailing list
Hi everyone,
You might have noticed that the linux-leaks and linux-asan-ubsan
jobs in GitHub Actions started failing last week. I investigated
this so I'm sharing my findinds in case it helps.
The failures are due to the new ubuntu-22.04 GitHub Actions image
(release 20240310.1.0, [1]) which uses a kernel where ASLR is configured
in a way that is incompatible with ASan and LSan as used in
the GCC and Clang versions in that image. More info can be found
in [2] and [3] and pages linked there.
A workaround was already implemented in the image generation process
[4], so the next version of the image should work. I think the images
are released weekly. We could maybe add the same sysctl command to reduce
the entropy to our YAML file, or we could live with it for the next week
or so while waiting for the next image to roll out.
Cheers,
Philippe.
[1] https://github.com/actions/runner-images/commit/d67fa31aeeec3cf0d666d0eb2976b683471c6b90
[2] https://github.com/actions/runner-images/issues/9491#issuecomment-1989718917
[3] https://github.com/actions/runner-images/issues/9524#issuecomment-2002065399
[4] https://github.com/actions/runner-images/pull/9513
^ permalink raw reply [flat|nested] 4+ messages in thread
* Re: Failures in GitHub Actions linux-leaks and linux-asan-ubsan
2024-03-16 19:20 Failures in GitHub Actions linux-leaks and linux-asan-ubsan Philippe Blain
@ 2024-03-16 23:31 ` Junio C Hamano
2024-03-18 6:52 ` Linus Arver
2024-03-18 9:08 ` Jeff King
2 siblings, 0 replies; 4+ messages in thread
From: Junio C Hamano @ 2024-03-16 23:31 UTC (permalink / raw
To: Philippe Blain; +Cc: Git mailing list
Philippe Blain <levraiphilippeblain@gmail.com> writes:
> Hi everyone,
>
> You might have noticed that the linux-leaks and linux-asan-ubsan
> jobs in GitHub Actions started failing last week. I investigated
> this so I'm sharing my findinds in case it helps.
>
> The failures are due to the new ubuntu-22.04 GitHub Actions image
> (release 20240310.1.0, [1]) which uses a kernel where ASLR is configured
> in a way that is incompatible with ASan and LSan as used in
> the GCC and Clang versions in that image. More info can be found
> in [2] and [3] and pages linked there.
>
> A workaround was already implemented in the image generation process
> [4], so the next version of the image should work. I think the images
> are released weekly. We could maybe add the same sysctl command to reduce
> the entropy to our YAML file, or we could live with it for the next week
> or so while waiting for the next image to roll out.
Ah, just what I needed, as I was looking at the CI results and
noticed these failed runs. Thanks.
^ permalink raw reply [flat|nested] 4+ messages in thread
* Re: Failures in GitHub Actions linux-leaks and linux-asan-ubsan
2024-03-16 19:20 Failures in GitHub Actions linux-leaks and linux-asan-ubsan Philippe Blain
2024-03-16 23:31 ` Junio C Hamano
@ 2024-03-18 6:52 ` Linus Arver
2024-03-18 9:08 ` Jeff King
2 siblings, 0 replies; 4+ messages in thread
From: Linus Arver @ 2024-03-18 6:52 UTC (permalink / raw
To: Philippe Blain, Git mailing list
Philippe Blain <levraiphilippeblain@gmail.com> writes:
> Hi everyone,
>
> You might have noticed that the linux-leaks and linux-asan-ubsan
> jobs in GitHub Actions started failing last week. I investigated
> this so I'm sharing my findinds in case it helps.
Huge thanks for looking into this, Philippe!
> The failures are due to the new ubuntu-22.04 GitHub Actions image
> (release 20240310.1.0, [1]) which uses a kernel where ASLR is configured
> in a way that is incompatible with ASan and LSan as used in
> the GCC and Clang versions in that image. More info can be found
> in [2] and [3] and pages linked there.
>
> A workaround was already implemented in the image generation process
> [4], so the next version of the image should work. I think the images
> are released weekly. We could maybe add the same sysctl command to reduce
> the entropy to our YAML file, or we could live with it for the next week
> or so while waiting for the next image to roll out.
>
> Cheers,
>
> Philippe.
>
> [1] https://github.com/actions/runner-images/commit/d67fa31aeeec3cf0d666d0eb2976b683471c6b90
> [2] https://github.com/actions/runner-images/issues/9491#issuecomment-1989718917
> [3] https://github.com/actions/runner-images/issues/9524#issuecomment-2002065399
> [4] https://github.com/actions/runner-images/pull/9513
I appreciate the links you've documented here for reference (and the
care you took to write up [3]). You've got good taste. :)
^ permalink raw reply [flat|nested] 4+ messages in thread
* Re: Failures in GitHub Actions linux-leaks and linux-asan-ubsan
2024-03-16 19:20 Failures in GitHub Actions linux-leaks and linux-asan-ubsan Philippe Blain
2024-03-16 23:31 ` Junio C Hamano
2024-03-18 6:52 ` Linus Arver
@ 2024-03-18 9:08 ` Jeff King
2 siblings, 0 replies; 4+ messages in thread
From: Jeff King @ 2024-03-18 9:08 UTC (permalink / raw
To: Philippe Blain; +Cc: Git mailing list
On Sat, Mar 16, 2024 at 03:20:44PM -0400, Philippe Blain wrote:
> The failures are due to the new ubuntu-22.04 GitHub Actions image
> (release 20240310.1.0, [1]) which uses a kernel where ASLR is configured
> in a way that is incompatible with ASan and LSan as used in
> the GCC and Clang versions in that image. More info can be found
> in [2] and [3] and pages linked there.
>
> A workaround was already implemented in the image generation process
> [4], so the next version of the image should work. I think the images
> are released weekly. We could maybe add the same sysctl command to reduce
> the entropy to our YAML file, or we could live with it for the next week
> or so while waiting for the next image to roll out.
Thanks for digging into this! I had done a little but didn't get nearly
as far. I am happy I can just ignore it and the problem will resolve
itself. ;)
While I have the attention of folks who might be interested in CI
failures, let me hijack the thread for a moment: has anybody figured out
why macOS jobs sometimes time out after 6 hours? I assume that is an
Actions limit, and something is just hanging. It sometimes hits
osx-reftable[0], but sometimes osx-clang[1] and osx-gcc[2]. I've seen it
on my builds and some on git/git (the last one is from git/git).
It's hard to tell which test is hanging, because the output only shows
the finished tests. I tried running them without "prove" and doing it
one-by-one, but then the hang doesn't seem to occur (so presumably it's
a race under load). I tried comparing the list of tests reported as
finishing versus the total list, but there was nothing enlightening.
It's all stuff in t9xxx, which you'd expect to be running near the end
anyway (and it feels like if there was _one_ test hanging, we should
finish everything else, since we run with some parallelism). So it's
almost like a bug in "prove" or something. But AFAIK it just started
happening in the past month or two.
-Peff
[0]: example run: https://github.com/peff/git/actions/runs/8107092556/job/22158038562
[1]: example run: https://github.com/peff/git/actions/runs/8091601551/job/22110929146
[2]: example run: https://github.com/git/git/actions/runs/8273234891/job/22636626511
^ permalink raw reply [flat|nested] 4+ messages in thread
end of thread, other threads:[~2024-03-18 9:09 UTC | newest]
Thread overview: 4+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2024-03-16 19:20 Failures in GitHub Actions linux-leaks and linux-asan-ubsan Philippe Blain
2024-03-16 23:31 ` Junio C Hamano
2024-03-18 6:52 ` Linus Arver
2024-03-18 9:08 ` Jeff King
Code repositories for project(s) associated with this public inbox
https://80x24.org/mirrors/git.git
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).