Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[X86Backend][M68KBackend] Make Ctx in X86MCInstLower (M68KInstLower) the same as AsmPrinter.OutContext #133352

Merged
merged 20 commits into from
Apr 5, 2025

Conversation

weiweichen
Copy link
Contributor

@weiweichen weiweichen commented Mar 28, 2025

In X86MCInstLower::LowerMachineOperand, a new MCSymbol can be created in GetSymbolFromOperand(MO) where MO.getType() is MachineOperand::MO_ExternalSymbol

  case MachineOperand::MO_ExternalSymbol:
    return LowerSymbolOperand(MO, GetSymbolFromOperand(MO));

at

Sym = Ctx.getOrCreateSymbol(Name);

However, this newly created symbol will not be marked properly with its IsExternal field since Ctx.getOrCreateSymbol(Name) doesn't know if the newly created MCSymbol is for MachineOperand::MO_ExternalSymbol.

Looking at other backends, for example Arch64MCInstLower is doing for handling MC_ExternalSymbol

case MachineOperand::MO_ExternalSymbol:
MCOp = LowerSymbolOperand(MO, GetExternalSymbolSymbol(MO));

MCSymbol *
AArch64MCInstLower::GetExternalSymbolSymbol(const MachineOperand &MO) const {
return Printer.GetExternalSymbolSymbol(MO.getSymbolName());
}

It creates/gets the MCSymbol from AsmPrinter.OutContext instead of from Ctx. Moreover, Ctx for AArch64MCLower is the same as AsmPrinter.OutContext.

: AsmPrinter(TM, std::move(Streamer)), MCInstLowering(OutContext, *this),
. This applies to almost all the other backends except X86 and M68k.

$git grep "MCInstLowering("
lib/Target/AArch64/AArch64AsmPrinter.cpp:100:      : AsmPrinter(TM, std::move(Streamer)), MCInstLowering(OutContext, *this),
lib/Target/AMDGPU/AMDGPUMCInstLower.cpp:223:  AMDGPUMCInstLower MCInstLowering(OutContext, STI, *this);
lib/Target/AMDGPU/AMDGPUMCInstLower.cpp:257:  AMDGPUMCInstLower MCInstLowering(OutContext, STI, *this);
lib/Target/AMDGPU/R600MCInstLower.cpp:52:  R600MCInstLower MCInstLowering(OutContext, STI, *this);
lib/Target/ARC/ARCAsmPrinter.cpp:41:        MCInstLowering(&OutContext, *this) {}
lib/Target/AVR/AVRAsmPrinter.cpp:196:  AVRMCInstLower MCInstLowering(OutContext, *this);
lib/Target/BPF/BPFAsmPrinter.cpp:144:    BPFMCInstLower MCInstLowering(OutContext, *this);
lib/Target/CSKY/CSKYAsmPrinter.cpp:41:    : AsmPrinter(TM, std::move(Streamer)), MCInstLowering(OutContext, *this) {}
lib/Target/Lanai/LanaiAsmPrinter.cpp:147:  LanaiMCInstLower MCInstLowering(OutContext, *this);
lib/Target/Lanai/LanaiAsmPrinter.cpp:184:  LanaiMCInstLower MCInstLowering(OutContext, *this);
lib/Target/MSP430/MSP430AsmPrinter.cpp:149:  MSP430MCInstLower MCInstLowering(OutContext, *this);
lib/Target/Mips/MipsAsmPrinter.h:126:      : AsmPrinter(TM, std::move(Streamer)), MCInstLowering(*this) {}
lib/Target/WebAssembly/WebAssemblyAsmPrinter.cpp:695:    WebAssemblyMCInstLower MCInstLowering(OutContext, *this);
lib/Target/X86/X86MCInstLower.cpp:2200:  X86MCInstLower MCInstLowering(*MF, *this);

This patch makes X86MCInstLower and M68KInstLower to have their Ctx from AsmPrinter.OutContext instead of getting it from MF.getContext() to be consistent with all the other backends.

I think since normal use case (probably anything other than our un-conventional case) only handles one llvm module all the way through in the codegen pipeline till the end of code emission (AsmPrint), AsmPrinter.OutContext is the same as MachineFunction's MCContext, so this change is an NFC.


This fixes an error while running the generated code in ORC JIT for our use case with MCLinker (see more details below):
#133291 (comment)

We (Mojo) are trying to do a MC level linking so that we break llvm module into multiple submodules to compile and codegen in parallel (technically into *.o files with symbol linkage type change), but instead of archive all of them into one .a file, we want to fix the symbol linkage type and still produce one *.o file. The parallel codegen pipeline generates the codegen data structures in their own MCContext (which is Ctx here). So if function f and g got split into different submodules, they will have different Ctx. And when we try to create an external symbol with the same name for each of them with Ctx.getOrCreate(SymName), we will get two different MCSymbol* because f and g's MCContext are different and they can't see each other. This is unfortunately not what we want for external symbols. Using AsmPrinter.OutContext helps, since it is shared, if we try to get or create the MCSymbol there, we'll be able to deduplicate.

Copy link

github-actions bot commented Mar 28, 2025

✅ With the latest revision this PR passed the C/C++ code formatter.

@weiweichen weiweichen requested a review from npanchen March 28, 2025 02:32
@weiweichen
Copy link
Contributor Author

weiweichen commented Mar 28, 2025

Verified that this change won't negatively impact this issue #132055 (comment)

source_filename = "text"
target datalayout = "e-m:o-p270:32:32-p271:32:32-p272:64:64-i64:64-i128:128-f80:128-n8:16:32:64-S128"
target triple = "x86_64-apple-macosx10.14.0-macho"

declare i16 @julia_float_to_half(float)

define internal i16 @__truncsfhf2(float %0) {
  %2 = call i16 @julia_float_to_half(float %0)
  ret i16 %2
}

define hidden swiftcc half @julia_fp(float %0) {
  %2 = fptrunc float %0 to half
  ret half %2
}

@llvm.compiler.used = appending global [1 x ptr] [ptr @__truncsfhf2], section "llvm.metadata"
$ llc lala.ll -filetype=obj -o llvm20.o
$ llvm-objdump llvm20.o -t

llvm20.o:       file format mach-o 64-bit x86-64

SYMBOL TABLE:
0000000000000000 l     F __TEXT,__text ___truncsfhf2
0000000000000010 g     F __TEXT,__text .hidden _julia_fp
0000000000000000         *UND* _julia_float_to_half

__truncsfhf2 is still l.

@MaskRay
Copy link
Member

MaskRay commented Mar 28, 2025

This solution appears to potentially mask an underlying issue.

Can you state the motivation behind the change?

The first few paragraphs only describe "what" the scenario is, but doesn't really state your end goal, and how this helps your MC Linker effort.

In X86MCInstLower::LowerMachineOperand, a new MCSymbol can be created in GetSymbolFromOperand(MO) where MO.getType() is MachineOperand::MO_ExternalSymbol

It would be beneficial to broaden the review process, especially when the reviewer selected is closely associated and neither of you has prior experience with this specific code area. (#108880 landed quickly within four hours. Considering the reviewer's close association, a wider review might offer additional perspectives.)

@weiweichen
Copy link
Contributor Author

weiweichen commented Mar 28, 2025

This solution appears to potentially mask an underlying issue.

Can you state the motivation behind the change?

The first few paragraphs only describe "what" the scenario is, but doesn't really state your end goal, and how this helps your MC Linker effort.

In X86MCInstLower::LowerMachineOperand, a new MCSymbol can be created in GetSymbolFromOperand(MO) where MO.getType() is MachineOperand::MO_ExternalSymbol

So for MC Linker, the main difference from normal codegen flow is that we are trying to AsmPrint functions that are initially in one llvm module, then being split into submodules for parallel codegen, and now use one AsmPrinter to generate the output. The parallel codegen pipeline generates the codegen data structures in their own MCContext (which is Ctx here). So if function f and g got split into different submodules, they will have different Ctx. If we try to create an external symbol with the same name for each of them with Ctx.getOrCreate(SymName), we will get two different MCSymbol* because f and g's MCContext are different. This is not what we want for external symbols. Using AsmPrinter.OutContext helps, since it is shared, if we try to get or create the MCSymbol there, we'll be able to deduplicate.

I think since normal use case (probably anything other than our un-conventional case) only handles one llvm module all the way through in the whole codegen pipeline, there is no need to deduplicate, and this change is probably just an NFC.

Let me add this to the PR description as well!

It would be beneficial to broaden the review process, especially when the reviewer selected is closely associated and neither of you has prior experience with this specific code area. (#108880 landed quickly within four hours. Considering the reviewer's close association, a wider review might offer additional perspectives.)

Totally agree. The PR is "draft" now so that I can put out the code to discuss with @npanchen about the solution, but not incur too much noises to others. Once it is "Ready for review", I'll add folks who have given us feedbacks on the previous PR+issue as reviewers.

(sorry for being pedantic here) But to my defense, #108880 took 4 days to land instead of 4 hours (it was 4 hours after approval). The PR notified core maintainers like llvm/pr-subscribers-backend-x86 and requested review from folks who are not close to our org based on GitHub suggestion, but I didn't get any comments from anyone other than a post merge one about adding a test. Sorry that I probably missed/forgot that comment. Fully agree that I should have added a test to #108880 to indicate our intention for the change+verify the change. On the other hand, there are numerous x86 backend tests existing and none is broken by the change is a way of testing, no?

@weiweichen weiweichen marked this pull request as ready for review March 28, 2025 14:47
@llvmbot
Copy link
Member

llvmbot commented Mar 28, 2025

@llvm/pr-subscribers-backend-m68k

@llvm/pr-subscribers-backend-x86

Author: weiwei chen (weiweichen)

Changes

In X86MCInstLower::LowerMachineOperand, a new MCSymbol can be created in GetSymbolFromOperand(MO) where MO.getType() is MachineOperand::MO_ExternalSymbol

  case MachineOperand::MO_ExternalSymbol:
    return LowerSymbolOperand(MO, GetSymbolFromOperand(MO));

at

Sym = Ctx.getOrCreateSymbol(Name);

However, this newly created symbol will not be marked properly with its IsExternal field since Ctx.getOrCreateSymbol(Name) doesn't know if the newly created MCSymbol is for MachineOperand::MO_ExternalSymbol.

Use AsmPrinter.GetExternalSymbolSymbol(Name) instead for this case so that we actually create a proper MCSymbol that is external. This is similar to what Arch64MCInstLower is doing for handling MC_ExternalSymbol

case MachineOperand::MO_ExternalSymbol:
MCOp = LowerSymbolOperand(MO, GetExternalSymbolSymbol(MO));

MCSymbol *
AArch64MCInstLower::GetExternalSymbolSymbol(const MachineOperand &MO) const {
return Printer.GetExternalSymbolSymbol(MO.getSymbolName());
}

Note that the critical point here is that MO_ExternalSymbol is not created/get from AsmPrinter.OutContext instead of from Ctx.

This fixes an error while running the generated code in ORC JIT for our use case with MCLinker (see more details below):
#133291 (comment)

We (Mojo) are trying to do a MC level linking so that we break llvm module into multiple submodules to compile and codegen in parallel (technically into *.o files with symbol linkage type change), but instead of archive all of them into one .a file, we want to fix the symbol linkage type and still produce one *.o file. The parallel codegen pipeline generates the codegen data structures in their own MCContext (which is Ctx here). So if function f and g got split into different submodules, they will have different Ctx. And when we try to create an external symbol with the same name for each of them with Ctx.getOrCreate(SymName), we will get two different MCSymbol* because f and g's MCContext are different and they can't see each other. This is unfortunately not what we want for external symbols. Using AsmPrinter.OutContext helps, since it is shared, if we try to get or create the MCSymbol there, we'll be able to deduplicate.

I think since normal use case (probably anything other than our un-conventional case) only handles one llvm module all the way through in the whole codegen pipeline, there is no need to deduplicate, and this change is probably just an NFC.


Full diff: https://github.com/llvm/llvm-project/pull/133352.diff

1 Files Affected:

  • (modified) llvm/lib/Target/X86/X86MCInstLower.cpp (+9-2)
diff --git a/llvm/lib/Target/X86/X86MCInstLower.cpp b/llvm/lib/Target/X86/X86MCInstLower.cpp
index 3f6cd55618666..c93a2879a057f 100644
--- a/llvm/lib/Target/X86/X86MCInstLower.cpp
+++ b/llvm/lib/Target/X86/X86MCInstLower.cpp
@@ -192,8 +192,15 @@ MCSymbol *X86MCInstLower::GetSymbolFromOperand(const MachineOperand &MO) const {
   }
 
   Name += Suffix;
-  if (!Sym)
-    Sym = Ctx.getOrCreateSymbol(Name);
+  if (!Sym) {
+    // If new MCSymbol needs to be created for
+    // MachineOperand::MO_ExternalSymbol, create it as a symbol
+    // in AsmPrinter's OutContext.
+    if (MO.isSymbol())
+      Sym = AsmPrinter.OutContext.getOrCreateSymbol(Name);
+    else
+      Sym = Ctx.getOrCreateSymbol(Name);
+  }
 
   // If the target flags on the operand changes the name of the symbol, do that
   // before we return the symbol.

@weiweichen weiweichen requested a review from topperc March 28, 2025 14:47
@weiweichen
Copy link
Contributor Author

weiweichen commented Mar 28, 2025

This is another attempt to address the issue what was poorly addressed in #108880 🤦‍♀️ . Adding more explanation on the motivation/problem to address and the rationale on the change. Appreciate suggestions on how to add a test in addition to existing tests not being broken this change 🙏

(update) working on adding a c++ unitttest.

@weiweichen weiweichen marked this pull request as draft March 28, 2025 18:40
@efriedma-quic
Copy link
Collaborator

(sorry for being pedantic here) But to my defense, #108880 took 4 days to land instead of 4 hours (it was 4 hours after approval). The PR notified core maintainers like llvm/pr-subscribers-backend-x86 and requested review from folks who are not close to our org based on GitHub suggestion, but I didn't get any comments from anyone other than a post merge one about adding a test. Sorry that I probably missed/forgot that comment. Fully agree that I should have added a test to #108880 to indicate our intention for the change+verify the change.

I don't want to blame anyone here. The review process for patches to LLVM relies, to a large extent, on code authors exercising good judgement, and other people catching mistakes. And new contributors often get tripped up here.

So, the patch probably shouldn't have been merged, but it's not a big deal. Reverts are easy enough. And we'll make sure everyone is on the same page going forward.


If we are going to make changes here, I'd like to try to ensure consistency across targets. And that whatever API usage rule you want to impose for the sake of your out-of-tree code actually makes sense; if there's a general rule we should follow for creating MCSymbols from the AsmPrinter, we should document it and make sure we consistently follow it.

@weiweichen
Copy link
Contributor Author

weiweichen commented Mar 29, 2025

If we are going to make changes here, I'd like to try to ensure consistency across targets. And that whatever API usage rule you want to impose for the sake of your out-of-tree code actually makes sense; if there's a general rule we should follow for creating MCSymbols from the AsmPrinter, we should document it and make sure we consistently follow it.

This is very good point!

I did some "grep" and found out that actually most backends are already using AsmPrinter to create MCSymbol for MO_ExternalSymbol. GetExternalSymbolSymbol creates the MCSymbol from AsmPrinter.OutContext.

case MachineOperand::MO_ExternalSymbol:
MCOp = LowerSymbolOperand(MO, GetExternalSymbolSymbol(MO));

case MachineOperand::MO_ExternalSymbol:
Symbol = Printer.GetExternalSymbolSymbol(MO.getSymbolName());

case MachineOperand::MO_ExternalSymbol:
MCOp = LowerSymbolOperand(MO, GetExternalSymbolSymbol(MO));

case MachineOperand::MO_ExternalSymbol:
MCOp = LowerSymbolOperand(MO, GetExternalSymbolSymbol(MO));

case MachineOperand::MO_ExternalSymbol:
Symbol = AsmPrinter.GetExternalSymbolSymbol(MO.getSymbolName());

MCContext &Ctx = AP.OutContext;
MCSymbol *Sym = Ctx.getOrCreateSymbol(Name);

case MachineOperand::MO_ExternalSymbol:
Symbol = Printer.GetExternalSymbolSymbol(MO.getSymbolName());

case MachineOperand::MO_ExternalSymbol:
Symbol = GetExternalSymbolSymbol(MO.getSymbolName());

With only one except (in addition to X86) for M68k backend. Adding the change for M68k backend as well.

Is there another place to document this other than comments in the code?

@weiweichen
Copy link
Contributor Author

Added a C++ unit test as well.

@weiweichen weiweichen marked this pull request as ready for review March 29, 2025 14:44
@weiweichen
Copy link
Contributor Author

Ready for review for real now 🙏 !

if (MO.isSymbol())
Sym = AsmPrinter.OutContext.getOrCreateSymbol(Name);
else
Sym = Ctx.getOrCreateSymbol(Name);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh. This is starting to make more sense, now. Normally, there's only one MCContext, but because you're doing this "parallel codegen" thing, you're passing a different MCContext to construct the MachineFunction... and the coupling between a symbol and its context is loose enough that you can pass a symbol from the wrong MCContext to the AsmPrinter, and sort of get away with it.

If preventing that is the goal, why not change X86MCInstLower::X86MCInstLower instead? Just change Ctx(mf.getContext()) to Ctx(asmprinter.OutContext).

That said, more generally, if you want everything to work reliably, you'll probably need to completely eliminate the MCContext reference from MachineFunction. Otherwise, you'll continue to run into issues where the MCSymbol is from the wrong context. Maybe doable, but involves a significant amount of refactoring to avoid constructing MCSymbols early... and I'm not sure if there are any other weird interactions here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If preventing that is the goal, why not change X86MCInstLower::X86MCInstLower instead? Just change Ctx(mf.getContext()) to Ctx(asmprinter.OutContext).

Hmmm, this probably won't work (I mean it will work for MO_ExternalSymbol to get a unique MCSymbol, but) because we do still need mf.getContext() to do most of the codegen here for this function pass to run on the corresponding MachineFunction. Ctx has to be mf.getContext() as most of the codegen for this MachineFunction is in this MCContext. AsmPrinter's OutContext is just for the output here.

Yeah, for most cases, AsmPrinter.OutContext is the same as mf.getContext because there is just one MCContext in the whole pipeline. I'm not sure what is the motivation for other backends to also use AsmPrinter.OutContext for the same case here, but the change does make them look more consistent with each other 😀

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

said, more generally, if you want everything to work reliably, you'll probably need to completely eliminate the MCContext reference from MachineFunction. Otherwise, you'll continue to run into issues where the MCSymbol is from the wrong context. Maybe doable, but involves a significant amount of refactoring to avoid constructing MCSymbols early... and I'm not sure if there are any other weird interactions here.

Interesting idea, but I do need to think more about what does this entails, and as you said, it will probably be a significant amount of refactoring, so maybe as follow-ups with more concrete considerations on a larger scale refactoring?

(Also very selfishly speaking, this PR will help significantly with our MCLinker to function correctly, otherwise, half of tests are failing now 😢, so would be great to get something in first so that we can keep upgrading weekly 🙏 . )

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ctx has to be mf.getContext() as most of the codegen for this MachineFunction is in this MCContext.

I'm not really understanding what this means... I guess you've sort of informally partitioned things so that "local" symbols are created in the function's MCContext, and "global" symbols are created in the global MCContext? I don't think that's really sustainable... if we're going to consistently partition symbols, the two kinds of symbols can't both just be MCSymbol*, or you'll inevitably trip over issues in more obscure cases where we use the wrong MCContext.


I don't think the patch in its current form will cause any immediate issues, but I don't want to commit to a design that hasn't really been reviewed by LLVM community, and probably won't be approved in its current form because it's very confusing.

Copy link
Contributor Author

@weiweichen weiweichen Mar 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ctx has to be mf.getContext() as most of the codegen for this MachineFunction is in this MCContext.

I'm not really understanding what this means... I guess you've sort of informally partitioned things so that "local" symbols are created in the function's MCContext, and "global" symbols are created in the global MCContext? I don't think that's really sustainable... if we're going to consistently partition symbols, the two kinds of symbols can't both just be MCSymbol*, or you'll inevitably trip over issues in more obscure cases where we use the wrong MCContext.

This is valid concern, but we are willing to take the risk for potentially running into issues (since we don't have substantial example right now to show this will be a big problem and worrying about "what-if-sm" is hard 😞) in the future at this point to unblock all of our tests. I'm definitely open to suggestion on how to make the MCLinker more solid (maybe after/during my talk next month?).

I don't think the patch in its current form will cause any immediate issues, but I don't want to commit to a design that hasn't really been reviewed by LLVM community, and probably won't be approved in its current form because it's very confusing.

I understand your reservation and being careful, thank you for being thorough here! Though I want to point out that this change has already been applied to most of other backends (and I imagine those changes were being reviewed and approved by the community?). So I'd push back a bit on the assumption that this is "a design that hasn't really been reviewed by LLVM community"?

and probably won't be approved in its current form because it's very confusing.

Do you have any suggestion on which part is confusing and what can be added to make it less so? 🙏

Copy link
Contributor Author

@weiweichen weiweichen Apr 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think that's really sustainable... if we're going to consistently partition symbols, the two kinds of symbols can't both just be MCSymbol*, or you'll inevitably trip over issues in more obscure cases where we use the wrong MCContext.

Why can't the two kinds be both "MCSymbo*" just because their scopes are different (local vs global)? What's wrong with `MCSymbo' wrt scoping? Also, I don't quite get "inevitably trip over issues in more obscure cases" part, I'm afraid it's a bit vague statement. Could you be more specific on what cases that may be? And since most of other backends are already doing this, do you think they are also doing something unsustainable?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@efriedma-quic the absence of a fix blocks our pulldown as 50% of tests are failing due to reverted #133291 (comment) change.

Even though I agree that there might be a better "bulletproof" solution, but is it ok to move forward with Weiwei's fix first and then consider much better solution ? My rationale here is that Weiwei's proposed fix is aligned with what most of the targets are doing now and also there was an interest in community to open source parallel MCLinker. So when it comes to open sourcing there will be more use-cases and we will come back to that topic again.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this change has already been applied to most of other backends

The other backends work in your model by accident. They're not intentionally using one context or the other, they're just not using MachineFunction::getContext() at all in the AsmPrinter.

Could you be more specific on what cases that may be?
Do you have any suggestion on which part is confusing and what can be added to make it less so? 🙏

Basically, if MF.getContext().getOrCreateSymbol() means something different from AsmPrinter.OutContext.getOrCreateSymbol(), it's a lot harder to understand what's going on. You have two APIs that look the same on the surface, and return a value of the same type, but actually mean something subtly different. Anything that creates an MCSymbol would need to be aware that both APIs exist, and pick the correct one. Some existing code probably isn't consistent with your model, and anyone writing new code gets zero guidance from the API names or type system to help pick the correct one.

And the "local" one is never really right: MCSymbols passed to an MCStreamer are supposed to be part of the same MCContext as the MCStreamer.

To make things consistent, there needs to be one correct way to construct an MCSymbol. Either we need a "MachineFunctionSymbol" type to represent symbols that haven't been emitted yet, or you need to stop sharing MCStreamers between different compilation units. This patch isn't making any progress towards either of those models, or anything similar.


I don't think this patch will cause any immediate issues, because it's basically a no-op if there's only one MCContext, but this isn't the right long-term solution. And I don't want to commit to merging an unbounded number of temporary hacks while you work out how to implement the right solution.

@weiweichen
Copy link
Contributor Author

@efriedma-quic, would you be open to a short zoom/google meeting to talk about this? (maybe easier if we can talk than typing here?)

@@ -0,0 +1,180 @@
//===- llvm/unittest/CodeGen/AArch64SelectionDAGTest.cpp
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

File name is incorrect

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, oops, fixed!

//
//===----------------------------------------------------------------------===//

#include "../lib/Target/X86/X86ISelLowering.h"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the header used?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nope, removed

#include "llvm/AsmParser/Parser.h"
#include "llvm/CodeGen/AsmPrinter.h"
#include "llvm/CodeGen/MachineModuleInfo.h"
#include "llvm/CodeGen/SelectionDAG.h"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I doubt this test uses anything from SelectionDAG.h

#include "llvm/IR/Module.h"
#include "llvm/MC/MCStreamer.h"
#include "llvm/MC/TargetRegistry.h"
#include "llvm/Support/KnownBits.h"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I doubt this file uses KnownBits.h

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you are right, removed

@topperc
Copy link
Collaborator

topperc commented Apr 4, 2025

Unless I'm misreadying, Ctx at all 3 of those locations came from a member variable of AArch64MCInstLower. AArch64MCInstLower is a member of AArch64AsmPrinter and the Ctx was initialized from OutStreamer not MachineFunction::getContext().

class X86MCInstLowerTest : public testing::Test {
protected:
static void SetUpTestCase() {
LLVMInitializeX86TargetInfo();
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the X86 target isn't built, can you still call these funtions or do they not exist?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like compilation will fail if I'm not building X86 backend. Changing this to

    InitializeAllTargetMCs();
    InitializeAllTargetInfos();
    InitializeAllTargets();
    InitializeAllAsmPrinters();

Tested with only building AArch64 backend, this test gets skipped.

[==========] Running 1 test from 1 test suite.
[----------] Global test environment set-up.
[----------] 1 test from X86MCInstLowerTest
[ RUN      ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL
/Users/weiwei.chen/research/modularml/modular/third-party/llvm-project/llvm/unittests/CodeGen/X86MCInstLowerTest.cpp:107: Skipped


[  SKIPPED ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL (1 ms)
[----------] 1 test from X86MCInstLowerTest (1 ms total)

[----------] Global test environment tear-down
[==========] 1 test from 1 test suite ran. (2 ms total)
[  PASSED  ] 0 tests.
[  SKIPPED ] 1 test, listed below:
[  SKIPPED ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL

@weiweichen
Copy link
Contributor Author

weiweichen commented Apr 4, 2025

Unless I'm misreadying, Ctx at all 3 of those locations came from a member variable of AArch64MCInstLower. AArch64MCInstLower is a member of AArch64AsmPrinter and the Ctx was initialized from OutStreamer not MachineFunction::getContext().

Oh, good catch!! Looks like all the other backends are doing the same thing as AARch64 here that Ctx was initialized from OutStreamer not MachineFunction::getContext(), and X86 is the only exception.

(autovenv) (autovenv) weiwei.chen@Weiweis-MacBook-Pro ~/research/modularml/modular/third-party/llvm-project/llvm 🍔 git grep "MCInstLowering("
lib/Target/AArch64/AArch64AsmPrinter.cpp:99:      : AsmPrinter(TM, std::move(Streamer)), MCInstLowering(OutContext, *this),
lib/Target/AMDGPU/AMDGPUMCInstLower.cpp:222:  AMDGPUMCInstLower MCInstLowering(OutContext, STI, *this);
lib/Target/AMDGPU/AMDGPUMCInstLower.cpp:253:  AMDGPUMCInstLower MCInstLowering(OutContext, STI, *this);
lib/Target/AMDGPU/R600MCInstLower.cpp:52:  R600MCInstLower MCInstLowering(OutContext, STI, *this);
lib/Target/ARC/ARCAsmPrinter.cpp:41:        MCInstLowering(&OutContext, *this) {}
lib/Target/AVR/AVRAsmPrinter.cpp:195:  AVRMCInstLower MCInstLowering(OutContext, *this);
lib/Target/BPF/BPFAsmPrinter.cpp:144:    BPFMCInstLower MCInstLowering(OutContext, *this);
lib/Target/CSKY/CSKYAsmPrinter.cpp:41:    : AsmPrinter(TM, std::move(Streamer)), MCInstLowering(OutContext, *this) {}
lib/Target/Lanai/LanaiAsmPrinter.cpp:147:  LanaiMCInstLower MCInstLowering(OutContext, *this);
lib/Target/Lanai/LanaiAsmPrinter.cpp:184:  LanaiMCInstLower MCInstLowering(OutContext, *this);
lib/Target/MSP430/MSP430AsmPrinter.cpp:149:  MSP430MCInstLower MCInstLowering(OutContext, *this);
lib/Target/Mips/MipsAsmPrinter.h:126:      : AsmPrinter(TM, std::move(Streamer)), MCInstLowering(*this) {}
lib/Target/WebAssembly/WebAssemblyAsmPrinter.cpp:695:    WebAssemblyMCInstLower MCInstLowering(OutContext, *this);
lib/Target/X86/X86MCInstLower.cpp:2204:  X86MCInstLower MCInstLowering(*MF, *this);

@topperc, do you by any chance know if this exception is a must to have or X86 can also switch to use OutContext for X86MCInstLower.Ctx? I checked that Ctx in X86MCInstLower are mostly used for two things:

  • getOrCreateSymbol() to create an MCSymbol.
  • Use as an allocator to create different MCExprs.

It looks like it's probably fine to switch here to be consistent with all the other backends unless there is something critical I'm missing here?

I pushed a new commit to reflect that change, buildkite/github-pull-requests/linux-linux-x64 is passed. I also tested on our side that this fixes our issue as well. I'm wondering if this change is more acceptable now?

@weiweichen weiweichen changed the title [X86Backend] Use GetExternalSymbolSymbol for MO_ExternalSymbol. [X86Backend][M68KBackend] Make Ctx in X86MCInstLower the same as AsmPrinter.OutContext Apr 4, 2025
@weiweichen weiweichen changed the title [X86Backend][M68KBackend] Make Ctx in X86MCInstLower the same as AsmPrinter.OutContext [X86Backend][M68KBackend] Make Ctx in X86MCInstLower (M68KInstLower) the same as AsmPrinter.OutContext Apr 4, 2025
Copy link
Collaborator

@topperc topperc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. This avoids creating a distinction between the 2 different sources of MCContexts in LLVM. As long as this happens to work for your use case that seems fine, but it seems fragile and could be broken in some other way in the future.

@weiweichen
Copy link
Contributor Author

LGTM. This avoids creating a distinction between the 2 different sources of MCContexts in LLVM. As long as this happens to work for your use case that seems fine, but it seems fragile and could be broken in some other way in the future.

Thank you! Will definitely keep an eye on this part and follow up on more solid design if this turns out to be an issue down the road 🙏.

@weiweichen weiweichen merged commit 1f72fa2 into llvm:main Apr 5, 2025
11 checks passed
@llvm-ci
Copy link
Collaborator

llvm-ci commented Apr 5, 2025

LLVM Buildbot has detected a new failure on builder sanitizer-aarch64-linux-bootstrap-hwasan running on sanitizer-buildbot11 while building llvm at step 2 "annotate".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/55/builds/9441

Here is the relevant piece of the build log for the reference
Step 2 (annotate) failure: 'python ../sanitizer_buildbot/sanitizers/zorg/buildbot/builders/sanitizers/buildbot_selector.py' (failure)
...
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld.lld: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/ld.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/main.py:72: note: The test suite configuration requested an individual test timeout of 0 seconds but a timeout of 900 seconds was requested on the command line. Forcing timeout to be 900 seconds.
-- Testing: 87461 tests, 72 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 
FAIL: LLVM-Unit :: CodeGen/./CodeGenTests/23/76 (79618 of 87461)
******************** TEST 'LLVM-Unit :: CodeGen/./CodeGenTests/23/76' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:/home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-115745-23-76.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=76 GTEST_SHARD_INDEX=23 /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/unittests/CodeGen/./CodeGenTests
--

Note: This is test shard 24 of 76.
[==========] Running 4 tests from 4 test suites.
[----------] Global test environment set-up.
[----------] 1 test from AArch64SelectionDAGTest
[ RUN      ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT
[       OK ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT (4 ms)
[----------] 1 test from AArch64SelectionDAGTest (4 ms total)

[----------] 1 test from InstrRefLDVTest
[ RUN      ] InstrRefLDVTest.MTransferSubregSpills
[       OK ] InstrRefLDVTest.MTransferSubregSpills (40 ms)
[----------] 1 test from InstrRefLDVTest (40 ms total)

[----------] 1 test from RegAllocScoreTest
[ RUN      ] RegAllocScoreTest.Counts
[       OK ] RegAllocScoreTest.Counts (1 ms)
[----------] 1 test from RegAllocScoreTest (1 ms total)

[----------] 1 test from X86MCInstLowerTest
[ RUN      ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL

--
exit: -6
--
shard JSON output does not exist: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-115745-23-76.json
********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
Slowest Tests:
--------------------------------------------------------------------------
56.42s: Clang :: Driver/fsanitize.c
40.96s: Clang :: Preprocessor/riscv-target-features.c
38.15s: Clang :: Driver/arm-cortex-cpus-2.c
Step 11 (stage2/hwasan check) failure: stage2/hwasan check (failure)
...
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld.lld: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/ld.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm-project/llvm/utils/lit/lit/main.py:72: note: The test suite configuration requested an individual test timeout of 0 seconds but a timeout of 900 seconds was requested on the command line. Forcing timeout to be 900 seconds.
-- Testing: 87461 tests, 72 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 
FAIL: LLVM-Unit :: CodeGen/./CodeGenTests/23/76 (79618 of 87461)
******************** TEST 'LLVM-Unit :: CodeGen/./CodeGenTests/23/76' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:/home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-115745-23-76.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=76 GTEST_SHARD_INDEX=23 /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/unittests/CodeGen/./CodeGenTests
--

Note: This is test shard 24 of 76.
[==========] Running 4 tests from 4 test suites.
[----------] Global test environment set-up.
[----------] 1 test from AArch64SelectionDAGTest
[ RUN      ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT
[       OK ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT (4 ms)
[----------] 1 test from AArch64SelectionDAGTest (4 ms total)

[----------] 1 test from InstrRefLDVTest
[ RUN      ] InstrRefLDVTest.MTransferSubregSpills
[       OK ] InstrRefLDVTest.MTransferSubregSpills (40 ms)
[----------] 1 test from InstrRefLDVTest (40 ms total)

[----------] 1 test from RegAllocScoreTest
[ RUN      ] RegAllocScoreTest.Counts
[       OK ] RegAllocScoreTest.Counts (1 ms)
[----------] 1 test from RegAllocScoreTest (1 ms total)

[----------] 1 test from X86MCInstLowerTest
[ RUN      ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL

--
exit: -6
--
shard JSON output does not exist: /home/b/sanitizer-aarch64-linux-bootstrap-hwasan/build/llvm_build_hwasan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-115745-23-76.json
********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
Slowest Tests:
--------------------------------------------------------------------------
56.42s: Clang :: Driver/fsanitize.c
40.96s: Clang :: Preprocessor/riscv-target-features.c
38.15s: Clang :: Driver/arm-cortex-cpus-2.c

@llvm-ci
Copy link
Collaborator

llvm-ci commented Apr 5, 2025

LLVM Buildbot has detected a new failure on builder clang-x64-windows-msvc running on windows-gcebot2 while building llvm at step 4 "annotate".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/63/builds/4983

Here is the relevant piece of the build log for the reference
Step 4 (annotate) failure: 'python ../llvm-zorg/zorg/buildbot/builders/annotated/clang-windows.py ...' (failure)
...
  Passed           : 46417 (99.12%)
  Expectedly Failed:    37 (0.08%)
[97/98] Running the LLVM regression tests
llvm-lit.py: C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\utils\lit\lit\llvm\config.py:57: note: using lit tools: C:\Program Files\Git\usr\bin
llvm-lit.py: C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\utils\lit\lit\llvm\config.py:520: note: using ld.lld: c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\ld.lld.exe
llvm-lit.py: C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\utils\lit\lit\llvm\config.py:520: note: using lld-link: c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\lld-link.exe
llvm-lit.py: C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\utils\lit\lit\llvm\config.py:520: note: using ld64.lld: c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\ld64.lld.exe
llvm-lit.py: C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\utils\lit\lit\llvm\config.py:520: note: using wasm-ld: c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\wasm-ld.exe
-- Testing: 58003 tests, 32 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70..
FAIL: LLVM :: tools/llvm-exegesis/RISCV/rvv/filter.test (25440 of 58003)
******************** TEST 'LLVM :: tools/llvm-exegesis/RISCV/rvv/filter.test' FAILED ********************
Exit Code: 2

Command Output (stdout):
--
# RUN: at line 1
c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\llvm-exegesis.exe -mtriple=riscv64 -mcpu=sifive-x280 -benchmark-phase=assemble-measured-code --mode=inverse_throughput --opcode-name=PseudoVNCLIPU_WX_M1_MASK     --riscv-filter-config='vtype = {VXRM: rod, AVL: VLMAX, SEW: e(8|16), Policy: ta/mu}' --max-configs-per-opcode=1000 --min-instructions=10 | c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\filecheck.exe C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\test\tools\llvm-exegesis\RISCV\rvv\filter.test
# executed command: 'c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\llvm-exegesis.exe' -mtriple=riscv64 -mcpu=sifive-x280 -benchmark-phase=assemble-measured-code --mode=inverse_throughput --opcode-name=PseudoVNCLIPU_WX_M1_MASK '--riscv-filter-config=vtype = {VXRM: rod, AVL: VLMAX, SEW: e(8|16), Policy: ta/mu}' --max-configs-per-opcode=1000 --min-instructions=10
# .---command stderr------------
# | PseudoVNCLIPU_WX_M1_MASK: Failed to produce any snippet via: instruction has tied variables, avoiding Read-After-Write issue, picking random def and use registers not aliasing each other, for uses, one unique register for each position
# `-----------------------------
# executed command: 'c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\filecheck.exe' 'C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\test\tools\llvm-exegesis\RISCV\rvv\filter.test'
# .---command stderr------------
# | FileCheck error: '<stdin>' is empty.
# | FileCheck command line:  c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\filecheck.exe C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\test\tools\llvm-exegesis\RISCV\rvv\filter.test
# `-----------------------------
# error: command failed with exit status: 2

--

********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
********************
Failed Tests (1):
  LLVM :: tools/llvm-exegesis/RISCV/rvv/filter.test


Testing Time: 344.55s

Total Discovered Tests: 64944
  Skipped          :    26 (0.04%)
  Unsupported      :  2420 (3.73%)
  Passed           : 62309 (95.94%)
  Expectedly Failed:   188 (0.29%)
  Failed           :     1 (0.00%)
FAILED: test/CMakeFiles/check-llvm C:/b/slave/clang-x64-windows-msvc/build/stage1/test/CMakeFiles/check-llvm 
cmd.exe /C "cd /D C:\b\slave\clang-x64-windows-msvc\build\stage1\test && C:\Python39\python.exe C:/b/slave/clang-x64-windows-msvc/build/stage1/./bin/llvm-lit.py -sv -j 32 C:/b/slave/clang-x64-windows-msvc/build/stage1/test"
ninja: build stopped: subcommand failed.
Step 8 (stage 1 check) failure: stage 1 check (failure)
...
  Passed           : 46417 (99.12%)
  Expectedly Failed:    37 (0.08%)
[97/98] Running the LLVM regression tests
llvm-lit.py: C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\utils\lit\lit\llvm\config.py:57: note: using lit tools: C:\Program Files\Git\usr\bin
llvm-lit.py: C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\utils\lit\lit\llvm\config.py:520: note: using ld.lld: c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\ld.lld.exe
llvm-lit.py: C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\utils\lit\lit\llvm\config.py:520: note: using lld-link: c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\lld-link.exe
llvm-lit.py: C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\utils\lit\lit\llvm\config.py:520: note: using ld64.lld: c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\ld64.lld.exe
llvm-lit.py: C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\utils\lit\lit\llvm\config.py:520: note: using wasm-ld: c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\wasm-ld.exe
-- Testing: 58003 tests, 32 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70..
FAIL: LLVM :: tools/llvm-exegesis/RISCV/rvv/filter.test (25440 of 58003)
******************** TEST 'LLVM :: tools/llvm-exegesis/RISCV/rvv/filter.test' FAILED ********************
Exit Code: 2

Command Output (stdout):
--
# RUN: at line 1
c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\llvm-exegesis.exe -mtriple=riscv64 -mcpu=sifive-x280 -benchmark-phase=assemble-measured-code --mode=inverse_throughput --opcode-name=PseudoVNCLIPU_WX_M1_MASK     --riscv-filter-config='vtype = {VXRM: rod, AVL: VLMAX, SEW: e(8|16), Policy: ta/mu}' --max-configs-per-opcode=1000 --min-instructions=10 | c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\filecheck.exe C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\test\tools\llvm-exegesis\RISCV\rvv\filter.test
# executed command: 'c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\llvm-exegesis.exe' -mtriple=riscv64 -mcpu=sifive-x280 -benchmark-phase=assemble-measured-code --mode=inverse_throughput --opcode-name=PseudoVNCLIPU_WX_M1_MASK '--riscv-filter-config=vtype = {VXRM: rod, AVL: VLMAX, SEW: e(8|16), Policy: ta/mu}' --max-configs-per-opcode=1000 --min-instructions=10
# .---command stderr------------
# | PseudoVNCLIPU_WX_M1_MASK: Failed to produce any snippet via: instruction has tied variables, avoiding Read-After-Write issue, picking random def and use registers not aliasing each other, for uses, one unique register for each position
# `-----------------------------
# executed command: 'c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\filecheck.exe' 'C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\test\tools\llvm-exegesis\RISCV\rvv\filter.test'
# .---command stderr------------
# | FileCheck error: '<stdin>' is empty.
# | FileCheck command line:  c:\b\slave\clang-x64-windows-msvc\build\stage1\bin\filecheck.exe C:\b\slave\clang-x64-windows-msvc\llvm-project\llvm\test\tools\llvm-exegesis\RISCV\rvv\filter.test
# `-----------------------------
# error: command failed with exit status: 2

--

********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
********************
Failed Tests (1):
  LLVM :: tools/llvm-exegesis/RISCV/rvv/filter.test


Testing Time: 344.55s

Total Discovered Tests: 64944
  Skipped          :    26 (0.04%)
  Unsupported      :  2420 (3.73%)
  Passed           : 62309 (95.94%)
  Expectedly Failed:   188 (0.29%)
  Failed           :     1 (0.00%)
FAILED: test/CMakeFiles/check-llvm C:/b/slave/clang-x64-windows-msvc/build/stage1/test/CMakeFiles/check-llvm 
cmd.exe /C "cd /D C:\b\slave\clang-x64-windows-msvc\build\stage1\test && C:\Python39\python.exe C:/b/slave/clang-x64-windows-msvc/build/stage1/./bin/llvm-lit.py -sv -j 32 C:/b/slave/clang-x64-windows-msvc/build/stage1/test"
ninja: build stopped: subcommand failed.

@llvm-ci
Copy link
Collaborator

llvm-ci commented Apr 5, 2025

LLVM Buildbot has detected a new failure on builder sanitizer-aarch64-linux-bootstrap-asan running on sanitizer-buildbot8 while building llvm at step 2 "annotate".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/24/builds/7040

Here is the relevant piece of the build log for the reference
Step 2 (annotate) failure: 'python ../sanitizer_buildbot/sanitizers/zorg/buildbot/builders/sanitizers/buildbot_selector.py' (failure)
...
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld.lld: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/main.py:72: note: The test suite configuration requested an individual test timeout of 0 seconds but a timeout of 900 seconds was requested on the command line. Forcing timeout to be 900 seconds.
-- Testing: 87462 tests, 72 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 
FAIL: LLVM-Unit :: CodeGen/./CodeGenTests/23/76 (78973 of 87462)
******************** TEST 'LLVM-Unit :: CodeGen/./CodeGenTests/23/76' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:/home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-2549900-23-76.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=76 GTEST_SHARD_INDEX=23 /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests
--

Note: This is test shard 24 of 76.
[==========] Running 4 tests from 4 test suites.
[----------] Global test environment set-up.
[----------] 1 test from AArch64SelectionDAGTest
[ RUN      ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT
[       OK ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT (15 ms)
[----------] 1 test from AArch64SelectionDAGTest (15 ms total)

[----------] 1 test from InstrRefLDVTest
[ RUN      ] InstrRefLDVTest.MTransferSubregSpills
[       OK ] InstrRefLDVTest.MTransferSubregSpills (148 ms)
[----------] 1 test from InstrRefLDVTest (148 ms total)

[----------] 1 test from RegAllocScoreTest
[ RUN      ] RegAllocScoreTest.Counts
[       OK ] RegAllocScoreTest.Counts (1 ms)
[----------] 1 test from RegAllocScoreTest (1 ms total)

[----------] 1 test from X86MCInstLowerTest
[ RUN      ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL

--
exit: 1
--
shard JSON output does not exist: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-2549900-23-76.json
********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
Slowest Tests:
--------------------------------------------------------------------------
233.30s: Clang :: Driver/fsanitize.c
179.96s: Clang :: Preprocessor/riscv-target-features.c
163.85s: Clang :: Driver/arm-cortex-cpus-2.c
Step 11 (stage2/asan check) failure: stage2/asan check (failure)
...
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld.lld: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/main.py:72: note: The test suite configuration requested an individual test timeout of 0 seconds but a timeout of 900 seconds was requested on the command line. Forcing timeout to be 900 seconds.
-- Testing: 87462 tests, 72 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 
FAIL: LLVM-Unit :: CodeGen/./CodeGenTests/23/76 (78973 of 87462)
******************** TEST 'LLVM-Unit :: CodeGen/./CodeGenTests/23/76' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:/home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-2549900-23-76.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=76 GTEST_SHARD_INDEX=23 /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests
--

Note: This is test shard 24 of 76.
[==========] Running 4 tests from 4 test suites.
[----------] Global test environment set-up.
[----------] 1 test from AArch64SelectionDAGTest
[ RUN      ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT
[       OK ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT (15 ms)
[----------] 1 test from AArch64SelectionDAGTest (15 ms total)

[----------] 1 test from InstrRefLDVTest
[ RUN      ] InstrRefLDVTest.MTransferSubregSpills
[       OK ] InstrRefLDVTest.MTransferSubregSpills (148 ms)
[----------] 1 test from InstrRefLDVTest (148 ms total)

[----------] 1 test from RegAllocScoreTest
[ RUN      ] RegAllocScoreTest.Counts
[       OK ] RegAllocScoreTest.Counts (1 ms)
[----------] 1 test from RegAllocScoreTest (1 ms total)

[----------] 1 test from X86MCInstLowerTest
[ RUN      ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL

--
exit: 1
--
shard JSON output does not exist: /home/b/sanitizer-aarch64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-2549900-23-76.json
********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
Slowest Tests:
--------------------------------------------------------------------------
233.30s: Clang :: Driver/fsanitize.c
179.96s: Clang :: Preprocessor/riscv-target-features.c
163.85s: Clang :: Driver/arm-cortex-cpus-2.c

@llvm-ci
Copy link
Collaborator

llvm-ci commented Apr 5, 2025

LLVM Buildbot has detected a new failure on builder sanitizer-x86_64-linux-bootstrap-asan running on sanitizer-buildbot2 while building llvm at step 2 "annotate".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/52/builds/7340

Here is the relevant piece of the build log for the reference
Step 2 (annotate) failure: 'python ../sanitizer_buildbot/sanitizers/zorg/buildbot/builders/sanitizers/buildbot_selector.py' (failure)
...
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/lld-link
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld.lld: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld.lld
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/lld-link
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/main.py:72: note: The test suite configuration requested an individual test timeout of 0 seconds but a timeout of 900 seconds was requested on the command line. Forcing timeout to be 900 seconds.
-- Testing: 89957 tests, 88 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80..
FAIL: LLVM-Unit :: CodeGen/./CodeGenTests/100/151 (80893 of 89957)
******************** TEST 'LLVM-Unit :: CodeGen/./CodeGenTests/100/151' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:/home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-322859-100-151.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=151 GTEST_SHARD_INDEX=100 /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests
--

Note: This is test shard 101 of 151.
[==========] Running 2 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 1 test from InstrRefLDVTest
[ RUN      ] InstrRefLDVTest.MLocSingleBlock
[       OK ] InstrRefLDVTest.MLocSingleBlock (3 ms)
[----------] 1 test from InstrRefLDVTest (3 ms total)

[----------] 1 test from X86MCInstLowerTest
[ RUN      ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL

--
exit: 1
--
shard JSON output does not exist: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-322859-100-151.json
********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
Slowest Tests:
--------------------------------------------------------------------------
307.52s: LLVM :: CodeGen/AMDGPU/sched-group-barrier-pipeline-solver.mir
132.71s: Clang :: Driver/fsanitize.c
114.50s: Clang :: Preprocessor/riscv-target-features.c
113.21s: LLVM :: CodeGen/RISCV/attributes.ll
99.03s: Clang :: OpenMP/target_update_codegen.cpp
95.37s: Clang :: OpenMP/target_defaultmap_codegen_01.cpp
92.84s: Clang :: Driver/arm-cortex-cpus-2.c
90.46s: Clang :: Driver/arm-cortex-cpus-1.c
79.57s: Clang :: Preprocessor/aarch64-target-features.c
79.16s: Clang :: Preprocessor/arm-target-features.c
69.66s: LLVM :: CodeGen/RISCV/atomic-rmw.ll
65.81s: Clang :: Preprocessor/predefined-arch-macros.c
65.58s: Clang :: Driver/linux-ld.c
Step 11 (stage2/asan check) failure: stage2/asan check (failure)
...
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/lld-link
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld.lld: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld.lld
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/lld-link
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm-project/llvm/utils/lit/lit/main.py:72: note: The test suite configuration requested an individual test timeout of 0 seconds but a timeout of 900 seconds was requested on the command line. Forcing timeout to be 900 seconds.
-- Testing: 89957 tests, 88 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80..
FAIL: LLVM-Unit :: CodeGen/./CodeGenTests/100/151 (80893 of 89957)
******************** TEST 'LLVM-Unit :: CodeGen/./CodeGenTests/100/151' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:/home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-322859-100-151.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=151 GTEST_SHARD_INDEX=100 /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests
--

Note: This is test shard 101 of 151.
[==========] Running 2 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 1 test from InstrRefLDVTest
[ RUN      ] InstrRefLDVTest.MLocSingleBlock
[       OK ] InstrRefLDVTest.MLocSingleBlock (3 ms)
[----------] 1 test from InstrRefLDVTest (3 ms total)

[----------] 1 test from X86MCInstLowerTest
[ RUN      ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL

--
exit: 1
--
shard JSON output does not exist: /home/b/sanitizer-x86_64-linux-bootstrap-asan/build/llvm_build_asan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-322859-100-151.json
********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
Slowest Tests:
--------------------------------------------------------------------------
307.52s: LLVM :: CodeGen/AMDGPU/sched-group-barrier-pipeline-solver.mir
132.71s: Clang :: Driver/fsanitize.c
114.50s: Clang :: Preprocessor/riscv-target-features.c
113.21s: LLVM :: CodeGen/RISCV/attributes.ll
99.03s: Clang :: OpenMP/target_update_codegen.cpp
95.37s: Clang :: OpenMP/target_defaultmap_codegen_01.cpp
92.84s: Clang :: Driver/arm-cortex-cpus-2.c
90.46s: Clang :: Driver/arm-cortex-cpus-1.c
79.57s: Clang :: Preprocessor/aarch64-target-features.c
79.16s: Clang :: Preprocessor/arm-target-features.c
69.66s: LLVM :: CodeGen/RISCV/atomic-rmw.ll
65.81s: Clang :: Preprocessor/predefined-arch-macros.c
65.58s: Clang :: Driver/linux-ld.c

@llvm-ci
Copy link
Collaborator

llvm-ci commented Apr 5, 2025

LLVM Buildbot has detected a new failure on builder sanitizer-aarch64-linux-bootstrap-msan running on sanitizer-buildbot9 while building llvm at step 2 "annotate".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/94/builds/5918

Here is the relevant piece of the build log for the reference
Step 2 (annotate) failure: 'python ../sanitizer_buildbot/sanitizers/zorg/buildbot/builders/sanitizers/buildbot_selector.py' (failure)
...
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld.lld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/ld.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/main.py:72: note: The test suite configuration requested an individual test timeout of 0 seconds but a timeout of 900 seconds was requested on the command line. Forcing timeout to be 900 seconds.
-- Testing: 87460 tests, 72 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 
FAIL: LLVM-Unit :: CodeGen/./CodeGenTests/23/76 (79255 of 87460)
******************** TEST 'LLVM-Unit :: CodeGen/./CodeGenTests/23/76' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:/home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-3200700-23-76.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=76 GTEST_SHARD_INDEX=23 /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/unittests/CodeGen/./CodeGenTests
--

Note: This is test shard 24 of 76.
[==========] Running 4 tests from 4 test suites.
[----------] Global test environment set-up.
[----------] 1 test from AArch64SelectionDAGTest
[ RUN      ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT
[       OK ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT (8 ms)
[----------] 1 test from AArch64SelectionDAGTest (8 ms total)

[----------] 1 test from InstrRefLDVTest
[ RUN      ] InstrRefLDVTest.MTransferSubregSpills
[       OK ] InstrRefLDVTest.MTransferSubregSpills (64 ms)
[----------] 1 test from InstrRefLDVTest (64 ms total)

[----------] 1 test from RegAllocScoreTest
[ RUN      ] RegAllocScoreTest.Counts
[       OK ] RegAllocScoreTest.Counts (0 ms)
[----------] 1 test from RegAllocScoreTest (4 ms total)

[----------] 1 test from X86MCInstLowerTest
[ RUN      ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL

--
exit: -6
--
shard JSON output does not exist: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-3200700-23-76.json
********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
Slowest Tests:
--------------------------------------------------------------------------
74.45s: LLVM :: CodeGen/AMDGPU/memintrinsic-unroll.ll
64.76s: LLVM :: CodeGen/AMDGPU/sched-group-barrier-pipeline-solver.mir
60.02s: Clang :: Driver/fsanitize.c
Step 11 (stage2/msan check) failure: stage2/msan check (failure)
...
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld.lld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/ld.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/main.py:72: note: The test suite configuration requested an individual test timeout of 0 seconds but a timeout of 900 seconds was requested on the command line. Forcing timeout to be 900 seconds.
-- Testing: 87460 tests, 72 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 
FAIL: LLVM-Unit :: CodeGen/./CodeGenTests/23/76 (79255 of 87460)
******************** TEST 'LLVM-Unit :: CodeGen/./CodeGenTests/23/76' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:/home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-3200700-23-76.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=76 GTEST_SHARD_INDEX=23 /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/unittests/CodeGen/./CodeGenTests
--

Note: This is test shard 24 of 76.
[==========] Running 4 tests from 4 test suites.
[----------] Global test environment set-up.
[----------] 1 test from AArch64SelectionDAGTest
[ RUN      ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT
[       OK ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT (8 ms)
[----------] 1 test from AArch64SelectionDAGTest (8 ms total)

[----------] 1 test from InstrRefLDVTest
[ RUN      ] InstrRefLDVTest.MTransferSubregSpills
[       OK ] InstrRefLDVTest.MTransferSubregSpills (64 ms)
[----------] 1 test from InstrRefLDVTest (64 ms total)

[----------] 1 test from RegAllocScoreTest
[ RUN      ] RegAllocScoreTest.Counts
[       OK ] RegAllocScoreTest.Counts (0 ms)
[----------] 1 test from RegAllocScoreTest (4 ms total)

[----------] 1 test from X86MCInstLowerTest
[ RUN      ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL

--
exit: -6
--
shard JSON output does not exist: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan/unittests/CodeGen/./CodeGenTests-LLVM-Unit-3200700-23-76.json
********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
Slowest Tests:
--------------------------------------------------------------------------
74.45s: LLVM :: CodeGen/AMDGPU/memintrinsic-unroll.ll
64.76s: LLVM :: CodeGen/AMDGPU/sched-group-barrier-pipeline-solver.mir
60.02s: Clang :: Driver/fsanitize.c
Step 16 (stage2/msan_track_origins check) failure: stage2/msan_track_origins check (failure)
...
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan_track_origins/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan_track_origins/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan_track_origins/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld.lld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan_track_origins/bin/ld.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using lld-link: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan_track_origins/bin/lld-link
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using ld64.lld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan_track_origins/bin/ld64.lld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/llvm/config.py:520: note: using wasm-ld: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan_track_origins/bin/wasm-ld
llvm-lit: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm-project/llvm/utils/lit/lit/main.py:72: note: The test suite configuration requested an individual test timeout of 0 seconds but a timeout of 900 seconds was requested on the command line. Forcing timeout to be 900 seconds.
-- Testing: 87460 tests, 72 workers --
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 
FAIL: LLVM-Unit :: CodeGen/./CodeGenTests/23/76 (78970 of 87460)
******************** TEST 'LLVM-Unit :: CodeGen/./CodeGenTests/23/76' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:/home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan_track_origins/unittests/CodeGen/./CodeGenTests-LLVM-Unit-3286472-23-76.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=76 GTEST_SHARD_INDEX=23 /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan_track_origins/unittests/CodeGen/./CodeGenTests
--

Note: This is test shard 24 of 76.
[==========] Running 4 tests from 4 test suites.
[----------] Global test environment set-up.
[----------] 1 test from AArch64SelectionDAGTest
[ RUN      ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT
[       OK ] AArch64SelectionDAGTest.getTypeConversion_SplitScalableMVT (22 ms)
[----------] 1 test from AArch64SelectionDAGTest (22 ms total)

[----------] 1 test from InstrRefLDVTest
[ RUN      ] InstrRefLDVTest.MTransferSubregSpills
[       OK ] InstrRefLDVTest.MTransferSubregSpills (91 ms)
[----------] 1 test from InstrRefLDVTest (91 ms total)

[----------] 1 test from RegAllocScoreTest
[ RUN      ] RegAllocScoreTest.Counts
[       OK ] RegAllocScoreTest.Counts (1 ms)
[----------] 1 test from RegAllocScoreTest (1 ms total)

[----------] 1 test from X86MCInstLowerTest
[ RUN      ] X86MCInstLowerTest.moExternalSymbol_MCSYMBOL

--
exit: -6
--
shard JSON output does not exist: /home/b/sanitizer-aarch64-linux-bootstrap-msan/build/llvm_build_msan_track_origins/unittests/CodeGen/./CodeGenTests-LLVM-Unit-3286472-23-76.json
********************
Testing:  0.. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
Slowest Tests:
--------------------------------------------------------------------------
423.09s: LLVM :: CodeGen/AMDGPU/sched-group-barrier-pipeline-solver.mir
223.78s: Clang :: Analysis/runtime-regression.c
152.78s: LLVM :: CodeGen/AMDGPU/memintrinsic-unroll.ll

@weiweichen
Copy link
Contributor Author

Sending this PR #134481 which should probably help to fix all the sanitizer failures here for X86MCInstLowerTest.cpp.

@llvm-ci
Copy link
Collaborator

llvm-ci commented Apr 5, 2025

LLVM Buildbot has detected a new failure on builder lld-x86_64-win running on as-worker-93 while building llvm at step 7 "test-build-unified-tree-check-all".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/146/builds/2646

Here is the relevant piece of the build log for the reference
Step 7 (test-build-unified-tree-check-all) failure: test (failure)
******************** TEST 'LLVM-Unit :: Support/./SupportTests.exe/82/95' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:C:\a\lld-x86_64-win\build\unittests\Support\.\SupportTests.exe-LLVM-Unit-10736-82-95.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=95 GTEST_SHARD_INDEX=82 C:\a\lld-x86_64-win\build\unittests\Support\.\SupportTests.exe
--

Script:
--
C:\a\lld-x86_64-win\build\unittests\Support\.\SupportTests.exe --gtest_filter=ProgramEnvTest.CreateProcessLongPath
--
C:\a\lld-x86_64-win\llvm-project\llvm\unittests\Support\ProgramTest.cpp(160): error: Expected equality of these values:
  0
  RC
    Which is: -2

C:\a\lld-x86_64-win\llvm-project\llvm\unittests\Support\ProgramTest.cpp(163): error: fs::remove(Twine(LongPath)): did not return errc::success.
error number: 13
error message: permission denied



C:\a\lld-x86_64-win\llvm-project\llvm\unittests\Support\ProgramTest.cpp:160
Expected equality of these values:
  0
  RC
    Which is: -2

C:\a\lld-x86_64-win\llvm-project\llvm\unittests\Support\ProgramTest.cpp:163
fs::remove(Twine(LongPath)): did not return errc::success.
error number: 13
error message: permission denied




********************


weiweichen added a commit that referenced this pull request Apr 5, 2025
Reordering `OS` and `PassMgrF` should fix the asan failure that's caused
by OS being destroyed before `PassMgrF` deletes the AsmPrinter.

As shown in[ this asan run
](https://lab.llvm.org/buildbot/#/builders/52/builds/7340/steps/12/logs/stdio)

```
  This frame has 15 object(s):
    [32, 48) 'PassMgrF' (line 154)
    [64, 1112) 'Buf' (line 155)
    [1248, 1304) 'OS' (line 156) <== Memory access at offset 1280 is inside this variable
```
which indicates an ordering problem. 

This should help to fix all the sanitizer failures caused by the test
`X86MCInstLowerTest.cpp` that's introduced by [this
PR](#133352 (comment)).
llvm-sync bot pushed a commit to arm/arm-toolchain that referenced this pull request Apr 6, 2025
Reordering `OS` and `PassMgrF` should fix the asan failure that's caused
by OS being destroyed before `PassMgrF` deletes the AsmPrinter.

As shown in[ this asan run
](https://lab.llvm.org/buildbot/#/builders/52/builds/7340/steps/12/logs/stdio)

```
  This frame has 15 object(s):
    [32, 48) 'PassMgrF' (line 154)
    [64, 1112) 'Buf' (line 155)
    [1248, 1304) 'OS' (line 156) <== Memory access at offset 1280 is inside this variable
```
which indicates an ordering problem.

This should help to fix all the sanitizer failures caused by the test
`X86MCInstLowerTest.cpp` that's introduced by [this
PR](llvm/llvm-project#133352 (comment)).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants