Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update for dxc to the latest version. #211

Merged
merged 1 commit into from
Oct 6, 2023

Conversation

jhmueller-huawei
Copy link
Contributor

@jhmueller-huawei jhmueller-huawei commented Aug 24, 2023

The new dxc version fixes multiple issues that we encounter during shader compilation with the older version.

We have tested this on linux (x64) and windows, but could not test in on linux (aarch) or mac, letting the asset processor compile all shaders (some changes are needed: o3de/o3de#16568). On windows when building with a newer visual studio version, package-system/DirectXShaderCompiler/build_dxc_windows.cmd needs to be modified, either removing or changing -vs2019 to -vs2022 as we did, but this is not included in this patch.

@jhmueller-huawei jhmueller-huawei requested review from moudgils, a team and lemonade-dm and removed request for a team August 24, 2023 07:15
Copy link
Contributor

@moudgils moudgils left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My only concern here is that now you are grabbing directly form MS instead of O3de repo. Have we checked if O3de repo had some custom change that will get lost as part of this switch? I seem to remember we had a small change related to Android but not sure what it was. Tagging @galibzon.

@lemonade-dm
Copy link
Contributor

lemonade-dm commented Aug 24, 2023

My only concern here is that now you are grabbing directly form MS instead of O3de repo. Have we checked if O3de repo had some custom change that will get lost as part of this switch? I seem to remember we had a small change related to Android but not sure what it was. Tagging @galibzon.

@moudgils It looks like you made a branch with some driver fixes on in the o3de fork release-1.6.2112-o3de branch: https://github.com/o3de/DirectXShaderCompiler/commits/release-1.6.2112-o3de

@galibzon
Copy link
Contributor

@jhmueller-huawei , please take a look at the branch that contains our customizations, as shared by @lumberyard-employee-dm : https://github.com/o3de/DirectXShaderCompiler/commits/release-1.6.2112-o3de

The last three commits in that branch are all our customizations.

@jhmueller-huawei
Copy link
Contributor Author

jhmueller-huawei commented Aug 25, 2023

I've had a look at these changes, it's basically two (one is split into two commits):

  • The first one adds a command line option: fvk_disable_depth_hint. I searched for it in the o3de codebase, but could not find it, so I assume that it is not used and thus not needed anymore?
  • The other change concerns outputting an invariant decorator for precise builtin variables. I could not figure out when this is necessary, since there were no hints in the corresponding commit/PR. I can only assume that it has to do with gl_Position needing to be invariant for multi-pass pipelines. Is it still needed and what for? From what I read here it seems that precise should ensure invariance as well which is what dxc outputs without this patch.

I have also opened a PR to get the recent changes from the MS repository here: o3de/DirectXShaderCompiler#3 (or maybe someone with the appropriate rights can just push the sync fork button on github since DCO fails in this PR).

@moudgils
Copy link
Contributor

moudgils commented Aug 25, 2023

  • -fvk-disable-depth-hint is used for android shaders and its usage can be seen here - \o3de\Gems\Atom\Asset\Shader\Config\Platform\Android\shader_build_options
  • The Position tagged with Invariant was done for Metal backend. For tiled hw the position in forward pass would sometimes not match depth pass and this created artifacts. The way around it is to tag the position stream with invariant and that led spirv-cross further down the line to tag it appropriately within metalSL.

yeah the correct way is to merge the latest from MS to https://github.com/o3de/DirectXShaderCompiler, create a branch tag and then have the 3p-scripts pull from that tag. @galibzon are you able to help merge the o3de/DirectXShaderCompiler#3 by giving it an approval. Thanks.

@jhmueller-huawei
Copy link
Contributor Author

Okay, I have now also created a branch for DXC containing these fixes (squashing the fix that was separated into two commits), but had to push them to my own fork of DXC since I cannot create branches on o3de/DirectXShaderCompiler : https://github.com/jhmueller-huawei/DirectXShaderCompiler/tree/release-1.7.2308-o3de - can you do that please?

@moudgils
Copy link
Contributor

o3de/DirectXShaderCompiler

@amzn-changml Are we able to give @jhmueller-huawei access to create PRs and submit to o3de/DirectXShaderCompiler repo?

@amzn-changml
Copy link
Contributor

o3de/DirectXShaderCompiler

@amzn-changml Are we able to give @jhmueller-huawei access to create PRs and submit to o3de/DirectXShaderCompiler repo?

Yes, that should already be possible. To be clear, the intended workflow is to fork the repo to your own account, create a branch with changes, then make a PR back to the parent.

The new dxc version fixes multiple issues that we encounter during shader compilation with the older version.

Signed-off-by: Joerg H. Mueller <[email protected]>
@jhmueller-huawei
Copy link
Contributor Author

jhmueller-huawei commented Aug 29, 2023

Indeed, I was able to push a new branch to https://github.com/o3de/DirectXShaderCompiler. I've updated the PR with the new branch now.

@moudgils
Copy link
Contributor

moudgils commented Sep 1, 2023

The changes in this PR look good. The next step would be to build the 4 packages using the scripts and get them uploaded to the S3 prod bucket. Once uploaded then you can update the version/hash in the core engine. I would refrain from merging this PR until all the steps are done and then you can merge this PR as well as the one from the core engine together.

@jhmueller-huawei
Copy link
Contributor Author

The changes in this PR look good. The next step would be to build the 4 packages using the scripts and get them uploaded to the S3 prod bucket. Once uploaded then you can update the version/hash in the core engine. I would refrain from merging this PR until all the steps are done and then you can merge this PR as well as the one from the core engine together.

Yet, do you know who can do that? I asked in sig-core on discord, but didn't get a response.

@lemonade-dm
Copy link
Contributor

The changes in this PR look good. The next step would be to build the 4 packages using the scripts and get them uploaded to the S3 prod bucket. Once uploaded then you can update the version/hash in the core engine. I would refrain from merging this PR until all the steps are done and then you can merge this PR as well as the one from the core engine together.

Yet, do you know who can do that? I asked in sig-core on discord, but didn't get a response.

SIG-Build can currently promote packages to the production bucket.

@jhmueller-huawei
Copy link
Contributor Author

SIG-Build can currently promote packages to the production bucket.

Great! @amzn-changml could you please have a go at this?

@amzn-changml
Copy link
Contributor

SIG-Build can currently promote packages to the production bucket.

Great! @amzn-changml could you please have a go at this?

Taking a look, but not sure where this was built. @moudgils have you attempted to build this? If not, I'll see about staging and validating this before promoting it

@lemonade-dm
Copy link
Contributor

Well since this is the first time I believe someone from outside of Amazon is uploading a 3rdParty package, there would need to be some kind of mechanism to publish the package to a development bucket that a maintainer of O3DE can upload to.

@amzn-changml
Copy link
Contributor

Well since this is the first time I believe someone from outside of Amazon is uploading a 3rdParty package, there would need to be some kind of mechanism to publish the package to a development bucket that a maintainer of O3DE can upload to.

In lieu of an automated approach (which has an RFC), we can at least put in a semi automated one with Jenkins to maintain the established workflow.

@jhmueller-huawei
Copy link
Contributor Author

Any updates @amzn-changml ? Would be great if this could be merged soon.

@amzn-changml
Copy link
Contributor

Any updates @amzn-changml ? Would be great if this could be merged soon.

I’m working on a build process in Github Actions for use with promoting the package. It’s nearly ready, but still working out support for aarch64/arm64

@amzn-changml
Copy link
Contributor

I've got the build process completed and it published to a O3DE dev bucket. I'll be making a PR to add the automation code shortly.

@jhmueller-huawei
Copy link
Contributor Author

I've got the build process completed and it published to a O3DE dev bucket. I'll be making a PR to add the automation code shortly.

Great, thanks!

@amzn-changml
Copy link
Contributor

I'm going to close and then reopen this PR to force the automation to go through.

@jhmueller-huawei
Copy link
Contributor Author

Automated building seems to work now? Nice work! So the only thing missing is to upload the packages to the 3rd party packages server?

@amzn-changml
Copy link
Contributor

We also need to AR the changes against the 3p before we promote to production. The PR has been autogenerated here: o3de/o3de#16822

@amzn-changml
Copy link
Contributor

I'm reopening this ticket to restart the build due to changes in #223. This will resolve the glibc conflict found in o3de/o3de#16822

@amzn-changml amzn-changml reopened this Oct 3, 2023
@jhmueller-huawei
Copy link
Contributor Author

I guess this can be merged? Just needs one more approval. I merged the o3de changes.

@amzn-changml
Copy link
Contributor

@moudgils Are we good to merge this?

@amzn-changml amzn-changml merged commit 280f5e2 into o3de:main Oct 6, 2023
44 of 48 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants