Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dealing with data too large for a single buffer #6138

Open
wants to merge 14 commits into
base: trunk
Choose a base branch
from

Conversation

alphastrata
Copy link

@alphastrata alphastrata commented Aug 20, 2024

Connections
Link to the issues addressed by this PR, or dependent PRs in other repositories
discussion
thread on matrix

Description
The aim of this new example is to demonstrate taking a large input dataset, splitting it into chunks for the purpose of moving it onto the GPU, but then treating it as a single contiguous data structure once on the GPU.

Testing
Explain how this change is tested.

Checklist

  • Run cargo fmt.
  • Run cargo clippy. If applicable, add:
    • --target wasm32-unknown-unknown
    • --target wasm32-unknown-emscripten
  • Run cargo xtask test to run tests.
  • Add change to CHANGELOG.md. See simple instructions inside file.

@alphastrata alphastrata marked this pull request as ready for review August 20, 2024 22:28
@alphastrata alphastrata requested a review from a team as a code owner August 20, 2024 22:28
@alphastrata alphastrata changed the title DRAFT: dealing with data too large for a single buffer dealing with data too large for a single buffer Aug 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant