Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhancement: Create large directory hash #39

Open
speedtoys opened this issue Apr 6, 2023 · 0 comments
Open

Enhancement: Create large directory hash #39

speedtoys opened this issue Apr 6, 2023 · 0 comments

Comments

@speedtoys
Copy link

speedtoys commented Apr 6, 2023

Greetings, new to your tool, your friends at Vast and I have a long history at Yahoo, and now I'm at Intel.

What if I want to stress out a more complex MD testing strategy like I can with a commercial tool, like Virtana Workload Wisdom?

Where I can define
Number of dirs
Number of subdirs in each dir
Number depth of tree
Number of files per directory

And perform this across each of any # of exported POSIX filesystems available to a cluster of compute hosts...and/or define a # of threads to traverse the hashed tree randomly or ordered (north to south, west to east, etc)

I need to work elbencho against storage for AI workloads primarily, but there is a need for the same vendor solution(s) for general storage, including HFC HPC and EDA workloads.

I look forward to your thoughts.

-JeffM. (Tell AP and JD I said hello!)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant