Conversation
|
cp_large_file[16] cp_archive_balanced_tree[(5, 4, 10)] cp_recursive_wide_tree[(6000, 800)] cp_recursive_deep_tree[(120, 4)] cp_recursive_balanced_tree[(5, 4, 10)] cp_preserve_metadata[(5, 4, 10)] |
4c295ac to
54ffb91
Compare
|
GNU testsuite comparison: |
|
Would it make sense to create types in pub type UHashMap<K, V> = ahash::HashMap<K, V>;
pub type UHashSet<T> = ahash::HashMap<T>;
pub use ahash::HashMapExt as UHashMapExt;
pub use ahash::HashSetExt as UHashSetExt;This would reduce the number of PRs, and create a central place that makes potential later replacements since only the alias needs to be changed instead of at each use site. |
|
previously tried it. but failed. |
|
Each Hashers are optimized for different sizes. So the name |
|
Just my two cents, I think we can all agree that siphash is slow for DOS protection, and there are a wide range of faster hashes, e.g. ahash, fxhash, rapidhash, xxh3, rustc-hash, you name it. It's true that they have different performance characteristics with different input sizes (u64/filename/text/etc.), but much of this is platform/input dependent. More importantly, having different ones for each is not worth the added complexity/security implications nor the reduced maintainability. I think a good enough common hashing algorithm would make more sense. |
|
I'd like to have 2 hasher at leats for (reciently dropped Fnv). |
No description provided.