Distributed Inference Test with 8x Qualcomm Snapdragon 8 Gen 2 Nodes #180
alphadance
started this conversation in
Results
Replies: 2 comments
-
Hello @alphadance, I see in the CPU section that You can try to reduce the context size for tests: About the network, I plan to extend the benchmark in next releases. Currently not available. |
Beta Was this translation helpful? Give feedback.
0 replies
-
@alphadance you can try the 0.12.8 version, this versions shows more details about the synchronization in |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Configuration:
Performance Results:
Full Log Output:
Click to expand
Unexpected Observation ❓
Despite using 8x Snapdragon 8 Gen 2 nodes (theoretically more powerful than Raspberry Pi clusters), the performance metrics failed to surpass those achieved by 4x Raspberry Pi nodes in previous tests.
Key questions:
Would appreciate any insights or debugging suggestions! 🛠️
Beta Was this translation helpful? Give feedback.
All reactions