Project

General

Profile

Feature #639

method to distribute message receives across pes in node

Added by Jim Phillips over 4 years ago. Updated almost 2 years ago.

Status:
New
Priority:
Normal
Assignee:
PPL
Category:
-
Target version:
-
Start date:
12/29/2014
Due date:
% Done:

0%


Description

If I have a group or array chare that needs to send a large number of messsages while the rest of the pes in the node have nothing more urgent to do I can use CkLoop to distribute the send work. On the other hand, if a group or array chare needs to receive a large number of urgent messages there is no corresponding mechanism to distribute this work to other threads in the node. The workaround is to send the messages to a nodegroup proxy that then directly calls a member function on the chare object regardless of the pe it belongs to and uses atomics to count the number of messages received. This means the sender has to know which node the target chare lives on and the nodegroup proxy needs to have pointers to all of the target objects. This could be handled more cleanly through the runtime, and ideally expressed in sdag as a parallel for loop or other control structure.

History

#1 Updated by Nikhil Jain over 3 years ago

  • Target version changed from 6.7.0 to 6.8.0

#2 Updated by Phil Miller over 3 years ago

  • Assignee set to Harshitha Menon

I believe the task construct Harshitha is working on should directly address this exact use case.

#3 Updated by Harshitha Menon over 3 years ago

I have an implementation of the task queue and parallel for using it but not sdag construct yet. Jim, if you have a specific case where this is required and it could benefit can you please share it with us so that I can try it out?

#4 Updated by Jim Phillips over 3 years ago

One NAMD use case was PME FFT transposes, specifically NodePmeMgr methods recvTrans, recvUntrans, recvUngrid, recv[XY]Trans, recv[YZ]Untrans.

#5 Updated by Sam White over 2 years ago

  • Assignee changed from Harshitha Menon to PPL
  • Target version changed from 6.8.0 to 6.9.0

Reassigning to PPL for now since Harshitha graduated...

#6 Updated by Phil Miller almost 2 years ago

  • Target version deleted (6.9.0)

Also available in: Atom PDF