c++ efficient concurrent programming

Efficient concurrent programming

The basic model of concurrent programming includes managing the message passing of the running order through the message mechanism, and protecting the shared shared memory through mutual exclusion.

Basic principles of thread synchronization

  1. Minimally share variables, consider using immutable objects
  2. Minimize lock granularity
  3. Mutexes and condition variables are sufficient for most tasks, try to use high-level encapsulation
  4. Avoid complexity and simplify, read-write lock semaphore can be reentrant lock, use it with caution.

About deadlock

  1. RAII control lock interval
  2. Pay attention to the order in which locks are acquired

Copy On Write reduces lock granularity

In the case of read-only, use shared_ptr lightweight shared data
. In the case of modification, copy the data with a small number of occurrences. For example, our data is read 100 times per second, and the average is added every ten seconds. One data, then do Copy on write for adding data, if the data is used when adding data, make a copy! (Since a shared_ptr is saved using the data thread, there is no problem)
Let's look at an example

class Foo
{
 public:
  void doit() const;
};

typedef std::vector<Foo> FooList;
typedef boost::shared_ptr<FooList> FooListPtr;
FooListPtr g_foos;
MutexLock mutex;

void post(const Foo& f)
{
  printf("post\n");
  MutexLockGuard lock(mutex);
  if (!g_foos.unique())//有其他线程在读,重新拷贝一份
  {
    g_foos.reset(new FooList(*g_foos));
    printf("copy the whole list\n");
  }
  assert(g_foos.unique());
  g_foos->push_back(f);
}

void traverse()
{
  FooListPtr foos;
  {
    MutexLockGuard lock(mutex);
    foos = g_foos;
    assert(!g_foos.unique());
  }

  // assert(!foos.unique()); this may not hold

  for (std::vector<Foo>::const_iterator it = foos->begin();
      it != foos->end(); ++it)
  {
    it->doit();
  }
}

void Foo::doit() const
{
  Foo f;
  post(f);
}

int main()
{
  g_foos.reset(new FooList);
  Foo f;
  post(f);
  traverse();
}

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325120743&siteId=291194637