1

I have a code where I have to read some Binary files in database. I wrote a For loop that iterate over my file path array, then it call a READ function that read that file into database.

I run this code with traditional for loop everything goes smooth, but then I change the for to Parallel.For and it doesn't go smooth, I adjust my code as I found lot of gray area and shared resource are creating problem in loop logic. So, I make everything Isolated to 1 Loop iteration.

Now the parallel method make my code run fast with twice the speed with one exception. The Exception is

  1. We have 600 files to read, out of 600 we have only 5-6 core big files.
  2. Out of these core big files, one or two file didn't get into the database, and no exception is generated. When I simply change Parallel.For to For they all get read, so code within it's scope is fine.

Can you suggest a method or alternative to Parallel.For, as I feel like one of core thread just die for no reason, seems impossible stuff, but this is what is happening as I am creating log at each step and different iteration shows thread stop at some other point.

Also, though I have 4Core processor it execute as many as 30 -40 files at time. I was expecting it to be like 3-4 thread at most per documentations. Any idea.

5
  • 1
    Concerning your second question: set ParallelOptions.MaxDegreeOfParallelism to the number of cores: msdn.microsoft.com/en-us/library/… Commented Jul 15, 2013 at 5:40
  • Concerning your first question: we cannot help you debugging this thing, but if you are just looking for alternatives, here is a great tutorial for threading/parallelism in C#: albahari.com/threading Commented Jul 15, 2013 at 5:47
  • I would try to find the cause before finding a solution. Maybe you can improve on the exception handling and get a better idea on what is wrong msdn.microsoft.com/en-us/library/dd460695.aspx. If you relay want an alternative you can check out TPL Dataflow msdn.microsoft.com/en-us/library/hh228603.aspx Commented Jul 15, 2013 at 7:13
  • sorry have to go out of town, and cannot reply before. I understand that you cannot debug and I don't expect you to debug for me too, just a great ideas are what I am looking, and thanks for your feedback thus far I am researching on them. :). Commented Jul 16, 2013 at 6:21
  • Okay @DocBrown Your suggest works well with limiting the execution. I limit my execution to no of processor I have, i.e. 4 and all is well, speed is 4 minute slower than last run, but it read 1 million more record [probably why speed is slow this time] as it works more this time. :). So everything is good now. Commented Jul 16, 2013 at 7:51

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.