C # to export large amounts of data to excel, how to enhance performance

First, to improve performance, we need to know where the time-consuming place

1, database queries,
2, into a new combined data set of the loop nesting too

Second, that we how to optimize it?

First, database queries,
1 ", a database query: If a small amount of data, we can use a temporary datatable, even table query ,, but if the tables are ten million even millions of data tables is not recommended to use even
that this time what can we do about it?
2 "This time we can choose to single-table query, and then recycled body other inquiries related to their own data, this time we need to pay attention to what the point?
3 "reduce database query! ! ! ! ! ! ! ! ! This is the key, how to reduce it? Normal logic following code, but the amount of waste of performance data

foreach (var item in listST)
{
    var sModel = getModel ();   // If this method every time a database query, the performance will be a waste! ! ! Resulting in very slow circulation loop! Even when using the cache will be very slow 
    item.name = sModel.name;
}

4 "We can put it another thought, optimizing the following code:

List<Student> listST = new List<Student>();
List<int> listInt = new List<int>();
int g = 0;
foreach (var item in listST)
{
    /// / We can choose to query the data once every two thousand
     /// / saved as a set of 
    IF (G% 2000 == 0 )
    {
          listST = GetList($"id in ({string.Join(",", listST.Skip(g).Take(2000).Select(m => m.id).Distinct().ToArray())})", 2000, 1, "name,Id");
    }
    var sModel = listST.FirstOrDefault(m => m.id == item.id);
    item.name = sModel.name;
    g++;
}

 


Second, the combined data set into a new loop nesting too, the optimization logic is to reduce loop nests

Guess you like

Origin www.cnblogs.com/May-day/p/11322409.html