Many of the people who did the program performance optimization, or concerned about the process of program performance, should have used all kinds of caching technology. I said today Cache refers specifically to the ASP.NET Cache we can use HttpRuntime.Cache to access to the Cache, rather than other caching techniques.

I this blog simply mentioned it, I intend to write it feature blog, specifically to talk about it, because it is too important. In this blog, I not only want to introduce some of its common use, will also introduce some of its advanced usage. In the last blog entry at the end of the , I leave you a question, today, I will be given in this blog what I think the perfect answer.

Mentioned herein [the delay operation] methods (such as: to delay merger written to the database) my experience, I hope you like this idea.

The basic purpose of the Cache

Mentioned Cache, have to talk about its main function: to improve program performance.
ASP.NET is a dynamic page, and pages with ASP.NET technology almost all dynamic, the so-called dynamic means: the content of the page will be continuously updated with the different users or data, showing a different displays the results. Since it is dynamic, dynamic content is coming from? I think the vast majority of sites have their own data sources, the program by accessing the data source for the surface data of the page, and then according to the business rules of the calculation process, and eventually for page display.

This dynamic page technology usually need to get data from the data source, and after some calculation logic, and ultimately become some HTML code to the client display. These calculations is clearly a cost. The cost of these processes can be expressed as the most direct impact of the response speed of the server, particularly if the data processing becomes complicated and the access becomes large, will become more apparent. On the other hand, some data is not time to change, if we can be some of the changes are not frequent the final calculation of the data cached results (including the output of the page), you can enhance the performance of the program, the cache of the most common and important uses are embodied in this aspect. This is why when it comes to performance optimization, generally will be cached before the first reason. Today I say to the ASP.NET Cache is also possible to achieve this cache a technology. However, it also has some other features, some other caching techniques.

The definition of Cache

Introduced the usage of the Cache ago, we first look at the definition of Cache: (Note: I ignored some significance is not a member)

/ / Cache for Web applications.This class can not be inherited.public sealed classCache: IEnumerable{/ / for the Cache.Insert (...) the method call absoluteExpiration parameter to indicate that the item never expires.public static readonlyDateTime NoAbsoluteExpiration; / / to as the Cache.Insert (...),, or the Cache.Add (...) / / method calls slidingExpiration parameters to disable sliding expiration.public static readonlyTimeSpan NoSlidingExpiration; / / Gets or sets the cache entry at the specified key.set;}/ / the the specified item to System.Web.Caching.Cache object, the object has dependencies, expiration and priority policies, / /, and a delegate (can be used from thepublic object this [string key] {get;Cache notify the application when the inserted item is removed).public object Add (string key, object value, CacheDependency dependencies, DateTime absoluteExpiration, TimeSpan slidingExpiration, CacheItemPriority priority, CacheItemRemovedCallback onRemoveCallback); / / from the System.Web.Caching.Cache object to retrieve the specified item./ / Key: to retrieve the identifier of the cache entry./ / Returns: the retrieved cache entry not found the key to null.public object Get (string key); public void Insert (string key, object value); public void Insert (string key, object value, CacheDependency dependencies); public void Insert (string key, object value, CacheDependency dependencies, DateTime absoluteExpiration, TimeSpan slidingExpiration); / / Summary: / / to the System.Web.Caching.Cache object into the object, the latter with dependencies, expiration and priority policies, / /, and a delegate (which can be used to notify the inserted item is removed from the Cache application)./ / / / Parameters: / / key: / / used to refer to the object cache key./ / / / Value: / / you want to insert the object in the cache./ / / / Dependencies: / / the file dependencies or cache key dependencies.When any dependency changes, the object is no longer valid, / / and removed from the cache.If there are no dependencies, this parameter contains null./ / / / AbsoluteExpiration: / / the inserted object expires and is removed from the cache./ / If you use absolute expiration the slidingExpiration parameters must Cache.NoSlidingExpiration./ / / / SlidingExpiration: / / last visit to the inserted object the interval expired with the object.If the value is equivalent to 20 minutes, / / the object was last accessed after 20 minutes will expire and be removed from the cache.If you are using sliding expiration, / / absoluteExpiration The time at parameter must be System.Web.Caching.Cache.NoAbsoluteExpiration./ / / / Priority: / / the object relative to the cost of other items stored in the cache, by System.Web.Caching.CacheItemPriority enumeration./ / This value is used when exiting the object cache; object having a lower cost are removed from the cache before the object having a higher cost./ / / / OnRemoveCallback: / / delegate will be called when the object is removed from the cache (if provided)./ / When the application's object is removed from the cache, can be used to notify the application./ / / / Exceptions: / / System.ArgumentException: / / to the Cache in the items set absoluteExpiration and slidingExpiration parameters you want to add./ / / / System.ArgumentNullException: / / key or value parameter is null./ / / / System.ArgumentOutOfRangeException: / / the slidingExpiration parameter is set to less than the TimeSpan.Zero or greater than the equivalent of one year.public void Insert (string key, object value, CacheDependency dependencies, DateTime absoluteExpiration, TimeSpan slidingExpiration, CacheItemPriority priority, CacheItemRemovedCallback onRemoveCallback); / / from the application System.Web.Caching.Cache object to remove the specified item.public object Remove (string key); / / object dependency strategy, the expiration policy and priority policy / / can be used to remove items from the cache [before] notifies the application delegate to insert the Cache object ./ / Note: This method is supported by the following version: 3.5 SP1, 3.0 SP1, 2.0 SP1public void Insert (stringkey,objectvalue,CacheDependencydependencies,DateTimeabsoluteExpiration,TimeSpanslidingExpiration,CacheItemUpdateCallbackonUpdateCallback);}

ASP.NET order to facilitate our access Cache in the HttpRuntime class plus a static property cache, so that we can use at any place the function of the Cache. Moreover, ASP.NET also it increases the two “shortcut”: the Page.Cache HttpContext.Cache two objects can also access the HttpRuntime.Cache Note: these three are accessing the same object . Page.Cache access HttpContext.Cache HttpContext.Cache direct access HttpRuntime.Cache

Cache common usage

Normally, we use the Cache, generally only two operations: read and write. Get a cache entry from the Cache, we can call the cache.get (key) method, to an object into the cache, we can call the Add, Insert method. However, the Add, Insert method has a number of parameters, and sometimes we may just want to simply put in the cache, all accept the default values, you can also call it the default indexer, we look at this index is how it works:

public object this [string key] {get {return this. Get (key);} set {this. Insert (key, value);}}

As you can see: read caching is in fact call the Get method, and write cache is the easiest call the Insert method overloaded version of that.

Note: Add method can also be an object into the cache, this method has seven parameters, and Insert a signature similar to an overloaded version, they have a similar function: Adds the specified item to System.Web.Caching.Cache object, the object with dependencies, expiration and priority policies, and a delegate (can be used to notify the application when the inserted item is removed from the Cache). However, they have little difference: When you want to add cache entry exists in the Cache, Insert will overwrite the existing cache entries, Add does not modify the original cache entry.

In other words: If you want to once placed in the cache, a cache project would not have been modified, then call the Add can indeed prevent subsequent modification operations. Insert method is called, you will always overwrite existing entries (even before calling the Add to join).

From another perspective, add the effect is more like the behavior of the static readonly Insert effect as the static behavior.
Note: I’m just saying [like] the fact that they have more flexible than the static member usage.

Since the cache entry can let us access at any time, does seem a bit static members of the taste, but they have more advanced features, such as: cache expiration (absolute expiration, sliding expiration), cache dependency (dependent files, reliance on other cache entries) Remove the first level cache before and after removal of the notification. Later I will introduce these four characteristics.

The Cache class of features

Cache class have a hard time advantages, is on MSDN speak:

This type is thread safe.

Why this is a rare advantage of it? Because in. Net, the vast majority of class in, only to ensure that static type method is thread safe, regardless of the instance method is thread-safe. This is also regarded as a basic the NET design specifications principle.
For those types, MSDN usually with the words to describe:

Public static (as Shared in Visual Basic) members of this type are thread safe. But can not guarantee that any instance members are thread-safe.

So, this means that we can read and write cache in any place, do not worry about data synchronization issues Cache data in a multithreaded environment. Multi-threaded programming, the most complex issues of data synchronization issues, the Cache has solved these problems for us.

But I want to remind you: ASP.NET itself is a multi-threaded programming model, all requests are handled by the thread pool thread. Usually, we order to solve the problem of data synchronization in a multithreaded environment, the general is the use of locks to ensure data synchronization, naturally, ASP.NET is no exception, in order to solve the data synchronization issues internal also used the lock.

Here, maybe some people will think: Since only a static instance of the Cache, the lock will not affect concurrency?
The answer is yes, a lock will certainly to some extent affect concurrency, which is no way to do.
However, ASP.NET in to achieve Cache, will be created based on the number of CPU cache container, may try to reduce the conflict, the following is created by Cache core processes:

internal static CacheInternal Create () {CacheInternal internal2; int numSingleCaches=0;if (numSingleCaches==0){uint numProcessCPUs=(uint)SystemInfo.GetNumProcessCPUs (); numSingleCaches=1;for (numProcessCPUs- =1;numProcessCPUs>0;numProcessCPUs= numProcessCPUs >>1) {numSingleCaches = numSingleCaches <<1;}} CacheCommon cacheCommon =new CacheCommon (); if (numSingleCaches ==1) {internal2 =new CacheSingle (cacheCommon, null,0);}else {internal2 =new CacheMultiple (cacheCommon, numSingleCaches);} cacheCommon. SetCacheInternal (internal2); cacheCommon. ResetFromConfigSettings (); return internal2;}

Description: CacheInternal the internal wrapper class Cache of many operations must be done by it.

In the above code, numSingleCaches the calculation process is very important, if the above code is not easy to understand, then look at the following sample code:

static void Main () {for (uint i =1; i <=20; i + +) ShowCount (i); }static void ShowCount (uint numProcessCPUs) {The int numSingleCaches;for (numProcessCPUs- =1;numProcessCPUs>0;numProcessCPUs=numProcessCPUs>>1) { numSingleCaches = numSingleCaches <<1; }Console Write (numSingleCaches +""); }

The program will output:


The constructor of CacheMultiple are as follows:

internal CacheMultiple (CacheCommon cacheCommon, int numSingleCaches): base (cacheCommon) {this. _cacheIndexMask=numSingleCaches-1;this. _caches=new CacheSingle [numSingleCaches];for (int i =0; i <numSingleCaches; i + +) {this. _caches [i] =new CacheSingle (cacheCommon, this, i); } }

Now you should understand it: CacheSingle actually used within the ASP.NET cache container, more than one CPU, it will create multiple cache container.
At the time of writing, it is how to locate these containers? Please continue to look at the code:

internal CacheSingle GetCacheSingle (int hashCode) { hashCode = Math. Abs (hashCode);int index = hashCode &this. _cacheIndexMask;return this. _caches [index]; }

Note: hashCode parameter is directly call we pass key.GetHashCode (), GetHashCode is defined by class Object of the.

Therefore, from this perspective, although ASP.NET Cache only one HttpRuntime.Cache the the static member, but its internal may contain more than one cache container, this design can to some extent reduce the impact of concurrency.

Regardless of how to design, share a container, in a multithreaded environment, conflict is inevitable. If you just want a simple cache some data, does not require Cache many advanced features, then, can be considered without Cache. For example: You can create a static instance of a Dictionary or Hashtable, it can also complete some basic cache However, I would like to remind you: you want to deal with their own data synchronization issues when multi-threaded access to the data.
By the way one: Hashtable.Synchronized (new Hashtable ()) is a thread-safe collection, if you want a simple point, we can consider it.

Next, we look at Cache advanced features, these are the Dictionary or Hashtable can not be completed.

The expiration time of the cache entry

ASP.NET supports two kinds of cache entry expiration policy: absolute expiration and sliding expiration.
An absolute expiration, this is easy to understand: the cache in the Cache, specify a specific time. When the time reaches the time specified when the cache entry is automatically removed from the Cache.

Sliding expiration: some cache entry, we may only want to visit, try to remain in the cache only when the period of time the user no longer has access to the cache entry, remove it, so you can optimize memory usage, because this strategy can guarantee the contents of the cache are [very popular]. Operating system memory and disk cache are not designed it? This very useful feature, Cache also ready, as long as the cache entry into the cache, specify a sliding expiration time can be achieved.

These two options correspond to the Add, Insert method DateTime absoluteExpiration, TimeSpan slidingExpiration these two parameters.

Note: These two parameters are used in pairs, but you can not specify both a [valid] value, only up to a parameter value. When not in use when another parameter, the the Cache class defines two static readonly field assignment.

These two parameters are relatively simple, I will not say more, saying only that the sentence: use Noxxxxx two options, then the cache entry has been stored in the cache. (May also be removed)

The dependence of the cache entry – rely on other cache

ASP.NET Cache has a very powerful function, that is, cache dependency. A cached item dependent on another cache entry. The following sample code creates two cache entry, and the dependencies between them. First, look at the code of the page:

<Body><P> Key1 contents of the cache: <a%=HttpRuntime. Cache ["key1"]%></ p><Hr /><Formaction= "CacheDependencyDemo.aspx"method= "post"><Inputtype= "submit"name= "SetKey1Cache"value= "value set Key1" /><Inputtype= "submit"name= "SetKey2Cache"value= "Set the value of Key2" /></ Form></ Body>

Page code-behind:

public partial class CacheDependencyDemo:System. Web. UI.Page{[SubmitMethod (AutoRedirect=true)] private void SetKey1Cache () {SetKey2Cache (); CacheDependency dep =newCacheDependency(null, new string [] {"key2"});HttpRuntime. Cache. Insert ("key1",DateTime. Now. ToString (), dep, Cache. NoAbsoluteExpiration, Cache. NoSlidingExpiration);} [SubmitMethod (AutoRedirect=true)] private voidSetKey2Cache () {HttpRuntime. Cache. Insert ( "key2",Guid. NewGuid (). ToString ());}}

When you run this sample page, the results are as shown below, click the button 【set Key1 values], there will be the contents of the cache entry (left). Click the button [value] set Key2, will get less than the contents of the cache entry (right).

Based on the results and analysis of the code, we can see, when you in Key1 create a cache entry, we use this cache dependencies:

CacheDependency dep =newCacheDependency(null, new string [] {"key2"});

So, when we update a Key2 the cache items, Key1 cache failure (does not exist).

Do not underestimate this example. Indeed, only to see a few lines of sample code, perhaps they really does not make sense. Well, I would like to give the actual usage scenarios to illustrate its value in use.

The above picture is what I wrote a small tool. In the diagram, the lower left corner is a cache table CacheTable It consists of a the class called Table1BLL maintenance. The data from CacheTable comes from Table1, display page by Table1.aspx. The same time, ReportA, ReportB data comes mainly from Table1, Table1 access to almost the vast majority of them are reading less write, so I will Table1 data cache up. , ReportA, ReportB these two statements GDI direct draw (generated by the report module to subscribe is Table1BLL the upper class) In view of these two statements Views more data source is read write less, The output of these two statements, I have them cached.

In this scenario, we can imagine: If you want to modify the data in Table1, how to make the results of the two reports cache failure?

Let Table1BLL to inform the the two statements modules, or Table1BLL to directly delete the cache of the two statements?
In fact, whether it is to choose the former or the latter, when ever need to do in the the Table1 CacheTable on other cache implementations (other statements), then, is bound to be modified Table1BLL, that is definitely a failed design. This could be considered the negative consequences of coupling between modules.

Fortunately, ASP.NET Cache support a technique called cache-dependent characteristics, we only need want Table1BLL to open it cache CacheTable the key can (assuming the KEY CacheTableKey), and then, other cache the results based CacheTable set to at the [CacheTableKey] dependence can achieve this effect: cache the result is dependent When CacheTable updated automatically cleared. This completely solve the problem of the cache data dependencies between modules.

The dependence of the cache entry – file dependencies

In the last blog entry the end of , I will give you left a question:
I want the user to modify the configuration file, the program can run immediately to the parameters, and do not restart the website.
I have to answer this question, given all the necessary code.

First of all, I would like to point out: the last blog, solutions and Cache file dependencies, but need to use can be the perfect solution to the problem with the cache removal notice. In order to facilitate the arrangement of the content, I first use the Cache file depends simply a rough version, perfect to achieve in the rest of this article again.

First look at a rough version. If my site in such a configuration parameter type:

/ / / <summary>/ / /Analog sites required operating parameters/ / / </ summary>public classRunOptions {The public string websiteUrl;public string UserName; }

I can configure it in such an XML file:

<?Xmlversion= "1.0"encoding= "utf-8"?><RunOptionsxmlns: xsi= ""xmlns: xsd= ""><WebSiteUrl> </ WebSiteUrl><UserName> fish li </ UserName></ RunOptions>

Encore is used to display the operating parameters of the page:

<Body><P> WebSiteUrl: <% = WebSiteApp. RunOptions.WebSiteUrl%> </ p><P> UserName: <% = WebSiteApp. RunOptions.UserName%> </ p></ Body>

The following code can be achieved: to view the page will be able to immediately see the latest parameter values ​​in the XML modified:

public static classWebSiteApp{private static readonly string RunOptionsCacheKey =Guid the NewGuid (). ToString (); the public staticRunOptions RunOptions {get{/ / first tries to run the parametersRunOptions Options = thethe HttpRuntime Cache [RunOptionsCacheKey] asRunOptionsretrieved from the cache;if (options ==null){/ / cache from the file loadedString path = HttpContext. Current Server MapPath ("~ / App_Data / RunOptions.xml); the options = RwConfigDemo XmlHelper XmlDeserializeFromFile <RunOptions (path, Encoding UTF8); / / read from the file into the cache, and set file dependencies.The the CacheDependency dep =newCacheDependency (path); / / if your argument is more complex, with multiple files related you can also use the following transfer multiple files paths./ / CacheDependency dep = new CacheDependency (new string [] {path});HttpRuntime. Cache. Insert (RunOptionsCacheKey, options, dep);}returnoptions;}}}

Note: here is still in use CacheDependency, but we are now to rely on the file name to its constructor the first argument.

Before the end of the cache dependent, would like to add two points:

The CacheDependency also support [nested], that is,: CacheDependency the constructor support incoming CacheDependency instance, this can constitute a very complex tree dependencies.

2 cache dependency object is SQL Server specific reference SqlCacheDependency

The removal of the priority of the cache entry

Cache practice there are many, a static variable can also be called a cache. A static collection is a cache container. I think many people use the Dictionary List, Hashtable done cache container, we can use them to save various data, to improve the performance of the program. General case, if we directly use such collection to cache the various types of data, then the memory occupied by that data will not be recovered, even if they use the opportunity to not many. When the cache data rises, they consume more and more memory will naturally. So, can the memory is not sufficient, freed some infrequent access cache entry?

This problem is indeed a real problem. Although cache will use the program run faster, our data, however, will be infinite, it is not possible to all cached, after all, the memory space is limited. Therefore, we can use the previously mentioned strategies to solve this problem based on a period of time no longer have access to delete. However, in our coding simply do not know our program will run on what configuration standard computer, thus making it impossible to make any assumptions on memory size, we may want to take up too much when the cache memory, and when the memory is not enough, can automatically remove some of the less important cache entry, perhaps more meaningful.

For this demand. NET Framework provides two kinds of solutions, one is to use the WeakReference class, the other is to use Cache. However, since we are using the ASP.NET, select the Cache of course, will be more convenient. Cache Add, Insert method overloaded version, you can specify the preservation priority policy of the cache entry, passing by parameters of the CacheItemPriority priority. Among them, the CacheItemPriority is an enumeration type that contains the following enumeration values:

/ / Specify the relative priority of the items stored in the Cache object.the public enum thethe CacheItemPriority{/ / the server frees system memory, with the priority level of the cache entry is most likely to be removed from the cache. Low =1,/ / the server frees system memory, with the priority level of the cache entry assigned CacheItemPriority.Normal / / priority is more likely to be removed from the cache. BelowNormal =2,/ / when the server frees system memory, with the priority level of the cache entry is likely to be removed from the cache / / delete second only to the possibility of CacheItemPriority.Low / / or CacheItemPriority.BelowNormal of priority of those items.This is the default option. Normal =3,/ / cache entry priority default value of CacheItemPriority.Normal of. Default =3,/ / server frees system memory, the possibility of having the priority level of the cache entry is deleted / / assigned the CacheItemPriority.Normal priority items to be small. AboveNormal =4,/ / the server frees system memory, with the priority level of the cache entry can not be deleted from the cache. High =5/ / server frees system memory with the priority level cache entries will not be automatically removed from the cache./ / However, with the priority level of the items will be based on the absolute expiration time of entry or adjustable expiration time to be removed together with other items. NotRemovable =6,}

Description: When we call Add, Insert method of the Cache, if not specified CacheItemPriority options, end-use represented by the Normal priority. If we hope will be a less important data into the cache, you can specify the priority Low or BelowNormal. If you want the cache entries in memory, it will not be removed (unless due or dependency changes), you can use NotRemovable.

Obviously, we could use this feature to control the impact of cache memory pressure. Other caching scheme, such as static Collection + WeakReference difficult to achieve such flexible control.

Cache entry removal notice

ASP.NET Cache with some static variable cache effect is not the same as the cache entry can fail under certain conditions, the failure of the cache will be removed from memory. Remove certain conditions not directly by our code solution hair, but ASP.NET or provide a way so that we can remove cache entry can inform our code.

Note Well: ASP.NET Cache support to remove the notice and takedown of [before] [after] notice two kinds of notice.

We can call the Add, Insert method, parameter onRemoveCallback pass a CacheItemRemovedCallback commissioned in order to remove the specified cache entry can notify us. Commissioned defined as follows:

/ / / <summary> / / /Defined in the callback method for notifying the application when the cache entry is removed from the System.Web.Caching.Cache./ / / </ Summary> / / / <param name="key">key removed from the cache (when the Add, Insert incoming).</ Param> / / / <param name="value">cache entry associated with the key removed from the cache (when the Add, Insert incoming).</ Param> / / / <param name="reason">reasons for removing items from the cache.</ Param>public delegate voidCacheItemRemovedCallback(string key, object value, CacheItemRemovedReason reason); / /specify the reasons for removing items from the System.Web.Caching.Cache object.the public enum thethe CacheItemRemovedReason{/ / that specify the same key in the Cache.Insert (System.String, System.Object) / / to call or Cache.Remove (System.String) method call is removed from the cache. Removed =1,/ / item is removed from the cache because it has expired. Expired =2,/ / The reason is removed from the cache that is because the system by removing the release of memory. Underused =3,/ / removed from the cache that is associated with cache dependency has changed. DependencyChanged =4,}

Commissioned the meaning of each parameter and remove the reasons are clearly explained in the comment, I will not repeat.
I thought: There are a lot of people know Cache Add, Insert method this parameter, but also know this commissioned, however, they are what use is it? In the next two sections, I will provide two examples to demonstrate this powerful feature.

Usually, we will get results from the Cache in the following way:

RunOptions options = HttpRuntime. Cache [RunOptionsCacheKey]asRunOptions;if (options ==null) {/ / Cache is not loaded from the file / / ..................................HttpRuntime. Cache. Insert (RunOptionsCacheKey, options, dep); }return options;

Actually, this is an idiom: first try to be retrieved from the cache, if not, from the data source loaded into the cache again.

Why Cache returns null? The answer is nothing more than two kinds of reasons: 1. Did not put into the Cache invalidate the cache entry is removed.
The wording itself is no problem, but if the load data from the data source a long time, what would happen?
Obviously, it will affect the back of the first request for a visit. Have you ever wondered, if the cache entry has been placed on the Cache, it can not on it. Yes, generally speaking, as long as you place an object in the Cache does not specify an expiration time, do not specify a cache dependency, and set to never remove the object really has been in the Cache, but, expiration time and The cache dependency also useful oh. How can it both ways?

To solve this problem, Microsoft Net Framework 3.5 SP1, 3.0 SP1, 2.0 SP1 version, add [remove notify] However, this method is limited only by the Insert support, there followed a delegate and remove the causes of enumeration definition:

/ / / <summary> / / /Define a callback method for notifying the application before the cache entry is removed from the cache./ / / </ Summary> / / / <param name="key">from the cache to remove the item identifier.</ Param> / / / <param name="reason">from the cache, remove the cause.</ Param> / / / <param name="expensiveObject">this method returns, contains the cache entry containing the updated object.</ Param> / / / <param name="dependency">this method returns, contains the new object dependencies.</ Param> / / / <param name="absoluteExpiration">this method returns, contains the expiration time of the object.</ Param> / / / <param name="slidingExpiration">This method returns the time interval between the expiration time of the last access time and the object that contains the object.</ Param>public delegate voidCacheItemUpdateCallback(string key, CacheItemUpdateReason reason, out objectexpensiveObject,outCacheDependency dependency, outDateTime absoluteExpiration, outTimeSpan slidingExpiration); / / / <summary> / / /remove the cache from the Cache object is specified The reason of the options./ / / </ Summary>remove items thepublic enumCacheItemUpdateReason{/ / / <summary> / / /specified from the cache because the absolute expiration or adjustable expiration interval has expired./ / / </ Summary> Expired =1/ / / <summary> / / /specified from the cache item is removed because associated CacheDependency object has changed./ / / </ Summary> DependencyChanged =2,}

Note: This enumeration CacheItemUpdateReason only two. Reasons, see the MSDN explanation:

Enumerate different CacheItemRemovedReason, this enumeration does not contain Removed or Underused value of. Updatable cache entry is not removed, and thus will not automatically remove ASP.NET, even if the need to release the memory.

Again: Sometimes we do need to cache invalidation feature, however, will be removed cache invalidation. While we can not obtain the cached data for subsequent requests from the data source loaded, the callback delegate in CacheItemRemovedCallback reload the cached data to the Cache, but the data loading process, Cache does not contain our the desired cache data, if longer to load, this [vacancy] the effect will be more obvious. This will affect the (subsequent) requested access. Notice in order to ensure that cached data so that we would expect to have been present in the the Cahce, and there is still failure mechanism, we can use [remove] function.

Using the removal of the cache entry notice [delay operation]

I have read some ASP.NET book, have seen some people write articles Cache, basically, either along the by, either just give you an example of moot. Unfortunately, ah, such a powerful feature, I rarely see someone use it.

Today, I would like to give a significant example, reproduction Cache powerful!

I have a page that allows the user to adjust (down) on-line order for an item branch records:

When users need to adjust the position of a record, the page will pop up a dialog box to enter an adjustment reasons, and will send an email to inform all relevant personnel.

Due to the limitations of the interface, a single operation (click the up and down keys head) just moved one position will be a record, a record span multiple lines to move, you must move several times. Taking into account the ease of operation and duplicate messages, the program needs to implement such a demand: page requires only the input one reason will be multiple records of a move operation, and do not repeatedly issued duplicate messages, and requires the last move results in the message sent to.

This demand is very reasonable, after all, anyone want a simple operation.

So how do you achieve this demand? Here from two aspects, First of all, on the page, we should accomplish this function, a record play only once dialog box. All pages and server interaction using Ajax (refresh), the state can be used to maintain the JS variable, so this feature in the page is very easy to implement. Look at the server, the server and no state, of course, by page transmit its status to the server, however, which operation is the last time? Obviously, this is no way of knowing, and finally can only modify the demand, if a user no longer operate a record 2 minutes, they put the last operation deemed to be the last operation.

Based on the new requirements, the program must record the user’s most recent operation, in order to send mail once after 2 minutes without operation, but to include a first input, should also contain the last modified oh.

How to achieve this? I immediately thought of the ASP.NET Cache, as I understand it, to know it can help me fulfill this function. Below me is how to achieve the service side.

The whole realization of the ideas:
Client page or will be recorded each time the RowGuid, adjust the direction of adjustment reason, these three parameters sent to the server.
Service side order adjustment operation is finished, you want to send e-mail messages Insert Cache and the while providing slidingExpiration and onRemoveCallback parameters.
Callback delegate in CacheItemRemovedCallback ignore CacheItemRemovedReason.Removed notice, other notice, mail.

To facilitate understanding, I deliberately have prepared an example. The entire sample consists of three parts: a page, a JS file, the server-side code. First look at the code of the page:

<Body> <p> For simplicity, the example page only process one record, and the record RowGuid directly displayed.<Br /> the actual scene, this RowGuid should be from a table [the currently selected row] get to.</ P> <p> the currently selected row rowguid = <spanid= "spanRowGuid"><%=Guid the NewGuid (). ToString ()%></ span> <br /> the currently selected row of the Sequence = <spanid= "spanSequence> </ span> </ p> <p> <inputType=" button "id =" btnMoveUp "value =" shift "/> <inputtype=" button "id =" btnMoveDown "value =" down "/> </ p> </ body>

Page display as follows:

The processing page two button JS code is as follows:

/ / User input adjustment recorded reasonsvar g_reason =null; $ (function () {$ ("# btnMoveUp"). Click (function () {MoveRec (-1);}); $ (# btnMoveDown) . click (function () {MoveRec (1);});});function MoveRec (direction) {if(~ ~ ($ ("# spanSequence"). text ()) + direction <0) {alert (" has been unable to move up. ");return;} if (g_reason ==null) {g_reason = prompt (" Please enter a reason to adjust the recording sequence: "" For what reasons, I want to adjust ... "); if(g_reason==null) return;} $. ajax ({url:"/AjaxDelaySendMail/",data: {RowGuid: $ ("# spanRowGuid"). text (), Direction: direction, Reason:g_reason}, type: "POST", dataType: "text", success:function (responseText) {$ ("# spanSequence"). text (responseText);}});}

Description: the service side, I use I article blog service framework, the server all the code look like this: (Note that the code in the comment)

/ / / <summary> / / /Move record information./ / / </ Summary>public classMoveRecInfo{public string RowGuid; public int Direction; public string Reason;} [MyService]public classAjaxDelaySendMail {[MyServiceMethod]public intMoveRec (MoveRecInfo info) {/ /here is not verified from the customer end of the incoming parameters.The actual development, this is a must./ / To adjust the first order of the records, the sample program database, Cache sequence =0;int. TryParse (HttpRuntime. Cache [info. RowGuid] as string, out sequence); / / simple example to adjust the order. sequence + = info Direction; the the HttpRuntime Cache [info rowguid] = sequence. ToString (); String key = info rowguid +"_DelaySendMail;/ / here I do not direct mail, but this information into the Cache and set over the expiration time of 2 seconds, and specify remove inform the client / / operation information in the cache, and placed to cover the form, so that we can save the last state./ / Note: I use the Insert method.HttpRuntime. Cache. Insert (key, info, null,Cache. NoAbsoluteExpiration, TimeSpan. FromMinutes (2.0),CacheItemPriority. NotRemovable, MoveRecInfoRemovedCallback);return sequence;} private voidMoveRecInfoRemovedCallback(string key, object value, CacheItemRemovedReason reason) {if ( The reason == CacheItemRemovedReason. Removed) return;/ / ignore subsequent calls HttpRuntime.Cache.Insert () triggered operation / / to run here, be sure to cache expires./ / In other words: you never operation./ / Retrieved from the parameter value the operation informationMoveRecInfo info = the(MoveRecInfo) value; / / Here you can do other processing info./ / Last one-Mail.The entire process delayed mail processing finished.MailSender sendmail (info);}}

In order to allow JavaScript can directly call the method in C #, you also need to add the following configuration in web.config:

<HttpHandlers><Addpath= "*. Fish"verb= "*"validate= "false"type= "MySimpleServiceFramework.AjaxServiceHandler" /></ HttpHandlers>

Well, the sample code is. If you are interested, you can download the sample code at the end of this article, personally feel Cache [delay processing] function.

In fact, this delay processing] function is useful, for example, there is a suitable scenario: Some of the data records may need to be updated frequently, if each update to write to the database, the database will certainly be a certain amount of pressure, but because these the data is not particularly important, so we can take advantage of this [delay processing time to write database] to the merge process, eventually we can achieve: to repeatedly write into one or a small number of write operations, I said this effect: delay merger write

I delay of the database combined write a line of thought: the need to write the data records into the cache, call the Insert method and provide slidingExpiration and onRemoveCallback parameters and callback delegate in CacheItemRemovedCallback, imitating my previous sample code, many times become. However, this could be a problem: If the data has been modified, then it has not been written to the database. Finally, if the site is restarted, the data may be lost. If you are concerned about this issue, then, in the callback delegate the encounter CacheItemRemovedReason.Removed, using the cumulative counts reach a certain number, and then written to the database. For example: encountered 10 times CacheItemRemovedReason.Removed I wrote a database, this will will need to write database operations into time. Of course, if other removal reasons, write to the database is always necessary. Note: Do not use this method for the amount of such sensitive data.

Would like to add two points:
1 When CacheItemRemovedCallback callback delegate is called, the cache entry longer the Cache.
2. In the callback CacheItemRemovedCallback entrusted, we can also cache entry back into the cache.
Have you ever thought: This design can form a loop? If combined with the parameters slidingExpiration can achieve the effect of a timer.

Cache expiration time, I want to remind you that: time, incoming by absoluteExpiration, slidingExpiration parameters when the cache time is in effect, the cache object does not immediately remove the ASP.NET Cache check these obsolete cache entry The timing is not certain, may be a delay.

Using the removal of the cache entry notice [automatically load the configuration file]

In the first part of this paper [file dependencies] section demonstrates an example: When the configuration file is updated, the page can display the latest changes. In that example, the sake of simplicity, I will direct configuration parameters on the Cache, each time you use obtained from the Cache. If the configuration parameters is more, this approach may also affect performance, after all, not often modify configuration parameters and will be able to obtain direct access to a static variable that should be faster. Normally, we might do:

the private the staticRunOptions s_RunOptions; the public staticRunOptions RunOptions the {/ / s_RunOptions the initialization on the Init method will be called in the Application_Start event in the Global.asax.get {return s_RunOptions;}} public static RunOptions LoadRunOptions () {string path = Path. Combine (AppDataPath, "RunOptions.xml"); return RwConfigDemo.XmlHelper.XmlDeserializeFromFile <RunOptions> (path, Encoding. UTF8);}

However, this approach has a drawback: not in the configuration file is updated automatically load the latest configuration results.

To solve this problem, we can use the Cache file dependencies and remove the notification feature. The previous example demonstrates removed after notification feature, here I will show you remove the front notification function.
Description: In fact, to accomplish this function, you can still remove notify, just remove the notice I have not shown, however, used here to remove the notice does not display its unique features.

The following code shows a modified configuration file to automatically update the operating parameters of the implementation: (Note that the code in the comment)

private static int s_RunOptionsCacheDependencyFlag=0;public static RunOptions LoadRunOptions () {string path = Path. Combine (AppDataPath, "RunOptions.xml");/ / pay attention to you: access to the file may appear abnormal.I do not learn, I wrote the sample code.RunOptions the options = RwConfigDemo XmlHelper the XmlDeserializeFromFile <RunOptions> (path, encoding UTF8); int flag = System. Threading Interlocked, the CompareExchange. (Ref s_RunOptionsCacheDependencyFlag, 1, 0);/ / ensure that only call once it.if (flag ==0){/ / Cache help us pegged to the configuration file.CacheDependency dep =newCacheDependency (path); HttpRuntime. Cache. Insert (RunOptionsCacheKey,"Fish Li", dep, Cache. NoAbsoluteExpiration, Cache. NoSlidingExpiration, RunOptionsUpdateCallback);}return options;} public static voidRunOptionsUpdateCallback(string key, CacheItemUpdateReason reason , the out object expensiveObject outCacheDependency dependency, outDateTime absoluteExpiration The time at out theTimeSpan slidingExpiration) {/ / Note Oh: In this method, do not [unhandled exception], or cache object will be removed./ / Note: I do not care about the parameters of reason, because I did not use the expired time / / So, there is only one reason: the dependent file has changed./ / The key argument I do not care, because this method is [dedicated].expensiveObject=""; dependency =newCacheDependency (Path. Combine (AppDataPath, "RunOptions.xml")); absoluteExpiration = Cache. NoAbsoluteExpiration; slidingExpiration = Cache. NoSlidingExpiration; / / reload to the configuration parameters s_RunOptions = LoadRunOptions ();}

Subtler just LoadRunOptions method has been modified., But the effect is pretty cool.

I remember in the last blog entry at the end of the left to it? This example is my solution.

File monitoring technology choices

File monitoring, I think one might think FileSystemWatcher. I just said, the choice [file monitoring technology]. All the conclusions of this paper are my personal view, for reference only.

This component, as early as WinForm development used, it is also deeply impressed.
It has a packaging bad place: the event will be re-issued. For example: a file save operation, it has triggered a secondary event.
What, you do not believe? I just prepared a sample program.

Description: The picture shows a secondary event, but I just modify file and do a save operation. In this paper, at the end of my sample program, you can go try. Here for the sake of convenience, or posted Product code:

private void Form1_Shown (object sender, EventArgs e) {this. fileSystemWatcher1. Path = Environment. CurrentDirectory; this. fileSystemWatcher1. Filter ="RunOptions.xml";this. fileSystemWatcher1. NotifyFilter = System. IO. NotifyFilters. LastWrite; this. fileSystemWatcher1. EnableRaisingEvents =true;} private voidfileSystemWatcher1_Changed(object sender, System. IO.FileSystemEventArgs e) {string message = string. Format ("{0} {1}.",e. Name, e. ChangeType); this. listBox1 Items Add (message);}

For the use of this class, just want to say one thing: the cause of a lot of events, so be sure to pay attention to the filter. MSDN a description of the following references:

Windows operating system notice in the buffer created by the FileSystemWatcher component of file changes. If there are many changes in a short time, the buffer can overflow. This causes the component to lose track of changes in the directory, and it will only provide blanket notification. The InternalBufferSize property to increase the buffer size overhead, because it comes from can not be swapped out to disk non-paged memory, it should ensure that the buffer size is moderate (as small as possible, but it should also have sufficient size so as not to lose any file changes events). To avoid buffer overflow, Please to use the NotifyFilter and IncludeSubdirectories property, so that you can filter out unwanted change notifications.

Fortunately, ASP.NET Cache and not to use this component, we do not have to worry about file dependencies referenced repeat operational problems. It is directly dependent on the API provided by webengine.dll Therefore, it is recommended that priority ASP.NET application, the documents provided by the Cache dependent function.

The coexistence of the various caching scheme

ASP.NET Cache is a caching technology in ASP.NET program, however, we can also use other caching techniques, these different cache has its own respective strengths. ASP.NET Cache can not provide external access capabilities, it can not replace the memcached distributed caching technology, but it is not required cross-process access, efficiency faster than the speed of the distributed cache. ASP.NET Cache is designed to [a cache], distributed cache design [two cache, as the CPU cache, you will be able to use all the advantages of both, to achieve a more perfect function, and speed.

In fact, the cache is not a well-defined technology, a static variable is a cache, a static collection is a cache container. This cache compared with the ASP.NET Cache, apparently static variable access speed will be faster, static collection is not designed to be very poor, concurrent conflict may smaller than the ASP.NET Cache, precisely because of this point, static collection of a wide range of use. However, the ASP.NET Cache of some advanced features, such as: expiration time, cache dependency (contains file dependencies) takedown notice, but also static collection does not have. Therefore, it is reasonable to use them, would stop the program with the best performance, and also has more powerful features.


Please enter your comment!
Please enter your name here