Send the link below via email or IMCopy
Present to your audienceStart remote presentation
- Invited audience members will follow you as you navigate and present
- People invited to a presentation do not need a Prezi account
- This link expires 10 minutes after you close the presentation
- A maximum of 30 users can follow your presentation
- Learn more about this feature in our knowledge base article
Transcript of VS2010 Release
Connections VS2010 PLINQ/LINQ - Parallel LINQ Whats New in ASP.NET Runtime 4.o? Amit Gundu What is PLINQ?
Its parallel LINQ in short. Its an execution engine for executing your LINQ queries on multicore systems.
Cool, but what does that mean?
It means you can now run LINQ queries in parallel! Across multiple threads without headache. No more baby sitting threads. It also means your mom can now write parallel code that will scale.
From the MSDN article, "Parallel LINQ: Running Queries On Multi-Core Processors," states: "PLINQ is a query execution engine that accepts any LINQ-to-Objects or LINQ-to-XML query and automatically utilizes multiple processors or cores for execution when they are available."
How and where can it be used?
PLINQ can be put into action whenever you're dealing (processing) large datasets. (RPV services comes to mind.)
For example, if a function takes one millisecond to execute, a sequential query over 1000 elements will take one second to perform that operation, whereas a parallel query on a computer with four cores might take only 250 milliseconds. A speed up of 750 milliseconds. Proof is in the pudding... Example 1
The AsParallel extension method binds the subsequent query operators (where and select) to the System.Linq.ParallelEnumerable. So that means you won't have to call AsParallel on subsequent operators.
var source = Enumerable.Range(1, 10000);
var evenNums = from num in source.AsParallel()
where Compute(num) > 0
PLINQ actually analyzes the the overall structure of the query, and if it finds it wouldn't be optimal to parallelize it, it won't, and will execute the query sequentially.
PLINQ uses all the CPU's on a computer up to a max of 64. You ca tell PLINQ to use no more than the specified number of #CPU's by using WithDegreeOfParallelism<TSource> method.
var query = from item in source.AsParallel().WithDegreeOfParallelism(2)
where Compute(item) > 42
You can also preserve the ordering of the source results by specifying AsOrdered()
var evenNums = from num in numbers.AsParallel().AsOrdered()
where num % 2 == 0
var nums = Enumerable.Range(10, 10000);
var query = from num in nums.AsParallel()
where num % 10 == 0
//Remember LINQ has lazy execution, so the above query is not executed until its enumerated.
//Process the results as each thread completes
//and add them to a System.Collections.Concurrent.ConcurrentBag(Of Int)
//which can safely accept concurrent add operations
query.ForAll((e) => concurrentBag.Add(Compute(e)));
The ForAll Operator
When you execute foreach on a queries results it actually has to merge all the data back together in order to iterate through the results. foreach doesn't run in parallel and requires output from all the parallel tasks to be merged back into the parent thread. You can use foreach when you want to preserve the final ordering of the query results or when you're processing the results in a serial manner. However when order and sequence is not important you should use the ForAll<TSource> method. ForAll does not perform the final merge step.
There are different merge options, PLINQ supports the WithMergeOptions<TSource> and ParallelMergeOptions enumeration.
Look into these for more info! Resources http://msdn.microsoft.com/en-ca/concurrency/default.aspx
http://www.microsoft.com/downloads/details.aspx?FamilyID=86b3d32b-ad26-4bb8-a3ae-c1637026c3ee&displaylang=en What does the runtime consist of?
Caching Output Caching - Generated content always has to be stored in memory (output caching), servers that experience high traffic, the memory requirments for output caching can compete with memory req. for other parts of the web app.
- ASP.NET 4 now give you the ability to extend the output-caching operations. You can configure your own custome output-cache provider. Output cache providers can use any storage mechanism to persist HTML content. Storage options can include local or remote disks, cloud storage, and distributed cache engines.
- You can create an output cache provider that caches the "Top 10" pages of a site in memory, while caching pages that get lower traffic on disk.
- You create a custom output-cache provider as a class that derives from the OutputCacheProvider type. You can then configure the provider in the Web.config file by using the new providers subsection of the outputCache element
- By default, in ASP.NET 4, all HTTP responses, rendered pages, and controls use the in-memory output cache. The defaultProvider attribute for ASP.NET is AspNetInternalProvider. You can change the default output-cache provider used for a Web application by specifying a different provider name for defaultProvider attribute.
- You can select different output-cache providers for individual controls and for individual requests and programmatically specify which provider to use Preloading Web Applications! - First request always takes the longest because it deals with initializing everything.
- New application preload manager (autostart feature) is available when ASP.NET 4 runs on IIS 7.5 on Windows Server 2008 R2.
- You can use the application preload manager to initialize an application and then signal a load-balancer that the application was initialized and ready to accept HTTP traffic.
- Supports multiple applications in a single pool.
- When an IIS 7.5 server is cold-started or when an individual application pool is recycled, IIS 7.5 uses the information in the applicationHost.config file to determine which Web applications have to be automatically started. Example Warm Up
~Requirments IIS 7.5 on Win7 or Win Server 08 R2
1. Select an application pool to be automatically started by using the following in applicationHost.config:
2. When an app pool is reset or IIS is cold-started, it will read the appplicationHost file and determine which apps need to be autostarted.
ASP.NET fires up the type define in the serviceAutoStartProvider (PreWarmMyCache in this example).
3. To create a managed preload type, you need to implement the IProcuessHostPreLoadClient interface.
public class CustomInitialization : System.Web.Hosting.IProcessHostPreloadClient
public void Preload(string parameters)
// Perform initialization logic.
4. After the Preload method is run the application is ready to serve up some pages. <sites>
<site name="MySite" id="1">
<!-- Additional content -->
<!-- Additional content -->
type="WebFx.Utility.CustomInitialization, MyLibrary" />
</serviceAutoStartProviders> Resources: Core Services'Changes Overview:
The Gu' on Warming up WebApplications:
http://weblogs.asp.net/scottgu/archive/2009/09/15/auto-start-asp-net-applications-vs-2010-and-net-4-0-series.aspx Diagnostics and Performance
Earlier versions of .NET didn't provide a way to determine if a particular application domain was affecting other domains, because Task Manager is only precise to the process level.
.NET 4 you can now get processor usage and memory usage per app domain.
App domain resource monitoring is available through managed hosting APIs.
When enabled it collects stats on all application domains. Check out AppDomain.MonitoringIsEnabled property.
- 4.0 now provides background garbage collection. It replaces the concurrent GC in previous versions and has better performance. F# for Parallel and Async Programming Some of the challenges with concurrent programming
Shared State:Hard to maintain, test, parallelize, and locking is pain in the ass.
Code Locality: Use to expressing algorithms linearly. Async requires logical division of algorithms. Difficult to combine async operations and deal w/ exceptions and cancellations.
I/O Parallelism: Leveraging webservices, I/O resources are inherently parallel. Resources: http://blogs.msdn.com/timng/
Ask Harish... and tell him to work on his pool game. Thank You