Thursday, March 29, 2012
performance of application
then what steps should i take to get performance in web application .ASPNet Performance (http://msdn.microsoft.com/msdnmag/issues/05/01/ASPNETPerformance/default.aspx)
High performance asp.net (http://authors.aspalliance.com/aspxtreme/webapps/developinghigh-performanceaspnetapplications.aspx)
Performance of an ASP.Net application
version, are there any other deployment and/or .Net CLR settings that
can be used to improve the application performance?Not really. You can tweak how IIS and ASP.NET work via the machine.config's
processModel, but I doubt you'll get much out of it.
IF you are seeing performance issues, caching using OutputCaching and the
Cache api as well as database tuning are generally good quick hits.
Karl
http://www.openmymind.net/
http://www.codebetter.com/
<robin9876@.hotmail.com> wrote in message
news:1162894730.222245.309590@.m73g2000cwd.googlegroups.com...
> After reviewing the code and compiling the application as release
> version, are there any other deployment and/or .Net CLR settings that
> can be used to improve the application performance?
>
Here is a Wonderful document which help you in improving the performance of
the application
http://www.microsoft.com/downloads/...&displaylang=en
Regards
Prabakar
"robin9876@.hotmail.com" wrote:
> After reviewing the code and compiling the application as release
> version, are there any other deployment and/or .Net CLR settings that
> can be used to improve the application performance?
>
Performance of an ASP.Net application
version, are there any other deployment and/or .Net CLR settings that
can be used to improve the application performance?Not really. You can tweak how IIS and ASP.NET work via the machine.config's
processModel, but I doubt you'll get much out of it.
IF you are seeing performance issues, caching using OutputCaching and the
Cache api as well as database tuning are generally good quick hits.
Karl
--
http://www.openmymind.net/
http://www.codebetter.com/
<robin9876@.hotmail.comwrote in message
news:1162894730.222245.309590@.m73g2000cwd.googlegr oups.com...
Quote:
Originally Posted by
After reviewing the code and compiling the application as release
version, are there any other deployment and/or .Net CLR settings that
can be used to improve the application performance?
>
Here is a Wonderful document which help you in improving the performance of
the application
http://www.microsoft.com/downloads/...&displaylang=en
Regards
Prabakar
"robin9876@.hotmail.com" wrote:
Quote:
Originally Posted by
After reviewing the code and compiling the application as release
version, are there any other deployment and/or .Net CLR settings that
can be used to improve the application performance?
>
>
Performance of CollectionBase class
I have a doubt which is little bit conceptual rather than a coding
convention.
I have a a table called products in which I have 40000 and odd products
which is expected to be frequently
accessed by the website. I'm having doubt in how fast i can serve this
product details to the requesting clients.
Basically, I have created necessary stored procedures to pull out the data
from the database. Now, I'm planning to have
a data access layer called product (a class) which contains properties,
enumerators etc., to load all the product details
by executing the stored procedure. That is, the data access layer will be
used by the business layer to deal with the product
details in the means of populating it in the webpage, filtering etc.,
So, the product class inherits Collectionbase class to have all the
enumerator, indexer funtionalities which is nothing
but a custom data source and which can be used with DataGrid etc., and all
the business logic like how to populate, what
to populate has been written in the business layer (it uses the data layer
to define the bl), which is eventually used by
the web developers to design the dynamic pages of the site.
I'm having doubt whether this kind of approach will work best in web
atmosphere rather than a windows application or not?
and also loading 40,000 records in a collection base object will give better
performance or not?
Please kindly suggest me, how to proceed in the terms of which is better way
to deal with this.
Thanks in Advance
Vadivel KumarI would put this way- if you are going to pull all 40K records in the
collection class and your web site will have multiple visitors (that means
multiple instances of the same class would be urnning in memory)- you are
definitely experience performance problem. If the business logic permits, I
would only pull the records necessary.
Also, .NET has memeory problem with larger objects (40K records is
definitely one of that kind). You are on the right track to analyse the
problem but you may have to compromise some performance to keep the business
logics flowing.
Prodip
"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
better
> performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
way
> to deal with this.
> Thanks in Advance
> Vadivel Kumar
On Tue, 15 Feb 2005 08:46:01 -0600, "Prodip Saha" <psaha@.bear.com>
wrote:
>Also, .NET has memeory problem with larger objects (40K records is
>definitely one of that kind). You are on the right track to analyse the
>problem but you may have to compromise some performance to keep the business
>logics flowing.
40k records would be a large collection of many small objects - which
can cause problems too.
--
Scott
http://www.OdeToCode.com/blogs/scott/
For example, my data layer contains a class called "product" which inherits
IEnumerator interface and this
class is used by business layer class "products" in which
i have written all my business logic etc.,
So, while retriving the any number of product i'm iterating
from product class in products class.
Is this makes sense and I'm following the standards? and If i follow the
standards i have to give up the price of performance. So, is this shows the
standards are not right?
I'm literrally confused. Advice me.
Thanks & Regards
Vadivel Kumar
"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
> better performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
> way to deal with this.
> Thanks in Advance
> Vadivel Kumar
And, I would like to know another one stuff like I'm developing a set of
libraries
which can be used with windows or web based interface my library code will
execute based on which kind
of interface it is deployed.
So, how to check this in C#, if you take C i normally use
preprocessor directive to check which platform, which architecture etc.,
based on that i will define my functions.
Kindly help me on this.
Thanks in Advance
Vadivel Kumar
"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
> better performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
> way to deal with this.
> Thanks in Advance
> Vadivel Kumar
Performance of CollectionBase class
I have a doubt which is little bit conceptual rather than a coding
convention.
I have a a table called products in which I have 40000 and odd products
which is expected to be frequently
accessed by the website. I'm having doubt in how fast i can serve this
product details to the requesting clients.
Basically, I have created necessary stored procedures to pull out the data
from the database. Now, I'm planning to have
a data access layer called product (a class) which contains properties,
enumerators etc., to load all the product details
by executing the stored procedure. That is, the data access layer will be
used by the business layer to deal with the product
details in the means of populating it in the webpage, filtering etc.,
So, the product class inherits Collectionbase class to have all the
enumerator, indexer funtionalities which is nothing
but a custom data source and which can be used with DataGrid etc., and all
the business logic like how to populate, what
to populate has been written in the business layer (it uses the data layer
to define the bl), which is eventually used by
the web developers to design the dynamic pages of the site.
I'm having doubt whether this kind of approach will work best in web
atmosphere rather than a windows application or not?
and also loading 40,000 records in a collection base object will give better
performance or not?
Please kindly suggest me, how to proceed in the terms of which is better way
to deal with this.
Thanks in Advance
Vadivel KumarI would put this way- if you are going to pull all 40K records in the
collection class and your web site will have multiple visitors (that means
multiple instances of the same class would be urnning in memory)- you are
definitely experience performance problem. If the business logic permits, I
would only pull the records necessary.
Also, .NET has memeory problem with larger objects (40K records is
definitely one of that kind). You are on the right track to analyse the
problem but you may have to compromise some performance to keep the business
logics flowing.
Prodip
"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
better
> performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
way
> to deal with this.
> Thanks in Advance
> Vadivel Kumar
>
On Tue, 15 Feb 2005 08:46:01 -0600, "Prodip Saha" <psaha@.bear.com>
wrote:
>Also, .NET has memeory problem with larger objects (40K records is
>definitely one of that kind). You are on the right track to analyse the
>problem but you may have to compromise some performance to keep the busines
s
>logics flowing.
>
40k records would be a large collection of many small objects - which
can cause problems too.
Scott
http://www.OdeToCode.com/blogs/scott/
For example, my data layer contains a class called "product" which inherits
IEnumerator interface and this
class is used by business layer class "products" in which
i have written all my business logic etc.,
So, while retriving the any number of product i'm iterating
from product class in products class.
Is this makes sense and I'm following the standards? and If i follow the
standards i have to give up the price of performance. So, is this shows the
standards are not right?
I'm literrally . Advice me.
Thanks & Regards
Vadivel Kumar
"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
> better performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
> way to deal with this.
> Thanks in Advance
> Vadivel Kumar
>
And, I would like to know another one stuff like I'm developing a set of
libraries
which can be used with windows or web based interface my library code will
execute based on which kind
of interface it is deployed.
So, how to check this in C#, if you take C i normally use
preprocessor directive to check which platform, which architecture etc.,
based on that i will define my functions.
Kindly help me on this.
Thanks in Advance
Vadivel Kumar
"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
> better performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
> way to deal with this.
> Thanks in Advance
> Vadivel Kumar
>
performance of code
Is there someway to profile your code in Visual Studio with reagard to performance, meaning, to see the time elapse per code that is crunching behind the scenes? I have an application that seems to slow way down at a certain page but I really have no idea where it is hanging up?
Thanks in advance,
Eric
I havent found anything that tests for complexity, calculating the McCabe factor etc.. like there is in C++ and fortran or Delphi.
However here is a link that does have a freeware program you can download to test performance
http://www.alessandropulvirenti.it/programmazione/ for c# and VB
You can enable Tracing on your page (just add Trace="true" to the page directive of your aspx page), and then use Trace.Write(DateTime.Now); to determine what time different sections of code are executed.
Hi Eric,
Based on my understanding, you want to know how to analyze your application performance. If I have misunderstood you, please feel free to let me know.
To better understand your question, could you please confirm the following information:
What version of Visual Studio are you using? If you use Visual Studio 2005 Team Edition, you can use Performance tools which be integrated with development environment (IDE) to measure, evaluate, and target performance-related issues in your code. For more information, see Analyzing Application Performance.
I hope this helps.
Tracing is very clumsy and awkward to set up and leaves you with swathes of code to remove once you've sorted the problems out.
Consider using something like this:
http://www.red-gate.com/products/ants_profiler/
you can get some pretty good results fromhttp://www.jetbrains.com/profiler/
They offer a 10 day trial.
Performance of Data View row filter
Please clarify me on whether a filter on dataview is a performance
bottle neck. we know that
we cannot apply successive filters to a data view. so the better way is
having the 'And' condition in the
Filter expression. suppose i have some 4 and conditions in my filter
expression what will be its effect on the
performance.
or is there any other way round. if so please let me know the details of it.
Thanks in advance.
Regards,
Sundararajan.SHi Sundararajan:
Filters can be a drag, but it's impossible to give you a definitive
answer. You have to measure the filters you are using in your
application with the expected load your application will receive to
determine if the performance hit is acceptable or unacceptable.
Does the data underneath the view come from a database query? If so,
one way around the performance problem is to add WHERE or HAVING
clauses to your SQL query - the database is generally much better at
filtering a set of records than .NET is.
Scott
http://www.OdeToCode.com/blogs/scott/
On Tue, 24 May 2005 07:22:04 -0700, Sundararajan
<sundararajan@.discussions.microsoft.com> wrote:
>Dear Folks,
> Please clarify me on whether a filter on dataview is a performance
>bottle neck. we know that
>we cannot apply successive filters to a data view. so the better way is
>having the 'And' condition in the
>Filter expression. suppose i have some 4 and conditions in my filter
>expression what will be its effect on the
>performance.
> or is there any other way round. if so please let me know the details of
it.
>Thanks in advance.
>Regards,
>Sundararajan.S