Thursday, March 29, 2012

performance of application

I have Desktop application in vb.net and i wanted to converted that in web based but i want that performance of web application should be smart enough as desktop application . or at least close to that.
then what steps should i take to get performance in web application .ASPNet Performance (http://msdn.microsoft.com/msdnmag/issues/05/01/ASPNETPerformance/default.aspx)

High performance asp.net (http://authors.aspalliance.com/aspxtreme/webapps/developinghigh-performanceaspnetapplications.aspx)

Performance of an ASP.Net application

After reviewing the code and compiling the application as release
version, are there any other deployment and/or .Net CLR settings that
can be used to improve the application performance?Not really. You can tweak how IIS and ASP.NET work via the machine.config's
processModel, but I doubt you'll get much out of it.
IF you are seeing performance issues, caching using OutputCaching and the
Cache api as well as database tuning are generally good quick hits.
Karl
http://www.openmymind.net/
http://www.codebetter.com/
<robin9876@.hotmail.com> wrote in message
news:1162894730.222245.309590@.m73g2000cwd.googlegroups.com...
> After reviewing the code and compiling the application as release
> version, are there any other deployment and/or .Net CLR settings that
> can be used to improve the application performance?
>
Here is a Wonderful document which help you in improving the performance of
the application
http://www.microsoft.com/downloads/...&displaylang=en
Regards
Prabakar
"robin9876@.hotmail.com" wrote:

> After reviewing the code and compiling the application as release
> version, are there any other deployment and/or .Net CLR settings that
> can be used to improve the application performance?
>

Performance of an ASP.Net application

After reviewing the code and compiling the application as release
version, are there any other deployment and/or .Net CLR settings that
can be used to improve the application performance?Not really. You can tweak how IIS and ASP.NET work via the machine.config's
processModel, but I doubt you'll get much out of it.

IF you are seeing performance issues, caching using OutputCaching and the
Cache api as well as database tuning are generally good quick hits.

Karl

--
http://www.openmymind.net/
http://www.codebetter.com/
<robin9876@.hotmail.comwrote in message
news:1162894730.222245.309590@.m73g2000cwd.googlegr oups.com...

Quote:

Originally Posted by

After reviewing the code and compiling the application as release
version, are there any other deployment and/or .Net CLR settings that
can be used to improve the application performance?
>


Here is a Wonderful document which help you in improving the performance of
the application

http://www.microsoft.com/downloads/...&displaylang=en
Regards
Prabakar

"robin9876@.hotmail.com" wrote:

Quote:

Originally Posted by

After reviewing the code and compiling the application as release
version, are there any other deployment and/or .Net CLR settings that
can be used to improve the application performance?
>
>

Performance of CollectionBase class

Hi Guys,

I have a doubt which is little bit conceptual rather than a coding
convention.

I have a a table called products in which I have 40000 and odd products
which is expected to be frequently
accessed by the website. I'm having doubt in how fast i can serve this
product details to the requesting clients.

Basically, I have created necessary stored procedures to pull out the data
from the database. Now, I'm planning to have
a data access layer called product (a class) which contains properties,
enumerators etc., to load all the product details
by executing the stored procedure. That is, the data access layer will be
used by the business layer to deal with the product
details in the means of populating it in the webpage, filtering etc.,

So, the product class inherits Collectionbase class to have all the
enumerator, indexer funtionalities which is nothing
but a custom data source and which can be used with DataGrid etc., and all
the business logic like how to populate, what
to populate has been written in the business layer (it uses the data layer
to define the bl), which is eventually used by
the web developers to design the dynamic pages of the site.

I'm having doubt whether this kind of approach will work best in web
atmosphere rather than a windows application or not?
and also loading 40,000 records in a collection base object will give better
performance or not?

Please kindly suggest me, how to proceed in the terms of which is better way
to deal with this.

Thanks in Advance
Vadivel KumarI would put this way- if you are going to pull all 40K records in the
collection class and your web site will have multiple visitors (that means
multiple instances of the same class would be urnning in memory)- you are
definitely experience performance problem. If the business logic permits, I
would only pull the records necessary.

Also, .NET has memeory problem with larger objects (40K records is
definitely one of that kind). You are on the right track to analyse the
problem but you may have to compromise some performance to keep the business
logics flowing.

Prodip

"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
better
> performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
way
> to deal with this.
> Thanks in Advance
> Vadivel Kumar

On Tue, 15 Feb 2005 08:46:01 -0600, "Prodip Saha" <psaha@.bear.com>
wrote:

>Also, .NET has memeory problem with larger objects (40K records is
>definitely one of that kind). You are on the right track to analyse the
>problem but you may have to compromise some performance to keep the business
>logics flowing.

40k records would be a large collection of many small objects - which
can cause problems too.

--
Scott
http://www.OdeToCode.com/blogs/scott/
For example, my data layer contains a class called "product" which inherits
IEnumerator interface and this
class is used by business layer class "products" in which
i have written all my business logic etc.,

So, while retriving the any number of product i'm iterating
from product class in products class.

Is this makes sense and I'm following the standards? and If i follow the
standards i have to give up the price of performance. So, is this shows the
standards are not right?

I'm literrally confused. Advice me.

Thanks & Regards
Vadivel Kumar

"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
> better performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
> way to deal with this.
> Thanks in Advance
> Vadivel Kumar
And, I would like to know another one stuff like I'm developing a set of
libraries
which can be used with windows or web based interface my library code will
execute based on which kind
of interface it is deployed.

So, how to check this in C#, if you take C i normally use
preprocessor directive to check which platform, which architecture etc.,
based on that i will define my functions.

Kindly help me on this.

Thanks in Advance
Vadivel Kumar

"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
> better performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
> way to deal with this.
> Thanks in Advance
> Vadivel Kumar

Performance of CollectionBase class

Hi Guys,
I have a doubt which is little bit conceptual rather than a coding
convention.
I have a a table called products in which I have 40000 and odd products
which is expected to be frequently
accessed by the website. I'm having doubt in how fast i can serve this
product details to the requesting clients.
Basically, I have created necessary stored procedures to pull out the data
from the database. Now, I'm planning to have
a data access layer called product (a class) which contains properties,
enumerators etc., to load all the product details
by executing the stored procedure. That is, the data access layer will be
used by the business layer to deal with the product
details in the means of populating it in the webpage, filtering etc.,
So, the product class inherits Collectionbase class to have all the
enumerator, indexer funtionalities which is nothing
but a custom data source and which can be used with DataGrid etc., and all
the business logic like how to populate, what
to populate has been written in the business layer (it uses the data layer
to define the bl), which is eventually used by
the web developers to design the dynamic pages of the site.
I'm having doubt whether this kind of approach will work best in web
atmosphere rather than a windows application or not?
and also loading 40,000 records in a collection base object will give better
performance or not?
Please kindly suggest me, how to proceed in the terms of which is better way
to deal with this.
Thanks in Advance
Vadivel KumarI would put this way- if you are going to pull all 40K records in the
collection class and your web site will have multiple visitors (that means
multiple instances of the same class would be urnning in memory)- you are
definitely experience performance problem. If the business logic permits, I
would only pull the records necessary.
Also, .NET has memeory problem with larger objects (40K records is
definitely one of that kind). You are on the right track to analyse the
problem but you may have to compromise some performance to keep the business
logics flowing.
Prodip
"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
better
> performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
way
> to deal with this.
> Thanks in Advance
> Vadivel Kumar
>

On Tue, 15 Feb 2005 08:46:01 -0600, "Prodip Saha" <psaha@.bear.com>
wrote:

>Also, .NET has memeory problem with larger objects (40K records is
>definitely one of that kind). You are on the right track to analyse the
>problem but you may have to compromise some performance to keep the busines
s
>logics flowing.
>
40k records would be a large collection of many small objects - which
can cause problems too.
Scott
http://www.OdeToCode.com/blogs/scott/
For example, my data layer contains a class called "product" which inherits
IEnumerator interface and this
class is used by business layer class "products" in which
i have written all my business logic etc.,
So, while retriving the any number of product i'm iterating
from product class in products class.
Is this makes sense and I'm following the standards? and If i follow the
standards i have to give up the price of performance. So, is this shows the
standards are not right?
I'm literrally . Advice me.
Thanks & Regards
Vadivel Kumar
"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
> better performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
> way to deal with this.
> Thanks in Advance
> Vadivel Kumar
>
And, I would like to know another one stuff like I'm developing a set of
libraries
which can be used with windows or web based interface my library code will
execute based on which kind
of interface it is deployed.
So, how to check this in C#, if you take C i normally use
preprocessor directive to check which platform, which architecture etc.,
based on that i will define my functions.
Kindly help me on this.
Thanks in Advance
Vadivel Kumar
"Vadivel Kumar" <donotreply@.spam-i-love-u.com> wrote in message
news:uBxbJK2EFHA.2700@.TK2MSFTNGP14.phx.gbl...
> Hi Guys,
> I have a doubt which is little bit conceptual rather than a coding
> convention.
> I have a a table called products in which I have 40000 and odd products
> which is expected to be frequently
> accessed by the website. I'm having doubt in how fast i can serve this
> product details to the requesting clients.
> Basically, I have created necessary stored procedures to pull out the data
> from the database. Now, I'm planning to have
> a data access layer called product (a class) which contains properties,
> enumerators etc., to load all the product details
> by executing the stored procedure. That is, the data access layer will be
> used by the business layer to deal with the product
> details in the means of populating it in the webpage, filtering etc.,
> So, the product class inherits Collectionbase class to have all the
> enumerator, indexer funtionalities which is nothing
> but a custom data source and which can be used with DataGrid etc., and all
> the business logic like how to populate, what
> to populate has been written in the business layer (it uses the data layer
> to define the bl), which is eventually used by
> the web developers to design the dynamic pages of the site.
> I'm having doubt whether this kind of approach will work best in web
> atmosphere rather than a windows application or not?
> and also loading 40,000 records in a collection base object will give
> better performance or not?
> Please kindly suggest me, how to proceed in the terms of which is better
> way to deal with this.
> Thanks in Advance
> Vadivel Kumar
>

performance of code

Is there someway to profile your code in Visual Studio with reagard to performance, meaning, to see the time elapse per code that is crunching behind the scenes? I have an application that seems to slow way down at a certain page but I really have no idea where it is hanging up?

Thanks in advance,

Eric

I havent found anything that tests for complexity, calculating the McCabe factor etc.. like there is in C++ and fortran or Delphi.

However here is a link that does have a freeware program you can download to test performance

http://www.alessandropulvirenti.it/programmazione/ for c# and VB


You can enable Tracing on your page (just add Trace="true" to the page directive of your aspx page), and then use Trace.Write(DateTime.Now); to determine what time different sections of code are executed.


Hi Eric,

Based on my understanding, you want to know how to analyze your application performance. If I have misunderstood you, please feel free to let me know.

To better understand your question, could you please confirm the following information:
What version of Visual Studio are you using? If you use Visual Studio 2005 Team Edition, you can use Performance tools which be integrated with development environment (IDE) to measure, evaluate, and target performance-related issues in your code. For more information, see Analyzing Application Performance.


I hope this helps.


Tracing is very clumsy and awkward to set up and leaves you with swathes of code to remove once you've sorted the problems out.

Consider using something like this:

http://www.red-gate.com/products/ants_profiler/


you can get some pretty good results fromhttp://www.jetbrains.com/profiler/

They offer a 10 day trial.

Performance of Data View row filter

Dear Folks,
Please clarify me on whether a filter on dataview is a performance
bottle neck. we know that
we cannot apply successive filters to a data view. so the better way is
having the 'And' condition in the
Filter expression. suppose i have some 4 and conditions in my filter
expression what will be its effect on the
performance.
or is there any other way round. if so please let me know the details of it.
Thanks in advance.
Regards,
Sundararajan.SHi Sundararajan:
Filters can be a drag, but it's impossible to give you a definitive
answer. You have to measure the filters you are using in your
application with the expected load your application will receive to
determine if the performance hit is acceptable or unacceptable.
Does the data underneath the view come from a database query? If so,
one way around the performance problem is to add WHERE or HAVING
clauses to your SQL query - the database is generally much better at
filtering a set of records than .NET is.
Scott
http://www.OdeToCode.com/blogs/scott/
On Tue, 24 May 2005 07:22:04 -0700, Sundararajan
<sundararajan@.discussions.microsoft.com> wrote:

>Dear Folks,
> Please clarify me on whether a filter on dataview is a performance
>bottle neck. we know that
>we cannot apply successive filters to a data view. so the better way is
>having the 'And' condition in the
>Filter expression. suppose i have some 4 and conditions in my filter
>expression what will be its effect on the
>performance.
> or is there any other way round. if so please let me know the details of
it.
>Thanks in advance.
>Regards,
>Sundararajan.S

Performance of Data View row filter

Dear Folks,

Please clarify me on whether a filter on dataview is a performance
bottle neck. we know that
we cannot apply successive filters to a data view. so the better way is
having the 'And' condition in the
Filter expression. suppose i have some 4 and conditions in my filter
expression what will be its effect on the
performance.
or is there any other way round. if so please let me know the details of it.

Thanks in advance.

Regards,
Sundararajan.SHi Sundararajan:

Filters can be a drag, but it's impossible to give you a definitive
answer. You have to measure the filters you are using in your
application with the expected load your application will receive to
determine if the performance hit is acceptable or unacceptable.

Does the data underneath the view come from a database query? If so,
one way around the performance problem is to add WHERE or HAVING
clauses to your SQL query - the database is generally much better at
filtering a set of records than .NET is.

--
Scott
http://www.OdeToCode.com/blogs/scott/

On Tue, 24 May 2005 07:22:04 -0700, Sundararajan
<sundararajan@.discussions.microsoft.com> wrote:

>Dear Folks,
> Please clarify me on whether a filter on dataview is a performance
>bottle neck. we know that
>we cannot apply successive filters to a data view. so the better way is
>having the 'And' condition in the
>Filter expression. suppose i have some 4 and conditions in my filter
>expression what will be its effect on the
>performance.
> or is there any other way round. if so please let me know the details of it.
>Thanks in advance.
>Regards,
>Sundararajan.S

Performance of datatable.select() func against database query or

Hi all
I am developing a webportal on VS2005 and ASP.NET 2.0
Using MsSQL SERVER 2005.
I have some static records(around 15000) in the database.These contents
never changed while the application is running.
what i m doing is querying the database only once at the
application_start event of the global.asax file and storing the records
in dataset...and further no query to database is done for this data.
wherever I need this data I fetch it from the dataset...through
datatable.select(filter expression) method.this is done several time for
a single page request and the website is high traffic portal.
Is it an efficient way of doing or there is any other better way by
which I can achieve the same..,as the amount of data is very
large...approx 15,000 records.
Plz Suggest me ........
Thanxxxxxxxxxxxxxx
Regards
Deepti Yadav
NoidaOK, I will suggest you. 15,000 rows for a select is a lot of data to displa
y
in an ASP.NET web application. Are your users realistically going to need to
see all 15,000 rows? why not have a more restrictive select with some sort
of paging mechanism if they want to move to the next page of results?
Peter
--
Site: http://www.eggheadcafe.com
UnBlog: http://petesbloggerama.blogspot.com
Short urls & more: http://ittyurl.net
"Mukesh" wrote:

> Hi all
> I am developing a webportal on VS2005 and ASP.NET 2.0
> Using MsSQL SERVER 2005.
> I have some static records(around 15000) in the database.These contents
> never changed while the application is running.
> what i m doing is querying the database only once at the
> application_start event of the global.asax file and storing the records
> in dataset...and further no query to database is done for this data.
> wherever I need this data I fetch it from the dataset...through
> datatable.select(filter expression) method.this is done several time for
> a single page request and the website is high traffic portal.
> Is it an efficient way of doing or there is any other better way by
> which I can achieve the same..,as the amount of data is very
> large...approx 15,000 records.
> Plz Suggest me ........
>
> Thanxxxxxxxxxxxxxx
> Regards
> Deepti Yadav
> Noida
>

Performance of DataBinder.Eval

Howdy All,

I was reading an older posting about the performance hit of
DataBinder.Eval.

http://groups-beta.google.com/group...02b1685ae0f63c0

Am I correct in understanding that

<%# ((DataRowView)Container.DataIt*em)["EmployeeName"] %
would have better performance than
<%# DataBinder.Eval(Container.DataIt*em, ["EmployeeName"]) %
Does anybody have comments on this or is there a better way to set the
values of the Template Column?

thanks

dblDBL:
You are correct.

I can't give you numbers, but you ARE correct.

I will say 1 more thing however. Casting DataItem to the specific item
might be faster, but it violates your tiers and makes it harder to maintain.
Your presentation shouldn't know what TYPE of data source is being bound, it
should be implementation independent. The minute you start casting
everything, you make it a realy pain in the ass to change if you start using
collections or datasets.

Sometimes the penalty is too high to pay, but in my opinion in this case,
I'd strongly suggest you use DataBinder.Eval unless you have compelling
reasons why you shoudln't in your particular case.

Karl

--
MY ASP.Net tutorials
http://www.openmymind.net/ - New and Improved (yes, the popup is annoying)
http://www.openmymind.net/faq.aspx - unofficial newsgroup FAQ (more to
come!)

"DBLWizard" <ibflyfishin@.yahoo.com> wrote in message
news:1121106912.352713.81740@.g44g2000cwa.googlegro ups.com...
Howdy All,

I was reading an older posting about the performance hit of
DataBinder.Eval.

http://groups-beta.google.com/group...02b1685ae0f63c0

Am I correct in understanding that

<%# ((DataRowView)Container.DataIt*em)["EmployeeName"] %
would have better performance than
<%# DataBinder.Eval(Container.DataIt*em, ["EmployeeName"]) %
Does anybody have comments on this or is there a better way to set the
values of the Template Column?

thanks

dbl

Performance of datatable.select() func against database query orthe xquery

Hi all

I am developing a webportal on VS2005 and ASP.NET 2.0
Using MsSQL SERVER 2005.
I have some static records(around 15000) in the database.These contents
never changed while the application is running.

what i m doing is querying the database only once at the
application_start event of the global.asax file and storing the records
in dataset...and further no query to database is done for this data.
wherever I need this data I fetch it from the dataset...through
datatable.select(filter expression) method.this is done several time for
a single page request and the website is high traffic portal.

Is it an efficient way of doing or there is any other better way by
which I can achieve the same..,as the amount of data is very
large...approx 15,000 records.
Plz Suggest me ........

Thanxxxxxxxxxxxxxx

Regards

Deepti Yadav
NoidaOK, I will suggest you. 15,000 rows for a select is a lot of data to display
in an ASP.NET web application. Are your users realistically going to need to
see all 15,000 rows? why not have a more restrictive select with some sort
of paging mechanism if they want to move to the next page of results?

Peter
--
Site: http://www.eggheadcafe.com
UnBlog: http://petesbloggerama.blogspot.com
Short urls & more: http://ittyurl.net
"Mukesh" wrote:

Quote:

Originally Posted by

>
Hi all
>
I am developing a webportal on VS2005 and ASP.NET 2.0
Using MsSQL SERVER 2005.
I have some static records(around 15000) in the database.These contents
never changed while the application is running.
>
what i m doing is querying the database only once at the
application_start event of the global.asax file and storing the records
in dataset...and further no query to database is done for this data.
wherever I need this data I fetch it from the dataset...through
datatable.select(filter expression) method.this is done several time for
a single page request and the website is high traffic portal.
>
Is it an efficient way of doing or there is any other better way by
which I can achieve the same..,as the amount of data is very
large...approx 15,000 records.
Plz Suggest me ........
>
>
Thanxxxxxxxxxxxxxx
>
Regards
>
Deepti Yadav
Noida
>


Hi Deepti,

Although you're not directly displaying all 15,000 records to end-user
(used DataTable.Select to filter), it's still not a good idea to cache all
15,000 records at server-side.

I would suggest to optimize at the database side: with careful index
design, I think you should get good performance when frequently selecting
from the table. After all, the database server also does pretty well to
cache data for frequently queried data.

Hope this helps.

Regards,
Walter Wang (wawang@.online.microsoft.com, remove 'online.')
Microsoft Online Community Support

==================================================
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
==================================================

This posting is provided "AS IS" with no warranties, and confers no rights.

performance of release vs debug build

Hi,
Up until now I have been shipping debug builds of my asp.net
application and including the pdb files too, even into production
systems, primarily because it gives us proper stack traces with line
numbers in the event of an exception. I haven't been aware of any
stability, memory or performance problems with the application, even
for sites that have been running 24/7 for months with worker process
recycling disabled.
is there any reason I shouldn't continue doing this indefinitely and
only switch to release builds if a customer needs a really highly
tuned system, or are there some gotchas with debug builds?
a related question - any general guidelines as to how much faster C#
code performs in release build? as for most asp.net apps I would
anticipate that the system load of the application as a whole is
probably 50% IIS/franework, 50% database and only a miniscule amount
for my processing.
cheers
AndyHi,
See:
http://weblogs.asp.net/scottgu/arch...0_-enabled.aspx
Note also the followup link in the post.
Teemu Keiski
AspInsider, ASP.NET MVP
http://blogs.aspadvice.com/joteke
http://teemukeiski.net
<ajfish@.blueyonder.co.uk> wrote in message
news:1178872287.825311.58920@.h2g2000hsg.googlegroups.com...
> Hi,
> Up until now I have been shipping debug builds of my asp.net
> application and including the pdb files too, even into production
> systems, primarily because it gives us proper stack traces with line
> numbers in the event of an exception. I haven't been aware of any
> stability, memory or performance problems with the application, even
> for sites that have been running 24/7 for months with worker process
> recycling disabled.
> is there any reason I shouldn't continue doing this indefinitely and
> only switch to release builds if a customer needs a really highly
> tuned system, or are there some gotchas with debug builds?
> a related question - any general guidelines as to how much faster C#
> code performs in release build? as for most asp.net apps I would
> anticipate that the system load of the application as a whole is
> probably 50% IIS/franework, 50% database and only a miniscule amount
> for my processing.
> cheers
> Andy
>
Thanks very much for these links
it seems to me that we can continue to use a debug build for the code
in the application without too much penalty, but should definitely use
debug=false in the web.config
On 11 May, 10:08, "Teemu Keiski" <jot...@.aspalliance.com> wrote:
> Hi,
> See:[url]http://weblogs.asp.net/scottgu/archive/2006/04/11/Don_1920_t-run-prod...[/ur
l]
> Note also the followup link in the post.
> --
> Teemu Keiski
> AspInsider, ASP.NET MVPhttp://blogs.aspadvice.com/jotekehttp://teemukeiski
.net
> <ajf...@.blueyonder.co.uk> wrote in message
> news:1178872287.825311.58920@.h2g2000hsg.googlegroups.com...
>
>
>
>
>
>
>
> - Show quoted text -
Yup, that's what Scott says to be a sort of "compromise"
Teemu
<ajfish@.blueyonder.co.uk> wrote in message
news:1178876244.123360.92750@.e65g2000hsc.googlegroups.com...
> Thanks very much for these links
> it seems to me that we can continue to use a debug build for the code
> in the application without too much penalty, but should definitely use
> debug=false in the web.config
> On 11 May, 10:08, "Teemu Keiski" <jot...@.aspalliance.com> wrote:
>

Performance of Xsl Transformations

Folks, I'm running into some performance issues with my Xsl transformations.
I've done a ton of debugging and digging around, and have come to the
conclusion that the performance issues are NOT caused by slow stored
procedures, or bad XSL/Ts.

I came to this conclusion by doing a test transformation in client-side code
instead of server side aspx, as shown at the bottom of this post. The
transformations were super-fast. However, the client-side javascript isn't a
viable solution for my application.

I'm sure that different versions of the XMLDOM are being used when the code
is client-side as opposed to using System.Xml.Xsl.

Here's my C# code. What I'm trying to do is write out an HTML file of my
transformation.

// Create a FileStream to write with
System.IO.FileStream fs = new System.IO.FileStream(exportPath,
System.IO.FileMode.Create);
// Create an XmlTextWriter for the FileStream
System.Xml.XmlTextWriter oXmlTextWriter = new
System.Xml.XmlTextWriter(fs, System.Text.Encoding.Unicode);

try
{
// Set up the XmlResolver
XmlUrlResolver oXmlUrlResolver = new XmlUrlResolver();
oXmlUrlResolver.Credentials = CredentialCache.DefaultCredentials;

// Set up the XslTransform
System.Xml.Xsl.XslTransform oXslTransform = new
System.Xml.Xsl.XslTransform();
oXslTransform.Load(MyXslPath, oXmlUrlResolver);

// Perform Transformation
XmlDataDocument oXmlDataDocument = new XmlDataDocument(MyDataSet);
oXslTransform.Transform(oXmlDataDocument, null, oXmlTextWriter,
oXmlUrlResolver);

// Clean up
oXmlTextWriter.Close();

return exportPath; // defined elsewhere
}
catch (Exception ex)
{
oXmlTextWriter.Close();
System.IO.File.Delete(exportPath);
ExceptionManager.Publish(ex);
throw(ex);
}

The code works, but it's slow. It was suggested that I use an XPathDocument
instead of an XmlDataDocument. How would I do that?
Any suggestions? Thank You

Client-Side Transformation
<%
var sXml = "MyXml.Xml"
var sXsl = "MyXsl.xsl"

var oXmlDoc = Server.CreateObject("MICROSOFT.XMLDOM");
var oXslDoc = Server.CreateObject("MICROSOFT.XMLDOM");
oXmlDoc.async = false;
oXslDoc.async = false;
oXmlDoc.load(Server.MapPath(sXml));
oXslDoc.load(Server.MapPath(sXsl));
Response.Write(oXmlDoc.transformNode(oXslDoc));
%Ok, I figured this out ... I would like to do some research to figure out
why this is faster (MUCH faster).

The slow way:

DataSet oDataSet = populate dataset with some report data
System.IO.FileStream oFileStream = new System.IO.FileStream(exportPath,
System.IO.FileMode.Create);
System.Xml.XmlTextWriter oXmlTextWriter = new
System.Xml.XmlTextWriter(oFileStream, System.Text.Encoding.Unicode);
try
{
XmlUrlResolver oXmlUrlResolver = new XmlUrlResolver();
oXmlUrlResolver.Credentials = CredentialCache.DefaultCredentials;
System.Xml.Xsl.XslTransform oXslTransform = new
System.Xml.Xsl.XslTransform();
oXslTransform.Load(_XslPath, oXmlUrlResolver);
XmlDataDocument oXmlDataDocument = new XmlDataDocument(oDataSet);
oXslTransform.Transform(oXmlDataDocument, null, oXmlTextWriter,
oXmlUrlResolver);
oXmlTextWriter.Close();

return exportPath; // path of exported file
}
catch (Exception ex)
{
oXmlTextWriter.Close();
System.IO.File.Delete(exportPath);
throw(ex);
}

The fast way:

DataSet oDataSet = populate dataset with some report data
XmlTextReader oXmlTextReader = new XmlTextReader(oDataSet.GetXml(),
XmlNodeType.Document, null);
System.IO.FileStream oFileStream = new System.IO.FileStream(exportPath,
System.IO.FileMode.Create);
System.Xml.XmlTextWriter oXmlTextWriter = new
System.Xml.XmlTextWriter(oFileStream, System.Text.Encoding.Unicode);
try
{
XmlUrlResolver oXmlUrlResolver = new XmlUrlResolver();
oXmlUrlResolver.Credentials = CredentialCache.DefaultCredentials;
System.Xml.Xsl.XslTransform oXslTransform = new
System.Xml.Xsl.XslTransform();
oXslTransform.Load(_XslPath, oXmlUrlResolver);
XPathDocument oXPathDocument = new XPathDocument(oXmlReader);
oXslTransform.Transform(oXPathDocument, null, oXmlTextWriter,
oXmlUrlResolver);
oXmlTextWriter.Close();

return exportPath; // path of exported file
}
catch (Exception ex)
{
oXmlTextWriter.Close();
System.IO.File.Delete(exportPath);
throw(ex);
}

"George Durzi" <gdurzi@.hotmail.com> wrote in message
news:#eQPvdwzDHA.1744@.TK2MSFTNGP12.phx.gbl...
> Folks, I'm running into some performance issues with my Xsl
transformations.
> I've done a ton of debugging and digging around, and have come to the
> conclusion that the performance issues are NOT caused by slow stored
> procedures, or bad XSL/Ts.
> I came to this conclusion by doing a test transformation in client-side
code
> instead of server side aspx, as shown at the bottom of this post. The
> transformations were super-fast. However, the client-side javascript isn't
a
> viable solution for my application.
> I'm sure that different versions of the XMLDOM are being used when the
code
> is client-side as opposed to using System.Xml.Xsl.
> Here's my C# code. What I'm trying to do is write out an HTML file of my
> transformation.
> // Create a FileStream to write with
> System.IO.FileStream fs = new System.IO.FileStream(exportPath,
> System.IO.FileMode.Create);
> // Create an XmlTextWriter for the FileStream
> System.Xml.XmlTextWriter oXmlTextWriter = new
> System.Xml.XmlTextWriter(fs, System.Text.Encoding.Unicode);
> try
> {
> // Set up the XmlResolver
> XmlUrlResolver oXmlUrlResolver = new XmlUrlResolver();
> oXmlUrlResolver.Credentials = CredentialCache.DefaultCredentials;
> // Set up the XslTransform
> System.Xml.Xsl.XslTransform oXslTransform = new
> System.Xml.Xsl.XslTransform();
> oXslTransform.Load(MyXslPath, oXmlUrlResolver);
> // Perform Transformation
> XmlDataDocument oXmlDataDocument = new XmlDataDocument(MyDataSet);
> oXslTransform.Transform(oXmlDataDocument, null, oXmlTextWriter,
> oXmlUrlResolver);
> // Clean up
> oXmlTextWriter.Close();
> return exportPath; // defined elsewhere
> }
> catch (Exception ex)
> {
> oXmlTextWriter.Close();
> System.IO.File.Delete(exportPath);
> ExceptionManager.Publish(ex);
> throw(ex);
> }
>
> The code works, but it's slow. It was suggested that I use an
XPathDocument
> instead of an XmlDataDocument. How would I do that?
> Any suggestions? Thank You
>
> Client-Side Transformation
> <%
> var sXml = "MyXml.Xml"
> var sXsl = "MyXsl.xsl"
> var oXmlDoc = Server.CreateObject("MICROSOFT.XMLDOM");
> var oXslDoc = Server.CreateObject("MICROSOFT.XMLDOM");
> oXmlDoc.async = false;
> oXslDoc.async = false;
> oXmlDoc.load(Server.MapPath(sXml));
> oXslDoc.load(Server.MapPath(sXsl));
> Response.Write(oXmlDoc.transformNode(oXslDoc));
> %>

Performance on images.

I have a web application which contains loads of images.

when I load from our hosted site, its pretty slow. i can see every image as they were loading. sometimes images do not display until they are used ( e.g. mouseover a certain image panel )

Question is. what is the best practice of handling images to boost performance? currently i just place my images in a folder and use CSS to reference them.

Will it improve performance to use Resource files instead?

thanks!

the only way i can think of is to cache the page, but the first still needs time as physically, people have to download the images no matter what.

Hi,

The way u r doing is common thing and i think its the right way also.
this problem may be becasuse of poor bandwidth. Anyway it will be better to use image will less KB's


thank you for your inputs. one thing more is that I heavily use server controls, which I'm not really sure if it has any effect on performance... taking out the ViewState factor...

I just have one thing to consider, if you are dealing with large images you can create thumbnails out of these images which will decrease its size and increase the loading speed.

Create Thumbnail Images

HC


can you elaborate on what you mean please??

do you mean to use the code your probided in ur blog to create thumbnail images.. and use these images instead?


The code provided will create a thumbnail image from the original image with specified width and height (100 px, 100px) which will decrease its size and make the load process better. for example if you have an image size equals to 6 mb when you create a thumbnail it will be 24 kb or even less.

So what yo u can do is to programatically using the provided code create an thumbnails save them in a folder on your web server and load them instead of using the original images.

HC


hi,

i've read your problem thru. and i think both the above two solutions can solve you problem. you can use output caching in ur page,setting theDuration time a little ralatively longer andLoctaionto be 'Any'. For more information you can refer to:

ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.VisualStudio.v80.en/dv_aspnetcon/html/51149563-a347-4b2b-bdc2-57a317e12487.htm

Also you can follow haissam's suggestion to create thumbnail images for the original ones.

Well, i have another suggestion, you can choose to host your images to another web host( if you have got pretty pretty lots of images) which is much more powerful and stronger . This can make your access to the images much faster than now.

Hope my suggestion helps. :)


I appreciate all your responses. thank you very much.! =)

hi,

how do i open this link ? ms-help://MS.MSDNQTR.v80.en/MS.MSDN.v80/MS.VisualStudio.v80.en/dv_aspnetcon/html/51149563-a347-4b2b-bdc2-57a317e12487.htm


The 100 px x 100px.. is that fixed? or just need to set it same with the original height and oeiginal width?

so if you have an image 1000px by 400px... just set it to these values?


another thing, i'm getting "A generic error occurred in GDI+" error when saving the thumbnail. any ideaS?

oh, i assumed that u have an msdn installed on your local computer. i'm sorry. you can still refer to :http://msdn2.microsoft.com/en-us/library/zd1ysf1y(VS.80).aspx

it's the same :)

performance problem after porting .net 1.1 to 2.0

porting .net project to .net 2.0 cause lot of performance issues .

Server become down because of cpu usage, memory usage.

thanks

You are going to have to supply more details for anyone to be able to offer suggestions. Typically we see sites run much faster moving to 2.0.


hai

we have application which contains

1. client application

2. web application

These application use samewebservice which contains methods that returns datasets.
Dataset contains 7-8 tables (tables are merged in dataset using merge method)
Crystal reports arealso used.

In Web application ,expoted to pdf format

In Client application Crystal reports10 is used.

Aso session and view state is used .

porting .net project to .net 2.0 cause lot of performance issues .

Server become down because of cpu usage, memory usage.

thanks

Performance problem with asp.net and database connection

Hallo,

I have an ASP.NET application with masterpages, skins and diffrent themes.
The application works fine, but the performance is not realy good. If I load
an ASPX file, which has no database connection, is the performance ok. ASPX
file with one or more database queries have a answer time about 15 or 20
seconds. Each database query need maximal a second to give the answer. So
the query works ok!
Know you a reason, why the time with a database connection is so long? How
can I make my application faster?

Thanks for the answers and tips

ThomasHello Thomas,

Do you use connection pooling?!
Seems that your delay is the reason that no connection pooling is activated
and new connection takes some ammount of time to be initiated

--
WBR, Michael Nemtsev [.NET/C# MVP].
My blog: http://spaces.live.com/laflour
Team blog: http://devkids.blogspot.com/
"The greatest danger for most of us is not that our aim is too high and we
miss it, but that it is too low and we reach it" (c) Michelangelo

THI have an ASP.NET application with masterpages, skins and diffrent
THthemes.
THThe application works fine, but the performance is not realy good.
THIf I load
THan ASPX file, which has no database connection, is the performance
THok. ASPX
THfile with one or more database queries have a answer time about 15
THor 20
THseconds. Each database query need maximal a second to give the
THanswer. So
THthe query works ok!
THKnow you a reason, why the time with a database connection is so
THlong? How
THcan I make my application faster?
On Tue, 3 Jul 2007 07:26:48 +0000 (UTC), Michael Nemtsev
<nemtsev@.msn.comwrote:

Quote:

Originally Posted by

>Hello Thomas,
>
>Do you use connection pooling?!
>Seems that your delay is the reason that no connection pooling is activated
>and new connection takes some ammount of time to be initiated


I was under the impession that connection pooling was activated by
default.

One bad thing about connection pooling is the default pool size is zero
- the number of permanently open connections.

* Add a "Min Pool Size" entry to your connection string. 5 seems a
suitable minimum pool size.

* Check all your database code to ensure that connections are closed
after being used. the "using" keyword is handy here as it limits the
scope of the connection. Although your app should just fail if this
isn't the case rather than just going slow.

* If you are using datasets - what happens when you have no data in the
dataset?
- make sure you check with something like
if ( rs != null ) ... before you try to access a dataset

* Consider using data caching for some of the data.

* Try not to use datasets when you don't need to. For instance accessing
a DataTable or, better some kind of generic list (e.g. List<T>) is
nearly always faster - this will often use a datareader to read the
actual data - there is no real syntax shortcut but it seems to always
exectute faster.

Read this: http://www.15seconds.com/issue/040830.htm
I can't add anything more to this as we really need to see the code
before commenting further.

Quote:

Originally Posted by

>THI have an ASP.NET application with masterpages, skins and diffrent
>THthemes.
>THThe application works fine, but the performance is not realy good.
>THIf I load
>THan ASPX file, which has no database connection, is the performance
>THok. ASPX
>THfile with one or more database queries have a answer time about 15
>THor 20
>THseconds. Each database query need maximal a second to give the
>THanswer. So
>THthe query works ok!
>THKnow you a reason, why the time with a database connection is so
>THlong? How
>THcan I make my application faster?
>


Hi Thomas,

In addition to what Mark said, you could easily identify bottlenecks on the
page by turning on trace (see <trace web.config element). Plus, inspect the
size of the ViewState as it could have grown up. There's one more potential
cause I have seen in several systems. Check how many queries are run for each
page impression, and how it affects the database (Profiler is the best tool
to use in this case).

Hope this helps
--
Milosz

"Hahn, Thomas" wrote:

Quote:

Originally Posted by

Hallo,
>
I have an ASP.NET application with masterpages, skins and diffrent themes.
The application works fine, but the performance is not realy good. If I load
an ASPX file, which has no database connection, is the performance ok. ASPX
file with one or more database queries have a answer time about 15 or 20
seconds. Each database query need maximal a second to give the answer. So
the query works ok!
Know you a reason, why the time with a database connection is so long? How
can I make my application faster?
>
Thanks for the answers and tips
>
Thomas
>
>
>

Performance problem with asp.net and database connection

Hallo,
I have an ASP.NET application with masterpages, skins and diffrent themes.
The application works fine, but the performance is not realy good. If I load
an ASPX file, which has no database connection, is the performance ok. ASPX
file with one or more database queries have a answer time about 15 or 20
seconds. Each database query need maximal a second to give the answer. So
the query works ok!
Know you a reason, why the time with a database connection is so long? How
can I make my application faster?
Thanks for the answers and tips
ThomasHello Thomas,
Do you use connection pooling?!
Seems that your delay is the reason that no connection pooling is activated
and new connection takes some ammount of time to be initiated
WBR, Michael Nemtsev [.NET/C# MVP].
My blog: http://spaces.live.com/laflour
Team blog: http://devkids.blogspot.com/
"The greatest danger for most of us is not that our aim is too high and we
miss it, but that it is too low and we reach it" (c) Michelangelo
TH> I have an ASP.NET application with masterpages, skins and diffrent
TH> themes.
TH> The application works fine, but the performance is not realy good.
TH> If I load
TH> an ASPX file, which has no database connection, is the performance
TH> ok. ASPX
TH> file with one or more database queries have a answer time about 15
TH> or 20
TH> seconds. Each database query need maximal a second to give the
TH> answer. So
TH> the query works ok!
TH> Know you a reason, why the time with a database connection is so
TH> long? How
TH> can I make my application faster?
On Tue, 3 Jul 2007 07:26:48 +0000 (UTC), Michael Nemtsev
<nemtsev@.msn.com> wrote:

>Hello Thomas,
>Do you use connection pooling?!
>Seems that your delay is the reason that no connection pooling is activated
>and new connection takes some ammount of time to be initiated
I was under the impession that connection pooling was activated by
default.
One bad thing about connection pooling is the default pool size is zero
- the number of permanently open connections.
* Add a "Min Pool Size" entry to your connection string. 5 seems a
suitable minimum pool size.
* Check all your database code to ensure that connections are closed
after being used. the "using" keyword is handy here as it limits the
scope of the connection. Although your app should just fail if this
isn't the case rather than just going slow.
* If you are using datasets - what happens when you have no data in the
dataset?
- make sure you check with something like
if ( rs != null ) ... before you try to access a dataset
* Consider using data caching for some of the data.
* Try not to use datasets when you don't need to. For instance accessing
a DataTable or, better some kind of generic list (e.g. List<T> ) is
nearly always faster - this will often use a datareader to read the
actual data - there is no real syntax shortcut but it seems to always
exectute faster.
Read this: http://www.15seconds.com/issue/040830.htm
I can't add anything more to this as we really need to see the code
before commenting further.

>TH> I have an ASP.NET application with masterpages, skins and diffrent
>TH> themes.
>TH> The application works fine, but the performance is not realy good.
>TH> If I load
>TH> an ASPX file, which has no database connection, is the performance
>TH> ok. ASPX
>TH> file with one or more database queries have a answer time about 15
>TH> or 20
>TH> seconds. Each database query need maximal a second to give the
>TH> answer. So
>TH> the query works ok!
>TH> Know you a reason, why the time with a database connection is so
>TH> long? How
>TH> can I make my application faster?
>
Hi Thomas,
In addition to what Mark said, you could easily identify bottlenecks on the
page by turning on trace (see <trace > web.config element). Plus, inspect th
e
size of the ViewState as it could have grown up. There's one more potential
cause I have seen in several systems. Check how many queries are run for eac
h
page impression, and how it affects the database (Profiler is the best tool
to use in this case).
Hope this helps
--
Milosz
"Hahn, Thomas" wrote:

> Hallo,
> I have an ASP.NET application with masterpages, skins and diffrent themes.
> The application works fine, but the performance is not realy good. If I lo
ad
> an ASPX file, which has no database connection, is the performance ok. ASP
X
> file with one or more database queries have a answer time about 15 or 20
> seconds. Each database query need maximal a second to give the answer. So
> the query works ok!
> Know you a reason, why the time with a database connection is so long? How
> can I make my application faster?
> Thanks for the answers and tips
> Thomas
>
>

Performance problem with multiple choice questions

I'm working with a website that presents a series of multiple choice
questions to users.
The problem at hand is that the pages with the questions take some time load
and the users are under a time limit to answer the questions.
Would you know if there is a way to "download" the question bitmaps to the
user's PC to some type of cache before the timer starts for the user?Hello Night,
It depends on what your timer is?
Is it possible to start timer for each page, (after page load) and stop it
when user press next?
If not than I read your data into asp.net Cache and build it, but the buildi
ng
can takes some time.
I'd rely on timer stop/start after page is loaded and deduct this time from
overal time allowed to user to answer on all questions
WBR,
Michael Nemtsev [.NET/C# MVP] :: blog: http://spaces.live.com/laflour
"The greatest danger for most of us is not that our aim is too high and we
miss it, but that it is too low and we reach it" (c) Michelangelo
NA> I'm working with a website that presents a series of multiple choice
NA> questions to users.
NA>
NA> The problem at hand is that the pages with the questions take some
NA> time load and the users are under a time limit to answer the
NA> questions.
NA>
NA> Would you know if there is a way to "download" the question bitmaps
NA> to the user's PC to some type of cache before the timer starts for
NA> the user?
NA>

Performance problem with RegEx

I have a performance issue related to regular expressions and
caching , hopefully someone can point me in the right direction?
I have a asp.net web service that is called several million times a
day. It does not have data caching enabled since the input variables
for the webmethods change every time its called. It's my understanding
that if the input paramters change frequently that the hash table
created by a caching option wont actually do any good.
Everytime the web service is called, among other things, the webmethod
gets a list of RegEx from a sql database and then loops through each
expression trying to find a match with a webmethod input paramter.
I have seen that as I add more regular expressions to test against, my
processor usage on the machine goes up accordinly. Right now I have
about 100 RegX in my db table and my proc usage is pretty consistantly
at 100%. In addition I see "random" out of memory exceptions from the
RegEx engine itself, the web server has 4 gigs of ram running on
server 2003.
I have two questions:
1. Is there a way for me to cache the list of regular expressions I am
getting from the sql db? If I was using a aspx page I would use the
datacache property on ado.net however how can I accomplish the same
thing the web service?
2. Can I "Pre-Compile" or otherwise improve the performance of the
regular expression testing itself?
thanks,
Jeffregex caching is automatic if you use the static methods on regex
class but not if you create your own regex instances.
I would suggest retrieving the regular expressions from the db once,
create regex instances from each and store in a static array. Use the
RegexOptions.Compiled flag when creating the instances since they'll
be used many times. Regex instances are immutable and threadsafe so
you don't have to worry about using the same instance many times
simultaneously.
Another thing to look at is optimizing the array order. Say you are
matching against 100 regular expressions and assume you only need to
find the first match. Keeps stats on how often the patterns are
matched and then over time reorder the expressions so the most common
expressions are checked first.
HTH,
Sam
----
We're hiring! B-Line Medical is sing .NET
Developers for exciting positions in medical product
development in MD/DC. Work with a variety of technologies
in a relaxed team environment. See ads on Dice.com.
On Wed, 18 Jul 2007 23:01:34 -0000, "jmacduff@.gmail.com"
<jmacduff@.gmail.com> wrote:

>I have a performance issue related to regular expressions and
>caching , hopefully someone can point me in the right direction?
>I have a asp.net web service that is called several million times a
>day. It does not have data caching enabled since the input variables
>for the webmethods change every time its called. It's my understanding
>that if the input paramters change frequently that the hash table
>created by a caching option wont actually do any good.
>Everytime the web service is called, among other things, the webmethod
>gets a list of RegEx from a sql database and then loops through each
>expression trying to find a match with a webmethod input paramter.
>I have seen that as I add more regular expressions to test against, my
>processor usage on the machine goes up accordinly. Right now I have
>about 100 RegX in my db table and my proc usage is pretty consistantly
>at 100%. In addition I see "random" out of memory exceptions from the
>RegEx engine itself, the web server has 4 gigs of ram running on
>server 2003.
>I have two questions:
>1. Is there a way for me to cache the list of regular expressions I am
>getting from the sql db? If I was using a aspx page I would use the
>datacache property on ado.net however how can I accomplish the same
>thing the web service?
>2. Can I "Pre-Compile" or otherwise improve the performance of the
>regular expression testing itself?
>thanks,
>Jeff
Store a DataTable in the Application Cache. You will only need to create it
when the Application starts.
HTH,
Kevin Spencer
Microsoft MVP
Printing Components, Email Components,
FTP Client Classes, Enhanced Data Controls, much more.
DSI PrintManager, Miradyne Component Libraries:
http://www.miradyne.net
<jmacduff@.gmail.com> wrote in message
news:1184799694.193160.125230@.j4g2000prf.googlegroups.com...
>I have a performance issue related to regular expressions and
> caching , hopefully someone can point me in the right direction?
> I have a asp.net web service that is called several million times a
> day. It does not have data caching enabled since the input variables
> for the webmethods change every time its called. It's my understanding
> that if the input paramters change frequently that the hash table
> created by a caching option wont actually do any good.
> Everytime the web service is called, among other things, the webmethod
> gets a list of RegEx from a sql database and then loops through each
> expression trying to find a match with a webmethod input paramter.
> I have seen that as I add more regular expressions to test against, my
> processor usage on the machine goes up accordinly. Right now I have
> about 100 RegX in my db table and my proc usage is pretty consistantly
> at 100%. In addition I see "random" out of memory exceptions from the
> RegEx engine itself, the web server has 4 gigs of ram running on
> server 2003.
> I have two questions:
> 1. Is there a way for me to cache the list of regular expressions I am
> getting from the sql db? If I was using a aspx page I would use the
> datacache property on ado.net however how can I accomplish the same
> thing the web service?
> 2. Can I "Pre-Compile" or otherwise improve the performance of the
> regular expression testing itself?
> thanks,
> Jeff
>

Performance problem with RegEx

I have a performance issue related to regular expressions and
caching , hopefully someone can point me in the right direction?

I have a asp.net web service that is called several million times a
day. It does not have data caching enabled since the input variables
for the webmethods change every time its called. It's my understanding
that if the input paramters change frequently that the hash table
created by a caching option wont actually do any good.

Everytime the web service is called, among other things, the webmethod
gets a list of RegEx from a sql database and then loops through each
expression trying to find a match with a webmethod input paramter.

I have seen that as I add more regular expressions to test against, my
processor usage on the machine goes up accordinly. Right now I have
about 100 RegX in my db table and my proc usage is pretty consistantly
at 100%. In addition I see "random" out of memory exceptions from the
RegEx engine itself, the web server has 4 gigs of ram running on
server 2003.

I have two questions:

1. Is there a way for me to cache the list of regular expressions I am
getting from the sql db? If I was using a aspx page I would use the
datacache property on ado.net however how can I accomplish the same
thing the web service?

2. Can I "Pre-Compile" or otherwise improve the performance of the
regular expression testing itself?

thanks,
JeffStore a DataTable in the Application Cache. You will only need to create it
when the Application starts.

--
HTH,

Kevin Spencer
Microsoft MVP

Printing Components, Email Components,
FTP Client Classes, Enhanced Data Controls, much more.
DSI PrintManager, Miradyne Component Libraries:
http://www.miradyne.net
<jmacduff@.gmail.comwrote in message
news:1184799694.193160.125230@.j4g2000prf.googlegro ups.com...

Quote:

Originally Posted by

>I have a performance issue related to regular expressions and
caching , hopefully someone can point me in the right direction?
>
I have a asp.net web service that is called several million times a
day. It does not have data caching enabled since the input variables
for the webmethods change every time its called. It's my understanding
that if the input paramters change frequently that the hash table
created by a caching option wont actually do any good.
>
Everytime the web service is called, among other things, the webmethod
gets a list of RegEx from a sql database and then loops through each
expression trying to find a match with a webmethod input paramter.
>
I have seen that as I add more regular expressions to test against, my
processor usage on the machine goes up accordinly. Right now I have
about 100 RegX in my db table and my proc usage is pretty consistantly
at 100%. In addition I see "random" out of memory exceptions from the
RegEx engine itself, the web server has 4 gigs of ram running on
server 2003.
>
I have two questions:
>
1. Is there a way for me to cache the list of regular expressions I am
getting from the sql db? If I was using a aspx page I would use the
datacache property on ado.net however how can I accomplish the same
thing the web service?
>
2. Can I "Pre-Compile" or otherwise improve the performance of the
regular expression testing itself?
>
thanks,
Jeff
>

Performance problem: DataSet 2 database

Hi group,
i've got a major performance problem.

I've got a dataset with 1 datatable.
This datatable has 3 columns with the following datatypes:
string 18;
datetime
decimal

There are approx 150.000 records in the datatable.
I need to save the values to the database.
Saving the records using a sp is way to slow (12 minutes!)

Does anybody have a faster way?
Ive tried:
http://www.codeproject.com/cs/database/generic_OpenXml.asp?df=100&forumid=171911&select=1344569&msg=1344569
But this didnt work due to the datetime column.

Please help me on this one!!!

TIA

VisualSanderDon't load all 150,000 records at the same time. Is there a reason you need all 150,000 records in there? Can't you do it in batches of, say, 1000?

Performance problems using ASP.NET 2.0

I have just built my first ASP.NET v2.0 app and have it running on a 3 server web farm that runs several other ASP.NET v1.1 apps. The v1.1 apps run great, but the new v2.0 app always seems to run very slowly when first connecting to the site, even if other users have already visited the site. I believe I have pre-compiled the site, and even if I visit each server in turn to ensure the app has been compiled before hitting the load-balanced url, the site just runs very slowly. On our development machines the site runs very quickly, so the poor performance is a bit of a mystery.

The servers in the farm are running Windows 2003 Server and the machine.config on each server has been updated to syncronize the machine keys.

Check the event viewer to see if the app pool is restarting itself. Some people had problems with the application shutting down and if that's your case then when the app (aspnet) restarts itself the code has to recompile. Any other details that you might find, will help on directing you towards the right path.

hi

are u working with directory. The asp.net 2.0 application restart as soon as the directory structure changes. see the post below

http://vikramlakhotia.com/Post.aspx?postID=6

Hope this helps

Vikram

Vikram's Blog


Did not know that. If that's the case that's a huge bug on asp.net 2.0. It should be reported. Maybe there's a setting on the Machine.Config for the application not to restart when the directory structure changes.
There was nothing in the event viewer for all three servers (relating to asp.net anyway). The slow responses almost feel like every page request is causing the application to be re-built...

I don't know what else to tell you. Maybe you have indexing on and it's getting ahold of the files and thus restarting. Just throwing one out there.

Try creating another app and see how it behaves. If it behaves normal, then the problem is related to your application.

Performance Q..

My development environments are ASP.Net and VB.Net and .NetFramework1.1.

I remember reading "using less number of session variables is good for
application performance".

Now instead of using 20 session variables, I can use one session variable
which carries the data for all the 20 variables that I want to keep track.

Is it a good idea? Any documentation on how exactly the sessions variables
are processed and stored in IIS5?

Thanks,

Lalit SinghPerformance generally falls into 2 categories: Processor and Memory.
Anything which saves Processor usage or Memory usage improves performance.
If you can remember that, you can stop relying on opinions you read to
determine what you should do.

Session is memory. Anything which reduces Session size will improve
performance. Combining Session variables into a single object isn't likely
to improve performance, as you aren't changing the size of anything, just
how it is stored. If I put 5 apples in a box, I still have 5 apples. I also
have the box now.

Programming instructions are Processor. If, by putting all of these objects
into a single object enables you to write leaner code, you will improve
performance. For example, using the apples again, taking the apples out of
Session one at a time, I need to write 5 sets of instructions, or a loop
that executes 5 times. If I take the box out, I use only 1 instruction. Of
course, at some point you're going to need to take the apples out of the
box. So, that kind of cancels out the savings there.

But let me get back to my original point: Don't rely on opinions. If you're
really concerned about performance, keep your eye on Memory usage and
Processor usage. Don't concentrate on Session. Concentrate on your whole
app. Look for opportunites to write leaner code, and maximize memory usage.

--
HTH,
Kevin Spencer
..Net Developer
Microsoft MVP
Neither a follower
nor a lender be.

"Lalit Singh" <LalitSingh_34@.hotmail.com> wrote in message
news:#HlYRKg3EHA.3616@.TK2MSFTNGP09.phx.gbl...
> My development environments are ASP.Net and VB.Net and .NetFramework1.1.
> I remember reading "using less number of session variables is good for
> application performance".
> Now instead of using 20 session variables, I can use one session variable
> which carries the data for all the 20 variables that I want to keep track.
> Is it a good idea? Any documentation on how exactly the sessions variables
> are processed and stored in IIS5?
> Thanks,
> Lalit Singh
Thanks,
I was waiting to post during this hour of the day so that you will respond.
Thanks,
Lalit

"Kevin Spencer" <kspencer@.takempis.com> wrote in message
news:#X6bD6g3EHA.3708@.TK2MSFTNGP14.phx.gbl...
> Performance generally falls into 2 categories: Processor and Memory.
> Anything which saves Processor usage or Memory usage improves performance.
> If you can remember that, you can stop relying on opinions you read to
> determine what you should do.
> Session is memory. Anything which reduces Session size will improve
> performance. Combining Session variables into a single object isn't likely
> to improve performance, as you aren't changing the size of anything, just
> how it is stored. If I put 5 apples in a box, I still have 5 apples. I
also
> have the box now.
> Programming instructions are Processor. If, by putting all of these
objects
> into a single object enables you to write leaner code, you will improve
> performance. For example, using the apples again, taking the apples out of
> Session one at a time, I need to write 5 sets of instructions, or a loop
> that executes 5 times. If I take the box out, I use only 1 instruction. Of
> course, at some point you're going to need to take the apples out of the
> box. So, that kind of cancels out the savings there.
> But let me get back to my original point: Don't rely on opinions. If
you're
> really concerned about performance, keep your eye on Memory usage and
> Processor usage. Don't concentrate on Session. Concentrate on your whole
> app. Look for opportunites to write leaner code, and maximize memory
usage.
> --
> HTH,
> Kevin Spencer
> .Net Developer
> Microsoft MVP
> Neither a follower
> nor a lender be.
> "Lalit Singh" <LalitSingh_34@.hotmail.com> wrote in message
> news:#HlYRKg3EHA.3616@.TK2MSFTNGP09.phx.gbl...
> > My development environments are ASP.Net and VB.Net and .NetFramework1.1.
> > I remember reading "using less number of session variables is good for
> > application performance".
> > Now instead of using 20 session variables, I can use one session
variable
> > which carries the data for all the 20 variables that I want to keep
track.
> > Is it a good idea? Any documentation on how exactly the sessions
variables
> > are processed and stored in IIS5?
> > Thanks,
> > Lalit Singh

Performance prolem when deploying

Hi,

We have a quite simple asp.net application that works fne on my XP
development box. When we deploy it to a Windows 2003 Server performance is
really poor when rendering pages. When I turn tracing on it shows that it
takes more than 15 seconds between "Begin Render" and "End Render". All
application logic is in OnLoad which takes less than 0,1 seconds to complete
on both machines. The weird thing is that if we request this page locally
(from the server box) it takes 0,1 seconds. Authentication is Anonymous and
we don't find any errors anywhere. The statusline text in IE is flickering
while the request is being processed.

So, the question is: How come the times varies depending on where the
request comes from? It must be security related somehow but I can't figure
it out.

Thanks !

Mansyou probably have a large payload whose impact only shows up with a network
request. check how big your viewstate is. if you use netscape, check page
info, if ie, save the page to disk, and check the file size.

-- bruce (sqlwork.com)

"Mans" <mans@.nomail.com> wrote in message
news:uQLgeta5DHA.2496@.TK2MSFTNGP09.phx.gbl...
> Hi,
> We have a quite simple asp.net application that works fne on my XP
> development box. When we deploy it to a Windows 2003 Server performance is
> really poor when rendering pages. When I turn tracing on it shows that it
> takes more than 15 seconds between "Begin Render" and "End Render". All
> application logic is in OnLoad which takes less than 0,1 seconds to
complete
> on both machines. The weird thing is that if we request this page locally
> (from the server box) it takes 0,1 seconds. Authentication is Anonymous
and
> we don't find any errors anywhere. The statusline text in IE is flickering
> while the request is being processed.
> So, the question is: How come the times varies depending on where the
> request comes from? It must be security related somehow but I can't figure
> it out.
> Thanks !
> Mans
There are also some useful tools that let you see what is going on between
your browser and the server. The best is http://www.httpwatch.com/ . There's
also http://www.blunck.info/iehttpheaders.html

If it is the size of the output that is the problem then look at optimising
it - removing viewstate if you aren't using it etc... You can also reduce
the size with Http Compression - http://www.intesoft.net/aspaccelerator/

- Simon Green
InteSoft IT Ltd

"bruce barker" <nospam_brubar@.safeco.com> wrote in message
news:eQZuARd5DHA.632@.TK2MSFTNGP12.phx.gbl...
> you probably have a large payload whose impact only shows up with a
network
> request. check how big your viewstate is. if you use netscape, check page
> info, if ie, save the page to disk, and check the file size.
> -- bruce (sqlwork.com)
>
> "Mans" <mans@.nomail.com> wrote in message
> news:uQLgeta5DHA.2496@.TK2MSFTNGP09.phx.gbl...
> > Hi,
> > We have a quite simple asp.net application that works fne on my XP
> > development box. When we deploy it to a Windows 2003 Server performance
is
> > really poor when rendering pages. When I turn tracing on it shows that
it
> > takes more than 15 seconds between "Begin Render" and "End Render". All
> > application logic is in OnLoad which takes less than 0,1 seconds to
> complete
> > on both machines. The weird thing is that if we request this page
locally
> > (from the server box) it takes 0,1 seconds. Authentication is Anonymous
> and
> > we don't find any errors anywhere. The statusline text in IE is
flickering
> > while the request is being processed.
> > So, the question is: How come the times varies depending on where the
> > request comes from? It must be security related somehow but I can't
figure
> > it out.
> > Thanks !
> > Mans
Hi...

Do you have used any client side implementation in the code behind pages..

As you have written there is flickering in the status bar, there is is some
kind to and from from server to browser before rendering the html..

Can you spcify some portion of the first aspx page....

"Mans" <mans@.nomail.com> wrote in message
news:uQLgeta5DHA.2496@.TK2MSFTNGP09.phx.gbl...
> Hi,
> We have a quite simple asp.net application that works fne on my XP
> development box. When we deploy it to a Windows 2003 Server performance is
> really poor when rendering pages. When I turn tracing on it shows that it
> takes more than 15 seconds between "Begin Render" and "End Render". All
> application logic is in OnLoad which takes less than 0,1 seconds to
complete
> on both machines. The weird thing is that if we request this page locally
> (from the server box) it takes 0,1 seconds. Authentication is Anonymous
and
> we don't find any errors anywhere. The statusline text in IE is flickering
> while the request is being processed.
> So, the question is: How come the times varies depending on where the
> request comes from? It must be security related somehow but I can't figure
> it out.
> Thanks !
> Mans

Performance Q...

My development environment is ASP.Net with VB and .Net Framework 1.1. All my
browsers are IE5+.
I am injecting client validations codes like if the numbers of characters
are n set the focus to next text box, covert all the lower case to upper
case letter etc. I got like 20 textboxes which has client side injected
code. Will it be a performance issue?
Smithprobably not...unless your users are running 386s...
Karl
MY ASP.Net tutorials
http://www.openmymind.net/
"Smith John" <JohnSmith56@.hotmail.com> wrote in message
news:e0jdG5X2EHA.4028@.TK2MSFTNGP15.phx.gbl...
> My development environment is ASP.Net with VB and .Net Framework 1.1. All
my
> browsers are IE5+.
> I am injecting client validations codes like if the numbers of characters
> are n set the focus to next text box, covert all the lower case to upper
> case letter etc. I got like 20 textboxes which has client side injected
> code. Will it be a performance issue?
> Smith
>
>
>
Every instruction you write is a performance issue.
HTH,
Kevin Spencer
.Net Developer
Microsoft MVP
Neither a follower
nor a lender be.
"Smith John" <JohnSmith56@.hotmail.com> wrote in message
news:e0jdG5X2EHA.4028@.TK2MSFTNGP15.phx.gbl...
> My development environment is ASP.Net with VB and .Net Framework 1.1. All
my
> browsers are IE5+.
> I am injecting client validations codes like if the numbers of characters
> are n set the focus to next text box, covert all the lower case to upper
> case letter etc. I got like 20 textboxes which has client side injected
> code. Will it be a performance issue?
> Smith
>
>
>
Smith John wrote:

> My development environment is ASP.Net with VB and .Nadequatework 1.1. All
> my browsers are IE5+.
> I am injecting client validations codes like if the numbers of characters
> are n set the focus to next text box, covert all the lower case to upper
> case letter etc. I got like 20 textboxes which has client side injected
> code. Will it be a performance issue?
> Smith
I don't think so. We have a very extensive clien-side validation and get
adequate performance (less than 1 second response to client-side actions).

Performance Q..

In one of the search window (zip code search) I am using ASP Text box to get
the search text. Is it required to use the text box here, I can use html
input text box and can pass this value for search at the server side when
user clicks the search button.
IS it a good design approach to avoid server side text boxes when compared
to HTML input boxes?
NelsonIt's all a matter of functionality. If all you need is an HTML input form
object, and you don't need it to retain its value across PostBacks,
certainly, a Server Control is not necessary.
HTH,
Kevin Spencer
.Net Developer
Microsoft MVP
Neither a follower
nor a lender be.
"Nelson" <NelsonSmith1997@.hotmail.com> wrote in message
news:ezSB$aOyEHA.260@.TK2MSFTNGP11.phx.gbl...
> In one of the search window (zip code search) I am using ASP Text box to
get
> the search text. Is it required to use the text box here, I can use html
> input text box and can pass this value for search at the server side when
> user clicks the search button.
> IS it a good design approach to avoid server side text boxes when compared
> to HTML input boxes?
> Nelson
>
Hi Nelson,
I guess you should use the TextBox control. Because, as I know, the normal
requirement for a Search window is to persist the search query after the
page post backs and gets the results. So, if you use html INPUT, you would
have to use another hidden variable to achieve this, with asp.net TextBox
control you would have the ViewState, which does this automatically if you
keep it enabled. If you do not bother about preserviing the search criteria,
go with html INPUT control with runat server
"Nelson" <NelsonSmith1997@.hotmail.com> wrote in message
news:ezSB$aOyEHA.260@.TK2MSFTNGP11.phx.gbl...
> In one of the search window (zip code search) I am using ASP Text box to
get
> the search text. Is it required to use the text box here, I can use html
> input text box and can pass this value for search at the server side when
> user clicks the search button.
> IS it a good design approach to avoid server side text boxes when compared
> to HTML input boxes?
> Nelson
>
After the search results (using ZIP Code) I am displaying city, state etc
corresponds to the zip code using a text box control. Is there anyway I can
avoid text box here?
Thanks for your reply.
Nelson
"Kevin Spencer" <kspencer@.takempis.com> wrote in message
news:e1CrReOyEHA.2752@.TK2MSFTNGP11.phx.gbl...
> It's all a matter of functionality. If all you need is an HTML input form
> object, and you don't need it to retain its value across PostBacks,
> certainly, a Server Control is not necessary.
> --
> HTH,
> Kevin Spencer
> .Net Developer
> Microsoft MVP
> Neither a follower
> nor a lender be.
> "Nelson" <NelsonSmith1997@.hotmail.com> wrote in message
> news:ezSB$aOyEHA.260@.TK2MSFTNGP11.phx.gbl...
> get
when
compared
>
1. By making Run at server can I access the search text from the server side
code. Is it correct?
2. In terms of performance does it matter between HTML INPUT control with
runat server attribute and ASP Server side text box.
Thanks for your suggestions.
Nelson.
"Kumar Reddi" <KumarReddi@.REMOVETHIS.gmail.com> wrote in message
news:OGm3EgOyEHA.3024@.TK2MSFTNGP14.phx.gbl...
> Hi Nelson,
> I guess you should use the TextBox control. Because, as I know, the
normal
> requirement for a Search window is to persist the search query after the
> page post backs and gets the results. So, if you use html INPUT, you would
> have to use another hidden variable to achieve this, with asp.net TextBox
> control you would have the ViewState, which does this automatically if you
> keep it enabled. If you do not bother about preserviing the search
criteria,
> go with html INPUT control with runat server
> "Nelson" <NelsonSmith1997@.hotmail.com> wrote in message
> news:ezSB$aOyEHA.260@.TK2MSFTNGP11.phx.gbl...
> get
when
compared
>
I never did performance tests. But, if you set "EnableViewState" to false,
asp.net TextBox is very much same as html INPUT with runat server. Yes by
making runat = server, you could access the text in the server side. Since
this is the only textbox control on your page, you shouldnt be too worried
about performance I guess
"Nelson" <NelsonSmith1997@.hotmail.com> wrote in message
news:O2UURsOyEHA.2196@.TK2MSFTNGP14.phx.gbl...
> 1. By making Run at server can I access the search text from the server
side
> code. Is it correct?
> 2. In terms of performance does it matter between HTML INPUT control with
> runat server attribute and ASP Server side text box.
> Thanks for your suggestions.
> Nelson.
>
>
> "Kumar Reddi" <KumarReddi@.REMOVETHIS.gmail.com> wrote in message
> news:OGm3EgOyEHA.3024@.TK2MSFTNGP14.phx.gbl...
> normal
would
TextBox
you
> criteria,
to
html
> when
> compared
>
The only performance hit I see is when the asp.net engine converts this
server control into the HTML output for a normal html textbox. Otherwise,
user the textbox control. Nobody will know the difference. The page has to
be compiled either way so its only going to take this hit the first time
"Nelson" wrote:

> 1. By making Run at server can I access the search text from the server si
de
> code. Is it correct?
> 2. In terms of performance does it matter between HTML INPUT control with
> runat server attribute and ASP Server side text box.
> Thanks for your suggestions.
> Nelson.
>
>
> "Kumar Reddi" <KumarReddi@.REMOVETHIS.gmail.com> wrote in message
> news:OGm3EgOyEHA.3024@.TK2MSFTNGP14.phx.gbl...
> normal
> criteria,
> when
> compared
>
>
Hi Nelson,
It all depends on what you want the user to do with the data. If you simply
want to display it, a form input object is not necessary; you can simply
write it out to the page. If, on the other hand, you want the user to be
able to edit it, change, it, etc., you would need a form field. And again,
if you want the values in a form field to survive PostBacks, you need to use
a Server Control. Adding "runat=server" to a Control, and declaring it in
the CodeBehind class, effectively creates a Server Control (an HTMLInputText
Control). You can then manipulate and work with the Control in the
CodeBehind class on the server, and it will retain its value across
PostBacks.
HTH,
Kevin Spencer
.Net Developer
Microsoft MVP
Neither a follower
nor a lender be.
"Nelson" <NelsonSmith1997@.hotmail.com> wrote in message
news:uhgKBmOyEHA.924@.TK2MSFTNGP10.phx.gbl...
> After the search results (using ZIP Code) I am displaying city, state etc
> corresponds to the zip code using a text box control. Is there anyway I
can
> avoid text box here?
> Thanks for your reply.
> Nelson
>
> "Kevin Spencer" <kspencer@.takempis.com> wrote in message
> news:e1CrReOyEHA.2752@.TK2MSFTNGP11.phx.gbl...
form
to
html
> when
> compared
>

Performance Q..

My development environments are ASP.Net and VB.Net and .NetFramework1.1.
I remember reading "using less number of session variables is good for
application performance".
Now instead of using 20 session variables, I can use one session variable
which carries the data for all the 20 variables that I want to keep track.
Is it a good idea? Any documentation on how exactly the sessions variables
are processed and stored in IIS5?
Thanks,
Lalit SinghPerformance generally falls into 2 categories: Processor and Memory.
Anything which saves Processor usage or Memory usage improves performance.
If you can remember that, you can stop relying on opinions you read to
determine what you should do.
Session is memory. Anything which reduces Session size will improve
performance. Combining Session variables into a single object isn't likely
to improve performance, as you aren't changing the size of anything, just
how it is stored. If I put 5 apples in a box, I still have 5 apples. I also
have the box now.
Programming instructions are Processor. If, by putting all of these objects
into a single object enables you to write leaner code, you will improve
performance. For example, using the apples again, taking the apples out of
Session one at a time, I need to write 5 sets of instructions, or a loop
that executes 5 times. If I take the box out, I use only 1 instruction. Of
course, at some point you're going to need to take the apples out of the
box. So, that kind of cancels out the savings there.
But let me get back to my original point: Don't rely on opinions. If you're
really concerned about performance, keep your eye on Memory usage and
Processor usage. Don't concentrate on Session. Concentrate on your whole
app. Look for opportunites to write leaner code, and maximize memory usage.
HTH,
Kevin Spencer
.Net Developer
Microsoft MVP
Neither a follower
nor a lender be.
"Lalit Singh" <LalitSingh_34@.hotmail.com> wrote in message
news:#HlYRKg3EHA.3616@.TK2MSFTNGP09.phx.gbl...
> My development environments are ASP.Net and VB.Net and .NetFramework1.1.
> I remember reading "using less number of session variables is good for
> application performance".
> Now instead of using 20 session variables, I can use one session variable
> which carries the data for all the 20 variables that I want to keep track.
> Is it a good idea? Any documentation on how exactly the sessions variables
> are processed and stored in IIS5?
> Thanks,
> Lalit Singh
>
Thanks,
I was waiting to post during this hour of the day so that you will respond.
Thanks,
Lalit
"Kevin Spencer" <kspencer@.takempis.com> wrote in message
news:#X6bD6g3EHA.3708@.TK2MSFTNGP14.phx.gbl...
> Performance generally falls into 2 categories: Processor and Memory.
> Anything which saves Processor usage or Memory usage improves performance.
> If you can remember that, you can stop relying on opinions you read to
> determine what you should do.
> Session is memory. Anything which reduces Session size will improve
> performance. Combining Session variables into a single object isn't likely
> to improve performance, as you aren't changing the size of anything, just
> how it is stored. If I put 5 apples in a box, I still have 5 apples. I
also
> have the box now.
> Programming instructions are Processor. If, by putting all of these
objects
> into a single object enables you to write leaner code, you will improve
> performance. For example, using the apples again, taking the apples out of
> Session one at a time, I need to write 5 sets of instructions, or a loop
> that executes 5 times. If I take the box out, I use only 1 instruction. Of
> course, at some point you're going to need to take the apples out of the
> box. So, that kind of cancels out the savings there.
> But let me get back to my original point: Don't rely on opinions. If
you're
> really concerned about performance, keep your eye on Memory usage and
> Processor usage. Don't concentrate on Session. Concentrate on your whole
> app. Look for opportunites to write leaner code, and maximize memory
usage.
> --
> HTH,
> Kevin Spencer
> .Net Developer
> Microsoft MVP
> Neither a follower
> nor a lender be.
> "Lalit Singh" <LalitSingh_34@.hotmail.com> wrote in message
> news:#HlYRKg3EHA.3616@.TK2MSFTNGP09.phx.gbl...
variable
track.
variables
>