 |
Javascript How-To Ask your "How do I do this with Javascript?" questions here. |
Welcome to the p2p.wrox.com Forums.
You are currently viewing the Javascript How-To section of the Wrox Programmer to Programmer discussions. This is a community of software programmers and website developers including Wrox book authors and readers. New member registration was closed in 2019. New posts were shut off and the site was archived into this static format as of October 1, 2020. If you require technical support for a Wrox book please contact http://hub.wiley.com
|
|
|

May 20th, 2009, 06:57 PM
|
Authorized User
|
|
Join Date: Apr 2008
Posts: 35
Thanks: 1
Thanked 0 Times in 0 Posts
|
|
Multiple, Simultaneous, Asyncronous Requests
I have followed the guides on the net to do this and although the requests seem to work okay, the data returned isn't. My code (below) returns the following text:
Process Start at: 20/05/2009 23:47:51
Wait until: 20/05/2009 23:47:54
Delay value: 3
Process Start at: 20/05/2009 23:47:54
Wait until: 20/05/2009 23:47:57
Delay value: 3 As you can see, the result of the second process started after the first finished, whereas I was hoping for the following result:
Process Start at: 20/05/2009 23:47:51
Wait until: 20/05/2009 23:47:54
Delay value: 3
Process Start at: 20/05/2009 23:47:51
Wait until: 20/05/2009 23:47:54
Delay value: 3 Hopefully someone can point out the error of my ways here.
Client-Side Javascript
Code:
var sTargetPage = "GetData.asp"
var xmlhttp = new Array();
var aSournceKey = new Array();
aSournceKey[0] = "SOURCE1";
aSournceKey[1] = "SOURCE2";
function fetchData()
{
for (var i = 0; i < aSournceKey.length; i++)
{
xmlhttp[i] = new ActiveXObject("Microsoft.XMLHTTP");
xmlhttp[i].onreadystatechange = gotArrayData;
xmlhttp[i].open("POST", sTargetPage, true);
xmlhttp[i].send("<DATA><SYSTEM>" + aSournceKey[i] + "</SYSTEM></DATA>");
}
}
function gotArrayData()
{
for (var i = 0; i < xmlhttp.length; i++)
{
if (xmlhttp[i].readyState == 4)
{
if (xmlhttp[i].status == 200)
{
document.all.item("txtTime" + i).innerHTML = xmlhttp[i].responseText;
}
else { alert("Problem retrieving data:" + xmlhttp[i].statusText); }
}
}
}
Which points to the Server-Side GetData.asp page, as follows:
Code:
Response.CacheControl = "no-cache"
Dim xmldom
Dim dDate
Dim nDelayIt
Set xmldom = Server.CreateObject("Microsoft.XMLDOM")
If xmldom.load(Request) then
Select Case xmldom.selectSingleNode("//DATA/SYSTEM").text
Case "SOURCE1"
nDelayIt = 3
Case "SOURCE2"
nDelayIt = 3
End Select
Response.Write "Process Start at: " & Now & "<br>"
dDate = DateAdd("s", nDelayIt, Now)
Do While Now < dDate
' Nothing, but wait!
Loop
Response.Write "Wait until: " & dDate & "<br>"
Response.Write "Delay value: " & nDelayIt & "<br>"
else
Response.Write "ERROR" & xmldom.parseError.reason & "<br>" & xmldom.xml
end if
Set xmldom = Nothing
__________________
Regards,
Sean
|

May 21st, 2009, 03:46 AM
|
Friend of Wrox
|
|
Join Date: Jun 2008
Posts: 1,649
Thanks: 3
Thanked 141 Times in 140 Posts
|
|
The culprit is your ASP page!
Code:
Do While Now < dDate
' Nothing, but wait!
Loop
That is eating up 100% OF THE CPU during that loop.
So the webserver can *NOT* do ANYTHING else. Including even *start* processing the next ASP request. Your other XMLHTTP request.
I suppose if the webserver was a multi-cpu (multi-processor) machine, then it could be running multiple threads at the same time and then it might work. But maybe not. It's possible that IIS limits ASP to a single thread, just so that ASP can't hog all the processors. Dunno.
Try doing something in the ASP page that uses, say, a database query. Now there will be times when the DB is doing fetches off the disk and it will release the processor waiting for the I/O. Now your dual hits to the same page can run interleaved. Preferably do *different* things in the two calls, so that you aren't grabbing the same resources. Might not matter, but can't hurt.
|

May 26th, 2009, 08:23 AM
|
Authorized User
|
|
Join Date: Apr 2008
Posts: 35
Thanks: 1
Thanked 0 Times in 0 Posts
|
|
Thank you for the response, but I think that I may not have been clear in my explanation.
It's not so much what the back-end (ASP) process is doing, it's the fact that I am trying to make multiple hits to "???" and not have the page waiting around for all of the results before displaying something. The cheesy "do...while" was just there to simulate the fact that "???" will have an unpredictable response time.
I found that with a little poking around, my code example actually works ... ONCE ... the http objects would need to be disposed of, and re-instantiated for the results to work again, in the same instance.
However, further reading tells me that within IE, Microsoft have followed the HTTP structure to the letter and only allow two asynchronous connections in this fashion, thereby when I try to obtain results from FIVE sources with a gadr-coded delay factor:
Source (hard coded delay)
SRC1 : 5 minutes
SRC2 : 10 minutes
SRC3 : 15 minutes
SRC4 : 20 minutes
SRC5 : 25 minutes
The results that I get are along the lines of
Start at 10am precisely.
SRC1(5) starts
SRC2(10) starts
10:05am
SRC1(5) completes and displays results
SRC3(15) starts
10:10am
SRC2(10) completed and displays results
SRC4(20) starts
10:20
SRC3(15) completes and results display
SRC5(25) starts
10:30
SRC4(20) completes and results display
10:45
SRC5(25) completes and results display
So you see that all five results come back quicker (45 minutes) than a synchronous solution (75 minuets), but what I would be aiming for (as my ideal) would have been:
10am
SRC1(5) starts
SRC2(10) starts
SRC3(15) starts
SRC4(20) starts
SRC5(25) starts
10:05am
SRC1(5) completes and displays results
10:10am
SRC2(10) completes and displays results
10:15am
SRC3(15) completes and displays results
10:20am
SRC4(20) completes and displays results
10:25am
SRC5(25) completes and displays results
Where all of my results have been processed and displayed on screen as and when available, with the slowest one appearing at the 25 minute mark.
... we see the reasoning for IE having this limi of two threads, as a situation of 100 users clicking, would have the effect of 500 clicks ... and having negative effect on the server performance. Blah, blah...
I have found information which suggests the answer to be using "Asynchronous PreRequest Handler Execution", but I can only find one good article on it ... and can't find a working example of the code to use, which is now my next question, but to keep things appropriate ... I'll start a new thread on that.
__________________
Regards,
Sean
|
|
 |