I'm trying to process an xml file I'm getting from a vendor. I managed to get some c# code to read in all 26 items in the xml. This code I placed into a script component in SSIS and fed that into a Union All task. I then placed a dataviewer so I can verify what I received. I use this code to add the rows to the output buffer:
Roles roles = GetWebServiceResult(wUrl);
MessageBox.Show("We have read in " + roles.Items.Length + " items");
//Add each role entry to the output buffer.
for (int i = 0; i < roles.Items.Length; i++)
{
MessageBox.Show("Adding item " + (i + 1) + " to the output");
Transfer role = getRole(roles.Items[i]);
Output0Buffer.AddRow();
Output0Buffer.roleKey = role.roleKey;
Output0Buffer.text = role.text;
Output0Buffer.Item = role.Item;
Output0Buffer.Users = role.Users;
}
When I run this I get a popup saying that there are 26 items to process, but I only get one more popup after that, telling me that item #1 has been added. the job then stops with no errors, but I only have one row of output in the dataviewer. I don't understand why this is happening when I know that there are 25 more items to add.
Additional: On a whim, I took out the Output0Buffer code and it went through all 26 items.
I figured it out. I ran it using Ctrl-F5 and studied the output in the console. Turns out a column wasn't big enough. I made that column larger and everything works. I would have thought that error would have stopped the processing.
Related
I am trying to create a MMF-backed collection that dynamically expands up to 2GB. I wanted to be able to read existing items at the same time new items are being added. It works great on my development machine but I am getting an error on some machines:
The thread tried to read from or write to a virtual address for which it does not have the appropriate access.
Has anyone seen this error before? I am likely doing something wrong with the way I am handling MemoryMappedFiles, I'm creating a 2GB MMF with the DelayAllocatePages flag so it doesn't use it all right away:
public const long ONE_GIGABYTE = 1073741824;
long maxCapacity = 2 * ONE_GIGABYTE;
mmStorage = MemoryMappedFile.CreateNew(mmfName, maxCapacity, MemoryMappedFileAccess.ReadWrite, MemoryMappedFileOptions.DelayAllocatePages, null, HandleInheritability.Inheritable);
Then I write data to it a chunk at a time, using an internal collection to keep track of where each chunk is located.
lock ( writeLock ) {
// get the most recently added item
var lastItem = _itemLocations[_itemLocations.Count - 1];
// calculate next items offset
long newOffset = lastItem.Offset + lastItem.Length;
// add next items data
using ( var mmStream = mmStorage.CreateViewStream(newOffset, itemBytes.Length, MemoryMappedFileAccess.ReadWrite) ) {
Trace.WriteLine(string.Format("Writing {0} bytes at location {1}", itemBytes.Length, newOffset));
mmStream.Write(itemBytes, 0, itemBytes.Length);
}
// add location info to list
_itemLocations.Add(new ItemLocation()
{
Offset = newOffset,
Length = itemBytes.Length
});
}
On the remote machine the first write goes ok, but the second write is causing the exception I mentioned which kills the program completely.
Writing 5973 bytes at location 0
Writing 5901 bytes at location 5973
The thread tried to read from or write to a virtual address for which it does not have the appropriate access.
Update
I have tried changing
MemoryMappedFileOptions.DelayAllocatePages
to
MemoryMappedFileOptions.None
and it stops throwing the exception, but it also allocates the full 2GB right away. I'd prefer if I could grow the MMF as needed but I guess it wont work on all machines. I'm not sure why DelayAllocatePages works on some machines and not others.
I have been looking over Microsoft's documentation and the posts here on getting search results from DirectorySearcher. I am writing code not sure the best performing way to get a lot of results from AD (right now testing with 4K results, but should scale for more).
Question 1: What is the best method?
Here are my efforts so far.
Run 1 description
I did not set the PageSize which returns 2000 (this seems to be the default on the AD server - not 1000 that I read from posts/documentation). I do not know how to get the remainder of the results. I tried making calls to Dispose() and then FindAll() multiple times. That did not work (gave me same results over and over).
Question 2: How do I get all the results this way?
Run 1:
//ds.PageSize - not setting this property
log.Debug("PageSize=" + ds.PageSize);
log.Debug("SizeLimit=" + ds.SizeLimit);
results = ds.FindAll();
log.Debug("AD count: " + results.Count);
Run 1 Log
PageSize=0
SizeLimit=0
AD Count: 2000
Run 2 description
I did the PageSize to higher than my results (though I really do not want to do this for performance fears). I got all the results as expected.
Run 2:
ds.PageSize = 5000;
log.Debug("PageSize=" + ds.PageSize);
log.Debug("SizeLimit=" + ds.SizeLimit);
results = ds.FindAll();
log.Debug("AD count: " + results.Count);
Run 2 Log
PageSize=5000
SizeLimit=0
AD Count: 4066
Run 3 description
I set the PageSize to lower than my results so not to impact performance thinking setting this to would then maybe allow the 'pagination' of results by calling Dispose() and FindAll(). Totally got unexpected results!
Run 3:
ds.PageSize = 2000;
log.Debug("PageSize=" + ds.PageSize);
log.Debug("SizeLimit=" + ds.SizeLimit);
results = ds.FindAll();
log.Debug("AD count: " + results.Count);
Run 3 Log:
PageSize=2000
SizeLimit=0
AD Count: 4066
Question 3: This makes no sense to me. Please point me to right direction. I thought subsequent calls to Dispose() and FindAll() would work here. But I got all the results on first go.
Thanks a million!
The value may have been changed in your environment - it is 1000 by default. You can set the Page Size to 1000 and the DirectorySearcher class will handle paging for you. If you set it smaller, that's fine too. You should wrap the code in a using block to make sure resources get disposed.
I'm having an EWS MoveItems issue that I hope someone can help me with. With 1300 emails in the sent folder, I call the MoveItems method to move them ALL to a back-up folder and only a subset of the items get moved! Looking for a pattern, I recorded the following test numbers:
Test #1: Init Count: 1300; Actual # Moved: 722
Test #2: Init Count: 1300; Actual # Moved: 661
Test #3: Init Count: 1300; Actual # Moved: 738
With each test case my logging output shows that 1300 were found and passed to the MoveItems method, however, checking the Sent Items folder shows that not all 1300 were moved (as indicated in the above tests).
Here's a snip of my code:
...
do
{
ItemView view = new ItemView(pageSize, offset);
findResults = service.FindItems(folder, emailFilter, view);
Logger.Write("Email count on this page to be archived: " + findResults.Items.Count);
foreach (Item email in findResults)
{
itemIds.Add(email.Id);
}
offset += pageSize;
}
while (findResults.MoreAvailable);
Logger.Write("Total email Ids to be archived: " + itemIds.Count());
if (itemIds.Count() > 0)
{
Logger.Write("Archiving emails...");
service.MoveItems(itemIds, folder5.Folders[0].Id);
Logger.Write("Archive call complete.");
}
else
{
Logger.Write("No emails found to archive.");
}
...
All of this is wrapped in a try/catch block. No errors are caught.
The only other interesting item worth noting, is that the time between the "Archiving emails..." log and the "Archive call complete." is always within a second or two of being 1 minute. Possibly indicating a time-out on the call? Here's a snip of my log:
8/15/2014 4:29:43 PM - Information - Archiving emails...
8/15/2014 4:29:44 PM - Information - Creating search filters...
8/15/2014 4:29:48 PM - Information - Email count on this page to be archived: 1000
8/15/2014 4:29:49 PM - Information - Email count on this page to be archived: 300
8/15/2014 4:29:49 PM - Information - Total email Ids to be archived: 1300
8/15/2014 4:29:49 PM - Information - Archiving emails...
8/15/2014 4:30:51 PM - Information - Archive call complete.
8/15/2014 4:30:51 PM - Information - Email archival completed without errors
I'm pretty much at the end of my rope, so I appreciate any help you may be able to provide.
I had this same issue while working with EWS. I'm not sure what the "correct" solution is, but my workaround seemed to work. I profiled the move and it seemed to do fine moving a few hundred items at a time. Try moving ~250 in each call to MoveItems.
You should try processing the ServiceResponses that come back when you run the MoveItems method eg
if (itemIds.Count() > 0)
{
ServiceResponseCollection<MoveCopyItemResponse> Responses = service.MoveItems(itemIds, folder5.Id);
Int32 Success = 0;
Int32 Error = 0;
foreach (MoveCopyItemResponse respItem in Responses) {
switch (respItem.Result) {
case ServiceResult.Success: Success++;
break;
case ServiceResult.Error: Error++;
Console.WriteLine("Error with Item " + respItem.ErrorMessage);
break;
}
}
Console.WriteLine("Results Processed " + itemIds.Count + " Success " + Success + " Failed " + Error);
}
That will tell you what's going on and why some of your moves have failed. I would suspect its throttling so as Chris suggested drop your batch size down. In the past when I've written stuff to do large moves between Mailbox and Archive I went for 100 item batch size and never had a problem. When I set the batch size too large I saw time-outs and throttle errors.
Cheers
Glen
Trying to complete a web process, however, I am receiving a 'request time out' error. I'm not sure what I can do to get around this.
I modified the method to create a new connection for every number being passed in the for loop, but it seems to be yielding the same result.
I am more of a desktop developer, not overly versed in ASP.Net, so any light that could be shed on my issue would be great.
I've looked up info on ASP background workers, which doesn't seem to be a great way to go, and I've increased the server settings to allow a higher timeout, but still timeout if a huge number of parts are provided.
I'm also trying to avoid a separate process that is scheduled on the server to execute a submitted list of numbers. If more info is needed to make sense of this, just let me know.
Also, when I attempt to run the application locally (debug) there are never issues, only when it's placed on the live site.
Here is the exact error received:
Server Error in '/' Application.
Request timed out.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Web.HttpException: Request timed out.
Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.
Stack Trace:
[HttpException (0x80004005): Request timed out.]
And here is the code:
protected void btnSearch_Click(object sender, EventArgs e)
{
// Clear our reporting panel.
panelHolder.Controls.Clear();
// Store each line in a new array item.
string[] searchlines = txtboxSearch.Text.Replace("\n", "|").Split('|');
// Create a table row containing our main table headers.
panelHolder.Controls.Add(new LiteralControl("<table style=\"width:100%;\">" +
" <tr> " +
" <td></td> " +
" <td>Number</td> " +
" <td>Comparison</td> " +
" </tr>"));
// Variable to hold the row counts.
int j = 0;
// Store our current web members name for use in our tracking data.
string MemberName = Member.GetCurrentMember().Text;
// This table will be used solely for storing our excel exported data.
System.Data.DataTable dt = new System.Data.DataTable();
// Locate our part comparison results for every line of data supplied by our users.
for (int i = 0; i < searchlines.Count(); i++)
{
using (SqlConnection con = new SqlConnection(dbConnection))
{
// If this array item is not blank we will need to collect information about it.
if (searchlines[i].Trim() != string.Empty)
{
// Determine if data collection (reporting) is turned on.
Boolean isReporting = DataCollection();
using (SqlDataReader dr = Connect.ExecuteReader("SelectNumbers]", con,
new SqlParameter("#Number", searchlines[i].Trim()),
new SqlParameter("#CurrentMember", MemberName),
new SqlParameter("#DataCollection", isReporting)))
{
if (dr.HasRows)
{
while (dr.Read())
{
// Add our table rows containing our returned data set.
panelCompetitorHolder.Controls.Add(new LiteralControl("<tr><td>" + Convert.ToString(i + 1) + "</td>"));
AddTableData(dr, "Part Number");
AddTableData(dr, "Comparison");
// Go to our next line item.
j += 1;
}
}
}
}
}
}
// Add our table to the panel control.
panelHolder.Controls.Add(new LiteralControl("</table>"));
}
Your issue may lie in the fact that IIS assumes a maximum period of time for a given request to be processed. By default, the value is 90 seconds. Here's a few ways you can set a different amount of time:
Via Web.config - Check/add this entry on your web.config file:
<system.web>
<httpRuntime executionTimeout="N" />
</system.web>
Programatically - You may add this to your server-side code:
Server.ScriptTimeout = N;
Where N is, in both options, the desired amount of seconds for the request timeout.
Additionally, your values may be superseded/ignored if there's an entry present at the server's applicationhost.config or machine.config files, as described here:
http://www.iis.net/configreference/system.applicationhost/sites/sitedefaults/limits
If that's the case you may have to alter the corresponding entries - or, in lieu of that, alter your codebehind content.
In Lightswitch by default when you want to delete an item from a on screen List or DataGrid, you can click the delete button provided by default, or you can programmatically delete the item from the VisualCollection by calling in the 'screen code'
this.VisualCollection<Entity>.SelectedItem.Delete()
or
this.VisualCollection<Entity>.DeleteSelcted()
However this marks the selected row/entity for deletion and places an 'X' in the leftmost column of the DataGrid/List. The row remains visible to the user, and while this does reflect the transactional/asynchronous nature of the process, it is confusing to users who expect the row to be removed from the list. For example:
Customer: I deleted it why is it still there...
Me: Did you notice the x to the left?
Customer: Oh.... um...
Me: Yeah... you need to click save for the changes to be persisted to the database.
Customer: ....I'll pretend like that makes sense.
Me: .... that's a good lad ....
A better way would be to remove the item from the VisualCollection when delete is called then silently persist the change. Not having the annoying waiting/loading popup because of the asynchronous nature.
I have tried calling this.VisualCollection<Entity>.RemoveSelected() but that results in a LightSwitchException - Current item cannot be removed
I have tried saving the record after I call Delete() but that saves all changes on screen, and also displays the aforementioned popup and is not a good user experience.
After I make any changes to a DataGrid programatically, I call this function I wrote. It will check for any validation errors and inform the user if they exist so that they can be corrected. Otherwise, it will silently persist the changes in the background. I'm not sure what you mean by "waiting/loading popup". The only indication is the small blue spinner next to the screen name on the tab for a second or two.
private void ValidateAndSave()
{
//Check for validation errors
if ((this.Details.ValidationResults.HasErrors == false)) {
//Save the changes to the database
try {
this.DataWorkspace.DatabaseNameData.SaveChanges();
} catch (Exception ex) {
this.ShowMessageBox(ex.ToString());
}
} else {
//If validation errors exist,
string res = "";
//Add each one to a string,
foreach (object msg_loopVariable in this.Details.ValidationResults) {
msg = msg_loopVariable;
res = res + msg.Property.DisplayName + ": " + msg.Message + "\r\n";
}
//And display them in a message box
this.ShowMessageBox(res, "Validation error", MessageBoxOption.Ok);
}
}
Note: I converted this from VB.NET so it's probably not a drop in replacement. In particular I think the Message Box is done differently so double check that.