I got out of memory exception problem for 4 months. My client use webservice, they wanna me test their webservice. In their webservice, there is a function call upload. I test that function on 1500 users who uploaded at the same time. I tried garbage collection function of visual studio (GC). With 2mb of file, there is not exception, but with 8mb of file there is still out of memory exception. I have tried many times and a lot of solutions but still happened. I gonna crazy now. When upload is on going, I watched memory of all test computers but memory is not out of. So I think that problem is from webservice and server. But my client said that i have to improve those reasons which is from webservice and server to them. I'm gonna crazy now. Do you guys have any solotions for this? In additional, Our client does not public code, I just use webservice's function to test. Additional, I have to use vps to connect their webservice and network rather slow when connect to vps.
I have to make sure that my test script doesn't have any problem. Here is my test script to test upload function.
public void UploadNewJob(string HalID, string fileUID, string jobUID, string fileName, out List errorMessages)
{
errorMessages = null;
try
{
int versionNumber;
int newVersionNumber;
string newRevisionTag;
datasyncservice.ErrorObject errorObj = new datasyncservice.ErrorObject();
PfgDbJob job = new PfgDbJob();
job.CompanyName = Constant.SEARCH_CN;
job.HalliburtonSalesOffice = Constant.SEARCH_SO;
job.HalliburtonOperationsLocation = Constant.SEARCH_OL;
job.UploadPersonHalId = HalID;
job.CheckOutState = Constant.CHECKOUT_STATE;
job.RevisionTag = Constant.NEW_REVISION_TAG;
var manifestItems = new List();
var newManifestItems = new List();
var manifestItem = new ManifestItem();
if (fileUID == "")
{
if (job.JobUid == Guid.Empty)
job.JobUid = Guid.NewGuid();
if (job.FileUid == Guid.Empty)
job.FileUid = Guid.NewGuid();
}
else
{
Guid JobUid = new Guid(jobUID);
job.JobUid = JobUid;
Guid fileUid = new Guid(fileUID);
job.FileUid = fileUid;
}
// Change the next line when we transfer .ssp files by parts
manifestItem.PartUid = job.FileUid;
job.JobFileName = fileName;
manifestItem.BinaryFileName = job.JobFileName;
manifestItem.FileUid = job.FileUid;
manifestItem.JobUid = job.JobUid;
manifestItem.PartName = string.Empty;
manifestItem.SequenceNumber = 0;
manifestItems.Add(manifestItem);
errorMessages = DataSyncService.Instance.ValidateForUploadPfgDbJobToDatabase(out newVersionNumber, out newRevisionTag, out errorObj, out newManifestItems, HalID, job, false);
if (manifestItems.Count == 0)
manifestItems = newManifestItems;
if (errorMessages.Count > 0)
{
if (errorMessages.Count > 1 || errorMessages[0].IndexOf("NOT AN ERROR") == -1)
{
return;
}
}
//upload new Job
Guid transferUid;
long a= GC.GetTotalMemory(false);
byte[] fileContents = File.ReadAllBytes(fileName);
fileContents = null;
GC.Collect();
long b = GC.GetTotalMemory(false);
//Assert.Fail((b - a).ToString());
//errorMessages = DataSyncService.Instance.UploadFileInAJob(out transferUid, out errorObj, job.UploadPersonHalId, job, manifestItem, fileContents);
DataSyncService.Instance.UploadPfgDbJobToDatabase(out errorObj, out versionNumber, job.UploadPersonHalId, job, false, manifestItems);
}
catch (Exception ex)
{
Assert.Fail("Error from Test Scripts: " + ex.Message);
}
}
Please review my test code. And if there is not any problem from my test code, I have to improve reason is not from my test code T_T
My guess would be that you hit the 2 GB object size limit of .NET (1500 * 8MB > 4GB).
You should consider to change to .NET 4.5 and use the large object mode - see here - the setting is called gcAllowVeryLargeObjects.
Related
I am uploading files to dropbox using the following code.
I am using the nuget package Dropbox.Api and getting the exception System.Threading.Tasks.TaskCanceledException("A task was canceled.")
From this SO Question it appears to be a timeout issue.
So how do I modify the following code to set the timeout.
public async Task<FileMetadata> UploadFileToDropBox(string fileToUpload, string folder)
{
DropboxClient client = new DropboxClient(GetAccessToken());
using (var mem = new MemoryStream(File.ReadAllBytes(fileToUpload)))
{
string filename = Path.GetFileName(fileToUpload);
try
{
string megapath = GetFullFolderPath(folder);
string megapathWithFile = Path.Combine(megapath, Path.GetFileName(Path.GetFileName(filename))).Replace("\\", "/");
var updated = client.Files.UploadAsync(megapathWithFile, WriteMode.Overwrite.Instance, body: mem);
await updated;
return updated.Result;
}
catch (Exception ex)
{
return null;
}
}
}
Try creating and initializing the client like this:
var config = new DropboxClientConfig();
config.HttpClient.Timeout = new TimeSpan(hr, min, sec); // choose values
var client = DropboxClient(GetAccessToken(), config);
Reference:
http://dropbox.github.io/dropbox-sdk-dotnet/html/M_Dropbox_Api_DropboxClient__ctor_1.htm
One more thing to keep in mind is UploadAsync will not work for files larger than 150MB as per documentation. One will have to use UploadSessionStartAsync based implementation for it. I was making the mistake without realizing it and it took ages for me to fish the problem out.
When is it an advantage/disadvantage to be using RDotNet for making statistical calculations as opposed to generating an R script text file and running it from the application using e.g. Process.Start? Or is there some other better way?
I need to execute a large number of commands and have a feeling that sending them one by one to R takes a lot of time.
I'd say the following two scenario's are stereotypical:
.NET code and R code are quite separate, not a lot of interaction is needed between the R code and the .NET code. For example, the .NET code gathers some information, and launches a processing script on that, after which the .NET code picks up the results. In this case spawning an R process (Process.Start) is a simple way to get this working.
A lot of interaction is needed between the .NET code and the R code, the workflow consists of going back and forth between .NET and R often. In this case, a more heavy weight, flexible solution such as RDotNet makes a lot of sense. RDotNet allows more easy integration of the .NET code and the R code, with the price being that it is often harder to learn, harder to debug, and often needs to be updated for new versions of R etc.
R.NET currently can initilize once. Parallel execution will be problematic.
Would suggest use RScript.
Our solution based on this answer on stackoverflow Call R (programming language) from .net
With monor change, we send R code from string and save it to temp file, since user run custom R code when needed.
public static void RunFromCmd(string batch, params string[] args)
{
// Not required. But our R scripts use allmost all CPU resources if run multiple instances
lock (typeof(REngineRunner))
{
string file = string.Empty;
string result = string.Empty;
try
{
// Save R code to temp file
file = TempFileHelper.CreateTmpFile();
using (var streamWriter = new StreamWriter(new FileStream(file, FileMode.Open, FileAccess.Write)))
{
streamWriter.Write(batch);
}
// Get path to R
var rCore = Registry.LocalMachine.OpenSubKey(#"SOFTWARE\R-core") ??
Registry.CurrentUser.OpenSubKey(#"SOFTWARE\R-core");
var is64Bit = Environment.Is64BitProcess;
if (rCore != null)
{
var r = rCore.OpenSubKey(is64Bit ? "R64" : "R");
var installPath = (string)r.GetValue("InstallPath");
var binPath = Path.Combine(installPath, "bin");
binPath = Path.Combine(binPath, is64Bit ? "x64" : "i386");
binPath = Path.Combine(binPath, "Rscript");
string strCmdLine = #"/c """ + binPath + #""" " + file;
if (args.Any())
{
strCmdLine += " " + string.Join(" ", args);
}
var info = new ProcessStartInfo("cmd", strCmdLine);
info.RedirectStandardInput = false;
info.RedirectStandardOutput = true;
info.UseShellExecute = false;
info.CreateNoWindow = true;
using (var proc = new Process())
{
proc.StartInfo = info;
proc.Start();
result = proc.StandardOutput.ReadToEnd();
}
}
else
{
result += "R-Core not found in registry";
}
Console.WriteLine(result);
}
catch (Exception ex)
{
throw new Exception("R failed to compute. Output: " + result, ex);
}
finally
{
if (!string.IsNullOrWhiteSpace(file))
{
TempFileHelper.DeleteTmpFile(file, false);
}
}
}
}
Full blog post: http://kostylizm.blogspot.ru/2014/05/run-r-code-from-c-sharp.html
With Process.Start you will start up a new R session. This can take some time, especially if you are using different packages in your script which you need to load.
If you use R.NET, you can create an R instance, and keep on talking with it. So if you have created a webservice to connect R with ASP you don't want to start up R all the time as this will be very time costly. You need it just once and you can work with it interactively.
I need to create a sema4 file that restricts other sessions from trying to open/write to a database if another session is already trying to do the same 'transaction' By transaction, in this case make a similar booking that is already 'in progress'.
Here's the code:
HttpSessionState ss = HttpContext.Current.Session;
string sessionID = ss.SessionID;
DirectoryInfo di = new DirectoryInfo(dataDirectory + "Semaphores");
string facilityIDExt = requestedFacilityID.ToString().PadLeft(3, '0');
string sema4File = string.Format("{0}.{1:yyyyMMdd}.{2}", sessionID, RequestedStartDT, facilityIDExt);
sema4FilePath = Path.Combine(di.FullName, sema4File);
File.Create(sema4FilePath);
FileInfo[] fiPaths = di.GetFiles(string.Format("*.{0}", facilityIDExt));
bool bookingInProgress = true;
int waitPeriod = 60;
while (waitPeriod > 0 && bookingInProgress)
{
fiPaths = di.GetFiles(string.Format("*.{0}", facilityIDExt));
bookingInProgress = false;
foreach (FileInfo item in fiPaths)
if (item.Name.Contains(string.Format("{0:yyyyMMdd}.{1}", RequestedStartDT, facilityIDExt)) && item.Name != sema4File)
{
if (item.LastWriteTime > DateTime.Now.AddMinutes(-1))
{
bookingInProgress = true;
break;
}
}
System.Threading.Thread.Sleep(5000);
waitPeriod = waitPeriod - 5;
}
The idea is that the actual booking will take much less than 60 seconds to record in the database however in the meantime, no other booking requests will be permitted.
The problem that I am having is that when I call the following:
if (File.Exists(sema4FilePath))
File.Delete(sema4FilePath);
iisexpress won't delete the file as it is 'in use'. It is 'in use' by iisexpress.
I assume that this will happen with iis as well.
I don't understand why iisexpress keeps the sema4 file open?
How do I get around the 'in use' issue when I want to delete the sema4 file?
When you do this:
File.Create(sema4FilePath);
You get back a FileStream. You should close that to release it. Preferably wrap it into a using:
using (var stream = File.Create(sema4FilePath)) {
// Do you stuff
}
Or just directly close if you don't use the contents:
File.Create(sema4FilePath).Close();
For instances when Active Directory takes too long to replicate data between sites, I need to ensure that the local AD replica contains the most up to date information.
How can I get a list of DomainControllers for the current site?
I haven't found anything on Codeproject or on StackOverflow
Going to all this trouble is probably wasted effort. Unless you are experiencing issues with the built in logic for finding a domain controller you should just go with the built in method that returns one. According to Microsoft it automatically tries to find the closes one: http://technet.microsoft.com/en-us/library/cc978016.aspx.
Just use the static DomainController.FindOne method and pass in your directorycontext.
Update
Alright, try the code below, let me know how it works for you. It pings each, returns the roundtrip time, if -1 (no connection) it skips it. Flags PDC status if present. Orders by PDC status, followed by ping round trip.
static void Main(string[] args)
{
var dcsInOrder = (from DomainController c in Domain.GetCurrentDomain().DomainControllers
let responseTime = Pinger(c.Name)
where responseTime >=0
let pdcStatus = c.Roles.Contains(ActiveDirectoryRole.PdcRole)
orderby pdcStatus, responseTime
select new {DC = c, ResponseTime = responseTime}
).ToList();
foreach (var dc in dcsInOrder)
{
System.Console.WriteLine(dc.DC.Name + " - " + dc.ResponseTime);
}
System.Console.ReadLine();
}
private static int Pinger(string address)
{
Ping p = new Ping();
try
{
PingReply reply = p.Send(address, 3000);
if (reply.Status == IPStatus.Success) return (int)reply.RoundtripTime;
}
catch { }
return -1;
}
First, I'll answer the question that you actually asked:
System.DirectoryServices.ActiveDirectory.ActiveDirectorySite.GetComputerSite().Servers
But it seems like you're asking how to make sure that you're talking to the closest domain controller possible. Windows doesn't exactly provide this functionality, the best it will do is give you a domain controller in the same site that the code is running from.
I think the first thing to check is that you have your sites and subnets configured correctly. Run Active Directory Sites and Services, and make sure that subnets and domain controllers are assigned to the correct sites.
This MSDN page (and the Technet article in Peter's answer) says that you must be searching by the DNS name for the DC Locator to attempt to find a DC in the current site. I don't know if the Name property of the Domain class is the DNS domain name.
I have to assume that DomainController.FindOne is a wrapper for DsGetDcName. At that link, you can find how to turn on tracing for that function. You can use this if you still have problems, or maybe you should just PInvoke this function.
Here is a code sample that has no hard coding of DCs. Comments and criticism are welcome.
/// <summary>
/// For best results ensure all hosts are pingable, and turned on.
/// </summary>
/// <returns>An ordered list of DCs with the PDCE first</returns>
static LinkedList<DomainController> GetNearbyDCs()
{
LinkedList<DomainController> preferredDCs = new LinkedList<DomainController>();
List<string> TestedDCs = new List<string>();
using (var mysite = ActiveDirectorySite.GetComputerSite())
{
using (var currentDomain = Domain.GetCurrentDomain())
{
DirectoryContext dctx = new DirectoryContext(DirectoryContextType.Domain, currentDomain.Name);
var listOfDCs = DomainController.FindAll(dctx, mysite.Name);
foreach (DomainController item in listOfDCs)
{
Console.WriteLine(item.Name );
if (IsConnected(item.IPAddress))
{
// Enumerating "Roles" will cause the object to bind to the server
ActiveDirectoryRoleCollection rollColl = item.Roles;
if (rollColl.Count > 0)
{
foreach (ActiveDirectoryRole roleItem in rollColl)
{
if (!TestedDCs.Contains(item.Name))
{
TestedDCs.Add(item.Name);
if (roleItem == ActiveDirectoryRole.PdcRole)
{
preferredDCs.AddFirst(item);
break;
}
else
{
if (preferredDCs.Count > 0)
{
var tmp = preferredDCs.First;
preferredDCs.AddBefore(tmp, item);
}
else
{
preferredDCs.AddFirst(item);
}
break;
}
}
}
}
else
{
// The DC exists but has no roles
TestedDCs.Add(item.Name);
if (preferredDCs.Count > 0)
{
var tmp = preferredDCs.First;
preferredDCs.AddBefore(tmp, item);
}
else
{
preferredDCs.AddFirst(item);
}
}
}
else
{
preferredDCs.AddLast(item);
}
}
}
}
return preferredDCs;
}
static bool IsConnected(string hostToPing)
{
string pingurl = string.Format("{0}", hostToPing);
string host = pingurl;
bool result = false;
Ping p = new Ping();
try
{
PingReply reply = p.Send(host, 3000);
if (reply.Status == IPStatus.Success)
return true;
}
catch { }
return result;
}
Here's my approach using powershell but I'm sure it's a simple implementation in c#, etc. If DHCP is setup correctly, the Primary DNS server in your subnet should be the closest Domain Controller. So the following code should grab the first DNS IP and resolve it to the hostname of the closest DC. This doesn't require RSAT or credentials and contains no specific properties of the current domain.
$NetItems = #(Get-WmiObject -Class Win32_NetworkAdapterConfiguration -Filter "IPEnabled = 'True'" -ComputerName $env:COMPUTERNAME)
foreach ($objItem in $NetItems)
{
if ($objItem.{DNSServerSearchOrder}.Count -ge 1)
{
$PrimaryDNS = $objItem.DNSServerSearchOrder[0]
$domain = $objItem.DNSDomain
break
}
}
[System.Net.Dns]::GetHostbyAddress($PrimaryDNS).hostname -replace ".$($domain)",""
I am using p4.net API to generate some reports from the metadata.
In one of the reports, I need to generate then number of the changes lines for each changeset report.
As a reporting tool, I am using MS SQL Reporting services 2008, and I have written a custom dll that uses p4.net API to calculate the number of changed lines. it works on the local without any problem. However, when I run the code on the server, it calculates let's say first %20 part then starts throwing Unable to connect to the Perforce Server!
Unable to connect to Perforce! exception.
I try same credentials on the local, it works.. I use commandline with same credentials on the server, it works.
Could anyone help me with that please, if encountered before?
Here is the code I use. If needed
public static class PerforceLib
{
public static P4Connection p4conn = null;
private static void CheckConn()
{
try
{
if (p4conn == null)
{
p4conn = new P4Connection();
p4conn.Port = "address";
p4conn.User = "user";
p4conn.Password = "pwd*";
p4conn.Connect();
p4conn.Login("pwd");
}
else if (p4conn != null)
{
if(!p4conn.IsValidConnection(true, false))
{
Log("Check CONN : Connection is not valid, reconnecting");
p4conn.Login("pwd*");
}
}
}
catch (Exception ex )
{
Log(ex.Message);
}
}
public static int DiffByChangeSetNumber(string ChangeSetNumber)
{
try
{
CheckConn();
P4Record set = p4conn.Run("describe", "-s",ChangeSetNumber)[0];
string[] files = set.ArrayFields["depotFile"].ToArray<string>();
string[] revs = set.ArrayFields["rev"].ToArray<string>();
string[] actions = set.ArrayFields["action"].ToArray<string>();
int totalChanges = 0;
List<P4File> lstFiles = new List<P4File>();
for (int i = 0; i < files.Count(); i++)
{
if (actions[i].ToString() == "edit")
lstFiles.Add(new P4File() { DepotFile = files[i].ToString(), Revision = revs[i].ToString(), Action = actions[i].ToString() });
}
foreach (var item in lstFiles)
{
if (item.Revision != "1")
{
string firstfile = string.Format("{0}#{1}", item.DepotFile, (int.Parse(item.Revision) - 1).ToString());
string secondfile = string.Format("{0}#{1}", item.DepotFile, item.Revision);
P4UnParsedRecordSet rec = p4conn.RunUnParsed("diff2", "-ds", firstfile, secondfile);
if (rec.Messages.Count() > 1)
{
totalChanges = PerforceUtil.GetDiffResults(rec.Messages[1].ToString(), item.DepotFile);
}
}
}
GC.SuppressFinalize(lstFiles);
Log(string.Format("{0} / {1}", ChangeSetNumber,totalChanges.ToString() + Environment.NewLine));
return totalChanges;
}
catch (Exception ex)
{
Log(ex.Message + Environment.NewLine);
return -1;
}
}
}
your help will be appreciated
Many thanks
I have solved this issue. we identified that the code is circling through the ephemeral port range in around two minutes. once it reaches the maximum ephemeral port, it was trying to use same port again. Due to each perforce command creates a new socket, available ports were running out after it processed about 1000 changesets.
I have set the ReservedPorts value of HKLM\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters default(1433,143) that gave me larger range of ephemeral port.
and also implemented singleton pattern for P4Conn which helped as I dont close the connection. I only check the validity of the connection, and login if the connection is not valid.
Please let me know if any of you guys needs any help regarding this