First: I have two BizTalk applications. First one does polling from SQL server and send to an MQ queue, works fine. The second processes a file and uses a dynamic send port. In the orchestration I have updated TOMQ(Microsoft.XLANGs.BaseTypes.Address)= QueuePath;
TOMQ(Microsoft.XLANGs.BaseTypes.TransportType)="MQSeries"; It grabs the file, processes it, sends the results to an output directory. (This works fine.) Then to MQ. MQ throws the error: Error encountered on opening Queue Manager name = .... Reason code = 2354. I checked and it is getting the correct queue path, but fails. Anyone have any suggestions. I've checked all I can think of.
I answered this question on this post: I have a BizTalk application with a dynamic send port that is set to "MQSeries". Can I programmatically set its properties?
Basically, I had to create a copy of my message, and add the MQSeries.dll as a reference to my project. I then I set the property like this.
DBMSGMQ = DBMSGOUT;
DBMSGMQ(MQSeries.TransactionSupported)="False";
Related
Using the Sharepoint.Client version 16 package, we are trying to create a MigrationJob in c# and then subsequently want to see the status and logs of that migration job. We managed to provision the containers and queue using the ProvisionMigrationContainers and ProvisionMigrationQueue methods on the Site object. And we managed to upload some files and manifest XMLs. These XMLs still contain some errors in the ids and structure, so we expect the job to fail. However, we still expect the job to be created and output some messages and logs. Unfortunately the message queue seems to be empty and the logs are nowhere to be found (at least we can't find them). The Guid of the created migration job is the null guid: 00000000-0000-0000-0000-000000000000
According to https://learn.microsoft.com/en-us/sharepoint/dev/apis/migration-api-overview the logs should be saved in the manifest container as a blob. But how would you actually find the name of the log file? The problem is that everything has to be encrypted and it is not allowed to list the blobs in the blob storage (trying this leads to a 403 error).
So the main question is: how are we supposed to access the log files? And the bonus question: assuming that the command to create the migration job is correct, why are we getting the null guid? And last one: why is the queue empty? I could speculate that the migration job is never created and that's why the guid is all zeroes, but how are we supposed to know what is preventing the job from being created?
Here is the code that creates the Migration Job:
public ClientResult<Guid> CreateMigrationJob()
{
var encryption = new EncryptionOption
{
AES256CBCKey = encryptionProvider.Key
};
return context.Site.CreateMigrationJobEncrypted(
context.Web.Id,
dataContainer.Uri.ToString(),
metadataContainer.Uri.ToString(),
migrationQueue.Uri.ToString(),
encryption
);
}
context, dataContainer, metadataContainer have all been properly instantiated as members and have been used in other methods successfully. migrationQueue and encryption also look fine, but have not been used elsewhere. The encryption key has been used to upload and download files though and works perfectly fine there.
For completeness sake, here is the code we tried to use to check if there is anything in the queue:
public void GetMigrationLog()
{
while (migrationQueue.ApproximateMessageCount > 0) //debug code, this should be done async
{
Console.WriteLine(migrationQueue.GetMessage().AsString);
}
}
It outputs nothing, because the queue is empty. We would expect there to be at least an error message or a message that the logs were created (including the name of the log file).
PS: we realise that it should be possible to download the logs using DownloadToFileEncrypted(encryptionProvider, targetFile.ToString(), System.IO.FileMode.Create) but only if you already know the file name, which you cannot find, so that seems a bit silly.
When you call context.Site.CreateMigrationJobEncrypted in your code it returns a Guid. The name of the log file will Import-TheGuidThatWasReturned-ANumberThatStartsAt1ButIncremennts.log
So the first log file might be called.
Import-AE9525D9-3CF7-4D1A-A9E0-8AB0DF4F09B2-1.log
Using encryption should not stop you from reading the queue. You will only be unable to read the queue if you have configured your queue this way or you are using the tenancy default rather than you own.
I've got a very simple console app that is having trouble peeking at a message in a remote private queue.
var queues = MessageQueue.GetPrivateQueuesByMachine(machineName);
var queue = queues.Where(x=>x.FormatName == queueName).Single();
Message message = queue.Peek();
The Peek call fails with a MessageQueueException of "Access to Message Queuing system is denied".
Using the same client machine and user I am able to view the queue using Queue Explorer and with the Message Queuing Snap In.
Experimenting with a local queue I am only able to reproduce the error by taking away the Peek permission on the queue but that also stops it in the other tools.
I've seen lots of information that points me to the issues outlined here.
However, it seems like if any of those things was the issue, I wouldn't be able to do it using the other tools as well.
EDIT
I have been able to get this to work using the MSMQQueueInfo/MSMQQueue COM objects without changing any credentials.
It would be nice if I could make it work using the .NET libraries but at least I have a workaround.
My issue was that when GetPrivateQueuesByMachine is used to get the queue, it uses a access mode of SendAndReceive which is asking for more permissions then I had. I had to use the MessageQueue constructor to specify the AccessMode. (In this case Peek.)
In the end, I was able to get this to work using code similar to the following:
var queue = new MessageQueue(#"FormatName:DIRECT=OS:machineName\private$\queueName", QueueAccessMode.Peek);
Message message = queue.Peek();
I too had the same problem. In my case, I was initializing Message Queue in parent thread and accessing Peek function in child thread.
In case you are using multi threading, try to keep initialization and access of function in same thread.
That the queue shows up in various utilities just doesn't tell you that much. Such a utility is pretty unlikely to be peeking at messages. In general, the default access permissions allows everybody to see the queue and post messages to it. But not retrieve them.
On the machine that owns this queue, use Control Panel > Administrative Tools > Computer Management > Services and Applications > Message Queuing > Private Queues. Select the queue and right click > Properties > Security tab. Note how Everbody has some rights, like "Get Properties" and "Send Message". But not "Peek Message".
Sane thing to do is just add the user account that you use on the other machine and tick the rights you need to get the job done. If this machine is managed by an admin then you'll need to ask them to do it for you.
I am trying to change the static send port URI from BizTalk admin console by opening the configuration, I am able to do change the URI.
But if I do the send port URIchange using WMI script in c# or directly update on bts_sendport_transport, it changes the URI in database and the change appears in the send port list also.
Bur when I open the MQ configuration, I can still find the old URI in MQ definition.
Can any one please help me to change the MQ definition in send ports without using admin console?
I think your approach to the problem is wrong they invernted dynamic ports for your problem. since your question not clear enough may be I don't understand your problem clearly.
Here is link for Dynamic Ports and Usage http://www.codeproject.com/Articles/502425/BizTalk-Static-and-Dynamic-FTP-Send-Port-Sample
you must focus on last part
Msg_DynamicSend(FTP.CommandLogFileName) = "D:\\BiztalkLogs\\FTPLog\\DynamicFTPLog.txt";
Msg_DynamicSend(FTP.UserName) = "FTPUSER";
Msg_DynamicSend(FTP.Password) = "Pass1234";
Msg_DynamicSend(FTP.SpoolingFolder) = "/IN/";
Msg_DynamicSend(FTP.RepresentationType) = "ASCII";
DynSendPort(Microsoft.XLANGs.BaseTypes.Address)= "ftp://inhydeshrilata";
DynSendPort(Microsoft.XLANGs.BaseTypes.TransportType) = "FTP"
I have an application that runs as a Windows service. It stores various things settings in a database that are looked up when the service starts. I built the service to support various types of databases (SQL Server, Oracle, MySQL, etc). Often times end users choose to configure the software to use SQL Server (they can simply modify a config file with the connection string and restart the service). The problem is that when their machine boots up, often times SQL Server is started after my service so my service errors out on start up because it can't connect to the database. I know that I can specify dependencies for my service to help guide the Windows service manager to start the appropriate services before mine. However, I don't know what services to depend upon at install time (when my service is registered) since the user can change databases later on.
So my question is: is there a way for the user to manually indicate the service dependencies based on the database that they are using? If not, what is the proper design approach that I should be taking? I've thought about trying to do something like wait 30 seconds after my service starts up before connecting to the database but this seems really flaky for various reasons. I've also considered trying to "lazily" connect to the database; the problem is that I need a connection immediately upon start up since the database contains various pieces of vital info that my service needs when it first starts. Any ideas?
Dennis
what your looking for is SC.exe. This is a command line tool that users can use to configure services.
sc [Servername] Command Servicename [Optionname= Optionvalue...]
more specificly you would want to use
sc [ServerName] config ServiceName depend=servicetoDependOn
Here is a link on the commandlike options for SC.EXE
http://msdn.microsoft.com/en-us/library/ms810435.aspx
A possible (far from ideal) code solution:
In you startup method code it as a loop that terminates when you've got a connection. Then in that loop trap any database connection errors and keep retrying as the following pseudo code illustrates:
bool connected = false;
while (!connected)
{
try
{
connected = openDatabase(...);
}
catch (connection error)
{
// It might be worth waiting for some time here
}
}
This means that your program doesn't continue until it has a connection. However, it could also mean that your program never gets out of this loop, so you'd need some way of terminating it - either manually or after a certain number of tries.
As you need your service to start in a reasonable time, this code can't go in the main initialisation. You have to arrange for your program to "start" successfully, but not do any processing until this method had returned connected = true. You might achieve this by putting this code in a thread and then starting your actual application code on the "thread completed" event.
Not a direct answer put some points you can look into
Windows service can be started Automatically with a delay. You can check this question in SO for some information about it.
How to make Windows Service start as “Automatic (Delayed Start)”
Check this post How to: Code Service Dependencies
We're using Exchange 2007 WS to process mail folders and are hitting various problems if we try and forward a message we've already received. Our process is:
Windows Service monitors mailbox folder, on finding a new mail we process the information and move the item to a 'Processed folder' and store the Exchange Message Id.
Users may opt to forward the mail externally. We use the Exchange API to find the item using the Message Id we stored earlier, and then again use the API to forward.
Except finding the mail again is proving rather flaky. We regularly get the following error:
The specified object was not found in the store.
Is there a better/more reliable way we can achieve the same? The documentation for Exchange WS is rather sparse.
This is a bug in microsoft exchange manage API. here is a link for more information
http://maheshde.blogspot.com/2010/09/exchange-web-service-specified-object.html
Are you saving the Message ID of the newly found message or the message once it has been moved to the 'Processed' folder? The id will change when it moves to a new folder.
The method recommended in the book Inside Microsoft Exchange Server 2007 Web Services is to grab the PR_SEARCH_KEY (0x300B, Binary) of the newly discovered item, then move it to the 'Processed' folder. You can then search for it in the new folder based on the PR_SEARCH_KEY and get it's new Message id to forward it.
I have come to the conclusion that this happens to me is because while my app is processing the emails, someone else is fiddling with an email at the same time.
So to cure the problem, I put it the code in a try catch and see if the exception is == the that object not found in store, if so I just skip it and move on to the next item. So for has no issues.
I wrote a program that reads the emails in inbox downloads attachments to the specified folder, wrote the email info and the saved path to the database, and finally deletes the email. I run this program as a windows service. After all tests are finished I run this program to the main server and run it. Program runs successfully but sometimes I got this error. I checked everything and finally I found that I forgot to stop service on my computer. 2 programs that runs on my computer and on real server checking the same mailbox at the same time. If you get this error make sure that only one program can process at the same mailbox.