How to catch LDAP Sync Info Message using Content Synchronization Operation - c#

I am trying to retrieve the deleted UUIDs from a Openldap server using a .Net Core Console Application.
I was able to see that a Sync Info Message was indeed sent by my Openldap server and that it contained the UUIDS of the present entries by using a Perl script and dumping the whole response.
I set up a Openldap server with the syncprov Overlay (see my previous question Can't get deleted items from OpenLDAP Server using Content Synchronization Operation (syncrepl)).
After re-reading the RFC4533 multiple times and the OpenLdap Syncrepl documentation and analysing the response, with my current configuration (No Accesslog) it is impossible to retrieve deleted entries, only a list of present entries. They are contained in the Sync Info Message. I wish to retrieve the information anyway so I can make a delta between what is sent and what is on my client.
Do you know how to catch the message in C#?
I tried using the DirectoryServices.Protocols and the Novell.Directory.Ldap libraries (separately). I must have missed something but don't know what exactly...
I used the Novell Code sample (the SearchPersist one and adding the corresponding control) available at https://www.microfocus.com/documentation/edirectory-developer-documentation/ldap-libraries-for-c-sharp/.
I can retrieve added/modified entries but not the Sync Info Message containing the present entries.

By digging a bit into the Novell Library, I found some useful classes for decoding ASN1 Objects.
By using the following code I am able to determine the type of the Intermediate Sync Info Message.
var decoder = new LBERDecoder();
(Asn1Tagged)decoder.decode(intermediateResponse.getValue());`
Then, depending on the Tag I am able to decode the message (using the method .decode(valueToDecode))

Related

"Write" Transactions don't work when using C#/Node.js with Amazon Neptune

I am able to connect to Neptune and was able to add some data to it with no issues. However, when I tried the code at https://tinkerpop.apache.org/docs/current/reference/#gremlin-dotnet-transactions it doesn't seem to work. I receive following error:
"Received data deserialized into null object message. Cannot operate on it."
I even jumped to a JS sample (https://tinkerpop.apache.org/docs/current/reference/#gremlin-javascript-transactions) and tried again. It doesn't work either.
What am I missing?
At the time of this writing, Amazon Neptune only has support for TinkerPop version 3.4.11. The "traversal transactions" semantics using tx(), that you are referencing, is new as of 3.5.2 that was released by Apache TinkerPop in mid January 2022.
Transactions are typically only required when you need to submit multiple queries but have all of the queries bounded within a single commit, or with rollback if one of the queries were to fail. If you don't need this, then each Gremlin query sent to Neptune behaves as a single transaction.
If you do need transaction-like behavior in 3.4.11, here's a link to the documentation on how to do that in Neptune using Gremlin sessions: https://docs.aws.amazon.com/neptune/latest/userguide/access-graph-gremlin-sessions.html
If you don't need transactions, then here are examples of interacting with Neptune by submitting individual queries:
(.NET) https://docs.aws.amazon.com/neptune/latest/userguide/access-graph-gremlin-dotnet.html
(JS) https://docs.aws.amazon.com/neptune/latest/userguide/access-graph-gremlin-node-js.html

How To Get Sharepoint Online Migration API Logs (using c#)

Using the Sharepoint.Client version 16 package, we are trying to create a MigrationJob in c# and then subsequently want to see the status and logs of that migration job. We managed to provision the containers and queue using the ProvisionMigrationContainers and ProvisionMigrationQueue methods on the Site object. And we managed to upload some files and manifest XMLs. These XMLs still contain some errors in the ids and structure, so we expect the job to fail. However, we still expect the job to be created and output some messages and logs. Unfortunately the message queue seems to be empty and the logs are nowhere to be found (at least we can't find them). The Guid of the created migration job is the null guid: 00000000-0000-0000-0000-000000000000
According to https://learn.microsoft.com/en-us/sharepoint/dev/apis/migration-api-overview the logs should be saved in the manifest container as a blob. But how would you actually find the name of the log file? The problem is that everything has to be encrypted and it is not allowed to list the blobs in the blob storage (trying this leads to a 403 error).
So the main question is: how are we supposed to access the log files? And the bonus question: assuming that the command to create the migration job is correct, why are we getting the null guid? And last one: why is the queue empty? I could speculate that the migration job is never created and that's why the guid is all zeroes, but how are we supposed to know what is preventing the job from being created?
Here is the code that creates the Migration Job:
public ClientResult<Guid> CreateMigrationJob()
{
var encryption = new EncryptionOption
{
AES256CBCKey = encryptionProvider.Key
};
return context.Site.CreateMigrationJobEncrypted(
context.Web.Id,
dataContainer.Uri.ToString(),
metadataContainer.Uri.ToString(),
migrationQueue.Uri.ToString(),
encryption
);
}
context, dataContainer, metadataContainer have all been properly instantiated as members and have been used in other methods successfully. migrationQueue and encryption also look fine, but have not been used elsewhere. The encryption key has been used to upload and download files though and works perfectly fine there.
For completeness sake, here is the code we tried to use to check if there is anything in the queue:
public void GetMigrationLog()
{
while (migrationQueue.ApproximateMessageCount > 0) //debug code, this should be done async
{
Console.WriteLine(migrationQueue.GetMessage().AsString);
}
}
It outputs nothing, because the queue is empty. We would expect there to be at least an error message or a message that the logs were created (including the name of the log file).
PS: we realise that it should be possible to download the logs using DownloadToFileEncrypted(encryptionProvider, targetFile.ToString(), System.IO.FileMode.Create) but only if you already know the file name, which you cannot find, so that seems a bit silly.
When you call context.Site.CreateMigrationJobEncrypted in your code it returns a Guid. The name of the log file will Import-TheGuidThatWasReturned-ANumberThatStartsAt1ButIncremennts.log
So the first log file might be called.
Import-AE9525D9-3CF7-4D1A-A9E0-8AB0DF4F09B2-1.log
Using encryption should not stop you from reading the queue. You will only be unable to read the queue if you have configured your queue this way or you are using the tenancy default rather than you own.

find New Emails that comes into exchange email server

Right now I have a Microsoft Exchange mail server and I want to extract information from that exchange server directly to my own SQL database so I can analyze them.
Now the problem is right now the way I come up with extract information from this exchange database is to check every user then check every folder they have then check all the content under each folder and if I see any modified or new emails I copy,update that into my own database.
this method does work for small amount of data however if I try to copy large amount of data with this :eg 1000 users the process will take minutes if not longer because I am constantly doing loops over loops to check every single email one by one for difference. I want to have a way to only loop through a folder if it is modified.
I looked at the documentation of Microsoft on Exchange folders I see the function called IsDirty. The description says it returns true or false based on whether the object has been modified. Now I am confused on what is it comparing the object to. Does it compare the object with whether it was modified upon creation? or a specific date I tried to look for information on this and I can't find any. Also is there any other faster way of detecting new emails/modified emails in the exchange without looping through all the folders and all the emails thanks.

Exchange WS 'The specified object was not found in the store.' error

We're using Exchange 2007 WS to process mail folders and are hitting various problems if we try and forward a message we've already received. Our process is:
Windows Service monitors mailbox folder, on finding a new mail we process the information and move the item to a 'Processed folder' and store the Exchange Message Id.
Users may opt to forward the mail externally. We use the Exchange API to find the item using the Message Id we stored earlier, and then again use the API to forward.
Except finding the mail again is proving rather flaky. We regularly get the following error:
The specified object was not found in the store.
Is there a better/more reliable way we can achieve the same? The documentation for Exchange WS is rather sparse.
This is a bug in microsoft exchange manage API. here is a link for more information
http://maheshde.blogspot.com/2010/09/exchange-web-service-specified-object.html
Are you saving the Message ID of the newly found message or the message once it has been moved to the 'Processed' folder? The id will change when it moves to a new folder.
The method recommended in the book Inside Microsoft Exchange Server 2007 Web Services is to grab the PR_SEARCH_KEY (0x300B, Binary) of the newly discovered item, then move it to the 'Processed' folder. You can then search for it in the new folder based on the PR_SEARCH_KEY and get it's new Message id to forward it.
I have come to the conclusion that this happens to me is because while my app is processing the emails, someone else is fiddling with an email at the same time.
So to cure the problem, I put it the code in a try catch and see if the exception is == the that object not found in store, if so I just skip it and move on to the next item. So for has no issues.
I wrote a program that reads the emails in inbox downloads attachments to the specified folder, wrote the email info and the saved path to the database, and finally deletes the email. I run this program as a windows service. After all tests are finished I run this program to the main server and run it. Program runs successfully but sometimes I got this error. I checked everything and finally I found that I forgot to stop service on my computer. 2 programs that runs on my computer and on real server checking the same mailbox at the same time. If you get this error make sure that only one program can process at the same mailbox.

How to get the modified time attribute of a certain file on FTP

I need to monitor a certain file on FTP, once it had been updated, I need to fetch it from FTP. but how to identify whether it's updated or not is a problem.
Does Anybody have any experience on this?
You need to send a LIST command. You'll need to parse the results manually using regex, since there is no standard format for the return result.
File modification data and time can be also obtained using a MLST or a MDTM command. Both ones are extensions of FTP protocol (not guaranteed on all servers), but at least some of them is supported by most servers. These commands return standardized format, it has not to be parsed like results of LIST command.
See the more details in this article.

Categories

Resources