Navigate through SAS Project's OutputDatasets using C# - c#

I've been trying to navigate through a SAS Project's output datasets using the SAS.EG.Scripting library for C#, but I keep getting an empty collection of datasets, even though the dataset is correctly generated on SAS Server.
I tried to follow the steps explained on this article, which is the only resource about SAS automation I've found researching the web: Not Just for Scheduling: Doing More with SASĀ® Enterprise GuideĀ® Automation.
The code I've wrote so far is below:
public static void RunSASProject()
{
SAS.EG.Scripting.Application EGApp = new SAS.EG.Scripting.Application();
EGApp.SetActiveProfile("almarci");
SAS.EG.Scripting.Project EGProject = (SAS.EG.Scripting.Project)EGApp.New();
SAS.EG.Scripting.Code oCode = (SAS.EG.Scripting.Code)EGProject.CodeCollection.Add();
try
{
oCode.Server = "SASCORP";
oCode.UseApplicationOptions = false;
oCode.GenSasReport = false;
oCode.Name = "Testing";
oCode.Text = "LIBNAME SRC '/home/cau004/aj/dccvoj/sotcpc/giad/workgroup/Apoio'; DATA SRC.CARS; SET SASHELP.CARS; OUTPUT; RUN;";
oCode.Run();
oCode.Log.SaveAs(#"C:\Users\almarci\Desktop\SAS\LogSAS" + DateTime.Now.ToString("ddmmyyyy HHmmss") + ".log");
oCode.TaskCode.SaveAs(#"C:\Users\almarci\Desktop\SAS\TaskSAS" + DateTime.Now.ToString("ddmmyyyy HHmmss") + ".txt");
SAS.EG.Scripting.OutputDatasets outputDatasets = (SAS.EG.Scripting.OutputDatasets)oCode.OutputDatasets;
foreach (SAS.EG.Scripting.OutputData outputData in outputDatasets)
{
Console.Write(outputData.Name.ToString());
}
}
catch (Exception ex)
{
Console.WriteLine("\n" + ex.Message.ToString());
}
finally
{
EGApp.Quit();
}
}
The basic steps it performs are the following:
1) Instantiate an Application object;
2) Define the profile that will be used for connecting to the server;
3) Create a new project;
4) Add a Code object to the recently created project;
5) Set up the Code object properties (Name, Text - which is the SAS command that will be executed);
6) Run the Code object;
7) Save the Code log and command text on .txt files;
8) Iterate through Code's OutputDatasets collection. This is where I get the strange behavior, since even though the code run successfully, the collection items's count is set to zero.
Anyone has already faced this kind of problem? Have I forgot to write some key word on SAS command or to set some property of the objects involved?
Appreciate any help.

I wrote the paper that you referenced.
There were some automation-related fixes in SAS Enterprise Guide in 4.3, post-release. You didn't say which version you have here, but this issue is best tracked with SAS Technical Support who can advise on hotfixes.
Also, for a higher concentration of SAS expertise, consider posting such questions to the SAS Enterprise Guide discussion forum.

Related

SharePoint Event Reciever Works On One Machine/computer/user

So we have created an updated version of a WSP for SharePoint 2010 due to our migration/update from 2007 to 2010.
The WSP is a event handler/reciever for ItemAdded() and we have it working as intended. Issue is that the operation seems to only work for one computer/machine and no others.
When the Item is Added to a list the WSP creates a Folder in Shared Documents library, creates a wiki page, then updates the new List Item with links to the Shared Doc and Wiki.
When triggered by Machine #1 and User #1 all operations work, when Machine #2(M2) and user #2(U2) or M3 and U3 non of the tasks take place when a new Item is created.
User #2 can log in on M1 and create a new item and all operations work. But if U1 uses M2 or M3 to create an item the events don't trigger. Machine #1 is able to trigger the event as many times as they want but no other computer is able to.
If you were able to follow is it something with the code or some sort of cache setting on the local machine or the SP server, or something else? Any help is appreciated.
Update: All machines are on the same network. Non of the machines are the server but various personal laptops. Development was done on a separate machine. All are accessing via the same URL. All users have same access. This is on our test site currently which would be switched to being production once migration/upgrade takes place.
Before current .WSP deployment we noticed the same issue but it was reverse, Machine #2 did all the updates but Machine #1 and #3 couldn't. Only thing we can think of was that those machines were the first to trigger the event after deployment.
I'm Not doing the .WSP install but our IT guy is(won't let us have access :/ but I understand) but below is the install commands he is running.
Add-SPSolution -LiteralPath "OurPath/ourFile.wsp"
Install-SPSolution -Identity ourIdentity -WebApplication http://myhost.com/ -GACDeployment
Below is the main part of the code
public class CreateWikiAndFolder : Microsoft.SharePoint.SPItemEventReceiver
{
public override void ItemAdded(SPItemEventProperties properties)
{
try
{
//this.DisableEventFiring();
base.EventFiringEnabled = false;
string sUrlOfWikiPage = string.Empty;
string sUrlOfNewFolder = string.Empty;
string sSubsiteRUL = string.Empty;
string sCurrentItemTitle = properties.ListItem["Title"].ToString();
string sWikiListName = "TR Wikis";
string sDocLibName = "Shared Documents";
string sTRListID = "TR Status";
if (sTRListID.ToUpper().Equals(properties.ListTitle.ToString().ToUpper()))
{
//Create the Folder
sUrlOfNewFolder = CreateFolder(properties.ListItem.Web, sDocLibName, sCurrentItemTitle);
//Create the Wiki
string ItemDispFormUrl = String.Concat(properties.ListItem.Web.Url, "/", properties.ListItem.ParentList.Forms[PAGETYPE.PAGE_DISPLAYFORM].Url, "?ID=", properties.ListItem.ID.ToString());
sUrlOfWikiPage = CreateWiki(properties.ListItem.Web, sWikiListName, sCurrentItemTitle, ItemDispFormUrl, sUrlOfNewFolder);
//Update the current TR Item
SPWeb myWeb = properties.ListItem.Web;
myWeb.AllowUnsafeUpdates = true;
SPListItem myListItem = properties.ListItem;
SPFieldUrlValue shareFolderURLValue = new SPFieldUrlValue();
shareFolderURLValue.Description = "Shared Folder";
shareFolderURLValue.Url = sUrlOfNewFolder ;
myListItem["SharedFolder"] = shareFolderURLValue;
myListItem.Update();
myWeb.AllowUnsafeUpdates = false;
}
base.EventFiringEnabled = true;
}
catch (Exception e)
{
//Currently throwing nothing
}
}
}
It could be a hardcoded path/url, however there is not enough information to identify the problem, I would be glad to update my answer with a more detailed theory if you provide more details or if you share some of your code.
Figured out the issue. I didn't include them with the above file code. But we were StreamWriting to a text file on the server to help us with debugging. Issue was with that, When user 1 was logged on their machine and the log files didn't exist, they would get generated. Now no other users then had read/write access to those files and so it errored out at our debug files for anyone else. But that Windows user could run it as much as they wanted as they were the owner of the file :/

verbot 5 sdk - loading KnowledgeBases

I'm looking for help from anyone who's worked with the verbot sdk.
I'm making a program that I want to use the LearnedKnowledge.vkb, Teacher.vkb, and any standard bot (julia, for example). Those who've used this before will know that with the rules in Teacher, you can essentially write responses to things that the bot doesn't understand, and train it on the fly.
I'm planning on using speech recognition and text-to-speech, but my problem right now is that after I load the knowledgebases, I can't seem to get any response from the bot.
Here's what I have: The Verbot5Library.dll, from verbots.sourceforge.net (I got the editor and player too, to make sure the files were working). In my program, I set up the variables as such:
Verbot5Engine verbot = new Verbot5Engine();
KnowledgeBase kb = new KnowledgeBase();
KnowledgeBaseItem kbi = new KnowledgeBaseItem();
State state = new State();
XMLToolbox xmlToolboxKB = new XMLToolbox(typeof(KnowledgeBase));
Then I initialize the verbot engine and load the kbs:
// using the xmlToolboxKB method I saw in this forum: http://www.verbots.com/forums/viewtopic.php?t=2984
kbi.Fullpath = #"C:\\[full path to kb...]\\";
kbi.Filename = "LearnedKnowledge.vkb";
kb = (KnowledgeBase)xmlToolboxKB.LoadXML(kbi.Fullpath + kbi.Filename);
verbot.AddKnowledgeBase(kb, kbi);
kbi.Filename = "julia.vkb";
kb = (KnowledgeBase)xmlToolboxKB.LoadXML(kbi.Fullpath + kbi.Filename);
verbot.AddKnowledgeBase(kb, kbi);
//trying to use LoadKnowledgeBase and LoadCompiledKnowledgeBase methods: verbot.LoadKnowledgeBase("C:\\[full path to kb...]\\LearnedKnowledge.vkb");
//verbot.LoadCompiledKnowledgeBase("C:\\[full path...]\\julia.ckb");
//verbot.LoadCompiledKnowledgeBase("C:\\[full path...]\\Teacher.ckb");
// set up state
state.CurrentKBs.Add("C:\\[full path...]\\LearnedKnowledge.vkb");
state.CurrentKBs.Add("C:\\[full path...]\\Teacher.vkb");
state.CurrentKBs.Add("C:\\[full path...]\\julia.ckb");
Finally, I attempt to get a response from the verbot engine:
Reply reply = verbot.GetReply("hello", state);
if (reply != null)
Console.WriteLine(reply.AgentText);
else
Console.WriteLine("No reply found.");
I know julia has a response for "hello", as I've tested it with the editor. But all it ever returns is "No reply found". This code has been taken from the example console program in the SDK download (as very little documentation is available). That's why I need some pointers from someone who's familiar with the SDK.
Am I not loading the KBs correctly? Do they all need to be compiled (.ckb) instead of the XML files (.vkb)? I've used the verbot.OnKnowledgeBaseLoadError event handler and I get no errors. I even removed the resource file Default.vsn needed to load the Teacher, and it throws an error when trying to load it so I'm pretty sure it's all loading correctly. So why do I always get "No reply found"?
resolved: see http://www.verbots.com/forums/viewtopic.php?p=13021#13021

Automatic update a Windows application

How do I develop my Windows application so it will auto update on the client machine, like Firefox, Skype, etc.?
Is there any simple approach or any open source library which help me to do it just following some steps or a few lines of code?
ClickOnce is what you're searching for.
You might also find these SO questions interesting (which offers some different solutions):
Auto update for WinForms application
How do I implement an auto update strategy for my in-house winform app
try microsoft clickonce technology
(in MSDN)
You can use wyUpdate or .NET Application Updater Component
There is also the Update Block in the Ent Lib by msft.
The most popular frameworks are:
Google Omaha - This is what Chrome uses. Very powerful.
Squirrel - This is used in Electron applications. Easy to use but can't update machine-wide installations. Also, no graphical update notifications.
WinSparkle - Gives you graphical update notifications. But less mature than Squirrel.
AutoUpdater.NET - Both graphical and silent updates. Similar to Squirrel and WinSparkle.
I've taken these links from this article. It goes into more details about the pros and cons of each of the frameworks.
Use MD5-Update it easy only need add 5 lines at your application, no configuration need in your app only add library and publish the files.
1. Your need a web server with PHP for publish your files please include updt.exe.
2. Add index.php for make list of update files. aviable on github repository https://github.com/jrz-soft-mx/MD5-Update/blob/main/Tools/Tools.zip o create new app with this code.
<?php
$_dat = array();
$_dir=new RecursiveDirectoryIterator(".");
foreach (new RecursiveIteratorIterator($_dir) as $_itm) {
$_fil = str_replace(".".DIRECTORY_SEPARATOR, "", $_itm);
if(!is_dir($_fil) && $_fil != "index.php"){
$_dat[]=array('StrFil' => "$_fil", 'StrMd5' => strtoupper(md5_file($_fil)), 'lonSiz' => filesize($_fil));
}
}
echo json_encode($_dat, JSON_UNESCAPED_UNICODE);
?>
3. Add nuget repository at your proyect
PM> Install-Package MD5.Update
4. Call the library when your app stars, with your update folder url, update all files and download your new app on updt folder, for replace your app need updt.exe
string strUrl = "http://yourdomain.com/app/";
if (MD5Update.MD5Update.Check(strUrl, true))
{
Process.Start(AppDomain.CurrentDomain.BaseDirectory + #"updt.exe", AppDomain.CurrentDomain.FriendlyName + " " + Process.GetCurrentProcess().ProcessName);
Application.Exit();
}
5. updt.exe for replace the current app with the new app updt folder to app. aviable on github repository https://github.com/jrz-soft-mx/MD5-Update/blob/main/Tools/Tools.zip o create new app with this code.
try
{
Application.EnableVisualStyles();
Application.SetCompatibleTextRenderingDefault(false);
List<string> lisArg = Environment.GetCommandLineArgs().ToList();
if (lisArg.Count < 2)
{
MessageBox.Show("Please provide App Excutable Name and Procees name");
Application.Exit();
return;
}
string strAppName = lisArg[1];
string strAppProcees = lisArg[2];
Process[] lisPro = Process.GetProcessesByName(strAppProcees);
foreach (Process Pro in lisPro)
{
if (Pro.Id != Process.GetCurrentProcess().Id)
{
Pro.Kill();
Thread.Sleep(1000);
}
}
string strAppMain = AppDomain.CurrentDomain.BaseDirectory + strAppName;
string strAppUpdate = AppDomain.CurrentDomain.BaseDirectory + #"updt\" + strAppName;
if (!File.Exists(strAppMain))
{
MessageBox.Show("App Excutable dosent exists");
Application.Exit();
return;
}
if (!File.Exists(strAppUpdate))
{
MessageBox.Show("App Excutable Updated dosent exists");
Application.Exit();
return;
}
File.Copy(strAppUpdate, strAppMain, true);
long fileSize = 0;
FileInfo currentFile = new FileInfo(strAppMain);
while (fileSize < currentFile.Length)
{
fileSize = currentFile.Length;
Thread.Sleep(1000);
currentFile.Refresh();
}
Process.Start(strAppMain);
}
catch (Exception Ex)
{
MessageBox.Show("An error ocurred");
File.WriteAllText(AppDomain.CurrentDomain.BaseDirectory + #"updt_" + DateTime.Now.ToString("yyyyMMddTHHmmss") + " .txt", Ex.ToString());
Application.Exit();
}
How about System Center 2012 Configuration Manager?
I'd add another possible variation:
https://github.com/synhershko/NAppUpdate
https://github.com/cecon/autoupdatereasy
https://github.com/NetSparkleUpdater/NetSparkle
While ClickOnce is simple and it resurrected for .NET 5, it still has a lot of limitations, so I found out that nowadays better option exists: you could use included in Windows 10 mechanism for app delivery called AppInstaller by packaging your app in MSIX bundle or package.
I covered my findings related to the topic in this answer

Use C# to interact with Windows Update

Is there any API for writing a C# program that could interface with Windows update, and use it to selectively install certain updates?
I'm thinking somewhere along the lines of storing a list in a central repository of approved updates. Then the client side applications (which would have to be installed once) would interface with Windows Update to determine what updates are available, then install the ones that are on the approved list. That way the updates are still applied automatically from a client-side perspective, but I can select which updates are being applied.
This is not my role in the company by the way, I was really just wondering if there is an API for windows update and how to use it.
Add a Reference to WUApiLib to your C# project.
using WUApiLib;
protected override void OnLoad(EventArgs e){
base.OnLoad(e);
UpdateSession uSession = new UpdateSession();
IUpdateSearcher uSearcher = uSession.CreateUpdateSearcher();
uSearcher.Online = false;
try {
ISearchResult sResult = uSearcher.Search("IsInstalled=1 And IsHidden=0");
textBox1.Text = "Found " + sResult.Updates.Count + " updates" + Environment.NewLine;
foreach (IUpdate update in sResult.Updates) {
textBox1.AppendText(update.Title + Environment.NewLine);
}
}
catch (Exception ex) {
Console.WriteLine("Something went wrong: " + ex.Message);
}
}
Given you have a form with a TextBox this will give you a list of the currently installed updates. See http://msdn.microsoft.com/en-us/library/aa387102(VS.85).aspx for more documentation.
This will, however, not allow you to find KB hotfixes which are not distributed via Windows Update.
The easiest way to do what you want is using WSUS. It's free and basically lets you setup your own local windows update server where you decide which updates are "approved" for your computers. Neither the WSUS server nor the clients need to be in a domain, though it makes it easier to configure the clients if they are. If you have different sets of machines that need different sets of updates approved, that's also supported.
Not only does this accomplish your stated goal, it saves your overall network bandwidth as well by only downloading the updates once from the WSUS server.
If in your context you're allowed to use Windows Server Update Service (WSUS), it will give you access to the Microsoft.UpdateServices.Administration Namespace.
From there, you should be able to do some nice things :)
P-L right. I tried first the Christoph Grimmer-Die method, and in some case, it was not working. I guess it was due to different version of .net or OS architecture (32 or 64 bits).
Then, to be sure that my program get always the Windows Update waiting list of each of my computer domain, I did the following :
Install a serveur with WSUS (may save some internet bandwith) : http://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=5216
Add all your workstations & servers to your WSUS server
Get SimpleImpersonation Lib to run this program with different admin right (optional)
Install only the administration console component on your dev workstation and run the following program :
It will print in the console all Windows updates with UpdateInstallationStates.Downloaded
using System;
using Microsoft.UpdateServices.Administration;
using SimpleImpersonation;
namespace MAJSRS_CalendarChecker
{
class WSUS
{
public WSUS()
{
// I use impersonation to use other logon than mine. Remove the following "using" if not needed
using (Impersonation.LogonUser("mydomain.local", "admin_account_wsus", "Password", LogonType.Batch))
{
ComputerTargetScope scope = new ComputerTargetScope();
IUpdateServer server = AdminProxy.GetUpdateServer("wsus_server.mydomain.local", false, 80);
ComputerTargetCollection targets = server.GetComputerTargets(scope);
// Search
targets = server.SearchComputerTargets("any_server_name_or_ip");
// To get only on server FindTarget method
IComputerTarget target = FindTarget(targets, "any_server_name_or_ip");
Console.WriteLine(target.FullDomainName);
IUpdateSummary summary = target.GetUpdateInstallationSummary();
UpdateScope _updateScope = new UpdateScope();
// See in UpdateInstallationStates all other properties criteria
_updateScope.IncludedInstallationStates = UpdateInstallationStates.Downloaded;
UpdateInstallationInfoCollection updatesInfo = target.GetUpdateInstallationInfoPerUpdate(_updateScope);
int updateCount = updatesInfo.Count;
foreach (IUpdateInstallationInfo updateInfo in updatesInfo)
{
Console.WriteLine(updateInfo.GetUpdate().Title);
}
}
}
public IComputerTarget FindTarget(ComputerTargetCollection coll, string computername)
{
foreach (IComputerTarget target in coll)
{
if (target.FullDomainName.Contains(computername.ToLower()))
return target;
}
return null;
}
}
}

What is the Fastest way to read event log on remote machine?

I am working on an application which reads eventlogs(Application) from remote machines. I am making use of EventLog class in .net and then iterating on the Log entries but this is very slow. In some cases, some machines have 40000+ log entries and it takes hours to iterate through the entries.
what is the best way to accomplish this task? Are there any other classes in .net which are faster or in any other technology?
Man, I feel your pain. We had the exact same issue in our app.
Your solution has a branch depending on what server version you're running on and what server version your "target" machine is running on.
If you're both on Vista or Windows Server 2008, you're in luck. You should look at System.Diagnostics.Eventing.Reader.EventLogQuery and System.Diagnostics.Eventing.Reader.EventLogReader. These are new in .net 3.5.
Basically, you can build a query in XML and ship it over to run on the remote computer. Maybe you're just searching for events of a specific type, or maybe just new events from a specific point in time. The search runs on the remote machine, and then you just get back the matching events. The new classes are much faster than the old .net 2.0 way, but again, they are only supported on Vista or Windows Server 2008.
For our app when the target is NOT on Vista/Win2008, we downloaded the raw .evt file from the remote system, and then parsed the file using its binary format. There are several sources of data about the event log format for .evt files (pre-Vista), including link text and an article I recall on codeproject.com that had some c# code.
Vista and Windows Server 2008 machines use a new .evtx format that is a new format, so you can't use the same binary parsing approach across all versions. But the new EventLogQuery and EventLogReader classes are so fast that you won't have to. It's now perfectly speedy to just use the built-in classes.
Event Log Reader is horribly slow... too slow. WTF Microsoft?
Use LogParser 2.2 - Search for C# and LogParser on the Internet (or you can use the log parser commands from the command line). I don't want to duplicate the work already contributed by others.
I pull the log from the remote system by having the log exported as an EVTX file. I then copy the file from the remote system. This process is really quick - even with a network that spans the planet (I had issues with having the log exported to a network resource). Once you have it local, you can do your searches and processing.
There are multiple reasons for having the EVTX - I won't get into the reasons why we do this.
The following is a working example of the code to save a copy of the log as an EVTX:
(Notes: "device" is the network host name or IP. "LogName" is the name of the log desired: "System", "Security", or "Application". outputPathOnRemoteSystem is the path on the remote computer, such as "c:\temp\%hostname%.%LogName%.%YYYYMMDD_HH.MM%.evtx".)
static public bool DumpLog(string device, string LogName, string outputPathOnRemoteSystem, out string errMessage)
{
bool wasExported = false;
string errorMessage = "";
try
{
System.Diagnostics.Eventing.Reader.EventLogSession els = new System.Diagnostics.Eventing.Reader.EventLogSession(device);
els.ExportLogAndMessages(LogName, PathType.LogName, "*", outputPathOnRemoteSystem);
wasExported = true;
}
catch (UnauthorizedAccessException e)
{
errorMessage = "Unauthorized - Access Denied: " + e.Message;
}
catch (EventLogNotFoundException e)
{
errorMessage = "Event Log Not Found: " + e.Message;
}
catch (EventLogException e)
{
errorMessage = "Export Failed: " + e.Message + ", Log: " + LogName + ", Device: " + device;
}
errMessage = errorMessage;
return wasExported;
}
A good Explanation/Example can be found on MSDN.
EventLogSession session = new EventLogSession(Environment.MachineName);
// [System/Level=2] filters out the errors
// Where "Log" is the log you want to get data from.
EventLogQuery query = new EventLogQuery("Log", PathType.LogName, "*[System/Level=2]");
EventLogReader reader = new EventLogReader(query);
for (EventRecord eventInstance = reader.ReadEvent();
null != eventInstance;
eventInstance = reader.ReadEvent())
{
// Output or save your event data here.
}
When waiting 5-20 minutes with the old code this one does it in less than 10 seconds.
Maybe WMI can help you:
WMI with C#
Have you tried using the remoting features in powershell 2.0? They allow you to execute cmdlets (like ones to read event logs) on remote machines and return the results (as objects, of course) to the calling session.
You could place a Program at those machines that save the log to file and sends it to your webapplication i think that would be alot faster as you can do the looping local but im not sure how to do it so i cant ive you any code :(
I recently did such thing via WCF callback interface however my clients interacted with the server through WCF and adding a WCF Callback was easy in my project, full code with examples is available here
Just had the same issue and want to share my solution. It makes a search through application, system and security eventlogs from 260 seconds (using EventLog) about a 100 times faster (using EventLogQuery).
And this in a way where it is possible to check if the event message contains a pattern or any other check without the requirement of FormatDescription().
My trick is to use the same mechanism as PowerShells Get-WinEvent does and then pass it through the result check.
Here is my code to find all events within last 4 days where the event message contains a filter pattern.
string[] eventLogSources = {"Application", "System", "Security"};
var messagePattern = "*Your Message Search Pattern*";
var timeStamp = DateTime.Now.AddDays(-4);
var matchingEvents = new List<EventRecord>();
foreach (var eventLogSource in eventLogSources)
{
var i = 0;
var query = string.Format("*[System[TimeCreated[#SystemTime >= '{0}']]]",
timeStamp.ToUniversalTime().ToString("o"));
var elq = new EventLogQuery(eventLogSource, PathType.LogName, query);
var elr = new EventLogReader(elq);
EventRecord entryEventRecord;
while ((entryEventRecord = elr.ReadEvent()) != null)
{
if ((entryEventRecord.Properties)
.FirstOrDefault(x => (x.Value.ToString()).Contains(messagePattern)) != null)
{
matchingEvents.Add(entryEventRecord);
i++;
}
}
}
Maybe that the remote computers could do a little bit of computing. So this way your server would only deal with relevant information. It would be a kind of cluster using the remote computer to do some light filtering and the server would the the analysis part.

Categories

Resources