Run multiple script in SAP GUI - c#

Context : I have two open sessions in my SAP GUI with following id :
/app/con[0]/ses[0]
/app/con[0]/ses[1]
I want to run 1 script (vbs) in each session, this is my code :
foreach (GuiSession s in _dicSap[tmpDKey].get_lstSapSession())
{
if (!s.Busy)
{
Process p = Process.Start(scriptName, s.Id); //s.Id=/app/con[0]/ses[0]
await Task.Delay(5000);
break;
}
}
Problem : my scripts are correctly executed but they are execute one by one.
I want to run those scripts in simultaneously. I don't understand because I haven't use .WaitForExit().
Is it my code which is wrong or is it impossible to run multi-script in SAP GUI in C# ?
Sorry for my english.
Regards

might be necro-threading, but this is how I solved a similar problem.
In my case I had to run N tasks, not two. Please also note that I had the script steps in c# code, not in separate files. Anyhow, this solution should fit your requirements.
First of all, you'll need to create multiple sessions of an existing (initial) session:
for (int i = 0; i < numOfSessions - 1 ; i++)
{
SapSession.CreateSession();
}
All these sessions will be placed in a list (sessionList). I use a custom sessionDetails class because I need to store IDs and activity information:
for (int i = 0; i < _maxSessions; i++)
{
sessionDetail sd = new sessionDetail((GuiSession)sapConnection.Sessions.Item(i), false, i);
sessionList.Add(sd);
}
class sessionDetail
{
public GuiSession sapSession { get; }
public bool isUsed { get; set; }
public int sessionId { get; set; }
public sessionDetail(GuiSession SapSession, bool IsUsed, int SessionId)
{
sapSession = SapSession;
isUsed = IsUsed;
sessionId = SessionId;
}
}
Secondly you'll need to parallelize execution of your program.
Let’s assume you’ve got an array of scripts scr that you need to execute:
string[] scr = { "scriptingTask1", " scriptingTask2", " scriptingTask3" };
Then you’ll create parallel threads for each script:
Parallel.ForEach<string>(scr
, new ParallelOptions { MaxDegreeOfParallelism = _maxSessions }
, (script) => DoSomeWork(script, sessionList)
);
The method that you pass for a lambda will assign the scripting Tasks to sessions and launch them
private void DoSomeWork(string scrptTask, List<sessionDetail> _sessionList)
{
sessionDetail _sessionToUse;
foreach (sessionDetail s in _sessionList)
{
if (!s.isUsed)
{
_sessionToUse = s;
s.isUsed = true;
//// Do your stuff here
s.isUsed = false;
break;
}
}
}
Fourth, make sure that addresses in your scripts (like "/app/con[0]/ses[0]/wnd[0]/usr/ctxtP_EKORG”) use corresponding session IDs in them. You can see it in the middle of this path (ses[0]).
If you keep referencing ses[0] in all the scripts, you'll likely get "element wasn't found by ID" errors.
Constantine.

Related

How to push (append) a list in Firebase Realtime Database from Unity, C#?

So, I want to store the inventory of the player in Firebase Realtime Database.
I set up a basic schema, and I want to add the items (their names, actually) under the inventory "branch".
I am trying to do it with the push method, but it ain't working for me right now.
I got the following script which updates the player's inventory client-side and should update the database to, with appending the inventory "branch".
What am I doing wrong?
public void pickUp() {
Transform free_place = findFirstFreePlace();
if(free_place) {
for (int i = 0; i < items.Length; i++) {
if (free_place.name.Contains(i.ToString())) {
items[i] = currentPickable;
reference.Child("users").Child(auth.CurrentUser.UserId).Child("inventory").Child(items[i].item_name).Push();
lastIndex = i;
}
}
free_place.GetChild(0).GetChild(0).GetComponent<Image>().sprite = items[lastIndex].item_pic;
free_place.GetChild(0).GetChild(0).GetComponent<Image>().enabled = true;
free_place.GetChild(1).GetComponent<Image>().enabled = true;
free_place.GetChild(1).GetComponent<Button>().interactable = true;
}
}
I'm guessing that there's slight confusion around what Push() does in this context. If you read the documentation, it says:
Add to a list of data. Every time you call Push(), Firebase generates a unique key that can also be used as a unique identifier, such as user-scores/<user-id>/<unique-score-id>.
Basically, it's giving you a place you can safely write into rather than actually adding something to the Database. You should check out the sample, but I think I can illustrate both what you intend to do and a cool way to use Push() for your use case.
public async void pickUp() {
Transform free_place = findFirstFreePlace();
if(free_place) {
int lastIndex = 0;
var elements = new List<string>();
for (int i = 0; i < items.Length; i++) {
if (free_place.name.Contains(i.ToString())) {
items[i] = currentPickable;
elements.Add(items[i].item_name);
lastIndex = i;
}
}
await reference.Child("users").Child(auth.CurrentUser.UserId).Child("inventory").SetValueAsync(itemNames);
free_place.GetChild(0).GetChild(0).GetComponent<Image>().sprite = items[lastIndex].item_pic;
free_place.GetChild(0).GetChild(0).GetComponent<Image>().enabled = true;
free_place.GetChild(1).GetComponent<Image>().enabled = true;
free_place.GetChild(1).GetComponent<Button>().interactable = true;
}
}
What I'm doing here is that I build a list of the items before hand. Then I set the inventory entry to this list. If I depended on the previous state, I'd probably prefer to do this as a transaction instead.
I also use async/await. I'm not sure how important it is that this logic runs before you update your game's UI, but setting anything in Firebase is an asynchronous operation (it runs in the background). You can omit this if it's not important that you stay in sync.
If you wanted to have uniquely identified weapon slots. You could instead do something like:
var childReference = reference.Child("users").Child(auth.CurrentUser.UserId).Child("inventory");
var key = childReference.Key;
// ... more logic here ...
childReference.Child(key).SetValueAsync(item_name);
A few more hints:
I'd recommend that you use the ValueChanged listener whenever possible to keep your game state in sync with your server state. Realtime Database will always be running asynchronously in the background, and this way you can help stay up to date as more data becomes available.
If asynchronous programming in Unity is new to you, check out my post on the subject.
Okay, so I've finally figured it out.
With the help of Push() method's unique id generation, I am able to add childs with the same item name to the inventory node.
Firstly, I create that unique id, then, I add that unique id to the database and set it's value to the item name.
Edit: I mixed my original solution with Pux0r3's solution, so that it uses async/await.
public async void pickUp() {
Transform free_place = findFirstFreePlace();
if (free_place) {
int lastIndex = 0;
var elements = new List<string>();
for (int i = 0; i < items.Length; i++) {
if (free_place.name.Contains(i.ToString())) {
items[i] = currentPickable;
elements.Add(items[i].item_name);
lastIndex = i;
}
}
string key = reference.Child(auth.CurrentUser.UserId).Child("inventory").Push().Key;
await reference.Child("users").Child(auth.CurrentUser.UserId).Child("inventory").Child(key).SetValueAsync(currentPickable.item_name);
free_place.GetChild(0).GetChild(0).GetComponent<Image>().sprite = items[lastIndex].item_pic;
free_place.GetChild(0).GetChild(0).GetComponent<Image>().enabled = true;
free_place.GetChild(1).GetComponent<Image>().enabled = true;
free_place.GetChild(1).GetComponent<Button>().interactable = true;
}
}

How to use Rx to monitor a project file and files for external changes?

I would like to reproduce the behavior of Visual Studio which informs you when a project file is touched externally and proposes to reload it!
Due to the requirements, I believe reactive is a great match to solve that problem.
I am using a modified reactive FileSystemWatcher described in this post: http://www.jaylee.org/post/2012/08/26/An-update-to-matthieumezil-Rx-and-the-FileSystemWatcher.aspx
public class FileWatcher
{
private static readonly ILog Logger = LogManager.GetLogger(MethodBase.GetCurrentMethod().DeclaringType);
public static IObservable<FileChanged> ObserveFolderChanges(string path, string filter, TimeSpan throttle, Predicate<string> isPartOfProject)
{
return Observable.Create<FileChanged>(
observer =>
{
var fileSystemWatcher = new FileSystemWatcher(path, filter) { EnableRaisingEvents = true, IncludeSubdirectories = true };
var sources = new[]
{
Observable.FromEventPattern<FileSystemEventArgs>(fileSystemWatcher, "Created")
.Where(IsMaybeAProjectFile)
.Select(ev => new FileChanged(ev.EventArgs.FullPath, FileChangeTypes.Added, SourceChangeTypes.FileSystem)),
Observable.FromEventPattern<FileSystemEventArgs>(fileSystemWatcher, "Deleted")
.Where(IsMaybeAProjectFile)
.Select(ev => new FileChanged(ev.EventArgs.FullPath, FileChangeTypes.Deleted, SourceChangeTypes.FileSystem))
};
return sources.Merge()
.Throttle(throttle)
.Do(changed =>
{
if (Logger.IsDebugEnabled)
{
Logger.Debug($"FileWatcher event [{changed.FileChangeType}] {changed.FullPath}");
}
})
.Finally(() => fileSystemWatcher.Dispose())
.Subscribe(observer);
}
);
}
private static bool IsMaybeAProjectFile(EventPattern<FileSystemEventArgs> ev)
{
return ev.EventArgs.FullPath.EndsWith(".zip") || ev.EventArgs.FullPath.EndsWith(".skye");
}
}
public class FileChanged
{
public string FullPath { get; }
public FileChangeTypes FileChangeType { get; }
public SourceChangeTypes SourceChangeType { get; }
public FileChanged(string fullPath, FileChangeTypes fileChangeType, SourceChangeTypes sourceChangeType)
{
FullPath = fullPath;
FileChangeType = fileChangeType;
SourceChangeType = sourceChangeType;
}
}
[Flags]
public enum FileChangeTypes
{
Added = 1,
Deleted = 2
}
[Flags]
public enum SourceChangeTypes
{
FileSystem = 1,
Project = 2
}
Now in my application I created an event
private ProjectChangedEventHandler ProjectChanged { get; set; }
private void OnProjectChanged(FileChanged fileChanged)
{
ProjectChanged?.Invoke(this, fileChanged);
}
public delegate void ProjectChangedEventHandler(object sender, FileChanged fileChanged);
Which is used like this when I delete or a add a file from the project
OnProjectChanged(new FileChanged(archive.Filename, FileChangeTypes.Deleted, SourceChangeTypes.Project));
OnProjectChanged(new FileChanged(archive.Filename, FileChangeTypes.Added, SourceChangeTypes.Project));
Now I can start to leverage those two streams and with a join (which needs fine tuning for the left and right duration selector) I am able to detect which file was modified by my application:
private void ObserveProjectModifications(string projectFilePath)
{
_observeFolderChanges = FileWatcher.ObserveFolderChanges(Path.GetDirectoryName(projectFilePath), "*.*", TimeSpan.FromMilliseconds(500), IsPartOfProject);
_observeProjectChanges = Observable.FromEventPattern<ProjectChangedEventHandler, FileChanged>(h => ProjectChanged += h, h => ProjectChanged -= h).Select(pattern => pattern.EventArgs);
_changes = _observeProjectChanges.Join(_observeFolderChanges, _ => Observable.Never<Unit>(), _ => Observable.Never<Unit>(), ResultSelector).Where(changed => IsPartOfProject(changed.FullPath));
}
private FileChanged ResultSelector(FileChanged fileChanged, FileChanged projectChanged)
{
if (Logger.IsDebugEnabled)
{
Logger.Debug($"ResultSelector File [{fileChanged.FileChangeType}] {fileChanged.FullPath} # Project [{projectChanged.FileChangeType}] {projectChanged.FullPath}");
}
if (fileChanged.FullPath == projectChanged.FullPath)
{
if (fileChanged.FileChangeType == projectChanged.FileChangeType)
{
if (fileChanged.SourceChangeType != projectChanged.SourceChangeType)
{
return projectChanged;
}
return fileChanged;
}
return fileChanged;
}
return fileChanged;
}
private bool IsPartOfProject(string fullPath)
{
if (_projectFileManager.ProjectFilePath.Equals(fullPath)) return true;
return _archives.Values.Any(a => a.Filename.Equals(fullPath));
}
My issue is that I also want to know that a file was modified externally! Any idea would be really helpful! Thanks
Unfortunatelly the FileSystemWatcher doesn't provide information which process has modified the file, so you are bit out of luck there. There are few possibilities that I can think of:
Ignore flag - When your application is doing a change you can set a flag and ignore the events when the flag is set. This is the simplest way, but you might miss some external change if it happens concurrently when the flag is set and also it gets even more complicated due to throttling you have.
Tagging the file - whenever you do a change to the file you generate a guid (or similar) which you will use to tag the file. And then whenever the file change is fired, you check the file property (can be stored either as real filesystem file property - similar for example to jpeg metadata you see in details in file explorer, there are more ways to set such file property) and then if the tag is different from what you have or is missing then you know it is external - there you need to also take care due to throttling and the tag being outdated etc
Minifilter file system driver - This would be the cleanest solution and probably is very close to what Visual studio is using - just a guess though. It is basically a universal windows driver that monitors any I/O change. Microsoft has created reference implementation called minispy, which is small tool to monitor and log any I/O and transaction activity that occurs in the system. You don't have to implement the driver yourself as there is already a 3rd party FileSystemWatcher implemented using this approach on github. That file system watcher provides information which process has modified the file. The only problem here is that the driver itself needs to be installed, before it can be used, so you need admin privileged installer of sort.
At the moment that's all I can think of.

Why isnt it deleting the string from the list but it does delete something

So I have no idea why it's not deleting the actual string from the WebsiteList, it's weird because it does delete from the ProxyList.
When debugging it says that it does delete something because the websiteList.Count gets lower after running through webisteList.Remove(website);
But it doesnt delete the string, it keeps looping through the same string.
foreach (var website in websiteList.ToArray())
{
var webSplit = website.Split(')');
foreach (var proxy in proxyList.ToArray())
{
if (proxyList.Count > 0)
{
if(websiteList.Count > 0)
{
var proxySplit = proxy.Split(':');
int Port;
bool convert = Int32.TryParse(proxySplit[1], out Port);
if (this returns true)
{
Console.WriteLine("Removing proxy");
proxyList.Remove(proxy);
websiteList.Remove(website);
}
if (this returns true)
{
Console.WriteLine("Removing proxy");
proxyList.Remove(proxy);
websiteList.Remove(website);
}
}
}
else
break;
}
}
You are repeatedly deleting from the same proxyList (i.e. you are repeating the whole inner loop as many times as there are websites). Why are those 2 loops nested? The websites seem not to be related to the proxies. Only if the proxy list would be extracted from a website, nesting would make sense.
Are these 2 lists supposed to have the same length and to have proxies belonging to websites at the same index? If this is the case, loop using a for-loop and loop in reverse order to avoid messing up the indexes.
for (int i = websiteList.Count - 1; i >= 0; i--) {
if (<condition>) {
proxyList.RemoveAt(i);
websiteList.RemoveAt(i);
}
}
If you had a class for the websites, this would simplify manipulating things belonging together. It also has the advantage that you can add additional logic belonging to websites and proxies (like extracting the port number):
public class Website
{
public string Site { get; set; }
public string Proxy { get; set; }
public int Port {
get {
string[] proxySplit = proxy.Split(':');
int portNo = 0;
if (proxySplit.Length == 2) {
Int32.TryParse(proxySplit[1], out portNo);
}
return portNo;
}
}
}
Now the list is of type List<Website> and contains both, the websites and the proxies
You can delete by using the for loop as before or use LINQ and create a new list containing only the desired items
websiteList = websiteList.Where(w => <condition using w.Site, w.Proxy, w.Port>).ToList();
Note: There is a System.Uri class for the manipulation of uniform resource identifiers. Among other things it can extract the port number. Consider using this class instead of your own.

Phone version of Universal App seems to be caching database even after uninstall

I'm working on a Universal App using SQLite, and I've been using this tutorial along with SQLite for Windows Runtime (Windows Phone 8.1) and sqLite-net to get started.
I'm using this to check if the database exists:
bool exists = true;
try {
var file = await ApplicationData.Current.LocalFolder.GetFileAsync( "database.db" );
} catch {
exists = false;
}
and if not running some insert queries to populate default data:
if( !exists ) {
var conn = new SQLiteConnection( "database.db );
conn.CreateTable<MyClass>();
MyClass.Insert( "Default data value" );
}
where MyClass is
[Table( "MyClass" )]
public class MyClass {
public MyClass() {
this.ID = -1;
this.Name = string.Empty;
}
[PrimaryKey, AutoIncrement]
public int ID { get; set; }
[NotNull]
public string Name { get; set; }
internal static void Insert( string Name ) {
var myclass = new MyClass();
myclass.Name = Name;
var conn = new SQLiteConnection( "database.db" );
try {
if( myclass.ID == -1 ) conn.Insert( myclass );
} catch { }
}
public async static Task<List<MyClass>> List() {
var conn = new SQLiteAsyncConnection( "database.db" );
var rs = conn.Table<MyClass>().OrderBy( t => t.ID );
return await rs.ToListAsync();
}
}
The problem is every time I do a fresh deploy (after uninstalling from my test device), and the code correctly determine the database doesn't exist and performs the inserts, I'm left with an increasing number of records. More specifically, I'm doing an insert of four default values, but at my last deploy that table currently has 124 records (meaning I've deployed 31 times) and looking at the data, it's the same four values repeated. Stepping through the code, only four inserts are occurring, as expected.
It seems like the database is being cached somewhere. I've run the Windows Phone Power Tools and verified there are no files after uninstall. Am I missing something here?
I had a similar problem some time ago (although not with SQLite). Try to check the "Disable automatic backup/restore" checkbox in WMAppManifest.xml (Packaging tab).
In general though, new SQLiteConnection( "database.db" ); by itself does not guarantee a new file being created and your code doesn't currently ensure the file does not already exist: you are catching all exceptions while your condition is only covered by FileNotFoundException - the GetFileAsync may fail for other reasons than the file not existing. I would suggest catching the FileNotFoundException exception to begin with, and possibly also creating the file explicitly in code.

ClrZmq returning messages always to first started client

We're creating a WPF app in which we execute python scripts from different Test Stations and show the output in its corresponding output panel, To run the scripts in parallel we are using Task but when we run the scripts in parallel from the stations, We are getting the output of other stations also into the station that is started first, we're using the following code,
private void ZmqStatusListener(string endPoint)
{
using (Context context = new Context())
{
StatusPort = string.Empty;
TestResultPort = string.Empty;
using (Socket server = context.Socket(SocketType.REP))
{
try
{
if (isStatusContextActive == false || isPortChanged == true)
{
server.Bind(endPoint);
isStatusContextActive = true;
}
}
catch (ZMQ.Exception ex)
{
if (ex.Errno != 100)
{
string IPCPort = _globalParameters.GlbParam.GlbParamIpcStartPort;
if (IPCPort == string.Empty)
{
IPCPort = "0";
}
if (endPoint == EditorConstants.PortAddress.PortPrefix + IPCPort)
{
StatusPort = endPoint;
TestReultError = EditorConstants.CommonMessageTypes.TestReultError + ex.Message + EditorConstants.CommonMessageTypes.StackTraceMessage + ex.StackTrace;
}
StopExecOfScript(default(object));
isCancelledtask = true;
ScriptStatusDesc = new ScriptStatusDesc()
{
Status = "Failed",
statusDescription = "Failed"
};
}
}
while (true)
{
string message = server.Recv(Encoding.UTF8);
UpdateTestResults(message);
server.Send(" ACK", Encoding.UTF8);
// if (message == "Test Passed")
//break;
}
}
}
}
and for testing purpose we're breaking the while loop in this code based on a test message we kept in the python script, then we are able to get the output in the respective station correctly but this way we can only run in a synchronous fashion which we don't want as we require to run the test stations in parallel and the while loop should not break as it should be listening for the response.
We were able to solve the issue by getting clues doing a sample app to reproduce the issue and to first know whether our ClrZmq pattern was correct for us or not and it is correct. The resolution we followed is that when we needed to bind that data to its corresponding View's Model object in its ViewModel so had to retrieve View's DataContext which is of Type ISomeXViewModel for the particular TestStation using an Id of that TestStation we did this cos all of our TestStations are dynamically added and we even store it to be accessed wherever necessary. This issue was caused to due multiple instances of UserControls so we explicitly needed to update the TestStation manually with a little more effort.
Sample Code Snippet
private void BindTestResult(string xmlPayLoad)
{
// converting xmlPalLoad to a class/model object
ITestStationViewModel viewModel = (ITestStationViewModel)((IView)DynamicTestStationsGrid.Children[StationNumber].Content).DataContext;
// IView class has DataContext property so I am type casting the Content which is ContentControl to IView type first and later to ITestStationViewModel
viewModel.TestStationModel = xmlPayLoadModel;
}
Thanks.

Categories

Resources